npm_3.5.2.orig/.gitignore0000644000000000000000000000071112631326456013461 0ustar 00000000000000*.swp npm-debug.log /test/bin /test/output.log /test/*/*/node_modules /test/packages/npm-test-depends-on-spark/which-spark.log /test/packages/test-package/random-data.txt /test/root /node_modules/ronn /node_modules/tap /node_modules/.bin /node_modules/npm-registry-mock /node_modules/marked /html/api/ /html/doc/ /man/ /doc/*/npm-index.md /npmrc /release/ /npm-*.tgz /node_modules/npm-registry-client/test/fixtures /node_modules/npm-registry-couchapp *.pyc npm_3.5.2.orig/.npmignore0000644000000000000000000000072012631326456013470 0ustar 00000000000000*.swp .*.swp npm-debug.log /test/bin /test/output.log /test/packages/*/node_modules /test/packages/npm-test-depends-on-spark/which-spark.log /test/packages/test-package/random-data.txt /test/root node_modules/marked node_modules/ronn node_modules/tap node_modules/.bin node_modules/npm-registry-mock /npmrc /release/ # don't need these in the npm package. html/*.png # don't ignore .npmignore files # these are used in some tests. !.npmignore /npm-*.tgz *.pyc npm_3.5.2.orig/.npmrc0000644000000000000000000000005412631326456012611 0ustar 00000000000000save-prefix = ~ proprietary-attribs = false npm_3.5.2.orig/.travis.yml0000644000000000000000000000011612631326456013601 0ustar 00000000000000language: node_js script: "npm run-script tap" node_js: - "0.11" - "0.10" npm_3.5.2.orig/AUTHORS0000644000000000000000000003025012631326456012542 0ustar 00000000000000# Authors sorted by whether or not they're me Isaac Z. Schlueter Steve Steiner Mikeal Rogers Aaron Blohowiak Martyn Smith Charlie Robbins Francisco Treacy Cliffano Subagio Christian Eager Dav Glass Alex K. Wolfe James Sanders Reid Burke Arlo Breault Timo Derstappen Bart Teeuwisse Ben Noordhuis Tor Valamo Whyme.Lyu <5longluna@gmail.com> Olivier Melcher Tomaž Muraus Evan Meagher Orlando Vazquez Kai Chen George Miroshnykov Geoff Flarity Max Goodman Pete Kruckenberg Laurie Harper Chris Wong Scott Bronson Federico Romero Visnu Pitiyanuvath Irakli Gozalishvili Mark Cahill Tony Iain Sproat Trent Mick Felix Geisendörfer Jameson Little Conny Brunnkvist Will Elwood Dean Landolt Oleg Efimov Martin Cooper Jann Horn Andrew Bradley Maciej Małecki Stephen Sugden Michael Budde Jason Smith Gautham Pai David Trejo Paul Vorbach George Ornbo Tim Oxley Tyler Green Dave Pacheco Danila Gerasimov Rod Vagg Christian Howe Andrew Lunny Henrik Hodne Adam Blackburn Kris Windham Jens Grunert Joost-Wim Boekesteijn Dalmais Maxence Marcus Ekwall Aaron Stacy Phillip Howell Domenic Denicola James Halliday Jeremy Cantrell Ribettes Don Park Einar Otto Stangvik Kei Son Nicolas Morel Mark Dube Nathan Rajlich Maxim Bogushevich Meaglin Ben Evans Nathan Zadoks Brian White Jed Schmidt Ian Livingstone Patrick Pfeiffer Paul Miller Ryan Emery Carl Lange Jan Lehnardt Stuart P. Bentley Johan Sköld Stuart Knightley Niggler Paolo Fragomeni Jaakko Manninen Luke Arduini Larz Conwell Marcel Klehr Robert Kowalski Forbes Lindesay Vaz Allen Jake Verbaten Schabse Laks Florian Margaine Johan Nordberg Ian Babrou Di Wu Mathias Bynens Matt McClure Matt Lunn Alexey Kreschuk elisee Robert Gieseke François Frisch Trevor Burnham Alan Shaw TJ Holowaychuk Nicholas Kinsey Paulo Cesar Elan Shanker Jon Spencer Jason Diamond Maximilian Antoni Thom Blake Jess Martin Spain Train Alex Rodionov Matt Colyer Evan You bitspill Gabriel Falkenberg Alexej Yaroshevich Quim Calpe Steve Mason Wil Moore III Sergey Belov Tom Huang CamilleM Sébastien Santoro Evan Lucas Quinn Slack Alex Kocharin Daniel Santiago Denis Gladkikh Andrew Horton Zeke Sikelianos Dylan Greene Franck Cuny Yeonghoon Park Rafael de Oleza Mikola Lysenko Yazhong Liu Neil Gentleman Kris Kowal Alex Gorbatchev Shawn Wildermuth Wesley de Souza yoyoyogi J. Tangelder Jean Lauliac Andrey Kislyuk Thorsten Lorenz Julian Gruber Benjamin Coe Alex Ford Matt Hickford Sean McGivern C J Silverio Robin Tweedie Miroslav Bajtoš David Glasser Gianluca Casati Forrest L Norvell Karsten Tinnefeld Bryan Burgers David Beitey Evan You Zach Pomerantz Chris Williams sudodoki Mick Thompson Felix Rabe Michael Hayes Chris Dickinson Bradley Meck GeJ Andrew Terris Michael Nisi fengmk2 Adam Meadows Chulki Lee 不四 dead_horse Kenan Yildirim Laurie Voss Rebecca Turner Hunter Loftis Peter Richardson Jussi Kalliokoski Filip Weiss Timo Weiß Christopher Hiller Jérémy Lal Anders Janmyr Chris Meyers Ludwig Magnusson Wout Mertens Nick Santos Terin Stock Faiq Raza Thomas Torp Sam Mikes Mat Tyndall Tauren Mills Ron Martinez Kazuhito Hokamura Tristan Davies David Volm Lin Clark Ben Page Jeff Jo martinvd Mark J. Titorenko Oddur Sigurdsson Eric Mill Gabriel Barros KevinSheedy Aleksey Smolenchuk Ed Morley Blaine Bublitz Andrey Fedorov Daijiro Wachi Luc Thevenard Aria Stewart Charlie Rudolph Vladimir Rutsky Isaac Murchie Marcin Wosinek David Marr Bryan English Anthony Zotti Karl Horky Jordan Harband Guðlaugur Stefán Egilsson Helge Skogly Holm Peter A. Shevtsov Alain Kalker Bryant Williams Jonas Weber Tim Whidden Andreas Karolis Narkevicius Adrian Lynch Richard Littauer Oli Evans Matt Brennan Jeff Barczewski Danny Fritz Takaya Kobayashi Ra'Shaun Stovall Julien Meddah Michiel Sikma Jakob Krigovsky Charmander <~@charmander.me> Erik Wienhold James Butler Kevin Kragenbrink Arnaud Rinquin Mike MacCana Antti Mattila laiso Matt Zorn Kyle Mitchell Jeremiah Senkpiel Michael Klein Simen Bekkhus Victor thefourtheye Clay Carpenter bangbang93 Nick Malaguti Cedric Nelson Kat Marchán Andrew Eduardo Pinho Rachel Hutchison Ryan Temple Eugene Sharygin James Talmage jane arc Joseph Dykstra Andrew Crites Joshua Egan Carlos Alberto Thomas Cort Thaddee Tyl Steve Klabnik Andrew Murray Stephan Bönnemann Kyle M. Tarplee Derek Peterson Greg Whiteley murgatroid99 Marcin Cieslak João Reis Matthew Hasbach Jon Hall Anna Henningsen James Treworgy James Hartig Stephanie Snopek Kent C. Dodds Aaron Krause Daniel K O'Leary fscherwi Thomas Reggi Thomas Michael McTiernan Jason Kurian Sebastiaan Deckers lady3bean Tomi Carr Juan Caicedo Ashley Williams Andrew Marcinkevičius Jorrit Schippers Alex Lukin Aria Stewart Tiago Rodrigues Tim Nick Williams Louis Larry Ben Gotow Jakub Gieryluk npm_3.5.2.orig/CHANGELOG.md0000644000000000000000000027363712631326456013325 0ustar 00000000000000### v3.5.2 (2015-12-03): Weeeelcome to another npm release! The short version is that we fixed some `ENOENT` and some modules that resulted in modules going missing. We also eliminated the use of MD5 in our code base to help folks using Node.js in FIPS mode. And we fixed a bad URL in our license file. #### FIX URL IN LICENSE The license incorrectly identified the registry URL as `registry.npmjs.com` and this has been corrected to `registry.npmjs.org`. * [`cb6d81b`](https://github.com/npm/npm/commit/cb6d81bd611f68c6126a90127a9dfe5604d46c8c) [#10685](https://github.com/npm/npm/pull/10685) Fix npm public registry URL in notices. ([@kemitchell](https://github.com/kemitchell)) #### ENOENT? MORE LIKE ENOMOREBUGS The headliner this week was uncovered by the fixes to bundled dependency handling over the past few releases. What had been a frustratingly intermittent and hard to reproduce bug became something that happened every time in Travis. This fixes another whole bunch of errors where you would, while running an install have it crash with an `ENOENT` on `rename`, or the install would finish but some modules would be mysteriously missing and you'd have to install a second time. What's going on was a bit involved, so bear with me: `npm@3` generates a list of actions to take against the tree on disk. With the exception of lifecycle scripts, it expects these all to be able to act independently without interfering with each other. This means, for instance, that one should be able to upgrade `b` in `a→b→c` without having npm reinstall `c`. That works fine by the way. But it also means that the move action should be able to move `b` in `a→b→c@1.0.1` to `a→d→b→c@1.0.2` without moving or removing `c@1.0.1` and while leaving `c@1.0.2` in place if it was already installed. That is, the `move` action moves an individual node, replacing itself with an empty spot if it had children. This is not, as it might first appear, something where you move an entire branch to another location on the tree. When moving `b` we already took care to leave `c@1.0.1` in place so that other moves (or removes) could handle it, but we were stomping on the destination and so `c@1.0.2` was being removed. * [`f4385d8`](https://github.com/npm/npm/commit/f4385d8e7678349e75c80fae8a1f8f366f197937) [#10655](https://github.com/npm/npm/pull/10655) Preserve destination `node_modules` when moving. ([@iarna](https://github.com/iarna)) There was also a bug with `remove` where it was pruning the entire tree at the remove point, prior to running moves and adds. This was fine most of the time, but if we were moving one of the deps out from inside it, kaboom. * [`19c626d`](https://github.com/npm/npm/commit/19c626d69888f0cdc6e960254b3fdf523ec4b52c) [#10655](https://github.com/npm/npm/pull/10655) Get rid of the remove commit phase– we could have it prune _just_ the module being removed, but that isn't gaining us anything. ([@iarna](https://github.com/iarna)) After all that, we shouldn't be upgrading the `add` of a bundled package to a `move`. Moves save us from having to extract the package, but with a bundled dependency it's included in another package already so that doesn't gain us anything. * [`641a93b`](https://github.com/npm/npm/commit/641a93bd66a6aa4edf2d6167344b50d1a2afb593) [#10655](https://github.com/npm/npm/pull/10655) Don't convert adds to moves with bundled deps. ([@iarna](https://github.com/iarna)) While I was in there, I also took some time to improve diagnostics to make this sort of thing easier to track down in the future: * [`a04ec04`](https://github.com/npm/npm/commit/a04ec04804e562b511cd31afe89c8ba94aa37ff2) [#10655](https://github.com/npm/ npm/pull/10655) Wrap rename so errors have stack traces. ([@iarna](https://github.com/iarna)) * [`8ea142f`](https://github.com/npm/npm/commit/8ea142f896a2764290ca5472442b27b047ab7a1a) [#10655](https://github.com/npm/npm/pull/10655) Add silly logging so function is debuggable ([@iarna](https://github.com/iarna)) #### NO MORE MD5 We updated modules that had been using MD5 for non-security purposes. While this is perfectly safe, if you compile Node in FIPS-compliance mode it will explode if you try to use MD5. We've replaced MD5 with Murmur, which conveys our intent better and is faster to boot. * [`f068b26`](https://github.com/npm/npm/commit/f068b2661a8d0269c184867e003cd08cb6c56cf2) [#10629](https://github.com/npm/npm/issues/10629) `unique-filename@1.1.0` ([@iarna](https://github.com/iarna)) * [`dba1b24`](https://github.com/npm/npm/commit/dba1b2402aaa2beceec798d3bd22d00650e01069) [#10629](https://github.com/npm/npm/issues/10629) `write-file-atomic@1.1.4` ([@othiym23](https://github.com/othiym23)) * [`8347a30`](https://github.com/npm/npm/commit/8347a308ef0d2cf0f58f96bba3635af642ec611f) [#10629](https://github.com/npm/npm/issues/10629) `fs-write-stream-atomic@1.0.5` ([@othiym23](https://github.com/othiym23)) #### DEPENDENCY UPDATES * [`9e2a2bb`](https://github.com/npm/npm/commit/9e2a2bb5bc71a0ab3b3637e8eec212aa22d5c99f) [nodejs/node-gyp#831](https://github.com/nodejs/node-gyp/pull/831) `node-gyp@3.2.1`: Improved \*BSD support. ([@bnoordhuis](https://github.com/bnoordhuis)) ### v3.5.1 (2015-11-25): #### THE npm CLI !== THE npm REGISTRY !== npm, INC. npm-the-CLI is licensed under the terms of the [Artistic License 2.0](https://github.com/npm/npm/blob/8d79c1a39dae908f27eaa37ff6b23515d505ef29/LICENSE), which is a liberal open-source license that allows you to take this code and do pretty much whatever you like with it (that is, of course, not legal language, and if you're doing anything with npm that leaves you in doubt about your legal rights, please seek the review of qualified counsel, which is to say, not members of the CLI team, none of whom have passed the bar, to my knowledge). At the same time the primary registry the CLI uses when looking up and downloading packages is a commercial service run by npm, Inc., and it has its own [Terms of Use](https://www.npmjs.com/policies/terms). Aside from clarifying the terms of use (and trying to make sure they're more widely known), the only recent changes to npm's licenses have been making the split between the CLI and registry clearer. You are still free to do whatever you like with the CLI's source, and you are free to view, download, and publish packages to and from `registry.npmjs.org`, but now the existing terms under which you can do so are more clearly documented. Aside from the two commits below, see also [the release notes for `npm@3.4.1`](https://github.com/npm/npm/releases/tag/v3.4.1), which is where the split between the CLI's code and the terms of use for the registry was first made more clear. * [`35a5dd5`](https://github.com/npm/npm/commit/35a5dd5abbfeec4f98a2b4534ec4ef5d16760581) [#10532](https://github.com/npm/npm/issues/10532) Clarify that `registry.npmjs.org` is the default, but that you're free to use the npm CLI with whatever registry you wish. ([@kemitchell](https://github.com/kemitchell)) * [`fa6b013`](https://github.com/npm/npm/commit/fa6b0136a0e4a19d8979b2013622e5ff3f0446f8) [#10532](https://github.com/npm/npm/issues/10532) Having semi-duplicate release information in `README.md` was confusing and potentially inaccurate, so remove it. ([@kemitchell](https://github.com/kemitchell)) #### EASE UP ON WINDOWS BASH USERS It turns out that a fair number of us use bash on Windows (through MINGW or bundled with Git, plz – Cygwin is still a bridge too far, for both npm and Node.js). [@jakub-g](https://github.com/jakub-g) did us all a favor and relaxed the check for npm completion to support MINGW bash. Thanks, Jakub! * [`09498e4`](https://github.com/npm/npm/commit/09498e45c5c9e683f092ab1372670f81db4762b6) [#10156](https://github.com/npm/npm/issues/10156) completion: enable on Windows in git bash ([@jakub-g](https://github.com/jakub-g)) #### THE ONGOING SAGA OF BUNDLED DEPENDENCIES `npm@3.5.0` fixed up a serious issue with how `npm@3.4.1` (and potentially `npm@3.4.0` and `npm@3.3.12`) handled the case in which dependencies bundled into a package tarball are handled improperly when one or more of their own dependencies are older than what's latest on the registry. Unfortunately, in fixing that (quite severe) regression (see [`npm@3.5.0`'s release notes' for details](https://github.com/npm/npm/releases/tag/v3.5.0)), we introduced a new (small, and fortunately cosmetic) issue where npm superfluously warns you about bundled dependencies being stale. We have now fixed that, and hope that we haven't introduced any _other_ regressions in the process. :D * [`20824a7`](https://github.com/npm/npm/commit/20824a75bf7639fb0951a588e3c017a370ae6ec2) [#10501](https://github.com/npm/npm/issues/10501) Only warn about replacing bundled dependencies when actually doing so. ([@iarna](https://github.com/iarna)) #### MAKE NODE-GYP A LITTLE BLUER * [`1d14d88`](https://github.com/npm/npm/commit/1d14d882c3b5af0a7fee46e8e0e343d07e4c38cb) `node-gyp@3.2.0`: Support AIX, use `which` to find Python, updated to a newer version of `gyp`, and more! ([@bnoordhuis](https://github.com/bnoordhuis)) #### A BOUNTEOUS THANKSGIVING CORNUCOPIA OF DOC TWEAKS These are great! Keep them coming! Sorry for letting them pile up so deep, everybody. Also, a belated Thanksgiving to our Canadian friends, and a happy Thanksgiving to all our friends in the USA. * [`4659f1c`](https://github.com/npm/npm/commit/4659f1c5ad617c46a5e89b48abf0b1c4e6f04307) [#10244](https://github.com/npm/npm/issues/10244) In `npm@3`, `npm dedupe` doesn't take any arguments, so update documentation to reflect that. ([@bengotow](https://github.com/bengotow)) * [`625a7ee`](https://github.com/npm/npm/commit/625a7ee6b4391e90cb28a95f20a73fd794e1eebe) [#10250](https://github.com/npm/npm/issues/10250) Correct order of `org:team` in `npm team` documentation. ([@louislarry](https://github.com/louislarry)) * [`bea7f87`](https://github.com/npm/npm/commit/bea7f87399d784e3a6d3393afcca658a61a40d77) [#10371](https://github.com/npm/npm/issues/10371) Remove broken / duplicate link to tag. ([@WickyNilliams](https://github.com/WickyNilliams)) * [`0a25e29`](https://github.com/npm/npm/commit/0a25e2956e9ddd4065d6bd929559321afc512fde) [#10419](https://github.com/npm/npm/issues/10419) Remove references to nonexistent `npm-rm(1)` documentation. ([@KenanY](https://github.com/KenanY)) * [`19b94e1`](https://github.com/npm/npm/commit/19b94e1e6781fe2f98ada0a3f49a1bda25e3e32d) [#10474](https://github.com/npm/npm/issues/10474) Clarify that install finds dependencies in `package.json`. ([@sleekweasel](https://github.com/sleekweasel)) * [`b25efc8`](https://github.com/npm/npm/commit/b25efc88067c843ffdda86ea0f50f95d136a638e) [#9948](https://github.com/npm/npm/issues/9948) Encourage users to file an issue, rather than emailing authors. ([@trodrigues](https://github.com/trodrigues)) * [`24f4ced`](https://github.com/npm/npm/commit/24f4cedc83b10061f26362bf2f005ab935e0cbfe) [#10497](https://github.com/npm/npm/issues/10497) Clarify what a package is slightly. ([@aredridel](https://github.com/aredridel)) * [`e8168d4`](https://github.com/npm/npm/commit/e8168d40caae00b2914ea09dbe4bd1b09ba3dcd5) [#10539](https://github.com/npm/npm/issues/10539) Remove an extra, spuriously capitalized letter. ([@alexlukin-softgrad](https://github.com/alexlukin-softgrad)) ### v3.5.0 (2015-11-19): #### TEEN ORCS AT THE GATES This week heralds the general release of the primary npm registry's [new support for private packages for organizations](http://blog.npmjs.org/post/133542170540/private-packages-for-organizations). For many potential users, it's the missing piece needed to make it easy for you to move your organization's private work onto npm. And now it's here! The functionality to support it has been in place in the CLI for a while now, thanks to [@zkat](https://github.com/zkat)'s hard work. During our final testing before the release, our ace support team member [@snopeks](https://github.com/snopeks) noticed that there had been some drift between the CLI team's implementation and what npm was actually preparing to ship. In the interests of everyone having a smooth experience with this _extremely useful_ new feature, we quickly made a few changes to square up the CLI and the web site experiences. * [`d7fb92d`](https://github.com/npm/npm/commit/d7fb92d1c53ba5196ad6dd2101a06792a4c0412b) [#9327](https://github.com/npm/npm/issues/9327) `npm access` no longer has problems when run in a directory that doesn't contain a `package.json`. ([@othiym23](https://github.com/othiym23)) * [`17df3b5`](https://github.com/npm/npm/commit/17df3b5d5dffb2e9c223b9cfa2d5fd78c39492a4) [npm/npm-registry-client#126](https://github.com/npm/npm-registry-client/issues/126) `npm-registry-client@7.0.8`: Allow the CLI to grant, revoke, and list permissions on unscoped (public) packages on the primary registry. ([@othiym23](https://github.com/othiym23)) #### NON-OPTIONAL INSTALLS, DEFINITELY NON-OPTIONAL * [`180263b`](https://github.com/npm/npm/commit/180263b) [#10465](https://github.com/npm/npm/pull/10465) When a non-optional dep fails, we check to see if it's only required by ONLY optional dependencies. If it is, we make it fail all the deps in that chain (and roll them back). If it isn't then we give an error. We do this by walking up through all of our ancestors until we either hit an optional dependency or the top of the tree. If we hit the top, we know to give the error. If you installed a module by hand but didn't `--save` it, your module won't have the top of the tree as an anscestor and so this code was failing to abort the install with an error This updates the logic so that hitting the top OR a module that was requested by the user will trigger the error message. ([@iarna](https://github.com/iarna)) * [`b726a0e`](https://github.com/npm/npm/commit/b726a0e) [#9204](https://github.com/npm/npm/issues/9204) Ideally we would like warnings about your install to come AFTER the output from your compile steps or the giant tree of installed modules. To that end, we've moved warnings about failed optional deps to the show after your install completes. ([@iarna](https://github.com/iarna)) #### OVERRIDING BUNDLING * [`aed71fb`](https://github.com/npm/npm/commit/aed71fb) [#10482](https://github.com/npm/npm/issues/10482) We've been in our bundled modules code a lot lately, and our last go at this introduced a new bug, where if you had a module `a` that bundled a module `b`, which in turn required `c`, and the version of `c` that got bundled wasn't compatible with `b`'s `package.json`, we would then install a compatible version of `c`, but also erase `b` at the same time. This fixes that. It also reworks our bundled module support to be much closer to being in line with how we handle non-bundled modules and we're hopeful this will reduce any future errors around them. The new structure is hopefully much easier to reason about anyway. ([@iarna](https://github.com/iarna)) #### A BRIEF NOTE ON NPM'S BACKWARDS COMPATIBILITY We don't often have much to say about the changes we make to our internal testing and tooling, but I'm going to take this opportunity to reiterate that npm tries hard to maintain compatibility with a wide variety of Node versions. As this change shows, we want to ensure that npm works the same across: * Node.js 0.8 * Node.js 0.10 * Node.js 0.12 * the latest io.js release * Node.js 4 LTS * Node.js 5 Contributors who send us pull requests often notice that it's very rare that our tests pass across all of those versions (ironically, almost entirely due to the packages we use for testing instead of any issues within npm itself). We're currently beginning an effort, lasting the rest of 2015, to clean up our test suite, and not only get it passing on all of the above versions of Node.js, but working solidly on Windows as well. This is a compounding form of technical debt that we're finally paying down, and our hope is that cleaning up the tests will produce a more robust CLI that's a lot easier to write patches for. * [`791ec6b`](https://github.com/npm/npm/commit/791ec6b1bac0d1df59f5ebb4ccd16a29a5dc73f0) [#10233](https://github.com/npm/npm/issues/10233) Update Node.js versions that Travis uses to test npm. ([@iarna](https://github.com/iarna)) #### 0.8 + npm <1.4 COMPATIBLE? SURE WHY NOT Hey, you found the feature we added! * [`231c58a`](https://github.com/npm/npm/commit/231c58a) [#10337](https://github.com/npm/npm/pull/10337) Add two new flags, first `--legacy-bundling` which installs your dependencies such that if you bundle those dependencies, npm versions prior to `1.4` can still install them. This eliminates all automatic deduping. Second, `--global-style` which will install modules in your `node_modules` folder with the same layout as global modules. Only your direct dependencies will show in `node_modules` and everything they depend on will be flattened in their `node_modules` folders. This obviously will elminate some deduping. ([@iarna](https://github.com/iarna)) #### TYPOS IN THE LICENSE, OH MY * [`8d79c1a`](https://github.com/npm/npm/commit/8d79c1a39dae908f27eaa37ff6b23515d505ef29) [#10478](https://github.com/npm/npm/issues/10478) Correct two typos in npm's LICENSE. ([@jorrit](https://github.com/jorrit)) ### v3.4.1 (2015-11-12): #### ASK FOR NOTHING, GET LATEST When you run `npm install foo`, you probably expect that you'll get the `latest` version of `foo`, whatever that is. And good news! That's what this change makes it do. We _think_ this is what everyone wants, but if this causes problems for you, we want to know! If it proves problematic for people we will consider reverting it (preferrably before this becomes `npm@latest`). Previously, when you ran `npm install foo` we would act as if you typed `npm install foo@*`. Now, like any range-type specifier, in addition to matching the range, it would also have to be `<=` the value of the `latest` dist-tag. Further, it would exclude prerelease versions from the list of versions considered for a match. This worked as expected most of the time, unless your `latest` was a prerelease version, in which case that version wouldn't be used, to everyone's surprise. Worse, if all your versions were prerelease versions it would just refuse to install anything. (We fixed that in [`npm@3.2.2`](https://github.com/npm/npm/releases/tag/v3.2.2) with [`e4a38080`](https://github.com/npm/npm/commit/e4a38080).) * [`1e834c2`](https://github.com/npm/npm/commit/1e834c2) [#10189](https://github.com/npm/npm/issues/10189) `npm-package-arg@4.1.0` Change the default version from `*` to `latest`. ([@zkat](https://github.com/zkat)) #### BUGS * [`bec4a84`](https://github.com/npm/npm/commit/bec4a84) [#10338](https://github.com/npm/npm/pull/10338) Failed installs could result in more rollback (removal of just installed packages) than we intended. This bug was first introduced by [`83975520`](https://github.com/npm/npm/commit/83975520). ([@iarna](https://github.com/iarna)) * [`06c732f`](https://github.com/npm/npm/commit/06c732f) [#10338](https://github.com/npm/npm/pull/10338) Updating a module could result in the module stealing some of its dependencies from the top level, potentially breaking other modules or resulting in many redundent installations. This bug was first introduced by [`971fd47a`](https://github.com/npm/npm/commit/971fd47a). ([@iarna](https://github.com/iarna)) * [`5653366`](https://github.com/npm/npm/commit/5653366) [#9980](https://github.com/npm/npm/issues/9980) npm, when removing a module, would refuse to remove the symlinked binaries if the module itself was symlinked as well. npm goes to some effort to ensure that it doesn't remove things that aren't is, and this code was being too conservative. This code has been rewritten to be easier to follow and to be unit-testable. ([@iarna](https://github.com/iarna)) #### LICENSE CLARIFICATION * [`80acf20`](https://github.com/npm/npm/commit/80acf20) [#10326](https://github.com/npm/npm/pull/10326) Update npm's licensing to more completely cover all of the various things that are npm. ([@kemitchell](https://github.com/kemitchell)) #### CLOSER TO GREEN TRAVIS * [`fc12da9`](https://github.com/npm/npm/commit/fc12da9) [#10232](https://github.com/npm/npm/pull/10232) `nock@1.9.0` Downgrade nock to a version that doesn't depend on streams2 in core so that more of our tests can pass in 0.8. ([@iarna](https://github.com/iarna)) ### v3.4.0 (2015-11-05): #### A NEW FEATURE This was a group effort, with [@isaacs](https://github.com/isaacs) dropping the implementation in back in August. Then, a few days ago, [@ashleygwilliams](https://github.com/ashleygwilliams) wrote up docs and just today [@othiym23](https://github.com/othiym23) wrote a test. It's a handy shortcut to update a dependency and then make sure tests still pass. This new command: ``` npm install-test x ``` is the equivalent of running: ``` npm install x && npm test ``` * [`1ac3e08`](https://github.com/npm/npm/commit/1ac3e08) [`bcb04f6`](https://github.com/npm/npm/commit/bcb04f6) [`b6c17dd`](https://github.com/npm/npm/commit/b6c17dd) [#9443](https://github.com/npm/npm/pull/9443) Add `npm install-test` command, alias `npm it`. ([@isaacs](https://github.com/isaacs), [@ashleygwilliams](https://github.com/ashleygwilliams), [@othiym23](https://github.com/othiym23)) #### BUG FIXES VIA DEPENDENCY UPDATES * [`31c0080`](https://github.com/npm/npm/commit/31c0080) [#8640](https://github.com/npm/npm/issues/8640) [npm/normalize-package-data#69](https://github.com/npm/normalize-package-data/pull/69) `normalize-package-data@2.3.5`: Fix a bug where if you didn't specify the name of a scoped module's binary, it would install it such that it was impossible to call it. ([@iarna](https://github.com/iarna)) * [`02b37bc`](https://github.com/npm/npm/commit/02b37bc) [npm/fstream-npm#14](https://github.com/npm/fstream-npm/pull/14) `fstream-npm@1.0.7`: Only filter `config.gypi` when it's in the build directory. ([@mscdex](https://github.com/mscdex)) * [`accb9d2`](https://github.com/npm/npm/commit/accb9d2) [npm/fstream-npm#15](https://github.com/npm/fstream-npm/pull/15) `fstream-npm@1.0.6`: Stop including directories that happened to have names matching whitelisted npm files in npm module tarballs. The most common cause was that if you had a README directory then everything in it would be included if wanted it or not. ([@taion](https://github.com/taion)) #### DOCUMENTATION FIXES * [`7cf6366`](https://github.com/npm/npm/commit/7cf6366) [#10036](https://github.com/npm/npm/pull/10036) Fix typo / over-abbreviation. ([@ifdattic](https://github.com/ifdattic)) * [`d0ad8f4`](https://github.com/npm/npm/commit/d0ad8f4) [#10176](https://github.com/npm/npm/pull/10176) Fix broken link, scopes => scope. ([@ashleygwilliams](https://github.com/ashleygwilliams)) * [`d623783`](https://github.com/npm/npm/commit/d623783) [#9460](https://github.com/npm/npm/issue/9460) Specifying the default command run by "npm start" and the fact that you can pass it arguments. ([@JuanCaicedo](https://github.com/JuanCaicedo)) #### DEPENDENCY UPDATES FOR THEIR OWN SAKE * [`0a4c29e`](https://github.com/npm/npm/commit/0a4c29e) [npm/npmlog#19](https://github.com/npm/npmlog/pull/19) `npmlog@2.0.0`: Make it possible to emit log messages with `error` as the prefix. ([@bengl](https://github.com/bengl)) * [`9463ce9`](https://github.com/npm/npm/commit/9463ce9) `read-package-json@2.0.2`: Minor cleanups. ([@KenanY](https://github.com/KenanY)) ### v3.3.12 (2015-11-02): Hi, a little hot-fix release for a bug introduced in 3.3.11. The ENOENT fix last week ([`f0e2088`](https://github.com/npm/npm/commit/f0e2088)) broke upgrades of modules that have bundled dependencies (like `npm`, augh!) * [`aedf7cf`](https://github.com/npm/npm/commit/aedf7cf) [#10192](//github.com/npm/npm/pull/10192) If a bundled module is going to be replacing a module that's currently on disk (for instance, when you upgrade a module that includes bundled dependencies) we want to select the version from the bundle in preference over the one that was there previously. ([@iarna](https://github.com/iarna)) ### v3.3.11 (2015-10-29): This is a dependency update week, so that means no PRs from our lovely users. Look for those next week. As it happens, the dependencies updated were just devdeps, so nothing for you all to worry about. But the bug fixes, oh geez, I tracked down some really long standing stuff this week!! The headliner is those intermittent `ENOENT` errors that no one could reproduce consistently? I think they're nailed! But also pretty important, the bug where `hapi` would install w/ a dep missing? Squashed! #### EEEEEEENOENT * [`f0e2088`](https://github.com/npm/npm/commit/f0e2088) [#10026](https://github.com/npm/npm/issues/10026) Eliminate some, if not many, of the `ENOENT` errors `npm@3` has seen over the past few months. This was happening when npm would, in its own mind, correct a bundled dependency, due to a `package.json` specifying an incompatible version. Then, when npm extracted the bundled version, what was on disk didn't match its mind and… well, when it tried to act on what was in its mind, we got an `ENOENT` because it didn't actually exist on disk. ([@iarna](https://github.com/iarna)) #### PARTIAL SHRINKWRAPS, NO LONGER A BAD DAY * [`712fd9c`](https://github.com/npm/npm/commit/712fd9c) [#10153](https://github.com/npm/npm/pull/10153) Imagine that you have a module, let's call it `fun-time`, and it depends on two dependencies, `need-fun@1` and `need-time`. Further, `need-time` requires `need-fun@2`. So after install the logical tree will look like this: ``` fun-time ├── need-fun@1 └── need-time └── need-fun@2 ``` Now, the `fun-time` author also distributes a shrinkwrap, but it only includes the `need-fun@1` in it. Resolving dependencies would look something like this: 1. Require `need-fun@1`: Use version from shrinkwrap (ignoring version) 2. Require `need-time`: User version in package.json 1. Require `need-fun@2`: Use version from shrinkwrap, which oh hey, is already installed at the top level, so no further action is needed. Which results in this tree: ``` fun-time ├── need-fun@1 └── need-time ``` We're ignoring the version check on things specified in the shrinkwrap so that you can override the version that will be installed. This is because you may want to use a different version than is specified by your dependencies' dependencies' `package.json` files. To fix this, we now only allow overrides of a dependency version when that dependency is a child (in the tree) of the thing that requires it. This means that when we're looking for `need-fun@2` we'll see `need-fun@1` and reject it because, although it's from a shrinkwrap, it's parent is `fun-time` and the package doing the requiring is `need-time`. ([@iarna](https://github.com/iarna)) #### STRING `package.bin` AND NON-NPMJS REGISTRIES * [`3de1463`](https://github.com/npm/npm/commit/3de1463) [#9187](https://github.com/npm/npm/issues/9187) If you were using a module with the `bin` field in your `package.json` set to a string on a non-npmjs registry then npm would crash, due to the our expectation that the `bin` field would be an object. We now pass all `package.json` data through a routine that normalizes the format, including the `bin` field. (This is the same routine that your `package.json` is passed through when read off of disk or sent to the registry for publication.) Doing this also ensures that older modules on npm's own registry will be treated exactly the same as new ones. (In the past we weren't always super careful about scrubbing `package.json` data on publish. And even when we were, those rules have subtly changed over time.) ([@iarna](https://github.com/iarna)) ### v3.3.10 (2015-10-22): Hey you all! Welcome to a busy bug fix and PR week. We've got changes to how `npm install` replaces dependencies during updates, improvements to shrinkwrap behavior, and all sorts of doc updates. In other news, `npm@3` landed in node master in preparation for `node@5` with [`41923c0`](https://github.com/nodejs/node/commit/41923c0). #### UPDATED DEPS NOW MAKE MORE SENSE * [`971fd47`](https://github.com/npm/npm/commit/971fd47) [#9929](https://github.com/npm/npm/pull/9929) Make the tree more consistent by doing updates in place. This means that trees after a dependency version update will more often look the same as after a fresh install. ([@iarna](https://github.com/iarna)) #### SHRINKWRAP + DEV DEPS NOW RESPECTED * [`eb28a8c`](https://github.com/npm/npm/commit/eb28a8c) [#9647](https://github.com/npm/npm/issues/9647) If a shrinkwrap already has dev deps, don't throw them away when someone later runs `npm install --save`. ([@iarna](https://github.com/iarna)) #### FANTASTIC DOCUMENTATION UPDATES * [`291162c`](https://github.com/npm/npm/commit/291162c) [#10021](https://github.com/npm/npm/pull/10021) Improve wording in the FAQ to be more empathetic and less jokey. ([@TaMe3971](https://github.com/TaMe3971)) * [`9a28c54`](https://github.com/npm/npm/commit/9a28c54) [#10020](https://github.com/npm/npm/pull/10020) Document the command to see the list of config defaults in the section on config defaults. ([@lady3bean](https://github.com/lady3bean)) * [`8770b0a`](https://github.com/npm/npm/commit/8770b0a) [#7600](https://github.com/npm/npm/issues/7600) Add shortcuts to all command documentation. ([@RichardLitt](https://github.com/RichardLitt)) * [`e9b7d0d`](https://github.com/npm/npm/commit/e9b7d0d) [#9950](https://github.com/npm/npm/pull/9950) On errors that can be caused by outdated node & npm, suggest updating as a part of the error message. ([@ForbesLindesay](https://github.com/ForbesLindesay)) #### NEW STANDARD HAS ALWAYS BEEN STANDARD * [`40c1b0f`](https://github.com/npm/npm/commit/40c1b0f) [#9954](https://github.com/npm/npm/pull/9954) Update to `standard@5` and reformat the source to work with it. ([@cbas](https://github.com/cbas)) ### v3.3.9 (2015-10-15): This week sees a few small changes ready to land: #### TRAVIS NODE 0.8 BUILDS REJOICE * [`25a234b`](https://github.com/npm/npm/commit/25a234b) [#9668](https://github.com/npm/npm/issues/9668) Install `npm@3`'s bundled dependencies with `npm@2`, so that the ancient npm that ships with node 0.8 can install `npm@3` directly. ([@othiym23](https://github.com/othiym23)) #### SMALL ERROR MESSAGE IMPROVEMENT * [`a332f61`](https://github.com/npm/npm/commit/a332f61) [#9927](https://github.com/npm/npm/pull/9927) Update error messages where we report a list of versions that you could have installed to show this as a comma separated list instead of as JSON. ([@iarna](https://github.com/iarna)) #### DEPENDENCY UPDATES * [`4cd74b0`](https://github.com/npm/npm/commit/4cd74b0) `nock@2.15.0` ([@pgte](https://github.com/pgte)) * [`9360976`](https://github.com/npm/npm/commit/9360976) `tap@2.1.1` ([@isaacs](https://github.com/isaacs)) * [`1ead0a4`](https://github.com/npm/npm/commit/1ead0a4) `which@1.2.0` ([@isaacs](https://github.com/isaacs)) * [`759f88a`](https://github.com/npm/npm/commit/759f88a) `has-unicode@1.0.1` ([@iarna](https://github.com/iarna)) ### v3.3.8 (2015-10-12): This is a small update release, we're reverting [`22a3af0`](https://github.com/npm/npm/commit/22a3af0) from last week's release, as it is resulting in crashes. We'll revisit this PR during this week. * [`ddde1d5`](https://github.com/npm/npm/commit/ddde1d5) Revert "lifecycle: Swap out custom logic with add-to-path module" ([@iarna](https://github.com/iarna)) ### v3.3.7 (2015-10-08): So, as Kat mentioned in last week's 2.x release, we're now swapping weeks between accepting PRs and doing dependency updates, in an effort to keep release management work from taking over our lives. This week is a PR week, so we've got a bunch of goodies for you. Relatedly, this week means 3.3.6 is now `latest` and it is WAY faster than previous 3.x releases. Give it or this a look! #### OPTIONAL DEPS, MORE OPTIONAL * [`2289234`](https://github.com/npm/npm/commit/2289234) [#9643](https://github.com/npm/npm/issues/9643) [#9664](https://github.com/npm/npm/issues/9664) `npm@3` was triggering `npm@2`'s build mechanics when it was linking bin files into the tree. This was originally intended to trigger rebuilds of bundled modules, but `npm@3`'s flat module structure confused it. This caused two seemingly unrelated issues. First, failing optional dependencies could under some circumstances (if they were built during this phase) trigger a full build failure. And second, rebuilds were being triggered of already installed modules, again, in some circumstances. Both of these are fixed by disabling the `npm@2` mechanics and adding a special rebuild phase for the initial installation of bundled modules. ([@iarna](https://github.com/iarna)) #### BAD NAME, NO CRASH * [`b78fec9`](https://github.com/npm/npm/commit/b78fec9) [#9766](https://github.com/npm/npm/issues/9766) Refactor all attempts to read the module name or package name to go via a single function, with appropriate guards unusual circumstances where they aren't where we expect them. This ultimately will ensure we don't see any more recurrences of the `localeCompare` error and related crashers. ([@iarna](https://github.com/iarna)) #### MISCELLANEOUS BUG FIXES * [`22a3af0`](https://github.com/npm/npm/commit/22a3af0) [#9553](https://github.com/npm/npm/pull/9553) Factor the lifecycle code to manage paths out into its own module and use that. ([@kentcdodds](https://github.com/kentcdodds)) * [`6a29fe3`](https://github.com/npm/npm/commit/6a29fe3) [#9677](https://github.com/npm/npm/pull/9677) Start testing our stuff in node 4 on travis ([@fscherwi](https://github.com/fscherwi)) * [`508c6a4`](https://github.com/npm/npm/commit/508c6a4) [#9669](https://github.com/npm/npm/issues/9669) Make `recalculateMetadata` more resilient to unexpectedly bogus dependency specifiers. ([@tmct](https://github.com/tmct)) * [`3c44763`](https://github.com/npm/npm/commit/3c44763) [#9643](https://github.com/npm/npm/issues/9463) Update `install --only` to ignore the `NODE_ENV` var and _just_ use the only value, if specified. ([@watilde](https://github.com/watilde)) * [`87336c3`](https://github.com/npm/npm/commit/87336c3) [#9879](https://github.com/npm/npm/pull/9879) `npm@3`'s shrinkwrap was refusing to shrinkwrap if an optional dependency was missing– patch it to allow this. ([@mantoni](https://github.com/mantoni)) #### DOCUMENTATION UPDATES * [`82659fd`](https://github.com/npm/npm/commit/82659fd) [#9208](https://github.com/npm/npm/issues/9208) Correct the npm style guide around quote usage ([@aaroncrows](https://github.com/aaroncrows)) * [`a69c83a`](https://github.com/npm/npm/commit/a69c83a) [#9645](https://github.com/npm/npm/pull/9645) Fix spelling error in README ([@dkoleary88](https://github.com/dkoleary88)) * [`f2cf054`](https://github.com/npm/npm/commit/f2cf054) [#9714](https://github.com/npm/npm/pull/9714) Fix typos in our documentation ([@reggi](https://github.com/reggi)) * [`7224bef`](https://github.com/npm/npm/commit/7224bef) [#9759](https://github.com/npm/npm/pull/9759) Fix typo in npm-team docs ([@zkat](https://github.com/zkat)) * [`7e6e007`](https://github.com/npm/npm/commit/7e6e007) [#9820](https://github.com/npm/npm/pull/9820) Correct documentation as to `binding.gyp` ([@KenanY](https://github.com/KenanY)) ### v3.3.6 (2015-09-30): I have the most exciting news for you this week. YOU HAVE NO IDEA. Well, ok, maybe you do if you follow my twitter. Performance just got 5 bazillion times better (under some circumstances, ymmv, etc). So– my test scenario is our very own website. In `npm@2`, on my macbook running `npm ls` takes about 5 seconds. Personally it's more than I'd like, but it's entire workable. In `npm@3` it has been taking _50_ seconds, which is appalling. But after doing some work on Monday isolating the performance issues I've been able to reduce `npm@3`'s run time back down to 5 seconds. Other scenarios were even worse, there was one that until now in `npm@3` that took almost 6 minutes, and has been reduced to 14 seconds. * [`7bc0d4c`](https://github.com/npm/npm/commit/7bc0d4c) [`cf42217`](https://github.com/npm/npm/commit/cf42217) [#8826](https://github.com/npm/npm/issues/8826) Stop using deepclone on super big datastructures. Avoid cloning all-together even when that means mutating things, when possible. Otherwise use a custom written tree-copying function that understands the underlying datastructure well enough to only copy what we absolutely need to. ([@iarna](https://github.com/iarna)) In other news, look for us this Friday and Saturday at the amazing [Open Source and Feelings](https://osfeels.com) conference, where something like a third of the company will be attending. #### And finally a dependency update * [`a6a4437`](https://github.com/npm/npm/commit/a6a4437) `glob@5.0.15` ([@isaacs](https://github.com/isaacs)) #### And some subdep updates * [`cc5e6a0`](https://github.com/npm/npm/commit/cc5e6a0) `hoek@2.16.3` ([@nlf](https://github.com/nlf)) * [`912a516`](https://github.com/npm/npm/commit/912a516) `boom@2.9.0` ([@arb](https://github.com/arb)) * [`63944e9`](https://github.com/npm/npm/commit/63944e9) `bluebird@2.10.1` ([@petkaantonov](https://github.com/petkaantonov)) * [`ef16003`](https://github.com/npm/npm/commit/ef16003) `mime-types@2.1.7` & `mime-db@1.19.0` ([@dougwilson](https://github.com/dougwilson)) * [`2b8c0dd`](https://github.com/npm/npm/commit/2b8c0dd) `request@2.64.0` ([@simov](https://github.com/simov)) * [`8139124`](https://github.com/npm/npm/commit/8139124) `brace-expansion@1.1.1` ([@juliangruber](https://github.com/juliangruber)) ### v3.3.5 (2015-09-24): Some of you all may not be aware, but npm is ALSO a company. I tell you this 'cause npm-the-company had an all-staff get together this week, flying in our remote folks from around the world. That was great, but it also basically eliminated normal work on Monday and Tuesday. Still, we've got a couple of really important bug fixes this week. Plus a lil bit from the [now LTS 2.x branch](https://github.com/npm/npm/releases/tag/v2.14.6). #### ATTENTION WINDOWS USERS If you previously updated to npm 3 and you try to update again, you may get an error messaging telling you that npm won't install npm into itself. Until you are at 3.3.5 or greater, you can get around this with `npm install -f -g npm`. * [`bef06f5`](https://github.com/npm/npm/commit/bef06f5) [#9741](https://github.com/npm/npm/pull/9741) Uh... so... er... it seems that since `npm@3.2.0` on Windows with a default configuration, it's been impossible to update npm. Well, that's not actually true, there's a work around (see above), but it shouldn't be complaining in the first place. ([@iarna](https://github.com/iarna)) #### STACK OVERFLOWS ON PUBLISH * [`330b496`](https://github.com/npm/npm/commit/330b496) [#9667](https://github.com/npm/npm/pull/9667) We were keeping track of metadata about your project while packing the tree in a way that resulted in this data being written to packed tar files headers. When this metadata included cycles, it resulted in the the tar file entering an infinite recursive loop and eventually crashing with a stack overflow. I've patched this by keeping track of your metadata by closing over the variables in question instead, and I've further restricted gathering and tracking the metadata to times when it's actually needed. (Which is only if you need bundled modules.) ([@iarna](https://github.com/iarna)) #### LESS CRASHY ERROR MESSAGES ON BAD PACKAGES * [`829921f`](https://github.com/npm/npm/commit/829921f) [#9741](https://github.com/npm/npm/pull/9741) Packages with invalid names or versions were crashing the installer. These are now captured and warned as was originally intended. ([@iarna](https://github.com/iarna)) #### ONE DEPENDENCY UPDATE * [`963295c`](https://github.com/npm/npm/commit/963295c) `npm-install-checks@2.0.1` ([@iarna](https://github.com/iarna)) #### AND ONE SUBDEPENDENCY * [`448737d`](https://github.com/npm/npm/commit/448737d) `request@2.63.0` ([@simov](https://github.com/simov)) ### v3.3.4 (2015-09-17): This is a relatively quiet release, bringing a few bug fixes and some module updates, plus via the [2.14.5 release](https://github.com/npm/npm/releases/tag/v2.14.5) some forward compatibility fixes with versions of Node that aren't yet released. #### NO BETA NOTICE THIS TIME!! But, EXCITING NEWS FRIENDS, this week marks the exit of `npm@3` from beta. This means that the week of this release, [v3.3.3](https://github.com/npm/npm/releases/tag/v3.3.3) will become `latest` and this version (v3.3.4) will become `next`!! #### CRUFT FOR THE CRUFT GODS What I call "cruft", by which I mean, files sitting around in your `node_modules` folder, will no longer produce warnings in `npm ls` nor during `npm install`. This brings `npm@3`'s behavior in line with `npm@2`. * [`a127801`](https://github.com/npm/npm/commit/a127801) [#9285](https://github.com/npm/npm/pull/9586) Stop warning about cruft in module directories. ([@iarna](https://github.com/iarna)) #### BETTER ERROR MESSAGE * [`95ee92c`](https://github.com/npm/npm/commit/95ee92c) [#9433](https://github.com/npm/npm/issues/9433) Give better error messages for invalid urls in the dependecy list. ([@jamietre](https://github.com/jamietre)) #### MODULE UPDATES * [`ebb92ca`](https://github.com/npm/npm/commit/ebb92ca) `retry@0.8.0` ([@tim-kos](https://github.com/tim-kos)) * [`55f1285`](https://github.com/npm/npm/commit/55f1285) `normalize-package-data@2.3.4` ([@zkat](https://github.com/zkat)) * [`6d4ebff`](https://github.com/npm/npm/commit/6d4ebff) `sha@2.0.1` ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`09a9c7a`](https://github.com/npm/npm/commit/09a9c7a) `semver@5.0.3` ([@isaacs](https://github.com/isaacs)) * [`745000f`](https://github.com/npm/npm/commit/745000f) `node-gyp@3.0.3` ([@rvagg](https://github.com/rvagg)) #### SUB DEP MODULE UPDATES * [`578ca25`](https://github.com/npm/npm/commit/578ca25) `request@2.62.0` ([@simov](https://github.com/simov)) * [`1d8996e`](https://github.com/npm/npm/commit/1d8996e) `jju@1.2.1` ([@rlidwka](https://github.com/rlidwka)) * [`6da1ba4`](https://github.com/npm/npm/commit/6da1ba4) `hoek@2.16.2` ([@nlf](https://github.com/nlf)) ### v3.3.3 (2015-09-10): This short week brought us brings us a few small bug fixes, a doc change and a whole lotta dependency updates. Plus, as usual, this includes a forward port of everything in [`npm@2.14.4`](https://github.com/npm/npm/releases/tag/v2.14.4). #### BETA BUT NOT FOREVER **_THIS IS BETA SOFTWARE_**. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### REMOVE INSTALLED BINARIES ON WINDOWS So waaaay back at the start of August, I fixed a bug with [#9198](https://github.com/npm/npm/pull/9198). That fix made it so that if you had two modules installed that both installed the same binary (eg `gulp` & `gulp-cli`), that removing one wouldn't remove the binary if it was owned by the other. It did this by doing some hocus-pocus that, turns out, was Unix-specific, so on Windows it just threw up its hands and stopped removing installed binaries at all. Not great. So today we're fixing that– it let us maintain the same safety that we added in #9198, but ALSO works with windows. * [`25fbaed`](https://github.com/npm/npm/commit/25fbaed) [#9394](https://github.com/npm/npm/issues/9394) Treat cmd-shims the same way we treat symlinks ([@iarna](https://github.com/iarna)) #### API DOCUMENTATION HAS BEEN SACRIFICED THE API GOD The documentation of the internal APIs of npm is going away, because it would lead people into thinking they should integrate with npm by using it. Please don't do that! In the future, we'd like to give you a suite of stand alone modules that provide better, more stand alone APIs for your applications to build on. But for now, call the npm binary with `process.exec` or `process.spawn` instead. * [`2fb60bf`](https://github.com/npm/npm/commit/2fb60bf) Remove misleading API documentation ([@othiym23](https://github.com/othiym23)) #### ALLOW `npm link` ON WINDOWS W/ PRERELEASE VERSIONS OF NODE We never meant to have this be a restriction in the first place and it was only just discovered with the recent node 4.0.0 release candidate. * [`6665e54`](https://github.com/npm/npm/commit/6665e54) [#9505](https://github.com/npm/npm/pull/9505) Allow npm link to run on windows with prerelease versions of node ([@jon-hall](https://github.com/jon-hall)) #### graceful-fs update We're updating all of npm's deps to use the most recent `graceful-fs`. This turns out to be important for future not yet released versions of node, because older versions monkey-patch `fs` in ways that will break in the future. Plus it ALSO makes use of `process.binding` which is an internal API that npm definitely shouldn't have been using. We're not done yet, but this is the bulk of them. * [`e7bc98e`](https://github.com/npm/npm/commit/e7bc98e) `write-file-atomic@1.1.3` ([@iarna](https://github.com/iarna)) * [`7417600`](https://github.com/npm/npm/commit/7417600) `tar@2.2.1` ([@zkat](https://github.com/zkat)) * [`e4e9d40`](https://github.com/npm/npm/commit/e4e9d40) `read-package-json@2.0.1` ([@zkat](https://github.com/zkat)) * [`481611d`](https://github.com/npm/npm/commit/481611d) `read-installed@4.0.3` ([@zkat](https://github.com/zkat)) * [`0dabbda`](https://github.com/npm/npm/commit/0dabbda) `npm-registry-client@7.0.4` ([@zkat](https://github.com/zkat)) * [`c075a91`](https://github.com/npm/npm/commit/c075a91) `fstream@1.0.8` ([@zkat](https://github.com/zkat)) * [`2e4341a`](https://github.com/npm/npm/commit/2e4341a) `fs-write-stream-atomic@1.0.4` ([@zkat](https://github.com/zkat)) * [`18ad16e`](https://github.com/npm/npm/commit/18ad16e) `fs-vacuum@1.2.7` ([@zkat](https://github.com/zkat)) #### DEPENDENCY UPDATES * [`9d6666b`](https://github.com/npm/npm/commit/9d6666b) `node-gyp@3.0.1` ([@rvagg](https://github.com/rvagg)) * [`349c4df`](https://github.com/npm/npm/commit/349c4df) `retry@0.7.0` ([@tim-kos](https://github.com/tim-kos)) * [`f507551`](https://github.com/npm/npm/commit/f507551) `which@1.1.2` ([@isaacs](https://github.com/isaacs)) * [`e5b6743`](https://github.com/npm/npm/commit/e5b6743) `nopt@3.0.4` ([@zkat](https://github.com/zkat)) #### THE DEPENDENCIES OF OUR DEPENDENCIES ARE OUR DEPENDENCIES UPDATES * [`316382d`](https://github.com/npm/npm/commit/316382d) `mime-types@2.1.6` & `mime-db@1.18.0` * [`64b741e`](https://github.com/npm/npm/commit/64b741e) `spdx-correct@1.0.1` * [`fff62ac`](https://github.com/npm/npm/commit/fff62ac) `process-nextick-args@1.0.3` * [`9d6488c`](https://github.com/npm/npm/commit/9d6488c) `cryptiles@2.0.5` * [`1912012`](https://github.com/npm/npm/commit/1912012) `bluebird@2.10.0` * [`4d09402`](https://github.com/npm/npm/commit/4d09402) `readdir-scoped-modules@1.0.2` ### v3.3.2 (2015-09-04): #### PLEASE HOLD FOR THE NEXT AVAILABLE MAINTAINER This is a tiny little maintenance release, both to update dependencies and to keep `npm@3` up to date with changes made to `npm@2`. [@othiym23](https://github.com/othiym23) is putting out this release (again) as his esteemed colleague [@iarna](https://github.com/iarna) finishes relocating herself, her family, and her sizable anime collection all the way across North America. It contains [all the goodies in `npm@2.14.3`](https://github.com/npm/npm/releases/tag/v2.14.3) and one other dependency update. #### BETA WARNINGS FOR FUN AND PROFIT **_THIS IS BETA SOFTWARE_**. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. That said, it's getting there! It will be leaving beta very soon! #### ONE OTHER DEPENDENCY UPDATE * [`bb5de34`](https://github.com/npm/npm/commit/bb5de3493531228df0bd3f0742d5493c826be6dd) `is-my-json-valid@2.12.2`: Upgrade to a new, modernized version of `json-pointer`. ([@mafintosh](https://github.com/mafintosh)) ### v3.3.1 (2015-08-27): Hi all, this `npm@3` update brings you another round of bug fixes. The headliner here is that `npm update` works again. We're running down the clock on blocker 3.x issues! Shortly after that hits zero we'll be promoting 3.x to latest!! And of course, we have changes that were brought forward from 2.x. Check out the release notes for [2.14.1](https://github.com/npm/npm/releases/tag/v2.14.1) and [2.14.2](https://github.com/npm/npm/releases/tag/v2.14.2). #### BETA WARNINGS FOR FUN AND PROFIT **_THIS IS BETA SOFTWARE_**. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### NPM UPDATE, NOW AGAIN YOUR FRIEND * [`f130a00`](https://github.com/npm/npm/commit/f130a00) [#9095](https://github.com/npm/npm/issues/9095) `npm update` once again works! Previously, after selecting packages to update, it would then pick the wrong location to run the install from. ([@iarna](https://github.com/iarna)) #### MORE VERBOSING FOR YOUR VERBOSE LIFECYCLES * [`d088b7d`](https://github.com/npm/npm/commit/d088b7d) [#9227](https://github.com/npm/npm/pull/9227) Add some additional logging at the verbose and silly levels when running lifecycle scripts. Hopefully this will make debugging issues with them a bit easier! ([@saper](https://github.com/saper)) #### AND SOME OTHER BUG FIXES… * [`f4a5784`](https://github.com/npm/npm/commit/f4a5784) [#9308](https://github.com/npm/npm/issues/9308) Make fetching metadata for local modules faster! This ALSO means that doing things like running `npm repo` won't build your module and maybe run `prepublish`. ([@iarna](https://github.com/iarna)) * [`4468c92`](https://github.com/npm/npm/commit/4468c92) [#9205](https://github.com/npm/npm/issues/9205) Fix a bug where local modules would sometimes not resolve relative links using the correct base path. ([@iarna](https://github.com/iarna)) * [`d395a6b`](https://github.com/npm/npm/commit/d395a6b) [#8995](https://github.com/npm/npm/issues/8995) Certain combinations of packages could result in different install orders for their initial installation than for reinstalls run on the same folder. ([@iarna](https://github.com/iarna)) * [`d119ea6`](https://github.com/npm/npm/commit/d119ea6) [#9113](https://github.com/npm/npm/issues/9113) Make extraneous packages _always_ up in `npm ls`. Previously, if an extraneous package had a dependency that depended back on the original package this would result in the package not showing up in `ls`. ([@iarna](https://github.com/iarna)) * [`02420dc`](https://github.com/npm/npm/commit/02420dc) [#9113](https://github.com/npm/npm/issues/9113) Stop warning about missing top level package.json files. Errors in said files will still be reported. ([@iarna](https://github.com/iarna)) #### SOME DEP UPDATES * [`1ed1364`](https://github.com/npm/npm/commit/1ed1364) `rimraf@2.4.3` ([@isaacs](https://github.com/isaacs)) Added EPERM to delay/retry loop * [`e7b8315`](https://github.com/npm/npm/commit/e7b8315) `read@1.0.7` Smaller distribution package, better metadata ([@isaacs](https://github.com/isaacs)) #### SOME DEPS OF DEPS UPDATES * [`b273bcc`](https://github.com/npm/npm/commit/b273bcc) `mime-types@2.1.5` * [`df6e225`](https://github.com/npm/npm/commit/df6e225) `mime-db@1.17.0` * [`785f2ad`](https://github.com/npm/npm/commit/785f2ad) `is-my-json-valid@2.12.1` * [`88170dd`](https://github.com/npm/npm/commit/88170dd) `form-data@1.0.0-rc3` * [`af5357b`](https://github.com/npm/npm/commit/af5357b) `request@2.61.0` * [`337f96a`](https://github.com/npm/npm/commit/337f96a) `chalk@1.1.1` * [`3dfd74d`](https://github.com/npm/npm/commit/3dfd74d) `async@1.4.2` ### v3.3.0 (2015-08-13): This is a pretty EXCITING week. But I may be a little excitable– or possibly sleep deprived, it's sometimes hard to tell them apart. =D So [Kat](https://github.com/zkat) really went the extra mile this week and got the client side support for teams and orgs out in this week's 2.x release. You can't use that just yet, 'cause we have to turn on some server side stuff too, but this way it'll be there for you all the moment we do! Check out the details over in the [2.14.0 release notes](https://github.com/npm/npm/releases/tag/v2.14.0)! But we over here in 3.x ALSO got a new feature this week, check out the new `--only` and `--also` flags for better control over when dev and production dependencies are used by various npm commands. That, and some important bug fixes round out this week. Enjoy everyone! #### NEVER SHALL NOT BETA THE BETA **_THIS IS BETA SOFTWARE_**. EXCITING NEW BETA WARNING!!! Ok, I fibbed, EXACTLY THE SAME BETA WARNINGS: `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### ONLY ALSO DEV Hey we've got a SUPER cool new feature for you all, thanks to the fantastic work of [@davglass](https://github.com/davglass) and [@bengl](https://github.com/bengl) we have `--only=prod`, `--only=dev`, `--also=prod` and `--also=dev` options. These apply in various ways to: `npm install`, `npm ls`, `npm outdated` and `npm update`. So for instance: ``` npm install --only=dev ``` Only installs dev dependencies. By contrast: ``` npm install --only=prod ``` Will only install prod dependencies and is very similar to `--production` but differs in that it doesn't set the environment variables that `--production` does. The related new flag, `--also` is most useful with things like: ``` npm shrinkwrap --also=dev ``` As shrinkwraps don't include dev deps by default. This replaces passing in `--dev` in that scenario. And that leads into the fact that this deprecates `--dev` as its semantics across commands were inconsistent and confusing. * [`3ab1eea`](https://github.com/npm/npm/commit/3ab1eea) [#9024](https://github.com/npm/npm/pull/9024) Add support for `--only`, `--also` and deprecate `--dev` ([@bengl](https://github.com/bengl)) #### DON'T TOUCH! THAT'S NOT YOUR BIN * [`b31812e`](https://github.com/npm/npm/commit/b31812e) [#8996](https://github.com/npm/npm/pull/8996) When removing a module that has bin files, if one that we're going to remove is a symlink to a DIFFERENT module, leave it alone. This only happens when you have two modules that try to provide the same bin. ([@iarna](https://github.com/iarna)) #### THERE'S AN END IN SIGHT * [`d2178a9`](https://github.com/npm/npm/commit/d2178a9) [#9223](https://github.com/npm/npm/pull/9223) Close a bunch of infinite loops that could show up with symlink cycles in your dependencies. ([@iarna](https://github.com/iarna)) #### OOPS DIDN'T MEAN TO FIX THAT Well, not _just_ yet. This was scheduled for next week, but it snuck into 2.x this week. * [`139dd92`](https://github.com/npm/npm/commit/139dd92) [#8716](https://github.com/npm/npm/pull/8716) `npm init` will now only pick up the modules you install, not everything else that got flattened with them. ([@iarna](https://github.com/iarna)) ### v3.2.2 (2015-08-08): Lot's of lovely bug fixes for `npm@3`. I'm also suuuuper excited that I think we have a handle on stack explosions that effect a small portion of our users. We also have some tantalizing clues as to where some low hanging fruit may be for performance issues. And of course, in addition to the `npm@3` specific bug fixes, there are some great one's coming in from `npm@2`! [@othiym23](https://github.com/othiym23) put together that release this week– check out its [release notes](https://github.com/npm/npm/releases/tag/v2.13.4) for the deets. #### AS ALWAYS STILL BETA **_THIS IS BETA SOFTWARE_**. Just like the airline safety announcements, we're not taking this plane off till we finish telling you: `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### BUG FIXES * [`a8c8a13`](https://github.com/npm/npm/commit/a8c8a13) [#9050](https://github.com/npm/npm/issues/9050) Resolve peer deps relative to the parent of the requirer ([@iarna](http://github.com/iarna)) * [`05f0226`](https://github.com/npm/npm/commit/05f0226) [#9077](https://github.com/npm/npm/issues/9077) Fix crash when saving `git+ssh` urls ([@iarna](http://github.com/iarna)) * [`e4a3808`](https://github.com/npm/npm/commit/e4a3808) [#8951](https://github.com/npm/npm/issues/8951) Extend our patch to allow `*` to match something when a package only has prerelease versions to everything and not just the cache. ([@iarna](http://github.com/iarna)) * [`d135abf`](https://github.com/npm/npm/commit/d135abf) [#8871](https://github.com/npm/npm/issues/8871) Don't warn about a missing `package.json` or missing fields in the global install directory. ([@iarna](http://github.com/iarna)) #### DEP VERSION BUMPS * [`990ee4f`](https://github.com/npm/npm/commit/990ee4f) `path-is-inside@1.0.1` ([@domenic](https://github.com/domenic)) * [`1f71ec0`](https://github.com/npm/npm/commit/1f71ec0) `lodash.clonedeep@3.0.2` ([@jdalton](https://github.com/jdalton)) * [`a091354`](https://github.com/npm/npm/commit/a091354) `marked@0.3.5` ([@chjj](https://github.com/chjj)) * [`fc51f28`](https://github.com/npm/npm/commit/fc51f28) `tap@1.3.2` ([@isaacs](https://github.com/isaacs)) * [`3569ec0`](https://github.com/npm/npm/commit/3569ec0) `nock@2.10.0` ([@pgte](https://github.com/pgte)) * [`ad5f6fd`](https://github.com/npm/npm/commit/ad5f6fd) `npm-registry-mock@1.0.1` ([@isaacs](https://github.com/isaacs)) ### v3.2.1 (2015-07-31): #### AN EXTRA QUIET RELEASE A bunch of stuff got deferred for various reasons, which just means more branches to land next week! Don't forget to check out [Kat's 2.x release](https://github.com/npm/npm/releases/tag/v2.13.4) for other quiet goodies. #### AS ALWAYS STILL BETA **_THIS IS BETA SOFTWARE_**. Yes, we're still reminding you of this. No, you can't be excused. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### MAKING OUR TESTS TEST THE THING THEY TEST * [`6e53c3d`](https://github.com/npm/npm/commit/6e53c3d) [#8985](https://github.com/npm/npm/pull/8985) Many thanks to @bengl for noticing that one of our tests wasn't testing what it claimed it was testing! ([@bengl](https://github.com/bengl)) #### MY PACKAGE.JSON WAS ALREADY IN THE RIGHT ORDER * [`eb2c7aa`](https://github.com/npm/npm/commit/d00d0f) [#9068](https://github.com/npm/npm/pull/9079) Stop sorting keys in the `package.json` that we haven't edited. Many thanks to [@Qix-](https://github.com/Qix-) for bringing this up and providing a first pass at a patch for this. ([@iarna](https://github.com/iarna)) #### DEV DEP UPDATE * [`555f60c`](https://github.com/npm/npm/commit/555f60c) `marked@0.3.4` ### v3.2.0 (2015-07-24): #### MORE CONFIG, BETTER WINDOWS AND A BUG FIX This is a smallish release with a new config option and some bug fixes. And lots of module updates. #### BETA BETAS ON **_THIS IS BETA SOFTWARE_**. Yes, we're still reminding you of this. No, you can't be excused. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### NEW CONFIGS, LESS PROGRESS * [`423d8f7`](https://github.com/npm/npm/commit/423d8f7) [#8704](https://github.com/npm/npm/issues/8704) Add the ability to disable the new progress bar with `--no-progress` ([@iarna](https://github.com/iarna)) #### AND BUG FIXES * [`b3ee452`](https://github.com/npm/npm/commit/b3ee452) [#9038](https://github.com/npm/npm/pull/9038) We previously disabled the use of the new `fs.access` API on Windows, but the bug we were seeing is fixed in `io.js@1.5.0` so we now use `fs.access` if you're using that version or greater. ([@iarna](https://github.com/iarna)) * [`b181fa3`](https://github.com/npm/npm/commit/b181fa3) [#8921](https://github.com/npm/npm/issues/8921) [#8637](https://github.com/npm/npm/issues/8637) Rejigger how we validate modules for install. This allow is to fix a problem where arch/os checking wasn't being done at all. It also made it easy to add back in a check that declines to install a module in itself unless you force it. ([@iarna](https://github.com/iarna)) #### AND A WHOLE BUNCH OF SUBDEP VERSIONS These are all development dependencies and semver-compatible subdep upgrades, so they should not have visible impact on users. * [`6b3f6d9`](https://github.com/npm/npm/commit/6b3f6d9) `standard@4.3.3` * [`f4e22e5`](https://github.com/npm/npm/commit/f4e22e5) `readable-stream@2.0.2` (inside concat-stream) * [`f130bfc`](https://github.com/npm/npm/commit/f130bfc) `minimatch@2.0.10` (inside node-gyp's copy of glob) * [`36c6a0d`](https://github.com/npm/npm/commit/36c6a0d) `caseless@0.11.0` * [`80df59c`](https://github.com/npm/npm/commit/80df59c) `chalk@1.1.0` * [`ea935d9`](https://github.com/npm/npm/commit/ea935d9) `bluebird@2.9.34` * [`3588a0c`](https://github.com/npm/npm/commit/3588a0c) `extend@3.0.0` * [`c6a8450`](https://github.com/npm/npm/commit/c6a8450) `form-data@1.0.0-rc2` * [`a04925b`](https://github.com/npm/npm/commit/a04925b) `har-validator@1.8.0` * [`ee7c095`](https://github.com/npm/npm/commit/ee7c095) `has-ansi@2.0.0` * [`944fc34`](https://github.com/npm/npm/commit/944fc34) `hawk@3.1.0` * [`783dc7b`](https://github.com/npm/npm/commit/783dc7b) `lodash._basecallback@3.3.1` * [`acef0fe`](https://github.com/npm/npm/commit/acef0fe) `lodash._baseclone@3.3.0` * [`dfe959a`](https://github.com/npm/npm/commit/dfe959a) `lodash._basedifference@3.0.3` * [`a03bc76`](https://github.com/npm/npm/commit/a03bc76) `lodash._baseflatten@3.1.4` * [`8a07d50`](https://github.com/npm/npm/commit/8a07d50) `lodash._basetostring@3.0.1` * [`7785e3f`](https://github.com/npm/npm/commit/7785e3f) `lodash._baseuniq@3.0.3` * [`826fb35`](https://github.com/npm/npm/commit/826fb35) `lodash._createcache@3.1.2` * [`76030b3`](https://github.com/npm/npm/commit/76030b3) `lodash._createpadding@3.6.1` * [`1a49ec6`](https://github.com/npm/npm/commit/1a49ec6) `lodash._getnative@3.9.1` * [`eebe47f`](https://github.com/npm/npm/commit/eebe47f) `lodash.isarguments@3.0.4` * [`09994d4`](https://github.com/npm/npm/commit/09994d4) `lodash.isarray@3.0.4` * [`b6f8dbf`](https://github.com/npm/npm/commit/b6f8dbf) `lodash.keys@3.1.2` * [`c67dd6b`](https://github.com/npm/npm/commit/c67dd6b) `lodash.pad@3.1.1` * [`4add042`](https://github.com/npm/npm/commit/4add042) `lodash.repeat@3.0.1` * [`e04993c`](https://github.com/npm/npm/commit/e04993c) `lru-cache@2.6.5` * [`2ed7da4`](https://github.com/npm/npm/commit/2ed7da4) `mime-db@1.15.0` * [`ae08244`](https://github.com/npm/npm/commit/ae08244) `mime-types@2.1.3` * [`e71410e`](https://github.com/npm/npm/commit/e71410e) `os-homedir@1.0.1` * [`67c13e0`](https://github.com/npm/npm/commit/67c13e0) `process-nextick-args@1.0.2` * [`12ee041`](https://github.com/npm/npm/commit/12ee041) `qs@4.0.0` * [`15564a6`](https://github.com/npm/npm/commit/15564a6) `spdx-license-ids@1.0.2` * [`8733bff`](https://github.com/npm/npm/commit/8733bff) `supports-color@2.0.0` * [`230943c`](https://github.com/npm/npm/commit/230943c) `tunnel-agent@0.4.1` * [`26a4653`](https://github.com/npm/npm/commit/26a4653) `ansi-styles@2.1.0` * [`3d27081`](https://github.com/npm/npm/commit/3d27081) `bl@1.0.0` * [`9efa110`](https://github.com/npm/npm/commit/9efa110) `async@1.4.0` #### MERGED FORWARD * As usual, we've ported all the `npm@2` goodies in this week's [v2.13.3](https://github.com/npm/npm/releases/tag/v2.13.3) release. ### v3.1.3 (2015-07-17): Rebecca: So Kat, I hear this week's other release uses a dialog between us to explain what changed? Kat: Well, you could say that… Rebecca: I would! This week I fixed more `npm@3` bugs! Kat: That sounds familiar. Rebecca: Eheheheh, well, before we look at those, a word from our sponsor… #### BETA IS AS BETA DOES **_THIS IS BETA SOFTWARE_**. Yes, we're still reminding you of this. No, you can't be excused. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. Rebecca: Ok, enough of the dialoguing, that's Kat's schtick. But do remember kids, betas hide in dark hallways waiting to break your stuff, stuff like… #### SO MANY LINKS YOU COULD MAKE A CHAIN * [`6d69ec9`](https://github.com/npm/npm/6d69ec9) [#8967](https://github.com/npm/npm/issues/8967) Removing a module linked into your globals would result in having all of its subdeps removed. Since the npm release process does exactly this, it burned me -every- -single- -week-. =D While we're here, we also removed extraneous warns that used to spill out when you'd remove a symlink. ([@iarna](https://github.com/iarna)) * [`fdb360f`](https://github.com/npm/npm/fdb360f) [#8874](https://github.com/npm/npm/issues/8874) Linking scoped modules was failing outright, but this fixes that and updates our tests so we don't do it again. ([@iarna](https://github.com/iarna)) #### WE'LL TRY NOT TO CRACK YOUR WINDOWS * [`9fafb18`](https://github.com/npm/npm/9fafb18) [#8701](https://github.com/npm/npm/issues/8701) `npm@3` introduced permissions checks that run before it actually tries to do something. This saves you from having an install fail half way through. We did this using the shiny new `fs.access` function available in `node 0.12` and `io.js`, with fallback options for older nodes. Unfortunately the way we implemented the fallback caused racey problems for Windows systems. This fixes that by ensuring we only ever run any one check on a directory once. BUT it turns out there are bugs in `fs.access` on Windows. So this ALSO just disables the use of `fs.access` on Windows entirely until that settles out. ([@iarna](https://github.com/iarna)) #### ZOOM ZOOM, DEP UPDATES * [`5656baa`](https://github.com/npm/npm/5656baa) `gauge@1.2.2`: Better handle terminal resizes while printing the progress bar ([@iarna](https://github.com/iarna)) #### MERGED FORWARD * Check out Kat's [super-fresh release notes for v2.13.2](https://github.com/npm/npm/releases/tag/v2.13.2) and see all the changes we ported from `npm@2`. ### v3.1.2 #### SO VERY BETA RELEASE So, `v3.1.1` managed to actually break installing local modules. And then immediately after I drove to an island for the weekend. 😁 So let's get this fixed outside the usual release train! Fortunately it didn't break installing _global_ modules and so you could swap it out for another version at least. #### DISCLAIMER MEANS WHAT IT SAYS **_THIS IS BETA SOFTWARE_**. Yes, we're still reminding you of this. No, you can't be excused. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### THIS IS IT, THE REASON * [`f5e19df`](https://github.com/npm/npm/commit/f5e19df) [#8893](https://github.com/npm/npm/issues/8893) Fix crash when installing local modules introduced by the fix for [#8608](https://github.com/npm/npm/issues/8608) ([@iarna](https://github.com/iarna) ### v3.1.1 #### RED EYE RELEASE Rebecca's up too late writing tests, so you can have `npm@3` bug fixes! Lots of great new issues from you all! ❤️️ Keep it up! #### YUP STILL BETA, PLEASE PAY ATTENTION **_THIS IS BETA SOFTWARE_**. Yes, we're still reminding you of this. No, you can't be excused. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### BOOGS * [`9badfd6`](https://github.com/npm/npm/commit/9babfd63f19f2d80b2d2624e0963b0bdb0d76ef4) [#8608](https://github.com/npm/npm/issues/8608) Make global installs and uninstalls MUCH faster by only reading the directories of modules referred to by arguments. ([@iarna](https://github.com/iarna) * [`075a5f0`](https://github.com/npm/npm/commit/075a5f046ab6837f489b08d44cb601e9fdb369b7) [#8660](https://github.com/npm/npm/issues/8660) Failed optional deps would still result in the optional deps own dependencies being installed. We now find them and fail them out of the tree. ([@iarna](https://github.com/iarna) * [`c9fbbb5`](https://github.com/npm/npm/commit/c9fbbb540083396ea58fd179d81131d959d8e049) [#8863](https://github.com/npm/npm/issues/8863) The "no compatible version found" error message was including only the version requested, not the name of the package we wanted. Ooops! ([@iarna](https://github.com/iarna) * [`32e6bbd`](https://github.com/npm/npm/commit/32e6bbd21744dcbe8c0720ab53f60caa7f2a0588) [#8806](https://github.com/npm/npm/issues/8806) The "uninstall" lifecycle was being run after all of a module's dependencies has been removed. This reverses that order-- this means "uninstall" lifecycles can make use of the package's dependencies. ([@iarna](https://github.com/iarna) #### MERGED FORWARD * Check out the [v2.13.1 release notes](https://github.com/npm/npm/releases/tag/v2.13.1) and see all the changes we ported from `npm@2`. ### v3.1.0 (2015-07-02): This has been a brief week of bug fixes, plus some fun stuff merged forward from this weeks 2.x release. See the [2.13.0 release notes](https://github.com/npm/npm/releases/tag/v2.13.0) for details on that. You all have been AWESOME with [all](https://github.com/npm/npm/milestones/3.x) [the](https://github.com/npm/npm/milestones/3.2.0) `npm@3` bug reports! Thank you and keep up the great work! #### NEW PLACE, SAME CODE Remember how last week we said `npm@3` would go to `3.0-next` and latest tags? Yeaaah, no, please use `npm@v3.x-next` and `npm@v3.x-latest` going forward. I dunno why we said "suuure, we'll never do a feature release till we're out of beta" when we're still forward porting `npm@2.x` features. `¯\_(ツ)_/¯` If you do accidentally use the old tag names, I'll be maintaining them for a few releases, but they won't be around forever. #### YUP STILL BETA, PLEASE PAY ATTENTION **_THIS IS BETA SOFTWARE_**. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@v3.x-next` and `npm@v3.x-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### BUGS ON THE WINDOWS * [`0030ade`](https://github.com/npm/npm/commit/0030ade) [#8685](https://github.com/npm/npm/issues/8685) Windows would hang when trying to clone git repos ([@euprogramador](https://github.com/npm/npm/pull/8777)) * [`b259bcc`](https://github.com/npm/npm/commit/b259bcc) [#8786](https://github.com/npm/npm/pull/8786) Windows permissions checks would cause installations to fail under some circumstances. We're disabling the checks entirely for this release. I'm hoping to check back with this next week to get a Windows friendly fix in. ([@iarna](https://github.com/iarna)) #### SO MANY BUGS SQUASHED, JUST CALL US RAID * [`0848698`](https://github.com/npm/npm/commit/0848698) [#8686](https://github.com/npm/npm/pull/8686) Stop leaving progress bar cruft on the screen during publication ([@ajcrites](https://github.com/ajcrites)) * [`57c3cea`](https://github.com/npm/npm/commit/57c3cea) [#8695](https://github.com/npm/npm/pull/8695) Remote packages with shrinkwraps made npm cause node + iojs to explode and catch fire. NO MORE. ([@iarna](https://github.com/iarna)) * [`2875ba3`](https://github.com/npm/npm/commit/2875ba3) [#8723](https://github.com/npm/npm/pull/8723) I uh, told you that engineStrict checking had gone away last week. TURNS OUT I LIED. So this is making that actually be true. ([@iarna](https://github.com/iarna)) * [`28064e5`](https://github.com/npm/npm/commit/28064e5) [#3358](https://github.com/npm/npm/issues/3358) Consistently allow Unicode BOMs at the start of package.json files. Previously this was allowed some of time, like when you were installing modules, but not others, like running npm version or installing w/ `--save`. ([@iarna](https://github.com/iarna)) * [`3cb6ad2`](https://github.com/npm/npm/commit/3cb6ad2) [#8736](https://github.com/npm/npm/issues/8766) `npm@3` wasn't running the "install" lifecycle in your current (toplevel) module. This broke modules that relied on C compilation. BOO. ([@iarna](https://github.com/iarna)) * [`68da583`](https://github.com/npm/npm/commit/68da583) [#8766](https://github.com/npm/npm/issues/8766) To my great shame, `npm link package` wasn't working AT ALL if you didn't have `package` already installed. ([@iarna](https://github.com/iarna)) * [`edd7448`](https://github.com/npm/npm/commit/edd7448) `read-package-tree@5.0.0`: This update makes read-package-tree not explode when there's bad data in your node_modules folder. `npm@2` silently ignores this sort of thing. ([@iarna](https://github.com/iarna)) * [`0bb08c8`](https://github.com/npm/npm/commit/0bb08c8) [#8778](https://github.com/npm/npm/pull/8778) RELATEDLY, we now show any errors from your node_modules folder after your installation completes as warnings. We're also reporting these in `npm ls` now. ([@iarna](https://github.com/iarna)) * [`6c248ff`](https://github.com/npm/npm/commit/6c248ff) [#8779](https://github.com/npm/npm/pull/8779) Hey, you know how we used to complain if your `package.json` was missing stuff? Well guess what, we are again. I know, I know, you can thank me later. ([@iarna](https://github.com/iarna)) * [`d6f7c98`](https://github.com/npm/npm/commit/d6f7c98) So, when we were rolling back after errors we had untested code that tried to undo moves. Being untested it turns out it was very broken. I've removed it until we have time to do this right. ([@iarna](https://github.com/iarna)) #### NEW VERSION Just the one. Others came in via the 2.x release. Do check out its changelog, immediately following this message. * [`4e602c5`](https://github.com/npm/npm/commit/4e602c5) `lodash@3.2.2` ### v3.0.0 (2015-06-25): Wow, it's finally here! This has been a long time coming. We are all delighted and proud to be getting this out into the world, and are looking forward to working with the npm user community to get it production-ready as quickly as possible. `npm@3` constitutes a nearly complete rewrite of npm's installer to be easier to maintain, and to bring a bunch of valuable new features and design improvements to you all. [@othiym23](https://github.com/othiym23) and [@isaacs](https://github.com/isaacs) have been [talking about the changes](http://blog.npmjs.org/post/91303926460/npm-cli-roadmap-a-periodic-update) in this release for well over a year, and it's been the primary focus of [@iarna](https://github.com/iarna) since she joined the team. Given that this is a near-total rewrite, all changes listed here are [@iarna](https://github.com/iarna)'s work unless otherwise specified. #### NO, REALLY, READ THIS PARAGRAPH. IT'S THE IMPORTANT ONE. **_THIS IS BETA SOFTWARE_**. `npm@3` will remain in beta until we're confident that it's stable and have assessed the effect of the breaking changes on the community. During that time we will still be doing `npm@2` releases, with `npm@2` tagged as `latest` and `next`. We'll _also_ be publishing new releases of `npm@3` as `npm@3.0-next` and `npm@3.0-latest` alongside those versions until we're ready to switch everyone over to `npm@3`. We need your help to find and fix its remaining bugs. It's a significant rewrite, so we are _sure_ there still significant bugs remaining. So do us a solid and deploy it in non-critical CI environments and for day-to-day use, but maybe don't use it for production maintenance or frontline continuous deployment just yet. #### BREAKING CHANGES ##### `peerDependencies` `grunt`, `gulp`, and `broccoli` plugin maintainers take note! You will be affected by this change! * [#6930](https://github.com/npm/npm/issues/6930) ([#6565](https://github.com/npm/npm/issues/6565)) `peerDependencies` no longer cause _anything_ to be implicitly installed. Instead, npm will now warn if a packages `peerDependencies` are missing, but it's up to the consumer of the module (i.e. you) to ensure the peers get installed / are included in `package.json` as direct `dependencies` or `devDependencies` of your package. * [#3803](https://github.com/npm/npm/issues/3803) npm also no longer checks `peerDependencies` until after it has fully resolved the tree. This shifts the responsibility for fulfilling peer dependencies from library / framework / plugin maintainers to application authors, and is intended to get users out of the dependency hell caused by conflicting `peerDependency` constraints. npm's job is to keep you _out_ of dependency hell, not put you in it. ##### `engineStrict` * [#6931](https://github.com/npm/npm/issues/6931) The rarely-used `package.json` option `engineStrict` has been deprecated for several months, producing warnings when it was used. Starting with `npm@3`, the value of the field is ignored, and engine violations will only produce warnings. If you, as a user, want strict `engines` field enforcement, just run `npm config set engine-strict true`. As with the peer dependencies change, this is about shifting control from module authors to application authors. It turns out `engineStrict` was very difficult to understand even harder to use correctly, and more often than not just made modules using it difficult to deploy. ##### `npm view` * [`77f1aec`](https://github.com/npm/npm/commit/77f1aec) With `npm view` (aka `npm info`), always return arrays for versions, maintainers, etc. Previously npm would return a plain value if there was only one, and multiple values if there were more. ([@KenanY](https://github.com/KenanY)) #### KNOWN BUGS Again, this is a _**BETA RELEASE**_, so not everything is working just yet. Here are the issues that we already know about. If you run into something that isn't on this list, [let us know](https://github.com/npm/npm/issues/new)! * [#8575](https://github.com/npm/npm/issues/8575) Circular deps will never be removed by the prune-on-uninstall code. * [#8588](https://github.com/npm/npm/issues/8588) Local deps where the dep name and the name in the package.json differ don't result in an error. * [#8637](https://github.com/npm/npm/issues/8637) Modules can install themselves as direct dependencies. `npm@2` declined to do this. * [#8660](https://github.com/npm/npm/issues/8660) Dependencies of failed optional dependencies aren't rolled back when the optional dependency is, and then are reported as extraneous thereafter. #### NEW FEATURES ##### The multi-stage installer! * [#5919](https://github.com/npm/npm/issues/5919) Previously the installer had a set of steps it executed for each package and it would immediately start executing them as soon as it decided to act on a package. But now it executes each of those steps at the same time for all packages, waiting for all of one stage to complete before moving on. This eliminates many race conditions and makes the code easier to reason about. This fixes, for instance: * [#6926](https://github.com/npm/npm/issues/6926) ([#5001](https://github.com/npm/npm/issues/5001), [#6170](https://github.com/npm/npm/issues/6170)) `install` and `postinstall` lifecycle scripts now only execute `after` all the module with the script's dependencies are installed. ##### Install: it looks different! You'll now get a tree much like the one produced by `npm ls` that highlights in orange the packages that were installed. Similarly, any removed packages will have their names prefixed by a `-`. Also, `npm outdated` used to include the name of the module in the `Location` field: ``` Package Current Wanted Latest Location deep-equal MISSING 1.0.0 1.0.0 deep-equal glob 4.5.3 4.5.3 5.0.10 rimraf > glob ``` Now it shows the module that required it as the final point in the `Location` field: ``` Package Current Wanted Latest Location deep-equal MISSING 1.0.0 1.0.0 npm glob 4.5.3 4.5.3 5.0.10 npm > rimraf ``` Previously the `Location` field was telling you where the module was on disk. Now it tells you what requires the module. When more than one thing requires the module you'll see it listed once for each thing requiring it. ##### Install: it works different! * [#6928](https://github.com/npm/npm/issues/6928) ([#2931](https://github.com/npm/npm/issues/2931) [#2950](https://github.com/npm/npm/issues/2950)) `npm install` when you have an `npm-shrinkwrap.json` will ensure you have the modules specified in it are installed in exactly the shape specified no matter what you had when you started. * [#6913](https://github.com/npm/npm/issues/6913) ([#1341](https://github.com/npm/npm/issues/1341) [#3124](https://github.com/npm/npm/issues/3124) [#4956](https://github.com/npm/npm/issues/4956) [#6349](https://github.com/npm/npm/issues/6349) [#5465](https://github.com/npm/npm/issues/5465)) `npm install` when some of your dependencies are missing sub-dependencies will result in those sub-dependencies being installed. That is, `npm install` now knows how to fix broken installs, most of the time. * [#5465](https://github.com/npm/npm/issues/5465) If you directly `npm install` a module that's already a subdep of something else and your new version is incompatible, it will now install the previous version nested in the things that need it. * [`a2b50cf`](https://github.com/npm/npm/commit/a2b50cf) [#5693](https://github.com/npm/npm/issues/5693) When installing a new module, if it's mentioned in your `npm-shrinkwrap.json` or your `package.json` use the version specifier from there if you didn't specify one yourself. ##### Flat, flat, flat! Your dependencies will now be installed *maximally flat*. Insofar as is possible, all of your dependencies, and their dependencies, and THEIR dependencies will be installed in your project's `node_modules` folder with no nesting. You'll only see modules nested underneath one another when two (or more) modules have conflicting dependencies. * [#3697](https://github.com/npm/npm/issues/3697) This will hopefully eliminate most cases where windows users ended up with paths that were too long for Explorer and other standard tools to deal with. * [#6912](https://github.com/npm/npm/issues/6912) ([#4761](https://github.com/npm/npm/issues/4761) [#4037](https://github.com/npm/npm/issues/4037)) This also means that your installs will be deduped from the start. * [#5827](https://github.com/npm/npm/issues/5827) This deduping even extends to git deps. * [#6936](https://github.com/npm/npm/issues/6936) ([#5698](https://github.com/npm/npm/issues/5698)) Various commands are dedupe aware now. This has some implications for the behavior of other commands: * `npm uninstall` removes any dependencies of the module that you specified that aren't required by any other module. Previously, it would only remove those that happened to be installed under it, resulting in left over cruft if you'd ever deduped. * `npm ls` now shows you your dependency tree organized around what requires what, rather than where those modules are on disk. * [#6937](https://github.com/npm/npm/issues/6937) `npm dedupe` now flattens the tree in addition to deduping. And bundling of dependencies when packing or publishing changes too: * [#2442](https://github.com/npm/npm/issues/2442) bundledDependencies no longer requires that you specify deduped sub deps. npm can now see that a dependency is required by something bundled and automatically include it. To put that another way, bundledDependencies should ONLY include things that you included in dependencies, optionalDependencies or devDependencies. * [#5437](https://github.com/npm/npm/issues/5437) When bundling a dependency that's both a `devDependency` and the child of a regular `dependency`, npm bundles the child dependency. As a demonstration of our confidence in our own work, npm's own dependencies are now flattened, deduped, and bundled in the `npm@3` style. This means that `npm@3` can't be packed or published by `npm@2`, which is something to be aware of if you're hacking on npm. ##### Shrinkwraps: they are a-changin'! First of all, they should be idempotent now ([#5779](https://github.com/npm/npm/issues/5779)). No more differences because the first time you install (without `npm-shrinkwrap.json`) and the second time (with `npm-shrinkwrap.json`). * [#6781](https://github.com/npm/npm/issues/6781) Second, if you save your changes to `package.json` and you have `npm-shrinkwrap.json`, then it will be updated as well. This applies to all of the commands that update your tree: * `npm install --save` * `npm update --save` * `npm dedupe --save` ([#6410](https://github.com/npm/npm/issues/6410)) * `npm uninstall --save` * [#4944](https://github.com/npm/npm/issues/4944) ([#5161](https://github.com/npm/npm/issues/5161) [#5448](https://github.com/npm/npm/issues/5448)) Third, because `node_modules` folders are now deduped and flat, shrinkwrap has to also be smart enough to handle this. And finally, enjoy this shrinkwrap bug fix: * [#3675](https://github.com/npm/npm/issues/3675) When shrinkwrapping a dependency that's both a `devDependency` and the child of a regular `dependency`, npm now correctly includes the child. ##### The Age of Progress (Bars)! * [#6911](https://github.com/npm/npm/issues/6911) ([#1257](https://github.com/npm/npm/issues/1257) [#5340](https://github.com/npm/npm/issues/5340) [#6420](https://github.com/npm/npm/issues/6420)) The spinner is gone (yay? boo? will you miss it?), and in its place npm has _progress bars_, so you actually have some sense of how long installs will take. It's provided in Unicode and non-Unicode variants, and Unicode support is automatically detected from your environment. #### TINY JEWELS The bottom is where we usually hide the less interesting bits of each release, but each of these are small but incredibly useful bits of this release, and very much worth checking out: * [`9ebe312`](https://github.com/npm/npm/commit/9ebe312) Build system maintainers, rejoice: npm does a better job of cleaning up after itself in your temporary folder. * [#6942](https://github.com/npm/npm/issues/6942) Check for permissions issues prior to actually trying to install anything. * Emit warnings at the end of the installation when possible, so that they'll be on your screen when npm stops. * [#3505](https://github.com/npm/npm/issues/3505) `npm --dry-run`: You can now ask that npm only report what it _would have done_ with the new `--dry-run` flag. This can be passed to any of the commands that change your `node_modules` folder: `install`, `uninstall`, `update` and `dedupe`. * [`81b46fb`](https://github.com/npm/npm/commit/81b46fb) npm now knows the correct URLs for `npm bugs` and `npm repo` for repositories hosted on Bitbucket and GitLab, just like it does for GitHub (and GitHub support now extends to projects hosted as gists as well as traditional repositories). * [`5be4008a`](https://github.com/npm/npm/commit/5be4008a09730cfa3891d9f145e4ec7f2accd144) npm has been cleaned up to pass the [`standard`](http://npm.im/standard) style checker. Forrest and Rebecca both feel this makes it easier to read and understand the code, and should also make it easier for new contributors to put merge-ready patches. ([@othiym23](https://github.com/othiym23)) #### ZARRO BOOGS * [`6401643`](https://github.com/npm/npm/commit/6401643) Make sure the global install directory exists before installing to it. ([@thefourtheye](https://github.com/thefourtheye)) * [#6158](https://github.com/npm/npm/issues/6158) When we remove modules we do so inside-out running unbuild for each one. * [`960a765`](https://github.com/npm/npm/commit/960a765) The short usage information for each subcommand has been brought in sync with the documentation. ([@smikes](https://github.com/smikes)) npm_3.5.2.orig/CONTRIBUTING.md0000644000000000000000000000076712631326456013735 0ustar 00000000000000## Before you submit a new issue * Check if there's a simple solution in the [Troubleshooting](https://github.com/npm/npm/wiki/Troubleshooting) wiki. * [Search for similar issues](https://github.com/npm/npm/search?q=Similar%20issues&type=Issues). * Ensure your new issue conforms to the [Contributing Guidelines](https://github.com/npm/npm/wiki/Contributing-Guidelines). Participation in this open source project is subject to the [npm Code of Conduct](http://www.npmjs.com/policies/conduct). npm_3.5.2.orig/LICENSE0000644000000000000000000002301612631326456012501 0ustar 00000000000000The npm application Copyright (c) npm, Inc. and Contributors Licensed on the terms of The Artistic License 2.0 Node package dependencies of the npm application Copyright (c) their respective copyright owners Licensed on their respective license terms The npm public registry at https://registry.npmjs.org and the npm website at https://www.npmjs.com Operated by npm, Inc. Use governed by terms published on https://www.npmjs.com "Node.js" Trademark Joyent, Inc., https://joyent.com Neither npm nor npm, Inc. are affiliated with Joyent, Inc. The Node.js application Project of Node Foundation, https://nodejs.org The npm Logo Copyright (c) Mathias Pettersson and Brian Hammond "Gubblebum Blocky" typeface Copyright (c) Tjarda Koster, https://jelloween.deviantart.com Used with permission -------- The Artistic License 2.0 Copyright (c) 2000-2006, The Perl Foundation. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble This license establishes the terms under which a given free software Package may be copied, modified, distributed, and/or redistributed. The intent is that the Copyright Holder maintains some artistic control over the development of that Package while still keeping the Package available as open source and free software. You are always permitted to make arrangements wholly outside of this license directly with the Copyright Holder of a given Package. If the terms of this license do not permit the full use that you propose to make of the Package, you should contact the Copyright Holder and seek a different licensing arrangement. Definitions "Copyright Holder" means the individual(s) or organization(s) named in the copyright notice for the entire Package. "Contributor" means any party that has contributed code or other material to the Package, in accordance with the Copyright Holder's procedures. "You" and "your" means any person who would like to copy, distribute, or modify the Package. "Package" means the collection of files distributed by the Copyright Holder, and derivatives of that collection and/or of those files. A given Package may consist of either the Standard Version, or a Modified Version. "Distribute" means providing a copy of the Package or making it accessible to anyone else, or in the case of a company or organization, to others outside of your company or organization. "Distributor Fee" means any fee that you charge for Distributing this Package or providing support for this Package to another party. It does not mean licensing fees. "Standard Version" refers to the Package if it has not been modified, or has been modified only in ways explicitly requested by the Copyright Holder. "Modified Version" means the Package, if it has been changed, and such changes were not explicitly requested by the Copyright Holder. "Original License" means this Artistic License as Distributed with the Standard Version of the Package, in its current version or as it may be modified by The Perl Foundation in the future. "Source" form means the source code, documentation source, and configuration files for the Package. "Compiled" form means the compiled bytecode, object code, binary, or any other form resulting from mechanical transformation or translation of the Source form. Permission for Use and Modification Without Distribution (1) You are permitted to use the Standard Version and create and use Modified Versions for any purpose without restriction, provided that you do not Distribute the Modified Version. Permissions for Redistribution of the Standard Version (2) You may Distribute verbatim copies of the Source form of the Standard Version of this Package in any medium without restriction, either gratis or for a Distributor Fee, provided that you duplicate all of the original copyright notices and associated disclaimers. At your discretion, such verbatim copies may or may not include a Compiled form of the Package. (3) You may apply any bug fixes, portability changes, and other modifications made available from the Copyright Holder. The resulting Package will still be considered the Standard Version, and as such will be subject to the Original License. Distribution of Modified Versions of the Package as Source (4) You may Distribute your Modified Version as Source (either gratis or for a Distributor Fee, and with or without a Compiled form of the Modified Version) provided that you clearly document how it differs from the Standard Version, including, but not limited to, documenting any non-standard features, executables, or modules, and provided that you do at least ONE of the following: (a) make the Modified Version available to the Copyright Holder of the Standard Version, under the Original License, so that the Copyright Holder may include your modifications in the Standard Version. (b) ensure that installation of your Modified Version does not prevent the user installing or running the Standard Version. In addition, the Modified Version must bear a name that is different from the name of the Standard Version. (c) allow anyone who receives a copy of the Modified Version to make the Source form of the Modified Version available to others under (i) the Original License or (ii) a license that permits the licensee to freely copy, modify and redistribute the Modified Version using the same licensing terms that apply to the copy that the licensee received, and requires that the Source form of the Modified Version, and of any works derived from it, be made freely available in that license fees are prohibited but Distributor Fees are allowed. Distribution of Compiled Forms of the Standard Version or Modified Versions without the Source (5) You may Distribute Compiled forms of the Standard Version without the Source, provided that you include complete instructions on how to get the Source of the Standard Version. Such instructions must be valid at the time of your distribution. If these instructions, at any time while you are carrying out such distribution, become invalid, you must provide new instructions on demand or cease further distribution. If you provide valid instructions or cease distribution within thirty days after you become aware that the instructions are invalid, then you do not forfeit any of your rights under this license. (6) You may Distribute a Modified Version in Compiled form without the Source, provided that you comply with Section 4 with respect to the Source of the Modified Version. Aggregating or Linking the Package (7) You may aggregate the Package (either the Standard Version or Modified Version) with other packages and Distribute the resulting aggregation provided that you do not charge a licensing fee for the Package. Distributor Fees are permitted, and licensing fees for other components in the aggregation are permitted. The terms of this license apply to the use and Distribution of the Standard or Modified Versions as included in the aggregation. (8) You are permitted to link Modified and Standard Versions with other works, to embed the Package in a larger work of your own, or to build stand-alone binary or bytecode versions of applications that include the Package, and Distribute the result without restriction, provided the result does not expose a direct interface to the Package. Items That are Not Considered Part of a Modified Version (9) Works (including, but not limited to, modules and scripts) that merely extend or make use of the Package, do not, by themselves, cause the Package to be a Modified Version. In addition, such works are not considered parts of the Package itself, and are not subject to the terms of this license. General Provisions (10) Any use, modification, and distribution of the Standard or Modified Versions is governed by this Artistic License. By using, modifying or distributing the Package, you accept this license. Do not use, modify, or distribute the Package, if you do not accept this license. (11) If your Modified Version has been derived from a Modified Version made by someone other than you, you are nevertheless required to ensure that your Modified Version complies with the requirements of this license. (12) This license does not grant you the right to use any trademark, service mark, tradename, or logo of the Copyright Holder. (13) This license includes the non-exclusive, worldwide, free-of-charge patent license to make, have made, use, offer to sell, sell, import and otherwise transfer the Package with respect to any patent claims licensable by the Copyright Holder that are necessarily infringed by the Package. If you institute patent litigation (including a cross-claim or counterclaim) against any party alleging that the Package constitutes direct or contributory patent infringement, then this Artistic License to you shall terminate on the date that such litigation is filed. (14) Disclaimer of Warranty: THE PACKAGE IS PROVIDED BY THE COPYRIGHT HOLDER AND CONTRIBUTORS "AS IS' AND WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES. THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NON-INFRINGEMENT ARE DISCLAIMED TO THE EXTENT PERMITTED BY YOUR LOCAL LAW. UNLESS REQUIRED BY LAW, NO COPYRIGHT HOLDER OR CONTRIBUTOR WILL BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, OR CONSEQUENTIAL DAMAGES ARISING IN ANY WAY OUT OF THE USE OF THE PACKAGE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -------- npm_3.5.2.orig/Makefile0000644000000000000000000001125312631326456013134 0ustar 00000000000000# vim: set softtabstop=2 shiftwidth=2: SHELL = bash PUBLISHTAG = $(shell node scripts/publish-tag.js) BRANCH = $(shell git rev-parse --abbrev-ref HEAD) markdowns = $(shell find doc -name '*.md' | grep -v 'index') README.md html_docdeps = html/dochead.html \ html/docfoot.html \ scripts/doc-build.sh \ package.json cli_mandocs = $(shell find doc/cli -name '*.md' \ |sed 's|.md|.1|g' \ |sed 's|doc/cli/|man/man1/|g' ) \ man/man1/npm-README.1 files_mandocs = $(shell find doc/files -name '*.md' \ |sed 's|.md|.5|g' \ |sed 's|doc/files/|man/man5/|g' ) \ man/man5/npm-json.5 \ man/man5/npm-global.5 misc_mandocs = $(shell find doc/misc -name '*.md' \ |sed 's|.md|.7|g' \ |sed 's|doc/misc/|man/man7/|g' ) \ man/man7/npm-index.7 cli_htmldocs = $(shell find doc/cli -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/cli/|html/doc/cli/|g' ) \ html/doc/README.html files_htmldocs = $(shell find doc/files -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/files/|html/doc/files/|g' ) \ html/doc/files/npm-json.html \ html/doc/files/npm-global.html misc_htmldocs = $(shell find doc/misc -name '*.md' \ |sed 's|.md|.html|g' \ |sed 's|doc/misc/|html/doc/misc/|g' ) \ html/doc/index.html mandocs = $(cli_mandocs) $(files_mandocs) $(misc_mandocs) htmldocs = $(cli_htmldocs) $(files_htmldocs) $(misc_htmldocs) all: doc latest: @echo "Installing latest published npm" @echo "Use 'make install' or 'make link' to install the code" @echo "in this folder that you're looking at right now." node cli.js install -g -f npm ${NPMOPTS} install: all node cli.js install -g -f ${NPMOPTS} # backwards compat dev: install link: uninstall node cli.js link -f clean: markedclean marked-manclean doc-clean uninstall rm -rf npmrc node cli.js cache clean uninstall: node cli.js rm npm -g -f doc: $(mandocs) $(htmldocs) markedclean: rm -rf node_modules/marked node_modules/.bin/marked .building_marked marked-manclean: rm -rf node_modules/marked-man node_modules/.bin/marked-man .building_marked-man docclean: doc-clean doc-clean: rm -rf \ .building_marked \ .building_marked-man \ html/doc \ man # use `npm install marked-man` for this to work. man/man1/npm-README.1: README.md scripts/doc-build.sh package.json @[ -d man/man1 ] || mkdir -p man/man1 scripts/doc-build.sh $< $@ man/man1/%.1: doc/cli/%.md scripts/doc-build.sh package.json @[ -d man/man1 ] || mkdir -p man/man1 scripts/doc-build.sh $< $@ man/man5/npm-json.5: man/man5/package.json.5 cp $< $@ man/man5/npm-global.5: man/man5/npm-folders.5 cp $< $@ man/man5/%.5: doc/files/%.md scripts/doc-build.sh package.json @[ -d man/man5 ] || mkdir -p man/man5 scripts/doc-build.sh $< $@ doc/misc/npm-index.md: scripts/index-build.js package.json node scripts/index-build.js > $@ html/doc/index.html: doc/misc/npm-index.md $(html_docdeps) @[ -d html/doc ] || mkdir -p html/doc scripts/doc-build.sh $< $@ man/man7/%.7: doc/misc/%.md scripts/doc-build.sh package.json @[ -d man/man7 ] || mkdir -p man/man7 scripts/doc-build.sh $< $@ html/doc/README.html: README.md $(html_docdeps) @[ -d html/doc ] || mkdir -p html/doc scripts/doc-build.sh $< $@ html/doc/cli/%.html: doc/cli/%.md $(html_docdeps) @[ -d html/doc/cli ] || mkdir -p html/doc/cli scripts/doc-build.sh $< $@ html/doc/files/npm-json.html: html/doc/files/package.json.html cp $< $@ html/doc/files/npm-global.html: html/doc/files/npm-folders.html cp $< $@ html/doc/files/%.html: doc/files/%.md $(html_docdeps) @[ -d html/doc/files ] || mkdir -p html/doc/files scripts/doc-build.sh $< $@ html/doc/misc/%.html: doc/misc/%.md $(html_docdeps) @[ -d html/doc/misc ] || mkdir -p html/doc/misc scripts/doc-build.sh $< $@ marked: node_modules/.bin/marked node_modules/.bin/marked: node cli.js install marked --no-global marked-man: node_modules/.bin/marked-man node_modules/.bin/marked-man: node cli.js install marked-man --no-global doc: man man: $(cli_docs) test: doc node cli.js test tag: npm tag npm@$(PUBLISHTAG) latest publish: link doc @git push origin :v$(shell npm -v) 2>&1 || true git clean -fd &&\ git push origin $(BRANCH) &&\ git push origin --tags &&\ npm publish --tag=$(PUBLISHTAG) release: @bash scripts/release.sh sandwich: @[ $$(whoami) = "root" ] && (echo "ok"; echo "ham" > sandwich) || (echo "make it yourself" && exit 13) .PHONY: all latest install dev link doc clean uninstall test man doc-clean docclean release npm_3.5.2.orig/README.md0000644000000000000000000001477312631326456012765 0ustar 00000000000000npm(1) -- a JavaScript package manager ============================== [![Build Status](https://img.shields.io/travis/npm/npm/master.svg)](https://travis-ci.org/npm/npm) ## SYNOPSIS This is just enough info to get you up and running. Much more info available via `npm help` once it's installed. ## IMPORTANT **You need node v0.8 or higher to run this program.** To install an old **and unsupported** version of npm that works on node 0.3 and prior, clone the git repo and dig through the old tags and branches. **npm is configured to use npm, Inc.'s public package registry at by default.** You can configure npm to use any compatible registry you like, and even run your own registry. Check out the [doc on registries](https://docs.npmjs.com/misc/registry). Use of someone else's registry may be governed by terms of use. The terms of use for the default public registry are available at . ## Super Easy Install npm is bundled with [node](http://nodejs.org/download/). ### Windows Computers [Get the MSI](http://nodejs.org/download/). npm is in it. ### Apple Macintosh Computers [Get the pkg](http://nodejs.org/download/). npm is in it. ### Other Sorts of Unices Run `make install`. npm will be installed with node. If you want a more fancy pants install (a different version, customized paths, etc.) then read on. ## Fancy Install (Unix) There's a pretty robust install script at . You can download that and run it. Here's an example using curl: ```sh curl -L https://www.npmjs.com/install.sh | sh ``` ### Slightly Fancier You can set any npm configuration params with that script: ```sh npm_config_prefix=/some/path sh install.sh ``` Or, you can run it in uber-debuggery mode: ```sh npm_debug=1 sh install.sh ``` ### Even Fancier Get the code with git. Use `make` to build the docs and do other stuff. If you plan on hacking on npm, `make link` is your friend. If you've got the npm source code, you can also semi-permanently set arbitrary config keys using the `./configure --key=val ...`, and then run npm commands by doing `node cli.js `. (This is helpful for testing, or running stuff without actually installing npm itself.) ## Windows Install or Upgrade You can download a zip file from , and unpack it in the `node_modules\npm\` folder inside node's installation folder. To upgrade to npm 2, follow the Windows upgrade instructions in the npm Troubleshooting Guide: If that's not fancy enough for you, then you can fetch the code with git, and mess with it directly. ## Installing on Cygwin No. ## Uninstalling So sad to see you go. ```sh sudo npm uninstall npm -g ``` Or, if that fails, ```sh sudo make uninstall ``` ## More Severe Uninstalling Usually, the above instructions are sufficient. That will remove npm, but leave behind anything you've installed. If you would like to remove all the packages that you have installed, then you can use the `npm ls` command to find them, and then `npm rm` to remove them. To remove cruft left behind by npm 0.x, you can use the included `clean-old.sh` script file. You can run it conveniently like this: ```sh npm explore npm -g -- sh scripts/clean-old.sh ``` npm uses two configuration files, one for per-user configs, and another for global (every-user) configs. You can view them by doing: ```sh npm config get userconfig # defaults to ~/.npmrc npm config get globalconfig # defaults to /usr/local/etc/npmrc ``` Uninstalling npm does not remove configuration files by default. You must remove them yourself manually if you want them gone. Note that this means that future npm installs will not remember the settings that you have chosen. ## Using npm Programmatically Although npm can be used programmatically, its API is meant for use by the CLI *only*, and no guarantees are made regarding its fitness for any other purpose. If you want to use npm to reliably perform some task, the safest thing to do is to invoke the desired `npm` command with appropriate arguments. The semantic version of npm refers to the CLI itself, rather than the underlying API. _The internal API is not guaranteed to remain stable even when npm's version indicates no breaking changes have been made according to semver._ If you _still_ would like to use npm programmatically, it's _possible_. The API isn't very well documented, but it _is_ rather simple. Eventually, npm will be just a thin CLI wrapper around the modules that it depends on, but for now, there are some things that only the CLI can do. You should try using one of npm's dependencies first, and only use the API if what you're trying to do is only supported by npm itself. ```javascript var npm = require("npm") npm.load(myConfigObject, function (er) { if (er) return handlError(er) npm.commands.install(["some", "args"], function (er, data) { if (er) return commandFailed(er) // command succeeded, and data might have some info }) npm.registry.log.on("log", function (message) { .... }) }) ``` The `load` function takes an object hash of the command-line configs. The various `npm.commands.` functions take an **array** of positional argument **strings**. The last argument to any `npm.commands.` function is a callback. Some commands take other optional arguments. Read the source. You cannot set configs individually for any single npm function at this time. Since `npm` is a singleton, any call to `npm.config.set` will change the value for *all* npm commands in that process. See `./bin/npm-cli.js` for an example of pulling config values off of the command line arguments using nopt. You may also want to check out `npm help config` to learn about all the options you can set there. ## More Docs Check out the [docs](https://docs.npmjs.com/), especially the [faq](https://docs.npmjs.com/misc/faq). You can use the `npm help` command to read any of them. If you're a developer, and you want to use npm to publish your program, you should [read this](https://docs.npmjs.com/misc/developers) ## BUGS When you find issues, please report them: * web: Be sure to include *all* of the output from the npm command that didn't work as expected. The `npm-debug.log` file is also helpful to provide. You can also look for isaacs in #node.js on irc://irc.freenode.net. He will no doubt tell you to put the output in a gist or email. ## SEE ALSO * npm(1) * npm-faq(7) * npm-help(1) * npm-index(7) npm_3.5.2.orig/bin/0000755000000000000000000000000012631326456012242 5ustar 00000000000000npm_3.5.2.orig/changelogs/0000755000000000000000000000000012631326456013604 5ustar 00000000000000npm_3.5.2.orig/cli.js0000755000000000000000000000006012631326456012576 0ustar 00000000000000#!/usr/bin/env node require('./bin/npm-cli.js') npm_3.5.2.orig/configure0000755000000000000000000000101112631326456013372 0ustar 00000000000000#!/bin/bash # set configurations that will be "sticky" on this system, # surviving npm self-updates. CONFIGS=() i=0 # get the location of this file. unset CDPATH CONFFILE=$(cd $(dirname "$0"); pwd -P)/npmrc while [ $# -gt 0 ]; do conf="$1" case $conf in --help) echo "./configure --param=value ..." exit 0 ;; --*) CONFIGS[$i]="${conf:2}" ;; *) CONFIGS[$i]="$conf" ;; esac let i++ shift done for c in "${CONFIGS[@]}"; do echo "$c" >> "$CONFFILE" done npm_3.5.2.orig/debian/0000755000000000000000000000000012631326456012714 5ustar 00000000000000npm_3.5.2.orig/doc/0000755000000000000000000000000012631326456012237 5ustar 00000000000000npm_3.5.2.orig/html/0000755000000000000000000000000012631326456012436 5ustar 00000000000000npm_3.5.2.orig/lib/0000755000000000000000000000000012631326456012240 5ustar 00000000000000npm_3.5.2.orig/make.bat0000644000000000000000000000023412631326456013076 0ustar 00000000000000:: The tests run "make doc" in the prepublish script, :: so this file gives windows something that'll exit :: successfully, without having to install make. npm_3.5.2.orig/node_modules/0000755000000000000000000000000012631326456014147 5ustar 00000000000000npm_3.5.2.orig/package.json0000644000000000000000000001224412631326456013763 0ustar 00000000000000{ "version": "3.5.2", "name": "npm", "description": "a package manager for JavaScript", "keywords": [ "install", "modules", "package manager", "package.json" ], "preferGlobal": true, "config": { "publishtest": false }, "homepage": "https://docs.npmjs.com/", "author": "Isaac Z. Schlueter (http://blog.izs.me)", "repository": { "type": "git", "url": "https://github.com/npm/npm" }, "bugs": { "url": "http://github.com/npm/npm/issues" }, "directories": { "bin": "./bin", "doc": "./doc", "lib": "./lib", "man": "./man" }, "main": "./lib/npm.js", "bin": "./bin/npm-cli.js", "dependencies": { "abbrev": "~1.0.7", "ansicolors": "~0.3.2", "ansistyles": "~0.1.3", "aproba": "~1.0.1", "archy": "~1.0.0", "async-some": "~1.0.2", "chownr": "~1.0.1", "cmd-shim": "~2.0.1", "columnify": "~1.5.2", "config-chain": "~1.1.9", "dezalgo": "~1.0.3", "editor": "~1.0.0", "fs-vacuum": "~1.2.7", "fs-write-stream-atomic": "~1.0.5", "fstream": "~1.0.8", "fstream-npm": "~1.0.7", "glob": "~5.0.15", "graceful-fs": "~4.1.2", "has-unicode": "~1.0.1", "hosted-git-info": "~2.1.4", "iferr": "~0.1.5", "inflight": "~1.0.4", "inherits": "~2.0.1", "ini": "~1.3.4", "init-package-json": "~1.9.1", "lockfile": "~1.0.1", "lodash.clonedeep": "~3.0.2", "lodash.union": "~3.1.0", "lodash.uniq": "~3.2.2", "lodash.without": "~3.2.1", "mkdirp": "~0.5.1", "node-gyp": "~3.2.1", "nopt": "~3.0.4", "normalize-git-url": "~3.0.1", "normalize-package-data": "~2.3.5", "npm-cache-filename": "~1.0.2", "npm-install-checks": "~2.0.1", "npm-package-arg": "~4.1.0", "npm-registry-client": "~7.0.9", "npm-user-validate": "~0.1.2", "npmlog": "~2.0.0", "once": "~1.3.2", "opener": "~1.4.1", "osenv": "~0.1.3", "path-is-inside": "~1.0.1", "read": "~1.0.7", "read-cmd-shim": "~1.0.1", "read-installed": "~4.0.3", "read-package-json": "~2.0.2", "read-package-tree": "~5.1.2", "realize-package-specifier": "~3.0.1", "retry": "~0.8.0", "rimraf": "~2.4.3", "semver": "~5.0.3", "sha": "~2.0.1", "slide": "~1.1.6", "sorted-object": "~1.0.0", "tar": "~2.2.1", "text-table": "~0.2.0", "uid-number": "0.0.6", "umask": "~1.1.0", "unique-filename": "~1.1.0", "unpipe": "~1.0.0", "validate-npm-package-name": "~2.2.2", "which": "~1.2.0", "wrappy": "~1.0.1", "write-file-atomic": "~1.1.4" }, "bundleDependencies": [ "abbrev", "ansi-regex", "ansicolors", "ansistyles", "aproba", "archy", "async-some", "chownr", "cmd-shim", "columnify", "config-chain", "debuglog", "dezalgo", "editor", "fs-vacuum", "fs-write-stream-atomic", "fstream", "fstream-npm", "glob", "graceful-fs", "has-unicode", "hosted-git-info", "iferr", "imurmurhash", "inflight", "inherits", "ini", "init-package-json", "lockfile", "lodash._baseindexof", "lodash._baseuniq", "lodash._bindcallback", "lodash._cacheindexof", "lodash._createcache", "lodash._getnative", "lodash.clonedeep", "lodash.isarguments", "lodash.isarray", "lodash.keys", "lodash.restparam", "lodash.union", "lodash.uniq", "lodash.without", "mkdirp", "node-gyp", "nopt", "normalize-git-url", "normalize-package-data", "npm-cache-filename", "npm-install-checks", "npm-package-arg", "npm-registry-client", "npm-user-validate", "npmlog", "once", "opener", "osenv", "path-is-inside", "read", "read-cmd-shim", "read-installed", "read-package-json", "read-package-tree", "readdir-scoped-modules", "realize-package-specifier", "request", "retry", "rimraf", "semver", "sha", "slide", "sorted-object", "strip-ansi", "tar", "text-table", "uid-number", "umask", "unique-filename", "unpipe", "validate-npm-package-license", "validate-npm-package-name", "which", "wrappy", "write-file-atomic" ], "devDependencies": { "deep-equal": "~1.0.1", "marked": "~0.3.5", "marked-man": "~0.1.5", "nock": "~1.9.0", "npm-registry-couchapp": "~2.6.11", "npm-registry-mock": "~1.0.1", "readable-stream": "~2.0.2", "require-inject": "~1.3.0", "sprintf-js": "~1.0.3", "standard": "~5.3.1", "tap": "~2.2.0" }, "scripts": { "dumpconf": "env | grep npm | sort | uniq", "prepublish": "node bin/npm-cli.js prune --prefix=. --no-global && rimraf test/*/*/node_modules && make -j4 doc", "preversion": "bash scripts/update-authors.sh && git add AUTHORS && git commit -m \"update AUTHORS\" || true", "tap": "tap --timeout 240", "test": "standard && npm run test-tap", "test-all": "standard && npm run test-legacy && npm run test-tap", "test-legacy": "node ./test/run.js", "test-tap": "npm run tap -- \"test/tap/*.js\"", "test-node": "\"$NODE\" ./test/run.js && \"$NODE\" \"node_modules/.bin/tap\" --timeout 240 \"test/tap/*.js\"" }, "license": "Artistic-2.0" } npm_3.5.2.orig/scripts/0000755000000000000000000000000012631326456013161 5ustar 00000000000000npm_3.5.2.orig/test/0000755000000000000000000000000012631326456012451 5ustar 00000000000000npm_3.5.2.orig/bin/node-gyp-bin/0000755000000000000000000000000012631326456014532 5ustar 00000000000000npm_3.5.2.orig/bin/npm0000755000000000000000000000112412631326456012760 0ustar 00000000000000#!/bin/sh (set -o igncr) 2>/dev/null && set -o igncr; # cygwin encoding fix basedir=`dirname "$0"` case `uname` in *CYGWIN*) basedir=`cygpath -w "$basedir"`;; esac NODE_EXE="$basedir/node.exe" if ! [ -x "$NODE_EXE" ]; then NODE_EXE=node fi NPM_CLI_JS="$basedir/node_modules/npm/bin/npm-cli.js" case `uname` in *CYGWIN*) NPM_PREFIX=`"$NODE_EXE" "$NPM_CLI_JS" prefix -g` NPM_PREFIX_NPM_CLI_JS="$NPM_PREFIX/node_modules/npm/bin/npm-cli.js" if [ -f "$NPM_PREFIX_NPM_CLI_JS" ]; then NPM_CLI_JS="$NPM_PREFIX_NPM_CLI_JS" fi ;; esac "$NODE_EXE" "$NPM_CLI_JS" "$@" npm_3.5.2.orig/bin/npm-cli.js0000755000000000000000000000371512631326456014150 0ustar 00000000000000#!/usr/bin/env node ;(function () { // wrapper in case we're in module_context mode // windows: running "npm blah" in this folder will invoke WSH, not node. /*global WScript*/ if (typeof WScript !== 'undefined') { WScript.echo( 'npm does not work when run\n' + 'with the Windows Scripting Host\n\n' + "'cd' to a different directory,\n" + "or type 'npm.cmd ',\n" + "or type 'node npm '." ) WScript.quit(1) return } process.title = 'npm' var log = require('npmlog') log.pause() // will be unpaused when config is loaded. log.info('it worked if it ends with', 'ok') var path = require('path') var npm = require('../lib/npm.js') var npmconf = require('../lib/config/core.js') var errorHandler = require('../lib/utils/error-handler.js') var configDefs = npmconf.defs var shorthands = configDefs.shorthands var types = configDefs.types var nopt = require('nopt') // if npm is called as "npmg" or "npm_g", then // run in global mode. if (path.basename(process.argv[1]).slice(-1) === 'g') { process.argv.splice(1, 1, 'npm', '-g') } log.verbose('cli', process.argv) var conf = nopt(types, shorthands) npm.argv = conf.argv.remain if (npm.deref(npm.argv[0])) npm.command = npm.argv.shift() else conf.usage = true if (conf.version) { console.log(npm.version) return } if (conf.versions) { npm.command = 'version' conf.usage = false npm.argv = [] } log.info('using', 'npm@%s', npm.version) log.info('using', 'node@%s', process.version) process.on('uncaughtException', errorHandler) if (conf.usage && npm.command !== 'help') { npm.argv.unshift(npm.command) npm.command = 'help' } // now actually fire up npm and run the command. // this is how to use npm programmatically: conf._exit = true npm.load(conf, function (er) { if (er) return errorHandler(er) npm.commands[npm.command](npm.argv, errorHandler) }) })() npm_3.5.2.orig/bin/npm.cmd0000644000000000000000000000074312631326456013525 0ustar 00000000000000:: Created by npm, please don't edit manually. @ECHO OFF SETLOCAL SET "NODE_EXE=%~dp0\node.exe" IF NOT EXIST "%NODE_EXE%" ( SET "NODE_EXE=node" ) SET "NPM_CLI_JS=%~dp0\node_modules\npm\bin\npm-cli.js" FOR /F "delims=" %%F IN ('CALL "%NODE_EXE%" "%NPM_CLI_JS%" prefix -g') DO ( SET "NPM_PREFIX_NPM_CLI_JS=%%F\node_modules\npm\bin\npm-cli.js" ) IF EXIST "%NPM_PREFIX_NPM_CLI_JS%" ( SET "NPM_CLI_JS=%NPM_PREFIX_NPM_CLI_JS%" ) "%NODE_EXE%" "%NPM_CLI_JS%" %* npm_3.5.2.orig/bin/read-package-json.js0000755000000000000000000000100112631326456016046 0ustar 00000000000000var argv = process.argv if (argv.length < 3) { console.error('Usage: read-package.json [ ...]') process.exit(1) } var file = argv[2] var readJson = require('read-package-json') readJson(file, function (er, data) { if (er) throw er if (argv.length === 3) { console.log(data) } else { argv.slice(3).forEach(function (field) { field = field.split('.') var val = data field.forEach(function (f) { val = val[f] }) console.log(val) }) } }) npm_3.5.2.orig/bin/node-gyp-bin/node-gyp0000755000000000000000000000025412631326456016203 0ustar 00000000000000#!/usr/bin/env sh if [ "x$npm_config_node_gyp" = "x" ]; then node "`dirname "$0"`/../../node_modules/node-gyp/bin/node-gyp.js" "$@" else "$npm_config_node_gyp" "$@" fi npm_3.5.2.orig/bin/node-gyp-bin/node-gyp.cmd0000755000000000000000000000022312631326456016741 0ustar 00000000000000if not defined npm_config_node_gyp ( node "%~dp0\..\..\node_modules\node-gyp\bin\node-gyp.js" %* ) else ( node %npm_config_node_gyp% %* ) npm_3.5.2.orig/changelogs/CHANGELOG-1.md0000644000000000000000000011255612631326456015565 0ustar 00000000000000### v1.4.29 (2015-10-29): #### THINGS ARE HAPPENING IN LTS LAND In a special one-off release as part of the [strategy to get a version of npm into Node LTS that works with the current registry](https://github.com/nodejs/LTS/issues/37), modify npm to print out this deprecation banner literally every time npm is invoked to do anything: ``` npm WARN deprecated This version of npm lacks support for important features, npm WARN deprecated such as scoped packages, offered by the primary npm npm WARN deprecated registry. Consider upgrading to at least npm@2, if not the npm WARN deprecated latest stable version. To upgrade to npm@2, run: npm WARN deprecated npm WARN deprecated npm -g install npm@latest-2 npm WARN deprecated npm WARN deprecated To upgrade to the latest stable version, run: npm WARN deprecated npm WARN deprecated npm -g install npm@latest npm WARN deprecated npm WARN deprecated (Depending on how Node.js was installed on your system, you npm WARN deprecated may need to prefix the preceding commands with `sudo`, or if npm WARN deprecated on Windows, run them from an Administrator prompt.) npm WARN deprecated npm WARN deprecated If you're running the version of npm bundled with npm WARN deprecated Node.js 0.10 LTS, be aware that the next version of 0.10 LTS npm WARN deprecated will be bundled with a version of npm@2, which has some small npm WARN deprecated backwards-incompatible changes made to `npm run-script` and npm WARN deprecated semver behavior. ``` The message basically tells the tale: Node 0.10 will finally be getting `npm@2`, so those of you who haven't upgraded your build systems to deal with its (relatively small) breaking changes should do so now. Also, this version doesn't even pretend that it can deal with scoped packages, which, given the confusing behavior of older versions of `npm@1.4`, where it would sometimes try to install packages from GitHub, is a distinct improvement. There is no good reason for you as an end user to upgrade to this version of npm yourself. * [`709e9b4`](https://github.com/npm/npm/commit/709e9b44f5df9817a1c4babfbf26a2329bd265fb) Print 20-line deprecation banner on all command invocations. ([@othiym23](https://github.com/othiym23)) * [`0c29d09`](https://github.com/npm/npm/commit/0c29d0906608e8e174bd30a7a245e19795326051) Crash out immediately with an exhortation to upgrade on attempts to use scoped packages. ([@othiym23](https://github.com/othiym23)) ### v1.5.0-alpha-4 (2014-07-18): * fall back to `_auth` config as default auth when using default registry ([@isaacs](https://github.com/isaacs)) * support for 'init.version' for those who don't want to deal with semver 0.0.x oddities ([@rvagg](https://github.com/rvagg)) * [`be06213`](https://github.com/npm/npm/commit/be06213415f2d51a50d2c792b4cd0d3412a9a7b1) remove residual support for `win` log level ([@aterris](https://github.com/aterris)) ### v1.5.0-alpha-3 (2014-07-17): * [`a3a85dd`](https://github.com/npm/npm/commit/a3a85dd004c9245a71ad2f0213bd1a9a90d64cd6) `--save` scoped packages correctly ([@othiym23](https://github.com/othiym23)) * [`18a3385`](https://github.com/npm/npm/commit/18a3385bcf8bfb8312239216afbffb7eec759150) `npm-registry-client@3.0.2` ([@othiym23](https://github.com/othiym23)) * [`375988b`](https://github.com/npm/npm/commit/375988b9bf5aa5170f06a790d624d31b1eb32c6d) invalid package names are an early error for optional deps ([@othiym23](https://github.com/othiym23)) * consistently use `node-package-arg` instead of arbitrary package spec splitting ([@othiym23](https://github.com/othiym23)) ### v1.5.0-alpha-2 (2014-07-01): * [`54cf625`](https://github.com/npm/npm/commit/54cf62534e3331e3f454e609e44f0b944e819283) fix handling for 301s in `npm-registry-client@3.0.1` ([@Raynos](https://github.com/Raynos)) * [`e410861`](https://github.com/npm/npm/commit/e410861c69a3799c1874614cb5b87af8124ff98d) don't crash if no username set on `whoami` ([@isaacs](https://github.com/isaacs)) * [`0353dde`](https://github.com/npm/npm/commit/0353ddeaca8171aa7dbdd8102b7e2eb581a86406) respect `--json` for output ([@isaacs](https://github.com/isaacs)) * [`b3d112a`](https://github.com/npm/npm/commit/b3d112ae190b984cc1779b9e6de92218f22380c6) outdated: Don't show headings if there's nothing to output ([@isaacs](https://github.com/isaacs)) * [`bb4b90c`](https://github.com/npm/npm/commit/bb4b90c80dbf906a1cb26d85bc0625dc2758acc3) outdated: Default to `latest` rather than `*` for unspecified deps ([@isaacs](https://github.com/isaacs)) ### v1.5.0-alpha-1 (2014-07-01): * [`eef4884`](https://github.com/npm/npm/commit/eef4884d6487ee029813e60a5f9c54e67925d9fa) use the correct piece of the spec for GitHub shortcuts ([@othiym23](https://github.com/othiym23)) ### v1.5.0-alpha-0 (2014-07-01): * [`7f55057`](https://github.com/npm/npm/commit/7f55057807cfdd9ceaf6331968e666424f48116c) install scoped packages ([#5239](https://github.com/npm/npm/issues/5239)) ([@othiym23](https://github.com/othiym23)) * [`0df7e16`](https://github.com/npm/npm/commit/0df7e16c0232d8f4d036ebf4ec3563215517caac) publish scoped packages ([#5239](https://github.com/npm/npm/issues/5239)) ([@othiym23](https://github.com/othiym23)) * [`0689ba2`](https://github.com/npm/npm/commit/0689ba249b92b4c6279a26804c96af6f92b3a501) support (and save) --scope=@s config ([@othiym23](https://github.com/othiym23)) * [`f34878f`](https://github.com/npm/npm/commit/f34878fc4cee29901e4daf7bace94be01e25cad7) scope credentials to registry ([@othiym23](https://github.com/othiym23)) * [`0ac7ca2`](https://github.com/npm/npm/commit/0ac7ca233f7a69751fe4386af6c4daa3ee9fc0da) capture and store bearer tokens when sent by registry ([@othiym23](https://github.com/othiym23)) * [`63c3277`](https://github.com/npm/npm/commit/63c3277f089b2c4417e922826bdc313ac854cad6) only delete files that are created by npm ([@othiym23](https://github.com/othiym23)) * [`4f54043`](https://github.com/npm/npm/commit/4f540437091d1cbca3915cd20c2da83c2a88bb8e) `npm-package-arg@2.0.0` ([@othiym23](https://github.com/othiym23)) * [`9e1460e`](https://github.com/npm/npm/commit/9e1460e6ac9433019758481ec031358f4af4cd44) `read-package-json@1.2.3` ([@othiym23](https://github.com/othiym23)) * [`719d8ad`](https://github.com/npm/npm/commit/719d8adb9082401f905ff4207ede494661f8a554) `fs-vacuum@1.2.1` ([@othiym23](https://github.com/othiym23)) * [`9ef8fe4`](https://github.com/npm/npm/commit/9ef8fe4d6ead3acb3e88c712000e2d3a9480ebec) `async-some@1.0.0` ([@othiym23](https://github.com/othiym23)) * [`a964f65`](https://github.com/npm/npm/commit/a964f65ab662107b62a4ca58535ce817e8cca331) `npmconf@2.0.1` ([@othiym23](https://github.com/othiym23)) * [`113765b`](https://github.com/npm/npm/commit/113765bfb7d3801917c1d9f124b8b3d942bec89a) `npm-registry-client@3.0.0` ([@othiym23](https://github.com/othiym23)) ### v1.4.28 (2014-09-12): * [`f4540b6`](https://github.com/npm/npm/commit/f4540b6537a87e653d7495a9ddcf72949fdd4d14) [#6043](https://github.com/npm/npm/issues/6043) defer rollbacks until just before the CLI exits ([@isaacs](https://github.com/isaacs)) * [`1eabfd5`](https://github.com/npm/npm/commit/1eabfd5c03f33c2bd28823714ff02059eeee3899) [#6043](https://github.com/npm/npm/issues/6043) `slide@1.1.6`: wait until all callbacks have finished before proceeding ([@othiym23](https://github.com/othiym23)) ### v1.4.27 (2014-09-04): * [`4cf3c8f`](https://github.com/npm/npm/commit/4cf3c8fd78c9e2693a5f899f50c28f4823c88e2e) [#6007](https://github.com/npm/npm/issues/6007) request@2.42.0: properly set headers on proxy requests ([@isaacs](https://github.com/isaacs)) * [`403cb52`](https://github.com/npm/npm/commit/403cb526be1472bb7545fa8e62d4976382cdbbe5) [#6055](https://github.com/npm/npm/issues/6055) npmconf@1.1.8: restore case-insensitivity of environmental config ([@iarna](https://github.com/iarna)) ### v1.4.26 (2014-08-28): * [`eceea95`](https://github.com/npm/npm/commit/eceea95c804fa15b18e91c52c0beb08d42a3e77d) `github-url-from-git@1.4.0`: add support for git+https and git+ssh ([@stefanbuck](https://github.com/stefanbuck)) * [`e561758`](https://github.com/npm/npm/commit/e5617587e7d7ab686192391ce55357dbc7fed0a3) `columnify@1.2.1` ([@othiym23](https://github.com/othiym23)) * [`0c4fab3`](https://github.com/npm/npm/commit/0c4fab372ee76eab01dda83b6749429a8564902e) `cmd-shim@2.0.0`: upgrade to graceful-fs 3 ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`2d69e4d`](https://github.com/npm/npm/commit/2d69e4d95777671958b5e08d3b2f5844109d73e4) `github-url-from-username-repo@1.0.0`: accept slashes in branch names ([@robertkowalski](https://github.com/robertkowalski)) * [`81f9b2b`](https://github.com/npm/npm/commit/81f9b2bac9d34c223ea093281ba3c495f23f10d1) ensure lifecycle spawn errors caught properly ([@isaacs](https://github.com/isaacs)) * [`bfaab8c`](https://github.com/npm/npm/commit/bfaab8c6e0942382a96b250634ded22454c36b5a) `npm-registry-client@2.0.7`: properly encode % in passwords ([@isaacs](https://github.com/isaacs)) * [`91cfb58`](https://github.com/npm/npm/commit/91cfb58dda851377ec604782263519f01fd96ad8) doc: Fix 'npm help index' ([@isaacs](https://github.com/isaacs)) ### v1.4.25 (2014-08-21): * [`64c0ec2`](https://github.com/npm/npm/commit/64c0ec241ef5d83761ca8de54acb3c41b079956e) `npm-registry-client@2.0.6`: Print the notification header returned by the registry, and make sure status codes are printed without gratuitous quotes around them. ([@othiym23](https://github.com/othiym23)) * [`a8ed12b`](https://github.com/npm/npm/commit/a8ed12b) `tar@1.0.1`: Add test for removing an extract target immediately after unpacking. ([@isaacs](https://github.com/isaacs)) * [`70fd11d`](https://github.com/npm/npm/commit/70fd11d) `lockfile@1.0.0`: Fix incorrect interaction between `wait`, `stale`, and `retries` options. Part 2 of race condition leading to `ENOENT` errors. ([@isaacs](https://github.com/isaacs)) * [`0072c4d`](https://github.com/npm/npm/commit/0072c4d) `fstream@1.0.2`: Fix a double-finish call which can result in excess FS operations after the `close` event. Part 2 of race condition leading to `ENOENT` errors. ([@isaacs](https://github.com/isaacs)) ### v1.4.24 (2014-08-14): * [`9344bd9`](https://github.com/npm/npm/commit/9344bd9b2929b5c399a0e0e0b34d45bce7bc24bb) doc: add new changelog ([@othiym23](https://github.com/othiym23)) * [`4be76fd`](https://github.com/npm/npm/commit/4be76fd65e895883c337a99f275ccc8c801adda3) doc: update version doc to include `pre-*` increment args ([@isaacs](https://github.com/isaacs)) * [`e4f2620`](https://github.com/npm/npm/commit/e4f262036080a282ad60e236a9aeebd39fde9fe4) build: add `make tag` to tag current release as `latest` ([@isaacs](https://github.com/isaacs)) * [`ec2596a`](https://github.com/npm/npm/commit/ec2596a7cb626772780b25b0a94a7e547a812bd5) build: publish with `--tag=v1.4-next` ([@isaacs](https://github.com/isaacs)) * [`9ee55f8`](https://github.com/npm/npm/commit/9ee55f892b8b473032a43c59912c5684fd1b39e6) build: add script to output `v1.4-next` publish tag ([@isaacs](https://github.com/isaacs)) * [`aecb56f`](https://github.com/npm/npm/commit/aecb56f95a84687ea46920a0b98aaa587fee1568) build: remove outdated `docpublish` make target ([@isaacs](https://github.com/isaacs)) * [`b57a9b7`](https://github.com/npm/npm/commit/b57a9b7ccd13e6b38831ed63595c8ea5763da247) build: remove unpublish step from `make publish` ([@isaacs](https://github.com/isaacs)) * [`2c6acb9`](https://github.com/npm/npm/commit/2c6acb96c71c16106965d5cd829b67195dd673c7) install: rename `.gitignore` when unpacking foreign tarballs ([@isaacs](https://github.com/isaacs)) * [`22f3681`](https://github.com/npm/npm/commit/22f3681923e993a47fc1769ba735bfa3dd138082) cache: detect non-gzipped tar files more reliably ([@isaacs](https://github.com/isaacs)) ### v1.4.23 (2014-07-31): * [`8dd11d1`](https://github.com/npm/npm/commit/8dd11d1) update several dependencies to avoid using `semver`s starting with 0. ### v1.4.22 (2014-07-31): * [`d9a9e84`](https://github.com/npm/npm/commit/d9a9e84) `read-package-json@1.2.4` ([@isaacs](https://github.com/isaacs)) * [`86f0340`](https://github.com/npm/npm/commit/86f0340) `github-url-from-git@1.2.0` ([@isaacs](https://github.com/isaacs)) * [`a94136a`](https://github.com/npm/npm/commit/a94136a) `fstream@0.1.29` ([@isaacs](https://github.com/isaacs)) * [`bb82d18`](https://github.com/npm/npm/commit/bb82d18) `glob@4.0.5` ([@isaacs](https://github.com/isaacs)) * [`5b6bcf4`](https://github.com/npm/npm/commit/5b6bcf4) `cmd-shim@1.1.2` ([@isaacs](https://github.com/isaacs)) * [`c2aa8b3`](https://github.com/npm/npm/commit/c2aa8b3) license: Cleaned up legalese with actual lawyer ([@isaacs](https://github.com/isaacs)) * [`63fe0ee`](https://github.com/npm/npm/commit/63fe0ee) `init-package-json@1.0.0` ([@isaacs](https://github.com/isaacs)) ### v1.4.21 (2014-07-14): * [`88f51aa`](https://github.com/npm/npm/commit/88f51aa27eb9a958d1fa7ec50fee5cfdedd05110) fix handling for 301s in `npm-registry-client@2.0.3` ([@Raynos](https://github.com/Raynos)) ### v1.4.20 (2014-07-02): * [`0353dde`](https://github.com/npm/npm/commit/0353ddeaca8171aa7dbdd8102b7e2eb581a86406) respect `--json` for output ([@isaacs](https://github.com/isaacs)) * [`b3d112a`](https://github.com/npm/npm/commit/b3d112ae190b984cc1779b9e6de92218f22380c6) outdated: Don't show headings if there's nothing to output ([@isaacs](https://github.com/isaacs)) * [`bb4b90c`](https://github.com/npm/npm/commit/bb4b90c80dbf906a1cb26d85bc0625dc2758acc3) outdated: Default to `latest` rather than `*` for unspecified deps ([@isaacs](https://github.com/isaacs)) ### v1.4.19 (2014-07-01): * [`f687433`](https://github.com/npm/npm/commit/f687433) relative URLS for working non-root registry URLS ([@othiym23](https://github.com/othiym23)) * [`bea190c`](https://github.com/npm/npm/commit/bea190c) [#5591](https://github.com/npm/npm/issues/5591) bump nopt and npmconf ([@isaacs](https://github.com/isaacs)) ### v1.4.18 (2014-06-29): * Bump glob dependency from 4.0.2 to 4.0.3. It now uses graceful-fs when available, increasing resilience to [various filesystem errors](https://github.com/isaacs/node-graceful-fs#improvements-over-fs-module). ([@isaacs](https://github.com/isaacs)) ### v1.4.17 (2014-06-27): * replace escape codes with ansicolors ([@othiym23](https://github.com/othiym23)) * Allow to build all the docs OOTB. ([@GeJ](https://github.com/GeJ)) * Use core.longpaths on win32 git - fixes [#5525](https://github.com/npm/npm/issues/5525) ([@bmeck](https://github.com/bmeck)) * `npmconf@1.1.2` ([@isaacs](https://github.com/isaacs)) * Consolidate color sniffing in config/log loading process ([@isaacs](https://github.com/isaacs)) * add verbose log when project config file is ignored ([@isaacs](https://github.com/isaacs)) * npmconf: Float patch to remove 'scope' from config defs ([@isaacs](https://github.com/isaacs)) * doc: npm-explore can't handle a version ([@robertkowalski](https://github.com/robertkowalski)) * Add user-friendly errors for ENOSPC and EROFS. ([@voodootikigod](https://github.com/voodootikigod)) * bump tar and fstream deps ([@isaacs](https://github.com/isaacs)) * Run the npm-registry-couchapp tests along with npm tests ([@isaacs](https://github.com/isaacs)) ### v1.2.8000 (2014-06-17): * Same as v1.4.16, but with the spinner disabled, and a version number that starts with v1.2. ### v1.4.16 (2014-06-17): * `npm-registry-client@2.0.2` ([@isaacs](https://github.com/isaacs)) * `fstream@0.1.27` ([@isaacs](https://github.com/isaacs)) * `sha@1.2.4` ([@isaacs](https://github.com/isaacs)) * `rimraf@2.2.8` ([@isaacs](https://github.com/isaacs)) * `npmlog@1.0.1` ([@isaacs](https://github.com/isaacs)) * `npm-registry-client@2.0.1` ([@isaacs](https://github.com/isaacs)) * removed redundant dependency ([@othiym23](https://github.com/othiym23)) * `npmconf@1.0.5` ([@isaacs](https://github.com/isaacs)) * Properly handle errors that can occur in the config-loading process ([@isaacs](https://github.com/isaacs)) ### v1.4.15 (2014-06-10): * cache: atomic de-race-ified package.json writing ([@isaacs](https://github.com/isaacs)) * `fstream@0.1.26` ([@isaacs](https://github.com/isaacs)) * `graceful-fs@3.0.2` ([@isaacs](https://github.com/isaacs)) * `osenv@0.1.0` ([@isaacs](https://github.com/isaacs)) * Only spin the spinner when we're fetching stuff ([@isaacs](https://github.com/isaacs)) * Update `osenv@0.1.0` which removes ~/tmp as possible tmp-folder ([@robertkowalski](https://github.com/robertkowalski)) * `ini@1.2.1` ([@isaacs](https://github.com/isaacs)) * `graceful-fs@3` ([@isaacs](https://github.com/isaacs)) * Update glob and things depending on glob ([@isaacs](https://github.com/isaacs)) * github-url-from-username-repo and read-package-json updates ([@isaacs](https://github.com/isaacs)) * `editor@0.1.0` ([@isaacs](https://github.com/isaacs)) * `columnify@1.1.0` ([@isaacs](https://github.com/isaacs)) * bump ansi and associated deps ([@isaacs](https://github.com/isaacs)) ### v1.4.14 (2014-06-05): * char-spinner: update to not bork windows ([@isaacs](https://github.com/isaacs)) ### v1.4.13 (2014-05-23): * Fix `npm install` on a tarball. ([`ed3abf1`](https://github.com/npm/npm/commit/ed3abf1aa10000f0f687330e976d78d1955557f6), [#5330](https://github.com/npm/npm/issues/5330), [@othiym23](https://github.com/othiym23)) * Fix an issue with the spinner on Node 0.8. ([`9f00306`](https://github.com/npm/npm/commit/9f003067909440390198c0b8f92560d84da37762), [@isaacs](https://github.com/isaacs)) * Re-add `npm.commands.cache.clean` and `npm.commands.cache.read` APIs, and document `npm.commands.cache.*` as npm-cache(3). ([`e06799e`](https://github.com/npm/npm/commit/e06799e77e60c1fc51869619083a25e074d368b3), [@isaacs](https://github.com/isaacs)) ### v1.4.12 (2014-05-23): * remove normalize-package-data from top level, de-^-ify inflight dep ([@isaacs](https://github.com/isaacs)) * Always sort saved bundleDependencies ([@isaacs](https://github.com/isaacs)) * add inflight to bundledDependencies ([@othiym23](https://github.com/othiym23)) ### v1.4.11 (2014-05-22): * fix `npm ls` labeling issue * `node-gyp@0.13.1` * default repository to https:// instead of git:// * addLocalTarball: Remove extraneous unpack ([@isaacs](https://github.com/isaacs)) * Massive cache folder refactor ([@othiym23](https://github.com/othiym23) and [@isaacs](https://github.com/isaacs)) * Busy Spinner, no http noise ([@isaacs](https://github.com/isaacs)) * Per-project .npmrc file support ([@isaacs](https://github.com/isaacs)) * `npmconf@1.0.0`, Refactor config/uid/prefix loading process ([@isaacs](https://github.com/isaacs)) * Allow once-disallowed characters in passwords ([@isaacs](https://github.com/isaacs)) * Send npm version as 'version' header ([@isaacs](https://github.com/isaacs)) * fix cygwin encoding issue (Karsten Tinnefeld) * Allow non-github repositories with `npm repo` ([@evanlucas](https://github.com/evanlucas)) * Allow peer deps to be satisfied by grandparent * Stop optional deps moving into deps on `update --save` ([@timoxley](https://github.com/timoxley)) * Ensure only matching deps update with `update --save*` ([@timoxley](https://github.com/timoxley)) * Add support for `prerelease`, `preminor`, `prepatch` to `npm version` ### v1.4.10 (2014-05-05): * Don't set referer if already set * fetch: Send referer and npm-session headers * `run-script`: Support `--parseable` and `--json` * list runnable scripts ([@evanlucas](https://github.com/evanlucas)) * Use marked instead of ronn for html docs ### v1.4.9 (2014-05-01): * Send referer header (with any potentially private stuff redacted) * Fix critical typo bug in previous npm release ### v1.4.8 (2014-05-01): * Check SHA before using files from cache * adduser: allow change of the saved password * Make `npm install` respect `config.unicode` * Fix lifecycle to pass `Infinity` for config env value * Don't return 0 exit code on invalid command * cache: Handle 404s and other HTTP errors as errors * Resolve ~ in path configs to env.HOME * Include npm version in default user-agent conf * npm init: Use ISC as default license, use save-prefix for deps * Many test and doc fixes ### v1.4.7 (2014-04-15): * Add `--save-prefix` option that can be used to override the default of `^` when using `npm install --save` and its counterparts. ([`64eefdf`](https://github.com/npm/npm/commit/64eefdfe26bb27db8dc90e3ab5d27a5ef18a4470), [@thlorenz](https://github.com/thlorenz)) * Allow `--silent` to silence the echoing of commands that occurs with `npm run`. ([`c95cf08`](https://github.com/npm/npm/commit/c95cf086e5b97dbb48ff95a72517b203a8f29eab), [@Raynos](https://github.com/Raynos)) * Some speed improvements to the cache, which should improve install times. ([`cb94310`](https://github.com/npm/npm/commit/cb94310a6adb18cb7b881eacb8d67171eda8b744), [`3b0870f`](https://github.com/npm/npm/commit/3b0870fb2f40358b3051abdab6be4319d196b99d), [`120f5a9`](https://github.com/npm/npm/commit/120f5a93437bbbea9249801574a2f33e44e81c33), [@isaacs](https://github.com/isaacs)) * Improve ability to retry registry requests when a subset of the registry servers are down. ([`4a5257d`](https://github.com/npm/npm/commit/4a5257de3870ac3dafa39667379f19f6dcd6093e), https://github.com/npm/npm-registry-client/commit/7686d02cb0b844626d6a401e58c0755ef3bc8432, [@isaacs](https://github.com/isaacs)) * Fix marking of peer dependencies as extraneous. ([`779b164`](https://github.com/npm/npm/commit/779b1649764607b062c031c7e5c972151b4a1754), https://github.com/npm/read-installed/commit/6680ba6ef235b1ca3273a00b70869798ad662ddc, [@isaacs](https://github.com/isaacs)) * Fix npm crashing when doing `npm shrinkwrap` in the presence of a `package.json` with no dependencies. ([`a9d9fa5`](https://github.com/npm/npm/commit/a9d9fa5ad3b8c925a589422b7be28d2735f320b0), [@kislyuk](https://github.com/kislyuk)) * Fix error when using `npm view` on packages that have no versions or have been unpublished. ([`94df2f5`](https://github.com/npm/npm/commit/94df2f56d684b35d1df043660180fc321b743dc8), [@juliangruber](https://github.com/juliangruber); [`2241a09`](https://github.com/npm/npm/commit/2241a09c843669c70633c399ce698cec3add40b3), [@isaacs](https://github.com/isaacs)) ### v1.4.6 (2014-03-19): * Fix extraneous package detection to work in more cases. ([`f671286`](https://github.com/npm/npm/commit/f671286), npm/read-installed#20, [@LaurentVB](https://github.com/LaurentVB)) ### v1.4.5 (2014-03-18): * Sort dependencies in `package.json` when doing `npm install --save` and all its variants. ([`6fd6ff7`](https://github.com/npm/npm/commit/6fd6ff7e536ea6acd33037b1878d4eca1f931985), [@domenic](https://github.com/domenic)) * Add `--save-exact` option, usable alongside `--save` and its variants, which will write the exact version number into `package.json` instead of the appropriate semver-compatibility range. ([`17f07df`](https://github.com/npm/npm/commit/17f07df8ad8e594304c2445bf7489cb53346f2c5), [@timoxley](https://github.com/timoxley)) * Accept gzipped content from the registry to speed up downloads and save bandwidth. ([`a3762de`](https://github.com/npm/npm/commit/a3762de843b842be8fa0ab57cdcd6b164f145942), npm/npm-registry-client#40, [@fengmk2](https://github.com/fengmk2)) * Fix `npm ls`'s `--depth` and `--log` options. ([`1d29b17`](https://github.com/npm/npm/commit/1d29b17f5193d52a5c4faa412a95313dcf41ed91), npm/read-installed#13, [@zertosh](https://github.com/zertosh)) * Fix "Adding a cache directory to the cache will make the world implode" in certain cases. ([`9a4b2c4`](https://github.com/npm/npm/commit/9a4b2c4667c2b1e0054e3d5611ab86acb1760834), domenic/path-is-inside#1, [@pmarques](https://github.com/pmarques)) * Fix readmes not being uploaded in certain rare cases. ([`527b72c`](https://github.com/npm/npm/commit/527b72cca6c55762b51e592c48a9f28cc7e2ff8b), [@isaacs](https://github.com/isaacs)) ### v1.4.4 (2014-02-20): * Add `npm t` as an alias for `npm test` (which is itself an alias for `npm run test`, or even `npm run-script test`). We like making running your tests easy. ([`14e650b`](https://github.com/npm/npm/commit/14e650bce0bfebba10094c961ac104a61417a5de), [@isaacs](https://github.com/isaacs)) ### v1.4.3 (2014-02-16): * Add back `npm prune --production`, which was removed in 1.3.24. ([`acc4d02`](https://github.com/npm/npm/commit/acc4d023c57d07704b20a0955e4bf10ee91bdc83), [@davglass](https://github.com/davglass)) * Default `npm install --save` and its counterparts to use the `^` version specifier, instead of `~`. ([`0a3151c`](https://github.com/npm/npm/commit/0a3151c9cbeb50c1c65895685c2eabdc7e2608dc), [@mikolalysenko](https://github.com/mikolalysenko)) * Make `npm shrinkwrap` output dependencies in a sorted order, so that diffs between shrinkwrap files should be saner now. ([`059b2bf`](https://github.com/npm/npm/commit/059b2bfd06ae775205a37257dca80142596a0113), [@Raynos](https://github.com/Raynos)) * Fix `npm dedupe` not correctly respecting dependency constraints. ([`86028e9`](https://github.com/npm/npm/commit/86028e9fd8524d5e520ce01ba2ebab5a030103fc), [@rafeca](https://github.com/rafeca)) * Fix `npm ls` giving spurious warnings when you used `"latest"` as a version specifier. (https://github.com/npm/read-installed/commit/d2956400e0386931c926e0f30c334840e0938f14, [@bajtos](https://github.com/bajtos)) * Fixed a bug where using `npm link` on packages without a `name` value could cause npm to delete itself. ([`401a642`](https://github.com/npm/npm/commit/401a64286aa6665a94d1d2f13604f7014c5fce87), [@isaacs](https://github.com/isaacs)) * Fixed `npm install ./pkg@1.2.3` to actually install the directory at `pkg@1.2.3`; before it would try to find version `1.2.3` of the package `./pkg` in the npm registry. ([`46d8768`](https://github.com/npm/npm/commit/46d876821d1dd94c050d5ebc86444bed12c56739), [@rlidwka](https://github.com/rlidwka); see also [`f851b79`](https://github.com/npm/npm/commit/f851b79a71d9a5f5125aa85877c94faaf91bea5f)) * Fix `npm outdated` to respect the `color` configuration option. ([`d4f6f3f`](https://github.com/npm/npm/commit/d4f6f3ff83bd14fb60d3ac6392cb8eb6b1c55ce1), [@timoxley](https://github.com/timoxley)) * Fix `npm outdated --parseable`. ([`9575a23`](https://github.com/npm/npm/commit/9575a23f955ce3e75b509c89504ef0bd707c8cf6), [@yhpark](https://github.com/yhpark)) * Fix a lockfile-related errors when using certain Git URLs. ([`164b97e`](https://github.com/npm/npm/commit/164b97e6089f64e686db7a9a24016f245effc37f), [@nigelzor](https://github.com/nigelzor)) ### v1.4.2 (2014-02-13): * Fixed an issue related to mid-publish GET requests made against the registry. (https://github.com/npm/npm-registry-client/commit/acbec48372bc1816c67c9e7cbf814cf50437ff93, [@isaacs](https://github.com/isaacs)) ### v1.4.1 (2014-02-13): * Fix `npm shrinkwrap` forgetting to shrinkwrap dependencies that were also development dependencies. ([`9c575c5`](https://github.com/npm/npm/commit/9c575c56efa9b0c8b0d4a17cb9c1de3833004bcd), [@diwu1989](https://github.com/diwu1989)) * Fixed publishing of pre-existing packages with uppercase characters in their name. (https://github.com/npm/npm-registry-client/commit/9345d3b6c3d8510dd5c4418f27ee1fce59acebad, [@isaacs](https://github.com/isaacs)) ### v1.4.0 (2014-02-12): * Remove `npm publish --force`. See https://github.com/npm/npmjs.org/issues/148. ([@isaacs](https://github.com/isaacs), npm/npm-registry-client@2c8dba990de6a59af6545b75cc00a6dc12777c2a) * Other changes to the registry client related to saved configs and couch logins. ([@isaacs](https://github.com/isaacs); npm/npm-registry-client@25e2b019a1588155e5f87d035c27e79963b75951, npm/npm-registry-client@9e41e9101b68036e0f078398785f618575f3cdde, npm/npm-registry-client@2c8dba990de6a59af6545b75cc00a6dc12777c2a) * Show an error to the user when doing `npm update` and the `package.json` specifies a version that does not exist. ([@evanlucas](https://github.com/evanlucas), [`027a33a`](https://github.com/npm/npm/commit/027a33a5c594124cc1d82ddec5aee2c18bc8dc32)) * Fix some issues with cache ownership in certain installation configurations. ([@outcoldman](https://github.com/outcoldman), [`a132690`](https://github.com/npm/npm/commit/a132690a2876cda5dcd1e4ca751f21dfcb11cb9e)) * Fix issues where GitHub shorthand dependencies `user/repo` were not always treated the same as full Git URLs. ([@robertkowalski](https://github.com/robertkowalski), https://github.com/meryn/normalize-package-data/commit/005d0b637aec1895117fcb4e3b49185eebf9e240) ### v1.3.26 (2014-02-02): * Fixes and updates to publishing code ([`735427a`](https://github.com/npm/npm/commit/735427a69ba4fe92aafa2d88f202aaa42920a9e2) and [`c0ac832`](https://github.com/npm/npm/commit/c0ac83224d49aa62e55577f8f27d53bbfd640dc5), [@isaacs](https://github.com/isaacs)) * Fix `npm bugs` with no arguments. ([`b99d465`](https://github.com/npm/npm/commit/b99d465221ac03bca30976cbf4d62ca80ab34091), [@Hoops](https://github.com/Hoops)) ### v1.3.25 (2014-01-25): * Remove gubblebum blocky font from documentation headers. ([`6940c9a`](https://github.com/npm/npm/commit/6940c9a100160056dc6be8f54a7ad7fa8ceda7e2), [@isaacs](https://github.com/isaacs)) ### v1.3.24 (2014-01-19): * Make the search output prettier, with nice truncated columns, and a `--long` option to create wrapping columns. ([`20439b2`](https://github.com/npm/npm/commit/20439b2) and [`3a6942d`](https://github.com/npm/npm/commit/3a6942d), [@timoxley](https://github.com/timoxley)) * Support multiple packagenames in `npm docs`. ([`823010b`](https://github.com/npm/npm/commit/823010b), [@timoxley](https://github.com/timoxley)) * Fix the `npm adduser` bug regarding "Error: default value must be string or number" again. ([`b9b4248`](https://github.com/npm/npm/commit/b9b4248), [@isaacs](https://github.com/isaacs)) * Fix `scripts` entries containing whitespaces on Windows. ([`80282ed`](https://github.com/npm/npm/commit/80282ed), [@robertkowalski](https://github.com/robertkowalski)) * Fix `npm update` for Git URLs that have credentials in them ([`93fc364`](https://github.com/npm/npm/commit/93fc364), [@danielsantiago](https://github.com/danielsantiago)) * Fix `npm install` overwriting `npm link`-ed dependencies when they are tagged Git dependencies. ([`af9bbd9`](https://github.com/npm/npm/commit/af9bbd9), [@evanlucas](https://github.com/evanlucas)) * Remove `npm prune --production` since it buggily removed some dependencies that were necessary for production; see [#4509](https://github.com/npm/npm/issues/4509). Hopefully it can make its triumphant return, one day. ([`1101b6a`](https://github.com/npm/npm/commit/1101b6a), [@isaacs](https://github.com/isaacs)) Dependency updates: * [`909cccf`](https://github.com/npm/npm/commit/909cccf) `read-package-json@1.1.6` * [`a3891b6`](https://github.com/npm/npm/commit/a3891b6) `rimraf@2.2.6` * [`ac6efbc`](https://github.com/npm/npm/commit/ac6efbc) `sha@1.2.3` * [`dd30038`](https://github.com/npm/npm/commit/dd30038) `node-gyp@0.12.2` * [`c8c3ebe`](https://github.com/npm/npm/commit/c8c3ebe) `npm-registry-client@0.3.3` * [`4315286`](https://github.com/npm/npm/commit/4315286) `npmconf@0.1.12` ### v1.3.23 (2014-01-03): * Properly handle installations that contained a certain class of circular dependencies. ([`5dc93e8`](https://github.com/npm/npm/commit/5dc93e8c82604c45b6067b1acf1c768e0bfce754), [@substack](https://github.com/substack)) ### v1.3.22 (2013-12-25): * Fix a critical bug in `npm adduser` that would manifest in the error message "Error: default value must be string or number." ([`fba4bd2`](https://github.com/npm/npm/commit/fba4bd24bc2ab00ccfeda2043aa53af7d75ef7ce), [@isaacs](https://github.com/isaacs)) * Allow `npm bugs` in the current directory to open the current package's bugs URL. ([`d04cf64`](https://github.com/npm/npm/commit/d04cf6483932c693452f3f778c2fa90f6153a4af), [@evanlucas](https://github.com/evanlucas)) * Several fixes to various error messages to include more useful or updated information. ([`1e6f2a7`](https://github.com/npm/npm/commit/1e6f2a72ca058335f9f5e7ca22d01e1a8bb0f9f7), [`ff46366`](https://github.com/npm/npm/commit/ff46366bd40ff0ef33c7bac8400bc912c56201d1), [`8b4bb48`](https://github.com/npm/npm/commit/8b4bb4815d80a3612186dc5549d698e7b988eb03); [@rlidwka](https://github.com/rlidwka), [@evanlucas](https://github.com/evanlucas)) ### v1.3.21 (2013-12-17): * Fix a critical bug that prevented publishing due to incorrect hash calculation. ([`4ca4a2c`](https://github.com/npm/npm-registry-client/commit/4ca4a2c6333144299428be6b572e2691aa59852e), [@dominictarr](https://github.com/dominictarr)) ### v1.3.20 (2013-12-17): * Fixes a critical bug in v1.3.19. Thankfully, due to that bug, no one could install npm v1.3.19 :) ### v1.3.19 (2013-12-16): * Adds atomic PUTs for publishing packages, which should result in far fewer requests and less room for replication errors on the server-side. ### v1.3.18 (2013-12-16): * Added an `--ignore-scripts` option, which will prevent `package.json` scripts from being run. Most notably, this will work on `npm install`, so e.g. `npm install --ignore-scripts` will not run preinstall and prepublish scripts. ([`d7e67bf`](https://github.com/npm/npm/commit/d7e67bf0d94b085652ec1c87d595afa6f650a8f6), [@sqs](https://github.com/sqs)) * Fixed a bug introduced in 1.3.16 that would manifest with certain cache configurations, by causing spurious errors saying "Adding a cache directory to the cache will make the world implode." ([`966373f`](https://github.com/npm/npm/commit/966373fad8d741637f9744882bde9f6e94000865), [@domenic](https://github.com/domenic)) * Re-fixed the multiple download of URL dependencies, whose fix was reverted in 1.3.17. ([`a362c3f`](https://github.com/npm/npm/commit/a362c3f1919987419ed8a37c8defa19d2e6697b0), [@spmason](https://github.com/spmason)) ### v1.3.17 (2013-12-11): * This release reverts [`644c2ff`](https://github.com/npm/npm/commit/644c2ff3e3d9c93764f7045762477f48864d64a7), which avoided re-downloading URL and shinkwrap dependencies when doing `npm install`. You can see the in-depth reasoning in [`d8c907e`](https://github.com/npm/npm/commit/d8c907edc2019b75cff0f53467e34e0ffd7e5fba); the problem was, that the patch changed the behavior of `npm install -f` to reinstall all dependencies. * A new version of the no-re-downloading fix has been submitted as [#4303](https://github.com/npm/npm/issues/4303) and will hopefully be included in the next release. ### v1.3.16 (2013-12-11): * Git URL dependencies are now updated on `npm install`, fixing a two-year old bug ([`5829ecf`](https://github.com/npm/npm/commit/5829ecf032b392d2133bd351f53d3c644961396b), [@robertkowalski](https://github.com/robertkowalski)). Additional progress on reducing the resulting Git-related I/O is tracked as [#4191](https://github.com/npm/npm/issues/4191), but for now, this will be a big improvement. * Added a `--json` mode to `npm outdated` to give a parseable output. ([`0b6c9b7`](https://github.com/npm/npm/commit/0b6c9b7c8c5579f4d7d37a0c24d9b7a12ccbe5fe), [@yyx990803](https://github.com/yyx990803)) * Made `npm outdated` much prettier and more useful. It now outputs a color-coded and easy-to-read table. ([`fd3017f`](https://github.com/npm/npm/commit/fd3017fc3e9d42acf6394a5285122edb4dc16106), [@quimcalpe](https://github.com/quimcalpe)) * Added the `--depth` option to `npm outdated`, so that e.g. you can do `npm outdated --depth=0` to show only top-level outdated dependencies. ([`1d184ef`](https://github.com/npm/npm/commit/1d184ef3f4b4bc309d38e9128732e3e6fb46d49c), [@yyx990803](https://github.com/yyx990803)) * Added a `--no-git-tag-version` option to `npm version`, for doing the usual job of `npm version` minus the Git tagging. This could be useful if you need to increase the version in other related files before actually adding the tag. ([`59ca984`](https://github.com/npm/npm/commit/59ca9841ba4f4b2f11b8e72533f385c77ae9f8bd), [@evanlucas](https://github.com/evanlucas)) * Made `npm repo` and `npm docs` work without any arguments, adding them to the list of npm commands that work on the package in the current directory when invoked without arguments. ([`bf9048e`](https://github.com/npm/npm/commit/bf9048e2fa16d43fbc4b328d162b0a194ca484e8), [@robertkowalski](https://github.com/robertkowalski); [`07600d0`](https://github.com/npm/npm/commit/07600d006c652507cb04ac0dae9780e35073dd67), [@wilmoore](https://github.com/wilmoore)). There are a few other commands we still want to implement this for; see [#4204](https://github.com/npm/npm/issues/4204). * Pass through the `GIT_SSL_NO_VERIFY` environment variable to Git, if it is set; we currently do this with a few other environment variables, but we missed that one. ([`c625de9`](https://github.com/npm/npm/commit/c625de91770df24c189c77d2e4bc821f2265efa8), [@arikon](https://github.com/arikon)) * Fixed `npm dedupe` on Windows due to incorrect path separators being used ([`7677de4`](https://github.com/npm/npm/commit/7677de4583100bc39407093ecc6bc13715bf8161), [@mcolyer](https://github.com/mcolyer)). * Fixed the `npm help` command when multiple words were searched for; it previously gave a `ReferenceError`. ([`6a28dd1`](https://github.com/npm/npm/commit/6a28dd147c6957a93db12b1081c6e0da44fe5e3c), [@dereckson](https://github.com/dereckson)) * Stopped re-downloading URL and shrinkwrap dependencies, as demonstrated in [#3463](https://github.com/npm/npm/issues/3463) ([`644c2ff`](https://github.com/isaacs/npm/commit/644c2ff3e3d9c93764f7045762477f48864d64a7), [@spmason](https://github.com/spmason)). You can use the `--force` option to force re-download and installation of all dependencies. npm_3.5.2.orig/changelogs/CHANGELOG-2.md0000644000000000000000000067573112631326456015577 0ustar 00000000000000### v2.14.14 (2015-12-03): #### FIX URL IN LICENSE The license incorrectly identified the registry URL as `registry.npmjs.com` and this has been corrected to `registry.npmjs.org`. * [`6051a69`](https://github.com/npm/npm/commit/6051a69b1adc80f5f200077067e831643f655bd4) [#10685](https://github.com/npm/npm/pull/10685) Fix npm public registry URL in notices. ([@kemitchell](https://github.com/kemitchell)) #### NO MORE MD5 We updated modules that had been using MD5 for non-security purposes. While this is perfectly safe, if you compile Node in FIPS-compliance mode it will explode if you try to use MD5. We've replaced MD5 with Murmur, which conveys our intent better and is faster to boot. * [`30b5994`](https://github.com/npm/npm/commit/30b599496a9762482e1cef945a378e3a534fd366) [#10629](https://github.com/npm/npm/issues/10629) `write-file-atomic@1.1.4` ([@othiym23](https://github.com/othiym23)) * [`68c63ff`](https://github.com/npm/npm/commit/68c63ff1279d3d5ea7b2c970ab5562a8e0536f27) [#10629](https://github.com/npm/npm/issues/10629) `fs-write-stream-atomic@1.0.5` ([@othiym23](https://github.com/othiym23)) #### DEPENDENCY UPDATES * [`e48e5a9`](https://github.com/npm/npm/commit/e48e5a90b4dcf76124b7e9ea3b295c1383e7f0c8) [nodejs/node-gyp#831](https://github.com/nodejs/node-gyp/pull/831) `node-gyp@3.2.1`: Improved \*BSD support. ([@bnoordhuis](https://github.com/bnoordhuis)) ### v2.14.13 (2015-11-25): #### THE npm CLI !== THE npm REGISTRY !== npm, INC. npm-the-CLI is licensed under the terms of the [Artistic License 2.0](https://github.com/npm/npm/blob/8d79c1a39dae908f27eaa37ff6b23515d505ef29/LICENSE), which is a liberal open-source license that allows you to take this code and do pretty much whatever you like with it (that is, of course, not legal language, and if you're doing anything with npm that leaves you in doubt about your legal rights, please seek the review of qualified counsel, which is to say, not members of the CLI team, none of whom have passed the bar, to my knowledge). At the same time the primary registry the CLI uses when looking up and downloading packages is a commercial service run by npm, Inc., and it has its own [Terms of Use](https://www.npmjs.com/policies/terms). Aside from clarifying the terms of use (and trying to make sure they're more widely known), the only recent changes to npm's licenses have been making the split between the CLI and registry clearer. You are still free to do whatever you like with the CLI's source, and you are free to view, download, and publish packages to and from `registry.npmjs.org`, but now the existing terms under which you can do so are more clearly documented. Aside from the two commits below, see also [the release notes for `npm@2.14.11`](https://github.com/npm/npm/releases/tag/v2.14.11), which is where the split between the CLI's code and the terms of use for the registry was first made more clear. * [`1f3e936`](https://github.com/npm/npm/commit/1f3e936aab6840667948ef281e0c3621df365131) [#10532](https://github.com/npm/npm/issues/10532) Clarify that `registry.npmjs.org` is the default, but that you're free to use the npm CLI with whatever registry you wish. ([@kemitchell](https://github.com/kemitchell)) * [`6733539`](https://github.com/npm/npm/commit/6733539eeb9b32a5f2d1a6aa797987e2252fa760) [#10532](https://github.com/npm/npm/issues/10532) Having semi-duplicate release information in `README.md` was confusing and potentially inaccurate, so remove it. ([@kemitchell](https://github.com/kemitchell)) #### EASE UP ON WINDOWS BASH USERS It turns out that a fair number of us use bash on Windows (through MINGW or bundled with Git, plz – Cygwin is still a bridge too far, for both npm and Node.js). [@jakub-g](https://github.com/jakub-g) did us all a favor and relaxed the check for npm completion to support MINGW bash. Thanks, Jakub! * [`460cc09`](https://github.com/npm/npm/commit/460cc0950fd6a005c4e5c4f85af807814209b2bb) [#10156](https://github.com/npm/npm/issues/10156) completion: enable on Windows in git bash ([@jakub-g](https://github.com/jakub-g)) #### MAKE NODE-GYP A LITTLE BLUER * [`333e118`](https://github.com/npm/npm/commit/333e1181082842c21edc62f0ce515928424dff1f) `node-gyp@3.2.0`: Support AIX, use `which` to find Python, updated to a newer version of `gyp`, and more! ([@bnoordhuis](https://github.com/bnoordhuis)) #### WE LIKE SPDX AND ALL BUT IT'S NOT ACTUALLY A DIRECT DEP, SORRY * [`1f4b4bb`](https://github.com/npm/npm/commit/1f4b4bbdf8758281beecb7eaf75d05a6c4a77c15) Removed `spdx` as a direct npm dependency, since we don't actually need it at that level, and updated subdeps for `validate-npm-package-license` ([@othiym23](https://github.com/othiym23)) #### A BOUNTEOUS THANKSGIVING CORNUCOPIA OF DOC TWEAKS These are great! Keep them coming! Sorry for letting them pile up so deep, everybody. Also, a belated Thanksgiving to our Canadian friends, and a happy Thanksgiving to all our friends in the USA. * [`6101f44`](https://github.com/npm/npm/commit/6101f44737645d9379c3396fae81bbc4d94e1f7e) [#10250](https://github.com/npm/npm/issues/10250) Correct order of `org:team` in `npm team` documentation. ([@louislarry](https://github.com/louislarry)) * [`e8769f9`](https://github.com/npm/npm/commit/e8769f9807b91582c15ef130733e2e72b6c7bda4) [#10371](https://github.com/npm/npm/issues/10371) Remove broken / duplicate link to tag. ([@WickyNilliams](https://github.com/WickyNilliams)) * [`1ae2dbe`](https://github.com/npm/npm/commit/1ae2dbe759feb80d8634569221ec6ee2c6d1d1ff) [#10419](https://github.com/npm/npm/issues/10419) Remove references to nonexistent `npm-rm(1)` documentation. ([@KenanY](https://github.com/KenanY)) * [`777a271`](https://github.com/npm/npm/commit/777a271830a42d4ee62540a89f764a6e7d62de19) [#10474](https://github.com/npm/npm/issues/10474) Clarify that install finds dependencies in `package.json`. ([@sleekweasel](https://github.com/sleekweasel)) * [`dcf4b5c`](https://github.com/npm/npm/commit/dcf4b5cbece1b0ef55ab7665d9acacc0b6b7cd6e) [#10497](https://github.com/npm/npm/issues/10497) Clarify what a package is slightly. ([@aredridel](https://github.com/aredridel)) * [`447b3d6`](https://github.com/npm/npm/commit/447b3d669b2b6c483b8203754ac0a002c67bf015) [#10539](https://github.com/npm/npm/issues/10539) Remove an extra, spuriously capitalized letter. ([@alexlukin-softgrad](https://github.com/alexlukin-softgrad)) ### v2.14.12 (2015-11-19): #### TEEN ORCS AT THE GATES This week heralds the general release of the primary npm registry's [new support for private packages for organizations](http://blog.npmjs.org/post/133542170540/private-packages-for-organizations). For many potential users, it's the missing piece needed to make it easy for you to move your organization's private work onto npm. And now it's here! The functionality to support it has been in place in the CLI for a while now, thanks to [@zkat](https://github.com/zkat)'s hard work. During our final testing before the release, our ace support team member [@snopeks](https://github.com/snopeks) noticed that there had been some drift between the CLI team's implementation and what npm was actually preparing to ship. In the interests of everyone having a smooth experience with this _extremely useful_ new feature, we quickly made a few changes to square up the CLI and the web site experiences. * [`0e8b15e`](https://github.com/npm/npm/commit/0e8b15e9fbc89e31bd00e573b648846beddfb835) [#9327](https://github.com/npm/npm/issues/9327) `npm access` no longer has problems when run in a directory that doesn't contain a `package.json`. ([@othiym23](https://github.com/othiym23)) * [`c4e939c`](https://github.com/npm/npm/commit/c4e939c1d493601d25dcb88e6ffcca73076fd3fd) [npm/npm-registry-client#126](https://github.com/npm/npm-registry-client/issues/126) `npm-registry-client@7.0.8`: Allow the CLI to grant, revoke, and list permissions on unscoped (public) packages on the primary registry. ([@othiym23](https://github.com/othiym23)) #### A BRIEF NOTE ON NPM'S BACKWARDS COMPATIBILITY We don't often have much to say about the changes we make to our internal testing and tooling, but I'm going to take this opportunity to reiterate that npm tries hard to maintain compatibility with a wide variety of Node versions. As this change shows, we want to ensure that npm works the same across: * Node.js 0.8 * Node.js 0.10 * Node.js 0.12 * the latest io.js release * Node.js 4 LTS * Node.js 5 Contributors who send us pull requests often notice that it's very rare that our tests pass across all of those versions (ironically, almost entirely due to the packages we use for testing instead of any issues within npm itself). We're currently beginning an effort, lasting the rest of 2015, to clean up our test suite, and not only get it passing on all of the above versions of Node.js, but working solidly on Windows as well. This is a compounding form of technical debt that we're finally paying down, and our hope is that cleaning up the tests will produce a more robust CLI that's a lot easier to write patches for. * [`d743620`](https://github.com/npm/npm/commit/d743620a0005213a65d25de771661b4d48a09717) [#10233](https://github.com/npm/npm/issues/10233) Update Node.js versions that Travis uses to test npm. ([@iarna](https://github.com/iarna)) #### TYPOS IN THE LICENSE, OH MY * [`58ac241`](https://github.com/npm/npm/commit/58ac241f556b2c202a8ee33321965e2540361ca7) [#10478](https://github.com/npm/npm/issues/10478) Correct two typos in npm's LICENSE. ([@jorrit](https://github.com/jorrit)) ### v2.14.11 (2015-11-12): #### ASK FOR NOTHING, GET LATEST When you run `npm install foo`, you probably expect that you'll get the `latest` version of `foo`, whatever that is. And good news! That's what this change makes it do. We _think_ this is what everyone wants, but if this causes problems for you, we want to know! If it proves problematic for people we will consider reverting it (preferrably before this becomes `npm@latest`). Previously, when you ran `npm install foo` we would act as if you typed `npm install foo@*`. Now, like any range-type specifier, in addition to matching the range, it would also have to be `<=` the value of the `latest` dist-tag. Further, it would exclude prerelease versions from the list of versions considered for a match. This worked as expected most of the time, unless your `latest` was a prerelease version, in which case that version wouldn't be used, to everyone's surprise. * [`6f0a646`](https://github.com/npm/npm/commit/6f0a646cd865b24fe3ff25365bf5421780e63e01) [#10189](https://github.com/npm/npm/issues/10189) `npm-package-arg@4.1.0`: Change the default version from `*` to `latest`. ([@zkat](https://github.com/zkat)) #### LICENSE CLARIFICATION * [`54a9046`](https://github.com/npm/npm/commit/54a90461f068ea89baa5d70248cdf1581897936d) [#10326](https://github.com/npm/npm/issues/10326) Clarify what-all is covered by npm's license and point to the registry's terms of use. ([@kemitchell](https://github.com/kemitchell)) #### CLOSER TO GREEN TRAVIS * [`28efd3d`](https://github.com/npm/npm/commit/28efd3d7dfb2fa3755076ae706ea4d38c6ee6900) [#10232](https://github.com/npm/npm/issues/10232) `nock@1.9.0`: Downgrade nock to a version that doesn't depend on streams2 in core so that more of our tests can pass in 0.8. ([@iarna](https://github.com/iarna)) #### A BUG FIX * [`eacac8f`](https://github.com/npm/npm/commit/eacac8f05014d15217c3d8264d0b00a72eafe2d2) [#9965](https://github.com/npm/npm/issues/9965) Fix a corrupt `package.json` file introduced by a merge conflict in [`022691a`](https://github.com/npm/npm/commit/022691a). ([@waynebloss](https://github.com/waynebloss)) #### A DEPENDENCY UPGRADE * [`ea7d8e0`](https://github.com/npm/npm/commit/ea7d8e00a67a3d5877ed72c9728909c848468a9b) [npm/nopt#51](https://github.com/npm/nopt/pull/51) `nopt@3.0.6`: Allow types checked to be validated by passed-in name in addition to the JS name of the type / class. ([@wbecker](https://github.com/wbecker)) ### v2.14.10 (2015-11-05): There's nothing in here that that isn't in the `npm@3.4.0` release notes, but all of the commit shasums have been adjusted to be correct. Enjoy! #### BUG FIXES VIA DEPENDENCY UPDATES * [`204c558`](https://github.com/npm/npm/commit/204c558c06637a753c0b41d0cf19f564a1ac3715) [#8640](https://github.com/npm/npm/issues/8640) [npm/normalize-package-data#69](https://github.com/npm/normalize-package-data/pull/69) `normalize-package-data@2.3.5`: Fix a bug where if you didn't specify the name of a scoped module's binary, it would install it such that it was impossible to call it. ([@iarna](https://github.com/iarna)) * [`bbdf4ee`](https://github.com/npm/npm/commit/bbdf4ee0a3cd12be6a2ace255b67d573a72f1f8f) [npm/fstream-npm#14](https://github.com/npm/fstream-npm/pull/14) `fstream-npm@1.0.7`: Only filter `config.gypi` when it's in the build directory. ([@mscdex](https://github.com/mscdex)) * [`d82ff81`](https://github.com/npm/npm/commit/d82ff81403e906931fac701775723626dcb443b3) [npm/fstream-npm#15](https://github.com/npm/fstream-npm/pull/15) `fstream-npm@1.0.6`: Stop including directories that happened to have names matching whitelisted npm files in npm module tarballs. The most common cause was that if you had a README directory then everything in it would be included if wanted it or not. ([@taion](https://github.com/taion)) #### DOCUMENTATION FIXES * [`16361d1`](https://github.com/npm/npm/commit/16361d122f2ff6d1a4729c66153b7c24c698fd19) [#10036](https://github.com/npm/npm/pull/10036) Fix typo / over-abbreviation. ([@ifdattic](https://github.com/ifdattic)) * [`d1343dd`](https://github.com/npm/npm/commit/d1343dda42f113dc322f95687f5a8c7d71a97c35) [#10176](https://github.com/npm/npm/pull/10176) Fix broken link, scopes => scope. ([@ashleygwilliams](https://github.com/ashleygwilliams)) * [`110663d`](https://github.com/npm/npm/commit/110663d000a3908a4853393d9abae481700cf4dc) [#9460](https://github.com/npm/npm/issue/9460) Specifying the default command run by "npm start" and the fact that you can pass it arguments. ([@JuanCaicedo](https://github.com/JuanCaicedo)) #### DEPENDENCY UPDATES FOR THEIR OWN SAKE * [`7476d2d`](https://github.com/npm/npm/commit/7476d2d31552a41671c425aa7fcc2844e0381008) [npm/npmlog#19](https://github.com/npm/npmlog/pull/19) `npmlog@2.0.0`: Make it possible to emit log messages with `error` as the prefix. ([@bengl](https://github.com/bengl)) * [`6ca7888`](https://github.com/npm/npm/commit/6ca7888862cfe8bf802dc7c66632c102acd94cf5) `read-package-json@2.0.2`: Minor cleanups. ([@KenanY](https://github.com/KenanY)) ### v2.14.9 (2015-10-29): There's still life in `npm@2`, but for now, enjoy these dependency upgrades! Also, [@othiym23](https://github.com/othiym23) says hi! _waves_ [@zkat](https://github.com/zkat) has her hands full, and [@iarna](https://github.com/iarna)'s handling `npm@3`, so I'm dealing with `npm@2` and the totally nonexistent weird bridge `npm@1.4` LTS release that may or may not be happening this week. #### CAN'T STOP WON'T STOP UPDATING THOSE DEPENDENCIES * [`f52f0cb`](https://github.com/npm/npm/commit/f52f0cb51526314197e9d67619feebbd82a397b7) [#10150](https://github.com/npm/npm/issues/10150) `chmodr@1.0.2`: Use `fs.lstat()` to check if an entry is a directory, making `chmodr()` work properly with NFS mounts on Windows. ([@sheerun](https://github.com/sheerun)) * [`f7011d7`](https://github.com/npm/npm/commit/f7011d7b3b1d9148a6cd8f7b8359d6fe3269a912) [#10150](https://github.com/npm/npm/issues/10150) `which@1.2.0`: Additional command-line parameters, which is nice but not used by npm. ([@isaacs](https://github.com/isaacs)) * [`ebcc0d8`](https://github.com/npm/npm/commit/ebcc0d8629388da0b849bbbad590382cd7268f51) [#10150](https://github.com/npm/npm/issues/10150) `minimatch@3.0.0`: Don't package browser version. ([@isaacs](https://github.com/isaacs)) * [`8c98dce`](https://github.com/npm/npm/commit/8c98dce5ffe242bafbe92b849e73e8de1803e256) [#10150](https://github.com/npm/npm/issues/10150) `fstream-ignore@1.0.3`: Upgrade to use `minimatch@3` (for deduping purposes). ([@othiym23](https://github.com/othiym23)) * [`db9ef33`](https://github.com/npm/npm/commit/db9ef337c253ecf21c921055bf8742e10d1cb3bb) [#10150](https://github.com/npm/npm/issues/10150) `request@2.65.0`: Dependency upgrades and a few bug fixes, mostly related to cookie handling. ([@simov](https://github.com/simov)) #### DEVDEPENDENCIES TOO, I GUESS, IT'S COOL * [`dfbf621`](https://github.com/npm/npm/commit/dfbf621afa09c46991249b4f9a995d1823ea7ede) [#10150](https://github.com/npm/npm/issues/10150) `tap@2.2.0`: Better handling of test order handling (including some test fixes for npm). ([@isaacs](https://github.com/isaacs)) * [`cf5ad5a`](https://github.com/npm/npm/commit/cf5ad5a8c88bfd72e30ef8a8d1d3c5508e0b3c23) [#10150](https://github.com/npm/npm/issues/10150) `nock@2.16.0`: More expectations, documentation, and bug fixes. ([@pgte](https://github.com/pgte)) ### v2.14.8 (2015-10-08): #### SLOWLY RECOVERING FROM FEELINGS OS&F is definitely my favorite convention I've gone to. Y'all should check it out next year! Rebecca and Kat are back, although Forrest is out at [&yet conf](http://andyetconf.com/). This week sees another tiny LTS release with non-code-related patches -- just CI/release things. Meanwhile, have you heard? `npm@3` is much faster now! Go upgrade with `npm install -g npm@latest` and give it a whirl if you haven't already! #### IF YOU CHANGE CASING ON A FILE, YOU ARE NOT MY FRIEND Seriously. I love me some case-sensitive filesystems, but a lot of us have to deal with `git` and its funky support for case normalizing systems. Have mercy and just don't bother if all you're changing is casing, please? Otherwise, I have to do this little dance to prevent horrible conflicts. * [`c3a7b61`](https://github.com/npm/npm/commit/c3a7b619786650a45653c8b55b8741fc7bb5cfda) [#9804](https://github.com/npm/npm/pulls/9804) Remove the readme file with weird casing. ([@zkat](https://github.com/zkat)) * [`f3f619e`](https://github.com/npm/npm/commit/f3f619e06e4be1378dbf286f897b50e9c69c9557) [#9804](https://github.com/npm/npm/pulls/9804) Add the readme file back in, with desired casing. ([@zkat](https://github.com/zkat)) #### IDK. OUR CI DOESN'T EVEN FULLY WORK YET BUT SURE Either way, it's nice to make sure we're running stuff on the latest Node. `4.2` is getting released very soon, though (this week?), and that'll be the first official LTS release! * [`bd0b9ab`](https://github.com/npm/npm/commit/bd0b9ab6e60a31448794bbd88f94672572c3cb55) [#9827](https://github.com/npm/npm/pulls/9827) Add node `4.0` and `4.1` to TravisCI ([@JaKXz](https://github.com/JaKXz)) ### v2.14.7 (2015-10-01): #### MORE RELEASE STAGGERING?! Hi all, and greetings from [Open Source & Feelings](http://osfeels.com)! So we're switching gears a little with how we handle our weekly releases: from now on, we're going to stagger release weeks between dependency bumps and regular patches. So, this week, aside from a doc change, we'll be doing only version bumps. Expect actual patches next week! #### TOTALLY FOLLOWING THE RULES ALREADY So I snuck this in, because it's our own [@snopeks](https://github.com/snopeks)' first contribution to the main `npm` repo. She's been helping with building support documents for Orgs, and contributed her general intro guide to the new feature so you can read it with `npm help orgs` right in your terminal! * [`8324ea0`](https://github.com/npm/npm/commit/8324ea023ace4e08b6b8959ad199e2457af9f9cf) [#9761](https://github.com/npm/npm/pull/9761) Added general user guide for Orgs. ([@snopeks](https://github.com/snopeks)) #### JUST. ONE. MORE. * [`9a502ca`](https://github.com/npm/npm/commit/9a502ca96e2d43ec75a8f684c9ca33af7e910f0a) Use unique package name in tests to work around weird test-state-based failures. ([@iarna](https://github.com/iarna)) #### OKAY ACTUALLY THE THING I WAS SUPPOSED TO DO Anyway -- here's your version bump! :) * [`4aeb94c`](https://github.com/npm/npm/commit/4aeb94c9f0df3f41802cf2e0397a998f3b527c25) `request@2.64.0`: No longer defaulting to `application/json` for `json` requests. Also some minor doc and packaging patches. ([@simov](https://github.com/simov)) `minimatch@3.0.0`: No longer packaging browser modules. ([@isaacs](https://github.com/isaacs)) * [`a18b213`](https://github.com/npm/npm/commit/a18b213e6945a8f5faf882927829ac95f844e2aa) `glob@5.0.15`: Upgraded `minimatch` dependency. ([@isaacs](https://github.com/isaacs)) * [`9eb64d4`](https://github.com/npm/npm/commit/9eb64e44509519ca9d788502edb2eba4cea5c86b) `nock@2.13.0` ([@pgte](https://github.com/pgte)) ### v2.14.6 (2015-09-24): #### `¯\_(ツ)_/¯` Since `2.x` is LTS now, you can expect a slowdown in overall release sizes. On top of that, we had our all-company-npm-internal-conf thing on Monday and Tuesday so there wasn't really time to do much at all. Still, we're bringing you a couple of tiny little changes this week! * [`7b7da13`](https://github.com/npm/npm/commit/7b7da13c6cdf5eae53c20d5c69afc4c16e6f715d) [#9471](https://github.com/npm/npm/pull/9471) When the port for a tarball is different than the registry it's in, but the hostname is the same, the protocol is now allowed to change, too. ([@fastest963](https://github.com/fastest963)) * [`6643ada`](https://github.com/npm/npm/commit/6643adaf9f37f08893e3ad28b797c55a36b2a152) `request@2.63.0`: Use `application/json` as the default content type when making `json` requests. ([@simov](https://github.com/simov)) ### v2.14.5 (2015-09-17): #### NPM IS DEAD. LONG LIVE NPM That's right folks. As of this week, `npm@next` is `npm@3`, which means it'll be `npm@latest` next week! There's some really great shiny new things over there, and you should really take a look. Many kudos to [@iarna](https://github.com/iarna) for her hard work on `npm@3`! Don't worry, we'll keep `2.x` around for a while (as LTS), but you won't see many, if any, new features on this end. From now on, we're going to use `latest-2` and `next-2` as the dist tags for the `npm@2` branch. #### OKAY THAT'S FINE CAN I DEPRECATE THINGS NOW? Yes! Specially if you're using scoped packages. Apparently, deprecating them never worked, but that should be better now. :) * [`eca7b24`](https://github.com/npm/npm/commit/eca7b24de9a0090da02a93a69726f5e70ab80543) [#9558](https://github.com/npm/npm/issues/9558) Add tests for npm deprecate. ([@zkat](https://github.com/zkat)) * [`648fe16`](https://github.com/npm/npm/commit/648fe16157ef0db22395ae056d1dd4b4c1605bf4) [#9558](https://github.com/npm/npm/issues/9558) `npm-registry-client@7.0.7`: Fixes `npm deprecate` so you can actually deprecate scoped modules now (it never worked). ([@zkat](https://github.com/zkat)) #### WTF IS `node-waf` idk. Some old thing. We don't talk about it anymore. * [`cf1b39f`](https://github.com/npm/npm/commit/cf1b39fc95a9ffad7fba4c2fee705c53b19d1d16) [#9584](https://github.com/npm/npm/issues/9584) Fix ancient references to `node-waf` in the docs to refer to the `node-gyp` version of things. ([@KenanY](https://github.com/KenanY)) #### THE `graceful-fs` AND `node-gyp` SAGA CONTINUES Last week had some sweeping `graceful-fs` upgrades, and this takes care of one of the stragglers, as well as bumping `node-gyp`. `node@4` users might be excited about this, or even `node@<4` users who previously had to cherry-pick a bunch of patches to get the latest npm working. * [`e07354f`](https://github.com/npm/npm/commit/e07354f3ff3a6be568fe950f1f825897f72912d8) `sha@2.0.1`: Upgraded graceful-fs! ([@ForbesLindesay](https://github.com/ForbesLindesay)) * [`83cb6ee`](https://github.com/npm/npm/commit/83cb6ee4045b85e565e9678ca1878877e1dc75bd) `node-gyp@3.0.3` ([@rvagg](https://github.com/rvagg)) #### DEPS! DEPS! MORE DEPS! OK STOP DEPS * [`0d60888`](https://github.com/npm/npm/commit/0d608889615a1cb63f5f852337e955053f201aeb) `normalize-package-data@2.3.4`: Use an external package to check for built-in node modules. ([@sindresorhus](https://github.com/sindresorhus)) * [`79b4dac`](https://github.com/npm/npm/commit/79b4dac11f1c2d8ad5489fc3104734e1c10d4793) `retry@0.8.0` ([@tim-kos](https://github.com/tim-kos)) * [`c164941`](https://github.com/npm/npm/commit/c164941d3c792904d5b126a4fd36eefbe0699f52) `request@2.62.0`: node 4 added to build targets. Option initialization issues fixed. ([@simov](https://github.com/simov)) * [`0fd878a`](https://github.com/npm/npm/commit/0fd878a44d5ae303325808d1f00df4dce7549d50) `lru-cache@2.7.0`: Cache serialization support and fixes a cache length bug. ([@isaacs](https://github.com/isaacs)) * [`6a7a114`](https://github.com/npm/npm/commit/6a7a114a45b4699995d6e09164fdfd0fa1274591) `nock@2.12.0` ([@pgte](https://github.com/pgte)) * [`6b25e6d`](https://github.com/npm/npm/commit/6b25e6d2235c11f4444104db4545cb42a0267666) `semver@5.0.3`: Removed uglify-js dead code. ([@isaacs](https://github.com/isaacs)) ### v2.14.4 (2015-09-10): #### THE GREAT NODEv4 SAGA So [Node 4 is out now](https://nodejs.org/en/blog/release/v4.0.0/) and that's going to involve a number of things over in npm land. Most importantly, it's the last major release that will include the `2.x` branch of npm. That also means that `2.x` is going to go into LTS mode in the coming weeks -- once `npm@3` becomes our official `latest` release. You can most likely expect Node 5 to include `npm@3` by default, whenever that happens. We'll go into more detail about LTS at that point, as well, so keep your eyes peeled for announcements! #### NODE IS DEAD. LONG LIVE NODE! Node 4 being released means that a few things that used to be floating patches are finally making it right into npm proper. This week, we've got two such updates, both to dependencies: * [`505d9e4`](https://github.com/npm/npm/commit/505d9e40c13b8b0bb3f70ee9886f7b73ba569407) `node-gyp@3.0.1`: Support for node nightlies and compilation for both node and io.js without extra patching ([@rvagg](https://github.com/rvagg)) [@thefourtheye](https://github.com/thefourtheye) was kind enough to submit a *bunch* of PRs to npm's dependencies updating them to `graceful-fs@4.1.2`, which mainly makes it so we're no longer monkey-patching `fs`. The following are all updates related to this: * [`10cb189`](https://github.com/npm/npm/commit/10cb189c773fef804214018d57509cc7a943184b) `write-file-atomic@1.1.3` ([@thefourtheye](https://github.com/thefourtheye)) * [`edfb80b`](https://github.com/npm/npm/commit/edfb80b39f8cfce9a993f139eb98248001198e09) `tar@2.2.1` ([@thefourtheye](https://github.com/thefourtheye)) * [`aa6e1ee`](https://github.com/npm/npm/commit/aa6e1eede7d71fa69d7256afdfbaa3406bc39a5b) `read-package-json@2.0.1` ([@thefourtheye](https://github.com/thefourtheye)) * [`18971a3`](https://github.com/npm/npm/commit/18971a361635ed3958ecd39b63930ae1e56f8612) `read-installed@4.0.3` ([@thefourtheye](https://github.com/thefourtheye)) * [`a4cba71`](https://github.com/npm/npm/commit/a4cba71bd2532236fda7385bf55e8790cafd4f0a) `fstream@1.0.8` ([@thefourtheye](https://github.com/thefourtheye)) * [`70a38e2`](https://github.com/npm/npm/commit/70a38e29418951ac61ab6cf269d188074fe8ac3a) `fs-write-stream-atomic@1.0.4` ([@thefourtheye](https://github.com/thefourtheye)) * [`9cbd20f`](https://github.com/npm/npm/commit/9cbd20f691e37960e4ba12d401abd1069657cb47) `fs-vacuum@1.2.7` ([@thefourtheye](https://github.com/thefourtheye)) #### OTHER PATCHES * [`c4dd521`](https://github.com/npm/npm/commit/c4dd5213b2f3283ea0392845e5f78cac4573529e) [#9506](https://github.com/npm/npm/issues/9506) Make `npm link` work on Windows when using node pre-release/RC releases. ([@jon-hall](https://github.com/jon-hall)) * [`b6bc29c`](https://github.com/npm/npm/commit/b6bc29c1401b3d6b570c09cbef1866bdb0436b59) [#9544](https://github.com/npm/npm/issues/9549) `process.binding` is being deprecated, so our only direct usage has been removed. ([@ChALkeR](https://github.com/ChALkeR)) #### MORE DEPENDENCIES! * [`d940594`](https://github.com/npm/npm/commit/d940594e479a7f012b6dd6952e8ef985ba2a6216) `tap@1.4.1` ([@isaacs](https://github.com/isaacs)) * [`ee38486`](https://github.com/npm/npm/commit/ee3848669331fd98879a3175789d963543f67ce3) `which@1.1.2`: Added tests for Windows-related dead code that was previously helping a silent failure happen. Travis stuff, too. ([@isaacs](https://github.com/isaacs)) #### DOC UPDATES * [`475daf5`](https://github.com/npm/npm/commit/475daf54ad07777938d1d7ee1a3e576961e84510) [#9492](https://github.com/npm/npm/issues/9492) Clarify how `.npmignore` and `.gitignore` are found and used by npm. ([@addaleax](https://github.com/addaleax)) * [`b2c391d`](https://github.com/npm/npm/commit/b2c391d7833249626a6d7650363a83bcc778717a) `nopt@3.0.4`: Minor clarifications to docs about how array and errors work. ([@zkat](https://github.com/zkat)) ### v2.14.3 (2015-09-03): #### TEAMS AND ORGS STILL BETA. CLI CODE STILL SOLID. Our closed beta for Teens and Orcs is happening! The web team is hard at work making sure everything looks pretty and usable and such. Once we fix things stemming from that beta, you can expect the feature to be available publicly. Some time after that, it'll even be available for free for FOSS orgs. It'll Be Done When It's Done™. #### OH GOOD, I CAN ACTUALLY UPSTREAM NOW Looks like last week's release foiled our own test suite when trying to upstream it to Node! Just a friendly reminder that no, `.npmrc` is no longer included then you pack/release a package! [@othiym23](https://github.com/othiym23) and [@isaacs](https://github.com/isaacs) managed to suss the really strange test failures resulting from that, and we've patched it in this release. * [`01a3428`](https://github.com/npm/npm/commit/01a3428534b754dca89a56fd1e49f55cb22f6f25) [#9476](https://github.com/npm/npm/issues/9476) test: Recreate missing `.npmrc` files when missing so downstream packagers can run tests on packed npm. ([@othiym23](https://github.com/othiym23)) #### TALKING ABOUT THE CHANGELOG IN THE CHANGELOG IS LIKE, POMO OR SOMETHING * [`c1e7a83`](https://github.com/npm/npm/commit/c1e7a83c0ae7aadf01aecc57cf8a0ae2009d4da8) [#9431](https://github.com/npm/npm/issues/9431) CHANGELOG: clarify windows-related nature of patch ([@saper](https://github.com/saper)) #### devDependencies UPDATED No actual dep updates this week, but we're bumping a couple of devDeps: * [`8454835`](https://github.com/npm/npm/commit/84548351bfd63e3e305d195abbcad24c6b7c3e8e) `tap@1.4.0`: Add `t.contains()` as alias to `t.match()` ([@isaacs](https://github.com/isaacs)) * [`13d2216`](https://github.com/npm/npm/commit/13d22161bcdeb6e1ed095d5ba2f77e6abfffa5eb) `deep-equal@1.0.1`: Make `null == undefined` in non-strict mode ([@isaacs](https://github.com/isaacs)) ### v2.14.2 (2015-08-27): #### GETTING THAT PESKY `preferGlobal` WARNING RIGHT So apparently the `preferGlobal` option hasn't quite been warning correctly for some time. But now it should be all better! tl;dr: if you try and install a dependency with `preferGlobal: true`, and it's _not already_ in your `package.json`, you'll get a warning that the author would really rather you install it with `--global`. This should prevent Windows PowerShell from thinking npm has failed just because of a benign warning. * [`bbb25f3`](https://github.com/npm/npm/commit/bbb25f30d582f8979168c79233a9f8f840974f90) [#8841](https://github.com/npm/npm/issues/8841) [#9409](https://github.com/npm/npm/issues/9409) The `preferGlobal` warning shouldn't happen if the dependency being installed is listed in `devDependencies`. ([@saper](https://github.com/saper)) * [`222fcec`](https://github.com/npm/npm/commit/222fcec85ccd30d35899e5037079fb14625af4e2) [#9409](https://github.com/npm/npm/issues/9409) `preferGlobal` now prints a warning when there are no dependencies for the current package. ([@zkat](https://github.com/zkat)) * [`5cfed6d`](https://github.com/npm/npm/commit/5cfed6d7a1a5f2731688cfc8293b5e43a6355393) [#9409](https://github.com/npm/npm/issues/9409) Verify that `preferGlobal` is warning as expected (when a `preferGlobal` dependency is installed, but isn't listed in either `dependencies` or `devDependencies`). ([@zkat](https://github.com/zkat)) #### BUMP +1 * [`eeafce2`](https://github.com/npm/npm/commit/eeafce2d06883c0f51bf403415b6bc5f2647eba3) `validate-npm-package-license@3.0.1`: Include additional metadata in parsed license object, useful for license checkers. ([@kemitchell](https://github.com/kemitchell)) * [`1502a28`](https://github.com/npm/npm/commit/1502a285f84aa548806b3eafc8889e6288e810f3) `normalise-package-data@2.3.2`: Updated to use `validate-npm-package-license@3.0.1`. ([@othiym23](https://github.com/othiym23)) * [`cbde823`](https://github.com/npm/npm/commit/cbde8233436bf0ea62a4740869b4990322c20659) `init-package-json@1.9.1`: Add a `silent` option to suppress output on writing the generated `package.json`. Also, updated to use `validate-npm-package-license@3.0.1`. ([@zkat](https://github.com/zkat)) * [`08fda46`](https://github.com/npm/npm/commit/08fda465452b4d77f1ced8050ee3a35a77fc30a5) `tar@2.2.0`: Minor improvements. ([@othiym23](https://github.com/othiym23)) * [`dc2f20b`](https://github.com/npm/npm/commit/dc2f20b53fff77203139c863b48da0e959df2ac9) `rimraf@2.4.3`: `EPERM` now triggers a delay / retry loop (since Windows throws this when things still hold a handle). ([@isaacs](https://github.com/isaacs)) * [`e8acb27`](https://github.com/npm/npm/commit/e8acb273aa67ee0394d0431650e1b2a7d09c8554) `read@1.0.7`: Fix licensing ambiguity. ([@isaacs](https://github.com/isaacs)) #### OTHER STUFF THAT'S RELEVANT * [`73a1ee0`](https://github.com/npm/npm/commit/73a1ee0be90fa1928521b63f28bef83b8ffab61d) [#9386](https://github.com/npm/npm/issues/9386) Include additional unignorable files in documentation. ([@mjhasbach](https://github.com/mjhasbach)) * [`0313e40`](https://github.com/npm/npm/commit/0313e40ee0f757fce8861be590ad668c23d7be53) [#9396](https://github.com/npm/npm/issues/9396) Improve the `EISDIR` error message returned by npm's error-handling code to give users a better hint of what's most likely going on. Usually, error reports with this error code are about people trying to install things without a `package.json`. ([@KenanY](https://github.com/KenanY)) * [`2677457`](https://github.com/npm/npm/commit/26774579c739c5951351e58263cf4d6ea3d66ec8) [#9360](https://github.com/npm/npm/issues/9360) Make it easier to run only _some_ of npm tests with lifecycle scripts via `npm tap test/tap/testname.js`. ([@iarna](https://github.com/iarna)) ### v2.14.1 (2015-08-20): #### SECURITY FIX There are patches for two information leaks of moderate severity in `npm@2.14.1`: 1. In some cases, npm was leaking sensitive credential information into the child environment when running package and lifecycle scripts. This could lead to packages being published with files (most notably `config.gypi`, a file created by `node-gyp` that is a cache of environmental information regenerated on every run) containing the bearer tokens used to authenticate users to the registry. Users with affected packages have been notified (and the affected tokens invalidated), and now npm has been modified to not upload files that could contain this information, as well as scrubbing the sensitive information out of the environment passed to child scripts. 2. Per-package `.npmrc` files are used by some maintainers as a way to scope those packages to a specific registry and its credentials. This is a reasonable use case, but by default `.npmrc` was packed into packages, leaking those credentials. npm will no longer include `.npmrc` when packing tarballs. If you maintain packages and believe you may be affected by either of the above scenarios (especially if you've received a security notification from npm recently), please upgrade to `npm@2.14.1` as soon as possible. If you believe you may have inadvertently leaked your credentials, upgrade to `npm@2.14.1` on the affected machine, and run `npm logout` and then `npm login`. Your access tokens will be invalidated, which will eliminate any risk posed by tokens inadvertently included in published packages. We apologize for the inconvenience this causes, as well as the oversight that led to the existence of this issue in the first place. Huge thanks to [@ChALkeR](https://github.com/ChALkeR) for bringing these issues to our attention, and for helping us identify affected packages and maintainers. Thanks also to the Node.js security working group for their coördination with the team in our response to this issue. We appreciate everybody's patience and understanding tremendously. * [`b9474a8`](https://github.com/npm/npm/commit/b9474a843ca55b7c5fac6da33989e8eb39aff8b1) `fstream-npm@1.0.5`: Stop publishing build cruft (`config.gypi`) and per-project `.npmrc` files to keep local configuration out of published packages. ([@othiym23](https://github.com/othiym23)) * [`13c286d`](https://github.com/npm/npm/commit/13c286dbdc3fa8fec4cb79fc4d1ee505c8a41b2e) [#9348](https://github.com/npm/npm/issues/9348) Filter "private" (underscore-prefixed, even when scoped to a registry) configuration values out of child environments. ([@othiym23](https://github.com/othiym23)) #### BETTER WINDOWS INTEGRATION, ONE STEP AT A TIME * [`e40e71f`](https://github.com/npm/npm/commit/e40e71f2f838a8a42392f44e3eeec04e323ab743) [#6412](https://github.com/npm/npm/issues/6412) Improve the search strategy used by the npm shims for Windows to prioritize your own local npm installs. npm has really needed this tweak for a long time, so hammer on it and let us know if you run into issues, but with luck it will Just Work. ([@joaocgreis](https://github.com/joaocgreis)) * [`204ebbb`](https://github.com/npm/npm/commit/204ebbb3e0cab696a429a878ceeb4a7e78ec2b94) [#8751](https://github.com/npm/npm/issues/8751) [#7333](https://github.com/npm/npm/issues/7333) Keep [autorun scripts](https://technet.microsoft.com/en-us/sysinternals/bb963902.aspx) from interfering with npm package and lifecycle script execution on Windows by adding `/d` and `/s` when invoking `cmd.exe`. ([@saper](https://github.com/saper)) #### IT SEEMED LIKE AN IDEA AT THE TIME * [`286f3d9`](https://github.com/npm/npm/commit/286f3d97103812f0fd84b70352addbe899e258f9) [#9201](https://github.com/npm/npm/pull/9201) For a while npm was building HTML partials for use on [`docs.npmjs.com`](https://docs.npmjs.com), but we weren't actually using them. Stop building them, which makes running the full test suite and installation process around a third faster. ([@isaacs](https://github.com/isaacs)) #### A SINGLE LONELY DEPENDENCY UPGRADE * [`b343b95`](https://github.com/npm/npm/commit/b343b956ef777e321e4251ddc96ec6d80827d9e2) `request@2.61.0`: Bug fixes and keep-alive tweaks. ([@simov](https://github.com/simov)) ### v2.14.0 (2015-08-13): #### IT'S HERE! KINDA! This release adds support for teens and orcs (err, teams and organizations) to the npm CLI! Note that the web site and registry-side features of this are still not ready for public consumption. A beta should be starting in the next couple of weeks, and the features themselves will become public once all that's done. Keep an eye out for more news! All of these changes were done under [`#9011`](https://github.com/npm/npm/pull/9011): * [`6424170`](https://github.com/npm/npm/commit/6424170fc17c666a6efc090370ec691e0cab1792) Added new `npm team` command and subcommands. ([@zkat](https://github.com/zkat)) * [`52220d1`](https://github.com/npm/npm/commit/52220d146d474ec29b683bd99c06f75cbd46a9f4) Added documentation for new `npm team` command. ([@zkat](https://github.com/zkat)) * [`4e66830`](https://github.com/npm/npm/commit/4e668304850d02df8eb27a779fda76fe5de645e7) Updated `npm access` to support teams and organizations. ([@zkat](https://github.com/zkat)) * [`ea3eb87`](https://github.com/npm/npm/commit/ea3eb8733d9fa09ce34106b1b19fb1a8f95844a5) Gussied up docs for `npm access` with new commands. ([@zkat](https://github.com/zkat)) * [`6e0b431`](https://github.com/npm/npm/commit/6e0b431c1de5e329c86e57d097aa88ebfedea864) Fix up `npm whoami` to make the underlying API usable elsewhere. ([@zkat](https://github.com/zkat)) * [`f29c931`](https://github.com/npm/npm/commit/f29c931012ce5ccd69c29d83548f27e443bf7e62) `npm-registry-client@7.0.1`: Upgrade `npm-registry-client` API to support `team` and `access` calls against the registry. ([@zkat](https://github.com/zkat)) #### A FEW EXTRA VERSION BUMPS * [`c977e12`](https://github.com/npm/npm/commit/c977e12cbfa50c2f52fc807f5cc19ba1cc1b39bf) `init-package-json@1.8.0`: Checks for some `npm@3` metadata. ([@iarna](https://github.com/iarna)) * [`5c8c9e5`](https://github.com/npm/npm/commit/5c8c9e5ae177ba7d0d298cfa42f3fc7f0271e4ec) `columnify@1.5.2`: Updated some dependencies. ([@timoxley](https://github.com/timoxley)) * [`5d56742`](https://github.com/npm/npm/commit/5d567425768b75aeab402c817a53d8b2bc60d8de) `chownr@1.0.1`: Tests, docs, and minor style nits. ([@isaacs](https://github.com/isaacs)) #### ALSO A DOC FIX * [`846fcc7`](https://github.com/npm/npm/commit/846fcc79b86984b109a97366b0422f995a45f8bf) [`#9200`](https://github.com/npm/npm/pull/9200) Remove single quotes around semver range, thus making it valid semver. ([@KenanY](https://github.com/KenanY)) ### v2.13.5 (2015-08-07): This is another quiet week for the `npm@2` release. [@zkat](https://github.com/zkat) has been working hard on polishing the CLI bits of the registry's new feature to support direct management of teams and organizations, and [@iarna](https://github.com/iarna) continues to work through the list of issues blocking the general release of `npm@3`, which is looking more and more solid all the time. [@othiym23](https://github.com/othiym23) and [@zkat](https://github.com/zkat) have also been at this week's Node.js / io.js [collaborator summit](https://github.com/nodejs/summit/tree/master), both as facilitators and participants. This is a valuable opportunity to get some face time with other contributors and to work through a bunch of important discussions, but it does leave us feeling kind of sleepy. Running meetings is hard! What does that leave for this release? A few of the more tricky bug fixes that have been sitting around for a little while now, and a couple dependency upgrades. Nothing too fancy, but most of these were contributed by developers like _you_, which we think is swell. Thanks! #### BUG FIXES * [`d7271b8`](https://github.com/npm/npm/commit/d7271b8226712479cdd339bf85faf7e394923e0d) [#4530](https://github.com/npm/npm/issues/4530) The bash completion script for npm no longer alters global completion behavior around word breaks. ([@whitty](https://github.com/whitty)) * [`c9ce294`](https://github.com/npm/npm/commit/c9ce29415a0a8fc610690b6e9d91b64d6e36cfcc) [#7198](https://github.com/npm/npm/issues/7198) When setting up dependencies to be shared via `npm link `, only run the lifecycle scripts during the original link, not when running `npm link ` or `npm install --link` against them. ([@murgatroid99](https://github.com/murgatroid99)) * [`422da66`](https://github.com/npm/npm/commit/422da664bd3ce71313da447f170507faf5aac46a) [#9108](https://github.com/npm/npm/issues/9108) Clear up minor confusion around wording in `bundledDependencies` section of `package.json` docs. ([@derekpeterson](https://github.com/derekpeterson)) * [`6b42d99`](https://github.com/npm/npm/commit/6b42d99460885e715772d3487b1c548d2bc8a738) [#9146](https://github.com/npm/npm/issues/9146) Include scripts that run for `preversion`, `version`, and `postversion` in the section for lifecycle scripts rather than the generic `npm run-script` output. ([@othiym23](https://github.com/othiym23)) #### NOPE, NOT DONE WITH DEPENDENCY UPDATES * [`91a48bb`](https://github.com/npm/npm/commit/91a48bb5ef5a990781c86f8b69b8a32cf4fac2d9) `chmodr@1.0.1`: Ignore symbolic links when recursively changing mode, just like the Unix command. ([@isaacs](https://github.com/isaacs)) * [`4bbc86e`](https://github.com/npm/npm/commit/4bbc86e3825e2eee9a8758ba26bdea0cb6a2581e) `nock@2.10.0` ([@pgte](https://github.com/pgte)) ### v2.13.4 (2015-07-30): #### JULY ENDS ON A FAIRLY QUIET NOTE Hey everyone! I hope you've had a great week. We're having a fairly small release this week while we wrap up Teams and Orgs (or, as we've taken to calling it internally, _Teens and Orcs_). In other exciting news, a bunch of us are gonna be at the [Node.js Collaborator Summit](https://github.com/nodejs/summit/issues/1), and you can also find us at [wafflejs](https://wafflejs.com/) on Wednesday. Hopefully we'll be seeing some of you there. :) #### THE PATCH!!! So here it is. The patch. Hope it helps. (Thanks, [@ktarplee](https://github.com/ktarplee)!) * [`2e58c48`](https://github.com/npm/npm/commit/2e58c4819e3cafe4ae23ab7f4a520fe09258cfd7) [#9033](https://github.com/npm/npm/pull/9033) `npm version` now works on git submodules ([@ktarplee](https://github.com/ktarplee)) #### OH AND THERE'S A DEV DEPENDENCIES UPDATE Hooray. * [`d204683`](https://github.com/npm/npm/commit/d2046839d471322e61e3ceb0f00e78e5c481f967) `nock@2.9.1` ([@pgte](https://github.com/pgte)) ### v2.13.3 (2015-07-23): #### I'M SAVING THE GOOD JOKES FOR MORE INTERESTING RELEASES It's pretty hard to outdo last week's release buuuuut~ I promise I'll have a treat when we release our shiny new **Teams and Organizations** feature! :D (Coming Soon™). It'll be a real *gem*. That means it's a pretty low-key release this week. We got some nice documentation tweaks, a few bugfixes, and other such things, though! Oh, and a _bunch of version bumps_. Thanks, `semver`! #### IT'S THE LITTLE THINGS THAT MATTER * [`2fac6ae`](https://github.com/npm/npm/commit/2fac6aeffefba2934c3db395b525d931599c34d8) [#9012](https://github.com/npm/npm/issues/9012) A convenience for releases -- using the globally-installed npm before now was causing minor annoyances, so we just use the exact same npm we're releasing to build the new release. ([@zkat](https://github.com/zkat)) #### WHAT DOES THIS BUTTON DO? There's a couple of doc updates! The last one might be interesting. * [`4cd3205`](https://github.com/npm/npm/commit/4cd32050c0f89b7f1ae486354fa2c35eea302ba5) [#9002](https://github.com/npm/npm/issues/9002) Updated docs to list the various files that npm automatically includes and excludes, regardless of settings. ([@SimenB](https://github.com/SimenB)) * [`cf09e75`](https://github.com/npm/npm/commit/cf09e754931739af32647d667b671e72a4c79081) [#9022](https://github.com/npm/npm/issues/9022) Document the `"access"` field in `"publishConfig"`. Did you know you don't need to use `--access=public` when publishing scoped packages?! Just put it in your `package.json`! Go refresh yourself on scopes packages by [checking our docs](https://docs.npmjs.com/getting-started/scoped-packages) on them. ([@boennemann](https://github.com/boennemann)) * [`bfd73da`](https://github.com/npm/npm/commit/bfd73da33349cc2afb8278953b2ae16ea95023de) [#9013](https://github.com/npm/npm/issues/9013) fixed typo in changelog ([@radarhere](https://github.com/radarhere)) #### THE SEMVER MAJOR VERSION APOCALYPSE IS UPON US Basically, `semver` is up to `@5`, and that meant we needed to go in an update a bunch of our dependencies manually. `node-gyp` is still pending update, since it's not ours, though! * [`9232e58`](https://github.com/npm/npm/commit/9232e58d54c032c23716ef976023d36a42bfdcc9) [#8972](https://github.com/npm/npm/issues/8972) `init-package-json@1.7.1` ([@othiym23](https://github.com/othiym23)) * [`ba44f6b`](https://github.com/npm/npm/commit/ba44f6b4201a4faee025341b123e372d8f45b6d9) [#8972](https://github.com/npm/npm/issues/8972) `normalize-package-data@2.3.1` ([@othiym23](https://github.com/othiym23)) * [`3901d3c`](https://github.com/npm/npm/commit/3901d3cf191880bb4420b1d6b8aedbcd8fc26cdf) [#8972](https://github.com/npm/npm/issues/8972) `npm-install-checks@1.0.6` ([@othiym23](https://github.com/othiym23)) * [`ffcc7dd`](https://github.com/npm/npm/commit/ffcc7dd12f8bb94ff0f64c465c57e460b3f24a24) [#8972](https://github.com/npm/npm/issues/8972) `npm-package-arg@4.0.2` ([@othiym23](https://github.com/othiym23)) * [`7128f9e`](https://github.com/npm/npm/commit/7128f9ec10c0c8482087511b716dbddb54249626) [#8972](https://github.com/npm/npm/issues/8972) `npm-registry-client@6.5.1` ([@othiym23](https://github.com/othiym23)) * [`af28911`](https://github.com/npm/npm/commit/af28911ecd54a844f848c6ae41887097d6aa2f3b) [#8972](https://github.com/npm/npm/issues/8972) `read-installed@4.0.2` ([@othiym23](https://github.com/othiym23)) * [`3cc817a`](https://github.com/npm/npm/commit/3cc817a0f34f698b580ff6ff02308700efc54f7c) [#8972](https://github.com/npm/npm/issues/8972) node-gyp needs its own version of semver ([@othiym23](https://github.com/othiym23)) * [`f98eccc`](https://github.com/npm/npm/commit/f98eccc6e3a6699ca0aa9ecbad93a3b995583871) [#8972](https://github.com/npm/npm/issues/8972) `semver@5.0.1`: Stop including browser builds. ([@isaacs](https://github.com/isaacs)) #### \*BUMP\* And some other version bumps for good measure. * [`254ecfb`](https://github.com/npm/npm/commit/254ecfb04f026c2fd16427db01a53600c1892c8b) [#8990](https://github.com/npm/npm/issues/8990) `marked-man@0.1.5`: Fixes an issue with documentation rendering where backticks in 2nd-level headers would break rendering (?!?!) ([@steveklabnik](https://github.com/steveklabnik)) * [`79efd79`](https://github.com/npm/npm/commit/79efd79ac216da8cee8636fb2ed926b0196a4eb6) `minimatch@2.0.10`: A pattern like `'*.!(x).!(y)'` should not match a name like `'a.xyz.yab'`. ([@isaacs](https://github.com/isaacs)) * [`39c7dc9`](https://github.com/npm/npm/commit/39c7dc9a4e17cd35a5ed882ba671821c9a900f9e) `request@2.60.0`: A few bug fixes and doc updates. ([@simov](https://github.com/simov)) * [`72d3c3a`](https://github.com/npm/npm/commit/72d3c3a9e1e461608aa21b14c01a650333330da9) `rimraf@2.4.2`: Minor doc and dep updates ([@isaacs](https://github.com/isaacs)) * [`7513035`](https://github.com/npm/npm/commit/75130356a06f5f4fbec3786aac9f9f0b36dfe010) `nock@2.9.1` ([@pgte](https://github.com/pgte)) * [`3d9aa82`](https://github.com/npm/npm/commit/3d9aa82260f0643a32c13d0c1ed16f644b6fd4ab) Fixes this thing where Kat decided to save `nock` as a regular dependency ;) ([@othiym23](https://github.com/othiym23)) ### v2.13.2 (2015-07-16): #### HOLD ON TO YOUR TENTACLES... IT'S NPM RELEASE TIME! Kat: Hooray! Full team again, and we've got a pretty small patch release this week, about everyone's favorite recurring issue: git URLs! Rebecca: No Way! Again? Kat: The ride never ends! In the meantime, there's some fun, exciting work in the background to get orgs and teams out the door. Keep an eye out for news. :) Rebecca: And make sure to keep an eye out for patches for the super-fresh `npm@3`! #### LET'S GIT INKY Rebecca: So what's this about another git URL issue? Kat: Welp, I apparently broke backwards-compatibility on what are actually invalid `git+https` URLs! So I'm making it work, but we're gonna deprecate URLs that look like `git+https://user@host:path/is/here`. Rebecca: What should we use instead?! Kat: Just do me a solid and use `git+ssh://user@host:path/here` or `git+https://user@host/absolute/https/path` instead! * [`769f06e`](https://github.com/npm/npm/commit/769f06e5455d7a9fc738379de2e05868df0dab6f) Updated tests for `getResolved` so the URLs are run through `normalize-git-url`. ([@zkat](https://github.com/zkat)) * [`edbae68`](https://github.com/npm/npm/commit/edbae685bf48971e878ced373d6825fc1891ee47) [#8881](https://github.com/npm/npm/issues/8881) Added tests to verify that `git+https:` URLs are handled compatibly. ([@zkat](https://github.com/zkat)) #### NEWS FLASH! DOCUMENTATION IMPROVEMENTS! * [`bad4e014`](https://github.com/npm/npm/commit/bad4e0143cc95754a682f1da543b2b4e196e924b) [#8924](https://github.com/npm/npm/pull/8924) Make sure documented default values in `lib/cache.js` properly correspond to current code. ([@watilde](https://github.com/watilde)) * [`e7a11fd`](https://github.com/npm/npm/commit/e7a11fdf70e333cdfe3dac94a1a30907adb76d59) [#8036](https://github.com/npm/npm/issues/8036) Clarify the documentation for `.npmrc` to clarify that it's not read at the project level when doing global installs. ([@espadrine](https://github.com/espadrine)) #### STAY FRESH~ Kat: That's it for npm core changes! Rebecca: Great! Let's look at the fresh new dependencies, then! Kat: See you all next week! Both: Stay Freeesh~ (some cat form of Forrest can be seen snoring in the corner) * [`bfa1f45`](https://github.com/npm/npm/bfa1f45ee760d05039557d2245b7e3df9fda8def) `normalize-git-url@3.0.1`: Fixes url normalization such that `git+https:` accepts scp syntax, but get converted into absolute-path `https:` URLs. Also fixes scp syntax so you can have absolute paths after the `:` (`git@myhost.org:/some/absolute/place.git`) ([@zkat](https://github.com/zkat)) * [`6f757d2`](https://github.com/npm/npm/6f757d22b53f91da0bebec6b5d16c1f4dbe130b4) `glob@5.0.15`: Better handling of ENOTSUP ([@isaacs](https://github.com/isaacs)) * [`0920819`](https://github.com/npm/npm/09208197fb8b0c6d5dbf6bd7f59970cf366de989) `node-gyp@2.0.2`: Fixes an issue with long paths on Win32 ([@TooTallNate](https://github.com/TooTallNate)) ### v2.13.1 (2015-07-09): #### KAUAI WAS NICE. I MISS IT. But Forrest's still kinda on vacation, and not just mentally, because he's hanging out with the fine meatbags at CascadiaFest. Enjoy this small bug release. #### MAKE OURSELVES HAPPY * [`40981f2`](https://github.com/npm/npm/commit/40981f2e0c9c12bb003ccf188169afd1d201f5af) [#8862](https://github.com/npm/npm/issues/8862) Make the lifecycle's safety check work with scoped packages. ([@tcort](https://github.com/tcort)) * [`5125856`](https://github.com/npm/npm/commit/512585622481dbbda9a0306932468d59efaff658) [#8855](https://github.com/npm/npm/issues/8855) Make dependency versions of `"*"` match `"latest"` when all versions are prerelease. ([@iarna](https://github.com/iarna)) * [`22fdc1d`](https://github.com/npm/npm/commit/22fdc1d52602ba7098af978c75fca8f7d1060141) Visually emphasize the correct way to write lifecycle scripts. ([@josh-egan](https://github.com/josh-egan)) #### MAKE TRAVIS HAPPY * [`413c3ac`](https://github.com/npm/npm/commit/413c3ac2ab2437f3011c6ca0d1630109ec14e604) Use npm's `2.x` branch for testing its `2.x` branch. ([@iarna](https://github.com/iarna)) * [`7602f64`](https://github.com/npm/npm/commit/7602f64826f7a465d9f3a20bd87a376d992607e6) Don't prompt for GnuPG passphrase in version lifecycle tests. ([@othiym23](https://github.com/othiym23)) #### MAKE `npm outdated` HAPPY * [`d338668`](https://github.com/npm/npm/commit/d338668601d1ebe5247a26237106e80ea8cd7f48) [#8796](https://github.com/npm/npm/issues/8796) `fstream-npm@1.0.4`: When packing the package tarball, npm no longer crashes for packages with certain combinations of `.npmignore` entries, `.gitignore` entries, and lifecycle scripts. ([@iarna](https://github.com/iarna)) * [`dbe7c9c`](https://github.com/npm/npm/commit/dbe7c9c74734be870d16dd61b9e7f746123011f6) `nock@2.7.0`: Add matching based on query strings. ([@othiym23](https://github.com/othiym23)) There are new versions of `strip-ansi` and `ansi-regex`, but npm only uses them indirectly, so we pushed them down into their dependencies where they can get updated at their own pace. * [`06b6ca5`](https://github.com/npm/npm/commit/06b6ca5b5333025f10c8d901628859bd4678e027) undeduplicate `ansi-regex` ([@othiym23](https://github.com/othiym23)) * [`b168e33`](https://github.com/npm/npm/commit/b168e33ad46faf47020a45f72ba8cec8c644bdb9) undeduplicate `strip-ansi` ([@othiym23](https://github.com/othiym23)) ### v2.13.0 (2015-07-02): #### FORREST IS OUT! LET'S SNEAK IN ALL THE THINGS! Well, not _everything_. Just a couple of goodies, like the new `npm ping` command, and the ability to add files to the commits created by `npm version` with the new version hooks. There's also a couple of bugfixes in `npm` itself and some of its dependencies. Here we go! #### YES HELLO THIS IS NPM REGISTRY SORRY NO DOG HERE Yes, that's right! We now have a dedicated `npm ping` command. It's super simple and super easy. You ping. We tell you whether you pinged right by saying hello right back. This should help out folks dealing with things like proxy issues or other registry-access debugging issues. Give it a shot! This addresses [#5750](https://github.com/npm/npm/issues/5750), and will help with the `npm doctor` stuff described in [#6756](https://github.com/npm/npm/issues/6756). * [`f1f7a85`](https://github.com/npm/npm/commit/f1f7a85) Add ping command to CLI ([@michaelnisi](https://github.com/michaelnisi)) * [`8cec629`](https://github.com/npm/npm/commit/8cec629) Add ping command to npm-registry-client ([@michaelnisi](https://github.com/michaelnisi)) * [`0c0c92d`](https://github.com/npm/npm/0c0c92d) Fixed ping command issues (added docs, tests, fixed minor bugs, etc) ([@zkat](https://github.com/zkat)) #### I'VE WANTED THIS FOR `version` SINCE LIKE LITERALLY FOREVER AND A DAY Seriously! This patch lets you add files to the `version` commit before it's made, So you can add additional metadata files, more automated changes to `package.json`, or even generate `CHANGELOG.md` automatically pre-commit if you're into that sort of thing. I'm so happy this is there I can't even. Do you have other fun usecases for this? Tell [npmbot (@npmjs)](http://twitter.com/npmjs) about it! * [`582f170`](https://github.com/npm/npm/commit/582f170) [#8620](https://github.com/npm/npm/issues/8620) version: Allow scripts to add files to the commit. ([@jamestalmage](https://github.com/jamestalmage)) #### ALL YOUR FILE DESCRIPTORS ARE BELONG TO US We've had problems in the past with things like `EMFILE` errors popping up when trying to install packages with a bunch of dependencies. Isaac patched up [`graceful-fs`](https://github.com/isaacs/node-graceful-fs) to handle this case better, so we should be seeing fewer of those. * [`022691a`](https://github.com/npm/npm/commit/022691a) `graceful-fs@4.1.2`: Updated so we can monkey patch globally. ([@isaacs](https://github.com/isaacs)) * [`c9fb0fd`](https://github.com/npm/npm/commit/c9fb0fd) Globally monkey-patch graceful-fs. This should fix some errors when installing packages with lots of dependencies. ([@isaacs](https://github.com/isaacs)) #### READ THE FINE DOCS. THEY'VE IMPROVED * [`5587d0d`](https://github.com/npm/npm/commit/5587d0d) Nice clarification for `directories.bin` ([@ujane](https://github.com/ujane)) * [`20673c7`](https://github.com/npm/npm/commit/20673c7) Hey, Windows folks! Check out [`nvm-windows`](https://github.com/coreybutler/nvm-windows) ([@ArtskydJ](https://github.com/ArtskydJ)) #### MORE NUMBERS! MORE VALUE! * [`5afa2d5`](https://github.com/npm/npm/commit/5afa2d5) `validate-npm-package-name@2.2.2`: Documented package name rules in README ([@zeusdeux](https://github.com/zeusdeux)) * [`021f4d9`](https://github.com/npm/npm/commit/021f4d9) `rimraf@2.4.1`: [#74](https://github.com/isaacs/rimraf/issues/74) Use async function for bin (to better handle Window's `EBUSY`) ([@isaacs](https://github.com/isaacs)) * [`5223432`](https://github.com/npm/npm/commit/5223432) `osenv@0.1.3`: Use `os.homedir()` polyfill for more reliable output. io.js added the function and the polyfill does a better job than the prior solution. ([@sindresorhus](https://github.com/sindresorhus)) * [`8ebbc90`](https://github.com/npm/npm/commit/8ebbc90) `npm-cache-filename@1.0.2`: Make sure different git references get different cache folders. This should prevent `foo/bar#v1.0` and `foo/bar#master` from sharing the same cache folder. ([@tomekwi](https://github.com/tomekwi)) * [`367b854`](https://github.com/npm/npm/commit/367b854) `lru-cache@2.6.5`: Minor test/typo changes ([@isaacs](https://github.com/isaacs)) * [`9fcae61`](https://github.com/npm/npm/commit/9fcae61) `glob@5.0.13`: Tiny doc change + stop firing 'match' events for ignored items. ([@isaacs](https://github.com/isaacs)) #### OH AND ONE MORE THING * [`7827249`](https://github.com/npm/npm/commit/7827249) `PeerDependencies` errors now include the package version. ([@NickHeiner](https://github.com/NickHeiner)) ### v2.12.1 (2015-06-25): #### HEY WHERE DID EVERYBODY GO I keep [hearing some commotion](https://github.com/npm/npm/releases/tag/v3.0.0). Is there something going on? Like, a party or something? Anyway, here's a small release with at least two significant bug fixes, at least one of which some of you have been waiting for for quite a while. #### REMEMBER WHEN I SAID "REMEMBER WHEN I SAID THAT THING ABOUT PERMISSIONS?"? `npm@2.12.0` has a change that introduces a fix for a permissions problem whereby the `_locks` directory in the cache directory can up being owned by root. The fix in 2.12.0 takes care of that problem, but introduces a new problem for Windows users where npm tries to call `process.getuid()`, which doesn't exist on Windows. It was easy enough to fix (but more or less impossible to test, thanks to all the external dependencies involved with permissions and platforms and whatnot), but as a result, Windows users might want to skip `npm@2.12.0` and go straight to `npm@2.12.1`. Sorry about that! * [`7e5da23`](https://github.com/npm/npm/commit/7e5da238ee869201fdb9027c27b79b0f76b440a8) When using the new, "fixed" cache directory creator, be extra-careful to not call `process.getuid()` on platforms that lack it. ([@othiym23](https://github.com/othiym23)) #### WHEW! ALL DONE FIXING GIT FOREVER! New npm CLI team hero [@zkat](https://github.com/zkat) has finally (FINALLY) fixed the regression somebody (hi!) introduced a couple months ago whereby git URLs of the format `git+ssh://user@githost.com:org/repo.git` suddenly stopped working, and also started being saved (and cached) incorrectly. I am 100% sure there are absolutely no more bugs in the git caching code at all ever. Mm hm. Yep. Pretty sure. Maybe. Hmm... I hope. *Sighs audibly.* [Let us know](http://github.com/npm/npm/issues/new) if we broke something else with this fix. * [`94ca4a7`](https://github.com/npm/npm/commit/94ca4a711619ba8e40ce3d20bc42b13cdb7611b7) [#8031](https://github.com/npm/npm/issues/8031) Even though `git+ssh://user@githost.com:org/repo.git` isn't a URL, treat it like one for the purposes of npm. ([@zkat](https://github.com/zkat)) * [`e7f56e5`](https://github.com/npm/npm/commit/e7f56e5a97fcf1c52d5c5bee71303b0126129815) [#8031](https://github.com/npm/npm/issues/8031) `normalize-git-url@2.0.0`: Handle git URLs (and URL-like remote refs) in a manner consistent with npm's docs. ([@zkat](https://github.com/zkat)) #### YEP, THERE ARE STILL DEPENDENCY UPGRADES * [`679bf47`](https://github.com/npm/npm/commit/679bf4745ac2cfbb01c9ce273e189807fd04fa33) [#40](http://github.com/npm/read-installed/issues/40) `read-installed@4.0.1`: Handle prerelease versions in top-level dependencies not in `package.json` without marking those packages as invalid. ([@benjamn](https://github.com/benjamn)) * [`3a67410`](https://github.com/npm/npm/commit/3a6741068c9119174c920496778aeee870ebdac0) `tap@1.3.1` ([@isaacs](https://github.com/isaacs)) * [`151904a`](https://github.com/npm/npm/commit/151904af39dc24567f8c98529a2a64a4dbcc960a) `nopt@3.0.3` ([@isaacs](https://github.com/isaacs)) ### v2.12.0 (2015-06-18): #### REMEMBER WHEN I SAID THAT THING ABOUT PERMISSIONS? About [a million people](https://github.com/npm/npm/issues?utf8=%E2%9C%93&q=is%3Aissue+EACCES+_locks) have filed issues related to having a tough time using npm after they've run npm once or twice with sudo. "Don't worry about it!" I said. "We've fixed all those permissions problems ages ago! Use this one weird trick and you'll never have to deal with this again!" Well, uh, if you run npm with root the first time you run npm on a machine, it turns out that the directory npm uses to store lockfiles ends up being owned by the wrong user (almost always root), and that can, well, it can cause problems sometimes. By which I mean every time you run npm without being root it'll barf with `EACCES` errors. Whoops! This is an obnoxious regression, and to prevent it from recurring, we've made it so that the cache, cached git remotes, and the lockfile directories are all created and maintained using the same utilty module, which not only creates the relevant paths with the correct permissions, but will fix the permissions on those directories (if it can) when it notices that they're broken. An `npm install` run as root ought to be sufficient to fix things up (and if that doesn't work, first tell us about it, and then run `sudo chown -R $(whoami) $HOME/.npm`) Also, I apologize for inadvertently gaslighting any of you by claiming this bug wasn't actually a bug. I do think we've got this permanently dealt with now, but I'll be paying extra-close attention to permissions issues related to the cache for a while. * [`85d1a53`](https://github.com/npm/npm/commit/85d1a53d7b5e0fc04823187e522ae3711ede61fa) Set permissions on lock directory to the owner of the process. ([@othiym23](https://github.com/othiym23)) #### I WENT TO NODECONF AND ALL I GOT WAS THIS LOUSY SPDX T-SHIRT That's not literally true. We spent very little time discussing SPDX, [@kemitchell](https://github.com/kemitchell) is a champ, and I had a lot of fun playing drum & bass to a mostly empty Boogie Barn and only ended up with one moderately severe cold for my pains. Another winner of a NodeConf! (I would probably wear a SPDX T-shirt if somebody gave me one, though.) A bunch of us did have a spirited discussion of the basics of open-source intellectual property, and the convergence of me, [@kemitchell](https://github.com/kemitchell), and [@jandrieu](https://github.com/jandrieu) in one place allowed us to hammmer out a small but significant issue that had been bedeviling early adopters of the new SPDX expression syntax in `package.json` license fields: how to deal with packages that are left without a license on purpose. Refer to [the docs](https://github.com/npm/npm/blob/16a3dd545b10f8a2464e2037506ce39124739b41/doc/files/package.json.md#license) for the specifics, but the short version is that instead of using `LicenseRef-LICENSE` for proprietary licenses, you can now use either `UNLICENSED` if you want to make it clear that you don't _want_ your software to be licensed (and want npm to stop warning you about this), or `SEE LICENSE IN ` if there's a license with custom text you want to use. At some point in the near term, we'll be updating npm to verify that the mentioned file actually exists, but for now you're all on the honor system. * [`4827fc7`](https://github.com/npm/npm/commit/4827fc784117c17f35dd9b51b21d1eff6094f661) [#8557](https://github.com/npm/npm/issues/8557) `normalize-package-data@2.2.1`: Allow `UNLICENSED` and `SEE LICENSE IN ` in "license" field of `package.json`. ([@kemitchell](https://github.com/kemitchell)) * [`16a3dd5`](https://github.com/npm/npm/commit/16a3dd545b10f8a2464e2037506ce39124739b41) [#8557](https://github.com/npm/npm/issues/8557) Document the new accepted values for the "license" field. ([@kemitchell](https://github.com/kemitchell)) * [`8155311`](https://github.com/npm/npm/commit/81553119350deaf199e79e38e35b52a5c8ad206c) [#8557](https://github.com/npm/npm/issues/8557) `init-package-json@1.7.0`: Support new "license" field values at init time. ([@kemitchell](https://github.com/kemitchell)) #### SMALLISH BUG FIXES * [`9d8cac9`](https://github.com/npm/npm/commit/9d8cac94a258db648a2b1069b1c8c6529c79d013) [#8548](https://github.com/npm/npm/issues/8548) Remove extraneous newline from `npm view` output, making it easier to use in shell scripts. ([@eush77](https://github.com/eush77)) * [`765fd4b`](https://github.com/npm/npm/commit/765fd4bfca8ea3e2a4a399765b17eec40a3d893d) [#8521](https://github.com/npm/npm/issues/8521) When checking for outdated packages, or updating packages, raise an error when the registry is unreachable instead of silently "succeeding". ([@ryantemple](https://github.com/ryantemple)) #### SMALLERISH DOCUMENTATION TWEAKS * [`5018335`](https://github.com/npm/npm/commit/5018335ce1754a9f771954ecbc1a93acde9b8c0a) [#8365](https://github.com/npm/npm/issues/8365) Add details about which git environment variables are whitelisted by npm. ([@nmalaguti](https://github.com/nmalaguti)) * [`bed9edd`](https://github.com/npm/npm/commit/bed9edddfdcc6d22a80feab33b53e4ef9172ec72) [#8554](https://github.com/npm/npm/issues/8554) Fix typo in version docs. ([@rainyday](https://github.com/rainyday)) #### WELL, I GUESS THERE ARE MORE DEPENDENCY UPGRADES * [`7ce2f06`](https://github.com/npm/npm/commit/7ce2f06f6f34d469b1d2e248084d4f3fef10c05e) `request@2.58.0`: Refactor tunneling logic, and use `extend` instead of abusing `util._extend`. ([@simov](https://github.com/simov)) * [`e6c6195`](https://github.com/npm/npm/commit/e6c61954aad42e20eec49745615c7640b2026a6c) `nock@2.6.0`: Refined interception behavior. ([@pgte](https://github.com/pgte)) * [`9583cc3`](https://github.com/npm/npm/commit/9583cc3cb192c2fced006927cfba7cd37b588605) `fstream-npm@1.0.3`: Ensure that `main` entry in `package.json` is always included in the bundled package tarball. ([@coderhaoxin](https://github.com/coderhaoxin)) * [`df89493`](https://github.com/npm/npm/commit/df894930f2716adac28740b29b2e863170919990) `fstream@1.0.7` ([@isaacs](https://github.com/isaacs)) * [`9744049`](https://github.com/npm/npm/commit/974404934758124aa8ae5b54f7d5257c3bd6b588) `dezalgo@1.0.3`: `dezalgo` should be usable in the browser, and can be now that `asap` has been upgraded to be browserifiable. ([@mvayngrib](https://github.com/mvayngrib)) ### v2.11.3 (2015-06-11): This was a very quiet week. This release was done by [@iarna](https://github.com/iarna), while the rest of the team hangs out at NodeConf Adventure! #### TESTS IN 0.8 FAIL LESS * [`5b3b3c2`](https://github.com/npm/npm/commit/5b3b3c2) [#8491](//github.com/npm/npm/pull/8491) Updates a test to use only 0.8 compatible features ([@watilde](https://github.com/watilde)) #### THE TREADMILL OF UPDATES NEVER CEASES * [`9f439da`](https://github.com/npm/npm/commit/9f439da) `spdx@0.4.1`: License range updates ([@kemitchell](https://github.com/kemitchell)) * [`2dd055b`](https://github.com/npm/npm/commit/2dd055b) `normalize-package-data@2.2.1`: Fixes a crashing bug when the package.json `scripts` property is not an object. ([@iarna](https://github.com/iarna)) * [`e02e85d`](https://github.com/npm/npm/commit/e02e85d) `osenv@0.1.2`: Switches to using the `os-tmpdir` module instead of `os.tmpdir()` for greate consistency in behavior between node versions. ([@iarna](https://github.com/iarna)) * [`a6f0265`](https://github.com/npm/npm/commit/a6f0265) `ini@1.3.4` ([@isaacs](https://github.com/isaacs)) * [`7395977`](https://github.com/npm/npm/commit/7395977) `rimraf@2.4.0` ([@isaacs](https://github.com/isaacs)) ### v2.11.2 (2015-06-04): Another small release this week, brought to you by the latest addition to the CLI team, [@zkat](https://github.com/zkat) (Hi, all!) Mostly small documentation tweaks and version updates. Oh! And `npm outdated` is actually sorted now. Rejoice! It's gonna be a while before we get another palindromic version number. Enjoy it while it lasts. :3 #### QUALITY OF LIFE HAS NEVER BEEN BETTER * [`31aada4`](https://github.com/npm/npm/commit/31aada4ccc369c0903ff7f233f464955d12c6fe2) [#8401](https://github.com/npm/npm/issues/8401) `npm outdated` output is just that much nicer to consume now, due to sorting by name. ([@watilde](https://github.com/watilde)) * [`458a919`](https://github.com/npm/npm/commit/458a91925d8b20c5e672ba71a86745aad654abaf) [#8469](https://github.com/npm/npm/pull/8469) Explicitly set `cwd` for `preversion`, `version`, and `postversion` scripts. This makes the scripts findable relative to the root dir. ([@alexkwolfe](https://github.com/alexkwolfe)) * [`55d6d71`](https://github.com/npm/npm/commit/55d6d71562e979e745c9db88861cc39f99b9f3ec) Ensure package name and version are included in display during `npm version` lifecycle execution. Gets rid of those little `undefined`s in the console. ([@othiym23](https://github.com/othiym23)) #### WORDS HAVE NEVER BEEN QUITE THIS READABLE * [`3901e49`](https://github.com/npm/npm/commit/3901e4974c800e7f9fba4a5b2ff88da1126d5ef8) [#8462](https://github.com/npm/npm/pull/8462) English apparently requires correspondence between indefinite articles and attached nouns. ([@Enet4](https://github.com/Enet4)) * [`5a744e4`](https://github.com/npm/npm/commit/5a744e4b143ef7b2f50c80a1d96fdae4204d452b) [#8421](https://github.com/npm/npm/pull/8421) The effect of `npm prune`'s `--production` flag and how to use it have been documented a bit better. ([@foiseworth](https://github.com/foiseworth)) * [`eada625`](https://github.com/npm/npm/commit/eada625993485f0a2c5324b06f02bfa0a95ce4bc) We've updated our `.mailmap` and `AUTHORS` files to make sure credit is given where credit is due. ([@othiym23](https://github.com/othiym23)) #### VERSION NUMBERS HAVE NEVER BEEN BIGGER * [`c929fd1`](https://github.com/npm/npm/commit/c929fd1d0604b5878ed05706447e078d3e41f5b3) `readable-stream@1.1.13`: Manually deduped `v1.1.13` (streams3) to make deduping more reliable on `npm@<3`. ([@othiym23](https://github.com/othiym23)) * [`a9b4b78`](https://github.com/npm/npm/commit/a9b4b78dcc85571fd1cdd737903f7f37a5e6a755) `request@2.57.0`: Replace dependency on IncomingMessage's `.client` with `.socket` as the former was deprecated in io.js 2.2.0. ([@othiym23](https://github.com/othiym23)) * [`4b5e557`](https://github.com/npm/npm/commit/4b5e557a23cdefd521ad154111e3d4dcc81f1cdb) `abbrev@1.0.7`: Better testing, with coverage. ([@othiym23](https://github.com/othiym23)) * [`561affe`](https://github.com/npm/npm/commit/561affee21df9bbea5a47298f2452f533be8f359) `semver@4.3.6`: .npmignore added for less cruft, and better testing, with coverage. ([@othiym23](https://github.com/othiym23)) * [`60aef3c`](https://github.com/npm/npm/commit/60aef3cf5d84d757752db3eb8ede2cb385469e7b) `graceful-fs@3.0.8`: io.js fixes. ([@zkat](https://github.com/zkat)) * [`f8bd453`](https://github.com/npm/npm/commit/f8bd453b1a1c46ba7666cb166595e8a011eae443) `config-chain@1.1.9`: Added MIT license to package.json ([@zkat](https://github.com/zkat)) ### v2.11.1 (2015-05-28): This release brought to you from poolside at the Omni Amelia Island Resort and JSConf 2015, which is why it's so tiny. #### CONFERENCE WIFI CAN'T STOP THESE BUG FIXES * [`cf109a6`](https://github.com/npm/npm/commit/cf109a682f38a059a994da953d5c1b4aaece5e2f) [#8381](https://github.com/npm/npm/issues/8381) Documented a subtle gotcha with `.npmrc`, which is that it needs to have its permissions set such that only the owner can read or write the file. ([@colakong](https://github.com/colakong)) * [`180da67`](https://github.com/npm/npm/commit/180da67c9fa53103d625e2f031626c2453c7ebcd) [#8365](https://github.com/npm/npm/issues/8365) Git 2.3 adds support for `GIT_SSH_COMMAND`, which allows you to pass an explicit git command (with, for example, a specific identity passed in on the command line). ([@nmalaguti](https://github.com/nmalaguti)) #### MY (VIRGIN) PINA COLADA IS GETTING LOW, BETTER UPGRADE THESE DEPENDENCIES * [`b72de41`](https://github.com/npm/npm/commit/b72de41c5cc9f0c46d3fa8f062c75bd273641474) `node-gyp@2.0.0`: Use a newer version of `gyp`, and generally improve support for Visual Studios and Windows. ([@TooTallNate](https://github.com/TooTallNate)) * [`8edbe21`](https://github.com/npm/npm/commit/8edbe210af41e8f248f5bb92c72de92f54fda3b1) `node-gyp@2.0.1`: Don't crash when Python's version doesn't parse as valid semver. ([@TooTallNate](https://github.com/TooTallNate)) * [`ba0e0a8`](https://github.com/npm/npm/commit/ba0e0a845a4f29717aba566b416a27d1a22f5d08) `glob@5.0.10`: Add coverage to tests. ([@isaacs](https://github.com/isaacs)) * [`7333701`](https://github.com/npm/npm/commit/7333701b5d4f01673f37d64992c63c4e15864d6d) `request@2.56.0`: Bug fixes and dependency upgrades. ([@simov](https://github.com/simov)) ### v2.11.0 (2015-05-21): For the first time in a very long time, we've added new events to the life cycle used by `npm run-script`. Since running `npm version (major|minor|patch)` is typically the last thing many developers do before publishing their updated packages, it makes sense to add life cycle hooks to run tests or otherwise preflight the package before doing a full publish. Thanks, as always, to the indefatigable [@watilde](https://github.com/watilde) for yet another great usability improvement for npm! #### FEATURELETS * [`b07f7c7`](https://github.com/npm/npm/commit/b07f7c7c1e5021730b3c320f1b3a46e70f8a21ff) [#7906](https://github.com/npm/npm/issues/7906) Add new [`scripts`](https://github.com/npm/npm/blob/master/doc/misc/npm-scripts.md) to allow you to run scripts before and after the [`npm version`](https://github.com/npm/npm/blob/master/doc/cli/npm-version.md) command has run. This makes it easy to, for instance, require that your test suite passes before bumping the version by just adding `"preversion": "npm test"` to the scripts section of your `package.json`. ([@watilde](https://github.com/watilde)) * [`8a46136`](https://github.com/npm/npm/commit/8a46136f42e416cbadb533bcf89d73d681ed421d) [#8185](https://github.com/npm/npm/issues/8185) When we get a "not found" error from the registry, we'll now check to see if the package name you specified is invalid and if so, give you a better error message. ([@thefourtheye](https://github.com/thefourtheye)) #### BUG FIXES * [`9bcf573`](https://github.com/npm/npm/commit/9bcf5730bd0316f210dafea898afe9103849cea9) [#8324](https://github.com/npm/npm/pull/8324) On Windows, when you've configured a custom `node-gyp`, run it with node itself instead of using the default open action (which is almost never what you want). ([@bangbang93](https://github.com/bangbang93)) * [`1da9b04`](https://github.com/npm/npm/commit/1da9b0411d3416c7fca17d08cbbcfca7ae86e92d) [#7195](https://github.com/npm/npm/issues/7195) [#7260](https://github.com/npm/npm/issues/7260) `npm-registry-client@6.4.0`: (Re-)allow publication of existing mixed-case packages (part 1). ([@smikes](https://github.com/smikes)) * [`e926783`](https://github.com/npm/npm/commit/e9267830ab261c751f12723e84d2458ae9238646) [#7195](https://github.com/npm/npm/issues/7195) [#7260](https://github.com/npm/npm/issues/7260) `normalize-package-data@2.2.0`: (Re-)allow publication of existing mixed-case packages (part 2). ([@smikes](https://github.com/smikes)) #### DOCUMENTATION IMPROVEMENTS * [`f62ee05`](https://github.com/npm/npm/commit/f62ee05333b141539a8e851c620dd2e82ff06860) [#8314](https://github.com/npm/npm/issues/8314) Update the README to warn folks away from using the CLI's internal API. For the love of glob, just use a child process to run the CLI! ([@claycarpenter](https://github.com/claycarpenter)) * [`1093921`](https://github.com/npm/npm/commit/1093921c04db41ab46db24a170a634a4b2acd8d9) [#8279](https://github.com/npm/npm/pull/8279) Update the documentation to note that, yes, you can publish scoped packages to the public registry now! ([@mantoni](https://github.com/mantoni)) * [`f87cde5`](https://github.com/npm/npm/commit/f87cde5234a760d3e515ffdaacaed6f5b71dbf44) [#8292](https://github.com/npm/npm/pull/8292) Fix typo in an example and grammar in the description in the [shrinkwrap documentation](https://github.com/npm/npm/blob/master/doc/cli/npm-shrinkwrap.md). ([@vshih](https://github.com/vshih)) * [`d3526ce`](https://github.com/npm/npm/commit/d3526ceb09a0c29fdb7d4124536ae09057d033e7) Improve the formatting in the [shrinkwrap documentation](https://github.com/npm/npm/blob/master/doc/cli/npm-shrinkwrap.md). ([@othiym23](https://github.com/othiym23)) * [`19fe6d2`](https://github.com/npm/npm/commit/19fe6d20883e28956ff916fe4dae42d73ee6195b) [#8311](https://github.com/npm/npm/pull/8311) Update [README.md](https://github.com/npm/npm#readme) to use syntax highlighting in its code samples and bits of shell scripts. ([@SimenB](https://github.com/SimenB)) #### DEPENDENCY UPDATES! ALWAYS AND FOREVER! * [`fc52160`](https://github.com/npm/npm/commit/fc52160d0223226fffe4166f42fdfd3b899b3c1e) [#4700](https://github.com/npm/npm/issues/4700) [#5044](https://github.com/npm/npm/issues/5044) `init-package-json@1.6.0`: Make entering an invalid version while running `npm init` give you an immediate error and prompt you to correct it. ([@watilde](https://github.com/watilde)) * [`738853e`](https://github.com/npm/npm/commit/738853eb1f55636476a2a410c2c04732eec9d51e) [#7763](https://github.com/npm/npm/issues/7763) `fs-write-stream-atomic@1.0.3`: Fix a bug where errors would not propagate, making error messages unhelpful. ([@iarna](https://github.com/iarna)) * [`6d74a2d`](https://github.com/npm/npm/commit/6d74a2d2ac7f92750cf6a2cfafae1af23b569098) `npm-package-arg@4.0.1`: Fix tests on windows ([@Bacra](https://github.com)) and with more recent `hosted-git-info`. ([@iarna](https://github.com/iarna)) * [`50f7178`](https://github.com/npm/npm/commit/50f717852fbf713ef6cbc4e0a9ab42657decbbbd) `hosted-git-info@2.1.4`: Correct spelling in its documentation. ([@iarna](https://github.com/iarna)) * [`d7956ca`](https://github.com/npm/npm/commit/d7956ca17c057d5383ff0d3fc5cf6ac2940b034d) `glob@5.0.7`: Fix a bug where unusual error conditions could make further use of the module fail. ([@isaacs](https://github.com/isaacs)) * [`44f7d74`](https://github.com/npm/npm/commit/44f7d74c5d3181d37da7ea7949c86b344153f8d9) `tap@1.1.0`: Update to the most recent tap to get a whole host of bug fixes and integration with [coveralls](https://coveralls.io/). ([@isaacs](https://github.com/isaacs)) * [`c21e8a8`](https://github.com/npm/npm/commit/c21e8a8d94bcf0ad79dc583ddc53f8366d4813b3) `nock@2.2.0` ([@othiym23](https://github.com/othiym23)) #### LICENSE FILES FOR THE LICENSE GOD * Add missing ISC license file to package ([@kasicka](https://github.com/kasicka)): * [`aa9908c`](https://github.com/npm/npm/commit/aa9908c20017729673b9d410b77f9a16b7aae8a4) `realize-package-specifier@3.0.1` * [`23a3b1a`](https://github.com/npm/npm/commit/23a3b1a726b9176c70ce0ccf3cd9d25c54429bdf) `fs-vacuum@1.2.6` * [`8e04bba`](https://github.com/npm/npm/commit/8e04bba830d4353d84751d21803cd127c96153a7) `dezalgo@1.0.2` * [`50f7178`](https://github.com/npm/npm/commit/50f717852fbf713ef6cbc4e0a9ab42657decbbbd) `hosted-git-info@2.1.4` * [`6a54917`](https://github.com/npm/npm/commit/6a54917fbd4df995495a95d4b548defd44b77c93) `write-file-atomic@1.1.2` * [`971f92c`](https://github.com/npm/npm/commit/971f92c4a4e5514217d1e4db45d1ccf71a60ff19) `async-some@1.0.2` * [`67b50b7`](https://github.com/npm/npm/commit/67b50b7667a42bb3340a660eb2e617e1a554d2d4) `normalize-git-url@1.0.1` #### SPDX LICENSE UPDATES * Switch license to [BSD-2-Clause](http://spdx.org/licenses/BSD-2-Clause.html#licenseText) from plain "BSD" ([@isaacs](https://github.com/isaacs)): * [`efdb733`](https://github.com/npm/npm/commit/efdb73332eeedcad4c609796929070b62abb37ab) `npm-user-validate@0.1.2` * [`e926783`](https://github.com/npm/npm/commit/e9267830ab261c751f12723e84d2458ae9238646) `normalize-package-data@2.2.0` * Switch license to [ISC](http://spdx.org/licenses/ISC.html#licenseText) from [BSD](http://spdx.org/licenses/BSD-2-Clause.html#licenseText) ([@isaacs](https://github.com/isaacs)): * [`c300956`](https://github.com/npm/npm/commit/c3009565a964f0ead4ac4ab234b1a458e2365f17) `block-stream@0.0.8` * [`1de1253`](https://github.com/npm/npm/commit/1de125355765fecd31e682ed0ff9d2edbeac0bb0) `lockfile@1.0.1` * [`0d5698a`](https://github.com/npm/npm/commit/0d5698ab132e376c7aec93ae357c274932116220) `osenv@0.1.1` * [`2e84921`](https://github.com/npm/npm/commit/2e84921474e1ffb18de9fce4616e73171fa8046d) `abbrev@1.0.6` * [`872fac9`](https://github.com/npm/npm/commit/872fac9d10c11607e4d0348c08a683b84e64d30b) `chmodr@0.1.1` * [`01eb7f6`](https://github.com/npm/npm/commit/01eb7f60acba584346ad8aae846657899f3b6887) `chownr@0.0.2` * [`294336f`](https://github.com/npm/npm/commit/294336f0f31c7b9fe31a50075ed750db6db134d1) `read@1.0.6` * [`ebdf6a1`](https://github.com/npm/npm/commit/ebdf6a14d17962cdb7128402c53b452f91d44ca7) `graceful-fs@3.0.7` * Switch license to [ISC](http://spdx.org/licenses/ISC.html#licenseText) from [MIT](http://spdx.org/licenses/MIT.html#licenseText) ([@isaacs](https://github.com/isaacs)): * [`e5d237f`](https://github.com/npm/npm/commit/e5d237fc0f436dd2a89437ebf8a9632a2e35ccbe) `nopt@3.0.2` * [`79fef14`](https://github.com/npm/npm/commit/79fef1421b78f044980f0d1bf0e97039b6992710) `rimraf@2.3.4` * [`22527da`](https://github.com/npm/npm/commit/22527da4816e7c2746cdc0317c5fb4a85152d554) `minimatch@2.0.8` * [`882ac87`](https://github.com/npm/npm/commit/882ac87a6c4123ca985d7ad4394ea5085e5b0ef5) `lru-cache@2.6.4` * [`9d9d015`](https://github.com/npm/npm/commit/9d9d015a2e972f68664dda54fbb204db28b21ede) `npmlog@1.2.1` ### v2.10.1 (2015-05-14): #### BUG FIXES & DOCUMENTATION TWEAKS * [`dc77520`](https://github.com/npm/npm/commit/dc7752013ffce13a3d3f13e518a0052c22fc1158) When getting back a 404 from a request to a private registry that uses a registry path that extends past the root (`http://registry.enterprise.co/path/to/registry`), display the name of the nonexistent package, rather than the first element in the registry API path. Sorry, Artifactory users! ([@hayes](https://github.com/hayes)) * [`f70dea9`](https://github.com/npm/npm/commit/f70dea9b4766f6eaa55012c3e8087e9cb04fd4ce) Make clearer that `--registry` can be used on a per-publish basis to push a package to a non-default registry. ([@mischkl](https://github.com/mischkl)) * [`a3e26f5`](https://github.com/npm/npm/commit/a3e26f5b4465991a941a325468ab7725670d2a94) Did you know that GitHub shortcuts can have commit-ishes included (`org/repo#branch`)? They can! ([@iarna](https://github.com/iarna)) * [`0e2c091`](https://github.com/npm/npm/commit/0e2c091a539b61fdc60423b6bbaaf30c24e4b1b8) Some errors from `readPackage` were being swallowed, potentially leading to invalid package trees on disk. ([@smikes](https://github.com/smikes)) #### DEPENDENCY UPDATES! STILL! MORE! AGAIN! * [`0b901ad`](https://github.com/npm/npm/commit/0b901ad0811d84dda6ca0755a9adc8d47825edd0) `lru-cache@2.6.3`: Removed some cruft from the published package. ([@isaacs](https://github.com/isaacs)) * [`d713e0b`](https://github.com/npm/npm/commit/d713e0b14930c563e3fdb6ac6323bae2a8924652) `mkdirp@0.5.1`: Made compliant with `standard`, dropped support for Node 0.6, added (Travis) support for Node 0.12 and io.js. ([@isaacs](https://github.com/isaacs)) * [`a2d6578`](https://github.com/npm/npm/commit/a2d6578b6554c5c9d48fe2006751759f4da57520) `glob@1.0.3`: Updated to use `tap@1`. ([@isaacs](https://github.com/isaacs)) * [`64cd1a5`](https://github.com/npm/npm/commit/64cd1a570aaa5f24ccba190948ec9456297c97f5) `fstream@ 1.0.6`: Made compliant with [`standard`](http://npm.im/standard) (done by [@othiym23](https://github.com/othiym23), and then debugged and fixed by [@iarna](https://github.com/iarna)), and license changed to ISC. ([@othiym23](https://github.com/othiym23) / [@iarna](https://github.com/iarna)) * [`b527a7c`](https://github.com/npm/npm/commit/b527a7c2ba3c4002f443dd2c536ff4ff41a38b86) `which@1.1.1`: Callers can pass in their own `PATH` instead of relying on `process.env`. ([@isaacs](https://github.com/isaacs)) ### v2.10.0 (2015-05-8): #### THE IMPLICATIONS ARE MORE PROFOUND THAN THEY APPEAR If you've done much development in The Enterprise®™, you know that keeping track of software licenses is far more important than one might expect / hope / fear. Tracking licenses is a hassle, and while many (if not most) of us have (reluctantly) gotten around to setting a license to use by default with all our new projects (even if it's just WTFPL), that's about as far as most of us think about it. In big enterprise shops, ensuring that projects don't inadvertently use software with unacceptably encumbered licenses is serious business, and developers spend a surprising (and appalling) amount of time ensuring that licensing is covered by writing automated checkers and other license auditing tools. The Linux Foundation has been working on a machine-parseable syntax for license expressions in the form of [SPDX](https://spdx.org/), an appropriately enterprisey acronym. IP attorney and JavaScript culture hero [Kyle Mitchell](http://kemitchell.com/) has put a considerable amount of effort into bringing SPDX to JavaScript and Node. He's written [`spdx.js`](https://github.com/kemitchell/spdx.js), a JavaScript SPDX expression parser, and has integrated it into npm in a few different ways. For you as a user of npm, this means: * npm now has proper support for dual licensing in `package.json`, due to SPDX's compound expression syntax. Run `npm help package.json` for details. * npm will warn you if the `package.json` for your project is either missing a `"license"` field, or if the value of that field isn't a valid SPDX expression (pro tip: `"BSD"` becomes `"BSD-2-Clause"` in SPDX (unless you really want one of its variants); `"MIT"` and `"ISC"` are fine as-is; the [full list](https://github.com/shinnn/spdx-license-ids/blob/master/spdx-license-ids.json) is its own package). * `npm init` now demands that you use a valid SPDX expression when using it interactively (pro tip: I mostly use `npm init -y`, having previously run `npm config set init.license=MIT` / `npm config set init.author.email=foo` / `npm config set init.author.name=me`). * The documentation for `package.json` has been updated to tell you how to use the `"license"` field properly with SPDX. In general, this shouldn't be a big deal for anybody other than people trying to run their own automated license validators, but in the long run, if everybody switches to this format, many people's lives will be made much simpler. I think this is an important improvement for npm and am very thankful to Kyle for taking the lead on this. Also, even if you think all of this is completely stupid, just [choose a license](http://en.wikipedia.org/wiki/License-free_software) anyway. Future you will thank past you someday, unless you are [djb](http://cr.yp.to/), in which case you are djb, and more power to you. * [`8669f7d`](https://github.com/npm/npm/commit/8669f7d88c472ccdd60e140106ac43cca636a648) [#8179](https://github.com/npm/npm/issues/8179) Document how to use SPDX in `license` stanzas in `package.json`, including how to migrate from old busted license declaration arrays to fancy new compound-license clauses. ([@kemitchell](https://github.com/kemitchell)) * [`98ad98c`](https://github.com/npm/npm/commit/98ad98cb11f3d3ba29a488ef1ab050b066d9c7f6) [#8197](https://github.com/npm/npm/issues/8197) `init-package-json@1.5.0` Ensure that packages bootstrapped with `npm init` use an SPDX-compliant license expression. ([@kemitchell](https://github.com/kemitchell)) * [`2ad3905`](https://github.com/npm/npm/commit/2ad3905e9139b0be2b22accf707b814469de813e) [#8197](https://github.com/npm/npm/issues/8197) `normalize-package-data@2.1.0`: Warn when a package is missing a license declaration, or using a license expression that isn't valid SPDX. ([@kemitchell](https://github.com/kemitchell)) * [`127bb73`](https://github.com/npm/npm/commit/127bb73ccccc59a1267851c702d8ebd3f3a97e81) [#8197](https://github.com/npm/npm/issues/8197) `tar@2.1.1`: Switch from `BSD` to `ISC` for license, where the latter is valid SPDX. ([@othiym23](https://github.com/othiym23)) * [`e9a933a`](https://github.com/npm/npm/commit/e9a933a9148180d9d799f99f4154f5110ff2cace) [#8197](https://github.com/npm/npm/issues/8197) `once@1.3.2`: Switch from `BSD` to `ISC` for license, where the latter is valid SPDX. ([@othiym23](https://github.com/othiym23)) * [`412401f`](https://github.com/npm/npm/commit/412401fb6a19b18f3e02d97a24d4dafed650c186) [#8197](https://github.com/npm/npm/issues/8197) `semver@4.3.4`: Switch from `BSD` to `ISC` for license, where the latter is valid SPDX. ([@othiym23](https://github.com/othiym23)) As a corollary to the previous changes, I've put some work into making `npm install` spew out fewer pointless warnings about missing values in transitive dependencies. From now on, npm will only warn you about missing READMEs, license fields, and the like for top-level projects (including packages you directly install into your application, but we may relax that eventually). Practically _nobody_ liked having those warnings displayed for child dependencies, for the simple reason that there was very little that anybody could _do_ about those warnings, unless they happened to be the maintainers of those dependencies themselves. Since many, many projects don't have SPDX-compliant licenses, the number of warnings reached a level where they ran the risk of turning into a block of visual noise that developers (read: me, and probably you) would ignore forever. So I fixed it. If you still want to see the messages about child dependencies, they're still there, but have been pushed down a logging level to `info`. You can display them by running `npm install -d` or `npm install --loglevel=info`. * [`eb18245`](https://github.com/npm/npm/commit/eb18245f55fb4cd62a36867744bcd1b7be0a33e2) Only warn on normalization errors for top-level dependencies. Transitive dependency validation warnings are logged at `info` level. ([@othiym23](https://github.com/othiym23)) #### BUG FIXES * [`e40e809`](https://github.com/npm/npm/commit/e40e8095d2bc9fa4eb8f01aa22067e0068fa8a54) `tap@1.0.1`: TAP: The Next Generation. Fix up many tests to they work properly with the new major version of `node-tap`. Look at all the colors! ([@isaacs](https://github.com/isaacs)) * [`f9314e9`](https://github.com/npm/npm/commit/f9314e97d26532c0ef2b03e98f3ed300b7cd5026) `nock@1.9.0`: Minor tweaks and bug fixes. ([@pgte](https://github.com/pgte)) * [`45c2b1a`](https://github.com/npm/npm/commit/45c2b1aaa051733fa352074994ae6e569fd51e8b) [#8187](https://github.com/npm/npm/issues/8187) `npm ls` wasn't properly recognizing dependencies installed from GitHub repositories as git dependencies, and so wasn't displaying them as such. ([@zornme](https://github.com/zornme)) * [`1ab57c3`](https://github.com/npm/npm/commit/1ab57c38116c0403965c92bf60121f0f251433e4) In some cases, `npm help` was using something that looked like a regular expression where a glob pattern should be used, and vice versa. ([@isaacs](https://github.com/isaacs)) ### v2.9.1 (2015-04-30): #### WOW! MORE GIT FIXES! YOU LOVE THOSE! The first item below is actually a pretty big deal, as it fixes (with a one-word change and a much, much longer test case (thanks again, [@iarna](https://github.com/iarna))) a regression that's been around for months now. If you're depending on multiple branches of a single git dependency in a single project, you probably want to check out `npm@2.9.1` and verify that things (again?) work correctly in your project. * [`178a6ad`](https://github.com/npm/npm/commit/178a6ad540215820d16217465a5f220d8c95a313) [#7202](https://github.com/npm/npm/issues/7202) When caching git dependencies, do so by the whole URL, including the branch name, so that if a single application depends on multiple branches from the same repository (in practice, multiple version tags), every install is of the correct version, instead of reusing whichever branch the caching process happened to check out first. ([@iarna](https://github.com/iarna)) * [`63b79cc`](https://github.com/npm/npm/commit/63b79ccde092a9cb3b1f34abe43e1d2ba69c0dbf) [#8084](https://github.com/npm/npm/issues/8084) Ensure that Bitbucket, GitHub, and Gitlab dependencies are installed the same way as non-hosted git dependencies, fixing `npm install --link`. ([@laiso](https://github.com/laiso)) #### DOCUMENTATION FIXES AND TWEAKS These changes may seem simple and small (except Lin's fix to the package name restrictions, which was more an egregious oversight on our part), but cleaner documentation makes npm significantly more pleasant to use. I really appreciate all the typo fixes, clarifications, and formatting tweaks people send us, and am delighted that we get so many of these pull requests. Thanks, everybody! * [`ca478dc`](https://github.com/npm/npm/commit/ca478dcaa29b8f07cd6fe515a3c4518166819291) [#8137](https://github.com/npm/npm/issues/8137) Somehow, we had failed to clearly document the full restrictions on package names. [@linclark](https://github.com/linclark) has now fixed that, although we will take with us to our graves the reasons why the maximum package name length is 214 characters (well, OK, it was that that was the longest name in the registry when we decided to put a cap on the name length). ([@linclark](https://github.com/linclark)) * [`b574076`](https://github.com/npm/npm/commit/b5740767c320c1eff3576a8d63952534a0fbb936) [#8079](https://github.com/npm/npm/issues/8079) Make the `npm shrinkwrap` documentation use code formatting for examples consistently. It would be great to do this for more commands HINT HINT. ([@RichardLitt](https://github.com/RichardLitt)) * [`1ff636e`](https://github.com/npm/npm/commit/1ff636e2db3852a53e38c866fed7eafdacd307fc) [#8105](https://github.com/npm/npm/issues/8105) Document that the global `npmrc` goes in `$PREFIX/etc/npmrc`, instead of `$PREFIX/npmrc`. ([@anttti](https://github.com/anttti)) * [`c3f2f7c`](https://github.com/npm/npm/commit/c3f2f7c299342e1c1eccc55a976a63c607f51621) [#8127](https://github.com/npm/npm/issues/8127) Document how to use `npm run build` directly (hint: it's different from `npm build`!). ([@mikemaccana](https://github.com/mikemaccana)) * [`873e467`](https://github.com/npm/npm/commit/873e46757e1986761b15353f94580a071adcb383) [#8069](https://github.com/npm/npm/issues/8069) Take the old, dead npm mailing list address out of `package.json`. It seems that people don't have much trouble figuring out how to report errors to npm. ([@robertkowalski](https://github.com/robertkowalski)) #### ENROBUSTIFICATIONMENT * [`5abfc9c`](https://github.com/npm/npm/commit/5abfc9c9017da714e47a3aece750836b4f9af6a9) [#7973](https://github.com/npm/npm/issues/7973) `npm run-script` completion will only suggest run scripts, instead of including dependencies. If for some reason you still wanted it to suggest dependencies, let us know. ([@mantoni](https://github.com/mantoni)) * [`4b564f0`](https://github.com/npm/npm/commit/4b564f0ce979dc74c09604f4d46fd25a2ee63804) [#8081](https://github.com/npm/npm/issues/8081) Use `osenv` to parse the environment's `PATH` in a platform-neutral way. ([@watilde](https://github.com/watilde)) * [`a4b6238`](https://github.com/npm/npm/commit/a4b62387b41848818973eeed056fd5c6570274f3) [#8094](https://github.com/npm/npm/issues/8094) When we refactored the configuration code to split out checking for IPv4 local addresses, we inadvertently completely broke it by failing to return the values. In addition, just the call to `os.getInterfaces()` could throw on systems where querying the network configuration requires elevated privileges (e.g. Amazon Lambda). Add the return, and trap errors so they don't cause npm to explode. Thanks to [@mhart](https://github.com/mhart) for bringing this to our attention! ([@othiym23](https://github.com/othiym23)) #### DEPENDENCY UPDATES WAIT FOR NO SOPHONT * [`000cd8b`](https://github.com/npm/npm/commit/000cd8b52104942ac3404f0ad0651d82f573da37) `rimraf@2.3.3`: More informative assertions on argument validation failure. ([@isaacs](https://github.com/isaacs)) * [`530a2e3`](https://github.com/npm/npm/commit/530a2e369128270f3e098f0e9be061533003b0eb) `lru-cache@2.6.2`: Revert to old key access-time behavior, as it was correct all along. ([@isaacs](https://github.com/isaacs)) * [`d88958c`](https://github.com/npm/npm/commit/d88958ca02ce81b027b9919aec539d0145875a59) `minimatch@2.0.7`: Feature detection and test improvements. ([@isaacs](https://github.com/isaacs)) * [`3fa39e4`](https://github.com/npm/npm/commit/3fa39e4d492609d5d045033896dcd99f7b875329) `nock@1.7.1` ([@pgte](https://github.com/pgte)) ### v2.9.0 (2015-04-23): This week was kind of a breather to concentrate on fixing up the tests on the `multi-stage` branch, and not mess with git issues for a little while. Unfortunately, There are now enough severe git issues that we'll probably have to spend another couple weeks tackling them. In the meantime, enjoy these two small features. They're just enough to qualify for a semver-minor bump: #### NANOFEATURES * [`2799322`](https://github.com/npm/npm/commit/279932298ce5b589c5eea9439ac40b88b99c6a4a) [#7426](https://github.com/npm/npm/issues/7426) Include local modules in `npm outdated` and `npm update`. ([@ArnaudRinquin](https://github.com/ArnaudRinquin)) * [`2114862`](https://github.com/npm/npm/commit/21148620fa03a582f4ec436bb16bd472664f2737) [#8014](https://github.com/npm/npm/issues/8014) The prefix used before the version on version tags is now configurable via `tag-version-prefix`. Be careful with this one and read the docs before using it. ([@kkragenbrink](https://github.com/kkragenbrink)) #### OTHER MINOR TWEAKS * [`18ce0ec`](https://github.com/npm/npm/commit/18ce0ecd2d94ad3af01e997f1396515892dd363c) [#3032](https://github.com/npm/npm/issues/3032) `npm unpublish` will now use the registry set in `package.json`, just like `npm publish`. This only applies, for now, when unpublishing the entire package, as unpublishing a single version requires the name be included on the command line and therefore doesn't read from `package.json`. ([@watilde](https://github.com/watilde)) * [`9ad2100`](https://github.com/npm/npm/commit/9ad210042242e51d52b2a8b633d8e59248f5faa4) [#8008](https://github.com/npm/npm/issues/8008) Once again, when considering what to install on `npm install`, include `devDependencies`. ([@smikes](https://github.com/smikes)) * [`5466260`](https://github.com/npm/npm/commit/546626059909dca1906454e820ca4e315c1795bd) [#8003](https://github.com/npm/npm/issues/8003) Clarify the documentation around scopes to make it easier to understand how they support private packages. ([@smikes](https://github.com/smikes)) #### DEPENDENCIES WILL NOT STOP UNTIL YOU ARE VERY SLEEPY * [`faf65a7`](https://github.com/npm/npm/commit/faf65a7bbb2fad13216f64ed8f1243bafe743f97) `init-package-json@1.4.2`: If there are multiple validation errors and warnings, ensure they all get displayed (includes a rad new way of testing `init-package-json` contributed by [@michaelnisi](https://github.com/michaelnisi)). ([@MisumiRize](https://github.com/MisumiRize)) * [`7f10f38`](https://github.com/npm/npm/commit/7f10f38d29a8423d7cde8103fa7b64ac728da1e0) `editor@1.0.0`: `1.0.0` is literally more than `0.1.0` (no change aside from version number). ([@substack](https://github.com/substack)) * [`4979af3`](https://github.com/npm/npm/commit/4979af3fcae5a3962383b7fdad3162381e62eefe) [#6805](https://github.com/npm/npm/issues/6805) `npm-registry-client@6.3.3`: Decode scoped package names sent by the registry so they look nicer. ([@mmalecki](https://github.com/mmalecki)) ### v2.8.4 (2015-04-16): This is the fourth release of npm this week, so it's mostly just landing a few small outstanding PRs on dependencies and some tiny documentation tweaks. `npm@2.8.3` is where the real action is. * [`ee2bd77`](https://github.com/npm/npm/commit/ee2bd77f3c64d38735d1d31028224a5c40422a9b) [#7983](https://github.com/npm/npm/issues/7983) `tar@2.1.0`: Better error reporting in corrupted tar files, and add support for the `fromBase` flag (rescued from the dustbin of history by [@deanmarano](https://github.com/deanmarano)). ([@othiym23](https://github.com/othiym23)) * [`d8eee6c`](https://github.com/npm/npm/commit/d8eee6cf9d2ff7aca68dfaed2de76824a3e0d9af) `init-package-json@1.4.1`: Add support for a default author, and only add scope to a package name once. ([@othiym23](https://github.com/othiym23)) * [`4fc5d98`](https://github.com/npm/npm/commit/4fc5d98b785f601c60d4dc0a2c8674f0cccf6262) `lru-cache@2.6.1`: Small tweaks to cache value aging and entry counting that are irrelevant to npm. ([@isaacs](https://github.com/isaacs)) * [`1fe5840`](https://github.com/npm/npm/commit/1fe584089f5bef133de5518aa26eaf6064be2bf7) [#7946](https://github.com/npm/npm/issues/7946) Make `npm init` text friendlier. ([@sandfox](https://github.com/sandfox)) ### v2.8.3 (2015-04-15): #### TWO SMALL GIT TWEAKS This is the last of a set of releases intended to ensure npm's git support is robust enough that we can stop working on it for a while. These fixes are small, but prevent a common crasher and clear up one of the more confusing error messages coming out of npm when working with repositories hosted on git. * [`387f889`](https://github.com/npm/npm/commit/387f889c0e8fb617d9cc9a42ed0a3ec49424ab5d) [#7961](https://github.com/npm/npm/issues/7961) Ensure that hosted git SSH URLs always have a valid protocol when stored in `resolved` fields in `npm-shrinkwrap.json`. ([@othiym23](https://github.com/othiym23)) * [`394c2f5`](https://github.com/npm/npm/commit/394c2f5a1227232c0baf42fbba1402aafe0d6ffb) Switch the order in which hosted Git providers are checked to `git:`, `git+https:`, then `git+ssh:` (from `git:`, `git+ssh:`, then `git+https:`) in an effort to go from most to least likely to succeed, to make for less confusing error message. ([@othiym23](https://github.com/othiym23)) ### v2.8.2 (2015-04-14): #### PEACE IN OUR TIME npm has been having an issue with CouchDB's web server since the release of io.js and Node.js 0.12.0 that has consumed a huge amount of my time to little visible effect. Sam Mikes picked up the thread from me, and after a [_lot_ of effort](https://github.com/npm/npm/issues/7699#issuecomment-93091111) figured out that ultimately there are probably a couple problems with the new HTTP Agent keep-alive handling in new versions of Node. In addition, `npm-registry-client` was gratuitously sending a body along with a GET request which was triggering the bugs. Sam removed about 10 bytes from one file in `npm-registry-client`, and this problem, which has been bugging us for months, completely went away. In conclusion, Sam Mikes is great, and anybody using a private registry hosted on CouchDB should thank him for his hard work. Also, thanks to the community at large for pitching in on this bug, which has been around for months now. * [`431c3bf`](https://github.com/npm/npm/commit/431c3bf6cdec50f9f0c735f478cb2f3f337d3313) [#7699](https://github.com/npm/npm/issues/7699) `npm-registry-client@6.3.2`: Don't send body with HTTP GET requests when logging in. ([@smikes](https://github.com/smikes)) ### v2.8.1 (2015-04-12): #### CORRECTION: NPM'S GIT INTEGRATION IS DOING OKAY A [helpful bug report](https://github.com/npm/npm/issues/7872#issuecomment-91809553) led to another round of changes to [`hosted-git-info`](https://github.com/npm/hosted-git-info/commit/827163c74531b69985d1ede7abced4861e7b0cd4), some additional test-writing, and a bunch of hands-on testing against actual private repositories. While the complexity of npm's git dependency handling is nearly fractal (because npm is very complex, and git is even more complex), it's feeling way more solid than it has for a while. We think this is a substantial improvement over what we had before, so give `npm@2.8.1` a shot if you have particularly complex git use cases and [let us know](https://github.com/npm/npm/issues/new) how it goes. (NOTE: These changes mostly affect cloning and saving references to packages hosted in git repositories, and don't address some known issues with things like lifecycle scripts not being run on npm dependencies. Work continues on other issues that affect parity between git and npm registry packages.) * [`66377c6`](https://github.com/npm/npm/commit/66377c6ece2cf4d53d9a618b7d9824e1452bc293) [#7872](https://github.com/npm/npm/issues/7872) `hosted-git-info@2.1.2`: Pass through credentials embedded in SSH and HTTPs git URLs. ([@othiym23](https://github.com/othiym23)) * [`15efe12`](https://github.com/npm/npm/commit/15efe124753257728a0ddc64074fa5a4b9c2eb30) [#7872](https://github.com/npm/npm/issues/7872) Use the new version of `hosted-git-info` to pass along credentials embedded in git URLs. Test it. Test it a lot. ([@othiym23](https://github.com/othiym23)) #### SCOPED DEPENDENCIES AND PEER DEPENDENCIES: NOT QUITE REESE'S Big thanks to [@ewie](https://github.com/ewie) for identifying an issue with how npm was handling `peerDependencies` that were implicitly installed from the `package.json` files of scoped dependencies. This [will be a moot point](https://github.com/npm/npm/issues/6565#issuecomment-74971689) with the release of `npm@3`, but until then, it's important that `peerDependency` auto-installation work as expected. * [`b027319`](https://github.com/npm/npm/commit/b0273190c71eba14395ddfdd1d9f7ba625297523) [#7920](https://github.com/npm/npm/issues/7920) Scoped packages with `peerDependencies` were installing the `peerDependencies` into the wrong directory. ([@ewie](https://github.com/ewie)) * [`649e31a`](https://github.com/npm/npm/commit/649e31ae4fd02568bae5dc6b4ea783431ce3d63e) [#7920](https://github.com/npm/npm/issues/7920) Test `peerDependency` installs involving scoped packages using `npm-package-arg` instead of simple path tests, for consistency. ([@othiym23](https://github.com/othiym23)) #### MAKING IT EASIER TO WRITE NPM TESTS, VERSION 0.0.1 [@iarna](https://github.com/iarna) and I ([@othiym23](https://github.com/othiym23)) have been discussing a [candidate plan](https://github.com/npm/npm/wiki/rewriting-npm's-tests:-a-plan-maybe) for improving npm's test suite, with the goal of making it easier for new contributors to get involved with npm by reducing the learning curve necessary to be able to write good tests for proposed changes. This is the first substantial piece of that effort. Here's what the commit message for [`ed7e249`](https://github.com/npm/npm/commit/ed7e249d50444312cd266942ce3b89e1ca049bdf) had to say about this work: > It's too difficult for npm contributors to figure out what the conventional > style is for tests. Part of the problem is that the documentation in > CONTRIBUTING.md is inadequate, but another important factor is that the tests > themselves are written in a variety of styles. One of the most notable > examples of this is the fact that many tests use fixture directories to store > precooked test scenarios and package.json files. > > This had some negative consequences: > > * tests weren't idempotent > * subtle dependencies between tests existed > * new tests get written in this deprecated style because it's not > obvious that the style is out of favor > * it's hard to figure out why a lot of those directories existed, > because they served a variety of purposes, so it was difficult to > tell when it was safe to remove them > > All in all, the fixture directories were a major source of technical debt, and > cleaning them up, while time-consuming, makes the whole test suite much more > approachable, and makes it more likely that new tests written by outside > contributors will follow a conventional style. To support that, all of the > tests touched by this changed were cleaned up to pass the `standard` style > checker. And here's a little extra context from a comment I left on [#7929](https://github.com/npm/npm/issues/7929): > One of the other things that encouraged me was looking at this > [presentation on technical debt](http://www.slideshare.net/nnja/pycon-2015-technical-debt-the-monster-in-your-closet) > from Pycon 2015, especially slide 53, which I interpreted in terms of > difficulty getting new contributors to submit patches to an OSS project like > npm. npm has a long ways to go, but I feel good about this change. * [`ed7e249`](https://github.com/npm/npm/commit/ed7e249d50444312cd266942ce3b89e1ca049bdf) [#7929](https://github.com/npm/npm/issues/7929) Eliminate fixture directories from `test/tap`, leaving each test self-contained. ([@othiym23](https://github.com/othiym23)) * [`4928d30`](https://github.com/npm/npm/commit/4928d30140821c63e03fffed73f8d88ebdc43710) [#7929](https://github.com/npm/npm/issues/7929) Move fixture files from `test/tap/*` to `test/fixtures`. ([@othiym23](https://github.com/othiym23)) * [`e925deb`](https://github.com/npm/npm/commit/e925debca91092a814c1a00933babc3a8cf975be) [#7929](https://github.com/npm/npm/issues/7929) Tweak the run scripts to stop slaughtering the CPU on doc rebuild. ([@othiym23](https://github.com/othiym23)) * [`65bf7cf`](https://github.com/npm/npm/commit/65bf7cffaf91c426b676c47529eee796f8b8b75c) [#7923](https://github.com/npm/npm/issues/7923) Use an alias of scripts and run-scripts in `npm run test-all` ([@watilde](https://github.com/watilde)) * [`756a3fb`](https://github.com/npm/npm/commit/756a3fbb852a2469afe706635ed88d22c37743e5) [#7923](https://github.com/npm/npm/issues/7923) Sync timeout time of `npm run-script test-all` to be the same as `test` and `tap` scripts. ([@watilde](https://github.com/watilde)) * [`8299b5f`](https://github.com/npm/npm/commit/8299b5fb6373354a7fbaab6f333863758812ae90) Set a timeout for tap tests for `npm run-script test-all`. ([@othiym23](https://github.com/othiym23)) #### THE EVER-BEATING DRUM OF DEPENDENCY UPDATES * [`d90d0b9`](https://github.com/npm/npm/commit/d90d0b992acbf62fd5d68debf9d1dbd6cfa20804) [#7924](https://github.com/npm/npm/issues/7924) Remove `child-process-close`, as it was included for Node 0.6 compatibility, and npm no longer supports 0.6. ([@robertkowalski](https://github.com/robertkowalski)) * [`16427c1`](https://github.com/npm/npm/commit/16427c1f3ea3d71ee753c62eb4c2663c7b32b84f) `lru-cache@2.5.2`: More accurate updating of expiry times when `maxAge` is set. ([@isaacs](https://github.com/isaacs)) * [`03cce83`](https://github.com/npm/npm/commit/03cce83b64344a9e0fe036dce214f4d68cfcc9e7) `nock@1.6.0`: Mocked network error handling. ([@pgte](https://github.com/pgte)) * [`f93b1f0`](https://github.com/npm/npm/commit/f93b1f0b7eb5d1b8a7967e837bbd756db1091d00) `glob@5.0.5`: Use `path-is-absolute` polyfill, allowing newer Node.js and io.js versions to use `path.isAbsolute()`. ([@sindresorhus](https://github.com/sindresorhus)) * [`a70d694`](https://github.com/npm/npm/commit/a70d69495a6e96997e64855d9e749d943ee6d64f) `request@2.55.0`: Bug fixes and simplification. ([@simov](https://github.com/simov)) * [`2aecc6f`](https://github.com/npm/npm/commit/2aecc6f4083526feeb14615b4e5484edc66175b5) `columnify@1.5.1`: Switch to using babel from 6to5. ([@timoxley](https://github.com/timoxley)) ### v2.8.0 (2015-04-09): #### WE WILL NEVER BE DONE FIXING NPM'S GIT SUPPORT If you look at [the last release's release notes](https://github.com/npm/npm/blob/master/CHANGELOG.md#git-mean-git-tuff-git-all-the-way-away-from-my-stuff), you will note that they confidently assert that it's perfectly OK to force all GitHub URLs through the same `git:` -> `git+ssh:` fallback flow for cloning. It turns out that many users depend on `git+https:` URLs in their build environments because they use GitHub auth tokens instead of SSH keys. Also, in some cases you just want to be able to explicitly say how a given dependency should be cloned from GitHub. Because of the way we resolved the inconsistency in GitHub shorthand handling [before](https://github.com/npm/npm/blob/master/CHANGELOG.md#bug-fixes-1), this turned out to be difficult to work around. So instead of hacking around it, we completely redid how git is handled within npm and its attendant packages. Again. This time, we changed things so that `normalize-package-data` and `read-package-json` leave more of the git logic to npm itself, which makes handling shorthand syntax consistently much easier, and also allows users to resume using explicit, fully-qualified git URLs without npm messing with them. Here's a summary of what's changed: * Instead of converting the GitHub shorthand syntax to a `git+ssh:`, `git:`, or `git+https:` URL and saving that, save the shorthand itself to `package.json`. * If presented with shortcuts, try cloning via the git protocol, SSH, and HTTPS (in that order). * No longer prompt for credentials -- it didn't work right with the spinner, and wasn't guaranteed to work anyway. We may experiment with doing this a better way in the future. Users can override this by setting `GIT_ASKPASS` in their environment if they want to experiment with interactive cloning, but should also set `--no-spin` on the npm command line (or run `npm config set spin=false`). * **EXPERIMENTAL FEATURE**: Add support for `github:`, `gist:`, `bitbucket:`, and `gitlab:` shorthand prefixes. GitHub shortcuts will continue to be normalized to `org/repo` instead of being saved as `github:org/repo`, but `gitlab:`, `gist:`, and `bitbucket:` prefixes will be used on the command line and from `package.json`. BE CAREFUL WITH THIS. `package.json` files published with the new shorthand syntax can _only_ be read by `npm@2.8.0` and later, and this feature is mostly meant for playing around with it. If you want to save git dependencies in a form that older versions of npm can read, use `--save-exact`, which will save the git URL and resolved commit hash of the head of the branch in a manner similar to the way that `--save-exact` pins versions for registry dependencies. This is documented (so check `npm help install` for details), but we're not going to make a lot of noise about it until it has a chance to bake in a little more. It is [@othiym23](https://github.com/othiym23)'s sincere hope that this will resolve all of the inconsistencies users were seeing with GitHub and git-hosted packages, but given the level of change here, that may just be a fond wish. Extra testing of this change is requested. * [`6b0f588`](https://github.com/npm/npm/commit/6b0f58877f37df9904490ffbaaad33862bd36dce) [#7867](https://github.com/npm/npm/issues/7867) Use git shorthand and git URLs as presented by user. Support new `hosted-git-info` shortcut syntax. Save shorthand in `package.json`. Try cloning via `git:`, `git+ssh:`, and `git+https:`, in that order, when supported by the underlying hosting provider. ([@othiym23](https://github.com/othiym23)) * [`75d4267`](https://github.com/npm/npm/commit/75d426787869d54ca7400408f562f971b34649ef) [#7867](https://github.com/npm/npm/issues/7867) Document new GitHub, GitHub gist, Bitbucket, and GitLab shorthand syntax. ([@othiym23](https://github.com/othiym23)) * [`7d92c75`](https://github.com/npm/npm/commit/7d92c7592998d90ec883fa989ca74f04ec1b93de) [#7867](https://github.com/npm/npm/issues/7867) When `--save-exact` is used with git shorthand or URLs, save the fully-resolved URL, with branch name resolved to the exact hash for the commit checked out. ([@othiym23](https://github.com/othiym23)) * [`9220e59`](https://github.com/npm/npm/commit/9220e59f8def8c82c6d331a39ba29ad4c44e3a9b) [#7867](https://github.com/npm/npm/issues/7867) Ensure that non-prefixed and non-normalized GitHub shortcuts are saved to `package.json`. ([@othiym23](https://github.com/othiym23)) * [`dd398e9`](https://github.com/npm/npm/commit/dd398e98a8eba27eeba84378200da3d078fdf980) [#7867](https://github.com/npm/npm/issues/7867) `hosted-git-info@2.1.1`: Ensure that `gist:` shorthand survives being round-tripped through `package.json`. ([@othiym23](https://github.com/othiym23)) * [`33d1420`](https://github.com/npm/npm/commit/33d1420bf2f629332fceb2ac7e174e63ac48f96a) [#7867](https://github.com/npm/npm/issues/7867) `hosted-git-info@2.1.0`: Add support for auth embedded directly in git URLs. ([@othiym23](https://github.com/othiym23)) * [`23a1d5a`](https://github.com/npm/npm/commit/23a1d5a540e8db27f5cd0245de7c3694e2bddad1) [#7867](https://github.com/npm/npm/issues/7867) `hosted-git-info@2.0.2`: Make it possible to determine in which form a hosted git URL was passed. ([@iarna](https://github.com/iarna)) * [`eaf75ac`](https://github.com/npm/npm/commit/eaf75acb718611ad5cfb360084ec86938d9c66c5) [#7867](https://github.com/npm/npm/issues/7867) `normalize-package-data@2.0.0`: Normalize GitHub specifiers so they pass through shortcut syntax and preserve explicit URLs. ([@iarna](https://github.com/iarna)) * [`95e0535`](https://github.com/npm/npm/commit/95e0535e365e0aca49c634dd2061a0369b0475f1) [#7867](https://github.com/npm/npm/issues/7867) `npm-package-arg@4.0.0`: Add git URL and shortcut to hosted git spec and use `hosted-git-info@2.0.2`. ([@iarna](https://github.com/iarna)) * [`a808926`](https://github.com/npm/npm/commit/a8089268d5f3d57f42dbaba02ff6437da5121191) [#7867](https://github.com/npm/npm/issues/7867) `realize-package-specifier@3.0.0`: Use `npm-package-arg@4.0.0` and test shortcut specifier behavior. ([@iarna](https://github.com/iarna)) * [`6dd1e03`](https://github.com/npm/npm/commit/6dd1e039bddf8cf5383343f91d84bc5d78acd083) [#7867](https://github.com/npm/npm/issues/7867) `init-package-json@1.4.0`: Allow dependency on `read-package-json@2.0.0`. ([@iarna](https://github.com/iarna)) * [`63254bb`](https://github.com/npm/npm/commit/63254bb6358f66752aca6aa1a275271b3ae03f7c) [#7867](https://github.com/npm/npm/issues/7867) `read-installed@4.0.0`: Use `read-package-json@2.0.0`. ([@iarna](https://github.com/iarna)) * [`254b887`](https://github.com/npm/npm/commit/254b8871f5a173bb464cc5b0ace460c7878b8097) [#7867](https://github.com/npm/npm/issues/7867) `read-package-json@2.0.0`: Use `normalize-package-data@2.0.0`. ([@iarna](https://github.com/iarna)) * [`0b9f8be`](https://github.com/npm/npm/commit/0b9f8be62fe5252abe54d49e36a696f4816c2eca) [#7867](https://github.com/npm/npm/issues/7867) `npm-registry-client@6.3.0`: Mark compatibility with `normalize-package-data@2.0.0` and `npm-package-arg@4.0.0`. ([@iarna](https://github.com/iarna)) * [`f40ecaa`](https://github.com/npm/npm/commit/f40ecaad68f77abc50eb6f5b224e31dec3d250fc) [#7867](https://github.com/npm/npm/issues/7867) Extract a common method to use when cloning git repos for testing. ([@othiym23](https://github.com/othiym23)) #### TEST FIXES FOR NODE 0.8 npm continues to [get closer](https://github.com/npm/npm/issues/7842) to being completely green on Travis for Node 0.8. * [`26d36e9`](https://github.com/npm/npm/commit/26d36e9cf0eca69fe1863d2ea536c28555b9e8de) [#7842](https://github.com/npm/npm/issues/7842) When spawning child processes, map exit code 127 to ENOENT so Node 0.8 handles child process failures the same as later versions. ([@SonicHedgehog](https://github.com/SonicHedgehog)) * [`54cd895`](https://github.com/npm/npm/commit/54cd8956ea783f96749e46597d8c2cb9397c5d5f) [#7842](https://github.com/npm/npm/issues/7842) Node 0.8 requires -e with -p when evaluating snippets; fix test. ([@SonicHedgehog](https://github.com/SonicHedgehog)) #### SMALL FIX AND DOC TWEAK * [`20e9003`](https://github.com/npm/npm/commit/20e90031b847e9f7c7168f3dad8b1e526f9a2586) `tar@2.0.1`: Fix regression where relative symbolic links within an extraction root that pointed within an extraction root would get normalized to absolute symbolic links. ([@isaacs](https://github.com/isaacs)) * [`2ef8898`](https://github.com/npm/npm/commit/2ef88989c41bee1578570bb2172c90ede129dbd1) [#7879](https://github.com/npm/npm/issues/7879) Better document that `npm publish --tag=foo` will not set `latest` to that version. ([@linclark](https://github.com/linclark)) ### v2.7.6 (2015-04-02): #### GIT MEAN, GIT TUFF, GIT ALL THE WAY AWAY FROM MY STUFF Part of the reason that we're reluctant to take patches to how npm deals with git dependencies is that every time we touch the git support, something breaks. The last few releases are a case in point. `npm@2.7.4` completely broke installing private modules from GitHub, and `npm@2.7.5` fixed them at the cost of logging a misleading error message that caused many people to believe that their dependencies hadn't been successfully installed when they actually had been. This all started from a desire to ensure that GitHub shortcut syntax is being handled correctly. The correct behavior is for npm to try to clone all dependencies on GitHub (whether they're specified with the GitHub `organization/repository` shortcut syntax or not) via the plain `git:` protocol first, and to fall back to using `git+ssh:` if `git:` doesn't work. Previously, sometimes npm would use `git:` and `git+ssh:` in some cases (most notably when using GitHub shortcut syntax on the command line), and use `git+https:` in others (when the GitHub shortcut syntax was present in `package.json`). This led to subtle and hard-to-understand inconsistencies, and we're glad that as of `npm@2.7.6`, we've finally gotten things to where they were before we started, only slightly more consistent overall. We are now going to go back to our policy of being extremely reluctant to touch the code that handles Git dependencies. * [`b747593`](https://github.com/npm/npm/commit/b7475936f473f029e6a027ba1b16277523747d0b) [#7630](https://github.com/npm/npm/issues/7630) Don't automatically log all git failures as errors. `maybeGithub` needs to be able to fail without logging to support its fallback logic. ([@othiym23](https://github.com/othiym23)) * [`cd67a0d`](https://github.com/npm/npm/commit/cd67a0db07891d20871822696c26692c8a84866a) [#7829](https://github.com/npm/npm/issues/7829) When fetching a git remote URL, handle failures gracefully (without assuming standard output exists). ([@othiym23](https://github.com/othiym23)) * [`637c7d1`](https://github.com/npm/npm/commit/637c7d1411fe07f409cf91f2e65fd70685cb253c) [#7829](https://github.com/npm/npm/issues/7829) When fetching a git remote URL, handle failures gracefully (without assuming standard _error_ exists). ([@othiym23](https://github.com/othiym23)) #### OTHER SIGNIFICANT FIXES * [`78005eb`](https://github.com/npm/npm/commit/78005ebb6f4103c20f077669c3929b7ea46a4c0d) [#7743](https://github.com/npm/npm/issues/7743) Always quote arguments passed to `npm run-script`. This allows build systems and the like to safely escape glob patterns passed as arguments to `run-scripts` with `npm run-script
npm_3.5.2.orig/html/favicon.ico0000644000000000000000000001566612631326456014575 0ustar 00000000000000@@(&hN(@78( @78npm_3.5.2.orig/html/index.html0000644000000000000000000000510312631326456014432 0ustar 00000000000000 npm - JavaScript Package Manager

npm

npm is a package manager for node. You can use it to install and publish your node programs. It manages dependencies and does other cool stuff.

Easy Zero Line Install

Install Node.js
(npm comes with it.)

Because a one-line install is one too many.

Fancy Install

  1. Get the code.
  2. Do what the README says to do.

There's a pretty thorough install script at https://npmjs.org/install.sh

For maximum security, make sure to thorougly inspect every program that you run on your computer!

Other Cool Stuff

npm_3.5.2.orig/html/n-64.png0000644000000000000000000000124712631326456013634 0ustar 00000000000000PNG  IHDRgAMA asRGB cHRMz&u0`:pQ<PLTE87"[BbKGD- pHYs  $IDATHc`Q M` >F;]:ţ%tEXtdate:create2011-05-30T23:10:44-07:00k%tEXtdate:modify2010-12-06T12:20:12-08:002tEXtPNG:cHRMchunk was found (see Chromaticity, above)H)tEXtPNG:gAMAgamma=0.45455 (See Gamma, above),tEXtPNG:IHDR.bit_depth8)~PtEXtPNG:IHDR.color_type6J+tEXtPNG:IHDR.interlace_method0;tEXtPNG:IHDR.width,height16, 168 (tEXtPNG:pHYsx_res=2835, y_res=2835, units=1)r>(tEXtPNG:sRGBintent=0 (See Rendering intent)IENDB`npm_3.5.2.orig/html/n-large.png0000644000000000000000000000127312631326456014474 0ustar 00000000000000PNG  IHDRf:%gAMA asRGB cHRMz&u0`:pQ<PLTE87"[BbKGD- pHYs  8IDAThʱ @@_ZA.Bu@wL|=%tEXtdate:create2011-05-30T23:10:44-07:00k%tEXtdate:modify2010-12-06T12:20:12-08:002tEXtPNG:cHRMchunk was found (see Chromaticity, above)H)tEXtPNG:gAMAgamma=0.45455 (See Gamma, above),tEXtPNG:IHDR.bit_depth8)~PtEXtPNG:IHDR.color_type6J+tEXtPNG:IHDR.interlace_method0;tEXtPNG:IHDR.width,height16, 168 (tEXtPNG:pHYsx_res=2835, y_res=2835, units=1)r>(tEXtPNG:sRGBintent=0 (See Rendering intent)IENDB`npm_3.5.2.orig/html/npm-16.png0000644000000000000000000000022112631326456014155 0ustar 00000000000000PNG  IHDRasRGB pHYs  6IDAT8cG C?3p IENDB`npm_3.5.2.orig/html/npm-256-square.png0000644000000000000000000000633212631326456015552 0ustar 00000000000000PNG  IHDRwh CiCCPICC profilexڝSwX>eVBl"#Ya@Ņ VHUĂ H(gAZU\8ܧ}zy&j9R<:OHɽH gyx~t?op.$P&W " R.TSd ly|B" I>ةآ(G$@`UR,@".Y2GvX@`B, 8C L0ҿ_pH˕͗K3w!lBa)f "#HL 8?flŢko">!N_puk[Vh]3 Z zy8@P< %b0>3o~@zq@qanvRB1n#Dž)4\,XP"MyRD!ɕ2 w ONl~Xv@~- g42y@+͗\LD*A aD@ $<B AT:18 \p` Aa!:b""aH4 Q"rBj]H#-r9\@ 2G1Qu@Ơst4]k=Kut}c1fa\E`X&cX5V5cX7va$^lGXLXC%#W 1'"O%zxb:XF&!!%^'_H$ɒN !%2I IkHH-S>iL&m O:ňL $RJ5e?2BQͩ:ZImvP/S4u%͛Cˤ-Кigih/t ݃EЗkw Hb(k{/LӗT02goUX**|:V~TUsU?y TU^V}FUP թU6RwRPQ__c FHTc!2eXBrV,kMb[Lvv/{LSCsfffqƱ9ٜJ! {--?-jf~7zھbrup@,:m:u 6Qu>cy Gm7046l18c̐ckihhI'&g5x>fob4ekVyVV׬I\,mWlPW :˶vm))Sn1 9a%m;t;|rtuvlp4éĩWggs5KvSmnz˕ҵܭm=}M.]=AXq㝧/^v^Y^O&0m[{`:>=e>>z"=#~~~;yN`k5/ >B Yroc3g,Z0&L~oL̶Gli})*2.QStqt,֬Yg񏩌;jrvgjlRlc웸xEt$ =sl3Ttcܢ˞w|/9% pHYs  =IDATx1JPW!:֫8AMzqHӨ˾n^72 d@ȀAȀ!B!B ,jI3ou'2 d@ȀAȀ!B!B  |gYN5fIENDB`npm_3.5.2.orig/html/npm-256w.png0000644000000000000000000000614112631326456014441 0ustar 00000000000000PNG  IHDRdi` CiCCPICC profilexڝSwX>eVBl"#Ya@Ņ VHUĂ H(gAZU\8ܧ}zy&j9R<:OHɽH gyx~t?op.$P&W " R.TSd ly|B" I>ةآ(G$@`UR,@".Y2GvX@`B, 8C L0ҿ_pH˕͗K3w!lBa)f "#HL 8?flŢko">!N_puk[Vh]3 Z zy8@P< %b0>3o~@zq@qanvRB1n#Dž)4\,XP"MyRD!ɕ2 w ONl~Xv@~- g42y@+͗\LD*A aD@ $<B AT:18 \p` Aa!:b""aH4 Q"rBj]H#-r9\@ 2G1Qu@Ơst4]k=Kut}c1fa\E`X&cX5V5cX7va$^lGXLXC%#W 1'"O%zxb:XF&!!%^'_H$ɒN !%2I IkHH-S>iL&m O:ňL $RJ5e?2BQͩ:ZImvP/S4u%͛Cˤ-Кigih/t ݃EЗkw Hb(k{/LӗT02goUX**|:V~TUsU?y TU^V}FUP թU6RwRPQ__c FHTc!2eXBrV,kMb[Lvv/{LSCsfffqƱ9ٜJ! {--?-jf~7zھbrup@,:m:u 6Qu>cy Gm7046l18c̐ckihhI'&g5x>fob4ekVyVV׬I\,mWlPW :˶vm))Sn1 9a%m;t;|rtuvlp4éĩWggs5KvSmnz˕ҵܭm=}M.]=AXq㝧/^v^Y^O&0m[{`:>=e>>z"=#~~~;yN`k5/ >B Yroc3g,Z0&L~oL̶Gli})*2.QStqt,֬Yg񏩌;jrvgjlRlc웸xEt$ =sl3Ttcܢ˞w|/9% pHYs  IDATx1J`ῒCt+(Wwq(=xtjKA\A ) \B }I2Қ   @ Y.Nn6fzEz<:t   ]EYhUzuF/ٴk6_aEɸֱ]l5۽0#x       S44   @Wnх~??(kmZqz&koV{b|>oA"sMf@@@ zFy>Xv3{   vgekIENDB`npm_3.5.2.orig/html/npm-64-square.png0000644000000000000000000000553412631326456015472 0ustar 00000000000000PNG  IHDR@@iqgAMA a CiCCPICC profilexڝSwX>eVBl"#Ya@Ņ VHUĂ H(gAZU\8ܧ}zy&j9R<:OHɽH gyx~t?op.$P&W " R.TSd ly|B" I>ةآ(G$@`UR,@".Y2GvX@`B, 8C L0ҿ_pH˕͗K3w!lBa)f "#HL 8?flŢko">!N_puk[Vh]3 Z zy8@P< %b0>3o~@zq@qanvRB1n#Dž)4\,XP"MyRD!ɕ2 w ONl~Xv@~- g42y@+͗\LD*A aD@ $<B AT:18 \p` Aa!:b""aH4 Q"rBj]H#-r9\@ 2G1Qu@Ơst4]k=Kut}c1fa\E`X&cX5V5cX7va$^lGXLXC%#W 1'"O%zxb:XF&!!%^'_H$ɒN !%2I IkHH-S>iL&m O:ňL $RJ5e?2BQͩ:ZImvP/S4u%͛Cˤ-Кigih/t ݃EЗkw Hb(k{/LӗT02goUX**|:V~TUsU?y TU^V}FUP թU6RwRPQ__c FHTc!2eXBrV,kMb[Lvv/{LSCsfffqƱ9ٜJ! {--?-jf~7zھbrup@,:m:u 6Qu>cy Gm7046l18c̐ckihhI'&g5x>fob4ekVyVV׬I\,mWlPW :˶vm))Sn1 9a%m;t;|rtuvlp4éĩWggs5KvSmnz˕ҵܭm=}M.]=AXq㝧/^v^Y^O&0m[{`:>=e>>z"=#~~~;yN`k5/ >B Yroc3g,Z0&L~oL̶Gli})*2.QStqt,֬Yg񏩌;jrvgjlRlc웸xEt$ =sl3Ttcܢ˞w|/9% pHYs b b IDATx @@эXGPE XV`Dbz Y}>3dI rA @͇Ѯn}qZt>yFyن|fx1Xz|JV@g0*C^&@C/]=&@jJzHa+zIENDB`npm_3.5.2.orig/html/npm-fin.png0000644000000000000000000013525512631326456014523 0ustar 00000000000000PNG  IHDR=K`tEXtSoftwareAdobe ImageReadyqe<OIDATx] |MG%.FlENH}+RR*J{T.%!$vW;7ss]r~n=9sf}fy'_"!!!o\r׮]KNN~ׯ=<< ..]:YK) $$BΞ=N#.\?.]oi\ J>;3+W.77CϟB 2eʗ/3 {#[io޿{Ν;˗/Q{=|e{Sb 7n\X O.$$oIHHcǎ]v ẗ́w&''?y?Le̘ŀ˜9sʕ+ 2ˑ#|}$$ 6mСC/^goޞ5kVh 2*U{L[B򖄄^ڻw?SMnBE4eˆڵ,quuTc%!h\|ɓ QQQIIIJ5d >˕+˗/!Ȋ/ަMZj̙wށP#yK򖄄q8p r‡?!dy@{AΝեKZj+V *i[);vN<#K,Y? Nʕ={Vv%JԩSJ*~$oIH +Vw~5ӫW $M!C4YBUXSN/TP,Y*i[* I?~|ĉ/^\|̔+OB€" 3ehLȲ|eΜnݺ?000gΜ%oI8uTttk׮_~]gϞ$)DP$>@6Qf93nY͛:uTPL2r-ԇ[&'';v _zt}%9$aWLr"D_/X`\|||rȁϠܹs(|[NuΝ;]ʤD) 23e̘_ϛ7//^xΜ9U7o|Q\\\||ݻwMիWrʠR\%ߣ- [Gwٳg>} ̄B-A3pҧOO5 䓝ZE,YYP!|t߸qҲxg TX1H㠿bŊ Al^^^-D[4D FIO K, '()&2WT) YL>|}^jׂo_D|%'>z\rȑ[n͛7Ϛ5),y+MJt9LJ${4~vaO/^ tBqeHJJ'Nz>*`) DrZUҨR<xj /yKY`n H"N>"Q()T2}t k׮Mc?@<Zr֠Hp'^ <<|("DR f'k֬``___///Tl\M:i׮]O>=zB;pZ#{Y( <͛7W\{.$իWi\ JTNf@=>ӧO']ѣ䭴v 6)+dH8!hX$\lAtpI'www? ,z2 <ώ(+A'&phzYDjժu8s}^^ޒ;_HRаIkĂ)Sj h$8FHXz.{8*T7X^=kDgΜAʓHP5"Dp)$- HY۷o(X1%]؉(dY@SPtip9%-yKr a@E4~iX5Ol>p)$)..s EQ2e|裏 G~N[$) =!!!ݻO$|.H *p Y|ʗ/[D _LGײIi8csP ʖ-UO>MHHQ|4wL$T=Vbڊk̘1PihqYf[B;_HaE8> Bh,^xMp {>t<=8[$J}TX'`nn۶m"6)ѝ;wlLi NֈE܊- K!$ W^hsPDb%L *U-ׯaÆ+W:tH֋Dhl `)$v",Gd w/e IQ /]`˗''' pryR6 Gh>dB^h']KSL0aCM׶m۾}֬YE\j… ?.X">tERd$ {`AAA7n,]TBx{GqwwW~i߇LQ>|oޢnnn`/8r 0Ωg=8=3)gއf9|۽PdEg| &$DމaÆM:|Ƶ$,@ț'ONMqضmѣ_zٴiS///8[%esuuxȨ }O?0`@ٲe a%9?0 Bsv/]=|aj*'6lzQT#G]#6mĭqDO3gδiT"rUB?`TTTʕap t~~~ǎe@` fo @ֈ$#lc ܊ݽ#w/Ԁ́$a'N"##A].dx}s6lQs5bAy17{!ѣGܽPl2cn?7jԈ=H@`AY\z6n8Ș1c͛kT֯_ߦMȁjժDO?/ sνzR~QmNֈM!AZ]cwcw/|)O IV0`@۶m `SN6l90a… bW.84i{իWO駟Yʻ=A={T>|ڵ 0o6T9@@KRHp`w/L٤|Jd$ {D:!C##GI@S%i~dӪUȟ?CS]GyjI~ȑ#$T@6i_f͚mj6mڜ9sH#&M^zO~̒n…̚5+Y u%rڴF,P`A(*+^Hmn"|UBpCCCķp矛6m<~ҥ#G ycWJ<㿏oܸUV)D UuQ\Z#ņWnjmtGe$ }Ϟ=$Vc1}ǎqѣGȱ`Fv8p l֕n"wMJSti:@:KiXmsL!A{3$SB@Ћ5kN0A0Vbƌ~VhܸqR(@f~D|ȑ+VTR=< ۶mk֬NMwܹ+Z_'j)q\w>h _h4 0pF)$Z(* ΐO Iz9s$VB{8(([#xifΜɍq˗/߻w-- GFI)WCR9%)J˜wywww4^d* `˖-'l޼ytMeU;!>Ó'OG Y,s@hmۖM5pB s7Bg8xJG{IXΗ K@DL–.]*RDDD͚5KcJp ԩSeJkaam2fhQ,Zk׮@u*6((h9qĚ5kmCmseʔO>|ȅ9m9$͏2dHsc]vZ{"DI0P&fQ2'O9r$>iݻb[[Zj$eZj˖-7')$FMy!m0IǦ;":ms9$lۯ\;/7nF/e~~K:TF Yi{iذ!2\gϴ2hР3ge˖i\tN:1 . ձcGAիWC)=YB͛7=z^̋ae$ 1cEWE UQf+Vɓ'sed9֗6>}n 2 cq%J؜y RS0В%KaΝ->e˖i\ҷk׎{̝ذa:C,F|l#-"^@ xwi9s{idEm@.s'ܺu >.=)S#GȗܹsO7nڴ [;HU[sbgJ}Ν;W[?=IX ) >zlz}H޽{ H+D:ugB(u6BB5jsNA9CeIIId%KtErAO<ت|||ԗHETT%s IR9'| ٌܣըTRdd8O0 $q6|O2 s@QN-ijgrɪ@xcl߼ծ]5k*ۛE~z;?W_M2V?~2=s~ JEr Q28F5i$ҥKi;v_Hs̡ [l'OI&+\Z"ab!'Νϟ:uɷ>|֭ɓ'߼%/ -˓Xol۶-((H#  M$pz{L}Y'NXz5ԥֹ(`BB 8bjjժ7ϿŊ[/7ӗ( 6J*ĈhۣG|eʔ)RHܹea4i޽{zNdd$wlٲM#]L{$_z\SNr2΄6i${;v:O`J`J#)S&ڢ  pwyە u?\:w2~%(fpr6>|,uuP/2cƌnݺկ_zUV&#~=zs:ݧv*@!0ٳg͚5#ON?y v\^42KXL{_RJ޼y3((HdΜ9s!ǟŋG%&&E;wn!!!]vjذ!잎 ,Uf͢Ej0ݰ4B݇.ybZhѼP͛7'G/K,YBx h1:Oݻamf>s`=zV1-eș\lglAgi:%TRqqq[xѣ{ ԩ(ϯ`Bӗ.]_~ , x>?$ׯ_ON3f 9%N^kM6?] AUT>MHHmڵk)""",nM2E{!ҙ/b͚5l,Dvݻ-Z4vA8v֭e˖>>>2L7׷oߩ&l;w|% ɹs 4ٳ[i+`t^:#b:bu2K@&)~J, BZ\Zۄ"y a[uĈdnڷo?p͛7 Du7gΜ &\"T&N/^@Hݼy-F=ݻ8r l /a5k$`E|}}kժklvBl*u{N+o޼oGaȑ|OiӧO͛gA7$Ŋ^ệ6l]&T8?ZjڞR4=BU<~NQtt;BCCǏ?r!Cu7lذTRrnT޽Ǎ7քoСC[O(Z(ޖS}]zK| :.))IϟszŴ?zDž=RfZJ.]Μ9; +h"L 8MkmMAݶm[/DL@"00ԩSzґ#G.] KE6jڴ)$'$$lUp3fX,^NqZWm"DB['a >>>p Y\ jԨanxg ,hAAx!wf[6mR}:LӧO={f صk̬#,[̀#dHT[hh(9966Vcgbykذa$o>x`rM!CPW0V:xjMJJFVXfj`(Tj"n:zĒnp!ˠ\B/ W`\r̙3{dC) 8mڴ lٲ%jΜ9.\|w Mkƍonh:x 5e4!zӧOCEs -$.=z2y SlkWWԥE:CGi6 ?tiL(ѣ0[l;1t-Z|yesZj&M#`;w{9<ҥKvM>}X,2RF?پtSԮK֔)Sd2/v hfڴi2cx߸qeˢY (5kV9$U]pŋ: 111E3Żݢ7!9sf-5jr'FU@.%e3R֭CV}޾}{򧹈vs)S's8$EΝٝ!ٓ/_>L m4]-m۶JibK`Ξ={ȑM6͜9gϞeʔ)^x"E<<<@Q2eSͭ\r+WTO> pB?L1~`$ޛ_t X],4.5_|7ģrA"\)7ooFC?wH@x:&UiA(3T[',,Lr?~IsiԩS5J.]dbŊvZz‡(UTIZhx_ile:W HZaĸ%*u˗/7w2G j]hѢ4#K,f͢߂iF8p -Cƌ6hdO?B6_?SiLE=}ݻё֭0aBv|}}Ͻ> ,3g4y9˗>g˖-b6/XJf U+Wӯ_?-ނg( 4PݎK ՠ5[pOT͛@`a$Fc?B-߲SSAAA7nܠ_]rEz葜,RXs omHYz wA;vتUFݪUʕ+.Z{q5o ŋ'BBBD ߿ǐ:txsy.L uRfP^[Wɓ{2 6Ԑ'N̝;رc9"ϩ&ɇ%qKkYk``qW o߾DDDH>pZsxԩ+V}'|bɒ% .7o^?qZC1|R0rM fzD*pO>{|>­"CڿzAs֬Y kf##x-09ުP ށ9jܸ1KڼEG0'oad"4ۓ'OF lٲdPBp%EjS4[/mbŊ"8k,bTc$ P-L%>>z 2CW.7oL{ny+cƌo,TT^ߦMPo߾[F֭[;wt 7o7l-E[qU~}r_Y0#M2Drg͚U0h/SY@PM:)awٿ?Z4iBCHHJI9B/>>a™3g&N783shFf,_\p=Sp ̝U"C"E`!^-5kִkNڗ/_id˖-[. `ԪU+;v{._ܒO.sQݳ.Ą7\[l9{l>}@N{ƌ_k{vM8pq"""lW EE]1 &66m x}p7xHR[hѢLq[yMD+ f ܫE܉T&ؖ\7,TwZ@(x4nݺ)ׁ}jB޽,X}ҶBBB #6W=z4T '!3W}j<pMp`qDz5i$wڵǯXšf ^YLLUL8?@ڸq͛{xx 9r,Xe)xzV>|x!8#xE2 ,-*@c b @:u {/2aΝɭϙ3v X~bH|3޳ ̓|AəoEFF1B$.]z׮]Ӹj [/k(Y$D7eʔ)SNeߖ pu_fЯ>}OR_Җ M*ahuB2KƎHO8f IΜ9.jIJJcݹs4#%99Y4ҏOаXqGo ;!M6/^Tтʎ;FBe`IA٦ƎM6RӧK.'O~ p[&L0fJrXhpi<5k4]o?~7"{M }v.svNbM? Z 8H(Rhի^:> CE-IJx` -'~9qĥK@N&<{.i8]JaD~@xKɝ=$UxKel\L;\|*+ŋ.CeM>}ذadNVp!QTlScGq[Ը€6mڤIUl xm2Ayޅ:tŋ5w?RJ`/J&mvݺuˆ2tAhT"hyXxjժh; 5!002J̙ o], EFN&X!'O)K[5xiib;'={lÆ վIxAr6l8O]Y9% q 7,.As%?"b)l\r֬Y2x*W N!%O?hm3yٳg@֍7#P5AC8P^ @$x]l޼--Ljٚ2Е+W>R? $`I5|Ɏaj9s̚5뫯NggG6nٲ ]HRKR?>5B?)"$@WbT޼yTRBt t%!?.B66EƓċ\1="Zm<}VL\ ^\E/`SNeh $[d<@FHx}1}]dɠAիgNRqSņڅpN -z! 8qرc5|ɀPQF3F)p3 2"hܸ14_pppJQԿ|,9QIٔYq'ooo'te£Rf=H)\hY8!o} Gǁw~혘x=))I0/T}ɓǀ&}z 0 H7߬^|Eҷ` "3ٴz}Em۶&Mpx%)S\t `nI{ݩS'Gc!48 A}KdT8yܹ(7-遜Po=B7C$v,X| F?E׃/ ҪS3li}pfW^={,ٳ8ocCi9SC' bh[B}r<iD7lЪU+MVp2ɨ>vV:1$2a8 !€[w7lذM6$* 9MxmBt02u4( 'J7ڐ(3@mG?iIƴZ;XL2 1W\~~~`)tޫ E',5 ӊu)UYkM\o]1ڼ%2ki`Tn 4FhWPP w&|2²e/ͣ01 f$آYfK,Gu0aqqq֔luNʛ77F0m4FE2aҥÇ7'>fe~<&j AՑ^f#w޽y&)pNTdItmDK5kՙ`\z ޼!jժǏz^Kҗ,ș#jO?H;[j ZRWy& 3ְpٟ&Pޢ8a#/tSN?ʕ+m}Ϗ Zj{eKر>|>0,WG֬YXqߢaqÉdPTeCb.i&(`-&$RGKu"'(-Q?p4Œ3 âk~KV7јJR["q}|٪Nٳxa mI<-:{0*U̙o>2M~>׳g7n؋ޜMȭq t"AQ9oKl/,j;w60w- GK5hР~2g?a'DOu@rB]I``v^:jc'o`vP94hqB8zwTy\9B*al\=RSޢ\L76,̣-VD\o[Ʃ]`E}+ _{ĉrݱc][j}9s (Z(UxYXe`z-m%¯L> Ju# A޲m {H82o o܂Uޭ+/$[/_,BDl2yˀR: 3"[(rD\$֢ V\'^K켗 '԰\f8^"υxKdж2~ o+a -{<lٲ  !E0k0߷oqgWkWWWwE{ҘN8[ f ܇7o^Z9u=u8{;4lRe-}\›;pGC-b'5@eݻWVTrHИʒ%[ qEdɦșL ,fW?C3'p"mڴ1 `iFc|O?ŷ6mR~E_;"F_rW%hFM͍pxB 6vu˲J`y+ Mq˗/XF5k[4iҮ]=VޜmUeXo)_"kRY8!x(J[% se˖վ$* X-bL٤'|,9CyK9Nk~u@-XY%b׫v!J[s-m=\[9fST[$d'ϱmSo67̛7/cRիW/Vi<082dL DE6p­[߿2uT n`{ yI|CAނLA93zY_iDAðjw{J-ԾϚ5kРA  ,հaQF]r7MMV^պuh\_t 7YxqΝuWč Б$fJեm< qhQRxXT;@ʕ/Ӏom@4؄m:,a)ս{Q%$$3([MexH{s<̎䄺<'vO0PRjG[`#gI:u#F WwiР;fϞXT)Vi)O}`/au(ڗ&躄LrۻG2!C &y-GxˢB .9m?<$?I[111;wƢO<,೏?X5e:7)\tn0< $O?iTy@..Ci0m.rQ5PcGva-^d l:7>֭[===[jqF6&U0n~Ê+:uDלQiڴ)%kѢE\w:zYdȑ͚5+_|Z-$tZN'J(A ͛7r*-o-4- =Y{-zG][x|7[⋎sQ&4Q ̇*yФMlj>FA3J,IׯCOF 6DEEA营 rs/):ԨQ>wQZ[[ og˖ oDOTT">&z+ȶƮEɍd7tՀJv{$y yY(jo't5$$uui<;QG]tAESV,B}ȟ54Ͻ{[| P'NT~ I;pYfʕ7n&?-ШQ#k*!y _X"Xtd^ =`N̰U}pz/PB<U/^ۤI˗/>*Μ96zst]vH;7G0&M4}EAc)-Wرc7o~z:$-֕#zode7sX\B8)w`)VtXJ\4[^iH@1eoқP\nJ/@>?]6-1/%ޜ%S*SqEŋQ>3e!C)RD^%!\Μ9aga%ܱc4wɬY͝L:UX_2kȑ#5GƑ޲kMq޲f~˰20NK}JN(,V`"g [tl`H sPtQ.Qܹݺu -YąYW7 *UT6VZFT?$+ gɓW߿Ϟ=~dN۴iR7UVYE޲I, cvn2F$dH8`-]}0"xKp~K|c7W߶~IIxq2pўʷҥK *ЫuI "NsΝ;4ߠ}άE~زeKsѣGӦMx(0nBm!,-nݺքRo9޲S ek6"HkV2P3"qF%P qNlּ[gF76!!aƌDEE={?.r1ņj߾2PJ*5jP}.Beǖ-[ 7Cej+kM %][VZ%__ӧ[7N(緬@5k&pQ wB`X [$޼{`> QRR'f@Qܹ#[z l.j9vS_YawMLL\nÇ?~lq 8w\v_Tυ7Y[C [QMDܴ+W^xQ#f͚5nnn N,ZMTOVe0A240e84[zrfǺxKܔ<}T`Df <r]?~Aҕ0_yPȧܮ~Ϟ=/qL\Li8={fnx߾}gRoݺտÇSo׌>^^ժU9b> -PLC,[]t'U'nU5{ѯZaJz0o[zs׺3Nhn &,f6Yd&N$VPo3yK|ZzɜG+}Zzя*;w\P! f߮9@hm jN\9j,gĬsiWNCh[TyK0ö~p֬Y]v]_֭'O|m<5M-ÛF*w[N rSvO-[(eܼy;"n2(ا vݻl@G>}>3Hhpuu6}ttttLL $)kзi$֏o,e+p<١/:[px|;]+oڑeeH -ӀYwqf͚\p3g4[bF_~L2b[8oi-n|ѢEɓg׮] (EPH%o86L$j @@9ر/UG}ϕ+W>|ڵk ?涭8a'cjmbOl\bb9bvQKsBF57zrnݺuҤI *A.]ZСCXxpl4,I5͎]zKoY f@o)s-Mq=U\nNp ^*=h.w4)%p$#1j_p 'Cil 8?']-q+cTOv[.'ʬ[n{5m4(GBQ`t=-`ނg@o $GGw2x./ȋ筤$4zKϟ_dFS\T-TL8,+V;R'",|E%Xe-SݲF4q6mڔZjvD)RDCksÏ45jhٲeˢ+Wu0]E0rHkx~zT[W>v!*kV֯r-\Pr崛oQ czK0Wo>|EoܖU,nx X\וI+HY=z%Kr+.%|۶m׮][Xe˖7jtҧNZnϤ~ Zxkxˮ{RE'sˆ7+6 .o o{y] lq nC:* B|:t#o=9x_UH&|ѓ )B9!6tP\*vV?:?}31mڴ!]j8kA"CC[F!\Lc$zK/oyxx8z;S=٪Jm/#e ;=@-siv-q*[)PPXB/rChs o4m8oYM * SD nb۶m@(G[!U/^\uu ?޽{w̘1S('('ʗ/tR???DTnޖ[lɓ![lɵcz+  k-q0emCmOh}R6_5  -|goa7ł'GG ߿wݻ>7ob`߾}IuNnJ}bJxv~dʕ/^ؠAf͚)wW^m7+ԨQz"::̍q`„ ٳg^F\rC 7or;ʚ5kq&XCoī>O<|X*V|Jw͚5ք;['Vʎp&&[Oe5XرǏuў8qbPP?ѣQQQ$ ԛU׭[8oY=;عsѣXjR}0J]N&,Y+W.% 8a; Ć#rqlFA۷sM9JhXoQpB͐Vnȸ eN5-cnhlc{'<FexRm۷oرc ӦMLZ00/x3-!9֭cUpAܠd`` [44vXeIPTH+n3)S0x*2.bl׮]hh(k&1.̓i%EFlqb87IY2`0zyaIuyn(71imݺqe+M i`I-&:,7 ڼycǎq15j`l2GdaG,͍C0ٛ.#Fڎy߿զ 0+ YoQo6?[6M Y1䄆рr&& ] u%_3]D`lH͊l["i$ /\SWn Ʌ6 /sBH]C|h#s|U55֭[lْ:砡XihP֢nov&-0o o(nxB}\~=E/ w066;ķ{qa )oݣ$.]DO=w"?qrq7Zj*U8 &믿VmR-̅pzB9j 킮աCn.4hLi/\Epoݻ7'.m9y~TpV[)~˰B4͎a쭷 Ô Q*Ӷm۾=zd :tO6LJ@' W*QxKW:H2gN2e K0$Dum`6GYf.]T{h&vsךvѣGZj(ٳd*= "޽`B,9r͛vH-N& ML?IR]94rƍ'NL6 ~p׮]X-ڱc >>MuV^ҦMH7}xKWj//o~뭍˰&&ic"Ԙ2=޼79alll~wΝgΜqΦn` 3mrtXnx1gѢE'+h [?{-<<5 s5bŊRJnn n/«gϞhΝ;шxp.rs,7 5%N5[趺XtNkIgÅV(@IKR7娡W1S 4hkl9r'<<<6l@L9JoƤ$9rj)[.]?q޼yXU7Ν[5*"P]VZXOhsj~&)O U@%JTXUVJ3\4;vteʔƘ1c*UıWDDWyQ_oݺy.K'SMD;jH  GxBuwnҤ?zm߾}A]{^z-ZXnK$_#Iut͍mz92ONz*_|pޫW^V&t7J3 \8~7eԅB Aʕ>+WHKo;w[ d`R5[ٳgA]d2MOa8gsΜ9F`DvDbժUf͚VM׮]2 Ê֯"i/ӎFmL6mذa'lٲ{f͢![6Ib| og'&b6PB;ϛ7/Up&U>Ю4oe͚Nٰ`J7nx%ۺG^ ,|駟ÙP.aUܑ?eCgofWcdFmX;@͇'䖗0`ڵTWPӧ'N]ױc'UZ|򱱱[YAp(رcV) n߾My:*.}ԇ};m&j"-6 ]6 lQitz$qƍp˗/N;'F+ʬW^`` ,V…޻Ք13=!)EIK"*K!qWƌqK$R"H\P$QR\"d\g=c9>g:u{^|=z+:x)˗{biC)w -<>>~N`իW2=Hnc޽%gΜ jxX_u6Uw }Zj.DF'$9! caS^v X(}}}ְa[5k&W#](4%PoPNBT Dn.Wđ?71NM߿?ɥB3YYYqdǑK,quu=w+jc(zyy=~';;GKP:uu!/5|&y3.C*3QIuuAZpŦVH* &%99922iJG-`3gάUZ.:d5GU*qjrmFd[އ] >'==ٰaÈ#UA;B+ϛ7ϧsG޽{˖-=aYW>}mۂ}|/D$"_q!Iu#""T 癶qV r$᭲ &T*" E'O`F`UUooX-ꥰ3M6ųÙzyy9;;7hЀ>LOO"wʕ/#dgQ9={vѢEdҖ*Oxٔٞ={x8bbkk~L1A;d3˖>-!;Iٳ [jТDI[IJMT B \ѣCZ=}TYf*L_ݺuħ 4RP᭮]ʆ|%L Q*Pe( +[D9z$?|1!Ϝ 1֨Q#8Y(D R,77{z`o<kذ!ch%[jRԾ l٢ E+ޒR5&[rŷLj:w\LL .oiWS66n=l4%l~/[JNN$*q8mHI/Onʕ+֭S[%riś'iym۶b_>)F7l?p|]e\|8V6n L1CjNE)uԤ*TzK[Moa@!k׆{9 㹹dժ+ ===Ç@Tt`ic $Gxk„ tJٵkBu.rE+qo,.O=,mV唂\|#F?DDžup~aQ`llܭ[7.0dPx t_svV}2/T)I\FEO>[xK,t=L ƥ-033?z̚5 dӰaCՍ$'NNNSNATKފY2%e0/1.X|>mzXhŃ b ]r xǎQQQtz RKߞvw߅xBhwJEu٩S*MZ:Lw-[tgcJXgeUzQiP4k q *=B3oXvEU-;;;{ !ύ]!nPsЭ׺u`z mnkٲ%;v=zW'gϞ :1G$׼FH7nܸuVtxv#D -5і io)-$FE-%<``ڵk~8>>>k޼9 m;J$yDi=x z zk](ɲ=Cqׯ?~8]e„ /^p/~/,,j&|ߣGv977I /֖N&-IY/taÆu۪lQ$xKG{2$-U?+0]8 ,NQaۚ%PRoy{{s:th;HJ=Ҋ--;;{'#,,m>hg f=[:A2ӼFgJy !Uɍ7ȟ#GDݛ 1cTm1c8ȵn`xHR\/KLL gjn߾ u/yzK՚BY];iSuP[kOz t,8::hN6ȈdupI'JOb+ >=zMŁK-(7nXYY=CGխ[W6m}BrP,K||<>Nš_.@@yf…]%CS"ՒUohVy[ڸiuͳgςvڵkfbҤI$B ;F펕-[8;;;88V M?۰aCev;w!sbɓ'ɓ'2#7w\zK`UR-pH4-- ]F {СK. hVC}^ u^(-i!U(vZ*,,LMMEO~Xx|0M6ڵC0`@>}0B V@[\ٳkiƍ'rr311qϞ=:T ^zu o>==ѣG133Sncz*(mrܹsǏƂ{s˺w144$xŮpzqWTzX[uhN E{G_.\-X1FhƌwE`OIbК[ڒIBC)xDoidϟ>А#* /_d8aΠH ".gy~gΜqrr?~p5*@jW^p^-nx\ĉ?~?>$]aaah.*[o߾U[JhWԷ_[WӨ|e;O~^NNNll,Wu4x`eeecc_ckkKV4/]XzHV*%"ӦMNKsƚ-n z)Sx2_.ώۃM{[܈D=V>|xhhhO>O>wI&Nz#Fr|𖳳mT,.CBhVֹs純`m4XԐXK6k)q}?hlooO[FGG{KasGxϯYf!uѐ#G*~4H`2,'Lзo_t[nLXݺu݀! |_DŽ-UxKj| YbpSY޽+ZZ}FOxAC׏BOS6==_LRF[/^ 8x@BB 6o͘p0=\xgϞ V.nҤNQ$<ի鋓@vMyK:&d[z4Kha5jVE+CCmvذaSZ޽;k=8Fa-pdgV^͛P]wrrY@\pK.7q dP||-⹃@řZHP@>|I&jǪǎ?t萐wޝť555}y֭i֩S$.>~ԨQx~z{~- 8Ms33'Opұ.?^iVq9ҫW/oO -YB*XZV@hT_yKH77Abyw 4Bӥc $̗bG< B__4zhS 4 O] S ]5|Urho=Y7ol\*?3|ٳY ַH=3g;vL 歐[vލۋEv)HԶJ) 9B/-:8F[xK2TVJm5a„vځZlɦ+̸8g~cӦMh2lΝ&&M[0-S@]]vܹ2:vH"""ZN'ӌIv}1͛7;uDLlȸ(j177@E;MڵkSD5YqxKIB-<GzjԨѨQ#FK 仿hhhbbժU:t͛S<ftԨQڅ'cfΟ?QէOp& ǿЭZZVVVq5HnvQ9Ryf#NABݽ{[ܝ- 8m4:_[=x.![ܣOtuu!8^t\7g \ vw&>|s& ;wD-O( JVp*p}KyKBeddES;rH |CC Wd  ]dT}B UDgE``'-Z4p@ո?{ݦM'3o?H.!!TzA\9)vG#mᕻV\ ! [UBo.[DoU(!DV]om'׷խ[caѰ'Pua;Ǐ05ᢊ pww߲e 1^Gׯ_+N)@\\]PڵM1ŋd!'[nuᕴm~_$8QF!!!S"Kի>|[ҴR2)CUT-yKv+-S׬Y#dر]tꫯVc +--MxB!StC/^4Q={Vb5CVZu nLhڵkgdB9j\`ŋϝ;GpΝ;%JIIrp0nիWZE[viַ@XT\҆H @uTzK}@҂5k߾}դUzssrr"r3VwJVVZtXlb~&W˖-H.]>"V|||HM_on ~>@WWltQ*۴i3k,:<5vU9|ݝ5j!w͜9ALnݺ/^ݻİE߹sg#Gܺu Ox J/&^cRϪhocbb޾}7ϟ?v3 (- \yBx0'N 敠 *W_b0 $dS4Gx4p$9[n6,P>\K\W$k׮%~K۶mq#o`NNJNM#HʆTk֬8p:[hdo skxR۷*ފ,|gt)>r[:E bSY׿7oN{sY.\ILD611!|}}yo޼Dֽx{lXs/oa\ƻ"2^\](Þ={.\Gƃ㩵➫V!z СCaaaQϞ=DZZZ2vS>N߾}=)گ'j~ٳ;v!\(1c6sL+ _RT/vVSyu?K###!qDvT Ei>}0R輥S4Aҥ~ܪ]Tq5ƒGx0[v222̙۠IT ~;QVxEAA<QQQhp+:222jݺR( Uhh(uխ[7??-[2.aŗ  ,CNN/=0LT ̫\2w 233{E+TŎuԙ^`z ziH+l2EfП=<V'>h߾} RXRޒرoI޿c˗/,څ֭[ѣf͚Çoذ!^ԑEANW^z͜9+[JbFK.~1*#hD(}. hCJxp TZUFrki8Fƍ۵klٲGU$;Bƪ*,>LII]_rٳgdZ'@Z9;;Çsttua\Zqqq9ܸ;2uxK/'N~ v.NA֭ T`[߽{%ŀW-xMav7n4O?D$ "is jzebY ݻp,X>|O*˜lڴ# CBBnhcǒ5!]AQ@һws@[ݽz QLWWIJJxz ]Ԃ1@CJ a8w\BBF)w:]j3堟cǎ9rDH1aYYY.]|۷!)B.:KhѢK.B𖴗5k6axڵ#KG]FEEa Qu۶ml-5bСOZހE8u=O}iӦ)@`]6?'d!HIE/KExv(6w7UoE&CZegg߿@JG8: yV`eOǯEODhI&'N400hӦ _ 4Az5t_ ƄRB߾}RAssKĀ|||';w`4o@KJJ [$>ի$z76i$W(]*;޲eIuфMrrrJSSS͛JDe+]A_GСCdd$Tf~~> 1h y(c< ٳg/_T≔Ƈm6h "͛7޽;|0Ɲ(utu]f1.7V>55K.`(v7n;D_if-}!)) bAԺ9j֬f[t?⯿*|r R /w9s [xxW>`>Y!r^~^T]pJk1-}..$ lH7OPZ/^4lؐ~U 355U$6V 5jԨٳg:M)ƌxRnҧO8qeb===̪ 3npႫӧEتU+.Xpa֭隓0d3[[۸8޿ի'=99Dl0={T-J!^ Tɓ][[ffff^jժRn̐UZZ0X(#FP5|||\Z>N)ԩsիWsXXXf8@˯fٳx]VT^D{Qj}x֭j*PJmZYY͙34,~_~Yf\ovɒ%׮]c x@~,??ҭsDo3 _WP=eFʕ}4=P"_Xd(4A]΂ :uTVI)))Ә1c?~a={4i&Ky۶mL;5klӦ\ɓ'øH[aЕɊ+LW#XEY[hYV [lD+==#6=ZhUVű7oy"B+PXI֥p vjڵ+4֤I GT_"%ڵk{yyA#w4h0}y1PAy X`Tܹݻ...hC4Ko1EAWE2dHÆ JY]EFFFGG +mիb`$0A0N1cFNN"s8o0޽KLL>}B;vUoPNNNcǎw8j ]k< D iUkƍ !h޼9+ |9%kWè[qeg6m2|!z kW566۷ n޼y޽O>;W_A5g 2,5:uSZ%kW,c"=޼yڶm[٫pl5jԘ6mZAAAq_-r/^{/"\ \ǃ9(ePU`>|/Z(++ >V)Ս7NB] L<) D[[[QuO5kϝ;OV˗/ #A~ 'lǎ i>{khE"HHH:u*\ftKxq J9j" ..2=99YxV c -ր ~'<==ĞۦM֮ѣG޽[`w><44y[O>r:u j-[ׇV\DߟPKYd +[ _k.+BvkW7n,|ѣ6mˣC'D"ڿ9rdӦMk׮e}RZ@{{{It&oi.lRfe˖=QvkW:uD%E;w{:4o)`FA]JTE@ddKxwKK={1[JbҥիW(y=S]dj޼9"AyB5H.]^'9֯_&\];::>e``}3gw}7uT'FDDXXXL}&Ek֬5j]Amqr^FM[hQ@mUhhCbbb=ҢEh###ֆ SL;v,=q')))% RV"VX`*O-E.w^8LIm0`IX20*'>|8!C=yUJ޼y:HoѣÆ kY:;;o۶MGd`bbRF տoSSSKk믿uy PӦMϜ9S^=ֆ !aSlݲYeI \lҥKe[VVV!!!+omٲ%ť_~N:u7&* ֭[tm@7K4hSW ?lll^ڳgOuOtvɥKƏ}||lmmEQ5kTtL>6 &8q*!!Ax ^~6mIJZ00?JV%Rjj*>~3fثLdddlذm۶ ӡY20`zKyxzzرCsyUR E0xA^^^```XXXff25jԀ|r6 … ׮]ܹmR]^xm*[:tӢBx3f˖-```^lذTܹsnRtӦLKѣG!!!AAAO>N  ,PU̟?SxTM6;vLy {) A??kE~zμyX200*@꺹{uƍ YYYgΜLLL̙3i$: ' III=z[N:yyyr^y޽{GGGס&5y{{'&& WWxfffƍ[bk@[ '%e/((mR ػw܉z@;cckۗb``ɱRX.RSS]>byK ֭͛0`Æ oi>^ªx/;vz(Yrϟ?ŋj> %m興8cgƍI0QxvuuUZYYF.qqqO\r{]10hX 7_߿_+רQE] WW_}Ν;yުp5k֭[nDDDϞ=YʪYH䏄JN:bր ˗/_l׬RJ``رcYFGG9sF8]A k5k@[ r %e+r|QjƌC a}C 8quOVx˖-gqLLLHHӧOo,,,6o-q8rȸq㔨{>tPw u)* zO>100bPу RٳOuA322SN˖-ctxA2TXX(5]ֲer>?~ =Qg}W˗/e}*KtҥK}U\|E?jǎYYYC-Zx1M~C_CE[x)%%EKeff6o\ʕ+!!!O: [ e ccD[[[YFI?]nUZuڴif LoiG'1X7\׬YԩSc0000\?C=~ s0..nӦMT2w\GGGoiRƏ/Ԍ5! ?~(}}N:,oi%fΜ+ƍ ?z* K8]}7f<_<^F*ܮ惇֮{|rлLn~sAGL?j P W/@V-Mj`X.xl \W=\K>xx;.pO@.Z  o/Zz\r%kX%8\ Zt s-p_rtn'  rFnW/K.W/z/\\p^s- X'<|cjrzϵv:}  |pV?x!8\t\@n?0k- _rP|@>c-^ .s- `ɻ\pk/8=.W/|p@.\~` z#@Z>o/@/K.\Mk^4qf NPP  \ps-pp%8\ztpz>x 3 \rrG`|=p`B?0;0zVZ \p|kW/0,8\m@0p8\pL=pD^̃|t7OO  p9z@-T ~`jء8{/j: H- fa.pf@>^ >8\r@>HN|/F خ7k r =.@/>q@? @>x+Z_rZ(_r\pz @*kX^ \p@>Rjk\pr7=tE 􂓂~p`VѿAQ1@0@1@0֥[O[Z 첉>/ p`ѽ@0/6ly?Kp8~ 1xq"QG .V@/_ŹIr.@0P  1W?gVpi@1x^>ŠF>@.@08j\^p`0P >n~¹F V[~pdm= t [~q]rŠy?` 18?V8\#|jz`g0cָ8\^ (5/ dp@0 ,xbx8]^ (|ZY%8\`@0 =rz`Hi9W/  ^rɀ䯤6ڵ@.@0 N-VAO(kyR 1P x6pE@N@/ `~`p(}׭.@0P ȉ^ (Jt p`q1Fr\.^ , bN  b8\zb H<Tl!@1pY?0^ X:d[Sz`*ȦX @2n8rpe tt-.`>pU `B. @2@>p# pb=oL U l>zp@2^|^ (.K &\`(@/@0 i?{@1p#x2N\z` 8c?Zz`(p¹`0;S $nm`Z`V/  @/h *` .@1pyl ÎHpa ,A`/܎nd<vypÊ% v}pq6~`4v *Fy x  @0 N+g0}O @/ $)@/ +VixD-6@1 *ؾ (< @0pYP;،pbOfS.@0P \Z 䀟-;^.@08XNcv8\z`'57O N ~`<.@1pUS=^ Hr[P5zA-@1xl  .z@Ҷ{Z```pv1M, @0p`V?1w@1P ̭v^ 3nt`(O[<.@1 1yZv`\#֓ݿb-8\z`I\p@1 ~?x@/ @08/(\y@1 .@1pY@`Z`*@0 B^(Jhtp'K @/_0P \kaQ9M@0 @>p@0pU @0 |`@0P \pʚ`z` (z @0 W%L^(tW/@/ `~7hE2@/ %LA8vb(A;#z` *`.9{@1 Ztී %w SbrM``H۴z`FޢOI{`>@1p`n0s3 zb@'Z zk@2Xcw7f\"K@1Xɋ[p> mYvo,@0@1p7@0 $RS@/ `{oπU| zd˵I0jD e , n:0K b Q;pć+ U $ܿ/nΗ+L\b ,я'pp! e +,lhLι">;@0 `n0?;f'#cbt0lgc` l fngGNؑW, Ӷħ@0P vv1|=@1pUZ`J:wF8L@2XISpp \V7(˦McWt`N l bC8(Ls?pp29*`nP͝cu]b  ATc W, %l *v3~|oo\^^4u{oñG pUj˘6G>'\k#R@2`n-\#K F0P \d?N8rłc`(, pp1f@0 (wy0椃2HK jǝɲI˜Fk `1>jA'ω{}SMNDS d2K#f?b-@0P ~B}*uؿ>>; yӟta13\Mi@0P 18LH1rT&W,` |j7m.W,blT4ǃHڷb@/ 72 b٨-pp0b  9{y&ˆ5 mroL`m0snn0RG-ñ@0P \pf@Q:h> x$Me7h8Nze ۬p;.Pt6 `(cm~(i.W,z#AlA}J޿ƘJV`#HbΘq\ _(@/@0e 6b137OLϕJpp#X&{@16Y~n,\;t0P \0iY20'bnX1,@2tp mZ ?@` (.KnS['0w]Ƽ_~z( &(@T5w < z` *a7ʶ wQ 9a E0P $i[߰u7?{i`.onF,,}7Xͷz @1n+7uُ` (vU1?Op z` HTob#|p{Gi6K  hR5'68vbpe4k+eg ;@0P tt0}ϫ%fp0, @76fzu#7z-0㗟 V>pF0P %Flc3&,$bj8#@:@1 VSA ^W_G  ]{L`Һ"t`Ч( O0P B"eK޸fF %#b@8Mt@0P pYh>a' n0:SO~h#J8 ].tZVKX@0 $ )Cg{? ]F/@0 \j:ʿscߘ/``*`S~C6'PsZA:pł^`HGM_259.8Y z` ^(cNʁ =X`vQ+કn1%;oa ,@J5<a@0 pUej:巳uFi],🷊Ҙp WIW, % ALY/;B7@8o&b` :YƔE'aQkdi;mZPy歯lr@/P 72`6>u}Z1lc%ph8!c_D bcAj:)h pBV'&8vb@6>k2 6I^` HZ S~6+jϷmc#w4b0p?~4dzkwepP} z` և- Vpł` (8c0aU;[/sR]_W,b а| ӆںL]9>  @7\]ƌmnmG&|@0 72`瘑;l\j4=?\ڱ@0P $:p0jm9e@_y]@0pYkv ĩ]7Na@1P hDkX/|T.z` *_ppu q]>OÁ$z`Iz.uh]8&; vqUp׶tk.)~W@1U u?IŔ}mtƿx^B:H;v^ZTLɸPp&qኅo` (, B?OM7S;^ ((%1-_ƦVW z` Љ+˦௝=;H9%@]zCs,.Ttw@/P \W`[sp4 B8~n6GbbЦT:?op?t vhbٷ:XW ,`t0, }AHul`m}%,K[FA8} ' v*W,ؾ @~+]L_zƺˍu-@1U 8H;n`bA0P v^ŬԶ`pE: %w=7%H8pc EԤaFJg{sqD=@1 NAd̛8uҁbcFռGᒠt8HǍh;Es{@@1pgZFvܐ=|pf6ψbp(9);rF: @8$Ln,d-?ĬbA: K[lqAF-6m>zf{ Q0;cpp7Iw1t!H׃V1.GƟ^; ъbL= 9C0P p#}2~z2-` 7U2)&>vί].T{XŬ;vI/` hI H^0U =)AkMg~m%]8 NH`>t1Y 0_c"w~ mC)_7:I'?=\LvlcJo8ϗ6'??cVد-k22WOс ͠VIg-L%\Z>ĥ`:%t@8x+( m&&. s%F AN@2@9VToaN^9tX1p#CO 1tM:ui(rG3+ 1X({vL\8[)\.A spqt,}&{Slu@Y1jAN:tHǝ7(YϼƸl&.-BbL7 ~Wէ({tP0p( և \e[ʄ`-p0U2ف`3K!d/͖]Z{;ɻN,%Aj&JWQfp1<%Aՙi`gQCH/QyP ]O:ةi a)7 kBq]9Akfʿ zt8bjԾL ҁbV=p0z57`-vu)b `0o9l(1wuppy8xO~U >GM㽗៤ _ =I ؇Qy AIxv:u+,Nn_cnwp4ل$*(.>t cp[ ÁSw|qW%0xyo؎u/-6cC +Obdl~g.IƁ`_t> kqł`V ΪJ7v-Fcp<@1izP1ԭ(J)5f\+jVW8H47;hl%ׁcgԭkvJ@8hN+]q!LɭpsJep3vD@1MX ƪ|t!p@881HJ([VQD/9Ph#`p5&_t b 4z熃qKt% V5ukON Np2tp07[@1,\G=JJz`]:<ޚbXW%Lwl c',pԁ+臅`$8u h.K,Hc辺e~*Aοft@1`82bP=u8'q Hr4HbdLɺv`l:Hyˤ tCu7k?_' pA1zme~p@8X7,p$j9zAO`gf)8v ,Ժcc r'n+[@0` v:v( bcV-lKFp^:xbA:pC SU٠oNu+Ǻ;:npqATLx̊U9z$@10_M'`}`]E{O+֓t6  :؁(ˬQ"/[lph1 f`t>ځccf.恼p0|Y"3فt /bc-\m?q pve bc&8lOp(&yg\̛X"5`t>}ہ Lt‰ ym9L%WV5.XpBt`\?'\4\a f)R8ָMND,}e #ɨy~Qgp1@8^yQ/tuKeW,LM@54(up`v8)V9u Li0ѿp54]n `SCHF/p& @1IR8hFE}Z+\3,*bʠg?f,?_clMH:Hkn_Lxҁ]8bp L *AO4@ѸaxyD[@1@88(J~jC@Ѡ9&, \7(RfMS Z"L9vf/=stLxL`G~B8*ޔzڦ7K(DŽutnpa8ש]VzΆ1Gcd$:`)ZlI8t|k$5t̘&րRPu.k8cb8xwYr?Rnv`Gu {[ uO{^:1$*,u8&U2]O8p/8|Yp0;DY8x=L Atl<|D僥5?UWp>'Qvݠph @%pX1$>ѝk;Ή WpD5#cԭ+@&P }A 8/~O,$e8{3ќ!* & 7pXhZ6¦CF) 7XەpP{f3L3M`ɠg&-Nj|=ʽsU~>]$@ 6b.>*ݧqIۇtf hiU0SA'M:ptn2֠@p Lҁ@/v 2.Tn;SaفWQQ=p  !_^?*[` .NpxXdڋ R뫇J^KuG` A?}f[ Bqh0%(@8@:p~=;Mh{h d`(ׅf3֔` ׇ~osbO;37-H~D8p`n0(Du#p m>k;]0%8v_qd]``c^@xƋy_hU`ypAu$ZpD7 +fN8}OD/-4Mm9c@dp@A}٢_8Nx{LG_d!@8h S+opC3@\)gwE8귳[-w w%N X $ ppH:p#?>p(p +K`7S9bm9/n,-X F_bXj y  ?fH/!5E|cbAŶW82ͮ3ލFS9 pd20OBo'M7סŔS~_9G75Q8 pX20A&y? A.zT:`i6AP8C`A 5d7^Wz>ļo[@1J>:_ %8vnE53l  p03|yX',M8sڽk<`Zk`L8X4|zYO5 {ԏmdcy1Hp{8XAb sٓbp<~i /"?#!hqnʅ`pϒxZ>^DJ(xt$" ap̗XPӴ1r=8peaEp |g \HgOd0oiM"79v(eyÈ4 c'6,P9\oQ@8- /^{,#"pZ:cǼӂ-;AadPs@8Xk*x*0XQU@10sKʩ<>"釜8 k?0Ap*fLq Np0<+l f歩SgA ?~pPvłt8VOt4KƕE/pй:ֲu7V h y _;h:\ >?b9p0;DŧpPqAXMeTsOߢ 8$'@@8irEҁvqVL|[*awA2`+6cGzH $*`WB8i 7;X%Xk5Yb˱XbHAns8x/, . ]Kٷ8L^ N8OzˤM1} "3,gpIot:v0S`D0Uo$)r =w7;av}cE] U LQg_wbtJ:0|C"wpP ךVi k8:l(`᏷1ډ1 NHopԜFwI:v` tŻm`u!vHz$:1S920l( ".'`9/|\s~R; hnzP sAM8Xd oͲ$ZЍ^(y/& !OS8X9i92;X :h$@8q,"덖J~P8pLcAj?7饷 +0,zf]Y 2`pp}~ogsmIXZ^҇: |9 HpP"ͬWv}HF~NmO1{HT&A>Hok"?uW=濡1} P @2 ?z>w'n;wȁ$jޅe(޿`H8W/ pG+Xhj={1㴉2%(F%6_ EQ4#&w_G$ Q"DGz,Hc6+[ʞཏٽ5FLvKcit|*@1ZܻE~/<$ܾk8xg9]ف3uoaLnB7ÒpP4;ȟ ׿7߀e)`tW_DNHR  ut8k$|#_g|sǧ?)%G`$@8nL曰ָ(ۏo|j|=y s\ X*_t}!sc_o|n{V o39x1}  HĿppu4:nRn|]:匤'y7sbH k8͸r %\= n~G A7y9#kN0jtP `to#p<octf3FQ`HjApDz^e)&?l2x:{_![3H }Z愃og}c* Y$Q2q9pT4HխC` #^UjzycF)bpxdxn6;`)-Q?N7,$2I)ppP'qIO!yM`1 H_}%H mA?ױ Qw'}Y`j8xqS^!9 5_7;Xd=ĿeDdf|`j^8xu/YA͠peEޞ5XEY LMde8xe/Y57՗tpe~FبpH1 h`;%~2\.D:hFG_\<[b[/s[ᅿvC.>?ƧΉ7RRz&6u,1:ntp0xpïNt鞛ΠQ?3IP ,?$?tBƟCbHۈ?!;Dž6 B~6O螞БbHA})/=Zo7?v7sv\;I:iQRW U? Q4sf AR]Oi}1IM}8`R/qYwI~|P*f HP A0.qs+Ax )̯4|I#it @P -Aҭ&;AohWH~^zؼT7WHdp <}n?T椞GWIc"LCͭ`X, Q|:u09{74R>KnvhI@2.A4 :|iX?ƄC\֕dfvY6 ்͈*R>xf=ԑ>&$XQ' kexpꟷw8{f~Cs>1M/ߛ59FC؁5ф>I'Oz'Ϯ]qu~{'W_դ7;?.E?Id Z-^ќid|oms* + `pP,[A#w;m |W>[wg p0eQ N^LDÙl2$  ~a.Hybעq yIzM@"`@8hrܥS|>.&:EI8)k5NDsE˝>wn.%;a7;(}U i%EA{_Ro>E(v®X({U#顄@1$3Atpq.>rM=$zktޯfRa^}/C05 U!Xf(` pP,'\/ӳpXQ|8[a8ncܓ{3.XX:dLP Hgwg1EwƢ-=pPv$`$`س.:UqwA(dp  |vMWGI8&w4 |kIkOO8xg?|1 فt( ֘A{Qq*"y.}y6{ͭ ߬-l@^tc̐6/<߻_^I8$Yg?Z|`v8ayrp H: ;wkk~fl].nv~`;54^Wէ2~bM:ŋdHQ &`k/y8qt>q(ddH8wyL8Ƅ;l~t4_=!-Eu`'!yi7 n-}~tBk =qOF3.dA㉃6  ?Q6M(zPp0uz^#VI/SpPvDpRU^!E.\{*C̫/$(A8v|8*/@{1 l'5 8_V/?{!T@1$LaApP{ I8p94U,>n n>Xp C`us'%_Ӯ/c] ^n혷9q-b hyM82>jAū"}`d e,Às{qAٌm1Xq'e|j_#ދk&]8a ˘nWt߉ʟХ`d`XM ]^hȹCz–4ƽ1^4W,k'@8@8Z  c*Hb ouhD)θX܍i fgN:9ԥX&C`쓌E^ǴGğͅtw8(L0 @k{ Z4 92_Aa{<5.E Y:?qv N (ep5OrpsœpqõG8(z@8D) n>nsVW,[`An-f!2? }ܾ>b b  fjKbp\8%^軈=w¯p.xEtGV`ɀYt/ALy+#\ ?>`o$ pm!="埴|WpXh~VP EV+8(NF:݃/~| F;pnN[ G/QE67&cK/[ o}^`S##`Lcd8x3/p LV!s"L}ɸ[\ [)= :Xm63`f( ~2& @:D,NLb㖅qI8xz) bA8}. zp!^M8x]HI cZ|OAi@8X;" 8ADF8~?$iS8z-2; wjpp]k Al0s mA96J+ဝ bp LO"ݡ ?݀SdL%e?

}[ڧnv{`IQ8`f Tq8ӌ}7>ņ[: XybOl< ,2f>b)3p7 b8(\qZ7`*Xo5!+пa~8ߺ>Ag'u1ܪk"俘q};yC#?ߣno3ބķm:h$bH>n8z8e7u7\bO:}U p$pϥpPiX?|:n?)F ȓI h] w,=3o5jw[W+Ę~wbG,^`JD1$@8`Q8(Z2/돏a鿯H8Ĩ9'N3W?[|MVx7 7օQweqsMhSf9-?[,!p=\U:X(߫ILytp}7E1 #R֨[|KVy]_xO"j>K_mr|tht`;KOn +SAi8ꕍ^8iJXlÅ[!&W7f_Y#mg>0 `h8w[2彊Ǯ1(_(`L8`5?wO_ytkQJW".K4S?c5_ɝ熃.lOǷ*[ (`KkF +8u>oy 2A$_QFsnd g*1pP-⭿gQ~z|9So@2}{'kfwjphmj8~Sc\nڋ{-ဥn`pЧp)z ߽=6ɵnz:xl8$ìp@f`b87J«_s=!S(d`HpW@8^Au8/?DS:iJ8@8F[WA%6 _'JFmH P =r@8ecu-_>wKWUbKozNl@`=$q>At" U++8 8a:5翾0t0i' L!*!S}y"A=XȰQg 8ט}wK{At 2JXQa1!_WQ XyυN "ҜO&L}>9Jy68qYм-( s8;79RLH7 J9rdiv!;2pcd͂aiEwvKkgId#Dv!-:c!DTlIUb16*UH&ӹAW8 1 :@踚8+F/{)с"2f9ud5܇Ķ .Qrdɟmv(8W4 d Av|I 8v \@@d@D˙4S0p,;ֶ;/߹4Fjy2˵*""g8Xp$85S4HA)#[GdNE^vdQ!"@Ǝ̩mj<:Pb@D.?ק?L4JyOatv8~8 D@5@DD[&&8(%D~C*TDD ۮ<5Xfg I1ߢHZw8+rDW*@r[App^ӯJJd@!V$"  ? .8m_W8D';pPDDpp&7*m/|oi%8ĮW, p~'] @9pF8h1 [PB " 88z_XL8W9灃_;=0Ɂ':U""A?J}O=?ώ[^ v i5)9`pǗG8hhYm8\HƧAUDDB):f̥΋M:h]S@sƳ":NEہ ^`ϭĀΒij)qV l߿\ *d꣙H[j`80uþzKp0!xwJ> IĀHTӯJT\ tƝtm?5zn`Zb֭z1 "Ȁ)8v1AZpl`8x=s6+v>8ȍ6/TC8YAޯ%8x>^m9hrAqKDtHpܯW&, 8ج1'p"9|4m@֔}?ļC $i@]fG]2l z_Zpڻ|'1"* dim*oč<8tm0zt鏗AF'BBك9?p4/-Yw=!Mm^ pВ>E(fP$ DDp3XpPvtdAã>H-F6t(?oK "Ȁ2vrDp}I1ppяr3p0cY}ʄſ2A`ՂԪ|]qq9xk7k5u?25I֎d@ΣsAA?_}.lppݪ $" ȱ73p"6xB)pze67Z KI;%t4m D88DUE7N,,:fYZA/BD98q7N>ڒnEd Z" hsm0{ppQ׀-]#AvT/,Xݔ@D9^TXT%[jMr Ёued@Di`8H:^kvv3Tl؋A6App僲O?r`>8WĀ r& a,do >MC5`kt.b" / q`[pv  k8@ 2qAVAqPM`81ƁAĽs@ p""a8MUدc@py"8+#aNo8(}v4V-pDpAx*MkxE *0`~pH'"aӎ[W6uTZpޠ 'z0=8H5pDpp<8?^p4bvp2Рp@Xp`jbfy æ-98V8 "48.vsp=:}JM A4(DW/85 !e{mbƸ,n` x5p pp1_Dp8n~xAB㿯]y8x-tn㲸p@DvQҭbK[Ef0q< "8KWW3X?p``CEMp@mvp;ij.bMJp@0X%ߩ[%ovT~hQ 8h@jgp@De48X^dU%J("J"0{|xp``S>ʁۦ8[D0VR>Gꁃ(([Ԫ>/UM}u8)8ۘx5p?ޑ8\f`^#jpP'"c 8pdAo5q4p@DKKpLY}+=Nc\V5Q8zLiVe8/0n 8b䞹բ#~%*u5Uf]fp ybbr":wEE٬~*8h^е'pp`"HL]Tݬ~,8x *Z8.:m":%XY^#P5[ 4A,1QU_cB8 :sGT5"|Kꃃ6dp8c׳`ίe\88u5[D' ࠾kTA386tCL~#AbWEt28f,*85t0t:x18xS c":wm.̾M;x畷_p=~_6x۾! ae":jpU`AۦOv ->qp8H7s #:j &|Lt!Zx ]o^j 8 W& 0cM 5{_#8`ёk@CFt2hL,\Kl==AZkʦG2S]Kp\pPt~3h<Hbf"qv>L@sF#2]KA0;pުtهoyZ u}E<] pp8Tn3 ;("8Z%v;_q`Bx8tЯ{ Ny*}6 ;p@Efg 8x>id'_O>h: :Z ^%LXXCޗuIc}??w38pe D+TNm8]anphje~Gpp5G88d!:7U@2pP8&FSk|[:ypӢSDN $[p033(p5j_abLBpD$N 8oc1螻&:dUsp(3'b8@L 85pkySAL6Mãl_8%9pd+ۿ673z=88D FL.Z_g4w AQp=9XN Z^}f8j{I@SnÁ*9ؐN Z8h^OF(f΀% 8pbT?rW]t)r'sz$/"ف8xbKmv:e0p@tN7pF\p0argE黂]yp@tN [8yu eKx`aۚ8 :Mb`*P.pl^]OLs ܪ-s~ ggup^>4/&&&+8 zAcZY uN08uB#AZpbhS9|8H(`l*Z ^j_#&L!Dp7q@plO8ܩh8axOǂGLd 5Yq pк} 4U!DZVnN|fv^TU 3 83qv"ݪ]mzO-3So DTppt l;ѶQ=*p@pǀ.48oWtw-&Mh8 88$۵ Wg=w Dpp앬%oUx eC8؉iH_}pTn9)} j5?!+8Pt2Ќ8A۱ΏkેvM_&>pPQryb,pΠ%G^p=Cr*V/7DPѮ"]&}SpwR}&cs|TW8x Ţ`# f8| ߃qK<pVAbUAǘy MjZ!-|8y.8sz]8rp@Vsp='~5p`8yT y.BO 8;:+mGti eiY!PC"x>pKVz17r| ppTMHtp Oi â8{~tشA8 2YLj[9 - 8?]618xZ:Ę|0Ș^8I~AEU~`kp(ޖ5q18H[(*"(({f*UZp2?~ Qh;d(M8 ࠶+-pT9ܸ}%8b'%pPd.td@!o3t|Wֲ{ܛ8ߟ~]Cxvpl'Ȁ.f6GX$8=9|of9JC,e48H6i{T8fpBaGkx=pp]zO=8.sSɛ88YhAӲJo:t1lmkc&  a&V\K7Iӵ9`@Lp d/rSsAlz p`8eA+w'jp}/p`p.8p v8?n֎O'=pptlG :㟗<7S#18 Ȁ-48>sȢpATW--fF c /0e\*y3R`f{ꢊ:lz,?OkC3f^nyO/:M#.kzA$eT Ȩ`?p3^[J/Dտ?gKp03W\j 2Purn p>qP#0nY$.^nqo UA@LJ8 . je;tpڢb>|Z-#Cm/ppKGm#ۗ&8xCWqpr&b9ɁFy8. &׷|uBpOvK:q/NClo} 8%m8fos$u8ٟi!Rk8mo~%Wۦ!8Hئi!:CspH7Euxޑ`88[`Cl/p3#pg|\pg` 6Jy6m6d@~tj6;L1xMppmpX\C8H׻i!Zmd i2;R ?ފspY Rp@pMe'Dc}wfd`΀6w|Xȕ~:DШ >чo{8WFC)Uv 8(fXǞkm& ޏ&&Ep0Wm&2 ;AApa>CsI࠯w_ԾOf@ 8,m"pJw8"wi_8d@*yV/:`A8볣8>ζ &vg"82)j[EEMF7mޯb[?$}VDo8u1=CɛN`H19yĘ X88|C00qP Rʋ + 8rmyqc8UwQg E hHc㴿NluSl^F~_$?XEnߔao6)Ńx4pUv 8__AEn#2r-ޝ JRʛ<p0|{S` v/$)ņAܸ涍'@IJy^aࠔO)8qэհ̋X9HnU(ʃaVq \۰*p啓U}ޕ~^/DJyAlw߶58'Nը[ºz])兌-fN=We FB_dqG<\ ?8pA-pp+8^\pp*pP|H)/|l KR6u7]EJa}NU^,˷z>`$ <2 B 8R轌칔Pp|xACg،,[w٠WR@^<:+./ ,es;\\2pw4TCrဇ ޘߢТSDu};p_ًJMQ.  LA_(%]hw9hr 1Wv#r2pN"hoE_"1Wz#6#2lq8HkBjy)NA5' 5Z'p0(§qD"> f^GAM/ryA?m{oM#x9{9ȋb@L>H-Ͳ18oӟG<ت0q˅މAr.0f@jyxtVo,l "n_Ѷ.ޱUޯ PfoL+_qϪ|Q=t_8pcDB]8آ?bhݏh T>r[ Z^i{k F/k'ãuodۋ?xhB;Q<)d@l 쳺E\-_pmM➼hG]|.u}֞HHg ޳@}+Mv 8K/W7wOUDip|Ʈ8h*M3 8fpq5D^,#ْ)GTByj k188ԾUҚJ&ϚaM1Ə~^0q@v.2F+PsE8p=CaشBb>}*d&88 8(^,{DO Qly`CQA#6cGp~FDykw }?v +{C Ɲ $8F=`58x~6"`o,K6C٨sk^H1/`.p}V)_~I1 p?Dtd<$MoQ(b68>tP✍"[Vp:iUQquc/ָ 1gK~(tv 88>&i78c`DkWrP札pY]H1/`C'uƊ+}?}k6As턲8 8k`NCkֱNv$S\':cj-N;` )淑[3wGYAs6AFp@p`8h1?k#g¹WH2b8.>QquE4q0zy֎ Ko{| ui)kCtF8j(%wtL=9it we)[q8r)ƠܳoJu 8H^Dy-vY8Hȿ`3p0KZչ~QtId { )k%W&b׾ @L=A<]R> 8H_Dy-@&+0qHZt` A8 @(m+AwKɾ1=. .)38 @A5EE׏9jc&yMC&Kr)b_Dy-gj ='j߷9G IA:8 R̋W8) _]`:Ȕ뀃J8 ī^ 8)m^quEXs`t>"Z +"r0Q< 8ND8tQlQ޿6fBp'[\& ykjHk)^.pS8q)ՋV&qՠ@blSj^iw A^}E1[ۯ)+Am8RL8ݜDy--1*8A^+gVK0uC;84p"RA |2ڟ+$FJ]{Q۞Dy-84p/ezii͕|hԢٕc%8{B$٥׷UڠDy-? L|AKw7:H9qBTZN[p?AʦF|y暋7维; L$8Z"Dy-5qQ1 V\'fWJp0-Fkb*vǔ5@m p]Ts^@6DAc)8CyA&j^iw 챸 8U^:fƅťۛxmltK>q!.p`Rbo:Wmv.35p}c]Z≃:7Љv85\t wc[!en;ߘv]W̾ Ȇ"p9%K/ 5q{kfh7Rp0<K$ Wuq`Ѵ)r#+pE8XQz&|Ux+to+fF~lz"&82'$)jS2nwډe 8yO$#&9cIQУ"RyiMUFe+mCoWi1a8 Z W 6% UFe펅Q7=xp8 Rk9np:hYee޴/0&Xe1r Q )煫sBW\]8ˮO wՌ1sDyv8p[KirbĚkkhx(bĪ(1ѴMu0+R]?ApSڸʋ+aA~{$8 RΛ8Ȭs7L9q[|T\ml~M,AFW38HRDy @p k6" )nXyVu3c{Apj88)5Vz珗U1Þ`H9ܪPasnp{%c$8暯?-RO@;gS|r"pwbs`ǥMkS*ne'lmh*M$"[.p˾>~} Slb$_A(݈XDݐkI {zB~M"XOprrk=!fpXsB9pw- 8}ټA-?f~E7ke9a6? _aLnZ v."䀃)#'VGAqѪY3p@p0-`޾Za67wB1*݈z@g8=h)v4Vvr0iuJ7"v 8m Dp0'fݗmlD1%U+L'Hk)ZnsM!%b8h4~LXfܯPn`8 ZV \8nN0Ysmu .8hy3#$j X"a.Z˫:nS=SĤmrm~׃OQ[p@v8\816bkfC{8?tÍEA8nD\+*,{8p0:vq`廓8ڍHV`N;uŊym3AJ>65cW sZA6q8O 1uc)%A8nD\DąT3'ջo nڻO0ߨNp@y78`p~:`B>tꀃpЫbh _4qhn)1cL1+=cn^Dz^\; `9b 8T9{cN[8X88~58A J+:t: } Dg+sUz:»A\8X  ]aW Lz*d¸~ښ8 p Xukpp}oW"l_.`C&GHlBN 88 "pAC ,?6q1KW 8sր ]p:t0Ď|.wT(R=pp8ʑrz5}M{܇[4^pm͈zO8$7k^n^oW,d>"> j1 84#&>#^pqAycOiy C82[fDħqlf?3@4lU{Otq}^3B8-o-.JU[,z]<9'5j@W3SFxfWVcW{ f8!d{ vSY4Ġ}:H,Nip@9| 8 8+:؁ ;n߰ܿ-G[Ģ uLhF0>ր pGϸ KA퀃ٽw^i?)93UVj ?#WTk66Om`=`v;Ӝ v%'(88 ޮpe[bsz TH +p`7p<&%:g|@1=C;ȀO=,upP,<8dࠫ+8/9.+l|!&H@83pcϗi8ZM?85q0/n_aC-8pBPOQgIp0yi9-/8Sݿ_!v>Բwv >,D615cOArf8`Wҡz~bX&xʇEod? N`Hq9" 8 8:e&z)4opPtzA 輵->t|`v)p@Yā''V\\ΎLgxA6Aa|}S7˷ƘH8$қ`8D]Cwr>&_ W_ap 9'dgfK[>y|&%4[dpps:~7uBb8 ~g8Ggj()C28xO /=6ooٕQll@8(J=p04{f߼߮0=5P|}< l[HT$q11 O,^F`8h:h9u:8xsLD!]]{Fb߭ߙ8`Vqpf`;.\r=_ݻ*?Rmp@8`Vqp4MjApC`g{cL 0qݺ1(w OJǐCp`SzAw G{Y ~|} ډْ_=C{hn[fGgNcˏ䁃u ]3gK0%8nWpK:퓆.`f`z?_=lEc$8r8rA"G?pm"[{i-m(8'-NT=QGy#r.&pplw tCE{}J^Ǩ#*ppDӒ18Mzų}  Zߝ\RUA b{ p\RsppA>9|0<.slTkv8 ~gY73)=|.j 9lC֖\G]zE`۟)w&*pP'483pP8m pcpx b?v5Bd˳6x*Lpf`;El .ed%6tr&suy{ p TΜQpmΈ8ׯ߮pѠ^8k?Fm^V*D;;@ #3yI!7o pG]a8O:xE&ppMhUA b[,isp|m8pl^:?`68x9r(oQgj ad ꈂ8$78; 8wA <[ p}^]Fg8p Y88Y2z]vxFbK'+tm'Zb`R"EG^Z8,IU̠m=qbi0eKqG8$iG\ppijbkLrvלtk^ Զf`R"EGESW[& t _v7o8Hӕ&_2pp\ց=MFL۔9zVtaē 8P)4yTj[ 8{0ͫ}B>I͌#vG8# 8 Oi&2Vሕ 76:Xb<lT;_+s88 v]!Z GD]pt<&t¨sJ3A~Ȍ6  u7v#8ZG8 pF_lqg 7t``@=qpMtfK9F$%~c*p˃X=)zcTb%,vfKwV9Yf bxABqF8`VQRX8s1?8yBzŸ}8' 3v"`peI#]r|ղxnNgfE&2p$Zp0/8` oCq;YEQ+DOs¶8|=/uJq;YEpO%9j='^׾í `;ckM@ &p88:%&zǾોzs>'fEW`q@N62` U~N8f#܎Gd@Af;s ю[t7 \-gp|=cY-أH]L2qpF<#n0(r~4q.8ܮ73s`#ȁ&ns&8?. Dkj:x7>ddnK_q;Y5lW`p?׷ pe>^^{?~ڭ!'90q@Yā 8s2G6 KJ&:L$W[8 ,pC 8!E>I|qyUSTQM-$'ZÍ=M:n~%82O\Mr"[n8m [F ?p0 ă`5!8cےel1VT "<(p ⤠n?` |-%8u c><Ǡ.7ֻF{oOpk8`j8*2&e\skApǟ+XJaxp8Y ~n5k&~^NtXd|G4xp8[@ 6j78߮pL,yJXvLFP}jH8`jq pA|87qƸzfqק&[':Uld8}k]a)(^5,F8`U; pdp33M:Xb~kru]R,55Ki8`UAp o`8mJOz^񖦎pXV`#n/9Xw;nkp߼O&x}A 8Q8X^`Xvt߲A/b[8 ^WסAHr9sp0 -6UT/x%6榌ip=:U,ph:hY9O&V]-6:b߷E*xpprⰰ\ڸmKΓ^K:nLL8̗&pTܪݲ ;TPunpp{{0l^m41m\rIoLx\NpxinWf|W[OY ,t,p@ƀUI;Vd|` Ѿ1VxA<\FCFz*+xiC&x2p@e|B4҆xk2X.pp[cHT:YŮ-p@Y'+ 8`|2>M]e jh(ys 8`ippC`9xZn n~q .C&ՑMx0U82ׂx y5Zf+d'ս'WQz\Ҭjg=xCpb`#u<6Z4D|J/xu߯QPy/xpbO$p|pv`dwCU| :{[5qZQᴞ}_c48\>,HꪑHGBgT, Yp~AEND2pz-FqqNRVy/xpb8R9}/޿_L?U||8 fƅ:iBڇa`g08:t)4> oza`qvly/x:dB!p8h|@ppWupX-119ިfi@}C"C^ppW}pjcIpϗ߽c+$ӷ?.^a1p@M@o@@þH0ܬi%wv-:߮#+Hu_GBK8H(ΛZ3p38h| ]v8yG_oc', ` p`8wue`]K=$" H j2 |`spk_ǡxo$~\.IrS'ŪAMq/Tۉ9;Kݿ]!V,LzJC' Nw?A"?/T;,08ҡ=N18>9A7<p@ ἑ_tG`\bE5L,]?2>P=ciV-*- p`?8hm E>Qt5E\ X5ƂAc5g&.*i;! 8$?[@pm OW pppmL#Yue=S Ap8xp1"ک6F ppiVWưsZpjiZSjA`F 8ECN{Հ5u@ 832F@%y%#4@ 8'~ͽՀQ|h y6q@p$cР4W 8x'v+7}L |5E5zJJ>?O50j]uƸ8׺ɑRbӬa I=D7oOǘjpVk%fzUo逃e 4K%M8~k,ًh0{T#eQre/ؠ6P89M ~;F.91H%& 8Pl)8`hQJU+pжn=q5w5&~BH? 8 gYIp͌/-)8j>ham#8se?$ 8(ʁI+5h*Vp(f~Ky?x@ Xq]iÁ) mEJ6qp"Y|WgQVg2HYL\Ҹٔ`P|ϫA Q@OBC R-66jwb8~!p  1q?w5pp^.8`hAJ}Qf bpgɐdpls#&FC pbAY<3N_-1?s_2.p@|8$$O7j bf ׹-3L.Þ+ZaLzEF%i%a5Po->%c 8xݒOURp$$9l40j 1 &7u|W\~ZKgbuzPF%i%aT5p0tqX^O>1"J o_ǠAQe p<ܼtwKF3vTÃqZ^9v8s98  4(I m>J8][gO|{gq*zB8 >0*I+1hA.pI/*gN,8W4M}[!'7{}6;mp$cSDpu, m.r€ڤPmiJ8`h1J~[ǠQ3iKk۴j,`qgdc -F 8S 3{?v͡Ӽg9pǠh@ p5!'.|ՏVpF%`?F PQoW g9p@|8`TU6c5 8 wɺ]dEAnrrhPΙ >2n?9֧ 4mpPlJJ?F 쾠V G}½Oѳ׸B7@C-M  -F 8 c`X|d8x^Նr8 >0*Xǁ5K':J?t{\mA8at:pb8{/? P>9__5~\?[?oֆZ8 >0**SРjl1l|ݞ8b^t+n:pb850j@WG41-&IX~x&.84Fi≃|8 >0**4:jT*<`pwP &-=z˸_; ,شHuJ0Р.5x^g!_55 `pT1^,󏞸ѫ,<2E(tzjtj@ApJoWߊ;qnҔPApVF%ٻ4@ hexA6&ڡm6.PAp`QƨAaj Rԩg?]l 8hNl׵6V7psahJ8`h 5 8h}BL4nWrж ѻbE{\F%у ߄> |?V6.OcgM$58)8:{^X'\赣GEu1p@j|8 I4`89OK iOF8Q߭ J|p;~^6sghUQ-7*,:Of p@r<!SMppDY:+U^|g$),4,OC+IRG $e*ppn=3[4Ĩmz:ܰJӣ. !J  L< VPbTl]:8p@\;$58  |D.___ڧzFMjj7I=+.g@BjP#GѲVD{ƊRZtuMp@\BY =Nkk4}fڧzZZt\+r&(I*DnP(݇gfp0 ;Po[`o\@=h~OtB$"8㘗+,=>~?Re^T!JT4 ) :.op}|˷RTq`ҕpB&:c N?gȮYwSpp G옡ӗ;3*uB 8ؑu36wt\[R|Dգ]ā%NyR6j1-bWppkVAr̨ % 8rs 3A3U>_=α=8B9YG; Qv-6p;="q&8Ǜ`-8zWqk;T\E>oB Is݋LM?sgp~ۋACF%9P;Jq: ?ӈxM((>7pVnRYA\8`TSȨ n|Bl#v]!58{Z4CM|^rGr ޠ.X &\a:8ϫLZbo_ Xq9QINB@p5gt0;lIl>qg/vit`-8N,4vqrҖp&Q6C+z8XaZQp*Jr  'tk׀s:Li>RXp9QIoArpo 6Ŵ+#J.8`T[P.'%o ?2ppD|pG]p ,B 8 tx-xTihF%p|"WI]{&L*Cx`v4f 8p>߳È+KBppyy ,.e/rB l Β%߭\$瀃pm|HC PJH 4gupp9^s֬ܲ&Wo8XZF+k^N] -ph@9Qg m%#^-]9HC F `6 +,6M~d 訜z3p@kYNuX>G~pZ8 Ѐ2 tB1+_8Hꃃ/کeetTNz=YʤE50j@gܭO;|OکUet8 -p55O5 傉5 j-ﻜ 8اYؔv)cNG$nޒ#Z[.YR?j6j@ʅrgɌvT)E_B; P 5 `-8?spۀ7 jmE/!(8x֙9 8X]Fvվ4 `3v.9?ܮetvQgKvppa?j 8%<8`S PD XBw?k_#76&{S<٤珉 Z^FK8`gJ@3mԀp0\Ur76f[Ip/25,*]%%Ѐp9r^qyL|2ڕPM.]%}g2k@eO1ɯ9U ՟ |I79ȓ ^kY8 )QDW i'& qh@F 88r-o-xrV=O(X D8 )4s5m kԫ\%lr'owFC.7Lg@ 5l@zڝ O"EuJ8 )JY.OA6r0 w2n_D˵>QQ:;pؔE>j`Ԁh@? > :|#qDCpؔ#F 8-=^yHۑ&ZӗpSB5 H '!8CO8*K8(}Qc hAnwmvi%8cGΘ8F 6N_B3 C5 6Jgڢ>vGau$8t>|pzZLìNݫel[ lӮ5ijQlJbpP\lΫW_d28X2 ؔr\R$7 :.v=}??{28XqLUa- P1L?\> xzN^-phiA42 Zw+zm}w烃^;5q@jp&F /mV8cW\%iosxf\?1A 8Q8`f* 5 YdӁ-˘G$fk'k1g}N^8 L+k%{b 8^ŐJS4K8`f%ЀcQ! XF\!^Su`Cpd&Hj@"E W,]J6o0#*QoP  Xb/SZSc_Ѐ&ȁAF &ˎ9~{Y 5}<4&ZUtp@ucDKM=3 O!40p@8 Ѐpp|a~};\^O_xcH_B 㤁󓀃)w Mv1&LU j3ēȨQK)VĦIL<.D :_z=g@ |m ^ǁ .hOir' 36m硃^ rmG_.b?gj@,o~&:qs_|9J*Dg/dH ܠ@k8l8i.pZX1,>'<Ҧ"df経OF휈.-?:Jpo\ o' pp9xk>q-8!8|7\Ӈ" G>Ù8`RHF eֶ/{.BAp骣h( Rő×LJT]fA<eS׍=Ϗ,uAqzJ|h@ru:<8W-,0Ǖ $gI)I9Bp.aߘ9xrw$Sn}cՙ8Hл&eKS0J,1ൌew).geppB^AON@9q`7x`ы"9xƗOwk *U9{oā7 c "3AL׈O,5e`w&oq4?āځԱH =fIJy,qռރ8x#C<@qkpy5+Qyw{/xZ=],a]aq幺w582X+9N)2ąWxbYtaQv-&q@k+\U/qpFȾ?d;5g8tp(kX` x(q>xdH!L_g$.4B@@8Զʎ_ AtuOMG@iup岵9˽y|?H,:xKL۱=Gk͏Q:*, `LhH 7@sz|ncdF]Fǒ{oS8  YhN!_WkeN7/&i*L<5q80f*ms Is0Axb8p+[D߷q:Uq80ak piI w8` x{n1&(ʠ! PH"[ͽs`ND]b˻66Z;` @`28$9C: ! iM42XH닫qQ2"h7Cby?*3q80>P8A__`vD߷d/*lk*4ā!5PE崶eFm<.Q.,?*g/TZViklR_Š_&nN\ =lq0{Lj]vj CjR*dpiԉ1?0N9'N]vF& jVҷSrHq 8b]L< l3ZVi> _amA%; :l76CyaeaH|DI6s0%YiVnDHW݄8ԛSG ր5P|uGo|}8{4'OCdg(R]a3Qb05b9ho:8l}a {*r80#ưU eG$6~w=kV^Pd9qvEZ:+Gϋq[WZF+SixAzC1j'(q6 "-puoo8f;9;%<蠱{"hp;ٖ #V;EQl2`5?O'l}|Gq,h#,g XFJ \k[w~WK8RvNWDq`D' Xʰ8cOf⠽gDD6 66*3i kykcjdw&"u!٪#?:a r AC)~~7`0uթo+Hā6! ݚ!9X?o{L)NL8qIe{2ƅ8q T L}B"nH/f=A*Sq+5sA|;ﻈ%ޥ$|]Aj880TvO}Mq_|8G8qpBC@FԜ k`"p9x߮^01d48Xc8\Po?朕kqq`iSc 5a .GqK|1ky7w@ٛ64@[H^_?"CR]a6Aژ884ͩU` K9W"u`Aܦ;q`-M۴k A"4uqS>q nW%6j(?[px95P"ጎϫoJ ^MWج6',Mu3k˫sl 75НzeKӔ  Xu]PqS7&_8"(q[P`dː*C>ͯCϟ8XfT(l6liRd lU wA$FnUPD" ؈#jJM>{ )]ӧ@-8n웉#Jy`Je ^K,Ž2V|1%&>k%:  ؈#jJݠw5SA΂8`sZ8&` Q?LY}UvQA1Xȭ80` 2ê_ȼ FqqrzZ@Il`f=tg".Aى6Z8` +`NBVOO·*l,qFXX *sc}.ז1f Kbp okƻ'+6YtDeqāV!]Vfuc`9kˢ5ݫ-cb:n•':ޮ@[=-qPl- +5Q>qjxA6p鉃n>@%cʬ+,7GzϮn:myAmp8 7_D՜&՞H9qSj6 G6:qFuxtcqxT4m3'2^8n,`fw6nx_LJ7?蠊8?sj\u8"@J! ʲnDaq0{T2lā]8C9l k 7{q0wΈ۠ucS~.m@֔б/k('nrq q`@ UI;vR`f );lā%t8kT)Fj 8kJ4@ān49` ,(Ps3?'+L0fRq q`@}XՋjs=Y 'gЂ_U8\q`@ҕdz@t!_ >cedgq,M2q=-q`@Mk|.;πC..]l,G|SC@DPԁ5p?wnZ?hWzqcکSjq`ݘ:5MWPǞ~>u`8oExq`@Mkgz^ȯӶyr9@|5r8X# uc(g& {bx >]p0( ucLY05Ph8t.vqMk;Q€Z7HQ5EVuEN1CA.Kd bm8 jݘ:-//A.qኇo?yw8xbm8 jݘ: Ֆ@*WO ) @qO`+,yE.rP k)ن@q.Mxbm8 jݘ:?;l4$!ϴ Ej|8ȟ=quc8Yfcb$GUqV/T:5LSSڠ3A!Eի=OT Z80Ud` `auc8:DX qdaZ6` 栬8ߦ5#'<۔_ā"ājak01cs3z8Hq۬Kvec@\k?kΐ}hqVi`akl"JTCa@U?q`ZA58AڧzB(qP&kY(v\Jm` :Bt+=jq8P ,M 81 Iep'E8X+G}@ekxw=ƨ6yrZVb-, ,d5` c6 =ʫS: BX3mGQvk0ٓ>^-q`ru@עITZ8O^+rb-AAM嬁y3[XϷ: 8XKG5P@Ak 5M$̿'uMKr8X[ϒqԲv,Xs8x2H=z+b[:q:e8qs~& R;T8%' VyG TPhWA :G:fk_:߈8p`2ڵ)q 8/n@UIH4u06n1=j.fD] N4Rt|˸_:jbA8zB(?k: `yCjW[~!q0UJ"}AX7gS(J簾ڔF)z2a]r_lYV@eWm+b41Peq.ʦS"Y@2K(ħadJ/&SmUذ ̺<2u8a }꾘t+/&wWP1|˞p#@ #x<6 yثɘl+yьm[kZJ h]L<XŔO.%2 S|G<΋4.w 6yg =ی^8xr) 3<85(T܌#3*49賾;'2.=g&.BOk`]nlnYE9#\J=ɵ9 qP[" WV ":"/%0(/}P5p9x/%c]GWA$AP{EX%(3<A'.c^l/TR> 9qF-A栟8h"8~GUGqF@C/BdoꈲR&p8p7.fLJ=k<\A[Mtţ8A0z7Z[nXm}Ab׼<4"֭ ]@Px]hU'3SvS8Cv*k \{LMv+Jūc$8T۫- ]`n:Nɼ uZ=su/Txz" އQpO{+qkk 9hgv!ŃϪ'w۬@h?GfkHx%Ãƕ()JC}k L [G;^K-t<Σ85X\GY9Y{ģSG$-'U%AtM5Xb܆ ix]%V? 񨷮(U\5XEk!3ր5d!_>z2_jRB6Tv5-?h{7k8YgTY ^u.qm#c|?A^;U+5( NfiG)2Ԇak _dڇhH6PYj,YC_D%ȣ)TC{yxq0 %HdYyͭѷ#akhy㋃vqiA2# Ԡ:5NL]P"[PGCsO/W7AOB# `Yn98l({{ȭGĝ;:w 'q9?HjT~ 4s0h;+:Z#g@-UBbZ (tB~ B XKh6ʡhHݏB!@*gЅPqpb"5eQ%jƕCѐA4ahPVۑI)[g Œmer>4?Pِk?Eek L3r -u/A=[uB8h YF/5="qhxthŦYtfX ](*Yb54b4|+4铡 :uN `USd~5Td j4WU[0! `UU)2GU9~ϛi6]tX("qe|1XnSTg OU#EC**Y=z4` kf{ٌbh_L-WB J~ztݞiDܧݼ[!&dh9s3m#wXvg8[^0wZOjԚ'y9,dX ]H*I庳-ǝ9$ \4|eޝHH@tԀ8` XѐeSJ:q"w1UO(i8hU5` 0I{ɘP}kS„8Hɫ k@$Žz˻l1uƂW1Z\6qp7&k qkVLx9q'x}՘6Wt!Ұ4Sր5L2|J-4t 8hyWqWqfLH1u-*́J" Jn;U ӟ)x7,dΜ@ MWN/5h^ڌiEOb͏ՁJ" ;Jm;U~l]WfA0 n`N/n^a S 7kyq~2`.DZnP*ݡ:ݧ9igR777D^1JHUig3ȶS;3033-[==Y(6鲩J" Jl;UmxnMqv88t!Ұt`a}~\!>պ9:vA҅HCUN`:c.ϯOjNpY9.qe;kAG$~خǏ:A8^4l*`݆vOe_@'/M&)V 5,6q/o-qAsG88 ͪt!PpUzAN7?n͛Asu:L jPB)L 9B?Wl;sqkEBvրPo6~ B+nUP N'Pȱ6 牪uȉ0[ cl # %dzT-ڼT\^9F ]=) 栆9n=<ܢugXim8@-k TL>[‡'9hL7f ֢"Ɓ#(\B[C;hD_{8T9quAXg 03WE4;Cv;q:7Hk{vumAAL5@LӫJ+z{8HP}rwMMY\ϽNāqru1iNǂWsٳ4q 0{)k@5F^_AK-Ws'q>O+v}(dQSލLz,(&58i^u^};qj^]K5%lKY)aײַ^=2eX0fqgdk?{)k`2_?@׊]-KgWaXanAF{hM QErk 6{8xg-lŮMzk`(jS7{1@kKհ=UH0UI܊]{+K7 򶀨%Ţ7 A6c L3oMI+R#L3ހ5pv8)^O..8(Ti6S` J[S@[j%s?qP){8kV\-[sSuF[mTe&s+qP+;q`_Wg"$ؒb{n$*#)kՊPZƝ8xAȈ@ZQl{8ebȐ94Hݞ0NNX(k:g"F,?(k2G}?mwa'5Պ`vxqkU=qq0u8:Q`KbEAX^̙ DY@Ms'"@mܰ'qUE3;qPo;  }n-*{kUOZශtn#A\=+[ ksnk8щmVUXIUZ!O gS8x,ȩ5v>ր56?EGq8xՁAW>l{kZspg osqU=5` xq5iH)HC(}Јֱaf',cіe8|spciq` PAW ?~i̐pA kntdqaR(Iԏb nuq)؈kWlig Z6ť"80j5p k |׻CU`B8СaΑb XPp۽ϑ āc XP<]a@<@qP H9cӏ@'~<Aɢlq/ClQf *`֫ĸKSY+<!/4Bk~`J9<Ro9Hcةk4k9hxIt2kxZcDL^kC :k8k(k"&qcKp(V2&Xwր58A65}խ:XU`EmbI/PNyTj Ƽf%#ZǬ KPR{Tn/)35:g,XsP)W4 @5` `z֐,x^r3tā` Xj~.e{Zƪtֱ` X@vH1vB(;3. @Hր5h:~iJ,xbrٷ&bTn3aCb@J= 3pe6I@rK,jݮtÊ5 X .9yZ cft%UdOs<μ8 * mffJ;q3( N{RKQkpڗ&5 2޲v<䬧,t1 ^qUk qpvІ09JT+av2aٯsW^LƴKDe*ql}Qa >>ķ1JUM&TYpJ:盃8V?qRE z(Yu-@S0s#ZNU"p['3j 5Ȉ52JWX5@k`THd*_Cz hlA9UX/DY[S84@P0'=}#Z?e(oe1րr:]í4XrSZsxa (5)"}āH@k@ѕ໱XsEʪP!K@(& PMW#ZOjKX5` ϗ (ەAr _m.b`34vOt<"8 9k@{Q:R#5 k`b9CZK(e;k,~HK5v_&XN5%x@L| @b0ٍa}cZls@a ȧe4.+B'JzP%+X,zAM:m `U8V/@^X3,1eN-(A3)ҝ1;Vj\-4ZBp`ev4yVXTrȉ%bր59Xbbb]NjQ\w5 X F117ܕD@4P]a՘^*`āh` `4]Kh,䵓nP Xր59S(}W2/k@X ۯXɿs=6Q`QG94[UoUh:O|FC7zP|怞8 8Sgr@A`t:^ ӑy:Cz+ZAs8 Ph5,]!8dD !h50cjB6q{w.CO Dik7L]e:8I$|slo_2ku]BAH5 @$=t1|L, Q 1c֜9Eyp9!=E,"Ʀhg W}U2=ਁE,"fgg `GjqPŠ DYv1\%q@ 5E<2K8 Al@a -pS ղߟ-H,|\@ ukF3A\t 55` /8*&q`cTole&nU߄$K-}.tāHUk,.6X0oWH~LU],!bB9Af݅[bYj?hjq`Cd X֏8 ]Ă5@v?*[u /U=q (XSY~!E,(XpuSqիV^_5ր5{~_qPv{E,*XAݿgCA lmMH}k]Ae8jCEvC5YKY k7Up`8B-b(k Ҙ?f5~W: d$,v(2q@5@\$,Zju` `Fv1{*Ջg}1|?)غZGo8Q}]}&5@7}$8i:`Eb `67+uz,m Qee @k\aMbGRm:`.T$5YYh@YE5"/7Q` q5Uۺ.vksހ8o' `{k`5L.;#k?F8k {[k~}emժ"7$Dk61=z9o@@oHMwS2s};浟)xnP DƎ[I@Ͱroֆ868nX}*k^I-O@,LL$/aQ49H6AT$>YߤvWִT#>j |mMƧzPGdjlܣ @Ęⵓ9pqb5A/GtЯU: G @h@YcuȻzD[G @l@](TĬ]ݘ8Ј8Q%@GF}Xa XU{Xŋ: X, kb+qnP X"Yvf3kG$Fi٧k8a ɜ5@%W&ңnP(YN`W&q g A ^3>Qj- 'ܠāa s0эG$8` p808`f41-bbe6 8E0,n{F%`9Ƅyڧs@Aۋ`w33#6 [50k09ZwIﳹ9ЉW Y⣤5@s#ys7(`Sq΢9XT9q0G x5` sY`΃ܟJxg@NuV';⠠4_}1 `zV*1M S9D')`IWuKA9iǕĘk6LQW5N6M`vl Н-3 )88kg{'nS"qPӧ^ 9,T=rLk3Akw="18(i7( th2/8X#8kh=D]qP$ܠkSs7jE5Xg &#TBqbε68,>"18(%2qX k k D$X5h9 {Ou4X<"1 TRqP{b@~=ʊJ_+ItlK9FYqPI,z`\TXyЁ kNAcfuAR_k]ͯw,k3"g 5sALj=5n!vv95ؤ_{ G*5Xd}yD*M\Th 5gޢ`&9XX)́>Q@d hW1(^5JdLg sPnTIْ5jAU?0ڹ+N?kȔP]ZZ76l;uթg Y H@M4;8}uk̥_WȐ P ~A,"#ޟ5r&ǰ62 @A$F(FuCu9kh*lIn 9YdBÜe` X1_O~)+{奁` KNmRrlـFA)cK [o"H,#\^5N`3QA RYqmpف#sPi 95EEUq`4aY::S'X`6m\ sPSd: OAEA+op`~c :SY*A8/[rZƕ8` 5; ko/Gި.l7k268f&ր5lTuV` Wnq"MPeKW8p` v AG kx|^qFeY {b K[E o{-% 5P9sPEb5/tlj[ƔfeZdbyJj\2.oèl@zb 7ʴ!o=A0# 8 y#fL@n'pWc[_@GXK efr:w_qXZmF6BTn*k\0xFj ŁN`  U45g?"qHI wc5 !wq: =¹"5v44eiǹ"`lA7䝴ŁF88kQuJ49ň8(,H5@znZ3#(Eke,wq ň (,ykzǮ.FNp5p`V/}Ĩ5neJ`7(r] y'_5.ސ/8CA y'oPb Xy )278/|[{1䝴IY`܇{NqہɰAys0xazw(041B.5&s9NA C׻>`H$C3!9o%kiljtA qXl蜇Sǂ<s0wN?j 5`F6CI_Ph ۷~3!o:EcWۙ!o9^"O5s y+ p50Ԩk@f 0`&!eyXaր*`VC-$q`9Ud+fV=sP,u8?dnàX0*B'Ν`&VuN/S&Zր,8 LeNp5_JU9qGXc ^nI٫'vh ݡAc%Nl;7}r 8o{g A?qwPek `ڱ-2qwXg 8ؤf [p"\ J*E *5s9X6&G9& 8p+g yQE۸8 qU'77kl |Vks+;E__XݮsQ-jFO^ u0H(+pjrХLjTvU/bN84\j`rE{?\aR8h4`Tse9H }ʁ'j`Z8h3jS/ HC5;Át2z`rPSX5`r0u@9Tr0Ժm)lC5@9U8w S P`/z8B5\9nǁj ,Z.9Hx By9x /߽9DWP ܣA[W$>Ж2~xмбzBP8W>nl~"1,Hj7QVp jܬo+d?t.nrT!h@ ۔'n~'Sٝ]s!uufQ (аP $4@9008 j('T?9Ɲ̝sN#t v9$ESk@X5pʁ5v @9c0)8[YH9w @:jc]@9PP rx"((@r'P .kjp ǃrjsjT@9 @9j^s pt)ܛ(]@9PP T@9'j9 (Wr`@wof@:0¡(8TT% V@5pN`:;f=0)8^,_LbU4Tyr@1hPp|5Ppl:3Үr8&=ʁɁ 1ajP0T ~@ʁpjʁnjyP5P tTZj(T@9 T@9 Y` =(aGjp ݀Gjp5|nyR9dUr/̪j0T -ʁQ5(FU?r`pT N>|@7P ^ ]@8P (j3P̐]Cny05Hu~R90KnR _r`k^>W_5kcn\{Дn5͸t@5ة~r,_ |ʁ?Ucjr`T P̞}?*s]: cR5(?g 3@7U |fX` v P1?S X/3krdUY6iUcjʁjʁjʁj;'/I(]*}5P PA7@9xˮ(Iʁj``́xڮjr(Kֻprn[T8QY5P h΁j~@5g6@5rCĖӳjKp΁jʁjgi*.dU @9/Uz7pg9vV H=˓hXO)`ˁ]PTX9P ܴ>sx|:5GǷkAT5k(QՠTHH@SwjWk(P @9xYr:eGdzSjDXH5.`@9xkC@5R rrrI4EP^W5@5@53Gˁ]X]r`:(cTPTPTPTj+R CJ5jS cTXV9n*R C?4FםIENDB`npm_3.5.2.orig/html/npm-large.png0000644000000000000000000166110512631326456015040 0ustar 00000000000000PNG  IHDR8 h%tEXtSoftwareAdobe ImageReadyqe<aIDATxkOi63AI!!FnJd' Y䬊;Y#j(Rrwlz pL&>]tүj2z<4-fv^۫I40jj*4% -M<{d2x80?U*F$I[g2sXPeypSv=79$Ij°,*v a0 o!\4˲s>88h6q8::Qln|.Ct:eY2,˦b$I{hE.!~|%qc!k ? fQvbBAvEQdY6N{d2^^Ohj*ɢŌ~xqVkZ@,`Qp8 bt:Ͳ,!et+V+.-3If"න_^AvEQ^7Lp8Tոи杝ߗE&I׋5EQtlnKzh4$; XoɢiyxE$z=.4pnmm- |7v{624M!tݢ(bBכL&z޸wժX{db BQn,N#VVVf$qo!VK,mXT:NY!˲t`0o~#.Z0<~X,YTie ! fv^ gEAVKlgϞ" Y,Cz$! `p)XI4yBEEQtX./..|.Cë",IB bŢp/`0diej" "fvNSeygY6NE|U[[[o~V%I"' kqߏ5yiZeӑp 5Z7*XXx"!tݢ(bBכL&~<Xj5.4$I°'EL&^/!EveǏfYV;͢,˦i,C ajB\]h6+++bர^yikfY7p'WRQד$i6a E5~<0te|nZָwժX ,*=pqqsB-"MW7CZ mcQ%M lWE|IĥpZ0lmmFe9N,pWo48+pfvsB)2!dY6NE,$I{hbx< !4MvF0'I)?ʢngYAH4X|gϞՅfYTb t:Ͳ, etKƶz ~A,eQ;eM&`0~o!-$Ih4*X{ˢוy a6:NY"XjB8$i6+++bXTknQM&`0|=j5-4nyggG,pYTd2zXPE۝fv;tqq!V٩F#IE]4|_?fs޽NaQ,N3/C5Ni"iZqc!I4͕?`0e4˲<4AF^OhaggG,s...|.CvEQ2^7LZ͕pxӧO|xx(1 WAL&^/ Ev˲t:W*0p zZ\,0 ek,N~<p(b¼m _F\]8??֖`QNY2X0vx)aDq ! !%ܒY\`888pYTbBכL& a80K?K!̱!\.--ݹ~zzZ apYTv !\^\\<\WE[,FX__#lllX~zz:[8;; " _WA aL&^o:fY&7zRY[[%H?O;b;fvs B)2M<ϳ,N"Rnnn%߶%ya[g\]8== ߌE ~k 5XT2n[E,Czɤcb1-aiiI8a\]ja E`M&^v2 ZTi[[['''zcGYTAAȲl:2`0b1ڼ-A2yX z}yyY,; 4Mc l6kf8"?-#lnnm :qrrRVz=z$,*BA8 !\N,FRY]]moon<}燇b,*B,CŃen(4M<|q asssmmrcuu5\.-- oiyy^WϟՅ3"`0dbBQnW>V*oap NOOϋ666m w+???ݹ/zx<O?pmONNBn!00=Z_Eb Bݞf!M<{d2|+++Je^0oKZVζONN>|xp8FYy& ^2[|q^Vq!B¢1 a2zXPE-˲*ʼ-A2Zjuww֝{#/^ǣ˗aӟKX8.=EGN,Xet:e`8q m ֊GGG'''aVasoݻw0˲x4M~0/Fؘ%,// ,#?*Uժϟ>}γ3EXPeynfp?*"mb1B0/FgIv^ C844Ma, 9==K == '''w|rtohžbEln|.Ct:eY2,˦ө6Mb_ն>}z||w/ ,p8 h4  ,<''' Xˢ_E,C~?!yiQnW>p-//ollV*0bxS8 j5#NxO?~: իW߿q/Z0,*'lk4q~p|wXkzd2 aE>\YYY__T*kkkdC£[~ᇳ`eh4/_ /^׽,p^]]VQ^Oh?|KVi2ln|.Ct:eYJ߶%;dyy^j'Oںo-^7޼y3O}EQ\__ϗZ_nZF&XTbBDZ ! r2z=v# +++pb0T0?zhx<駟~hq~׸ b[l6"?NVk6I@bQ*˲<\ngY,CH4sKa}}}ޖ$Z}||0xk>}J4~Ϡ,A^zCK)".$̗>|-fY~j\?Ǣk0teYQnW>0VWW+aޖ Xpzzzppɓpxwo߾F/^axe8M;_JYovTB?~# _W,C5n(,˦iכL&b1ڼ-A2QDZaoooww^ÅyYFpx4M?}KcYWWW?+N^ V$Iш j  x< !4Mcl6kTTVWW777oupp}~~ɓ0˓yWWWoFիWax捏/_7$+ǏƍXfy{΢4_!!eYe.IB ֖Xw4˲XPeә!3"b1-aiiI8]j5#lO4o߾}eFQe>}>(,rXQ\]hZqwʢb BכL&!C 7jRY___YYb[p=zttttrr]Vp~~`oݻw˗h<x" o߾f_]]Ņ쾒$If[¥X`1XT("N,pciiiss3.!T*x!b~aooV-޲ͫW`0Ȳ, 0 >| 5\]],˸?j4z=.9IXEa !4MM,FxflK[[[j gOOOc`ח8 ?S¥g:$̗);ٳgqul&I ro^024M<ϲl:卍J.>|beYuI"yqjͻbQ`0e4˲XPeӑgq !kkkJeޖ ?pppPVONN=zoŋx4|r8w<|=>}.".$pS2PՊK LlQ knQM&`0ڍo!%Qն>}poݻw0˲x4M~ǀ,0A,+j5-4nyggG,mXT~db BQnw6Ņ|`iiisssuu5#!\lllErrrpף͛7Yp7ʲF$Ia__XT4MKmK8#pyxx^ǯ_h4 'Ǐ+:$\]]o#el? Z-_gQXPeә!3N/"VVV+ʼaޖ ^V ۷o4߱(˲`/x>}t}}]E\H dUB\]k ? !`qzd`0e42T\BJeޖ Y,Fx'ONOO]|rtx6x}Yq!!>| w^^OlgϞ pgk...knQiy."_Y~ۖ AZ>}z||j~׿`0Ȳ,~?0F0x6BB8|}}-X{TƼ{!b9 p ذ'I\^^Jeޖ Xx=y,F˗q?ŋ ,a'Z,4Zָa7΢("ac 9k7mK<#T0=z>}J4~,A^z໸% WWWq3!\V7™$I A EX04˲<4EbxvV=??'oߎF/^0|r8{γwÇ___E ݨI40oϢŃen(X&|/(#W*y[d'b1B^۫j^eh4a,)MO>y<.³bCBqWyNh$IRQVfQ L&^veYv:_\RY]]#-A8gggۇa899W!\]]1~W^͛7c ?~ U7$IbB8[X,*Ut:,c Be`0| k7°2oK~,Fxq}͛7իW?s~Օ',3^%bB\]hZaxX,*g<45٬nf_O\B󶄥%pi:߾}0F,|(˲(븜`OsjZ$f3.0<{L,NbBכL& a80K_jRY__#ýu|ɓ'Zmww^///߷(޽{7_|9/[ קO% q9!ܔ x677$i40ʢ=UEۍe2p?*"ۈKammmޖ `{{ g6s9իWx0dYR>|9ỻ O&p%I2[h6_mhcHrXc !F#qV^^ ,>*t郥P@!BزaC L 6lhIٰKBʆ, 159ڃdI#pHS tdku1$ӹl_xn쏤RgcQ 5vۜC׫~cLQm &hVWW|4M&Lfii13nW\wwwmnZ8f^7 fd1¼-pp>bl6+L&NgiZeFC^oZ x# 83XT8I R8#zk6mE >-ppnrH$rʕd2F|<WUrm4jr? 8r!a0L˗ezXC,#8XT8B^^2q*ʼ akk|,%'@8zJ28d1B.K$HdmmM|3i4eZz.NcoBq]w<#hD,p RTi D eB6g@>,FXZZK-! #<BAUX,iꪸy3ǵZjٶh4vwwݮx`;ǟrA9!7BT ,e0.k)bw}"8^o(Rby[K$htmm-9ˉc&9籘vu]mۆa;wB9887$r9a: `N) bT*ŋ'Epfi۶A8}0 B>/EQd[8z<9j<4-^t)NAIZZ3bF9Ael6K,pXTg,Cc!T*qt] D`1f?.d[h4HdYYJĐIΝ;v0 t]vb ,d2G8fpDķTRX,  GEpmsF ^^2q* Xd'~Y q|Ppayy9=D4M㪪 K8FVU1`MSΗxfƆVVV eru]Y`F߯^|,>Y$d[x5 WG&I&auu5i&^)3L t:eV%FA2XXp<˒qļG,S!HR$w._L,XT'϶f)ktM>N  E#iXluu5Gd2dG,˶m]MvjU o&,h$K zm]X\__w/XbQY`05fӶm468],FPeޖ@2- qM"Hzfyypjz-4 vvvl28h4ψX\ZRT,wcQY8NRe֖nooOS"pfB\B-pbT*dĐfHF V^[t i$E?Ǒ% r9A$ÅR$yƢ/a}Yn1I>N)~,Fü-p?TUMӗ.]FqqdZmۻF6f!`gxG, E#d2iZ,SUP(DQqdv-ٱm[ j4۷o xW"FkkkHD̹\N3 0,2MZv]1躾O2X|:Ǒ% r9g/ŋ5M+JbQ0+8uXTԐe^^2q*r|1p8znK s,FrJ$r$'acƲ7oj51t:i8lHpf0LmccCn,fđL,8X ru]Y`Fߗev4MpxOŪ|h4H$٬,FHRb$YNDzZvM1pH28-B筜Ga^P,K.^H, Y0 t]et{{[|hkk|aPh^ fY oE3fD"FHu]7MӶm0`YHޞr!Ab6Y !2L2 Fv>۷MӬVeٶ#nK28EF$A.'LSK|^,ނJ%bpFT*͊!Ɉ… $ELӴ,0N#zjԙNx8+{}8lbT*ͻ׉aQp2۶ͦ,C ;ST$d1B0xEt:}%1c<WUdaggǶm˲|?Xp {{{$Εu X,abEXT>T > dBٴm|[~_Q`08/F%p2L4rdR |^Ux܏dbFӱ,^Z-14 0B|9A<V,+ bq}}]v/dYb pZٶl6e`0u]!LmzC(~_ >oޖ@8C x\41$L&#^\s,˲m[u4`n߾M28IO@(3~A\KǏEX\ A` ͦmv4M'p Pf~8z<ښ,Ffbd2T… $s[LO!{{{qh$w!Ɔ8R,*a2u A\LD@3`=A$(4MFtҥKbD"!ju˲:abGS,988 !A$ŔJƆb pd B^zA8a/8|P(D8#rHʕ+d2yUU8 εyY!> DBB@ (ʼ-dCUUh4L^rE3 <0,2MZv]1躾O28cǣh0NIxP+++mll̻bXTpƙ)kvf7 C![.Q`& %8qYpҥt:-q$0 ZVѰ,͛ZM Ndp&FyC\%lllȥq-R zR8#kzkۦi.~,Fü-pr…T*dh6M$|dNӱ,VݼyS pH28&I$X'{ 3bx" cQzz] 8ST bP(4oKzL$4-Z(x=}]MӴm0 1X%Y5Np8t]w@ŋBa}}X,Zq> dBٴm[!A ̋m "BrD">*fqG'v-˪VjΎM6ǟ {{{%>T*>ibQeyY\BH8\%iZ,[]]h4Lf2^nVteLu}dpF#Y L,$7\cXYY!,>zdiv[!}tfB\B-pC&٬2L*p  ~_Q0oK (bT*dĐfHQiYaNG zjN8G8$A H=G([RD,8Y,*_v4MY`0uqJB>p! b1x^B.D"W\I&h4ϫIضmYVZ' ΃p( {{{E RI\džEޞN A ͦm'B#%Y ,FrD"Iݾ}4jjYmۺy0q]O> X4TJ-lllqXpDXTY ÐetM>pd1B0 H8:P(4Mt:M2hXj:aɄdpcqF\NZ^^-bH,8,*FRqG!^'Ll'N6!̋flK qUU"H"rh4ɐ̣ǵZjٶh4vww $sb49#KrI2ζŒi3  5T*y ֖Y!"X^7 ~EQ NGJUx<..]N $s(LӴ,0 1ض-v}8xf@OH/_{ RX,!H Ee BݖW{^^'X(r A#Y#!H]p!JeYY f1y9,j՚  Εx,KáG[YY) Ri޽ Ău A ;⾮`b`0(^o8pԖ3HDӴxEpm @ iZ,[]]h4Lf2d21 XU[͸;_N/bwB^, 3lX$p^We8JurL>^o( ̈86kkkh4HhD3$s,˲m[u4`Fݾsr]w<8FJRX ⺱A,g 8k뺲0~l6mnۦir A#y[!bT*dĐfHpUUkFm Χt:S8odق\] .,*5@uY0NŇUDW% ΋m \.D\rҥh4iZ<WUd`0hZzݲNca[8 r9uݽ=|,RI.0\|X X8 ^z=Y`f8Fr A#y[L#rD"DQ$sZmۻFC (pH28FѼ!A ࡅb>^3,pǩT* A A%"8EpX.!ݖ@81K&hP(b1UU &dbFvb?~V b<F#q\ו 8bqP*đsRXT5vۜC׫~0 SB,F%pD4][[D"brdH(t]˲vvvlCZ5M$l4 C x`qR)!qX x`t{{Ok뺺0> p Pfd[!5MC2rJ$r$sD ð,4j*۶u]'sp<+ļG,eyyY-f4M+rXT5m7MY0 t]w]\.~_Q`0qY 8h4HdYYJĐI Vh4,˺yfVC!`r *JɽEQQpN2J8,C^lڶM>pxj"r6T۵,Zm;;;b7IáX.'y2 hccCn,ՅR1`0u] LmqS\,Dg[8@@Qy['K0 x\ӴH$Y^^&#Ulnۆa4M]1xuO>D.'f BX,Jbx9́E#kͦv[ is"0 ^?.d['.GT*dTUW1_pdp8lXu]۶[s{{{qh$w XXXXLӴÐfW͢Qebz2J8xx_0Te^0oK `AiYTU- L&Iܾ}4jjYmۺlHá\NFH$BT; /_>-^Tml`ry:nooL.!bEQm $,D"F"stdGVvV^[t ৿纮\HG {A̋EY`05fӶm46T83PH.!ݖ@8㪪yYpʕH$HxZjln4bG>qh$Krt:%pt҂JbO:݋ qJ"ͭ-q?~,Fü-pFD6C&IRjF24M˲ t:nW vΝ;$|. .J 3RID?E_T0 2vmΈAMC`08/FBK8BpB*d2h4& 1qV,qX/2LZ|'A˅Baccc޽P,o h}}^ s|_}UH8KEpm ,8YP(TUbc:&1L t:nW v۶ZF2\ug4pH,@bSg?l6_x7nҢ׾v_ R{W]fizCP`F >oޖ@8㪪yYpʕH$Ht]˲vvvlCZ5M$ܧh8,I &R孷z'k*s=O? sokc9N \B`pޖxX|qYpҥt:-q$deYiVU1ضC,r!a0Lܧ^z駟_wŕ~YTLӼ~W% ΋m  .\HRL&fD"!59Aj[nݼyVYtHx X$ C9F#b駟~饗7y* 靬_ MSp^v&P8,r A#y[Udt$4-Z(^/ᜠNcYVVy3d7Fb'|뭷Ey{ƍ^xlIywuQa0͗_~,EQ~8Kw%pZl.O<񄸊Y nmۆan+\H% -a: 8:T},{?qDϡ-*^{M<|zCP`F#8]BVWW|4M&LF du]˲vvv`۶Q$<qB|9a2 8N{GG/8{!/*mmmmnnkr A#y[!tY[[lV L&J]pdN\Vmn!4u]'!; h4"p~?S߸qH$?բ_~ׯW*p~EQ{nK 4MUT*dh6M$b 73eݼyVuVE2s]WƟrH<3|_Qҗ{??ׯ_+k( >O#qzJ8i"ȕ+Wd4qYN֭[azj!A X|_W?_|}wI|/9Οa76776'lKBH8iNUUrD"EQ1̂ٱm۲jj QFyC\%pe?+~^ӟs"|MGB!EQ~,FXZZm RP(4Mt:M2 ۦiVU˲lvb&QL&x<q]W.' 8￿~/7M?\?O^v/^<0ׯ߸qxxDP@ (lKGC8)UUH$r\4d2$8jZmZzݲNcd2!QLx<B\N8@p<~_~O~_|q0p<K___NKիWo|777_}U`xҒ\B-pSmyy9=D4M㪪 ^݋fggZ[3jXC1L\ug4gL,w4?<7.0k<:GQ9// gy橧 'y7k׮pt<O8A  YJ2l", ˲lu4M1nܹC2!ǎF#0S'|뭷˛&,o|O'7Ӊ7Y@ Fdw^xb'HW|wy, Phޖz 83r\$rJ2TUUӴx<.YLj՚oݺek49#rI2 T?L$u|ڵ^dDX|VVVNg> '?xK/OcYV<o_zU_0Te^0oK ,\.HD"5q4 ZV^,!4I8tcuB' pV);|_fя~/:CD'kee?4~ ^矿~<o|_c$y?H)>Wb+Y.yhpn---y^Y(ʼ-d3&LFBj,8$ZmۻFC X1N.$p8h4"Εgw>Cv矿q)o|k_;_K^OW^yO|Gbx{mtz7yp?<&$J]zU}Rl~嗯]l6ypyOŪ=qMӢK\Dr,nkYΎmbVi޾}dppp8h4% r9a: ۗwyGQw>܎ͷ=_~}{wy0==}F1 υh-[SeY9Rm'( ^~7o..Gee7p:)x~Z6###A閠I8@̴&iʕEEE!??P @ v}>x<՝;wbb $î]>V.+B9vXUUU >qq8:ttʲ: ʕ+Wc[o;bLOOHOOʜ'Jc[/..6yyyv=77`^vv6Ą$Ib@l1- )MR@lذ {!u]__FoHhCCCsssv#`̄P($˲R2L}}}W^o_]]ϏX'7^|Y@mmmggg?Ӌ/n޼YPk={6xGl*Khrrrn^ I&##CtXcXn7 Vg4I&!y$ݺu8q077$(yw!{?U[[OOl6Y,vZuuMxl62P7\$6lp8;w%FM\.WGG!(233cHHJcBٜ[RRb4H&QKx& >R+N%fGq8Wkkk]](?!V]VVF쬭Hgc=4~(T_ҲsZzuUUݻsrr< ,TEfddzn 1d2b3??d` $ixx펁x ϋD"U6$'NsHt644LNNOC,$~1?c?*╣޽pG4moow\.\OYYYn l0JJJrssŸPlZVI }$) NLL1q333DdYBJqB0$֯_%=55xqYYGٳg'|B?KENСCISqƞ 5)j4 6a>CCC hii'^ H)BP#tX4RD^^h,**R#Y&77d >J4>>vgffHP($˲$A)Nw H2+VlK8W__FY(++jD񄒩eǎNmP ^* *͛W^^Ɉokks:ׯ_%222ZNh4/Hl6l61b"I,p񌍍oF &q XqA h4smݺU d]b,ˍ7L&Q<~q8GmP(^'uĤ+**E˃z{{Nɓ'ipe˖i41Bff^OOOW˗[,j0l6lI8~_$=>>.^ݾ}d8 JMB0 ߿_USv޽{{{{Y^ 63H;v*CBSSӻᆱyLwXB 󙜜lmmѹn^ u(zrtKJOO' ,[j+V\i2v;RĄ$I_4oxxX ò,BXq(vq)uέn`` ?@}~W\IWx6(Thnn_v*..Vɔ.\ٙЍ6j322@ĺ%j PivZq+N>v?44$$ wQ  xnndի٪a4mnn)@UUUǎ@]@gϞyf͹gƍ |Ċ_߷o_YYJ#).t$(AgffjX4RMzzzqqh˳W<Ȉ$Iccc^W =O8&`IȲ,^H$V033C,$p><55: e˖/Rp.ф?OP7H7np8vء#;0.\ D<^P#(HMEEE+W46`0XVŲ|rIDp񌏏Kz`dddDo 6-wtt Hi4/<<99Y__r"H/f1LDENAWjV~(T8~Mc깲DKK4z}FFv^ffh4YYY,n0 VZ%b3//h4L$) ~1p>$,H$ !;wHQVVݭjkb.L<==C•+W^yDO?''gϞ=UUUE%SeɓNXJce˖)EJeVh\fM~~`(**yyy$y_$` ݹsGXARFI$f6oܸ!nszzѣVgO>O… I@Nڱcj˥ܲeKwwP ^կ#}FSQQp8֯_Y]~tɲ+,FZmFF!-pTVPP4F0LFXlL}z$n1zl B  )Evww=jp2)..IrdZ{@?]WWwIUͪ? ֮]d e˖}m۶M=Sjii?x4Fjn $8٬![d CAA$@ 7ߌxٳgP tј `2z+VgV~Yk|>/]ZZ^P#A[0 fYijժ1$$I b<<<;  H$ Yb?G?տ訩q 8p$~SSS>z, 7W[U=tܹm۶ /b/k.áN. bb -j5 6l6AII $-I@ 088H,JB!IB8A x999}}}Ah4rI>/B$3uuu---Iiy9Pfvvƍk׮moooll|%Mih:˰cǎ2UvO ht:V5FuK ²eˬVk~~hyyybP\\LRz${<e@2JD"YCPx^"^sF/˟LOO766?~\ yuuu 1ۃ8pU}ݦ8_O^w ^|T;!nܸpܹSUD"'Ot:׮]EbSZ6-d(iPivZq+q?$qI<ϭ[`d^0$@%B,J8AI2`,'~}}}sssb])egg_z4kתU۷>}Uxx1''_O[`4SB6jԦ;ζ6*xNz}zze@8P\\l4򊊊 C~~jr)9 @ v}>x<T% F"p8+N%7gΜYXWWSNر#G8fđ¹^zUmCWW_λ Vutt4e#''gϞ=l6jbSSSNSg#--Mgddhڬ,F@8WRR4Flb`Z-I&9LLLH488X Ħ *yHWbKn,7rJuuuoo/9??rF~_=(!ĿCk*K|^VTTTUU_^ms?:::"/@*tFi4Ftp?n0 Vl6,$4<O |n[ ~2;;eYBJqB0$*ztQ^[[v nz9FC Y=zᩩ%X3gl۶uI/bw[o}?nB;ٖ-[/NyjX pKT4FP6s5k "јG2I# ̓$֭[G IPP($˲$A)N|߉'***ⰣH$\WW+f[bQ8TN^?ܷoˑ(n޼/y|rBW^yʕ+,JKK*++Zv٦. Д:N@2TPP`0vd2%%%bl6L2$(;666==M2 ɲtH SUUuرxqzzѣ,ūEBzK;w/@Ξ=kyyeeO[O?dr8{QahSSSKKK `tB5F[xh4L&.6 H& I~P- AAe $%ﯫkiiF) qhkjjΟ?ϝ_իz|Ӌ/n޼'w-Tx7:;;YG޵kWUUnW"H{{8XBYYYJ(n FcQQ`/,,J2IÒ$A1 @fgg#RcbIl6߸qci; Ԝ={6eWp9rɝ?^3qؗd[z5'8lyP7lkkca~֎;9qӦM*͛7Ngkk+G)BP#Al2yyyJcUV /xz$Ν;,B!IRXZ~ѣGůZmwwwYY&sҥT[͛7Uh4O..ĺyP% >؝n7цf2Y<ǎRՔnw,իL&G㳉D".K3)zW\=Z,[n=xBw}y*6pw*8 ɓLbJ)BtZ6333-dj322zRpDQXXh6 ڵkW\i2v;U` oki0*Fp0 ,ʟ~@~JiiiOO/ݵH=W>l׮] 777Gn7n|V%D$N G“۱cǩSnBCC8@Yg&^CS śb@RJOOʜ4FuK 'vd2\`0[Vq!tmvIHP;w:$( HdvvdHgΜ)//}>Y=ԄXnɷ G}Ѹ:::盘fX,䙠^oaaawp89w >\SS<[߿˖-k^ ` )EJcN떐F8~VII`06M VkAAAvv6$IƼ^{{O?qLNN6448H$W-ˍ7 T}WWW $H$"r(Rj RYY/wD".>'NH5::ܬnڴŋ\_ L]]]kkw˗/oذ|K/400=Bk׮7am$CV${ @j:NѼ JcyƼw;;=p dZYuoٲE͓p\---SSS,HOOjJcVgl2՚4FXrd홙 Ƽ^$Inol8e9 )5 `Xؿ|uuuUWW `){!H$tС%c`1tttN?={x<,p8ĊfffbqdZ˲|THĉu---ߵ8V^*T]]-qu=Kw ~n{h4ݛ7o&| ?yum{ SSS:ܹplܸQpBSSSgggrTt:]1BVVV[Bzz:xv`0ZJ f^^h$v'&&|>?44477G2@⚝D"JAB2ѱcǪT2:''nskjjSRʯxqŊXؽB_~R2ñ}vSH.trDCt:Vk@8l6斔M1& GIt-#$$;wȲCP8$SQQq ʕ+q~دv'G8x Gɲt:_ի*t+"NNϜ9+Txo޼"%+V3d2|MLSSSkk$ $yYYYJ3$77W Ŧj%488$Ib4ŘRIRMNNNOOOqq'D:۶mۙ3g4 ޓ-],^LdŊ}}}6x%144v8{ Ojcݺu*,'Ot:fJ 1ĺ%a4򊊊k֬-,,$waILLL h8HD)He9Tv۷'TN,ˉn{*uuu4͹snJR9~8>ؿzPW^r ǖ-[?ׯw3mmm H8:.!+++-!==p,`6m6bAQQɤ&|>힘C0 R0;;K,r8GI9ֶ%ִz}OOOii)Gݳﯮp|?p߾} ,q'***{ ٳgY~zFMMM\x~Jc!A*5 `dggXV`fLj gll,|7b 6IHPHi'(cb@fseeeSSW]B7oV޽.]J ?cxB]]]bرԩSd,8ŋGޣzPCGG,ؽ{wB1pjooF:.33Sƺ% =X|1ʕ+M&n)e~InxޱiL8B,+ |h.^XVV6::ZWWJ&g6+ɔٳ555qUUձc8JsssCC88''n~^p!{h4`1'{ od8vYUUqƄpGGbӳ222lb4FP$ ~)..6yyyv=77`׶Ieo|n;!1$d$ Xq,^/;;;kkkipرdzFG=|7n[099t:#ȣhΜ9m6-˭[{ {=~8'މ&lr577uR4Fc1LKK# `0X,&VU/_N2lddD1+n{ffd$D{ql fݸq#''_F.Od-;v:u*)ﯯonnǒL+F[[Cz:1EaӦMW^}*666HɭpTVVzVvy5*4F#wݙb`ժUb 6F#ɤp8x&&&^H422B2@򙝝/yYCR Tj{zz֯_MOO766=zTeB[p֭݌jkk;;;|&׹Bͮ_^]]}ҥܾ}ӧ P Ϸjժ8g* VX<ɔw:mmmjkl2FZmfffVVVzzd,j0 Šh4L$) ~1p>$$P($˲$A)Nw '|gϞ'|h}}}KK -쾾Tx.]~RMĉuq%}``毚.ڵkoSqȑ<^ѣGuJzjݺu 1ᩩVөV5[X$Jcd4f~~>`xxX_+qJXq ۵kg}޽?D:٩SvؑROvtt4---uK/ErDq?Oq6Wt:~m)۷o͉2+W455uttD",===+++sh4Xxf`())1&n͂۷}>힘I~?*+M2Bs~hٳ555\:9%U$ijjjhhsrrv;GF\PÇqӧo+Thkk{7YaniΝ#ſ"E)BP#tX4xFcQQa͚5$all,|7###b 6=O8& YΊ׸,ˡP(V@,{c*xB g2ޙXM\"1O # E{( JARoϞ=Kj`Xnܸa2p{tRʮN8~fYjkksm۶WF322X*< 68;w.I3u ,"HAYYYZV"e˔n  n7LFXxP0'Iҭ[< jp8D1%j*lWWWuu@_(uuu qʽmmmqu=1i_VJE"V fc 4mqwtt455]rEII)BZ6-dğAC0Lv]l 4>>.IuMOO RB$!V nٸq˗cNϦđ#GzF].WCCO%0L7nܰX, Be)zر*R'+S?*T}xr999w %JimmUQ漬^xAitK 񗗗g4 C~~5kĦj%G"QهPaժU>‚-ks[[x:IKKJc1WVVXF1//Oijժ1$~$I@`ppP ?$P($˲R';IU9sLyyC455xlJ777'*|h\$nɓb\ZZӣD$YlY4Nׯ_ƍ}؏ ~_r:,͛7gEͮ]t: ğNh4n `I,_bXV`fʕ+Hxsb033C2@ H$+ b۷o߇~njֶע===juu oT"8pbzqiyy3g~a?*Txo޼ɂa[pTVVjĚdss墆,8FXcXT `?*꫗.]bovp8=s:mt +s^VVRpg'˗dZreQQ`ϷZK$I<1  eͅB!Y8AI2$sm۶a$ijjjhh"~WtccѣGeYNU8qDEEG#@:TWWΟ?*TxΞ=˂!>^}%̣h{{p@JIKKJc1uK KDi`jZ,˗ >v?44' dYÑHD)H㙙b E{|ɺ:Fy%`C9';|G5&zy/_~OPOɓ'Y0ƍΝ;qB!8IF#deeź%%4FX,VU l6l㍌H466z`||vc ) 6 *P(D,͛7wwwk4VnTӧoߞ4O_ϟO9oڴŋZfﻺӑիW~T?Cp޽;;;;&/r[[~:K @h4:N#ĺ%5(,,]fM~~`(**2yyy$ gll,$Ib@2@D"JABs8W_.]ڻwo?>-qȑ{^]]] qbhK/短& >'~To;N K%''gϞ=}7tIYYMcٲeJc9-dP*[RR"6"ę$I@`hh|o& ݹsGP(4IP(Gh4_~-[777|>r~Beeezyh4RWW8w֭[9/NOOs֭[O*>|Òϫp8ׯOONN655nVɜkpJcbh2v,(( ؿ?G#@@^^^wiӦW>#TPWWw! *e/q+S/\X(iiiz>##Cfee)n @=f`())BiZIOw1Qǿ000!S! 4kccc6lڵe׭ '6r.---+V+׮/\9wϗ%ֲ.3̏3<||ܹoZ$IaRޗv u֝={VP#I,Kuu51kF9%rrrpڵ'xϝHT(+++))aPf͛8f.666222..NP\AREDDC&%%Ʌ,Y*) l6+iWW!2qCp:rrv\ 0/zu:zlmm-..O߿߾}5%L&SsssBB_z%?w{CHTصkc$Nfs~~~bbb0.S&vyF\!&&FPT*1zP-Zb4-D34::jZv{wwf^"@vݩ S cccDB8wڵkucccQQQ[[0eÆ J[@*ʕ+˗/g6o]XXN_HTz3,q,e(ZǏb@aB.T*%&>>>uZ'IIIZ633Sfn;(ڝ$ݾ}v" NSE577p@yG|=J5CF¥K~i (11q֭fYZ?n||1ZTTR*0U-Xra FQC$pX~"23~Q~JX@P;tݻvJJJVk8Efffss3^zƢ"VP9sft\.W\\^n޼97#QڵkO<ÆT*f˃w-$Is TIraj D@ KOOjK,h4&)))IBdv{{{N(^$2mtt\.-''PS?z!<*9;;i9eޯڿ}/ utt,[ϝ[ܹs3|7{1 !@l{ T*_/RR${%L.juVVxuҰVn$I4htttLLL&vnrE,xR944D(|$--%111(VRUU5>>qر͛73-gndd䭷*//y O}Y?we˖{ooFB__``2/_n6Iljg\՜"##ccc'Ʌ%.99Ydfft:V+x`Ng$~֭. 2ߐFGG رc999/RyUVbwttׇp9ri9 $Y,9~hliiIHH %n۶ϝ۷o3|7GRRÆ }OrUVV~)<9 A.3U-!""oFVE;==]<%%600`ۻnݺ%6m $y<9'tz9j-..k֯~ ]+V\r%onܸ!J|" ^oFii;}wn:7#Qadd;Æ/Nhfh g\555\.**JT(Gyj @jIII&I4.]VӉ fgbbpNIDj066x\.JN ,fJu!vmzO<kQ]]mX$I HHHhii KDcccqq7ޖ-[ ?=z;={go$*ܦnƍfyAǏlkkcL ) 0BttJ7VXFF0BJJhddd۷%I]4 { Lp:=11Ad *9;;{ khliiIHHuq\eee{φ73X[ZZ҈ ؍./`BgfNrԩF"66VPThR)WKO#""`!`&I.^8##Ctr0G$v:00 6-xk𧱱1t:=rq懺k׊Lֆ:666)˩_뙖&&߾JP|6l D߿d?wbŊDe˖utt0xCJ2//l6/_<ץѣ&z#ȅj @jIII&IVN anwvvJW_a0wޝ DKR]revi[V" !UUUehh(0PP={vݺuLKill,))immze#2?S~4''a?Qqp~={=q8 t lEDDT(0h "66:&%%`0hZ(ڋ-"2۷oKdZhttt D rEf@:rH~~?z+**JKK 淩T˗/X"|VȲYrm8w^UWW[,B+$&pĉ~tΝOTx駹?I Jeˍ7*++?>22Ura89 A  L&FdOZ-v-Vutt!x'|2^o6i[T9~xee%US/BT* #LUK 8`h4SfddǤ$"x<]]]6pn "`tzީ~K¶Fb+ۼyc9W^ݵkx\%1b'&&2-<:~˛OTxN>'[l1!VҥKUUU'Nz .$'!ȅJT" Hʅt:VO $Ip8Vk;w t:=\$raT*\bŊyƢ0pvv<'NX,պ+Wd,x)?wz5k'*lڴ)^%77l6Z*p=z 'IVK 8^h4YYYjZM&xJd0_:;;n޼)7+rrHX߼;۷oїWUUY, Jjnnfɼ^oEEEiipvξﵶ޴Dm۶UWW3xY`ƍ:|R*|$""BREEEɅDcZZmFF\aҥj:==`ܹsj:IDnFGG^rݞIMXͭu/eeeovrر͛73344dXi^^O/hZ?* #gvQYYd2[lQT!:}}}U `bbb #NUK$8FFQ4 CJJ8ǁDf𞞞^O9Yp.K. '' F%!!?ԄOl{11%Wvv+W ;wjϝ>GOTbeee0CfYӅx^q2VQQq%)R9UaZ-Zb04hEd0FGGVnl0tx3066FXG*ʄKT^reŊ~իv!иO555x1ňm 7nxի/_P?Q7(--e=޼ylǏ3@ #]T" o!99Yddd,^XeffRn;I;;;EDx&yޯZNN aÆҊ q>N@ÇwܹPԔjx㛛333i3T]]mX$Iŗyyyӧ_x?w+Β#'*r 6޽{ݺu:###DҧIJ$Jwo ؉Zzz\aٲeQ+ueZv$Ir~B{{;a0kcccrn9'tL_~] 'O\ep\eee###a)}wyߩܱc;ݽ{C#'*,r!fŊfY)X?\k JT*ccc~I<MpLVd2jujj` ȈfS DMLLnq-'$ E"R(gϞMMMo֌FcKKKBBB ,$I:"fZ L+W^|YTX;%%%eee~tOT\۶mcfY ș ]DUUje|@;UA@hʒ #F0 )))-"2w6pNry<+'$(a ݻSuub&KT^xqժUTMMM!+V\r+>%Ο??/ILLlnn6ԦMjjjɓ'_|Ň bŢ3~|ߺulNKK jll%ra1z2L&FIMM]dhz^4 DGG$IjFWW0GrBxt:rvš5k.\P(FFF].ᚡ_f90#xÛl2i󢱱mv㏟y|駟t钟;+W|ܟp^xBظq8;Eeeeuu5\!&&FTFGG˙ E" K&''k4 VDd vpFgg8}60Gccc^WNHQ2(''!p||b c?Í72aJٳ7$ 8+sv]=GOTO}Y5k޽[Y]]8Cc; ɕ # 8BIjj\AhL499GV87$I4htttLLLst]v" rr8)&2իWw% _Ғ:<<\ZZZQQz(;w<|03GSbJ̰|ڵkϝ;z,E*U*,nv¥K~i)d6l"ېYGS5+$\A@p0BffVtGb6&b'3$UVVcFD艌$F@p*KOO #,[LF!,{챶6?wż4 wo&*Ȏ;k1S "77l6\2VMWTT?~|hhLDEE)ʘSЖ%F0a0RRR-ZDdnwvvJW_a Nxީ䄱1|ȑ#髵pokz:.WVTT={lanA.w:(F7MBRR!aƍkך朜[/uĉʫW2ʐIrahM}Tqdj)))AF^ϽXoߖ$jA _+$xޯZnG&77ϝ.}JP\pa͚5a2,KUU̯/#[ag_pp8ibb,>8M` Q|D߸qTffl޼ys]}ڵʚ@0\-!66V V.]h222Z"z{{vf:::Jdؽx^qvIX2--- zokػwoͥF_wĤMIIas׮]{'+,>8Mc=&|g0e۶ml )wz5=ZYY@466VTFEEB@pTVjuVVFm"x<]]]6p #nr"CT^xqժU [ٳa{Ƣ"]m%oذ/;ihhITx']1G/~{nJڼyl ;|UUU]]]4$1'S;%8\AEj:d2D~#I`GGh8"wN0@m: sܸqc׮]MMMR^лCX,CCC{=p/~ϝܹó4 O?K|瞨 裏^y͆1k׮ U$B4=)&&FP<#ra9'IIIZ6++KVtFc0 >͛rCBXsN/ :6l8sL@-R}}}qqj Ґ*gϮ[% .k3̫U>EEE~t$* kkkя~tM6ʕ+fsnnnHWUUwPREEEɅS1bcc jK,IMM ܹoZ$Ido𵱱1rnTra! `oz+**JKK.ۿ?^}}}%%%5552i[ZZ#Q_s~^^,>8MK/T__x;].Of#Olݺ5!!!jVUUUWWZb #IrfhDFFcѢE)))AՊGh4D~;88#fuuuqrv]9!A #o?44dXǃ%^k.“O sv.^f͚Y|pDM6KRoPꫯ} Rt&!!a֭f9%%%rUVV~)R%Fŋ@XOVM&SRRV$; ~6::jZv{wwf^"?gRN%'a`am߾&tmmmEEE:^RR78p`޽zG%Isiii4 zUUUbDDFu&?? >P( V\+ZYY)NGFF|hR9U-@;t^j322/^L&Vfn;q~888)o&2[$A<@ȑ`~ 766`9sf̮r=ԅ ""T}ǖ.Tz<Nk׮0ȈuG.\xW|jڵw~Cr톇?^YY?̋Ir½L&Nx⌌ Fl0‚ꫯ$I 9?1t:=rk4*J~D:Az*244w0fN$13ybҊKpe˖Ӵ}vD0Ȉ>lڵ֭[?я^&.((ؼyԯ]!G PTT\A4%0,FM&ZNOd###6n XVA=aݻ.k*!ANNV#Jܜ}N***Zh9sfÆ Aåk֬p7¦GVWp>|ɳ>/$*&H+T1%%%ogo޶m;ͭgITx뭷 cvӦM }1)Je˖P]SNUVV.luTBT* #LUK`[jzҥK,h4&)))I,Ij."FGG=Im@ihhx{,KUUbժU/^ mjj*,,lmm] Clb}]VD`ZoFii;K4 ;v`,+**~-|ᇯ*?:rrrN$ѣwIraR9U-  ɅzZOnoooG)I۷ tzީo8BUZZZKKKbbL|ƍ]v555YHHH6!^E}Ϟ=o&k VTTTj0ӟѣ~;wgIT8~O~Ʋ|1?_斮UV*\AqVĉK.K'IVK` g&33St:OŋD jvI:;;Ecbb3 99all Ņ .rV>ܐ\WbE Hlb?}ի ϟs 9994 'NXݻ0ӟ^~O?qO[LJ:޸q####"T*UTTRs%0C9^hjh`9Iv֭[]]]100@d햋$ r{߿߾}뭨(--&3ȑ^GztҒ[n-Y8?]Xbv&Q/cYXX ڹskF>_PP822RSSvwwIraшdsZ6))d2i4%K,]TV,u-ѐN'^5INH lww8%%%U*U8lSSSaaakk;:s̆ ]Bs]5nt}vDq30?***s_1O)<Jڲel6L!}|@ %22233S&%%]ZNOp$IVu``@4m67&&&n咋$ o ER577ggg¦ RxUuSjjjJJJf կ$v+߿o߾Y|DaZpiرc_۸qcAA5kB{5_+/\!%%`0hEva;;; r<FGG G[ws0=x:thL\򲲲YgU.^T* #~:uϝ[nǧITz111 m~*\|W^g4SV2͹NA0k&IѤ.YD4Ӥ$VKd n߾-Ic788(r~<\$AmMX 6\r ظq~NrazPO3g0I,Kuu?b4 ﵶӳgϮ_~&QAܼycB`EW^& 2[n5~z]]'|½Z<\!==]׫ꬬ,VDd8z{{vf_7`ll:N=@X N_zbkiii--- ,.L'!i&o>y/Hj͛7333gjpK/FFFm۶-L~JHH7z>Ν;m6T0NǂL /ӕ+W677OTxoܸ#O\pG uܶ66` 6o7]O pFF0 )))AdhܹoZ$IT?P=g*9;;okkk+**jll$bsҒWRRRSStׯ_̺hmm-,,ljj TVVرϝ444OTx'^#?qѢE!7mGçt:lOLL ꪯUݒFOyJJ`h4FQ׋+Odz{{{zzDpl6ݻS;66FdLgӧOZV6 999W^ݵkW\r̙ 60E}}}qqNhϝܹsy晩l鰕j:|V>e˖+xz Crr\a:Nj j6M4z{{ @v^xJNO b7>p~ Ԕiw}!<T(@6m}Q_ NJJ տp+p8ضSW.((,Xܹs磏>{WSdddffVM&ZNOp:;;+IDCO411qo= x;wnݺus¦&9*9;;;薼7朵Ȭ> k]|y#>QaACG}4 /|Uph4[l 돿>ibb,08JIIYhA`lv}``j$ID@`r:N%'Bݻ:4/_USSSRR!Α#Gw^k׮6`wzooooZZ\aD;vTVV26mɒ%abb^oK(Sz>wppɓ#LL&F*xj hpnOEtS ⑰nV\yeR9__rz뭲2 ͭ X,|k#G0t?wP(<x˗LPTTT^^θ 0?}|JT˗NLL455}_|!zjڥK&''k4 4)) `ZVIhFWW;.rn@㛛333JJJjjj}Ү_322RVVV^^y)bzT*f3lw_2}BII8d\.]JS[[ky^BZ~ݻcfN>}F"55U.ZmffxLdv{{{NIn߾Md r99AHd[n;vhmm%2Bq…5kքz`^Jjnnf7ǏO~Nũŋ%'*sNƕDhnn~WV+O-_l6cu@xO<_2z^C:d2i4T"'Tv$IzPNNQfbƍ~:X,$ݻ7T׮0@R9ϔ<(Ni^^?/>Qg+ W_}iӦg ^7[nMLL UnookllxQrǎ+ 3s_>el4dGFF>裺^&Fra%KxJdv{WW׭[Dw$2ؘq\n[I`B8wڵkܯj-**:uT<11)))ᰲ###eee.Ɩtklls޹sdDM61󒨐wQRf|-77W̴իW*~3hѢhF^,"'vw$9*v]4 'β]998S{=pBXTTV1ohh UZTT^reŊlc=y9>QAXKkOO`,]Sn '|jfyƍʃb `^qqqVjjZ6LIIIZ633322 ݾ}[Np8E`r<D{ttZj˗ .xeebܹ9ߚ [[[ݼw󝑑?w2N> /07o̜ܛŲdɒ\%''@8g^IIIwpp011q_2}¥K~iO>9/p{_{!n~1Sboo6u:]f@j###C$'' 8t:{'[n#f"2ؘd.kguر͛7"رwqQUW'0 E1pIMjk[}X(.JJLN+j cLC% L  s7>=cgF)RUUP(RSSZ%Ъ{9+ouppznPQQa N?~z>عs':X ۼyy8;;}=S>>>0{l//|G!HRX,!N5pȠ(ZM/@''86f-tg3ӄfdd{ϏNDFF"ML4yeX.\ BȀ_FL G?~|&VbŊq !!CBzለшdɒ-[3j|~aaJBCӧOfϞM >>>ȀzL&J3cବ4L` @fD&$$^*))5W}ч~XyK,)**2'*ttt̞=MKw5Ǐ6lg1$/^\z5fc`Q'$$f3gJ_[ZZlbΜ9ӦM{yӧϝ;wƌfBdK$&T*DNFsS7) ˜.X҅ [H$<Îs -uV\nbUEEE#\.7998x>QA"yNn` =}}}I6mSO=E2"]*vvvBOOX,y`h4*J Ev r+qqqr+**ɣb]p!((}o| "!!!552D o|+ot۶m3'*iдfY ...))gz!q~\]]#""8=CsI$tC/7c ___Cdh4NLF )"HRi~@''z:t觯SD D]preؖX,& ]PPp̙3K.E h?:nz=ƚcrҤIh?O&VG}ԩ3$ڟ|ɖ-[BBB8N`` voeeec5k֌3ΝK===̙3}ٳg#2p$T*Ť H{``GVE8"H:tttřsss#tɹJebbbrr9bfmrO?MF ;[?ῸMgD{Ws/RIIٳ޻~嗋/fH†pf p8+WdڎSN3g̘1k,zaG}#8&鈆@^AXAt:FV)0"`X ._Bm,YYY!!!\.7??ߪH$D߻ !--m֭rHNl6qzGGǥK._L7nܠ(J/]/DiSZ ~~~3f´i [w޼y"J~m̑#G֮]ːȓY3+___¨'gϞso|A//ٳgO>LJIaܹbL&H$ JpPFNHI 򔼈8m۶m߾}lfC98D^ɓ'/_g BBM  \fV***#*jhhn|ڵGAAccO>i풎akLT={vGG[&Vfs?1$$tN‰,}{SSS^^_M>Tw/\]]1m4__Yf͘1oۥRigg')d2Rhiih4 8.rA'$&'z9.\XZZb~R(</--m;ϯՙvR6mڴw^=rWX*F444d2Z醆FVc%d<#ٳ7a)tLTx'b1[f~_'VZe7捻0!\nrr2N7`=,,3mU*_Ι3sO>̙3ǡ)SЁ19zzzU]kkkgg')#24#Zy",L*zs!l6400)NPǧU^J@@@yy9<r\rhbdd#vPxʕ!NV{{{Rhol~m+otUUUfjD֢NJ}W-x㍜L~`i˖-p8 wr󋋋% Ɣ)S|}}=<}ܹf͚1cz842$dmmm@JBd h4:'ZFXt,B?\^t)**J(2$'O\|9{L&xVޮ@ `Df`v}}}D* hZ888(^J^T*~=x'`VVVHHY3Q᭷FPYYh}ر5k0m۶ ,b3͛DNwKza9sxzzN6"OI2< nL҂s(Z=~Dzzz\\\.wHFFF8p}I(FGGe `XeeeALVG h4`4[[[kkk{8}GQ^4uTVkq… R՘ ֭;|0Dr~ ɛOJJzXYYY_@2sޚ_x=JW^d$He00Q @/@@ Ĺ) lKoQBBBjjD'\\\Щ233y<^GG7G}8exxxhhhppۋ Eyb O̦&(ɮ^J EDZ]!.hZxzzՍ\.xiiimՄ#y̙KGMBHHHHMM5Kr6x8 2!BEQ5GP#t:Iwyq+V(((FlٲsNs6fg6l@_777>i0ك p""" [dn!K t:Fjtr`,Yr? QQQ.]Bneқq}GΉb.{wwoVUU!fĵkn޼I:?4440bpp`0 JgW0ܳ>+ }mܸ\zjqyyy``)5wzݻcbbӲChݺu]"""6ohӨTRM&uvv"2 F!>+:9/8{wxff&u;-߿ߛR7q}kFR `jSXXH:ؿ&HTWWw ZȠO.d2JMW]]_q3f =yUۘ +V@0B/[?OLLdTf\X,VHHY`NW,T*mmm$DJh4E^GXgT*wؑw[.//7)cbbN>mo@Io!y nݪP(&!!!YYYH$ W^Eݸq~+K(ŋ;^YرcjÇGDDFyEb]$phd ɤRiSSX,FXjz86]VVvד 9駟:[-,,$hoo޽6m!i:\uִo]nnnhv2nkkg $I2 XΟ?sww7Wmc&*\x_D3eJzW[ŋ$VH_z5xCp8aaa2\K$&T*DNjEj:'\#,NoΝ[l1hN\\\@oXզ&$$Lp}Y|ɓ'q>`0Я%C Je___Vg lhhDMMM)X^xq</11]d /{WWW1AN&0y)S8znܸ_|o0I'`. F,_<666((1H$RT,)twwcY.`V;44D'' ֞={̛=iӦ 6B'^xxCg)噙\.W"m"K\\\<<::ܹs=ϮgWΞ=C :p@dd$G[o4i H˗I*Ja@8vؚ5kjSNfpDx@T2͒;w.]ôU8\nrr2N`Q,+44 ·\ߋ➞TI 0@G׎T X͛csV… KKKY,P(,+Wxﯬ,//I yz98ٳg/^8\kk9sk׮f̘!fITxw˄p?$i,mѢE˖-C(T*dbqww7GFZNH2V_UUbScRp蹹 &t\,Q7;I07nGSt:#駟Fl_vU*y4/Qa oi$*OJJ'N+///00Q]oYvq~p8aaal6Ѱ[MMMtA&]zaF!Ejz&:9 .//"qqqG1˜3bqH](Z jZTRa;Rqcǎ]%MO:yg<"gԩZbCz===b8,,`!o64ؖR$>D"Adi(HbO۷oy/FfڸqctttmmD/$$iY YR… <\mٲY 0uuu}CCL&yfQ31c`lҀoEg}V(2Ͳ'|񘳢|g6l`.kڈǏ\ p|}} K$mmm@BhtQH:'NK@X,f8`F-*..fX CZZS(v=ooo@(ڱcGrr2)RO```ii)iiiohht䱯O*޸qC.c42']q6lHKKFW\g:KTx/^ *--5͛73Cg||'۲eΝ;ыFƪu~f$H:::d2EQappcDsssww7 /?qկ}]8y/QaŊ o退*WDQ]\\ח/__ԄM>*"""\]]QbX*Kr olliKVӋ$T*:3|!D"aRdK???tے[nMKKo>>>$J)ࠆ{G t:2P(uuu---S￟8؜L&5k[VVpB9^o>})|7K.m=㹹=v|ӦMX,Vhh( @42N˓H$8H'$+$P#L$cLUQQaPXM܏Vpbm۷oƍE㫭.))X,օ +Gh ͛7J%)FNz2*'RQ'{檫mr[uWWy/Q!:::55={˗/8%yYY٢Ez=˖-cڎٳCK ޲e˒%K ڷ~ǥR脄\m8:6]VV`;eZZǓB(,,ill֗/_~I 'ھ}m";4<>>---fvDx7/_{l^^J;9)iZj5R& ,++2 /"?f͛#Hx<Ç3g,].ttt$e777@ヰXɓ'rrq^YYY^^~LV \]8MNNFJKK^x <}͟?ߔJ:::ȡkbgSS:rg}&>7nܷoӺ`DDDnn.N`Q...96//믿=`FQVSE/@''Kfrqq~~~w333\D"A$o駟:NUTTDEEUWW[zC6mڻw/z)JJJbbblhXh訬$AF輅B"6o|+o4444##Վ(7)S7T"ɼLLT9sfyy9s~12SSS0ʕ+333>>`Ν8ႥZ,\T|̋bD{zFL8go߾7RRLHHHIIj'hѢ⟻BHOOx2B 6 +++R)<@z`XA?_QQanٲ7-pu֡KKKLappp̙&~/<} .hm̽{^|[^bZ׿FF@}n% |~qq1@4 R4?@X`,K,)**2KU111 H$v}T(tj`0ol6j޼y86~ ֒ъP(loojjZRQEI x衇\xᇭOr[%*dff^]\\lJ 01QPTTtĉh4:w[0^paݺummm8|||8Nxx+A?uTNN.zF(JV&' ,0q"ȼS5sVF5###44 {Z__sY3ֹk.clG(455ҋ$ X{(:uW_-[Վp^{ M~̙KR }gbKAAŋ"wx/+Lmmm֭pJ(77Hh*++!X@/@i`iYYYX햢DR`ZHIyd?(F0%]hhhFF舉gNHD";EQ;v0%5ĭrX|vRtj$#_nݕ`C!6wYgKTx/&Dɓ'gdd[㒒ݻw;wnpp?x 왟|ɖ-[pdb,?>''G("N@jF  _Yn?w\ttt}}=Z\\N}w=//oʕ86 wvvUTFQ._~&ɰH}?#~m+oB7P[[;|4YO~M;'.]ڳgOQQ零nJJ 'H 5o<fQ---999Jpz^ժT*(zX,VYYق x!mf0~V~/+B?%ȈR*!V%#\8!p_~+[n[t3g,Qx gF%Qaܹ׮]3{+Wܸq쥗^zᇙW{{{ׯ_)`QѸH$gϞtVEXL1o޼]]]aO dB3tT]IEﺕ\.ߺu@D ("~BAOH\v(PPtBBooo}}=ADFF@0J{{7`sON4I\vuppP!-&ۻqK. .DwHT5kL&cxeވ?J,@?tPzzz}}=ڔǏlfCCCɁ6oCKF9~оYYY ۤ###15XZ@@͛CCC-r|~QQ9"pT*V%w49A#,M$y{{XOvvv\\fº@ 0=VOQ,;dXEEE(`hoo 2!~hhH5t:aÆ> qT+otUUU7rJL\pXaވueff|ssf͚Oggg3oQVVF ;XD>8NddY·NW#{SNNfV%W#trkϮ]r'}HHȲevؑjXb޽@\rҥI;G7oF8&HtʕNRɣBy&y$`4({ A6Ţ[aE_|ŋ o XIZZچ Lwݽ{ɓ{%0 N=PVV/~ݽ~"\]]9S999%%%s zFCQZT*-[W_NX]XXخZ*77A|GGv͛WUU 򺺺+W477VU}}}#0? ؏FDD 6 B+ot{PwHTxN>V;wncc|&V~4e ?x".Lrѷz>**j8݃p8@Ȉe4j5EQ" trdݽsVfƖDĖDءܹsflvUUռyp= DR 7o޸qގh8+:!MN(0jqqqNm۶m߾黐BxiiiVM#444##-ܑFR*QG\vʕ+}mm-bQ$8؜Mniii\wHTرc֭[_tww?&V2}x>t}瓝eB|饗yP|7n$8?Ep8Pe9JroȠp\t:@j5#\]]m6Ν;]__L]`AYYfH_ /^hMx{{#FFt:y:<<,kkk#z{{("6zL+otѢE.\\wHTHNNrh{nܸib%l6[*a$wx fnݺ6%DFFW_^^P z0yEEE y={[H$qqqǎsڵ0_);v$''wAO?-sz)VWWWUU7("Y`111wFlGFlٲsNDcǎ]mozY&'W]fIxD9r!]?裏yt1X,֪U8N`` 1/| VVəNH3s,֪Urss&tw2~~~ۥKB688L,GGGyUUUlNh4Thnnh4jazE0q-V;uT۷oƍ |>7D󛞨044j;1ccEȑ# 0ds> ˗#c!c캺:Düt:U=I)`} T*RRR0XX,VYY=b0x}^C󛞨@.|Ad3gt=z4--M$1ېCӦMcQyM@|}}9Nxx}Nf'؞ ݈t:R6&**"~S )..FN ~7sB1999666>>>H^A̍P(Ă&ꫯQaHTz')IXbE__9lɩ?ѣ׮]IWZE>ƍmGEE9s0(GGp.Ӧ}EEE_}OH0/w#pss ...M"$''gee@;k׮Koŋl6A` www Q[njuuuMLLT*T:444<<<888::E,uܙl8g#WL Z k׮ollԳGyDIppԩÇ_rFvrr_xJ8'$$ %%%'љ?JT*rl L&ChΝ  l6ŋ6ݻw[Ə%&&|Fc f}Lo.(( CLN*655D {zzz{{%f#(>oJBA#Q}͚5}]^B2o̙3߿[oٴi-w(i04.NJEEE:>9( \>0˛_wwwf***,ZY]]]LL̕+Wj"##sss1uחxYbN8XFlooJSSS㝝o&/T* <8?#WtyV1GBwwʕ+KJJ^z%= y',dhh UQQaY_͛ѭ߾}{WWBr### 1>L,[fd2R9999@sww #HJMQ 0L8v2B=<'hhh0r;v1hs$*{믿g!_XŠC}ŬK.`k_~-[Эf466FFFZT` FDD]j1 JUVVV\\? Q*_hM^ño#`h ۶mC(f'>"Hd*yLB 3ϧ|1@LUJZ`0HЬ{]v?ۻNooddٲe?0%ZJ---7np2/9x `Z5556l0~偁bl _ N}}}9rܹ>\d٧uĉ؉  RӓX,GGG'-Y$TyG0\>99T*$'ͥ0K./ l6*iY DXXXppsNtG},h!?hf~X'jzjjjzzS jHR "&gp :1DLxyx<ӧ,{Ur퍍HSSSTTԞ={ccc]]]YD||3gtS*3$dBH\Ellv/Mzn31sf/\`IuiF"O-Z4>>~ͺj@pm\NމpD9O_֬Yn-'z޽wѳoqqq6#ZZZ|S:::,SdDc.on9::X,.hO*++gVJLB \FܵkױcY0ZjS1DbY E(Zk)))7o[ oG2訫ѺyBPToݸq#`ZFͭе}[X!200`;GMKK|rRR#}_~͎>)p@%77ڵk>>>OhyyyFbfV~Pdqɱ,"$$Ľ{:::,/F޲e?~{yy`0pyd?8HxxxhhͶGG  'f!JrrryooG}~cbb{oF$dIJ55502WWWLnZ<ټy3^|y__ֳ^z}ǏBREa-ؙ3gh ZjÉhhh>y0;VQQܵkvriii7@ pqqA!.\fE"%#pGP($CWxmDBFQE `t$`r˖-4r'33е̝uRo$*';UB+**'X~~~nnk8B믮nr tɓ<PDprJTTT}}jDl6… sݡT*3228 H)U(b9JdN \.J(YbEOO`Ztbɒ%Ư7==~x#$QY-`dR>555漝o޽{(122o:u hg.\B -nACt:}i4}ꅻwNMM5-)))3cdڵp% VO&555566677HRZ-Gmydݺuf~í- g?=uThhk;QW_ϷF@IB^^G= YlY?,T*{]zO} q\.7==}u嘘Zmh4ڵkpqKHf/\`A+//@GFڷoD"O`juVggАL&#']Zؚ۷|;^777C2*BU.F rvv...޲eEӧOgggs:RVI\xu991~t:ͽž={8`㍀ fϝ;K/i4ϧD_|EPPs+NW??|pEET*5Ƽ^DQQQ/^D(#""8' UXTT=h ???} /,,X\v/'=|j*++lv]],YB(`hSSSJRVWZ[[E"yN-+{EEE!k?~ȕzxxg*Bm۶BG |>f!`(ŋl62Pt:"V$ŕv3h4w}닎'&&E*IC&4==-GyB 񓓓---MMM7olllFٳ/`Z>' t*;Q!+++..K>|[lj%J*++7mm!]o.//'l3<}8Yjj"`h',,F!H$<kElJ2+++99,V߻wemٳgM$b$nhٵkױc~eX)2'2LѨ7o߸qC$IR ?7l؀8֚5kmb8q͝ RSS3LLL1$*TUU=?p#G|d0cI/]DS)((عsTvZFFF:;;#7xB F"]f@ʆ DMM͛޽aJk) ZM^!OZZZZ=ܾ}{ʕ i4%K(J#כbNT8y~;osOCC>觟~zttT3u?Q]]/9?uqq>k{ѣΝ3Zk׮z*sTUUm߾ k 0(c7M9~8DX.nlvSSNzyyYGDqqq _{~~~X7@OSSSjZT+###urƍ_rLmZ&9zhtt*;Q7''ӧFa=7Dr O?t{̙3]]]F4s&Kd#`P4-<< +++xgϞh4X.OOZCO\T*8` :t(!! vٳg3 P{{ڵkIo @wSSS3y$}dt*Y7%?>e2`ZW\ٸq=w\pp*;Qt֭|͆ )j?s?%gMMM/t/HLrssO:aЊzׯw}8š7|333qC r{|c.}@VQQg:;;JJJ,1VV6{\.OKK;xASG0?>777D`Rkkk7ntwwd;i 39 x*,,|W_@ 0:s'*TVVnڴہ_|O!o޸qsSE+[HW cǎ WWW2[lȑ#s/eXt:%%%<ŋX.nJ/\f3?Ukkk===ŝ={;;;1Lt40ZfX<@3SO]rq0]vQaNTs%+xb=Wu떞CI۷𷶐755Q^H$œLf4&&ruup8X>~^3;;GFF 0sL&S 8::jLOt.k-4..ԑLhn߾]WWW[[pgBKRQr/ً/xilvvv+e0ƩkDkx;h'O|)dxx'~JXnԧZ}B'>PHa+W֜vTTTee%Br1ݼr󫪪 0O4τ`)P^^N"fԑ@4t4BhnnBU,+J2J-F@X;wy7/{{{>5wBggիmԧ'|QύXj tuuY}駟>3VЋ***(,֭[Vt[e"`h!!!\.wIKhX>1~zz9lٳg3Pε62U# C Kئ1Px枞B1999::*ge2Iz)))IIIi򗿬3rAAAϟ7N]s$RrMN,dŔEh"J,Zwt7n6QAV[Y{'_ko6Sr_>777==< 1ݻSRR@fdl۶m˖-LKK3ôTYj̙3qqq"hdff"Khz{{o߾=::V'&&U*`z!L2ǐ1//ϽؽދpС}J g6l/@O?t}}=!tݻw;H=9qDll,#`P'22јW$+))z򪭭3,--ef;%h~~~a"(..M٤rl$J2-- H )..FG"[n qpVOOw|=`BEߘi̽@S)iZoX,ֿE988P"+^]aɒ%HFSUo&::ioo۷oߎ;\b@cǎegg' h;Y Dpp0p8p -- p0K.:Ν+))48;;f$E#ѣGlX,뛘Y$qmB(p[əB1WTMfYEx;RTTd5 ?Q罀 =B|CDEE]pKKK r>>>IZ%%%< 1o/:::X,6,L& 9oaHHHPP9dff3D>~qq HDINN~>d688ا%?5M{{{[[Dq ,BCӑ@ ;;; ѿE}݆d#)]'T*#Gϓ]px<T*Au[lALt%|c(zg_www#\ff1 GGG3NsH;ٲeKdd$ $ 80o1ᡡJCZb &&&nݺ8::(xLntJT`0ccch^BJTԙt q$bdHU (xZE-ooob(_Ŋ1C߿(;; 19:go߾,FA3i'doC iih0w$WRRqww?|0b`jD"J3!455 ›7o677ONN"P`hsFCLVTX˛p7$U/$i;;;M522y="cTi}\tqq+ 0_}ʕBX,ֵkjkkwA72>{2JOO', /f_l „ci,ŪUϟ?Oҏ777Q###MMMZW^-((8tPll/+O{p? `r&IT ޘ?蔨` U899_% 4'3 @jw( )SSSh &&&^:&&F$! 㓓ua///ywEEE^^GO(̓CjkkPݶm"""~NPPP(LII< u MMMсN?&'dvZ???EޥnݚYN$`L`1Q`5PLTbggww?CCCD HTn֭vo[D G"xG}+W BmbccBK°]]DDDKKKtt-GX^^^]%''mvt:=))I(ޥM"DXLN(JL&H^/y|||֬YrJ__^{o_M355Yykoo7~昨`͓T*տKɖ/6U0-.]tja%CV)GZ'|2==hAm*,,D4+ 7%%'afeǎ̦`0=Z]]~z$J,aaa\.@YL]:ZjթS D 35"/_|9;;{>+_EGG79$*R3~昨`USK*A,Y;~"mA l_Wm|>`PW\yW֬Yl"_FIIIgΜ B@ָ:tHr֯__]]}QZFYӕggt@@m!!!q_??cSSS_~u=î>֭[w]XXHFL2 i ` .袻$'F:%*ԯJ?gbbB JVT WT!HWppp ۷d2&hᴷիW'&&$m۶mmm KE薖uҥ'NPh䠻jnn>uTRRҶm6lf͚uֱX>khhhbbB.T*FC&L$ t: 7DBٙIŒt˗gwM 6@6_)00U0^^G@СC]]]'N{M[R"""\]]***;zL&3%%Ŋ? jllp8rQL "ɪX7|7l@[lٹs477 bTT*V &g# :{mMV/Ą8;;'-dye˖mڴ|6lHT{1=^^^.J9αcpQ*ǵlْ@y,:>p̊@ #rda TmmmVV־}rHE9̌\X!aaaq`7olhh }}}3Z2 QGR`XQGߖ.]j> ~˗WTTO0lvkgg_zGY)77wzzںgsPVVsϭ]J/GvuuG dffݻW P #ÕOѸ\ncccXXE/66v##⊊FZP… YYYoٺuK/o|>O?%ox/@,[ A0-$*t#Yhd>>`&+XaEJh;Ommmd{?BBBlUPPfp\nddAoCJUYY>L(((birs_o~C{QQQ|yEիuuu8ꂁڵ+999""zxx &寕y1gElsNNοo߾,eF~˖- --rܢGl98|pdddTTTUU՜ضmzh``C{llLTJ^-B(GL$wF3-3DJE'rpp0DMW@U~o۷ϖ5O=TmmmwwΝ;Ϝ9c/D" p8~~~|ֺpdii)b`4?9㋳szzzDDDTTԕ+W>|𸸸gZnbccm\vرcv^?R + ZwDߨ?|rۍ_@ $$*-ӳGGEY.0aƊ4'G$*x[+::wwӧOOMM%$$ddd SrCCCxǎDL ?66v7x{{ws?mBBBLUǙ3gΞ=g zb2hEDdddhh(k|>'ߐk|07cccZCCCRTTd[n!H$ňѸ!588h`'O~f!H)dO裞W^HpppGGG}}=~ 9xʕ+_u:zhWWyA@(AήW~~~l,3X,YmgyyyAA/6d2SSSъ~Ύ;ZZZ"##|oww7B``rJT0eJ2@}2$ɜ܌=<<,ڵkAAAֱwVX'n2r [cի~~~ x 77_\6"`h[lٽ{Mli7+,,$1 FѪ+**02 Ph@H$={dggk4l.]u?8XZT*'''U*MMMWn޼yƍׯ׷MLL P8$$q0 ;::`0YN+*`#,^XBO2 Ԯ ɐ=b^z%{{CqOOO[h`Peee>/Kp8q|U:tHTC۔zeN:wɴ0k_/2U=.\p # fܹ3;;(ҥKl6ƍsdd$zx|>ppF#c^gojjxe vEM JT B2qFggT*%/* ZhtqD"qrrBLhٲeƿg,88ܹsFt.o4Q312hdo NhVb1Q~P&~)J9rt(\=ټys}}}kk+:AfeeYW^1~Z5qwwOII-(( D@~Ntt!n,//''nnnTK8g0999>>>ݒXd)ٚoy<oϒas=p|˗[ZZzzz$BPT:,Ӓ&vժUƯ^79::YܥONNS(Yj݉ d\Maiw}Ct$*Pw+f͚sΑ?ȑ#N? A@|i544Ǐ!,wZ*55p哳񠠠={xZv7n' B.≉qDBɄ`S/_ wEٙpleP2_%wpR`ooo݉ d|NaiK,vlz!EjܹD111{쉎p8Ȼ]0rUUU :,,+((8uTWWD+[,%%dZ!'Nر#**83AIҦnD299Ibxddd||ydZ}!$*:߯!LEd/DjSkf HyX4'NlrΌhPn@@@@mm̴;޽{===:u,&&ƒfH$Z}}}JRPܾ}N(f+*I~ƯWDlпa>Ⱥ'%K`EZŋU*%+JVNNN||T*E@@^}Dcq[RR.^XI=<<Ν;WTTݍÇaO޽;<OK7 PFu6jrFk.C[ ggg"=NMMeX111/_60z{{'%%f?dx`}jT*iMNNNMMNῴ`;\]]2ݦJTk@ _=ܣ!TMdK.S"I6F3PCU.]󧧧<Zee/x<[?__LOOO ͹؎;ZZZX,͆hL&ӜҥK'N04-//ϖ睙I) Xb8Ejhh())ywx㍠uy{{oذ7[oUPPP__,zK.eDwwwԋDydE!dDM'DLh*mɒ%vvvn&*Sc )j%%%MOOGȡjjjYr}( qL466h4Lxxx߿η!؉' fYTmۖ~z8҃H?裏t؀DRLMMT*\.HƵ_~ԩzGy衇x≗_~9%%ٳ"mŠ ? 4X,ֿJr&&&/ǺO(OT|l-^@2R)Co׭[hጌ$''^:**_``SRRRu_HNNb@ ؿćNZP-8zhuul.0#<<ҴL&0[ZSZV 0͛7oذɓ.cBBB˖ Xffd^&{4-661,,̴[pf|w?7 " @Lkjj :b~_xGx_|1111??իbX.+JZMV졇BL$ ZN.$*%cE],HUi Qy6;T*8h4鬬,LUSSnT}8&yС'N" qa=EZZZ8Y0377nnn垞:OIIAw]dd$J;v@LNRDӧO'''?3O=Ժu6m?!55ԩSW\illlkkH$IX|A0vWjD{߇5bV |P@v 6{WI>#nB@@㓓###9 O#K.;;f0p!̈_ƚBD&600P 2,kF...999dED3/8(&PxͶQT*H$ZF A01-&*W.P29idVTf~fh4ڒ%K Q2L-@f wrׯ_'777===??544 &` `ѣG=޳guf{eXd\ZZjJccct߷nPbzzZ,߼yI$IRB1:::222E^nX`Z&VT@UJTdc~0 ']vیݿoyy^>H\pB\"WѺ)Sh~$ BD~ڈWqqٳ*t6c ӬGyyy+W?ǎC3@}lllYYBZ"""D"AzXڊh/yzzsܟ=O:999&ĝ;wjkkcbbΝ;g@!:uM jR/:::*Ф=FJ>SF!c{r9`LA{{;Go^vuuI#MT(--5k.DbiiI wޝ2eBHhIT0?иe]ՙf=/CCCi/W_=y$)"{m rynnnFF6vҥa^ϟ6xyy2CH-..u',,&Xe:::nJִv鐐4D[jD"ikkkooSTuuu7o޼uyBE&b镕=쳺|󻺺G:ӝC899iR lbU"XJPL>@;ӧO/О|H$ cو\n8###'';9|hPPPII>H433SY c۷oOMMerAJJiӅBatttii&E9!J/ˇMuy`4 qsssG &M%RQQ1c MJhoo>}z[[G"5,S _~eJ7o*4둖vTXx LO>d˖-* mwD"P(UƪرcbXÙU`МGBmmmLL̹sL!2֭KKK1\~=**!MCvd9`dr9JSd2Z$B w!#P`:8 ƪU AC9+++ϙ 9}>5k!o~AZC |K4fccCѲ-!R; ˦MJe^^+7iҤ&.gnݺK.}~URR'I{zz={2ٿڵkɴd)vDUUUOCZZZH$wܩyիW?={DEE ͛3g΢E"""?$8A8q 66vUiŊ1QTJB ?I۷oOdJ^Xp8\ a s.]~8 /}M2D,;D,9 CwGkkkKS 4,Xpԩ/wޥXTJjff_rY$?~-QT9%K$$$ &tqppB)((HMM=s b`lllҞw|~zzP(.--5Ȅ-_QDY111#=Ǘ\RR"ˇ@ HIIA ?)c{zzJ(d|n2: /C1?Akk3mmm7W~(橓N&~tuuiތ2g{jFOOћ@( JR֭[^2% @9;;# =("(<<F(Bzbȑ# Qڽ{ oܸqv3df۲eF>^{׭[dΝ111Νiii!RTeee?Qjjj? }0M?|@@+dz4{oboo/Q#svvF4wT0\7䣏>ھ}&%H$e˖_Hd2-\p8#n{}Uںu>SJ///GOTUUO^^+;>JSS9BjW@hK.],"իte744AdB!:&&k.[^^JXqqqM++J%G}}}WWL&RT*yML˒%K9j%)S ûy&RG_R;G0i$}%s0ǦM8I 2_o5<{{{?]ooקUZBB'|BWi]~]inݺ[UUd%ڶ iLΞ=SPVxL&%Qڈh,ƆR)CSL~)-AP(J$*^xbÆ xD",_.ͱX,Ziiix{GzNl ZGz-ǿYYY׮]>STT"0(}٦MihOCCC\\܎;֮]+ JAAAFFɓ' _lnk3l2b5oذa(yyy啘d(wyT*mkk#?PG3gٳ'O,,,'Qs8Y'jhhRibpt:6ioo`{hz3MPQa$|/233T*c-GJaa ~G$ H$B 455⌌VAJJ.㑑111zĉ3gҸ&cylC`{{;w޽{޽nLG~njj"XTg}||&L`nnnmmSOM:uʔ)L>x GHTx$*7A|5rdzzUV}W_^˗!Q^~}h{ڣRr(AAA \]]w޽sμ ]慴r0Ɲs+Z=\UxO?ʕ+===@&!JM+\{%''# =)"(""iban$000%%E"0vD> /z=d<49)Q.\288CP7lpE7SBCCE"!&$%%eggP"& OOϝ;wyP( MHH`! 芊 }ȶZ|>?==鼡CT*U(J* -Uƍ{g-Zhav # &^o>DҨBdepb4QMWq@R eiىP^pU$}ghOߟ\laaar9X?YIKK񝃃ѣGBaLLLuufG[``7XH$Vݿ%*x<\+C;h8a34:Oׄl[[[>eMTZ5u^V7+G! =EEEVںuH$ Ɲ/C{AX,hF%<?--60a^b SK0MjZR)Jeh lz 6E r92RqڴiK. ;w.axG zy_&$*FR>kE"( &w YYYR՚=6N"0~;w ڵԾ#G`uOЪ؉'n޼iΝ'N crFtGJGn4%%%K,1X޽~Ǐlڽ{l%rss;E!ZȨ)3Rimm-}'o… LBK!!!?vvv"KWlbY[[ϟ??>>>??ʕ+{]lw=ax!QaD]N@d2Ra!VVV蘒ViDƝ@иP"՚D#)Vȍ[B*++ɍWÃ%QxΜ9zX,w}tRZJ#'U<|~aƍm DVݿȑ# 7|h^lvvv|||GG喗k:ڵ+nذ!99U 5ةuV]]]SSPBgggKK y$c՞!???__ &p\[[[GGG 6af~a?_xA/Jk"Qθb~' ws)G#%Q\zX,_K/Uwee%]-\_h[FJIOO}Y9θq&N8y䧞z [%0B@cݸqqЋ&Ҙ}CBBN>Ky<.1Ԏ`aabq͍djDBc4688H.'ҧAysYwws禧ϛ7Bz _|g}m۶?rʕ+I5DBgV,gggJ`֮];@,[|׮]E h,r㎎6Xy{{oٲX˗/gddl޼{TKG}ccݻwcKKK__T*mooojjjnnwyR/%KK˩SN>i Ax,SG zy_Ҿ1bGJիW5)sʔ)Q#XWWGnl@$>YYYo].^ҥKt3\xtzP@jjj.X,~z{{ *@vZHhhD"KMM-**B4\]]|Ϟ/--B;233###uFr|ly]"@$Z[[srrFzIV7Sñ1cƬY|||4o LDzE !!QAstU.Z,,,8QޏfmmMci*  -ʁ6mڄ<\]]]VVqo-!lʲe˶lhofffRRRNNX,@Liݏa^~+V!\cҰM;r+omm5Xݻפ.ҭ*))ٷoߞ={& 2*[ڻwd2 ^NNNq8GGG__ygSp=ax&L@ `cci:cffy9===GEI4$*Dʐ[/(6m:pfΜyܹ͛79r9G!hHr-).\yyy~ޑ$˺8q"???66V__v0套 6X%''BA`{C&)ʮꚚ7o655j ^l6{ܸq盛zxxϟ?LJ DA#G۽+>wTC Zi8KKK#u?ѻRĺ)W_}w_x'իWB?>(11)--]f֭[E"ڵk#V-fP01<==G˗رСC&mE¥ߎ"feeEFFFEEUWW39V 5[`ؤ0qdX=@H$RST޹s *+++**n޼с@h˵bXfZpEf̘A L0Q2adefy7Z/--5+ի2a„ ?+hXHww d;3GG(!!O>4OO3gL6 1=xɉ@5k^z%]˫bMygmڴ\3hP(D₂t ˟`;bw%&.(( G"Q$ 3{UUUjr ͛322atT*jjjȐ|(' (-Q8"?,^xѢEHKa'N@:tqЗc+WWF&|`cc+FTj^-rlr7֓VQ* QW(._ܔL4_,ϝ;C\xqƍ?#ZחJ DgQSIw,Y ^vСC;v 7ӉXxx8CP$&&C>qMM͎;^zm۶1ǯ7"қoIb?CV8q֭[ZMKKkll$>>>nNNNtZ^^~z#X```dd$gee]t[G"pZXX駟>dOg͛---DTTAd)Ћ̜9sÆ 'N(//{nSS˗?W_}uʔ)VVVC hA0٫\. - i^~GTJci0V*h)s?=z4,,/s^37RnJbQqSND:Wٷo#""ŦyF篣⼼<$"-֮]K{]ZZjdrYYYq7nGH$?OOϝ;will*//J2\.WTXXX,^x…'Oxq(!hkkC`0D0I;*k^1w 2 u X,[[&466&%%}g+W|뭷̙f~dF=CW(?ƍ/^#e˖mٲ%((1Ѷ@Jrr2 X,HFi3pƍڵK/䵄<== h\nbbbxxxLL̹stXLVߺu<677JҞ2%Ė:cii9gΜE9;;[XXX[[|;;;&LӄD Q ڪy!e;jD&u bGCaffFRSS;* I\sk 9GDv@P Hw I*,,,::Zs۶mxgϞɉۆ %KC UWW֒:::Ri___{{{OOO/EP Pdmm`~nܸqǏ, !>0h[07HwM?#sNٳgsssccc 2-"""/_O~/$2ЋZx޽vJWWWww76I=gyoʔ)gKKq9S0 Q}}ߔ:880$Qa:;;i3;ijJkhzc ٓU۞W_}522_ʊ?xٳgO6M,zn{[Vg7o޵kiD"srg ޽{wܙGZBgH_!YZa[LP]QQ7JNN֮uuuRD @/ΝOO8Z[[U*׀蝥m4/ :V ]."f 믓=:c 0)B.ż/}TTԢEtرcIҶ?WO}]\OUUU6m”MЪ\J```BBNKKۿNNNFFFqq1b&u۶m666O&-R\\xAB9PRRYZdL[-[BCCcbbΝ;GEAhvuww7SཫK.666655ݽ{P( hImmmLLSCC(;rX,GL= OOO jS D"&w}GLVdee%Ac H$U*C$ݢ444T* @urr;v,Ͷ3gμy\\\"0MQᱰ!Qa tMW ٱcZRѸk!:w5H6h(]ѣ^^^ <`-mrϾ˖-F SVXqaggg, hIGGǮ]8.Hˌ@ H|LMNN ,))ٷoߞ={L*tߝP($fFl˖-݀BwȹxBlvVVVN0ej J3---7oz$e6Мq㬭PS9s?<ɍ鑾2DruCd2-VcbbbZR1TD"@MTCMMEEEӧOg`+ek֬yMh# $* 9I $eU~aƍ-!w#%K &Lijj" `FF`ܶm6sWLLL *,,4BaQ98899ZF~Ud"% ƍܵk HE"/T*ȣT*%$555d|E\ ecc3vX[[[ J3`𜜜} ֆ~yph]uu5,$00ŋn۴lٲo֘Njvvvux?wy'--m0*]ATfggO{͝;NKgϞ 7>7#/ƭE"щ' 6r#iI.T<^pY^^>9m7oddϞ=` GGG;wnB%%%Xxj2FB033koo'j2"> 9lП}y!D#ӣՇ&J^<^xt $^$jt4OT{ +<ŋIOFJ}q &|RZssܹs߿O]6==4ƍ2+T'N;Μ9sկԩS5,^3eIE>|믿/)>?F(@\]]E"Qdd$x:Fz=KLӧCBBFz5րdff Zh&qA &[H] 5b#o&88AT* JR޽{dV^^^VVV]]800X0 rg %K/^<{l@CuuuSLAannΜLD" Fj=DI&/ E"3}}}ň? ~̙3111Q{/]d@yf=fO1  'g -U7З?6SWWWQQQZZZVVFDmmm.FaSXc={%KO>vX333D F/_kjrvvw___7n0'p\Bd2RJLh 8@EAx~VXz|Đ۷=Y!Ƒ CZ2f۹s'*/[Ӛ<:o^~H2zP(D b޿.DmpMKK3c^vmUUUXX ))ijP/_א(<9_Wff&Kx⨨(?w۷4ƍ[lG}޺utݻGx]]ݞ={BBB6o\\\蒿VVVcccRR''11QQ1钑իH4Dsrrr:~ӧuK, JCڵk{կkHeqpp@ϟ_vڥKN>zCBB6nܘq?'i;eܸq/ҁ.]D0a\GC7\zwwm۶q=.\nff&ber"}>((LT*o߾}̙?ZdɬYN3Ϭ\ =ZXXXSS+J JBfCL0W_=x_QQQWWWVV矿; .}_mܫ,+,X ;;Kawu!2 ]zܹs-,,cMTrtÇ{{{gR>?TWWGGGo߾}ݺuׯͺr}XA~@L܋KJJ.\aʡHIIYRNBBn@ }95k\~]om6l&ݿp9s޽a1TZWWWIyfSSS?y $\.Gù,\pѢE6 qƙ^. 277O~mZJ_jӧ:^xرcZU;wf̘11w\r<57nLNN =ztԩھ_uPHځSh<<<덾q5kÇUlٲt6!tERT'OϟG4e޽&?ői!22RKkᓐV@/_~)c:ũ[nR\. ]UUUuuuMM۷;::$ yk(-P(M4iwwwF;v,_4UtO?FbʕyyyK.1j ȍY_IQ^^VVVbpBٳH:~zZJx뭷hl;.]z &U`ccY:;;>xSN3N CH]^.X[[Xb͚5< )k<==oݺe"MkJJ :OHH0q@mɒ%UWW✜$3/VĘB͛7ɣ#AC MUU-_{ImδS !mMMM]]]sss__@WWWGGG7SZZZٍ7naaGbEEE;w0jSheʗ 2L_Axj_ 60d%eccCcSTtViTG.^hoo|)F->JIo/_6mZll,,ME+˼Jnjjz׬KK.][[yf-M"rBƄ"(,,f#&:啜wcǎeddes>阕رСCvlRC4S]]$&&&C}WD^% Bd)V) 2{n{{T*immmkk# Pxyy͙3{ܸq\. 0<@G=A_~%{0f͚UZZjWW_}ZO=Tcc0a[>|k۶m(LaÆttO<9i$ lkknʕr^{^l``KqQg_~922rѢE$GN4[l[߿}vz6UnnnJZ'>h^QQQFFƱc4̒'vׯYW^rٳrvqGdh{G05dxΝw޿``CP Prڴi3gtwwl6mNNN/ L`aa 3ݻw'N8D"j>5^Qд(J&$ٳ <_4.ocffFoAnġ~O8A>P(bTS_|122r&boD6X[[7%99D1-ihhߵkiD"Q."diΜ:HZJ\ 0.(\-^72et5[a(z]J@TrI(J3-MF*)rƍ>lgcuvxs{|JSl5WxD"|29:\v&jTKO:2l^+F*4;;;111>>V5 _XX$/~7 (D"QHHP(ܲe ٹs'ye޽b!"jQ1F>7((ѢXw c7ZmBhn2'ǏOOO9ڍiFzNGo^uj={bƄD67ÙɸE*ݯ_ST*Uaaᦝ|`nnr\" {yO*JrrsXAiie9VTTD쮮.ԩSvPX BQVVF ɓ ( M}ww7i7DL&Vf4Ԕ饥%VK^L@LNN3 zzz={bb6'rVrmt;iX ~Xw9ch4/ w)qi,#ZC>- ,KKK: lڇHJglxXC9rȕ+Wۇ@҆<r0&&&-99!Z~R~%rښOmC(^xO?s 9T* D 0f >--l |Ԅ&~0ZvB.u:ϟGGGGFFȣZv_`l63444<<<,,F;9ELA.T؁hĢeFtw?~ևJzkuHP~WbUOʅ ]ȕ f JC'ODFF*Xkٳ۷o#Rbq^^L&ô]vXNQT=t{3iFr $%%߹seraB[(޿999x ~~~~Q{{{nn1,--DhR6'??mnbbD"ٿ?FDvQaU۷oGL ?P aƢvʉ'/`Pݝu[и@2B͛v.\.ު i !HMMxlvƍ%%%L([SSS.]dr<$$ؙH$nooW(_F&kjjYBRRR__9(lԱrx5.z #455W^%";;{ AMjKKK)9򝛛|=y#Jrl6+"""&&&66644\PaUQA(TaC|>v FcB6IE[[[KKKBB#zhk9 dh/TrړRۏ;fmQG&Dj_Xٌ6,,,LS:;; ߯SRSSFz TFU([\\D,UZZjTzbV*#'E"Utttwww]]]UUG8f L[uCdddD,_kZh`0 I Y///.2UoTTE"vae] as]P-#i)Tt郾zӧOSSSh^o2hY4s8z׍Bb3*,--\?T$m*ff:ck;wN@'JaaaGGy@KOOT[[ڪT*ɁL\.ki¹%)sHtwsrrHee%o???!%ga0,u'&I {۷?~v_`nnnpٳg80,ƍ Om߿0-F7ߐܹsޣGN: |8--ҥKHV7a„7o&$$رQd8+9z7l`k׮qapכo80M&eggߧdee+^T;A >|^^^NNNLL6 ̨,~ngNT8稑дthܤ)k֬Yh*߿1U* 4ޱe<l ʕ+Ǐ[VhŖP(sw @Bu[zM`Ν;w,XxbLmnNqFZZZzz\.8ܹS F''[FՓIRSSطo_uǎl̤I! 2UXXXSSHsț 'Ũ888S.\]]]\\b1(//G@bˀDG=斍H0,gbݻwoݺuʔ)\ ѣG_}URhkmcc#] $FcE6SϞ=KlMvر<0S\E;vy$i111-͛qM)\RSST*kkk+**)= X<ѿ~uQ$yyyyzzvCJ !QL^ooo33f j*ή5 a4ׯ_ 5kVbbb=87xY 6wSV;@ 0:@Yxݻ>oNcᛘѣG#]"s&Snܸ? `D2{lTB)۷o߻woJJY@bsTj8xzzZ9sqH3~n޼ hLR\\l(P___UUUVVV^^^YYIt:`x}z!}||:wܩS'!Ԃ LPb̉ hFZ +hݣe˖EGGwЁ?ѣG_reKhе=666>>_ࡲ28\j*O҅0πsBBBύGzxJ(HRd@Q1 U>///''';;;oo=zݛ|h#6`*HTx3'* 5UThBHDv…;wLNN9r$X[RR?7>HoP(7J搨`m۶VJΎ3ƚ\]]bwq7,_ـ92,))iǎgϖH$ 6Yرc))).],^844wryE*׫P***0yܹaXӷo.] > B Q1Ϝ`oofj[?Z$ B67֭[Ǐ4iҦMH7;v^OIIzʏ455јbccCoiKVm6D(iiio t&aBDDD~~\Y&JP,X_7n0www3zy…4ZBJJ*//!yyy sttl߾}߾} Я_;"DTQQ  3 -($˭'*PӔw}766Ŕ{oNNP^QA lLΙ3z&*3&//Ϫ)h SϟqwrhGɨT*[2Չoeff~'N@@9)}G0߽{͛ҘLτDXlx䄄&H7Ή8n޼y&Mzw>?S\U,JFV+ (yyy&) &񜝝۶mۦMP6hРo! **|SF 7//o…Νq?[EKއ}880I#EPh̼O!obHKpuu%BkC 8p v`j mk1lذ_~ϡfTTfDN֏}bgT*cbbvܹm۶^xoJÇwޓ~GVӵ:r$U.[ٰf4e77H0jF|Cu~᭷޲tr7s=zXjUxx#G>_[?Cˆ1((%6I`Heee\\ɕJ"!&&B!'޽{SRR #͐@ طogϦGynI355v„ 7o\vb-ZVEijj"ܣܽ{7;;?y;;;ֶ[n!!!AAAڶm **<߂899Yy3tNxəB?.&X|IY4*u Bz6ӑ^xaܹf`0vؼsu"b>>>˖-ol2x`h5~ؙ !??SN#FqgO#ˏ;(RRRw>k,4cj["B\~޽/J"7o|wwwD f'Q(555YYYgϞMMM%ӧO޽{޽_|蔔 .!KB777wwwooowȑ7o޻wGF QY۷GL o1U > QZ,!Q7pғ@466=zW_}h.8XBf+ zve 믿fggO6֭[,SӘB/CEVWPSNǎ{ー/?0FNwvv&o.]s1cfP]lٲ . cٳg# 5R\\VYYi[%9Xww/2<<<**e++Jծ];z ZmYYݻwܹŝ֭:ڧBO!ttt2dHhh\\\ ?sSQQ  Dqq1た@ 0W >D蘡tmZ{vmÆ N85k^iyL{BH^$3(Vؾ}SNڵkĈ9߬vDGowyHСC>Ipp?XTT#_~eΜ9J%"R 9j(oXl0a͛7ccc[{nsV޽{M6͛7oذa=zԩSg̘|򌌌~BT"KxmN4iݺugϞ)///..>uҥKCCCܐ`JJJ㼼xIkkYn)--EC QZ2 0D_~eذaZBBBݹeҸ4RI{W**j1vXLvYf`{SSZ)gFFFpD#IRs̙1cƩSʕj Nh<]GYxqRR#-ʙ3gnܸCȝe?eܸq11-''ʵkvu1BJm6;̤aϜ9s…Ν3˫VC#&TVVfeeݣ6RryCCbX!-aذa!!!}"فboo% BL ԒD'''4#`zǻ3Kʕ+HϞ&O t6VTP*-o**rFxN:/jZ.s7]'*L>W^?WTTj_YS&Pf̘c>ʭ[>ӧO# s}J$0[[[Ĵ8}vr WlV$-;>>>#JwHG+**ʢdff544rCIBPp/>ߩSC4gϞ"͕X2: &Djxk ɢLxwF[.%%%))iڴigoǎ4P4.P p!QNIMM]hS.iq4]GZ Y ˔~/^dPk_>SWѣGر_~oV*Ǔ{ϟM.b11-r$sΑĉbΝVr'Lݽ{7h4켼zLVSSSWWGޓ POw;w6Ipwwo۶X,&NJuyyy&Y9π -ZGhB^ݟ6mi{Σ| 6и[nӇ544̚5Grʔ){m׮3}j͚5W6 tW5ʼntPq_|9==ga_ B .7n jǽmFNvtȈ3Uٳ9ڵHZĉ)))ϟor0 "!!!99޽{朂lJA.WUUj | LիW>}zfgg׮]NsznJ`MhZIlmmU*JkoooEk-khO_СCz}ttG}ĭ?APp#魨ÇgggyԂC%66v˖- :Μ'V gΜYh٦+3׹NVXq9&Gqq񫯾Jŋ$aK.-[ʕ+00%$$fΜ@ I~JJJzzT*mr"##3K(yfr?~DDF,...***++QTUUU奥'r9 `H$իWϞ=tDյkWWWW,PNTPN_d &/=ђ`t n!99yϞ=6l kӦU=> vʺzĉkkk6 -$ʢERSS/rhT*]&L` LmD:thMMɓ'_y\.oe'/z=E*tB6X K/\WQ~-[^bt}ՓJ";;wIHH kDTu3EP]P(͛C/`Iꪪ***+yh m۶:th߾{@@@{㼼0U9ZNx<ڗ.e'NVL,Z?ҥ}HDi_HYfի'yɉ9cƌ?iH:sKW`apP(daGZ-!رcw[M9yܹsǽqƑ>Ҙ1cs )ϟOII!ǒ'Ν;( JUWW'444gςJ~~5h۵k'E"mN JJJ㐨`*𯐨BLO|=zTVVr+2|g}8|bGD֦oODEEC$ _Ѹ@s]~={k+Ќ ; sPܸd*?;{,9G!getttXXj14AzX#266,̍NS(rԤV+++srr333kqqe&شiY(sw[`&JKK|~} gZ`D}}=ӫ ]ӧO NSSҥKSRRl2~xlDfKsAv j׮믿Ɔ1|Šɻx>} [?I>ӴׯYV Vlҥk׮AL4J}}=y5z^4Qj5yO~" (cdzD PJHH\QVV  OPTVVdHTLuuul.2dȐk׮q+P/^$o\[nqɓy~42U ߒқ ddGX \Dn߿?>>y\2{lDBVpvvF :NCBII}۷jkkQ'Z Bʧo~ذaÇ2dHnݐDfuA`)ۛsdZ2彈FW믿޹sO> 5k,ZD"L;YYYW^8qbmm-kt/Dk׮uq̘1_}5\OWS|O, ;j٩xJLL̦MpO7bqqqKn9L H$ӦMC@,߿۷ogffrZmHZ@Y(|>F 1"88SN SMZ!m۶EgD8_קDEEqnRSS7n8uT;;;zG$1C9uԔ)S\/_#ϷifG_<.\XTT躴ZNct=LB`aE<Cv"v믿e˖$cƌ_ݻ|'N K???D2o<3Ow'$ܽ{777ZP455jCC-& {1|#F 0䛆D , sttĵ&aDΝ;{)[LyW=ʭ-/));w!C+G/w9$*LVVM?rH}}}޽_uȑg^z5Œ'=zhV;eݺuVimwQnY%+00oai(,Yة@.#NNN2ʠAv1l0te%99 D*_>)))<<\"#&&WZZeHH.// JccbHK@lll>t=zD"'''[[[/ B<==@“pVx[i圝Ξ=/rnS-[zyyqh6|**799yԩ!=z4>>>33455L441c &xAvv %9]lllBBBhooJ"A0CIdf&*0z}||96mJKK۸qYڴiÉm&OVThhh0۝M>}'ҨF=oX"11т/hZ**ʕ+򚚚 ڵk֭ߺmu񺺺'rul8zƍ-[fFPQw,EF׳SQ쎣6TI K=)2t5W\ysƍsEGG1xzr<$444P{v&bkk;hР>}mV$k׮M6B!xJeeeqZmqqIVm NNNhX˙j[[?~;[.rqq!+77wĉ|;;}Cʮ]~m 86cnBz^PpDeĜv '|@.>gϞkrpyYt)ii7WSSWPPPZZj( J+++ i 2I7:999M$yxxxzzvA,3=3XpAE(++kjjb̃ GgS%K,]lܦ+*<{WN4I*~zkʾ}͛g1W'''F+*)Lz-NAyz:|>PΝuvV$G{Yreyy9b ;vصkٳBZE*RYSSSYYYVVVEg⏐<O,ӧwޝ:urttD;wر#g6m"zB=x$ftP(-r œ'~_p.vڷo_bb"mۚ&ϧqJT0:thuuɓ'_y3wrh>ȃΘ1.YǨjJ.ذZEnI&m߾=#ޢ9s&..ƍ0DfPBCCcccLR*%%%䍡HBuuu)u={k߾OW+B4$*4A`:wlia&fT*5D4|nKIIIJJz-V[QQSN-j́fbήVgΜ|ȑ &ptՋH$b:PUVQ4-.ӔCn۶?`͛7/_Ns.Qz=o<$rCQSST*e2lySQQh(`֭[Ν۶mkggֳgݻbI"Ƒ >VT0ࠢBt .< rffԩSǎaÆ QbccC V5 ,j!eݺuVjz=ї`P__?qDwwcǎ9,(vXt:vpWW^ ѣGrrI+0_~~mUUN GEE/;bLVQ T*-,,,(((t: 󽼼:wL^mmmɛ^zӧ[n,LOA0N,#CE#Z@zhXRC[;zh^f͚իWs.?P"""VXcDShcc#w VRu \.7DQFuѣ 1rttdi4|>;irL1<|۵kuֹso`nʆ ֭[P(`vڤhL}v֍ޙ+Fb#Cw>ꊆܬ?m޽{?kFDDxxxvc-P(,cV0ӧ8q)?R__O# ꫯzšccchEITj# z7,XqFDqq+VK  !4ʤIbccG z}cccWT5558'''777//v۷؈D>}߿{jq0tx3z-9Q9ws~܀Mh/_4qDSmK#BZ̩q:r\/hg8sN޽ W_qZohED  XxlٲsJ (?ӊ+.]sNSMZMVmƦ&JURRWZZh+`C6m\]]BPupT*EDLha-6668p7Ι3g7o4hIZ=,MRYRr܉'fffM\D~z.]F}www?.EMa%D"Dk3. |TTx&JDDJbt9r?rC! }ܹqqq(DV+JɦdRrss,l,p@ ѣAܥK#uA`bU{{{|ZMDEEwa.e_|y᯽ZBBBY^;ӺzWW^4iғkkkR ИIb܅ <<<>D**&Jl]tiȐ!}9s z ]0z4hѣRzNg`Rȃ>~s熆vPvݺuǎuT*Y9[\]]M2HXS\\ שS'}HT0 U]]ma{oرcϟ?1PTM"mlڵRWWl%񢣣꒨h>ITx!k;wL8/@/YSN#/֫W4BXTJ.WX1sAt77l_߾}ZRi4D   &DZ@(m.Zb^^xn߾h^VYr%ic/~ B_>#|r+9 T*FWANE־Λiiiixx8۷/ѬA}%%%0 .ҥڵk-nU|={>ɓ'KQQQIӷo&2Qt:=NsqqxCEfyzz"Cq-LT0`s ,x ō?~x@@_TT*}ĵ ڽ{^ jӦ 7mEGmذ4[jL7crXX&g;cJ{GΎx(աCO?TѬ^}0`TeeeBBB.].\x}l㏟| y`͛7ʤRBhjjjMe+//G@ؗgz{{s">-LTpuuER(;wd_~%G?999 ?J/Za'|ҭ[K.ѻXsxGYh4L7cCvZf "ɑ]hQmm-a@ HHHPTر#rz5uԋ/" &VΞ=swyG9`^x__bŞ={Μ9oݻw/??Q'2X2>ѩɹi-faaaz~…\xLdɒ#G~w 4.ёMmll6F͈#wI2BU>3}z~~>Ѭy;w.88F}7G4hPFFVE@xSN%''/YdƌǏ'}SFFFY>#?/))J L`0,UUU!F#kjj2U W-=UO>~ݹssI&M0?O|~I{RΦ% c{9tSNk׮1۹#"""##9sXTT.4PŽhZ/Nf9r۶m @@K!x7￿kqqqd޼ysrrrss 2S+@ 8;;#A0}M5'*`|o߾zjHH eѢE~a.]Z4zB<:BԩS?ޚ3߂ <8ydtLj5ӻNEC^G駟m۶c" T*]n`΃bbb"""$7b,NW]]O)--555UUU䵶ZX0Sh'd\)9Dv:t(y۲eKll,O>ٿڵk͛ךJ7WT@£N<).\HW /mB_A2eJv=:zhD0XNh4֭[ƍM8W~vbx%99y͚5uuu 0D&fcǎ0D‚j\N*+++**?R)^Gh X0$*4 A`RN'*p%_>}46^T.]455uƍ/b988лUr1+W]/mBH$J'G3.lR; \(pbCM6H$r=D@!Z62jԨI&!&`U EQQQII yn-//khhUVVVUUo lrvvFE VVV  T :-LTpuuEo4 @*rnsrrf͚5z70>N{5WT0ne݋/~O n~!!C,Zh׮]̍'17o_SSӁb ILW%իW'$$ ͚Erʊ+.\s.R%IXXzLVVVVZZZ^^.J#j]]g Wuu5 3\&())ACIB-- cRuС9s&… C}7vvvn **E9~iӞ&~XPPPP[[۶m[Z-Cgng@AyZMYxƍQYÆ Ioxʕ@@9wY`A\\!H,#&9:RQQQ]]]WWT*KJJȃU10ONNN(`PQY^^^Ь'*#Q f̘,Ysq?􄄄HOOϧl>Ҟ ,N:̥M(VEZ |NNN&*t:CNEC ʬYl¡.\jjjbb X8OjUVVƍ͛'H)MJV*VWWdƆJIII}}=!HTlq:t@ؗgZEL&3`%((ƍxNaÆ)S4;ƆD-E$YR31bӧ{K7+*h4ZHBEvȁP*1clݺ5001 )k֬0GPH*66BF544d2\$J kqq1u"q`%%%qO9I7 D.VN:$*wSPKScV;yWVѣ9~m=ϟ/o{qz yJoMvx۶mrsso:eʔ2 NA Bf Q"m .„r8ޞR Go߾}d 7}tD>=I0!!7p|xҢhꪪJKKe2ٳgwk'N| p? .ܴiO?T[[,/V[[  ] XM/|ChZݻw;Dtt4G۰a_R^tO?-HΟ?QAe?oѢE?ߝzEt!DhdSH SK̘1vҥ#=O>1+V@'$$L4ŋ`XZZZjkk+++ϟ?s5k̜9'LKKKNN2dHvv;Se˖,*NR!v sWBN8%$*P(6mڈ#oN'wz1 pV;hР^zyHCBBҥK֯_8@0jo6M&! -,,ܶm0) l܃yt;v\tIV#'n M}gD"4FAYYYVu޼y/>#O?˗mg-QիQQQ[/&ԹL&SKK sۗH$FD&LP[[uVXKT@E]\dy\h8g-)) h?c/d bdR*gΜٴi[o5~Ν;s=Ç|?PTTDN4Du!gUPkL'N|駯?޻wĞ={3f8PPPP__[[[pKttt߾}*JS(}d2]!Q%ZAp?|%ӽ{wz>}X,fbmmmᶉxqiR۾D"a3/Dba"Q5NB4LN.F# $$$̘1$Ȁkڵ3g?ѧO1cܰañc  Zt[8!""GYdɁ]VQQQVV6g@xfP*}R$!Qi}1ׯ1cG=G 8gQ<裏LDfo,ퟜ|ڵ3gdeeyx(DrfӉvLl۶muuu.nj~Jޖnp2n qުA/22A`Ȱ d8W~U3O#qSN{֭[ttt```ddd%>>^ 0HT("AbJA!QA'*QBm0M4믿o?^z<33iLTygV^O?ݶl65QfΝ**++nlVؙv+i?i]aaa=ܨQ8ys9^nw|Gׇ0D"?V޹s1cޞ={|SLd4 pr DfR)M r#G,((pW3tf # UT ')k/u!=a„̃~7?ٳRRR{ャ,Q^oٳnɋ/o x بZ*JP Қf @ ҥKrrr|||I|Oɶ6ڿD"$*x+;];'66A`**8\9N;vltttaa2ۼy%&ҧ3C$ (k׮9s&++ޗ*N0\+%i=2]dz ?SoN>]]]-sԨQVz饗Gs*jk׮E@9uuu.]BpT[[6֒^ yU9sVKP(ҥKΝ###D"!LIIѣGllPQCfZ{`)U||<2,HT-3tdMzRI۳>7p.d?}99bjiiabyXD' D[oL&6++ldÆ ӦMc0a֐pqas[9~~~Bnuע(dOAɝ^4_}ҥ1qm۶-^za*(d4MSSymnn6uuu2˫l ^I$HX@;ߎ 븵# SRDzך***Ѩ[zdOǎǏQTTʭ?CV^|7%K577ӻM\Q`bDVZ&O,]tѢE,N?h͍t9sjvx<CNAʏ?y#GWk(/ªUGL>}w9~8tijjB-fVtUדA`dRrFH$;v |>?..nӧO{4x>׉D"oU]] ؗ ] +&DFщQ}}}}XXXvvvNN":uk B6šM&k.̚5?fti<^n9HWOY8lVT 9@(ΝۺuC__}E9r{׽{wÆ ;vXiiŋ7mڄp325 z@ikkh4eeeŔz |--]v߿ZZZ~¼>HT J.#ܕr \IT@ݴ6OHO wPrrr9'O`0xAsn0((DYZ)nsݺuk׮?~]mN/Hv-]b0A eӉF(۷~wU*2t" HJJ?䓕+WZ K̂Ӻu6j(Bdlk4[(d$E: 566"V;B({}H$>$*xy< `_TT2$*c8OBjfy{f <3#""uRRRQꫯ^|EN%J{')(oܹSR=4f3Z%6k,<"T7rrŎ;rss\B"9s޽{7rH!!!*I6lXxqMM b1PPf3( 7---J$AƶV322 ԻwPUyϛ4Xp ]ɄW]] B4$*8Υgd8| HT T*Uddĉnʹ_ꫯۜh?ǣw21ȹ܃!rk*((qf s).qZEWOdggr(䚐{M4 qTw}hѢ_GH҃#p2SEEE_HQ(ZžP|?.I=zb m٪!4ZQrD qAaZ- y5Lz^T:uj?css̝;rB@숉A؇ sJf0 :(Fm2@o}FPYC7 B:AAA̝#Lv diZ7.997ޘ0a޽{K.ўmȅ5ʂ }]dx:_~#cŋ_ڸq#WWW_~ڵkUUU* Wɫ! _#Cӂ[).1xeeeQQQAAAaaaIIIuu54Fےfʍ_Ӈ:pĐ @@) "J #P,!b1>oXll ]SGG"i0;>wSDġ杰k.??zҤI~ѣao9N:|Wd(X,g.M=1cƘ1cȠk˖-d؈ѣGZ*%%qd-l޼w)--EL`G|d¢"ZVVV__lO,1hР}SBa```PPh뱈JV@MMMO;uTCCeбX,NWr#2w%*.%*D"48_W bcc9G)--߿F~wHTpפy[̌3֯_n3P($c0ΓdݺuǏߵkW"tEFW`8,dO4iԨQ9w uԩ!CG >~(((Xxm_A;]p ժVe2Y1\Tj): &px]v@޽{hhP($D,XH-PQss.R\\|#GhT*%8B  ] :ub~J2::grݸq /8Vb2e{H orҞQ@@jmmmMT@=4ܹs'駎=ԩSFх7D";̜X˻1ͬ-"H&NH7o|Y nʕ+=0Zܤu}֭ׯ_|5k~ @8j*RRVkZFD &`J??$D" ###C)!!!v`&8ΝSTIP466 }hxx8" 8XeHThP=</00pK,⋳gϾv3>쳵kצ_rţv mӾfh$=0gFwfCV =˓'O9& DtA<)_}Es?Bp̘1>|xͧOƁ+\sM6mٲe*)NBލt;O8Px2ZUUUNddРVu:s_pQ{"Hbbb"(Xc>88(jgϞ=qDAAh4 JBӱ'Jq $A`L&s碢8/={:!!!++k޽ @ O>ͣviKTwg"j~k׮9sfԨQvfӶQ Xt/g:!IJNav #G|'srrN xc2b^Fq1Nynn.}g/2>l0պr]DEL$*0$!n6tZr7g2]^$9sʫjvfh4-{rJF}8(2u+V؟FSΟ?p|ěʻwNMME(E*jRi+P___WWGHިT*L+@ԭ[{cǎqqq *sZ[[hD|w%) lܸ\4ZQ*䵡AדZGBA{9F(&%%s=:uH$]((ePQN= MڵkL&#u:]qqKT*gpTTJ%` tO.2TTQ&cyÓ'O~W+**BCCK;911qȑA&j2D"a"堩K.Z_QPjw=-bŊ q32Eƍnq?ٳ۶m;tnke~ZZ⠄/bݺuW^j4⮕+Wfgg##φں:Vțꪪ*BQSSV I$.]ĈD[Brr2{ރ탇ŁOijjR[[ڪT*/]te^ωAt** ]틉 n{.nZmXXؤIkGCsCCCG~O?tԩ;۱cݵ̡ BʬY>c>3ZQ 7_ٳg|b! ̣f \pa}իWnĉ[z?;/v83}#.jiiojT]]-˫*++{N!))444...!!!"""(({F!D!|>,KmmmAAAIIF!gqbn_!Q]G숉AX Hň49ܢV]߈'(Dݛ`3mڴW_}\7nj1eef33JDzLrfbz[nڵ'Nܹs'D9/^}D eF(>,iW\uGy^|{DwΞ=pBffG}88ՔFNg0T*Ueeeyyykxب(?::W^/o1lUUL&S(z^R]|ɓr;~A,oF  ] -DVz2r#t}#w#$''?aaaعsK.Ӈtݺu|AFFŋ\b&* ^no^\\|1>Cl&OvڵgϞ?CF,wvyӧXAA|%KH3C@<m۶N!z͟Z-jWdeBxHKBD 88GWUUUWWjX^^?MMM^+QڥD&kZ**[˖-ر5chg2X,ƣ@N24Y"0R.ɈgϞ={Ҿ}XQiHz駧OiӦ#G}DJ <=)&LػwΝ;wͽ-F3.\_G:'iխX?D@} @^#-#QCbbZJe]]rʕcǎ/"f $*؇ Rm/]߈P(t.{"""<*2^8x`1kqD:O;h2h]%Q]rBCC {5tݻwӛD+/x?@F[}MKKbKT`ȐCnUˣG>piKCC:!uс\B@7lذcǎ.^xӦM6n8b8t-%$N~mmHBAAAaaaIIRtd8? іd8p ԭ[pnX/`4!PQ-l Ԝ>}z߾}gΜ}" ҺPe*gjq QQQ?GEt:WѣGnܹ~![f"Q dX 7Sf)eݺu3gtzZ?{#6m#vmD}I&effAscwO`g6[l)..~';&*|׿ ryiXhOTp%"\LTìYfϞ}1ce #Gꫯýg CCC~i$>yg* tHntYlvݘVP֯_|r\04# G]UUUxOhllNhii!C{[8uiРA8p`.]b@ ~BC`hUDʕ+.\p'`vA}ܕ@5wK Jh@,m2޿YYY׮]=!2^_ȉ;wtl6]ߠb1 $&ҿe.cqD[k;v,ڵm^^^ǎ'Nڪ? Eq H4f̘'x"??gΜT;QT3fxWy睅 " 2B.AK,!7}v}(e2YQQQqqqAAAIIImmFMD j2FfKHH%ׯSNBж&@@&2 !D"**OҾ BW[[  ] dݠeSECz,{H^tB`Wv´>=}ZZik׮3g2---')5 R8רT__ٳgϞpm۷oȑ@.39_233#GlٲԩSn,]feK.22믿yyy]222U2w)䵼h4UK ca$V-!ҷoߘP")xT yNCs!v",C\JT D-)4͍oR믿",M;v]ވbB/DE7hp9ZXXثWC޽BD"-뽦כ xrر'Nd2ܛd-%;;{ʕŲoK.ݰa#dZKJJ+4VUU566677d+f6w9###--W^;v "@ F4^ B,$* 싋CX9iЁaO:h|.bb69/>}o"̎;w[qb0z>} OҦOΡ{!FFF"\﷾O2Vi4jmZZZP$u]t8p^TFވD"%X,zqB h4sNLL x3C싏GX <1pQT"-6jΚ5?FYYYYޟ5ʹg G&**0t8ԕAAASNO,#4/fs:_姟~ڶmÇ_ӝddd^zH:o|> ΑJХF'2*WFՒ7MMM( 99}ҥX,DmmmHTh4666"AE"%&&"l2 WHuR[n͚5]tQ*6 FiZϟzjus+Qw**le…K,\q/**>^?cǎ=|UV3qD`` V_v:twNMME( &I.WTTWBhjuccJ"x}#!!!$$D*t` $*xH`_KKVE 8 w,sW9>^Q>B(2dlP(v1~x(wߙ3guHTѤ[DGڃYJYn̙3o{ϬeLq/HNcr}*Q&sμ?xر˖-:u*b9mƍ#F@z}uuuUUUEE\.ohhh4*J}}-9 ѣG^RRR;vGGFFbB0@*Q˾䮐|e> Qq555}d q5Q=~ZD"v>hܸqVuԨQLf/^2d]UX,׉g"Q"???>w+*'*̚5kΜ9[n%p" BvV<ӻv"w+WPu[ӦM={ᄏ`NBOrlnjR*jZRRH[t-)))<<<888666111>>>::n*sH$ː;"R(}dԆ ] dlNWE%GZPQf{%'LRR---86#Ϙ1cbx&PMommbB='g#""v1|p_V7= a3Qƽ(&L ]vXNS_ʼy}]Xu ϟ8226QՊ?544t:Z[[T*ߐ7*Cu%$$JN$Ipppbbb׮]"G 2E>焅!}{a9=`w.J\G\v,ةS'ɴf͚ qdo63lٲNGLؖJE.&)ůSRR9"P(>~5֭ۛo9nܸ{ܹ͸Iɿ)&MZ|yBBrAh̏>q&R[[K^5^PTWW(F߬Dҹs.]ƊD"[B׮]k\\j8_C{%rqcmI8`0 Q9R+;A?CGenLTt\MTj r}#N=}~)22X,"''磏>8p`YYmq~k`s GUT`hㅅ\?ȅ BvRmd2!Q&))i޼ycǎo/]h;D9r+zt@B;eddl۶ x%[}CCC}}=yUTͤgU]]]E!o VE@ &LII{k\\@VTTRiWknn}R]]  L&s;u鸹: ^Zv"-sݜ{?={vȐ!􆥥 ΙXџСC=-ĉ DgGRmmmk!䂀L&ouȝg$tXn̙3Go_mllDX$2tЕ+W8(99b5x2jZMz[BByyyeeeMM `AhhhlllLL yս{=zW85WH$Hqd< ]|=QA*񹞨@VUӉD"'~vΝ҂D;yGI~˗K׉w=اHT M4`zF\Spعq!X+>e9t2eѣ8e˖_~a3g<ݻw_b EFF/ E"@ 'ozѿ"-+9RjeMMMqpBll,࠺:MnLTzR **x5Q(8bhZD@@PP9;Dzݠ@ }'ɑ%J$&B@ N;!۸]ȑ#[n=u{2yKbtV\80jfJ!olOHR\\\UUE9 l Pl{驩䏘Y QF i`lnjj—ΉDT[[  ]D"Qxx8CD444Ҙ\cc# mzdzmF4bb;C˜,sOJKK2**82A:qJ>=ѣGnz L%%++kŊ={D@|P~ϟ8˖p3ŢRl )B C6tXfKK ٢Aٳgǎy<9$ **`0455!r !lrWBN:WD"_}} qq dEKE>z1H e"e\riiin wesCE=L9qķ~{bŎ\ʰaV^=tPħ\xqҤI_| 8='HځW]]mKH~zQQQyyy]]L&%Ϗ'li )))x8ŌF#qV;vDP(6!Qi@ZFb[X,=尚L&"3< m& Rئ0A (X+k 9E&q~:bbӧ޽{Xbȑorynn.!hRJQQQAAAqqqeeZ6dF^%7KLLLOO4hyMNNJAAA|>KzoZcDCNO0až={rss\<=7( ,Xh.w^bw~ F&SJJJ*++Z-ڞݷQX('&&8O>B0L!f3= )j[>65L񤳇C]g N`hvR*.r  t[FqZ'tp,#')SI_5>b1CC6lL2{켼zșr@@PPZHTp9yɝRƏk׮+W`dnJ/thċT6m0aw+PJZtzV-t!li iiiݻw D,j&x3;}w~q2sn{=NJ0s{=;y!USxPd'. aa*T˓A>:*lmٲsagT^vid9SLȬ+'?&-/'5.ZRq1Y9r*B6JKK̙3y۶m㷿|駿/Fc]z"ZZ7IXhQ}}o&>|ȑ#gSSS waNbh:4f9,TadgnHD7]]]---Gmnn+h4"ё^  B;3*((@XPt z(Th4z뭷.Y;Hch>SJi-+'x8_tkLpBz%dT:%~{AJ;po~I&}˗/߲eKss3Ƕm._N=bժU555Af^յ6{KKK{{{C3jׯ_~~lt:Z B J9Fggm۾={kkk8Nɼp`qHAԄ ć\.A޷XKPAQx#饗?hW,ĀƊ4 (=GB juOCGΊ+LI: ?,Yr뭷x Q"Vns̙UUU˗/8oo}ܹwu\X_=jԨ7޵qjjjv6sֆe0zݿ}feeL₂ QAeg?0oA:=cQey;:Lff^[QQaÆe˖}wuuuKnw= _-۩ S;;ݥȔlii$45500d2ӇLOO7EEE%%%Ņn@ r H8@̪m۶}p8\SS{ÇLcW!Ag"''AHPmm-_vv6@*?(T`pg;v8򒹫RJF}8qڵk1Z Fc'cJDh4&WЋkP5jԨs=zYI?y'?C9s,^ cKB6,Ves5wRR~&O`cۧL_?|뭷G",/:sfϞ=w\koK W\9~xD8uE!9 X |%V5Vt:m6b!ݛFA@P (Tx]vLk׮BhTR(P! i8xw>#,qt:2)Yg%6 Bfɒ% Kd~>ox<Ç.TT*ݞmٴZm~~sN'@A z"Hcccmoo߹s~~CJ&.'(THZff&  #<%DB^'jyff>Ow/..Fs}Gf^ЋK8|\!DPH*O6^_w뭷u]䌖!} w_~ԩSYʯ9O5 ]]]v<(W8_q+VlڴDLnB||Yti}}ʕ+Q L|W1ijVFtذa#F ?sssQ $K---MMMO|!A }18$!;;AH CcB*#^ntضmJz?a, ӄGNҽ{~@|d]dɭޚ XvT`m4Zmue\d8ӦM[zuee={$@%7'#, (ɑ_UU5h D@$b2#?c+UWW:trO> hZ l6~۷g]^^+K2' 䲅@⺻^o{{{GGkmmݲeƍ7oެm6 RG"*$'''AHFp:K.KE!0gKhh4֡?{#s_SBy   QZvرdꫯ|R9G fqϟ/G$BX;3qGkg|\r9sW^zݺuwP=d C**6_1cƐc +E `v9k@۷5 jFd2 *f 8s>|xNN=(\qՊ+@H^/Rydf^y#2vC܌ ć QA6)<*?&v2+-5 b?~x2\jkkE>(TQK BHoT![&΃>㏋N:q̙x8M~ Jjټuɮt?c ,8v1sWYwݽ{BִT,'*'S(Tu„ oUW]hp^#?u}չna;@rT*V#F83t\Z@GG1B!H$ z{?Ç#>`!c !`BQ&:ɠa㮮.-ol4̌2ׯWT_/Xs=W_?~h?M_t?!&Dz[?|3}˗7cѢE>5\rӽY,fÌӻczOPp:9W]uڵkܹh,ٗ|jr,z8?eʔ/wy'ܬ8Rކ:$"B.+dJ> EP**33s9ru::tT LoaAIA Mmmm7o ݎ8,OtTH u/, N B^tw7pýO?I"Vz>##c„ eee;v@c(*lV~_6SzNLS sN~oy=N[by:u꧟~*Ql63uP! +Ygu=\q֭[jn xGFC@]wr  ~ ;rȱ& >LT*8$!֧ZV #G0`@ff^jY(S51Zb8D9HE555_|O~vtt 8L&ł8vBrrrr#~*H2DZ(JAN' MMMr/,XVٿl={/(-IcSv; ^Mz !ovv_e˖ r)!'|?N2e׮]2uɕY3cn?]+{o=׻w;c|;Sm۶6EE pT* x7@u4@|~r;6n(zq:t(VpÇx<@ @FB0yPTsYRRbZ͜Rx p81(Tz̙+ LHb1Cx`U#q!, Ut:eCfYV2sGk.JO}b{Vf3rP 6 Y_dFP ߏBzJyqT[[ đ ry_9㉎ pKH:*3/O>;Z߿r֬Y3i$GIcҩR(M%B_|ggѢEvG]F+(rtOٮI Nf]qÇl.x91{d2iҤ%Kx㍈:rP|ȑֶ6x::::;;~(byu6N0N~VkfffZZnlX@B D@P@P!uCġP!"$[P 9Gt:% ph8gk"ɷr/~ yMNz R\6 R_ѧOݻw\#C"9a#9sw[lٔ)S$=2R|4,O B{Pp:Fqĉ&Lشiӊ+Ș T32S$:YO̙?0:2kkk9ommu&$5 $`,##fdpՕ}tT*"#s-+E~␄l _fj '@G^\vG׋mr ęnޣu_^zNr 4xz @\pn{ҥ3gdl]p|C4S9+^KteDX1W+W0o<+p?~%\'s䣏>ڿ q)G!2_BsF=#pɒ%(~#G=J~477wttZ%~Z%H\TXX8xAl6ÑAYB`08" WХxP\!An_WPP 0#`MyUdrV+”/^zxWo߾=--摏Pj,p8L5Y3=>ᾩWՀV\9d Ѯ.ƅ ,/(?>''`c999G6m?Jz.xN}ܸqSNE(rt5pގVsNl$l6%( pH. Ammmv;5孳ߋ% kllDCK .֋+h8b**Ưoo\5L*ИtT*JkarHkk͑H|s5׼[lV~^?9?8tѣG/_\Ne}Z,좢BjE avƠy<|Y*$ g 0B^;h2p8vttiwQGt:]rUV]y6|FCXRB(\n$kٲe=ܤIQp8L-g0N/pdq:y,%3B-Kr8~>}[YY9/r CGryE***^xh466644mmmn;V@$V@v%^w:})..̴X,yyy%ٌ KBҬV+:*,uuuazG]pEBrȝ&:!, Ƞ!l7rKv vm%{$W]uUaa322hlZ=Qޣ@Rш@0\BCiW_mٲeҤIGwL;陌Q!yw9k֬^z שRFQ~&(W\ԩSɲnݺݻwKhnPA(#G\|9gZ[[ce F!?`0vJo߾}ݻhD >*:*\nwZ (mJ]GG: %K'AR%}N/  D89mѣ7xkt: hToCbB[a3#GSLLb3 K8=мypťw\;&BuT8!{キf͚ݻw|>ov$at$  sdjiii崷ljj{<$(G,tZ,ֻw~,@o9)rqG1R{{;;dZԹn\B 8C NSNa'w\vJ~DquT 1K9V+yܼD﩮.XS*TQi'ODy{#A uRTs-Ztm)|F7' q Pee?]vy<1o0+Qt:a(O ">9^gssѣGp/uuuh 4f9???'''++h4~ 0`żQ)6 KJ)IB(IQEBf: Qz= 7E&uvv6VjtɫjΝC k9`R D"L/љl0'\ws̙xb m?9X&_}ڴi {dvb[r>Vpϝ; 'ϙpb%%%!AtLAN1D'9ddd  G(((@XBe#Php2#|>8cS+]%*-zWWIiOck{Q%Ě-zg?WD"[3i 뮷zkرʼf1>ęɽrSӧwq嗓aʕ۷oommF.!Gp|CE(d) vr^/[{{{ !ȑ#MMM @T*fKXV=xaÆ 2T"JcXb$$455!#nJGBrp8P!Bj L~DʯUh HxL=F2J4JnF^O)}5ٗCM8q׮]) T*U*Ϟ.AÆ S5%榫(W?={Iy睭[(g/T%!k)LӉ 0BS HmOgb˺?ڠQv"l=-"Ch-TKKKwO4),SyigϞÇoqX(Wo8q y͛7744aâ7Cu82`3f WB@jl6[,`0hھ}6,..F~98ft h@/~hW, _P =CS4mh2f3zCRh|JtB?L&v)mA;ږ.]:s$yww7힧F?/,,nǴ 4l6,Y:*$-''AH0C*H|3B*kVvv" pj(THdP! EQf3{M; G zyb~x=]]]@ x|w})fz饗a0{/rtPtPGFFƴib 裏rTbSQQ / 2FFqFub|)V$ @Awu09Bf*Tp:2g.@ QZ>uq!4N$Tɦʱ 6[<͋F^{maON::*oɒ%5~aAFfE>Yp^P}2vW+<򈰳|:J0r˗JP u*JV;dZ-K^^ѣ.Y䫯:x#G6nܸ`S!O$' AS$r҂ ܤ KEr%ѩjllDC\. [\\,HS{YNĶI^WQWST4 ~Tf2KTkQMi37ljj=zt"C:z''ŋYFe79)Д6r?OH?/ݣhl!oeeeVN$JѨT_voȑ#㏟ziӦ_}z* I[ryV4hR:u  CP՘$TG*#Ž d#X ~3x_F'DŽB!JǓ;SD_lݺBbc&fΜ9#ϙ$>JrR >qO<N =>ys:7n j7@6mr/O?iW' `q͆ \utt :׋`rrss#qdff",B]QpjXT~DX ڎ $VfVZ|qX/Ry I1FJ+1k֬sG6 9Z\zs)$CxR=rvDGUUU޽ jH^>}(%7w>/* 1*p8U[[;tTCrrr q#N̂O:GMxE[@fիT*GtԂ b+TD"bN%4i9P}S_ nNش 1t٬j||CBrPpr^zС>t*HcT@pHySHRx<T*fWQCBPZǒHX CZb$0Lfj h,#jG*u]h;8tTS1XUɱbB\q[w8 IxƎ8(V C(G(BHlh48v WI3LvqHQwwf?#?YYYB8 fP#~ǥ̮訐8of 4>F':*Ps'( 9`QR>,Y^Q@>Yap0 ̘1qP2`)h4(TB kD 䪭 Alrn7␜t!AuuuBNA` <']zq;",>m`w XJ a@! +TwH+Wׯ]r 6B^ ;*YK:E)1/A(\®N*ٳ~AȕpC r(Tݎ KL&lFRQ+qH:*ćB*TppdL~ z=KMEQ3V%`ɞBgg'ﯩh(xif%n߾>#˖oǦB(b0 ˣK XC=ya̓8 qTTT 2I :*H8BPAj5Rz 5onn[ 9 SSbso̬P],ӻExA#rb| LLfLAaȑ˗/EO*H`_LW2t)r{#6sO222aߏ8đ 0#TGY'iF9h]@I  ^(~_fcc(T4h[y@ 8$!//AHPmm-GNN  ,@P ]:iҤh4:{lm$9(U=uuu5Q(BIvww9xXfC ɯ\ 'l!Q@$Wرc5'AGQ(TE W , &! W---v;uvv~! B8P’ NS~DoeGfYl$\s2|>=}nwYYY&i4r{@cgCPwwN@2[|wl$ :*0>ȮaQcb < NpJ x ̘1q*HZfH@ P vAP@jŷx<`qHBVV! '"" ",Tmi;<^ɍl[vmڌFCi0bB0:*TTߥA\fIa4qyɠ\ABrDD͞=GI UZn(^7 !~?@ ( sIC ~jkk8 f\.P#,Y*X,mx|Ď"ĉm֣EZl41KB56dde d $r1tT\a /h##G\|9·(J8$n# W=  ^.88 $AA` -j*t2#~bp̳>JKK~22 cB \]XGɄh4̲##h4∍Xڿk698TVVFpS^Q 9Bp8R0$YHfbARɆnRѢBQ&JɱgB3@zpx$:rVZ%{K/wqGF)7\9(KCȑxUr~шIB͛Gl+:*[,qD]UUE~"p2VBɉ-c8Baҕ6 AYjkkCxB^Mff&8ć N|QQNK5+iz Sz|>ߠA/ȌR`p8B3\j6>fˏӁed="r1tT`$Uޅ@NG}TVVP)AOCGP*:*\PłSGC577#"lX gBg*̷~=-C77ɴ{ 6J1S`zPH\cL> 6<Ť#AP! rs状\AV [@ctdCE( -R`# A vڊ bF!E B  jllDC 3(ToeA*H>^>?OqMw}RD 84 B6G;om?& I{GE[@X٬ Ik6vXriQոD"(T ͆BwfMb8C~Pba.PweŶIRp]wEG_|7u>m $AC$TGSc.#˒pȑ#}P!fܸqdG*1Qh4RltTOR)T2Ћf^4jRF`@Pb ވ|(THdB5#+W ?-W `GH$"8Io =#G._M) J9PtT l6# KQ(THNVVZ!ojjQ!h4*UVUUuVlOGGk4۷zqy|NNTBMj* pSf32#BGNC&7ܹs/"9H.dvTÂ_ `deeeP@Z-Z$GVciRP`0(@9Rss3;ݎԹn*$';;AHP}}=GAAPzP!H9N1"^b9uT8f$3f8N)L)%fzxuƣh42X= h .D/$R&?s }? УW Y02sb`XF#Bv;rSvV(9訐8*WTT BQ竲\w_?re˖q?[J[ z+Fz)[ş_Bo 0qVVQ 5VsPBGcNz/첥K" 2 @rQ "a Ukk+;Áۖuwwwtt )9B8Q*ЀB>^FEwd>IGXHnѢENeDrF&h/9_*l6Sq<ߏ <]bBvT`}:3gΜ7oB # H 0`ʔ)Ͽ{Gj3bd6_^f1*`0(=zӑN8kŗ !`¨FS|rgzӧv7.\C{<Ane y{@e~?dVDoQ#ǧVIRBa  ;* 08y Y :*^VV6SZZp88999>()PߏB1 sQ|rс d2l6!+  $ l\.ZxKAG^jƌ|O C+W_pECFtT ۷_~eee6-++Bn/= ("@ "aSdl6#_^/,z!T)$T)BRARF;|ԩSg̘`1.u]Gd?y;v 0@ ;}BSS/(<^ry8ٞ={ꪊ r >\䖒MG6ˏ [L'g,w!# Q!?èQ>裲2DNFFJlbrw֧O\B~% K +:*|>L{ JZ)B fΥ ␄\!A B(zaFFBB{~-'h2I:vu뭷?W_}hwZ~͛7~/*N|88pĉ׮]+j)O6jAcCln;<묳uL&B!6}֧<شWF;*VEɉ)a]QF{GF4h0L{.))̴l"דXꭝNB>rU>~)c/_k0}tq ֯_O-,j.kQr 7ܱcGFF,w\QjB2ziPj~ M[[o>\rIzzr9܂`0&h4j69rz Q`0B)S sA4qXYgJJJۧO T tTP,ZhPp81 S_dAb7PE09(TH\]] PB)XZxstnG1cƚ5k.\ػwoj۶mg}6WW̜5kK/$!I褐M>7x+.T8h{ӧ7,//qx)T`S<`2M2o?iĉdEu B!6F̐BK,v08Lg}6f̘*yC"#--bL&9p4TixʤX?`& !GS$l6),a5qVů@X+< :*dgg˻x[߬cDGO>䣏>xR|h4)TizZϝ;w۶mC R*Daqȁ:|򡤾TuTHo8pছn3fC=R׋ϡ"e0sD898@-ȷh1,وSgϞQFmܸqСPd)-!&䧴tC9X^f ,tT h @ZZZ9tTHQ8v݈Cq˜ ߏU0FPA{6Η.'z_ ,ESZy{'TaD СC/JFT@&4hَ<ę0a~P(D5=ʯ/ƍw58p}6 Dzٜ̖fq}Rx\H%GI.B XTp#sڋ/P,, 2@b0_~C>|a$-!s|ʤ p bR( 4$|'N H}abe@0BшgMayֲ/Tbp8~8 쮙3gN'7x`0(7x#y[n,Ž 9rdΜ9dYnPl;]]]lF``0Gf2:*X lb1q%7<0k,(VEv zbm6[aa\pm⋟}.\xW8PR訠[9tT>B _;\y<_(TGBrB|(T2á#VW֭_*p}6lgZK+R֯_/}877 X/1"O;faVTTtM楫Lt:CfOȡrz<1z|B!7a 88 d\./2^ ABQDZVF253f+|W>xM|3f 4j `Plr l]vY,`I8$'A Xt:[>W:H$b~9/ٜR"7.^ve~*pqM & 6l֭J ^@?ϸz8juq~{YYk<=b85VȾcFYZ~\{a p`XNKK6lXyy;dȐŢN|Ⰸbt:) <~ *f3 RQ|ӝ8tTO)rzktTL.̩gaF1$d'~Hp3w%;v;s*0cǎz'p8 m/|駙uT?]Iw !b;* rwߍ3fBhZ, X_~q\ׯ'O>lذlٌba=cP*g>' KV+RI.CrrrP!>*0#TG^/&>gԗnˣ_Hr82Gh4;vPVM8qblx_p{0a[oŠs(6߶dX~̲YOSB,aS7"igB?jԨP_/Vѣ̙ꫯ~7ػwu?O>}iii&)6#e"2 @řb`%*Pz&5S$P!q BB*Ⱦgndµ9S^R9%Zϟ?$z 9(%vww>|ܸq6lPIN qf]袋jjjN_ ,{w&EuޗYzga`f@n0I ^4Jxޫ+2(**{˅&A fP# E t``}z~ MOשsޯ?>UuꜪ9Uw+G}TI*{"Qfd$* 9%;NNfҼc*ZuwwO6瞛5knRlqڼ p8 Q"im)B9f3DYeѽ*4~ I"o4$nG.TVV~7#_`^n{ݿ㑾A?+ذa̙3J,”A8v;y睟'}N k(wf M F푑$WȨ w1Qק'$*h?+U ǥ\n3APZZp8rss HϟS @;(f% S<]9 QAp8ߏ8v#):z()0DY| C#dz[!$GUVymݺusřEEE\sͪU$nj2ޅ/>SөCD dAD q#~_رcǓO>v?ڴiovz䒨 zP@-,3d=HFlUx ~r, . Z?8&H$w&"8١C&OaÆS"*fejP^SS3z芊߼y3W?p71QAty?~7Gd2QDH+lf4JEy iN:sOT ^xᅤ5{lDCPQ \jAyy9,)))--zHKUgqvz;`qPՊDP^.}ڕ Q!mHTHݑ#G$־BJT(//Bxi&*$#:'[ yEs9~Ҿꪫj'zcF HT^]QaT?Q<XVT0 pQȄoEiXQh|:L(\qdh"D#ǥ4|z}nnȑ#+++^t\eee79-Cf\HKO8+A@E8H"!EF(++C!Qa؆5  L2>'G]@߾|; _]wٳM9d߻n6#Q15UT ;Fq**'HpN$6Dww#%㪫ڸT^=$O]ZZ:w5kY2Mhj-3Xl**+E)hT}*ԋ<ixM Wze^G"4|=dK/q<A"r7\7^oqqqaafxCD&x@ٮF 777}ȁ3L@ ~zN|ƿ**.evLLXZ!r 9[H$²,$ȇl|+*p[ bBɊ ӧ^qh@\‚2~)++=ztMMͨQȟ $bX F f Fa$*(nGTAj"QAA$*v#Cyyy 7 \=M8L: 8 <쿘^7)&*.O>{b>ŋx'C2`ko{ 'W DTTdE7^MCapo=|f̘cǒ>pCC@9PQAz=@#BJg"TW‚r@E$*AHA`  vvvrhIs.`x*9S/YYÚzy͘1U/<+زe˔)S2"QAi4{ϏN'kqܻf!s2cs?44DO8&*=+*`TT`oӧOGT+qƉDbZX*";+am"^fFBD98x( "Q!=AH,vxS 5dʋVfШt<;iOS6+d2 lܸQӵΘ1C6/v&SN+pȑ@H"' %=j%i~s,ɣ C\*@.aroEh4ʷ.p~xو9**vq jlll?Zd!p "/p% BCv! $*D(l}}}>JDe}k[Qᄋ.h„ ;vQ{^s5O?LOddZs2gO$l**)Qh42;[Q![kIŽ限7eiѢE ,@GN:b`Ə<˧rez @Ԕ V]f5X! EEEB> $QZZ 0DWTYK%Q|^uïpΝ;u:ݚ5kΝ+wlfZWZu}(Lbe GLGN3F`2ȃD"l^?;;͢,CCCXEĊ VT m;DTT`i˖-C@@؜xĴs=b*344DmBEЎ@ 8D"X,Vq( ~! ! ׋ 0D@YWW8QAc[dI[[F+N9-.._ressLbez xMU,f0䛭 Q4l1Ll>NcU}۶ 8ӧO_z5Jz\ ȸqܸqsNCCCQQ/A bX q ֓V$*ZQi'/ "i@E!Q18EHTH3_KK RBE8EEEz9?X,q3HT8 h aP%64bL&A"r BC?~AHA`cE(듾*՚qk0hMPknn^t]..ݝF'?߽{_ןg9 * 魍/ Zr1IT0 W cŘl}'0 𬵵t"j |cǎ/!I&)''`0E%Р!$*hi0-FXO!v;]!\ QAAӃCBr&BrrrrS4 JDeZz=ŕtsss.\b [^Q HN%\B~_/Gfh^N!wE ᥚ**av'%;Du+*j9=MBS(QQAUkkvR5 \@Q ƍ7~1cdNFz0 f zHTPՊDP*), "UlppY4#I4XGk'Δl6]tww͢|ޮUTTHRVD"5K/}g> RGLUPQF[֗I !1QNNR^̮_*PrE"^)SL L-aSE B=.%[H'$??_@~VWWnbZf3饈 bHTЦ,F1MS ȸn#>*őf [qLj +EǎC@i6:::xD6O*4rܹk֬I{ tA'b;sٲe"{$*3Y !F:6,QTCLl5'\Ip#}@|V^=}tA -ܳ@H=//555;"77&ZFͣՈF7&2ChA$[f% !>uvuڗipB\QQCx}4҇ TP_uvv5>ͫ^+رcDŽ xo~ᄏWIjfęCIc܉ ii4+*d ;DłUelٲٳg#ZfeRt: *++Nnw86l6)bY AFpHTUBE9 Q~/|!EGAzHT` Jp8oGy7cƌCEx 8_3\vyX,A` X, **{b0&nm$xavv6ljɷm'h&9477/Zq @C^^^]]#?˝Npvjb^ uA**D"'6| h1) Gz #I! pLT())N)?Cm *TU]`qFN:cƌhD* G6mby,ReWet]QA 0rUkW19  QAfϞ߈7 zAII@~}X+!QA0-@B!$A9Z TT( ix<B:::$45#TT`jJ߈+*0{xEM0aǎ^NTƉD" 럼[[n=!94`n<4ڲLrOgQbh4dÃ/46E?{>b2tM>}Ո֠U^^ޘ1cjkkb˕[XXH~?#- $*hA8EwrॳAtb~DP#i@BPQ!9׋ 0룑 ilsW:* *~7lqbΝ:n͚5s=_͍iL,rU*^t,---++#LFQQAy<8O?t޽mkk۷oÙlFX023ӎAzzzA@ `*#)F΄ Д1Q^^PSXL& .\bũՙ+*?|.bѐi8WfU5,^Mx<˽ lfӯԚ@ZHf"/ȡlE'QQAիWO>q2qto\PPP&(..v:#G$),,@e4f]@Qıc>|~c߾}'Nq#f]eeBE KT tZ+ ғ `M/l QTTPb'ڧf૯:l^Okd"XEk̙d/_|Eqbr"cDN(R|Vur**0{lLNVVb'ҲCqZȎsoeD-[6{lAȝqd"zKJJ 9r겲"T/4Xr lhh/ܷo_GGG03ȱz8k3i`B.=NH$dq"8DfwR;V+. 9  Q={?8$*(\\1-a`ԨQ^D`**h**oСC~#Gݻk׮GfY!K'**(@O?@fhE:$*R'_Z^/@[[Fԛ=N_* x& |Lsy͘1矧8U wy;pȔ@8W`bXz2344`:2iTTPr c@%x2ӧ^q74 .+_p8l6[YYѣnjC~c|/԰&NI">,f)ܹs޽lUCQP(WJ`P>:l҅Op88 W~~>"F@+*hh**)}#6MS&+s=7 /@krTT8d2߿ݺusI{#CCCrtLwi:8**0&Q0 W5"娔"X+*mP>+ ̈i n[,p8FYOHKaT@FK$Ϸm'pY<4r,T:~qPݎE@}=AG?;BPXX ȑ#BZ DDI@PYVbHNT_I>裷vۤIo-3xv?o&LiQ,E"%ȭA VTPA𭨠$ -@q\ڪʞ*H#Q@&YYY2Lv^PWWWTT0 I!Q2;;;=O~ڵ>OF#&Z( %lHT!&:ݎъDPh``qHAR ɡHGi-r ꒾*9{zzT9KFs%;6qD"1o޼kJ ˯ȬD"gwɒ%]vI^"X^x. $e^m8ڊ5ԡq2fF򳦦i @JF)(\(9zh 8xm۶nW_)`/HD%0 Dep QAz?qHCQQ"$*$W\\ ȭtrED٬ T HB}6LMT81k͚5K,A_q 7 .dGRj^[__C [mBEr ~#Gvرyw}WEwXBl6\#JHTJeƝ@ 8AHQWWp&fC0 ,RQ'tCFԚ@LT8 .gʕ`"/7 &M}v._RR ٷoȑ#gϞsڦba&DQ<>YYY x<2|A'!r&f-Z8@+@t:r+ėnرƍkllzx a @n_rlݺ裏h4bb:HTPÁ$IPcǎ!tf͆8H$֭BPPP w%P $*D+!4TM*PQK$"M-[6m_ONNUeD"~Y}l6UAX,Vq( O/=B*PN!9`G#QӉX:*|oQi˔jr<?kGZvyR`,; Ĝ9s֭[w/)44{v}ppK/}嗩oft:.;nᄏuR&Q!*Pٽ䬬,)+Ex&*s@ɦON?xgmLpq+}? 8n81-b~52 ~(|/r„Z(Hq3]vmڴ魷:x 3XGA0U:ՊDBzrssT=zAHA`Xi ӣ}t:Ν;=\$RIT &M}n0*a-Y󝾃574{'PyWȕH 4^8SLv^dbSQ Rdggy rXP+0ZZZPu qʈ)Dtǎ;NP[[[TTDX3A}O |׭̕D"@ ޾m۶zk  Ae !Q:JD`q.Ӊ/EǎCzfsAA֢Me&!)gE8{x„ W]u(!>DR۶m{.$Gp5jw}{ɿϬ2'm2Np7/Yd˖-cǎ}}wyf-XN͠Ml^<_$#QAV[cF@d$*(VIIIkk+JA ** iQQkjj7n\]]]aab@j QA;@bX ؿᄏiӦ]]]⛝!2hW)>䃷 V]d2!#O/ yyyB> $ tvvGKrǏ jg}VWW7|;Db̙D⢋.ڸq%l%Ky睍3$Ȕgg~?㚔WX$զK.W_2N#WBo/_> ,q';l9~D>҆pS%;D"V!ECdNx|>߸qP6??_|!j4Oݐ*!fGpXH$~M6o޼{nOzm 'Cb!QA!v;Dn]6 KK700 x<B$0ł^r p2eݻУ=e|}}7cq 2O3qQ۷O4%NTnfgg}䗓|׿twr+~_VUUg QIH$"}4@ |>RoL:qܸqSSSgL& ) (|Ȩ\% BS 7>~wx<. XHTP  GfZ5z{{i),,DRt!RAnHT`p{ZZƎH$N;ݩՍ;7?>l,=9^u&NHy֮]{+_F3tZ ȸ& '[,Ye˖qƥ}ݮG_uuu_ /$*0 id~(#7בh`0HKkR%] zӧ#0**ҸѣGt8fc!9FpMDرc|-[nzA2`0 0rL&Ue"g $**!vTKDr!EF@ HT`TH5Jزe_ןg|q29r<Dњ5k,Y2z ڰaß zH+{3UT8Ə?sLr8һ9_t? Mɤ L6idX>]4iҘ1cۇb.]>!.+: Nlٲٳg#0,_uuu>AUU`h4 (iv!ZǿwٹsgWWW, ~? Df2p88pG礝TLv;VXtE׋ +**BDO!hhW){.2說jkk5;z?|n尐i})X,ܼr 6+V[yfhETړ^{-iӔmv=o2eʙDf3ɝ3,G,l6rX~y~J{E$*bܸq-B`0s\uuu n /b - EMDb߾}H$2 Pˌc4Ѯ*9󑖩dDPnAbIDzzl櫨ARt1!|Anfb@-HL"7l%nbGɳH|oeRG?HȊ+/^\YYBKKKo֥Kݻfezʐ قmvn޼sI8?$-ԩS'Ok\DX,,QD*ְ<#@Ǘ"-枨)>/CX` ^|෡^PVVt:Ś Vl6c@ڣrA#8O,ۻw֭[?O?c*AkȐ*ILA0G@}PQA. /(4 Q!uBv A`cE p#VTc7xu]gO^g;-zꩧ>S6)@9Xh4*']|Kvww{^r3feOg\bE=3fx5eĤ[tM>)+;;AbE6Q%{-[oGA)ʱ߻Y:NK/uvvnDRD8REFgB~vTK(F5H Z!@ݻرoD"^O5 RЮ*S `$Gȃ QA/w3qX%(,,DRAN$8&*EFM BrSubz:zI逸J@_W﫪ۈde|^Oh4VWWϟ?e OF?~7֒dbElAz{ݼy \MFW\qw뵵#&*0J*9ٖ}úsNF%Dɾs,(|'&Mz7|>gi.QF3x<.n7:fz@*=k׮;wٳ' >|+*M?ךh4 xhs:3ZĹUJ رcB^A)G1y,kEOD"qe+g;ۍrrr֮];gv_ pZsɐ[_m&N]&{'qXfܹsجb f,ˬ'}$R^!y睗$o H3PWW7k֬ləf'*ƌcǎM6]yGA8 HWw+^ș}Vp*cɤO~hYTVVVVWWgaaaqq1SCE͎0@>o.d2agesꃩr4VqB!aH_.77qHEGG@{TSx _~wL֜ΝO~rb{HpI&͜9sÆ jD.=o޼%Kݻwxvv6ǦbuΌ{"F/_~v{&&*֯_O`01H M7gmrPˉӧ>|xڵW]u {JT^Qyy&4D +_?q$A:$HT5rj3//v{}]vݽVl.r)&*)HH7l0sLYWol6ŋxauFjҙ&EƽA-ZtM6\Cd.6**LTPrs=cUrxErk&,$* UW]AD 8Α#GVTT{bٰx< ڔ%@jemTTPÁGJ]6 @O2 B> $QVV أtTfY{u͙3_q2r^|@Yf?~7)b"E >'Nm6%rNNN*|˗ٳ>bB5)gI&|I×NnT)p3,8Yܭyx-HnHz㎶z* #<@N\zɟKKK!- CL| a 2EQžG͆ROO@bAt!//AHё#G$h$*P%IXlppPzc]ӽ[ӧOMT b^?Sz$k֬;w.F$++Ŏ>9!.^xl6јA6s颞!^o[%۩`>9,+*(yIgyfҥ#'-79^]䶅4V:tPKK 33'ngr<s~~~YYYEEȑ#XpȬlPH$up3aV%.%Hn lx(]oo/R^PTT oEzPQ=1눯cǎw}{a㑸MZݸGf1;\7oҥKK+g R/N{W/- ZQAW>wyo#,El^ h'*|͝;wΙ3/yx&t3`0i9|Ί N.[9 ƍM+a IHyNt\.%BÑ_% Ǩ@Ċ H6 TTPh4ed/[D?pp >.͆1~?␆!EGEGDO~u:8ZdM7vihu>*B4;꬙?0 +VXp!F$ '~6nܸ]v/XV9CP&VT ~uW4ͳH2;"6)5&L矯_~/ bE'*5$n\pAkkQ 8= '#]PPxl6[qqqUUWVVu v !Q@!SD"@CB*u6 ol% B,җcǎ!IPa +shٶ~FJH$?J0>nVꨀSܼlٲn{^FΔb!Ss0DE~4)^ >ɠD,c=v7'|RZZz-2t/r t:PWSHYG.L|noW%[Qo.VT"]iӦ=sfB4MKrE:6#//v;N4rQFԌ=dCCCMYYY(Vg J}}}r$ Cuww#gx:tGk6Lr:Qi̦Dɤ̀S/:=ʕ+WUUl%v{{{ʮUVl\&.EBΏ;UVT܂??o޼/89GNT#lX ; _~#1H<=Ǣ(3i<'GzWfTӂ-[L2elFdfZ9pv|U&VT8jK+o߮3jH'ܗ9VT|#wIJ9v7xc{{e f2bZB}}}]]]ee%@ Ɂ(\0D2hD2!QA!PQT t D␆!HJ"rCxf3nj@NxPWRl۶mĉ#M:uڴio|3>3: E F̊ƐfYXGEں͛7_yZ>?y LMT#*&*M˗/'իWƑ}lDVALK[ !.@4$@ɐq*Q&$*(`K+ݎLһHTHFRс $QVV ȍcBIIf.cDͦy``@Mc0,DVb ;>iҤ3gnذAֽ߼y3g}n$OD.HM|x<qtMw߭]??4M/QPQ\x=/?+8NDCӬ,y~z`XV+閐1 ѣŴƊ %*q (2 /+V(B QT Ad2q'r!8z('3QQXZgsbH=GQR^kd:c Z䫯zݺuv~5e$ҨbҞM LA,]{I%|+*(!Q۲e\a-4URV`' !?|>YQQGst:<P8*3(ثX` JRA:;;, ~(Bvg$^/ 7@ކo:>Dx<Sb1~eeϿ袋&Nm6Yڑ,Zr)=C-L|J0 p^{O?cLim8>#>+BLTF;IE{{ɓ7l0uTDCsSGFb jkjj|>_SSScccee%JHrDQk **(\0D2lF2TTP$**uV +KߏD4 )BEPQAn}^>- D"100@1 R_Jk1cd%e^߽{KZ&>%**isTn iUzq Gs}~a'ߊ xO yr" IHv=zر>E}H\BfA^%_J`J]]]];Jǃ ByyyPN$*Ȣ_e{dZo$HPfSfBEŊ 4,k\S^:k֬Y/~Ev21QiF*[7%.P;s>W^ye (vB( dǹWTWԑvܹ-B444:EEP;tO2u\uuu>oܸqcǎ9rFfdh4Kp TTPA!2a;r~fQ4]v {K Q!u™`޵8&*h9xWQJ:S!{ӬF^gS^ڳg9眓ebV2Sٺ)xWA+.Rrw~nf@:lAjwA?8kjet .bD7r566|&1-AKGd($*h Ge%h4♘29HTPł IP%+SBs׵a-Rى IAA 7TTdv$E}%975jĉlH$B8c -Z衇Jg%qܝ1@5CrBc avufX,HSh4ߩnGTI}\@(BBz?~A8d0D^dRYDmSDO?'?Ƀ>HegWTcm.H/_/<WB lHK_W믿 +VUv~-qnq<瞨@ZiIϫ:m4,%l6ۿ˿׿~'6mڴo߾o~G~v[V@PQA AnppA f!QA!l6VI$(D BFAH\q\ShTTeZ}mժ8ņzzR6E}ӬM&g8x^uU+TT)**6 [n^ /8p`Μ9|:O '|K^m'Onmm5jѩqL֊JKK, xԇ@EmXX@PQ!L&d+n1 8Ez AEaiJKK/AJTJJywӟ.\[nI;Rr:|~M63114*NT[_~_WUUiQ;W\qw}'Dr]o̮f'Oްaĉ BEP&._^X`s2eʷ~nݺ:s1i&*]φo}@h'9mڴ_|q̙:G0Lb:1-fq4"84Hc2 MơEQ$*(D@})wMFRq! ׋ *p̖x"  % \!Fތ3o~ls ?)Qa4LH$ڧ~:_Xփ5+h4=o?O .:ujnnJ>x]we)N^QAedt.C4hpv2xX,& i HT TTPbqP ZtW`P$*p`"Q @^rrr2be%@EPQAnVAuT/*+e֥7xc\p"O}#nL{jժ{LC vU (D|__n|VU Gy)OıN{*!QX_C-[hA8|ĴQFy<l6łV`X4BcR5PEEe Øى ЅT "5 yyy:qHE{{;**СC>\GEJ$ʹ)p!LLߺu3n \;YPB%~ZZZnW4D,JJ饗^}Y`ժUK.?ƍ3kk$ D@/_wQi3̕r` ."({! \%" ʽ(ȵ+* "qA.Dx%kLȼy߼| ?a2twuUu= qaTZZZ ԘFi ܡ<LDj5** D$TTNbEhG]ll, ($*PDv LL08,\9s| ,xG|<ﳓ-'=຺:v}}~`dB=r˖-۶m[VV֓O>$@nnw}3|bDijPQ;W6mڔчdoIKK˭yQQQaaafYףxOP`c!QADPVY6-0;1_ p[EEEBCCz=bAໂ(APHTHK󢦦&11qϞ=m۶^Zs*ѣG{YI=g^U{*++v;O>uTNNc=V'"Xp8VQ4mGN:>"֭[ddNz6mΝ;g9-V ׭Ru{znU}u177t )NCB|R㎻+--YfOLLl6L!&EnLDPVYr$իB***Ѝv˗`-.FRiӦ͟?nK2tP=uf޼y_~… tЏjy~ UTp8ܯ\'xb˖-?ثV*HcA t:I5|͐!C222Nzdr%%%?'L$NO=:GΝ;ױcǝ;wn;Er&4C;RSSҚ6mC=,, i T4h^Zl ",$*0 IHT]hhMEE  C=/P#hjTT@PE6*ʓO>9q;4(u޽{O8^jh{r'xhJKlӦ {7 ]eaXH\>۹ss=7a„4̡w#[o7y?MR~ݻ_pd2!FlA5kJ%""ccc⢣TT I,l W\C`l ?HRII /&rt:\.BXqqqEBB 4 L" Y_5&&fÆ v:Nj]h,X%*ロ\˗/ݺQ\9%NR*#իWo۶mĉ<|zT/~7GG1x 4SIo~4k֬ )Nh4͚5KII! kj֢}נ*V+X`4QN 3 WQQ8|j!Q"TTJii)DrFy朜Z+O7kMxHl1'zK/p~- I{ 1 /Ƣ=xl ELgΜGM2o[LL"y=3_5;_̓KC%SQL19pl(g˗/ׯ ؕ@HI&wŅ&$$$%%4HKl P e<ַP՘cSMM ;@#QA཭YVޗɄLp-A>_+Ƞz촴4.RnسgV=vXv޳jkkym^ϟ߭[7᎔ +YYY_$|-q}FѣԩSx,!%%ȑ#|g(&^nFn8A)QTIIIM6MLL #Oi5ޓ L VD6 :rwzڵkL&)|_\rAI&(JLL*AR2_.= t@;t0iҤŋ &| n &:esɿo/i<9w\pԫ,DC.Xq ,v;Ot2͞䄄H@b0'TT`jhFljt:X4JEE/LUUU"""_!^ EhtP^^.^z oҥ|7|FJѨj=2kvLS6v>pyzATP\.>_Z]ڵ#߰o߾9/#VQA֭[=a„ѣG7kL&[u[`JkԹQդB O7)==9:<yƻEDD%%%%5m4** !<ޓ kXf₊ ̪{ ZF@z ;шA +ࣂ$AP6m 5t2OT|2O2{oԨQB/!9^F|%*H~ 7|ӧO_|MYeff_fSTj4nq) ZnW.,jnf{bbb͛5k0 a<jjjh4hTrXFo@zޙf4xAӑ| TT0QA-)) —Yc~ᇟ1'߄D!p,Ydٽzڽ{7Ks\.׈#6nM UTbՅ 2337o %Xj?/"M <*C >JOOG($|Zd2 Ȕ2F뮻Ұp#280 n؎FAv9?,0Hxj0XGEEEBCNhornBp R A+VdffBHH/ZYWBii}>!Cv-'*3&Lйs3gܹ7?:: -Z4}]$*SWC&MT3׬͛7O:5==]>ƹf4fR@)ڰPQ 8HJBj颢ZlL9 Dq!w4lvrXHTIR]ޅ!xG/!!Ar dćjRI]vL]K hFƍC$ BM6mٲ=ӭ[ѣGϚ5Ϗ=sΥKᇑC<`euNdٰXFAz*/^eUTTj_UVV" A(&*Pjo%ä+o߾~1 /Ҽys.T*qrUUU'֮];fF?3f|g@P4Jx|ϢQŻao6!ĉ%O?4_ƍKMMIA?3gΈ#ȓfQ9.rjZWTW:t 9 =FҤ [fPu]qkѢEBB 8 C!n S*M6+u**$!Qwz^"\^YYlk׮!^ QAhHT?ŀA 7l+*KOOcGjj/BBBxag-[dZZ -jݺ;VZդI?2VѣG@(~X@Z%$Y &xӊ+wdɒ?C>RJJ!d~IkRIݍޗ߰ pHvݺuL[ +i d`6㎴| _۷~8x{wEzSdH{)n[㪫PTXXl' ,0Lt_ 3 qjb{ (**BhOj=j*՗!~\\ܯCy_~|rS2YK|Y%5557aHHȏ?dnFgffڵ>~ ?cjHT`MZj%+S\\,˩SN䒣Dw(&*%4 $*u(& ȍZhJHHHNNn׮SO=k_?Ñ#GޓдiSvLf!QADnlnuFA)))A#jX? QGHT.>>AT^^_ 5/ZO͚5{w\T ѫ#{_\tWP7i I**kժW^K#*.O>}lT |Orrs粳p6lȑ#WQQ!jС .QT³3WzV@s<5BCC¢'|r̙7n\!62,Lcg!Q(rss1& @OI2&y2'-[XXXV ꫯnذرc/^=Lǎ].q{j1bn[dܹd|أGiǙ4b-X{+Wedd}}RV)np8 "F |?AE4[F(@ȃ 88XR;$iӦ[ѣӧ7oMHz??LF|row\\\QQѴi|Ԃ1cx6@lذxFDl6_u:ݞ={?.~jIIٳv+WޱcǙ3gڴiؿK1FT-BkI\EG֭KOOGdHR!Qą\ F))){~饗ѣw}'nٲlƨ$ CEQ* q`Pmm-X둨z1LxrT]]9Add$#TTBTbS3/?}{$*\5.wU۰aCDDĥK|+q!2ooxJ-\_~ d>n~ TZlm۶k&''N'*N1y?dXDF&}xґZr?3駟޽{{N>sF6bEFCq%9p&iwygС<pXkI&={:u?o'-[ΝODm4021㐨 *V+VSE{ =PQ;R]]84VTTࣂ!~w'70'TW_JLMM]xiڀ}7bĈk޼9Lm2n85rHio޽_xSNjxܧOիW7:JRVSKtfȽIn E2}'"E:<ؘ׹s?ĉyyyj syꩧ޸82'cg,@E"}]$0/DDbh6YVTTNׇ" 1f3^-q@bbb@1Q!119T zV hǴiڷoKn7nT(W\i+ r3gR璨@u]|ɺuRRR<t:L%::x…p}?,c?ڴi#NOO3g/"=zW|yѪ*@ʤO½R)UC]` gVBr&0CO<}פI@[R! LDY555@`P =׮]CjM&Qee%~G|tU!qqq(&*4m*Ql6#?!0N"kΝ}]3PIC7y,_V=S[[3gȑO>d<^~>_~AcJfv6gIIIOޱcީXpa^VZ%pϯ6l[QAP\Im [q978O\AHT%BͿoLD4 U6!Qz7Hx Qee%-E-::A@]BB ($*@7< S'AFFý9qΜ9111 71K7>|X;裏6lؐbX26f'7ā:rWҒ7{liK. O>䧟~* yoڴ)//Y֭)#!Dܓkt˩{0R%,ح <4 )cbuF7HOQQ/^uRU^^dW?`}/4i *//֯FB=ƈF6>ðth\\\T*ׯ_/eVWW罠w={Lrw&zQDAR]\{ǎs{Bva fͪӧÇyQ$_"99JMMzuuu˖-͛w ҷ5jT6mxo駟ٳGcw贴͛&2 >p C#HC!wd(sSN%%%IlٲG}gYԩŋ7oެ믿>i<)XXvC;!77o ;#xU  "3tHT`TUU!ԙL&Z;iuyg0q؁,//G AB+e0PQA@ܟOT9sf6mnHXQޙ3gwطo?yJ=\.W=p. ~ZNNv-PX ڶmYF'vҥ}w2dbYp?N'DLQFt/eggn>v;@{RrXQ*XS 2QPQMvIX`4$*.44h9"O=ł8!::Aŕ+W/PCPn~;***QيDii)O81߇o۶~brNIXڵgg}7[%/b֭<l7e=޽{?Cr i TTرc].O1uԁ~Gry)?~<9jZ_nEsV}kݺu vCek0@\ *`VWVTHql6# =%%% F(;_#^#¡XNPQP0f $JG]~=ϱM=/#Zl)aޓ)>>ɓzcǎ_~ᅲU눈~o [QQjh'}7yӧ#psc:0<n(JT **n#Q& Av/$*ҁDf31**xDM"\O&ueX -hbӦM deeUWWzp8Z:bsǯ'^gr,YD*, 7H+*իi6Ν+3t:W^ݭ[ŋotX(&*ԹQ|7tз~q!xA 5!BDaPQFQ9Ic[ P8F"##" BDF xIT`'7`͚5my t:I&q()\+88̙3ߟ ۩J)SH> ,J͘1H}\kedd_VT`amFA׭[8@C ed d$N}.{-..>} $๢o~dis΍5j_|Eyy9.c( k].^ P[H >l t: CHT`\uu5 ":Sa lHT`dBErIc9mpdXpe!&&AQqq1АxAPȸZĉ:$[a3f͚n(zҥKo_mEШE_~d40S)tRSi&//O[ٳgϞ&Mo1[I{HTEbbbnn.Ay!TGH6,`YUU "j SQQ l6c‘b^ 4$)) A!z2@KNΞ=;h 60;;jG9+\rQpU4 MZiiicƌ^4O $ȑ#].׸q$׭[׭[y]t {IT*vtvp<8TT@EyR( DDVkZāA\g4yeeeT*dB8V~C|QUU}. ;~;'`jlۼy5k5#ӧ{we\ՒinI';;uܶ%A'Z\Fx,CJ ,CT*$*faci!] Kzā < >*))A@)^^OD@|yRƎ{ܹ-Z vKWԩS6lHd \.L)T{1)5\jERGݻwh<|ffSO=gڦbsGt'I+D/֭[8m!Q؇x9jC, YV# 8\zAWhh( ˱C_P^~N)'z2tЪUzܸqEÇ\._j-9CHNN.++0a'(?\ A 4[n .8p_~Ǐ?~xUU.o-6G VZbw[Z`СCz"!QAJJBnAjkk^:^# =>O(sDy刃_\rA"&&ADM"CE |!<<|ʕ999j /_={mOV+*{wϝ;"G}ܹ؏HT`+*\_&mA$7o֭yR?S)&oniܸqӧOGRarɧ7pۑ89jChԠu7HOQQP$5qUTT ~E|"^4$*ȗ5~ׯŋ% YH&uuuT**kժUuu#nÇqFH0)YU*[n=w\jjb[{??t+&oO._q߅!8Aro$rAz9㐨 "*JV# B z=r#Az^ `0 QSb ~D|qe-yyy~5n ` Y.7k,77$f]]WW7z[_M,;v,?f2bĈ-[tԁ>}32pVZ]xqӦMrN4i[l)))XtRG9hoqF,_ƶ$HT!Q_ kh4L)3r QwzqdZсOxx8BKGT"AEvrW0.׬Y9-66`ڵ~mC狅j߾*((@"D1Q!11}f=kǎϞ=;h EnQ{u\79 mٲe޽)N넆^pAWŋ .]J.N:IHvꫯL= XFgI퉉]f"^X _DyڰeHTTT`>B(L&B@@b~F\"rSJy0{՛:u*Ɛ!C$O>|QW^^SK*k'ܞ,ر' a58e;jJ a QMvj"ԡHRYY/ѨV.,Kmm-XzV!QAPx&*`#j*o\rr%/9YlVT 2==},PNһvl2^Z-_rIl\ƎZ`,7Tl޼~C=Ҿumʔ)gΜT*;9Z,eDaaa׮]wލP@J%^s0n~28IbuIVEf L&SQQ +,, yUVVb_!^$%%!¡r 70Q9d …Ν;K;|mڤڷol26SՆoڴIT>}/i0׫gdee 4ɓ?644*Չ5kEEE&(AmgV-~͛7KHIvݻ w)^O\`S,K߾}P@`oN !QAƣ6Um[fl+"Z-*#hhDW$IPS*l ~CE/ըqp VJEz͚5۵kٳ%dnVVVjj*Ym۶}:ݻdɒ*.sB+L:y .Mf2dLM:U)x q=VT$* us̘13gD(T ~ @ "2ݎ qED`A6  k׮!tfqਬj" iCiAS N7Qsf͚uhI4zۥK ŧ~֍T^sժUzL}ЛקO2eРAǏcF_~eֶm[<jQQŋvEG?;gٵk9dzUPQY5jRwxA #Orϒ!2U[[Y8AEf{8Xo|@b~!\UTTCTT $*x  L0QAV#<z3g 0@3<^ 8%7LTUWWhb|C^lڞ={xb؏U*թSfa!9 Ͼ>}dee}^Zík$' Sk߾?Pm$*PQC;5\HH0 dsC EH h(q{X,CcA16QAAАA8)87pŲuG̔5HCA7n$c`7kqd`Cq%ȗ>}A= $6mڐW^AȝN*~شiŋ[j%#% / 2O?|ȽI:lqhs@( zƑg 6yٰYxy!.:` E sU_DGYmʗp+W  CC1Qp3TTUVm߾]sגDٳgSv'ObPY:ӨFڵk3f8p_DŽE#Me""".^vZ~#4hI}-[t^駟pِG]+HN.-$*e˖!pKX&  r"x,.]mNZ-Ztr%Ah햯p#:~D|TRR 4 *//֯nڴ)3a0x`\JJʮ]f͚%> -X|ʕ> G?8/޷۷ܹsZ̫ߎ=ru ͠G`vR4+**pVWW/[W^{m"f5ZD^xn 8< /XfX5V+**Pg4)**BVqhxuuuCC4i bE$***r͞=QQQbpMM4+}ٶmHƘ1cx+o}jiik6pÇM&r{߿q';gϞ~k4ExX9M6hР?YZmʟ&**ˊ+q<* xo|̃P:3QTq`juF@zm**pTUU%m^#%.vEBB $*F:NK}ΝϞ=;h Eb>Q8}B/v~M'O$Kr/ؿFFƛoyC0޿BWۣK.$ƍC(|p8s/ wy$:XBi~cǒBzH.WA5~*nڣGLRBBB XF8XDYR*زeDDHsDf#`0`熕0)? QG$JsbBbb"3$*LDm߇~j*$QV7 ۽{{,0GC yG%9e֬Y$C$Ėj_oʕz>r:NB9sle?~9r.=zL>ٳkTn~;{no۷8Pք lX08ł FAʦtr3L  Bxr="qqq/ /PQA8N3??oGE[v`y|E^yٻK,jXQ+rrr B߉ W_}EΝ;llpYfy*F!XI:o6Bq[uuuUTTtܹ{αYn]^^^6m$9UVٓt].4t$BBBhvFN=Q`_~i߾S BE`yG9N )J?p[X6#$*Pg6 ^qTYYS~A|D"##=EҐpK ,1fĶnȑ#b)+Q Zh/ z\<222'k#m2B{7 W_EDD_y]ʦɓ';xP/$9sH8ɧOΖ=Ϝ9uׯrnEEmG;K.wF(q5|2 D ̪KdꐨS[[ZdB8*++Õ$*AhJSHT%$*˳%DFFZj۶mocZ) . /_n0ˁ>g}P(8 ЗwBСC{ׅ t\cǏo߾h餛SNn  Sp3qy|^*/?pwԉ/iZzT*9]UUe6.]ԥKC %y̙?~Bir/OB4df+*\<#]vpΝkXz%jϞ=H'O|)NsMKn&) $*pDuF5kB!g!!!0Ma@ǎsrr,X\.^nܹ-Z@(Q*>SUUU]v}G#wرcp?O?c[T*$*`'x޽K.!Ғ>'6rϒݼܐ3e,CEAE6V^ TTAEޅ=G555=!** A@]bb" $*0%؄߱M𘨠V}~| /lbhZL:)ϸ^>}zNNNnݸ|g=+3L?ӆ  +rҎ5k~_90a±cDDv**`5N:վ}#8BȐRTՈjjjtHT`f}Ʉ `.F#&UWWX[92I&p $*K2 $:y;dؼys=x㍟f1O7'*kZR!;vxB> 7eHTbY-˰'h4̀1n[Vā:ш] 1HT`@# A"!!AHߎD*۫3LbGP rƍd,]PPTRR2{l?!` A̍iӦÏ^WWi&:tHDr\]v}G*++%|"222șK%"T[}Yr[QܛYcT^/ڵk{޺u+B!HTaM@**ȒRKDT8Pg0STT d2!Q*$*!>>AEaa!Ebb" pf@Ea񸆞\ ݋%-kjv7M6oˣ}۷o_xqc;AN:?^^WW=tR9x Ϝ9Sڧc֬Y^ȐõoҤIO<Ė-[ubrov]/rHTe˖!r$8TTB,KD4 ZTv$*PR!`**`0 #ł ^  Z" vh4x;Ymx e;r4ŋ tSNի XbZ.L.U*z~!#_|!3jwqԩd9\G}'{C1X,nE͆ Җ8H̓<"t+7J̪CEL2l6ڵkL&8BG/_FBB1Q!::؆`6b=`4Mެ\r%88>h |+6mڴᄏݩ<|9)ڵO,Yg!QIIIK.Qf]r5F-{ٹsgm۶yyyk׮Eݽ{3fLMMTj5vԲeq<ĥ$xM `**a1Z\Z-`ȭ8Х$@!QwHT஢G? @:-phjSB)MK|g}gn駟֬Y_ňZvҤI۷oӧmra_eee9Ύ;xVP=uaꫯJԌ=ƍuH+Vի;o1&*8ߓLn[,֭[{)] CE`X̔,) J`y=qE lB F# =f3 ***7ࣂ!p(VT@&*jXbc)444ԩS b߾}NUG;33m۶Ln `X|vegg;III^~ p(Gٱc60HHI`F_H\?СFrʠA6mTTTDP\IT^Q;v8o%G8yq@ !eyythQټ@h҃D~T*2CTT|!qqqp&af3B,1 &*TUUQNTڵc=vjl>}Z?cz^lF%*S&N}}63}eAb;l0if`d4yl6[Ϟ=z!29JLL}/^zܹser}䐧MviZMŹcOE+$PQ! t{nBjcT;fz$* 2aYX)"-jjFQ{y***~!yŒ9N'P***B fj5B,k_Ih_,,:x`pp KT\RM0|~𿐳.ロgϞ-ɣ XEz-"O?\W3HOtРAr{^z-^ҥK_1@7Q> O Y,_TTt:)!2XCEfᵗjlȦTTl6# 1׮]C둨D?\ 4 '??*HTBuTBH;@B˗ ҩYv}RwG۶mm۶bŊ. 5p`֬YZq9/E"hEzvo߾>իW%|)T[9s&55U^QQٳ{u]|9cdf57D(3FEECoACEy0,F8y!Q:2At:jqdZƊBqD;6Q;t8 vxLdክ"cK*Qk.BwPDBBBƏӿϟ Q!`L&ӹs6m$r:$wSj;v,::^x=\xqÆ 2c>-&@9|FE*f͚5j(7 " sd N}`n 4ZD6UWWvā.$*`. ?DZ ^y b4D)tR)*zzzPC8yyy~uӦM/_LRĤ]QAղ1x3F^{㏇ &\#G|GzkjJ% LR$l*҉'L"1C!IiiE222V^/uKWT@ H ־}{ P8`쨪BBVc!Q Ahē{ ~HHH@|qe(A +* Q;t:XJ`WD'߯P([ƫ]UVk Je}Kbƍ tD%KH1ݭ[ ʤ[r…_|񩧞bҺRh].ED@'<<qU20kHٛ2Dav,dػ(fdKh !k+EEBܠxH4EP`%4b (HG):$f5lOYvwڙ agfΜ3|ǀ 1:^;)CiOࣜKWA:*;KI!K5i҄ڪ۷z*姇f;~˩=bGZ5@jll,YHh$g;P(Os\x|i_\3g$uРA2iL>ܯ_7|g%׸p8 , >}Xq54Q'#UUUd8M@'$*Z/ !Q:AA  L&$*DDjq`)qRx@Y\aj (O4c5F#jjvЅ O.]*P(ȧFE4ѐqɨ)SH ڴi[n-&e=zxw.^V\.+*Ue6u@EyBb9X,A\SXDA!ݎ%40 ҘPQXX aÆ2T^^^_dgg#F xD8_G%nqŠ >wҥl֭zv 222>lۖ6mZ-oj$SQ1[XX=O/_N>ۥ}i&--mƍ2iU-oٲe{^r%+./v݂/ BMNNC cTϗrr C@3ٌ #Ӊ?N"Q$A`^pFÁ8I&/ Q;S bbbp|hxxC\RR"Cz-bB}}g(7od !5n{ZfggGFFΙ3hPUUwmO^zKL[bb:uLӧ==WTy )P =$2FEE  sdN PQADTPJ<aKC" 1J%J0F!7F|!5EL Q;&*n\,**!TVS8 dܸq׮]L۝1117oޜ9s&"E<\hNKOO1.-U;u7O4Iے%K v*C 5j~0!t5 )#.)))PؠAzdq`l@JA,j5(p8 8V =X.Jzā Z^^8KӅ!@E/;&*Lq UT0Qbpљ&LwŋnLmwX1>Q𒩨r>S(?fDD~29?s=7k֬spUTnvN':!XBJHcpZv *2 G&qDD QBvj[s+44T!LXV, @FLuT2rhKTz_>l߾}Q}o~ ?~|jU űch!!!IT8?c]v-((XdL+9꫞={L0Q|TWJ5k#M:r_O @25G% Ͼ(d@e0f.q`b &/v;׋9EEGD;|ҝ_3Fh4rpΝ莼?]rӧZ-Q}_z%>d6ܪXę3gH77ߔ7:u:tLlvvyH3vڼ<J0 2vE%&&"**TUU!QAT*{P Dd@puĔ!j[łc#S.** A 111w/ y% [TVQ,ׯ wzG}bذaU`ھ};iv%O-ۍ-UɶuVקiF&G̙3ڈ#~R_qa+*8cΚ5 q$JDv#2!QZ_IDPQN6 3 x2 + Uz j֬| dphy_!ʒ:cJ9'92;;{řP &%%9r! {ҥ ^PH;N)_~m۶7oޔlݺl"_~gϞӧO?yv>C(}Yb UX-[ZN;PHHP:** 40  A`Nm]AE QGyyyQVXX(Ի#QPQ>~֭I־Hdjh "BV HLqq1.N'|4fx @TTl>M6E8"`9HT3FDf*dɒÇ#utu֔w**u{jll 4F. 6Krŋ'$$-\իo;$mM8,w߾}/ =j2Ȓ yPHiL 4ý 2T*1@-ÁfbP(4 @JTT`@@bEC jL&ViӦ/PQh#BuDDnLwDIlBP>5_}֭[3_>.LFvF/ȍu{ّsK(\.<_MZ^tĉ6l|M \|phZa/s˵ɱ0,!a>p$* hfٰL,BBB@'ٌD D  L2qWDD TTe˖Gr |Ny%Z.((uVT֨Q3f`W:ֹ>u{-"tn9Td RjU geeI:]vmFFƣ>*vh4~駽{^fMnnݎţRENxA&v^jj*Ba !QfVK{łHTӋ%4HLqq1.$*0g4u @F_ i޼9$*P8W!oDaODj/>~%W_}f=c5?5تPd2=F+ TTĞV_t{|}wdG=gΜ\?**HբE@?G@ª3v92#²AͰZDPQN~|`@'$A`Wxx8ejF!OV#hjѢ$*P s::  ^o߾o߾u:ujʕC54 gתU}1?XX-yFO' SߖOu?C~gرfBdER)3 dgg={v~^[l޽{]V\\\^^(zD2P(` qjE"$$:**","P80aXL&ƍ#p\X**p*;;[wG/8OTt T*)\3D[grrڵk峋vmY?\T wqw >7waQ(xvy[ -q$]&HS֫W/;y ӿ+V rD+ YYY۶m1cF~Ν;3_vrU> ^A}i9 "!m#zB8!, BDV+ A"::AB^^ncbbpO0̖^@ SQƐ!C3\xf>OXO1 ѣ_y#F.D{ d**6hHܤW䔴jISs?c7oD~JHI||ƍ`]<5 ƛիWSSSO8qԩ]cuu pZX`-"HTo: 1*va~6\ |TXX xѴiS ׯ_QQ?RD"cUUDロo4p$ʓѹݾ^?w?CQwI&CnnرcCNm۲;t6G?5s>mTTJ !_ر7qơrmbkɈMNNƤL*ʇPfedd8qɓ.]2d$N,++#*G$s M$kj42cC'viZi9bA1͚5C| '$$-G@?$*pt2_Mc b]v=HLLx"%A3Qڂ ׽{"9֌ȟɩ3g>|xhh([n0L1 ,[$VU5cȑ~-ZK.N-( 111)))L***n7B/^fH/AIUUUo5ݬ[2|`źuu6c kPȁ۶mСC6jZVPݾ}ɓ'g͚5|p+Iku3%ۦq*+*Ըϟ~iȐ!(Gr^qp L21Pa>}z#Gw}Zx2&!G Gh~Dfr"o!DlwG_}՜9sx1DD`Li.66699Pʹ\.$m˗/?sssoܸUPP>u RhehBHT`@@b8Ñf34i Y 5k A (#$*V-,, W\Q(999͛7ӾO=tD}pI7jsΕ@n̮>&*4V:uo6od-(|WgԩS \Ҿ}>S 9z ^fѩmhB_~̙3SN}76lxǗ\WTƺC**{4R;!99xV4TT8m{*͐ av='''33ի?JJJL{԰ZAPA!,%bQQ$Fy80QYYIny_p5"D |`e?Yf@͏92s+VpLzkDzi2trQ};$*T_}L͘1c޽رctbDTTVEѮ]#F u8= j+xhѢE;LU0_ 4CX,쬬B;==JAC v#wwih4mHj"aN-4j24<>.C#>t:̣l2 #֮]~Gue̘17n`].NRyKΜ9S\ӢHThy|_SL=ztFj~FݤtITvց x7?sDc rTѠwcǎ5k{اhv ^fƍ׮]*..OOOtRvv6w/=:PQAP ssb)dYzbeeeet:2HThٲ%Y'S}TT**d21/Qp[;N355ude%Pmԩ}ܹ͛7EԯEJ.XV̞={߾}ԩ5\UVVr#D%~`{+W$ĠA9h4#QAV#Q+V PQ@\hξ~:8''ҥK/^_~ u p‚3 QՈmv=NCH h%0 ːl-//Ջh ׯ_ݑ;$*h4 GZڶm[~FDDāϟtRV<:[v=ܸqc~(V`Él?Fٳgϙ3g&M4vƍsn jZ*im6i KÇ޽{%%%2Fu@Ϡh0T5k֠P  hnsss_WTTuK.#>d 2@(dۑ >@YjV8!ted2!Q!/rrrG |㏙s8N1TիW:ujӦM<U(&%*TUUu] .6m/E]Lv;׻Jc6ra&.6w_|Qߔlrܹ^|Yn ԲX,'O^nco @32v +d^ZZWRRsųgϞ?{Kmʐqr5EOdq Azv 4 av_5BҾ/f3./"##.XQ!&&^hKThٛСׯ!>sԩ믿~^[GՒ?O:FϹ$GWiU-[N>=))СCsBu%%Ҫ㑥z<*_P(L8QGIKK[n݈#d55ֶm 7nhwAAAʻX)2.)))(--vڅ N>X(QTX@'?s@'ݎHp) 1HTࢡm Z^^8iӦ/PF; ZB}Me6hT6l'k׮Νٳg`r1)PGuQPP0f̘3"A%*j.\ w2* ip"v|3ݮVTT^bjjݻ7mڴtR2ׯ_^.]z'Ϝ9Pwx w2a:3aϞ=6mO|^xGsݶm[ff&l**@D iHTf+DDVl 5R( "Z"ZEH" 8F_]lݹ23'|7ߤ$Vo֮]~z}sѢE_J%nkU*UǏ dq=(D7o ƍB}Zh p "j5`#Q>Ь+,,T(/_!\]|yF|',Zm}wvرCAhZ[vXo߾`rp\أ!'|^ԩSv{bbT+G5 z #.\P_6%|d5AS5Ν;wС-[Cܹ>KO>9x`ii)b%RXQ\.tdl6ϻk g F Y<ѳ3P,׋h &*H}f?SA=ܹs)ʓ&M_Zn}t,>mE7n4hД)SIPP"ڸHo߾}A.F#oA*۷,]T&7?M6v'*xy9mm$*n74Fƍ/^?6o<ҥː!CاM27:}!|{&<D tZ%8,+AZ\pGfDEE!AӲeK BuHH4Fx[p{woݡC .v8][o[|?i@+%1QرcBسgZ!V`Js8<7*3mڴcǎi7nؾ4$*`Q|{ 6ߋ: 6}6͛.]:rgΜ٧Ov=3*\.6=3jMT*$7҈ ivx{d)ե$<B~IVs'|Bnjݻww!7e%;EBN8ѱc+W Phc1 @zHGjggg_|СC|͸qZnݹs1cƬY&-- 5d=Cv_!Q*x@-VQb`#8 %eeet:r2LHT@&M_ Q( HT<[ocQm۞?~Ȑ! ž+xd6m=z џ", Jү]UU3ό5 ^DN%!yWB} |P={߿_a?fOӧfBєlXbO=C=ԭ[I&mڴt+hy @ TTg#2LX*++QQApEDA`3HT96l8ܑqOfY,cbbbp 3gȀ8lذo[ٳŋ~UeE^j={a כUHc=oӱRMz?QI؜;??gӧL]vYlx !QAHgݶmBAP [۷oҥڵ{I7lƌ۷oy&B !QAn wP DD&J" %:^! CXJ/d)Y^(Js `9ۄcpJI&_5%9r۶m[f0c zWj2@G}tԩއI"MTzdZ 6,0wAG}$~ɓ &kt {3c2&H0hР+W"2 @9$*$$$̙3g׮]yyy`OnP " !\D4tR"Re0K2Ás̈#j尸LxϞ=ڵ۱cG_\| J%x.[tI322n}qi$*\.VI&D&0k,r9rD&-ڵk}Qq}*TT;r'L0c BHfk$"P{Ԇ r 4k #/8\DSi$A`` j2a E|]v,sP(Ne˖=0yna%]wu:$>j4RyJh^HŻTWTFڵkw}n'*pݼQVVMtx,X`4_x?Q9صkW׮]W-صh{ !15BHTa#%:l6 ZD:/"$*@bF#.^ L&l/S`U4x)#Q!<.`\Q^^EHP;wN|x~RkΝ;YƯ*[r7n޼9P QZEEE^zP?sDŽ>N]PHk||<9Æ c]ѴZ|Nʒi?͑0i~,"w>zPy5IDX0!'Nر+W IRy @-A a-Fkg" ^DHT^GA`V=2kK;5&//AOTT+*cܹ4c.eaaa"m۶5k( RV^G:tA;a~;tWirhUJ⨁"Am`SRRlBsLL&wBhxpkמ>}T%A&&Nt:v*#O~ISFw z̯&@/_&\ұc'N 8PQIPwc% hX@'m)V+װaCt/AJvc!Û]YYf͚!AӼys &*B`xJTgz]?{G|WRgƍccc=>)XYY% ,}QwAyY4`8:cb28]wE\;{G}bL:w!Q]I߿sNx~+HE=z4Fȟ|)**Maaa׮]gBbfHTC Pg Ȱ@29V+ jϻ輈 8N #HcfL&_HT**x p cGVVB P*Z-o֫Wŋ}kzNJs9NmXrU*w/8tի)Iee%sjm%K233(H=IAα_]5V ռ\߀#LWSO=ۑcrNnHkK/}Ќt'p: 9F :h0 EB 3 ORD5l;2UTT B/vڅ8ԧe˖B{LL APQALHEs=ܳh"|ڠ AٳgER{/kzW_}+n:$Fwfs HB;//>`.Nf >)zGDD.yN'Qu8?G5 \Wb.7䠏7t @P 7X6Y Dw0A]DNcRRTT J ُ٭Bn?رc׮]?1sAr PQ!PHT0~9sƊ⣒1gݻ۷os~Dt&%%m߾r*++rfx xMy>p]Qz!f㯽ۥ<@`BNuJ&ڴiSPP0BRVݿƋ/h4,X黐n D$R XxaðzXHofHHf)JtB'0Bݓ$CBPQ;v'>>{-:z/_y/!QAxZDkZ׽]pAP~U(nO\vmРAӧO߲نjkr^(--رc}v詨jL0 Vz!bBu9}}c?aÆW_}PQom4۵ke}HEPwd%mFV͛W^^޷o_^:QA Z(ׯOHH@Ri4ڸf\w(!P`q-LD4 R UVV"[XjZ" %BCCz=qWdd$pҥ7x#cԨQ;vؿa$***N@;r/i$Ȃ%Kt!##_~}iM" atF[9}-[SѠ A.߾~]w%tQPp=eM! gezbb"fNRIPj"U߲ނPQAɥv;? Q7ژjF@)bPndvNTDPQB>tT0WRR (q`l6cB4i"/k7oܯ_?,Y;w^t 3b8h#`E٦016Hdtp3|*Ja  =W^Y8N>jw}9rdedn[DtH0i{Ѳeˬ9sbq+*0`9{/ߴlԮضmDITxee%6^#zA2N&7Ab4..H%&nf 9TT6PA z=ĠPw иqcs΍=}4iRrr?xaCAZ.+k%&2z]]N۰a> Q=vXOJWZsH"z*JaB ^|922R!,S&={|Mr;v$ݩF?B\ڵk:ubRUUU'U:Ƅ͛7#bD#Nps&  UDAE:{5X)8NnHhDXjQw ͆8{IGnn{׿A|͏?/??ǝ]LThժ$*'JBQQϞ=KIX^y=zv(,ֳgO?|6+*0>#gx.]"d ă>Hz&L^{FNaݻCz+*On޼y}ӣT*YD;޹sgӦM1lXh4$*Ѐ  "-!B!:X. v\RIvDBEjL&A,И^浥*<< REz*//_+`6׭[7p^{-11q?͛p n HT w[GO40$*f{Gn7%=ݻwP&!!ᥗ^z뭷Ff|`er|,Y$((fb0U*׋N:e0r`:ׯOMM$Xv*/d~#b`wz*,,~*a@3fL02t))更! }!ѰN 6vnMzJ-$Fբ7CFpwE'NL0aԨQÇ1cƖ-[֬YsѬY3un'DGļVГP3`X @I|W۱c-R?\ {nn./H:,6ߓ&MJOOoذXqXR=6h C[vڑ8y(**;v,ػw.NR)HofV&~z)9Kpp0hpDDhcNn 7X **P bQq ** <<A))//Gإ qWddH?y~~?jԨ#G+W\jնmqXyֲeKuׯ_cbbp a%QU&,–-[ȭ_琍?~?0m;sWcw?IJJbP8ϰ[YbՇf%aIp\%4ӥ˖-̬JEt.MdϞ=)))|O`x(ntwߝ`?X,b\sϑoIx^P(kv-#FvZo.Z^*l u֡Ptt4L CE1|V2D+++[Qq: [nF[޽{۶m+3mk/ѣĉ)"WVV?w}{t[9=x(nݒ __|9fYjn://?#; 2ƾyf׮]x x4(( 4III$x3P>B@ D:fAB<l6tl.V:[ TT`b 5TUT Gpݺus̙={+Bc+VHOOǑ.&`Ep Q?(ˊy楥QUV5i҄#{̙#FHooW֗t+_|q^rG fq hرc۶m\2G*,,LtgucǎZddwɔwЁv:|f1UWT)cA߿_\:QA;9BBB ,2DLII! B!^n +AHڔ tX,(`%@D"V HT@ddt߿޼yqƎhѢŋڵiZYnY&M4{\`nժU'۷/))݁ wcPfm۶|]vI)4h~衇x{Gh4MAAA<'*Z,$*pa̙ӌ,ҷnPObIЉߑ ԁ I4d>QhK)dِ)8NH **+44T"LfиqcޫrÆ ~G}o̘1cً/ā) XT@XⲲт[oܸ$zK.뮻9[NMM;v4J% "Ձ6lBl?N&/b4ySꫯ;q<%h4\LPTFw>4VÇa9w_ ;"Q#$ݕg϶iӦ89} eŊ}A$fPb2Y!UbٙXN9zv; ,^%HIqq10ā $*I&ߖy7W^y뭷̙s!_"## XNob<^x!+Ws85k4o\f 4BϞ=njCs_5999^^^h4Ǐ_r%+?*gϔGڨu։A d؏zGΟ?~ަH@x.Y57AHHD jH. MX$􈐈A@AE®@H5H~<$s̙}޹9sf>I8(9**(Ah4$nAQON$5OAaP@f+;*''GepPYK^Ν[tgΜٿE֭[+O5P _ ot~2իW\#111?#?2%V}h&իk׮}yI֭[7.'ƅ FQ .!VU]KC ФIYL>z/jߪ17}u^q̥] i:BI&fsAgY6U8`12>y<+(TDGO  P.`LeAikEmjժe˖ 4hԨQ#F1cP+ ĀBbwf`|#GϟIM3S֭K3fN?jivƍ .\7[gϞ*oeeeIUT!(';e8NgLW_޼ysiG PX۶m׮]8( g.mQ@(M#PVdPrk |l6c,P^^ eۑJ %([vv6J[+Zj?_|ѣ{=tw}FxU"<<A.55TP 3C~ONh4* -Z~zGԟcmZnmԨQwܩPlVV'Nx'xnnny zؼ.-#}ܸquԹy*_Gq4f,,,lܹ"PD0dFJp̖ Q t$MJ<8ftT{+fÃ}PL"ݎQB,XQ!!!M6mܸ1..{ ڰaBB tT/v/MS}tO ΅P*dw#OXWFL&Ta zHP) rYYYCEU^aÆ۷_~ӧOGUVZu*z$:*C%J5 ITǍK/k׮}j x rh˂ Ο?_̙3/^VgQ!v!9,Vf͚Ujr3<`ґf2ƯшDF).T@貧H"𩰰w8 @| *A'B:*9Apj"G ܌r$_bUBCC:HII믿w޽w?q*$a)Bv<R;w~ի;N/h4U|͛e4<9Nfs%uլYٳ/_>XϞ=3~cիwqƕY i,~^j! 61b,d71^:;ȨB0#S_*HLL@(z*!%UXqL( :ԦOA=Ӊ fPQ^'Oǂ{G `- *A)Ν۽{~;wvu_ߴi"zh T7==]oG@fTZIHxX*4>>!UV88/j+}޺u瞛7onݺ2r :uG}_Qo2~ztt䷔/_Nnlƅ |;*<ꃎ lA8!!m۶w1^PxKZ-r 2bXPtTA@@J p8QAӉF^oܸݻwok֥KiӦQYz]Sx ;n[x43c+p؎;kԨq?*|e'zE%a\9k׎l}_2窞xo߾=dȐ,''G "ZWtd^z%rQMv [}wY^mPƒ%K##*  _h A4cOVEnPC^qoW`` JBQAtUy8qb?1c:uԱc>}ox,M 뗠P^KGf`͌ŋs{j4Ef͚_><gZV)ѐ۷tkϹ+WlAnn;*E+*QW_j|ٳrǸ }<^Ey6B;na$:-Eڵ+0 $ D8c$1}9995k䰤CƍWnݛ7oJ{$iJBĉ̠ @̜Q0P!(T*p@G>Qo"aD9E(T.''CG ]&շ#ME8/d+* Tf` ~W9 ~||뭙F*pJp82$==Af*Pff&:*WF 7nH8uGG&)Xaa!Tf`l }\ggg[_~Cu'Nx$tT]#EO;wܽ{wms~ȍj[~dkZL=}̙ϟPvBYB`1jl%#!!m۶ w=Ra[(T P QO,POݎ)$wA*_Yl=,O?  1}~LotXz2$--]K/7 o,XPJ 7vS { 9*KJJh4\6 5>_!vޡelVJܺuaÆ#F`_@F'<pQdɒhAESaa! @U3J|>2BTEaDdh4Cn[B fC@I! B\,ի#tPA*` |!9>?8'۳hѢ5kr &ܹaÆRmb$hȺ7""bѸͼ^/vfYt˗K'V0z>kZjK@$&M>|8Nw((8m@d L&dJp(??e6V+JB.B8NtTDjP wL_0<_ws1׮]h4?iѢʼn'z!I dw_wWvݒ%KTˏث|>6E-Tx(^ݻ7˃>v.e2` 111sAT kvw JCGܝ?Y@G>n*Hff4 tv\@p:yyycdzT =<<@ zHKٞgyfڴil9A>jժR#@vs{ !G3SuT`<BBš5k]hXvT`,>7,mvڵ}0bP$c$(Ci>!ޝa2d2!W3qoK@@_J ݹsA+((P!P :tT; $3u^|ER3glذ!W!zwϲR'c@vMlԭkРG3rEB¦ z}6% PDDDbb"n~բPEnlŸvb @;p@N' (TS~~> epjP:tT>WD.BCC$,T,] >CRƺ5]tdcΟ?h-KӦMO<,1K 1_Aׅ ȱ_s4~ =h6V@FΝ۸q#+e,P!ɸw}& PZAAmrY0H9M`P @*BI!I܋v T0}`Xq@ڵkݺulOڵ.\M-[~?z=W>Ov eA^^}Ueb_*'%פIfndP:*簐0ȚGB|CF O< ɄBx^As8(%}6@6yyyk`f͚]YEv*Pt8Ql62~Hj|6a„f͚qwyg߾}5b]:+L9vT(jF駟T5Y,Q:*'?-FtTNKHHh۶-B*:*ߐ@=0 a8!׋%rl63Z0nB(ZYW*(T P; tT`*;;[(ovܹɓ'=UV]j?ۣh֮]bŊի- Eog^^ =^Y~dJMMU] ~ح!^7n+ވJ~:*iɒ%ш$!x |>; erX"G.QC(TJA]"HT`F]訠L_0`qbۅo/fϞ} ~gСڵ*D / SX2_vMl*'6mCtT0L +TjbCx՝;wA xPI& >q.IXjxf ] tTP qCȱł8xzeYyP*eِg(PVV Cu.B*&6#33TLP)^ r<̉'*F bl2}+װr̙jբ+ 9ZW@N^NB۴iCw&;KB,mP9s pͪV736d} JzV @ ׋1D!LF, *xxP 9d¤!vXL^HH@:*(D=Ѩpgdd*8,T(t:uX,Xbaaat?zN`=#Pի6mʕJ $6c *.t$^hщ.sp_m۶]v-v|ԃ}? * |BG1ͨAѪ E6 IGtو{ժUC蒰PAӅcZ2QvsDZĭ[=ooU{HIPl6#vT(;4o\Z`4̱x+,7n\ݺu/_,;*/Tp^C""""pUtT!E2k *0pȈlFo^/ZaKnPce* /srrc@ Q4ۨtTl?Bޞ={=;v)SO=uȑQFѪ^PVW_}qѣfϞ= 5f gV˗֭;n8CvuwQ q([xxxrr2At:#:*!*ˑ@i^d*pw<A.l6R޸nTHNA]vB8NtT`z]ƍR}c=]@BL 1Qc6?slOAA}Y||<2?Xg=Ν;w]I1b'YV ܮh"rA!"IG˅yTS߇^"&@kG3*z($R*d08L&7^+LIp(5+\VV@Q@@MrPk׮I4 0}`lQ@A Μ9tܹ_~\護[h!shaaZ>|'TJJJh42"àPAvvM6oF?h4*yʾ/HLL@(lXjx=5tT P!CzjExx`Q0p88BӉBj֬ OoG-L h,._TPR2ԩSϞ=lܸuIII0a\MO~XPQFG{X4l[^/ޘs wcuz '|WZNNF駟҂ 6nXQ:Zr ^b_|—b<^YdI*UeetTx->X/S566q= Eo杪¦"Kȅ,::qrȆn!E PJ AD2L&ā7(TlF''233p'"D~~>셇#tPA`myegg=-Zxw Qn>ܪUJb72ͱO9w.7$$?xHz^ԯ@GJ{PDFr.|Gh PAmP &M>|8$!+%{ bPi PA8 :* %gpjddd tY2RsEa@Zj!tJ/Βdqn˅B!]bbb̙8@ /@=P; -T< qP~~> &h4"q:(V@@P."Ҫ0.tTP/t:n…SmGbݺu*h4+WQ={vBB?^E %U6;w>裕eX^;t1x¨$"lK9ܺuI&+.cnU'wT@uim۶]v-"+@U2J Omj[999, q:HߔnG1$(IZZ@jjB߿ׯ_G( E蒰Pa:b O-H~6QF~!WQҥ?ꫯ*}:3Od'ْl*|>|o-P&'r0xLo6MSL82b鮀C (M~WoՆldދO=z4**ҥK^* J#x\Af3:Gr(IZZ@f#PAӦM~g9sӧϟ?_U %a;GP@ 2J{ꩧ~?W|ݺu͛U:t/tԩp-ZyGK!?ժU#۶sқWeggc.xXh^՞-C͝;8;=:*54Š6 \f}xP tTp۷"##_y-[:uرc7o GjBBbPscb?JBAASz{rß~)?;Dž;4hP?J!R׮]=ڹs/ǃ%nW@0^(nPr ݌;* #󟰰0Ŝ;b?-ڐK?Cmذ\sss xн#jvJ#h4x'/f_gP 9Á4\u$ @|sέ[ر;?^xrWF . (?En_:NCaZVc$رcSSS߼y3W/?~Gdq8߭[nBBٳˮPA.Kօ bH8l6;[lիÀ >ׯ_SOIL^C^k+&Sy8p **ڵkwBzPJCڐ2<Paa!Ҭ8xZ J w$;;A+00Pa .]z[l٦MaÆٳ~8qVsP%)|QJC;Pڭ[~^^PPP@%s;4""b֬Y'1<<ߧOhHW_-\0((pL<9%%','^Wv P JW:QG2>˗/oذqL:5--V,;;UVWF#bQRԝ={6**'BwtȊndM {Ұģڐ2 xnԨѺu/_OW_'d#G/** )Vׯ?bĄtT:|-=.p /@Q*:+CEJ\.nKjb6*C$nB^:5D}РA]tyƌwٳP `6RSS%v*ЅBքwT B*F"y۶mK\ }g{Ȟj111F~b <㲫> d2R--[[ёnÇo\g#'Z\|99𒒒6GGJ v^ϟP@1tTa-aPլ AȋUVZqNb`6ۍVVqŸ}6@w(c\~}]v}7z/oݷoߍ7TCG%a] <SZTmCe˖>m۶`ؼy3?;*ޛ˗/ܹ3g<Co/\.YX:Z >trß{9rbŊgyB#ՋlX.]0tՒ9ϒzwN.X؃Af3^W|0j(#|0!P" |Bf4# $gZ0( ^eʓ& Ϸu֮]ٳw7q* %mBxx8vEz…Tn g ,hذ!?s~P B(Y*x;P ,r JBx+TgΜywzsرw޾}C*JP:tTP*!tx(tT(?k4n~U6mڌ1B(7K -Ʉ n;*Z@c2FDDV <zu:t })rq͘1#888%%)F+ Ԯ];g5}DfjG4tTPBV\ ,!˅)v;*!AIP@f㡖)??֬Ygʕ_~ebb}Q@ :bUwϴZ:YYYNELݻwK8q~˗H ^C'OX) @CF20`˖-|aÆ#Gl6ڲ+nݺղeݻc5DH(k~͛"+8pYCGAi^Wgz}QY0!mWvp8x J PdM&TW_ 0ڌ3VV-k׮IHt XQAVL'd׮]þHMM;v,?FzzF*1Ξ=ۤI+WaM (T~\ڶ>3k֬Ybs=WƟb\FWUZr~h@RR^_|9"F htRTTѣG Վc8OdBPլ Ap@!QA. \<*SZvw$oFO1bDlll^F~ 68p (?tTƍfuX0J7 R*VY&O?:B4i2i$Iw:t1hVޖjZm~lB~p$ 0H~\Q#Fԯ_?@ܨ a'G5[֭[ٳP!r!!WKJPAmMop(77ASā7NS.)U`` JBtusμyyaÆ 8p_~͛7 N? . Qʥ/\v֭[_~sDu֕d_~va=JPv:;Aޛ>ӫW^reF~?Sׯ_ ŋyaÆ!tT 9ή]YP$ϐ&dPWNܗ(:*ȅ<<ؑx/$BĻzn۶mԨQ#Fիׇ~r+V9saQ.*( Bݎ GރJ@7n!>K.˗/k4/RZ?yK. ].W0p V۷o߄Al6SNMKKkР'^_mŊ:n@/)*((ׯ_=x]֮]Վ_Yx;}РAW^řX(>ȺB?2 N VZr^@pps.]р=NW % չE$Ƣ+1G'ߏ@=P\S#8 Ȃhj T&3(T\@@J "_=o޼`1cƌ;t 6PZ5"tTP*B~9i7p4)S5jʿv-[ZnO?d,o-BWCo>''Gh4111 C Q@k#G:Ψ(< GhtECBBMSmd >>k׮x[(TYÊqP!QңHBF B\f*8NAZUTA@10PT'cMLL8qIbccNg-^xX,xM unGЅlb/Cx_tNppꩧݿ+ εk׸ڕg-.\SRqccӦM2RW\zƍ˽Çw_$ɳrUBi3gάZ1W^d߾}QQQ7n@(pU. 9`C R!]7x='!׋b`EsP, HE:u꣏>d޼y. 6-T.K *o@vKaJԩS>G]paĉ\95k\l_Q| & 0@ \YPP B21۷oΝnLyk˖-fY!J!{K.j $9[ZdPѦM7x`Tٳg7o  _jQB:*@`J@>Yej0pFd2P7(TAPPoF(X,v%KL,7kfdd ^5$G5tP5*[6_DwcƌӾ}{~ܹs9lRϯ\_y~郈I72fٳ|;d듓tС'NPyk׮NUCT V<9}L&Us5%S#LB +wPe(=,H#8 ȂdT7^v#:cȌZLJJ׿G 6lҤI9s i!e{1$(TP-\*P_-~Y/[ddgK]Lnzܹ X]v;*PY轢u֣Gr*U:bbb/_-SCd/_%CTeED.w߭WoxAm۶!JEF09<8-2A ːb <A?<BINp@j+MԩSs裏F=lذٳgϘ1cΝ~ԪU AHv PA6Mm "UMxiv֭WGQ*U֯_/t_ȿ>y}^r]L&x.Br9=ۍFe/1bȐ!ǎS nݺ#GpU'~ OK.EDD :#s {]IGVP Ux'ڪUqx0a„QF!ʃ 3dz` 2p ra20_w/[n!edd,Z(66v̘13f̘;wRSS%v*A *P5mڴf͚bbbܹձt!Heׯo~̙ HY$Ckq[fOr].]9rA=y.\ SzٯУj.jBI޽_z饿2Hnҥ={t: !8ըsA χ)  Cdl6c卻 !BN ku~2RfM"i;*D Zaaavvl·$//JwQ Ǐs/((?sKAFZ5mڴ޽{d&䐓ix~~~-F!wI۷,Y$33SSO=uubxri6bhK5L1$m۶:ddd  =@փES 8:BE^"v; @IQBCC-T@G1Hlb2@P!??ZlZdh4 *̙3SNaK?g"gƛfN^q-_ZjNLԆ*P (T ш q(TfS_P*@HH@ G jxaaaNNX,› SYp ]V&LN!BG:uyԩ X[.T0L<} /%&&|gwܑ]z}ҥJ:[ڴiR2[`ٺڵF#N֥K7o~ B*EXNT9Pa 6FJGdjoP 9Á>$=E(TP -Z,5G 9992Pl…a?>~Y}Q)c?gϞ/^ܤr:ze]+ƅ H:f̘~[Ft:4 X`A```JJ ÆsN2[h 7nnc‹PPq‚P!QS jZ|>,#)oBivB(`AB R!s7nH>(thʛQ`6oInn.;p-2.^h8gիW/ %K/ܹGr6 8|8_5h ..ÈL}׽{Oo߾-&9H>i&%[l[o:rIwܸqu_1n׮] j(= @%0T!,x !ɫRx(ł58xB9BP@BCC]&p1H0gr *])7KRfZwݲe˶m$jIw)}QdddDGGO8NGCvűz&B{ӧ7h@NR2 ,Tr O?)C.](i>W_i4͛738lz{F <npMdHG(dMO*uW.8OXVā7(TS@y5C]f=cb${Y^`2/zz>曹rPg'|ҭ[sqxb M;v'PA]n7… kמ?>?D uٱcGLL… oݺ]6%ӧOEA*[f ]!"f4|BG @q$=0 rY(S餒&&R"|RTf͚E(TP$ >L3??_'Py3D6tfI5kԨQ}J6mK =$СC/y{E~d"]v]x,%(s/BZZ'GiEN0!66 vԉ }UҬѣf 'ZqIիW1IٳqlFP3tT| Jx^*VEN' "oU,-tT%BTUHMMQ  p!<T*]&Ѹy&{رcgftbZ-ݣG &pʻh4 =--޽{FFt:Q;NPqԩիXr=wէO秧+2aÆ#GHX&')SԬY?l$=zVZ&M#^.]3 P/cP *<QA.  8P Z)% _jB(BGECEW~*I<<^ ]~=QI~?.acǎUuڵk7\snpIkdff"* PBE q F*sxeݻwaK~i_Z5x(~Bn޼-[8)P?O޽N*Hv6`L^o>a6Pb}7o^ZZ.~sÕt}_fN۹s'h4%fggo߾k׮yyy]ڶmvZAQBPB 2A۱{Y@^Mpp`J q-@$9i 7E"Aʣ@;f~R'%%^Q[wT] 111cǎ!m?q֭/_\O"䨓ݚ|ر$f'_5iҤC)jؠAWΘ1CIxrRGGGoߞVn>Jvm.\ K+"""11er+8~kPTxcPA.F#z)%(  44AHBNB\Q)T㕜Fs%6fڵ|_v;*XlY.]Μ9#.+ctZniӦ}ni]."Bb޽V511!J$o3{7o*8}7n4h@I[,>lㄒ &ԩSɓJ] drr2jq!1B5YzpJx^d K=ITd(d>9{<   ChIOO744wp" bMNNӧϬYo\aGIN r?iŋ֭;a„J^D[~=ȍ7*lGN[$XKĈBtE:* |>*FA-QA.^FBP*GժU;uԥKѣG|xӦM <,\pTRDׯ_fͤ~eԗk׶mqPy-*|zEt:*p8hG͚5###VpGrwm@4hйs:,\oݵkŋD$~{xx8vH$H@<'? Q( h4\Xq ^z/r>}M0o޼0+ 9K d4[hѱc^{m۶mwPbCGx+ENN8lqQIz<nެY3fL>]-SN*UܹC鮘BI~ 2ófͪQ(uD^711~[dIn !9c.Zqu2Tn„ =ea׮]{ܹfIIIYn*Zk Xb֭:u2 & L4iሃnYQCGP~P6v B?Ӥ"uiܸqTTs=$ly<*Hn (ɭ[Ũ^O?Mˋ/8}tLZCRIx \T^*t:͖sg̘k׮cǎILHFcZqd},Ċ*xb^F9s2LBsC)2z ^Ӳ9I믿7nԨQ xީS'rR׏zOi 0`ɒ%Fn\Q*ߘS>o۶mapǽ+3111sA qJB1Jh סBcqqBpppz_~gQL qsA$dۑ2JBqRO?t:Ǐc PA$8@NNNAAJ3 ǐ^/}zqFC"/a`{MA$ټ_|qٲe\yd(v69Q?NIIYfOC 5FբPCt;}i4'|iӦ >}_~yСk׎;I&R)ۍBi9b8N@Fã5k6qmPU5==]oX, "ʳrss9)TՃLF DvƎ駟j;*{7ɠa͛7?~/hѢx;HckСdߕp8Df#Tr٬B˥Br"Hy+/>|z57o6id]Nb (Txl+wÆ 2l|DǍxU^l(T!Q4%j1x?Ń^@@@jdz>۸q6mDDDٔPAr-(Fvv6 y~}Q׮];,'i)cxJ@%<5((H*Tl-W^ Ҋ)fwԉwŗ_~>z P,//oŊ͚5ko:*x<YgGYVe*n6 ʅ.KW \zR'gSRR}>yqF3ɏ+ U```rr2\2D=PLƠ4j :* Nuԩ /{ƌ[n=~Ǎc.0. ªC$YYY7m4&&)))Gر#tpKB{ @<ҜuB~~5P ĥK4Mvv6T2uŔ/bРAqO ,~y {Ɠ'Oveʔ)F*'=pqݲ~; ;od&*˄HВ ƞhQJV*i--E)V[6=LG~JLfν.G>ܙ9sϹ> ~l.(=ǎ{Ɍ ܹtnۜKرYfPGjjjtt4B ;l`!QKz;X :BE;7ooo___??VZuܹGm۶E'HT`Kբ@2n߾F47.11Q6 b3ٖyER=QEbLT>S|r"<|I[:ujݺuzxXJE[Nz5kqwX,~{H#l6K,L\Z&I‰ Fi ?xʔ)?%(GDD믿G {*0HJJJll,ASV+v8 ju:]@@>#+**-hD%[|# 4Q}-4Zu6#TT062OTv>Q|*H O;"hE^DY,ݻ ]rw .,--uLrv(.G}t߾}!!!5IKcd2Il<***=Kkֲeˋ/~TW,Y zxxH5%>hĈh9pB;0z҇F;!FB` gdHovb~~~͚5ܹ>;wܴ_uӦMIIIڵC  Ʉv`AksXp?Ccǎ駟N^ڹ{뭷~Wݚ5k&*\xq'ND)D( b6%\Q_t҃>.طo߭[YоJy.GM믣dwaGax )7HT&шFki}ў={h9 `c00a @Jn߾FBdOE'O3f $*HyS|<+iUT@Inn.yeeeq󽻻;D>KN2eȐ!M4ڹ{/o߾V\,.^~6lؕ+W_ۘ"**bLrHTOIc ̟~woЮ4Ŕ1b2@hA u@\mvڵw͛7;%#]Dܪ`ҀDިTMzxx.Z &6QApfHHTp~+DɜP$[~=wwxr>',,\|ғT~Ν[\\䘆E;#F~d'*Hc[>r 2uًLY$/$ iӚ4i.^tƦdw!QK=b^FZ~ ٌ o~ ,駟N 2dR@\]!׃HV?pLLL^^y>HT`Nբ@-˕+WԹt:ɣFBANNlf~~~i**^Z ;9w\\\FYz9ss =zسgcyF/wԉ/\ZZiisiVVVJl[w4OErPFNII9uIW bݺuBUO^߽{NCS **1|_$*@](%Rd2ɪ7vwwmڴiǎ{ŋOڵkǍ,h9FQ(h/^|~ԩݻ&xzz}- ̙3Rׯ3tS$*8OP{H/L/\ЪU+%ǤNaiר(BB&㕙3gk YtСQ͛7>shy^iLmT*UUUUSS#nPl{}.(++Kqz]2UNtݻwz4 !QLb#@@py j {tB#0kwww///ZܶmV-ZG2 F"Qd2eggO?o3__ߠ 0dȐ3gM "dP8sBz5,ԭ[裏^uǤ|ǔ˗/'''/^XhnΜ9Ǐ_l<`su2"ֳg8s(N[bņ éJfW{r3JDn%19~^ ^bbA%/]TZZp~RR0pua'2@X !F9P @d: B#H[rRmڴիWϞ=### h44bⓗWRRBûv,޵qjZӽ;O>$222~:6gYW*  !poA-X:th.]~ݻ v=!Cڶmꫯ۷pMNGddU^xRX誮g7Jȓc6% &|L&]Ϡ'N|cƌϷT.<Zj5lذ7|O᭢ourVUUIy& T֟`m۶Ç'$$Hi6=WI\KNN6mAdUA#aKcr>Ym KTP(J244cǎ}ѣGXX3ٌ lyyyaw'~a޼ynB}HoHHٳ 6!QAL1 C^gϜZϑ H7ZL2r $?FsP(ׯ_rr2cU*ב 2`0lْ'LuA aR$,m{={O:UJҊ (F .D;!vb'@ Ȗ;:)//T(:.88}cƌYbũS~k׎9iӦR y{{chX,7nܼyСChٲɓ@g&M, >>>=='N@ ҆ l;$*n8݊ }\[iZ2+**2d믿>mڴN뭨 >VZգG'FNۢ?{Hg^%XqzꫫKJJ""".^8gΜYfqA,ӒAZ ^xU+yR6-Z4a„Kී|)7HuxDvh@WdchG\b0l *3,,cǎݻw Y@EZ-_PP`7l؀6Ņ<⽼\ॗ^z7&PV> Pz=OEB^^h491#{zzRl+b4hd@e,Yd8JT g_>ydNWEѩMڡ6aR(`>F#@-A9 &(JVԩSqƭY?yʕ+ ,$MtHX,̫WC=,\]]7nܡC۷߸qA8m9ruD?pww9rnLT T~#G<裛7ow zqLpiii&M/L&& 6K89:󪪪oHYuss-KMk(Ҫjĉ!..NԿ :z~:Mߔq  D A14ԪV(hpwwJ-TTT ށ;w7n\JJbŊC40 iZ4p`0dgggee=z4>>>,,,""bhZj5_E:ޓ'OƢel6mS56AsJ`#T]]m6iET*WWWZ,555_G㩧2LB[.++1bį:sƍDn,KIIIIPPŋ{ꕘXkϞ=ڵo߾ZqEn&V;v죏>J$YQAbY%*8v6Ŷw^rw\;F~$*4N۽{7#lDcF * ۾uD;vد_]`lFy(An꒒2O!3g^mt@ժT*F3eʔW_}mtedd0t$* lPV?Hqq*J ) Q X,Bn=ra\tuܝbٲe'O\jUvj 8R!Cif_3=O8qɒ% ȈUxk׮C k.ZCS`$&e?7L/@.5kֈN<ۊ Xoа!555::M-@E4`053 (J-Zt)..k׮dīF2Z@b!OCDS~o \ ݬ~Ν;M# yl6CŽT**P2^'% X,:^QLN8ѵk׍7־n嵡jjj˖-Z|yTT_ՠh{{%*ܹs۶mT>7xK#M'׵+OZzӧmv|~)))( ?* AFä&IJb@ 4naFh4Z}ƍKII9ojժ~Y jF l!Q󋊊;;~xd)۷?Zv $XT*)FYY,8)UVǟ? 2v+>}8pABO<vv/:QOO C|p<ā )))ho-Dw"0`?=*Ėɇ RA[&'uyĉ۶m믿ӗ.]:hРƍIA>ZZ-b{3++kaaa]v.BhVׯ/,,,((X|9ZdJ8 \saxNT/#L'*PIzݟ7nܞ={yX,%%%رcǴI&5n/Yի|gkH,\?l2EP(Ql6} &jZG@/^x۶m+4Ed<Ѷmۏ? Pr5@.g}BUT *r8ɏ޽{~~[o%L~8 Lz~:ML7PQ;[D#!Q~)+ [nXR裏&%%}w/_>vؒ%K J$*0G($*ȓ<ɜ̙3ޞ>k>ԑFqww{w9:>hiAAA;v\WTp6ׯbٳ=?x{ڊQHa1c4+>?0f͚={vŊ9WVVb(/555::M&QQ;[D#!ej`#+d"T%2j;wNJJڵkץK;G i4 XMČFcYY֭[[n/^,!3w%s>}\z xx 'XD~ QAĨ⊋?yI8Q; 2CE!wXvV=:yd'7x(--wq!_{Μ9=)jX{[ BC},DWTyՖ6ckxF #0aq>1c0<#QAP&L0m4BPTYYFCB o QrSYYIq퍡d䆪ؾ}{V<<<.\ws[?ʊ)Hdя`$*8OTr'Tۙn2{DQQADoH{ֳ+j:jzdNSJpii+ySNQ9Z &%%}g\fG9s+W|||)l[&#{)//k.]ܹszɪׯ_])X\yq LX{` uPF=}PQJVj#!$d2[Ze~'<<&rGH8IzdgTRRC.`u*Vβ@䜠KwwC$*0!P** !Q$d2fCh-Zj EDD7|Bzƍ  Ëkjj/<0!䇧FGGcw#HTَ3*NSZ 7LfBzVD뭷*--EgʲX,gϞСdm D9,ksm'nAhEj E7ţ?Odr=CkϞ=ge B.g˖- =}b_gUR͛77ސRg6%r?FEI||2 G$|(HJJJll,f\@Fp vX=Q<؜GL+QARIxo-$ AVsqyM0!++1i%*GyꩧjS^xxx8Fo޽?XSܿ$*ڈ#q6]nnnXXX=BK;w^aqB@g{iZhQG(a=h lPQs1񪨨@#Ew?G##Gtڕ4i``ԩSxT*իZVVFz'Nm۶5UI*##᧣ #PP]] E7QzEU襃mU,$$SlRZKMM]`A``h.^L .<쳳gϾ%PQ?ןRf{%ÇEڀqqqC ^7o歋bcMjN. K | =4BEj+H”d UUUI?BPՏ=Xzzz#քK[lX,K,h4M8mEMYNy;\|DOOO ?wN&uU_:;vlTE=7ضm[TTwQΗ4ϟ3gN\\]SžEnyH+z5vX,kϞ=RIW,w}͛gc=zBT*}¡Czvn ҘpAq`?)C-|dxi0ͨFQhǮ˗+J^qF"A׮]/\P]]1dȐF-L@|Ox2u)**r8'-s3*'*xzzpy["gϞ֭[9Ur㏱|xM }뽼o x_=jyNs-] pwwga<vK6ٳ111b@p yp=yK˗wԩSHGmb 9s"##+hü---ٳ+"}ȑjkdHy{{=zVJsErϒ^Z1eee `'777i@L$*x{04Iyp|@2f3ס{ٸqcXXV Yrh,//'W,ťO>׮]#}7 Z$ת333~^_gQy 7|iiDTJxl6S[ND獗Ν{i-wy{{{رnݺmܸ0.B`OCg{ohѢիWٓOQƍ}=5j%m,{xWgb_a<$*@-| /4@2lZֿʥٳ tcƌ|ryyyYYRZر#((< @ 222~:xƲdǧ` DDͅJ[QA r1γ?{% L5Gw(\]ƍ4iRVV3ݩo@VQ۷;E6\]]`{EEE]rEOI K.8YAE,TTQ`hFbP +DZQb|wڵ#?y/.)))++üOO<č7ϝ;׻wo___2 zO,5dRz8ҽWoUHTh?IIIIKKbKBQv֙3gbbbVZU^^Eur\K. y]WWWW{7;t萙)+gׯ6mZHHN[et{?G1 ӯfBJY{sWyyy{޹s';DyF`v#zɄ [R*p*J#G={ϟ_iZdDde,:d|**h4i/Z#W.b˫ɿ!E}Yi_~W^y^tڪa׮]<|h&Mܵ49oO>M&˗/ӬY32]fM>}diӥRVTpaAs Bt?߿;old;zǎ@%Ӗ[D#9wZm/8!!!))xѣGbbH[~ذa`[iYf5iāǮ^n1HN.9JTThgĐv@PQ$}eEP O $lT"[h~BCC oݲp Y|9sss׮]?(ic[QiӦ81`흗/% WtKPPbi޼0ܶo޷oEc222ڶmS9<)ʒ˗/"mv F"}׿w} ڀPPMT" AI:D4Tqqq\\_$*6h@dL. \VVv`Hp=>ܽ{w[Wc&K.~"N 'BܠK&#T*Fyy9iXZZڂ p>qom.9RtmV\Il+WVz뭣GJdF~-'Re%䢂 .wߖӧ3u NڿHHw}L?jjjoD6;!}}YE} ť{}ȭ/e 75yJ#-曯*v$*`a_a$3A[\+jjj6mkn͚5ׯ_...$H=ÿk~~ѣGk4`Zٜ z x(0Q-++b!&&n ,Eb w%sؽ{ BWSS3xW^yᗕO槢BmͪUDF@@~ Κ5kڵ篑 Ln}JbMd lٲCb%IH7;D#/H 6HT2scN0ZVh-}  }ڵuO܂J%D& $*HFNNN޽كҍ Q S<Z\GE$*0eeeɾM4yۗUZZ`{ƍyyy/_HǮ2**D:w~2 Ez(d2 6G Gx~cFqƌ7onݺ5mt :^0 >>Ep +tmfYKDq-»_~|ꩧL׆ x~]P-tƍdJ9oڵk TQQ! ͊ DĐ:~~M! k i0!TT0[tw… C ?~糲rrr\ bԩ7vڮ]BCC1CZPQAXzxxxȼsN:dٳFL&/!QErt7n)ܹ ܭ.\T*Q(mEr]T_~eeeh sB;VP )r' lu 5o~СCxxx&M͛wׯa V޽/\pڵ/>3aaaFp+**~$*0xA ToK.MMM˷5 {WVV ݊'59MT 1Rd:qaÆ |bYfM~Dq;l47+~M6ٳG+7žq{ QA~ǘ4` a`SSSɵ}TTi0h˩RL xꩧn޼)EfP&*ހBEѸ2.7|AAABnԲe͛7Ϝ9S,gsȑcƌcoߟ\'{ŀ$PYYI1ٳ]t!M!a!$1 R6` On>@***VnnnQQѿײO~˗322}ȦMpE`mPQ%*4ٴiFL&+ˣ*++ndV0p]\PQQq)Sl߾]vm*?k֬={DDD~W-Z8vXm ʟ9%%E R6,l޽:*-9$`Sc2Y4L&ׯӧ^z̙~:22Yfj-Lddd0tS` ,:' 6aSX,VZ d 1tf0md@]zriu/9RR qlO? :T4777&&fΜ9/֭͞[ a[&*<(_m۶ M!nr Lȅ#`Cfֈ2CA c@nMv…-[k.<<<$$kmۊ HT`CέO%Q[ne9 =YUUU_ EE.ƻ "-8:h?›6mZx1{5k,]T,s{{Xo ^^^ϟ1>G#QAVȹ:t%KjvQ(xHݫ*#]GbhHє۽v (//7hԫW^vm֬YZj֬6@<KEEEVAF*l&k׮ݰa`):d.. [w )D'Ncǎ:qIC%%%=z}8aaa[l̙3\Bϟ߽{w= He7xWq% p=M! 280GVPbk;{͜9|M:uԤI`@ OHT`p<ȪYx;yFqzdWThM˜1c'ErtpdZ^nҥKz/ /^LU8^^^=z??z(FѣG=>>%=$6}m۶)@ {$+2g6ر#33ҥKӧO'odda(TTT1 O)///'qA{>}>Cb-b2(+(kO4INhhK/sGyDS2Z9rÇcccfsƍɷ}[#=$I111 xς4VQy)`ȑ#]tao"G LWd YTUU.?~Ճ80444<<\Re@\o+ z%*ӦM{'k׮ :T8MM1zt(**p|᷀B5 "С/2~xFöq5\oӦ͖-[yedӧ8q\\\555ܽTݹsmxN%$$TWW5"}lfK=Yw?!QmqǙqlʲYLI-TBjѴ+EZ(ܴ@RHAˆAHmSҢ,{cVSͽ;$D|>3wpsڶm۳gE#EbEYlc,7fLkp4)Tdd2966vݭ[AFj 9Bgk >|(z9 W^$ԦiZ){Y־9׮]k XZʊ+:;;ōG\YhZZZz{{ʴm?+jjjĈѣ'VٺLm}}81% [ݾ}{hhȾ㟷4,Uؽ{x}&^ ={pˎ ` 98 \t:J@WCPyyyQQpw){  _]Ņ k֬ Çj:'O|P!//ڄ^r47nx%}n}T*cǎZbX,߯r+ ?(ݻ,܋ԔwxL |Cv۷ӧO۷ q 9N<34Rb(Rd'F v{VT8 \xb|||```߾}`p+W$8D ,+*d=9.ї/_T͛u·/ ]PP`R0i׮]-+T0MsVUU :uP~pӈ#ȍ7:::4XZ׋d4E% D/v呑d*x^ZCmٲ8d/CĊ X$M0}`];3I@O~… D֭[ե`P!gQ>gb`mV|%c `_aϑW+W]&?#c}>_cc'Ojjjj,Ta_xђOiq#d2ˆrOd8BP(DP!2*<={̌e7+*W"ɅI"=-Wi˩Tʪ\9ۻsN҈o.Zĉ4hlnnްa2?XMMM9#qiJ 3sνmc:Vz.&V[NqvS=1%9HSooSY<N)T0:xh"xi, Bp)VTYݵ)<ߞ6%Ғ%6c1q`߲ۦMϜ9Dfװ߹sH&6Yo߾Wr*Ix}ΐMm`*7+L)1 SYYY__ׯBP$D@.Xx_Qw1]33,$ED T"StOhK[TEKR꧖P> QQ%a,6[_{zᡚ̝{}ތ_3`q/׾uL&c} ;Xw㶼h+*… Oh3f }⡎Jcl:uglC+*1bč7  ?+\͈C_ׯ_!-\"ϥ@E'%%EDD "qQQvA@0MTUUnL&ٌ8ƌsVƍlָdqF(..n'hڝ;w22֭[ O)T^^q;k`0-íwqѢEO(_CƟ~~~sMNN cu*++?{Yx[BbUZZN5烻X,#F4CPQAV#QgB\quuEE 66;*'<q1,Z(֚5kBCC[n܃mB@C_3AgAAA?dذaYYYLw0!22mUTTpi:O3L_ȓ]v@||{^Os)p4y8U.G#aBE7T:HT܂8F$*ڷowތWN6-7"wz=ív(C$~ 9`V/ >w\1LMM5kⶢL&Wc@=zT&=.*˩06٦UVk׮]l96ϾC/^:uC"xk<'* ~#F K/ B˛ӧ#Ir LHTqd { DɄDʘ1c322:4x֭[lRV#2ŶBhh(CN{u TTcZ<;?>~1 㧟~sN[)++ӴZ-<==Mn*9@k݋hv3=ڵkW>BUUÏ;;vs:U?777޶D?ڠ mDZ8<^` J"W'%/@ , u )2 h4 4h/^|5k:uԺu@DA e0$??X,dH߿Ϗa$ m;mE݄ 9FzE֟P}YHX 1IW^K\\oG1<3YYY UEEo|+YUUe41>zDFFL6_!Mj{NDA0K @/V ~P(h48cǎrիWM־}6mx{{#2A1~%@cn}zzzLL rm6mjZ<Ӹ-'b2ѣ}{1'4)NGd%Kfs֖tvZ|y=Ɂ3QϷ$zHT{6r`{~\"O hHJJD #WPuD ɢp)ҁ  2nܸSNeff߿_~Z GcCDnj;#JE֯_l2&{$)Տ=O6Q,(D֭[ 3?GRUUE5QZf9j7lQ yϞ= Z䮛<4箁SܹsVV֜9sod^wHVddA+ 2'`~CPQD`0 t˗/gff~W]v  SbPQ-T<ˣ3fHMMeS-Zضmè:ujڴi`Ӹ1UUU?}||Vku6|\1TqDr3!GmܹsoX\zVPX, 2o޼ׯwԩ!Myeg777{JNNFF' @Nq#0,ud2<y/p.]tw_M6^^^- l>wAkyH|S GFFZ^g˗oڴ,//ӴZ-f6Eŋid4ܹswGKsvbZQ)r^6b.[}3Lh-Qv۬0r;^-)!TTpssG%%%EDD ~e@Đ) u)<%J,7nqʕ!ChT!QAؿfT?߶=][hрvׯ_}mE\+Z-_|Ϟ=J'Od$lgϞ;wd2 [ֶm6sF#ӂhXZ&G'Է?9n8Z}\(}8Dwwwg@"^ӨP'..]tQ(wbbbȉ4lذRU G[3gD\0O1iD 0 RkHTX,& qRs%^!8MT !`kzZZ믿d7e2իWƙ @'LZN@̷X,N ~9s+?9x >W.wwwM&j5թ<$( <+++6Ɩ-[>}7$ǚ$n`M xڹs;N]ðEh  BZ4Ɂ0oB u ) !@, M6]bE~~˗W\ٶm۰0$*Hy7d(2txx͛{bbb9?wƊ u͛&d4iL3yTL[,sIj53D.^x֭TSSPQш#H|饗lnpPEvԐ&2ES?>\X,fqGyYYYǤIM QAؿfW_}ԩSLvv̘1o6h}?|c>×1vފ 6={ZYR7l2v)TPQA&0K?q۷ߵkT*o (VTT`|>L/!W"""0aBH#<9M=I/쁊 P$@,dBDFR+7nׯ_pppHH<77@syH|Quuܹsyc=f0ҥK|IG3ǜ&Y4h Dzۤ]Sg@-QIu+Ui,p1"66vgizxd ۷G}**p%(((99YΒ8oꑨ"D{؃ H\.@ee%**FYjUNNΥKBBBE$*0RM?;Iǎswwgs"FqO4۷d_ aӓw6Th^|8%*Yn]NskKsݻw(m͍4@E%>+8IIIdFyr @䮁 ]`'d\C$gJ B())@/|ڵ+WcJOO1bDppرcz=,d2ي+_ZhѪU+Di4ؿfIPI}͚5.wqÆ c>cƌ7\mZRq/,[YXX@ NL&ma|T;zhppMM /&QQ /cш) ̓BEpRN( uX(Υ`mڴiѢwޝ1 u_7nܘ_L2v؍ kP\ BD H`t Qbݑ!!![?~VVQ )σ@R . B hѢÇ]JnB8wØaE\wof˖-SL̙3֞Sjj=+**$Xrٲe&I{1Wډ flL5N/~'O>^3x1 Y`Att4 ? A8[xuȘjJ 8TSSO?u֭Yf]v]r+WD ,y䑃={6&&+((zMT ! 4cx(Y]YY)خմp;vaEO~ VT4hlcM&3>:E^R8Ү@{:.~LF“'Ojj̫P;ZV`#/<ƻp°aÚ6mwZ4 nnnʍ7rrr9ү_?ҫ0_vѓO>vZkZiwɓ c>bĈ\N+*d2ni4%XQ !޳gO?`087YR}9Crit,Cvݽ3q蒒LP`^F!Q܉VP CX섙P Rkhٳg6k֬O>{!SZZ*wNkz~ʕ!!!͛7Ŵh($* Tll߮hҥ ]޽5k< jrX ߹]ܝj(--m޼y8ȞFXof3[E?tN6m ?tP=Gw-***11qaEO  Ay~%)){:uZdIVVVnnnQQB&u_˻t+ݬY3~fY(Ƃ_ >Q_|ӧ'Ld8mؓ9sCu,7gΜ4N>i]Qj>r~XI).\>|8: {=B-\y$D,TTC(ԩ*ȒjhP?20M6 0a©SsssM&,rkVPPҫW/c[N o0 b88x`pp0MLL\|9]"""qd2h4Q}ٳjUՍł tJL&1oa{2d֭T7ag}FOff"$*%AAA^^^>>8 B IIIG2,''獒fpkҏ?~ܩ~z׌3Ο?sҤyooׯSeÜ1D_GvZ?\Rz_m۶{ A*իWgΜPMbbbTT#W=$*0!r0NyFRm nݺv<~mVV֭[JJJ0p.*jŊ/š5k֤I&4MT@9zH#Gܹs;SRR߮jexԫW/y0?~<9E6mXH]ɷԩ\YYI)!霩[,RdrMnz &ܹs$;B!_uС.9.W1D80/..>v"q ,Fg#?E@c0\j33JT*T}РAӦM/,,DpK.]H7LvN6{uTp̀L/,,dӧO?~3,+*NRG+xѣG7. ;pT+*hZ f g7_3===iOXt?~5+*F,#A\ޤIr2o޼ݻwK-Sgʔ)8Ѝ`A_&J" xmD&77w 4hСC +**K.]}vJJJ=jQPlpB <dӧiӦ1 6kQQ6mJLLZ</^|a?\VKu`s/Ξ=;nܸ ri:y-[<}4oQ"hD%A\NIg2\;w^Tp_~% 0a8#l6[DQMTX,TaKa%?pĈׯ_o۹s'TTT{ ϟ,Í .X A B b S>>Hq {6lc8ڵkoj3~{̘1?4Z~ 65?_\\\fSnFtدFk"NT]QA㕚:rȵk6t_t:R/Iyy9**pYoqO?tɿ(g?MlO9@c"X%@İ: xPE:m>S/}}}v? apF**&&&''ĉ spRlp8$2JJJ?昖-[ڵfee=SLvy¿748jvYh<}/L& PQA& s9s22{_};w4∊ L·%*8p ..nٙ"BjX>|O>y)JQ&遳CEpFp ! kx X)))}~w򊊊JKKtIyzzZt_uɿxxx~ۙ!( (N3QT?cll,cd[7իWh 8{;sL&+|ǿkCvEډ +*yZ-󰬬,11qWZ۱DևxP(H礽y… 'Oc1\%*$ zq4 p**`G/x m' ]hr֭iӦ5k{ǎ+***))vNJ&oȑ#YYY&MR(^^^\u1HT:4Ƀ ֬Y[o9f̙Ǐ-[o߾-#mdyŋ8pdѻwocF&*HB3gΌ7n…^EZQ-`4N**9x`\\￟!(E@rss{ns o x O90؃\yvtmۮ^:77e'%bbblfNjQBq K,qaÆݻ_|ҥK<PyjDEEݳ@*T;6LNtD/gG=ׯ_U\ |: ɓ'N*B֠x֭>+siZ$* ZHGw!vr A,  xPR:lM L4Hx{{O<9##`0 9%$$TTTܹsg͚5M4KCZ )'y_~%cN7klݺuM49<:CZN 6{&=rBP?WOW^O RDEf opAE5|$))??̙~~~:G2LeeeO$ 8v=zt׿l# Xl) QA`5Ô)S{LRq $a{s @ QARSS͛Gb4󭢣 cf٩_^t}nnn˱3gΔkj`|sNNΝRKTp4矓m6 T*͞=^# ;Ϙ*BC:vLg ` .OoHBl6oڴGuss OHH}`08c斖>}wR~ pIǏOJJBYd2&5"Vk[h!C \QQsnݺ%)GM6%ߧgϞ`@BsP,**ZtΞ=+X]]]^^[#k?~۶mI Q(טxȑ#@N3aÇ#T*ш@PQD /Dr>h ӌ^x>>>ǏOOOX,KU bҥK_~}@@Fe Yws #GLNNvwwǁtJp۽xѣY¹P*HT'xbݺubiiid6z, \Vqɑ#Gnڵi}<'*>9kFFիr_O ??ٲe[l!=sEvڽ{sss +a^öe?j۷o'WU`Xpypj;v<|0˧M. Dr9 Qw^2ñtxܼm6iӦO?^d>ŋNrc9qD޽333 4N!``&tܿ8`߾}OΚ5kϞ=0wfy۔M?Z͚5fSOe!˵Z,111**\}z >_~d["55۷oGlN)ढB͛7F5?XC#-- z(!QD 0P7JI5|$)8&++kʔ)fذa'O4LF/^\\\lX~-Z(J_} DPAkT^njv j~i&u8OTy r`͛111wNdƩ>}zܸq}Olr?յcr\MMM^^i䪧RmG}g͚u_T:ooea 9uM3/#By$*47|Ӽys77֭[eee&]SRRHoz{{+,**x&ycǎk17oر#N3?pᬶN^-D Z~ݳ9M&DvjAnn|fggu&a:\%܎?>l6?s[n BVs /_i&V JDZcǮ\$eaet&+ ZJ o 0@VM2֭[Zi?MйsKx=`'rg `{Ǡnݺ:… ɦ^1]7odӜWTj//X\\\w:MԚ/O{y6ˉ`rdڱcȑ#KKKEyV巫CO Ҿz5c Qf3&%[שS?|rj,or-x7}]oGx+€ i?pRnz}}} EddÇf3ws&M,][l s 5*''S$*^38p 221& g[n1kdy{{[־}?> D|x…TjCI\V+5:+IE'߼y3,,lۻkGkO?k3El^E:iiitcJW9yȐ!%%%8]@09l*++@qgyF&i_|مJrV533_~./ P<z@Ј5Fپ}?`yzz9s~+Wd!!!ms& ߽zڻRf^ x |<'*(ʆ|}w{1&y}{.=~m``ÇioD͍9.V駟UZAY ]PCO>A 0];tA@6̔ZG8HSyy!:vhH\S Fdj2~砠 |*!J'Nc찮]nذ+~fϞ}o4Evt:Tof3HT΀N>I5@oAAAMFN/EDXVdD1ss $OhbNT ̱^zgX"44w#3X⫯KKKՕINN#i-X ߮7'A{aCjOX,ܾ1-I..e0dܯ3_y~,=zr9 `o3fd~q{ݗ~~@(1ye+&3BEFA'>@dwuVX'¾{@4Hn{ 9o޼jo߾i?p$*=$ܹsWX_~M6L6Mnlnnn~DbСgF[NIKFAϵV#?d'Nt,Xжms#Gp- ҷƫÇy&z3\VsNMѣG5 ڂiZifӼS(,,ܽ{7[*QsNNΝ+ڿ|A׮]32241LT [7Laup⫯zg o@h@8BpRzs256ld eeeOmˢ͝;;v Ddm9TT$i͔z瓓1Ӻ1d2˖-[~=5jT||\Ruͤ?q0r-0a„Ç64 ۄ`h-P]]=uYf!"}1]180h40HT!ڻw`5w޼yIII/ӧOc>ARq;EZ?o>s裏arNPNT=DpС֐Qm/_iUUUcj; \j_}/D R^^PU.Gq$Lgs KZtHpj4mgoܸ]v&,,/|!6`NTzzzzyy 80==,++r .HT*fi4;w5 h Lv V{}QrsT*nvOT o@]QܹsN=M>/o=yp]?ٳ_ɼ@R٭ZZx1#{mCؼyJ s^a>>|wLƥ@BC..UUUu Km?J ,KIId:yȑ#=<hZɦV+7gdgCKVs;׊Ҩݻw/կ_s (ʍ7?8فF(Dvˆ3 @DIm8jVTTnٲW^ޭZZhQII  >Jӳo߾GaaaNNΘ1cb @HTWBBG}84ƿ8&ԩzTC+*:t{[o=w}NQ(6lvմxd2ZM;JdGk%o{ny>]ygٖ( "#&}CbגRRR2dȐ\N×!42%uۅBv fŋ&Lm֬رc/\VmڴZ:ukJJJ:Ժuke@ &*T@B̝;7!!qh5k׏ɦǎj_yzfgޏܼq4t:ÿ`ꩨ`sQ_!Ã%''ܮ_|F˳$XUUmvr׮]tRgyFNȀCkUV?~ ۖFԐޘ /^P'D..HTqCtrTT)---((OHHر-[QR(^^^M4iѢիsrrF#ئ=.]bBƒM>=11gmc9rU~O?p:ڪUzaT*nZqhGuss!WTd'Odw/߮] ubO4i֭zmݻwOOOodAi'N ~v6''Kͺg EN7xӈ @A2/^ܺus5@ ab>d\rю]AA**Xx}**H VYYy+WǷhѢiӦO=Tjj* V:. 766… %%%o׮"ug[~ѣm|> oܹ+\Aݼy^nC׆|j, \6lp3g:m?e ⡢9iKo/\|yXX4mӧOg휕%_\\1w\R+Wtҥn_&'a exEEES#g BC..HTQBa#;$*>֭[6lhݺ+L& qO```۷o\:g} Ǝ+:AAA8@j"+*ܽ/]v-,,$ c[d;0z s=PQ P4 YF177ѣƍ yW?"8bT*w[n]p!&&';vǎO DիWJJ #Ѳeݻw3ٴje8%[n֭r:5 @CvЁFeee2쯿brb3{-l_[O^ԓq裏߿ȂaL$M"MRkeddի9LJdgϜ9Cd&*TUU!Q233{} I 0 $\\%qRc`$BJPK**H ξxܱcG~%''#K @d2GHHH˖-ccc% //СCݺuCpn%%%'ҽBD t%%%q#=ӟ9gϞv|ĉo!***\1/rpWСÚ5kX  ͎ZGmlIIɍ7z֭[ #|EEEXdMuqBpՕ+Wz{{#8p7rϸqg}_ u0[^ڵKMM%"1sW^yɦ&Oj׮]wܾ3L~sl pV25jKJJJgU~ahh-{饗o޿V KǥKv|;.K!Jr۶m(ҷH}mkyx/F޽EQww]B 7PE1($ (MiJ+ $I0 D%2RS#eh&^]_\̼0Zfvs;3ϻ*22r3 @PQ PIxw Veee/__&MԡCN:ŝ={1Rٶm[r粳ɍ|UTTl۶͔0f'Cд㸠0ǖ-[z˪7kO8R{KKKxEZR@azjJ]vvԐc\vVll#$Qҥˁj 7͞=;22?+*9rѣ\xD;@K(:A'Ο?TT Dj y[2hkyyyii׬YߩSݻܹ7OOO//^zŋEEE|"##`b_{N4...AAA9;˪_~#G5552_2xEfCfϞ=bJ˿k׮v 0 99!Aֲ=F`c#Z\rƍAm߾}YYY[&ܹ?C e ȍ @he˖%%% 1[PQ({ VH % 3.ܽ{ҥK~ȑ#y7x͛999˾NJ+JJJ.\qFkkkDR蘘w^Apss>zaaa9nݺ?VVVǎOf,2IYYYڵKKK|yw(2"HT`{LG&7 d[UUȑ#N` rK\qRk5U*43t׼@g?lBݻP @ AL4Z;ҁ ¥]v…s͟?+22־}{r ڽ{KKKMݺuqFCعh4d2/iX-[wb^>É%3~OƍH @J1c$%%v8qDۼaC[Ezyyys挤|'Ǐ?vv]|vVT )S`~PA 1 b!͇G7ԏ\g.a=޼y[nӧϺuc zO?رcϜ9s…"r/kP٭[Ls8:;;)?hDfRT;w?~(0=9tttͥ$wexYp9|f}8wyw,U00'6lq}+++V{D`fi#Λ7逸=mm۶%p3C~g;;۷7:׼2J$h4lOOOB(u **XLj !ˑ@oO޵kСC}}}{tқ7o">gcc=f̘|rG ?rx7n'N+NL777B-ZfL ,xyY{キm66h42c0t:_q`EEEj#sĈlN1s d2ٵk׸;lw|j40og}{ȁG6lV;Ǥ\Q%y`iiA4Jڹsg`` B! lK  >hw^GIHGYY+ҥK??ӧO#&Rhnݺk׮ݻw_`cNJ?: 8H  **PE1cƌM6vAxYرcWq ܏gt2Io֭cdrѮ][rwU*{˯f;QQd+=lذod5Тӝ4Ϝ9ӫW/srr |FVK9QߡDLLHԡC"B.07ymc$̅(,,@<<< _/?9s=ƍ|!Qf &رIt)^Vj@XYY /e˖Gtrdd*haaqƍ3BKKKONCq$*z.):tسg… I ԐgǎdAȆfI6Wȉl:_SJxApE 'P@ &'> ڵk@@@LLLnn7h Dƍ6l0'O0qqqmI!;;[ L&ej%0 Cuu5 P|mNRRĉ-[r ??\TfEZZ+g;Ve{1JsSQ.7|s˖-]tm]V?;ҥK8о}{ᎅΜ9 d$s@tD)S$''#B8p`e0AE=sǽ{d...!!! .?˱`xȟINb"hXi^ղI6l(uAZ/E-$:|r\\p?{l3NB Z~= PtbKPQ:nԉDz0[Dxշo߰O>ȑ#skOׯoذa#&Dʡy}!NNN~^V}Ç 4n̾Hȅ 08a˖-ݻe ?~!ۛvEZV7Ғv^j_[FFƢE\\\p`|=t/ؼy3 w/_3ϐ(Hs/ 0Ohy娨;v`pЉ K0XLz=$̅s VVV߻wC7l0Dp>`mڴ:uj~~>b777fHT`aa!t׭[4^V_ۂhd}w\sƍOO ~g&rWE?<==yO3QHT:m  P[W'99۽h ӧOG~B#a6ooBOOOLcƌ?>/^p_-p 3/((w}mٲ%_Fdcǎ5ϕJ%K% 曕+Wx`5QAղ=K ĠA/b-'㎽vڑ{[o%Lү_?rd_ʉ :/@oii)يsrrA3#W0$*PQB**H  տ|o ~ڵklHT=  3-[lĈz%%%hbvTT03g*s}z;4 -K֭[콼auNc{sJlnݺuZZ]]]q`-\]dI|3.̼K;)Qh4"r̾@!TT!x"̅`|n2`U``^|={9rdԨQ̝fO߿6m^{5'4(v˪===wK*#ԭos!@$''2eaaaggWf lGP(@vG in߾{S ҥK ;**t jk#3^JBE޽;00:@M/LMd7I@#rvv ׯٳۗ3 ?VVVvھ}{LHTzXpH$6 &7o$f,@ݹs _ **:9خ0/9x=DNFVؿݻolzSSS¦MVQQ!~zrpVUU 0~^&'&Ȑ|M J.>䮊~6<<q tQPQL )H **F&yyyuԉ#111::9@L333>h0HR 4CO4MNNyDA(TSScV?JS6 sK[Zܽ{ٳ {'7 wwVOsz&QZ%mݺwqss͢yjkkǟ;wqN6ѣk֬^6dٻB6VY¿sq b*1`2Ԡ#>=/cee9nܸo6..Y DeeeV۷o۶mgϞ,aJs΄|2֭[_;v,1UUU.ښ퍌3gGG~Uo۶ޣ62F:Mψw}w4|r4&}lWT`{ > !//o„ Ҥ |Cyzz 䉨>KȉIr2R \Z~= * :sU Q\$*򊎎]vx’%KZn$*SQrm/F(:~xW=k,RxF`(**RTڵٹ( +++9X/՞*NvENF7mԳg˗_zFj^XRR2i$rWz0f+W s玈h$+*d2ZA;v`t`  @C+O `RSSDP*xq@ƻqT*U@@[˖-'Ls!&xW^MOO8v1$*oyh"{{ٳg#fJJJ:}4/ BCCWWW [Kkkk~… 4|7nd˗/w앪fu:h9hdGx3g}:v8c gggիiiiO)1k֬M6a*P^:""Uiӆ¡+1>6%Ο?Kqww/a;QmǏg{JQ^z߿_zN3g 111aaa֭[EkkkMTD?ۜrrrOBtB**@c?Mt.>yxxI&&zm233gD@?L§ج,FPq;88h0 񔕕$&&?/uVҒ5VWW pݺu-_V zqIIIv۷/ 2RΝ[TT3P[Z`0,IIIGDNbmz!ţ$\j.$իyyCBBж*--MOO_ z{{#B*2^|ڶm{!^V/=z0o$9k9}F9c עj4Rj̙aaa.SV]`0ًٿpœ:Ν2d{ETw|6f[o`G rW`)S$''#xEPD_hL)d@J1'O>. H NJ`Jl4yD :thyy3%̙3aiZVd -/Ç4Zv_;9'9}ݞ={Ο? r&aB3g&$$={3}ᇣG2d@ˎl8Gw؁:ҁq3@'~X%7N|z#RJ!ChS~Bzzٳg ^TTTBBB 14DFF=z0ӆ Ν;wvخ]k׮qjS3Dv7(J-ՑRICrРAEEEBjC |JPH!f>hwtUݻ 4$ݻw2dL& @LyΞ=:DGG ?A@)J{m۶!1WaŊ~~~cƌlƆ˄$g͚էO^zwIn7oNT g:q_x__ߙ3g߿_dq9İe˖YYYׯONN澄7DhZh9v^{M@G+r`` B!MJ=o]@Nr@E.]T#F <Bʳ,444***66ѐ$*KPlݺfÆ d2˗;v|}}Y^t .PTT*fɆ& V $[$١X)h(AEu>{]`ƍoܸ!#Ej=vӧOsyKW3f x";Zb#EZY ‚]h _Ptd2R**aٲesrrx֥KVV9X^gdz ҒDu@U*Fg-..Nu2k4=Jz( !z rl_C{Aŋ{葜|9I#ekΝO<9{svZ-ev]m۶%OzoƊ+hXS"AKIIG$ka1s@dX]@( O 8WL׮]J=z􈊊B@ِ@pދqrrB4$ = tRٳg#拈xf͚U*{T[[l5XRxŘ1co2Ixݻgggפ:wĉ7o,CC+++rwܑ#GxYxff&KZ*JLD+** PPPbWWf!9Ds ֭[n-[ڵ'~wݻwϜ9#[{cf}{[3{x֭[_|L.d/c@ر:Ř9 &VЍDzm;w4hbfOHKK|2AJ;tP'#AHlcc3j(f˗"''7nݻw/Ydl,d2KKKƿ^W(}ݛob2F9/m===4 [}̙.]Peee9@*-}ɾ5k޽{qv4p }ԩ;vɓ'YHN76""<8p0Tm:;;I(rGz@.Z%*d2Tb@TT! (޽{w`933s۶mW^E4hRcbbhI!CEDD` ݼyzoĈ/Y3:˗/ "w,3Iv;w\HUUU߾}>,[ղzlwܙ,ٙlk,{,--@s,Ծ}+V]"/3AO?믿l|Zn 4߿>0a={ٔ)V8994߇X $*mps1 4݅ @ TTTTGGΝ;[[["&`cǎٳ th4h$4O>} """,82L2x/^`v'*rHu:]}aÆ]vm۶f.VXXT`R^^>z;wR 9r d†䀀9sb# z; JJJ~:9P綾^"SNQ1TpOٞS* h ݅ @ TT **:v8aWWP;;;́i4ءC $x(H.++C4̧Vϝ;3pN:FL2Mc\ zn΃qss# œIoݺ[|y~"""{TWWWs39W_}駟N< ?hӦͿOTTfz΄̜9oeImrKHKJJƏk.G@rVQ_]dɂ 'ѐ XR!cVVV`` BBІ_psAEh 4u`N{dZٯ3Hܹ7 tђJL4ŋԆՕe%9IT঴ڴicǎ%Kli}vAoܸ֖ك y뭷]\\x˘ R@RSS.Q(3=%WΪ2ȣN|zܿAJ(ɓ'n޼yƍ'N0aǎ;wnzb d)P&&&fgtt4!`۷/**E(1bĈe˖ݻ3Zfd2y&:qcdiӦ5c&*O=Y'rDq9T3c<KKKfb&'#c1k֬/=jjje8qē'O3F]rp2?߿?$s EJJJ||<faa @d(!;Jq**ڵkbŊ?<##w?~|BB&s=ztܹm۶իתU@{{ج;wFGG3>j = RPPTu̟?\_`f2![[[3:|_~ٌ??C;Z-#q/^8a:<*vOF 0 '''!!AL) ~=~<Eq6ӧ;;;󸗑 SLINNF=U`2d2A ȣ**/}MMMϿ_=111** A6'Z 1}||)?aǎ`#Z;y\^d[d2+++fT5s ógn۶m߾}@۷GEE?q*#ᅣؐE\\ܞ={ 8q޾k~~~lFVe+rKKKf7S=Kzzzmid]ccq^Z%޽{ϛ7oҥ={f駟-BkǏOOOoC}q)Çgff#3BQQQ111x@Hh4ݻw'&&!sL&3=44...^P0^G6r".|}}Ϝ9Ә߽{=bmm-cDM6duԩSgٲe999JJdƣ>}۷oΜ9_}K!73$޾}AfΜIYMazkGT*NDMOOl) @Ca]h̳'-^‰@ NWUU8Kz5qĖ-[m۶{:q:O)?!,, 1 rؾ}  Fd;wr֭[7COT`Dc**;}tRRҖ-[{8\_u̙k֬0>'N`-87\VZmڴG)))W^VK$$$$=zIܔxJ _y.srrOx@dpkZA"x, j(h"wب(JၷT?m넇#B 2}+ Fٝ,K4Zբٸqc=-[ k w>}K_|"UBTjEDJcRSSP(PQhDk Ai 8(QYY @Sꨨɓ'vyzz nf.V~~~FFFff&(fO A4BHT-icc\Ҵdɒgϒ-/kvJi#?xYSfuuxxx\t[[[ %{R-zsݐlĉgΜ)S\.oٲ%,,lon^`L R㬜4YIJJJ||<3@!CuBPQZԽ-Ќ'>%)3gɃW֭` hݧODhbooo?j(bĮ]|}}O>'MԵk^z5,uUVV/h4SŋG޹s'7JK؅ 8o߾h j)O8ףG7xc۶moߦ %cƌ߿?yǍ'[:8 Jq/ӐV߿1eʔd3@e@|P /<==.\ءCFceeSߟq D6۷7'! x(rCɉwa3gΐ|?+**I^o0J駟^y問?EY㫯Zf̙39_TVV9޺O>Z{Fyb֭ѣǢE,<)I ,Ylٲe-k UJW)7bSD4#T Cg6BHHTHTP 8`Pnn={@GGG" ,!~!!!uajL,vvvd 4Z^RWƍ&Olrkn޽F܁AT(w܉ [q_.̬"&]v%CIT0y뭷 4~/cEއ~yDB ;v`4-#(lI… !;L,:z tٳg1BPtԩu0(777(nO@4@QEpeee#d2ڍFc^J%K_`ҤIuY#+++ٮx D{srrV^={l_(PQA]v%6o-[M3 <gCPP'Luև 555|׷`_I陝-;,V** Q iDC .|g5 b سg(haxT8ˈ#=s+W40!}MMNd t޽ʜL!h4_,m,dIuu55g͚5hР]*AL#8X͛7[jE'Z~~풒 )QA|͛¦Mv@)''D{Άox"dBoNA:g=N|JܿA''';;;uBYLj2oooS~/)$ݽ`JPPЖ-[Z{NNo8iiii4o]~=<TTS;MҁDz(W_}b9];Z`BUUU߾}ߏ#0+P}Q& Yu ݻwA4__߼ɓ';;;o''l۶aGPPʕ+\?Λ7,$*H80H&~mԨQwkRpBa4I6 ywYeIx;XjҤI"Gv"8q\N:%pEGGٳgذa<%%I%*\Q=Ɏ7&7޽{8 ' ?TT$B<QUU Qll+W ZǞ/Ng'PO>+WvZQQќ9s @AԶm[oj޼ygZ^g&=#Gd!H)ҪGg70^k׮B{xx)r#OS|Y )S(^TkL 򞔔l2`e @!4 F@E)AUPQ@@d2˯]v͛7oy S~˜1cڄY ̙憘ã?cā{φ Z3toYf5jp-[>vڙd!NtAZDT* B.+*47۷P[omڴחUSs!s.d/3)͗x'N^a?"" v s:HGz=PQ~^^^͛7gϞֺukkkkDdO3'!?*k֬? f̘G;G?&OlccDɰZ:ujqqڵkyYxƌTMMFapr߿vL&jJVjkkۉ#?=p>L䒒OOOC7l0??7x###1b9|z޼ysYYݻm D`O Ȉ` 0**H@_Ͽm۶%N+++$KOOKKۻw/Q(,,, Dxj+>>~Ν.Ơu 8PocrZ>~xlii)WWW QT☾j^^^~ BסC5b.g=ɉ&8!Vۿׯ# =*C@n@ AOMP(@#7o^YYݻw322y[[[d),GyyySLA@# rrrC5B:H9>>>j-[l={l(mL׬{lg:"8)XW_Jg# qv{b{nGEcǎ" P~fdz$*HG:zz.*pww̬yojcchNfUUUVP(7mT^^3e'''I 9x𠃃BWmhlݺf|Bkg[[ۆ?ꫯ&$$4ulEVsA&}䰧3W$77D??ZluȎqt^z=z]F @|PQZMDx`D/&PqqIXѠm 3'5yȑOB:th}~„ lHTG ,,,Dmfd2Noψׯ4?HTO_|6iɖl4whl߾=YrjZw9d{[l=zzF^YY)DqLEvAuu jLLLRR\&*&G<Lvy HOOSуõ7P9D&I \n4\VEEޑ] ޲e+Uk4.'r3UT`*Y>bq ի<i' Qh1| >hqA B!cDAhR&u޽@UTT̘1UÇ;;;!?*666 OzQ4 L# :t萋 xۢyxĉ^QSo[[[ɓ'Lbɒ9eAr"^j:77ZluAEnd2#ayӧOA4_d2&fFWJMM /_رヿ'dBv}|G{jZ:"8 ziG0RuDDDL6>=j )HT@cgee"z6unFޓH**:pf̙ӦMBX{MOOtUbbb1$%0F <jj*DA2LH ww^x_bqqq[pG(J$*S2މ۷rТn!f cżD݋شiSBB=`~_ @ @L Bakkk4;uz#&={ddd ?BC L`Jh+W Rň%bbb̙oثpѩS'hgg%WUUZQ!??+JrRX2 Z.--d1d8>jjj0ʐ\P(O27'&0B%&&X HT D Iq۶m#gg={=:ùQbޖ,oa*FZfjm宷[Y̕ $WMRe(b r;ΏW^Ü33gR1cϝ;zDŔ)S-[F\]v6m"x7{`0WQ[ݲ֯_裏ʴqר:UUU!kƈի+٧|E+ͺɷ6L:uLUFQjE@zZ(0JCdi,W\!SÆ [j5mڴӧO駭[&2@nA 1cƤ;wnʕd;@d@7nݺu4裏RB;wZ@ ꫯ~Nڴic82 رbH&9UQAF˗/2dq@D>݂#h **h@@@PP=:u3g:PXXOGyf!Ǎw5?!<<ɫ*0C 6c.4mڴC(DPTTd0Ξ=+[2k֬z(,,6/: L:Cm֕'OOŢE3l6H_~{+@SV*++e7ooA6oܻwd"Bd-=.dcg( bX,7G:tX`A PLaaaRRR|||jj*6I&5j9ehX C(\hڵ]v%;'A>L~.^XW(un׮ݿoW^4TgLܹ?OpE.8 |ez֜n4 ѣ yzz2saAƸk0APQ۪U-Z?~8qDrr2Y @~A7n}ѣG?#b 9ro-3BՎ?ޣG.] ݢĝkn j ~zA\r @ƍI?#FLu iii<M```TTTddd^ s t%##DÅ]"޷Ji&i֬Y]v0` {zĉÇ'{‰ ]vI%[FZwxxxdff|{uނG8&Y.r#WVV 7%1[r62PKF>!` P 7.݂#h] {5iժ~wL8w( >>>))l2&&&22kP ֌Ç9rd֬YƌFj-++sǎ'O~7Ek*o߾_{'KXBRvvvOx*p8nX[F$5-.,z1k֬MNN!zRQa@c5d 8JJJۻ]vҟZ1cF˖- P3?!111--huN: @ٲe?yfBjvvBqF 7K5/\[nQQQٲ s ^x_Orrr 4;nu}XRRL2{TUUy$*ȭiӦ5rWj-h{葒H4t֋ Ԃ!jpvC'g='> +WBӦM۶mk2&M"@y֭[hڴiL4=#daFaB7nر F .|לc8q>낸vMK >U,K-_4x[*qL)(l6+VQ: <}[ƍ̙=zThWWWk0֧#99:1 -aHiEB8 Rm۶yaC ! @y'} l^nO4\/_&d46l:M<|aúu̝6.}6 Gk 4iҢEj+x._J&*TWWk| h۶mNN_7*6 }n?ej۳gM6ՋhhG"5hFЀDHTclN:vmwyɓ۵kGLOPpppTTTttt6m_cndd4WX[o *** :n|?SOm۶MeݾęfjDd1"//oDC]>&@4,6 mԺD5veee?}:tbO~͙}v!Ν;GFFĴlْh19B ӧO?~>>GfIڵ}^}Yf+>9/ |n1mڴ}>Sǎ@N,]4//oݺu.  C]Ɛ'**s]yy9PRFzۺu1c4oޜ ?Adݻw&?@p9r?Ơ7P4+ǖkMG-gWK ZYW_ضm^`21QYMzVt$(}9X^]vzСci R+7޽{ u@7E@cXXtrSC\Sևnײe=zlݻ6]6lؐH~BBB D@EEDD 2h>g VWWj:СCNN΍_fۋjoozΌZk>6LI ( bŊ} .\黨eITPRfff׮]lҲeK. *Ahq 6~,VnݺR裏.yyyqqqIII'W^QQQ'p 6@i[l4hPaa!pM6kСC:biժ7p8pk8;;{ȑk׮k***\P{yylznDhܻw<21\¥ q'x~ꩧ+9nYYYDe08ؔq=zH.] mDLO%-I X, jذѣY3?!11qDC4&M .mGA322O4\\q=zt^څt8uK'XfM=Əvpɦ|EM4(ژXYY\Ş^8WWW+3eAܮo̙3]XUUUEEkFdmH޽{'$$ Q#7"QNdcAPQҼy?ڵ9rbu~/77wÆ 񙙙DC4aaa5O ;v۷!.TQQ̒?sα`**87N0]vvVTp=͗^z)###))꿘L&m3Ul#<UWWKK瑠d  By{OpKXK(~&] lٲ{htF@E6X[N$gE68 xz)5c^ 'H+:YnӺu]vȑ#Dõ ӧ6m Ϟ=]aӥKkNPMgD?\8j6DJܡˁEEE3QAy-adsHmذ˗.) ;ґ&2o',/*e˖IqI EO+i MT"Q]x㍼+V(Sr Qh -4)ZYYɂz`0#PinEӦM###NQFQnn5kOh w' a, ==}РA;w$իW{oرzBFW= s N'O:thbb*][gW:t~i³ޥQ;+[+6bl6%HEaEFF<)))<3-{# G%$$ 5rD 7*ppgEz **? 2f{}"& S,=p@~~~~[l:thjj*p~uΝ;۷qɇIHH?)STMg[ g}L_uuuyyo/߿]v2m_Ɋ &[QQApt]{W곴&y?֣GڤIF$*@4h SѠ澖{P qd,hj۷o>}:$$#&N8w H=pg~vۯ;ڵk>}Uij֭[.\8tPUxґDI۶mwر\q%,5*** '>S۬Jm0ܲ FQPQCH"oߞho(9D-hނPd" jذ#:tݯ_͛^Ǐ߰a laaa111'  ';?$e˖VZ:}?[]]5lz}njW1 Nv8-ZU'HEE}/[o͖+6d2)\n$QARU;v옕|:,\R]Ί tٳgBBB^Bajj8 Z 6 .n2 Fݸq&M*Yw?H~hl6[xx3?LJ!ѸrJ钹dZ?^7n}׽{: 5jp87nޛβNtw.p#FHF]Zh#5\MQ4R,\ } ,߿=T R_~_Ϟ={K8ghgEfP:֭[E44 xVAÃlp@EEEA7|p|#?AX>>>aaa#G$?x,^֬Yµ CUUЭ[˗/o9;wn„ * Ϯ5?[w=пf͚d)YQADJ$*HrrrV^mĉ*݅=:99-;+*c;+*@8f_hѤIj$B(ho!jlL'==h|[lI@9w}wڴiŵ<1[:CK'QA4/tFHPADI]tPf>á˭S^֛jw8vwu[hqmVWXÑ#G6!B  #?ꐨAJgy!.ײe/:˗w74ZҖgΜwޤ$]B\d7ڷop87n\PP(9\ܞԿ6oj)]t==VkQQڵk7xJo[>}FuәAƌA"O.]zj諸c"]V_, 8mݺu{}}}[hѹsgz\@999'/"""***<{رznD=R~ tu>w}3ϼk]]O޽{cccotJ[QA:* Ig:>']\@3Itr3Yĕ+WxzzFGGmڴ?~駥v֍WNNάY:Ԑ~ KA#GLNN>wի#""? @bBEFFJ!C=G\bcc.{ァ8,[3fM_Axp rU/^}O?eeee|__]vTVV^ӄ ݫq8BUTJ:NZߕNb(_QA? f7ے͛7:tݺuݯ?z*U!=<UVz:jyL͞=[8qbLaÆW\h4f%RUC'g̕ҕ7&&fҥv}7/$=νV+ LII#b2 = ) 9ITAE{ٻw|nݺDw͘1{fϞM8blllzz3g-[F]a$(((##SNBaǎ/9Uw4 Q_,OyWo.L^h_~ժU|qYYl6b%镕9fM>>3?}o ;v9sfš1; *ӤIݻ 9k?'K,rM2qKU>~ᒒ_9O=ɓ'kJ\PrjU25p ZNtڵk_w<O8e'*X,fK&$$H6tH"C3-~FXTTDPQQAEhk(ƙ uN 'N & X,J^C'm-͕zg^u.;xyy}lfVZj3LD|}-4IBsʳ H[^^NDIfpo DAAA'O޳gϩS,X@~8dْ### L2zhpvpuuLOj@_gyF-]4 @mIZC*11Q k}ׯ ^ oO?f-=>s̼y޽_6nؽj2@L ,!⣢Dp8Xo,b̗g= 2 bl6OxOJPPs *;VfyƍO=G}Խ{w f9_V~aǎ ѹsexΛ7/99YFo={; ?V)ie ru+_RRy{lݺuVZ=ۍb0Z@SL>>_}LLӧ|m/WQQp™3g~+RSKɓ'g̘O:u+ŭuITC```JJP**@@TT0` HTq.sNk;v,;;{̙'@}1f̘O>{C2رƍ,O~w޽Psȑ/.TUU&6ɓ"aaaӦM6mڴ~vl6ݻ} 6(Ya% K.^:""/U 1 ;b&p<KijqNhPsSn;wK͚5ٳ' BԩӜ9s;v3gnݚ@MJNNlBO<믿N\ӳΉ onf͚{۩tР ?A#aܹ]vpݾ}4*Sji0LwqO?4lذٳg+v'U NYYYH5/,;bĈKr\AV"@DET""'= AQQAmݺuK w}̙3M4!&.vIݻ dff0.tyH{챟~^COU\\?Q#UV}H+"L&ooo~~ѣO~ISO~W^y'ˣ͇{'$$fBO!Pt>]xQϝ;wɒ%D?< >F Ə߸q}._q-[ܹs/ }R#44PȤk׮W&aZl@@f͚9;S{6,6z#GQt)yv`ҥFڽ{kOe/Y&""/͇ \x* (@V+%%%yyy_~[l9m4fZ) , Vjj3?˗/D+V8wܖ-[ƍ@L@L[0aBiiP;?pڼR **ѻv-Z贳L&SmځC Yr/5 TTpr8{g_ugZ~(O  0`@-QQA6~SNSNm޼yf?Gt  l r .QAZVV)((xgNO?Pxӧϟ#C~`BBBpp0Ձs Euu5AM1O.]O?]644y[~w +Wl6ЉԱcǒ 36pUV;w.%%e̘1D諧zj??K 9$&&v!;;P* ԩSwާOnWTT1TTQ᪎;.Y_U-}SJұȷ{߿=zOdRrJZZO///9 :--->>>11CGGG8P)6@^{5K/ 9 DаaCPi׮Ç~Ţ҇p6m Caͦ;VTT(_QV$/^|糲LҪU:\ZΝp Ullt^X pA{GQQQ^^-9q;Sm6⦨ N@VUUU5O3p@FtjҤI~~~cǎ:DCM6%U^^|[uС'|rտwݮ@E.xy=zxgKwt,:$*8\ ,x衇p>>&?;;e˖A]th;|E[ӽp8Cv]'FQn*-??W{5) 6mZjjUիWիW-W£LSRR摋ѐrԩG6OҰ;+@vGET"--mÆ IIIL _xxxLL'Q!!! ((( rx뭷rrrRRRJKKU7_䡇ڵkLw- ke0W:B۶mp۷oȐ!g~'ڄ]U]T2\\Nj .G`` $<}-VQQ/^xO?t֭, xxxr *ADfXK***RSSO_DDDTT 7#g%QQDii)A ƙ'I~P#kaر$r%~C@,Xoɣ133S:N 4jmذ?FjjѣG5k8pG5k֘1c7n,)p8IsV' 2wܘ⠷tyD@bjw ;x4*ܶmg}VRRBX$@eeeAp.D@~"""É;O&M222 {n!xĉロP\EA \OO///͛O?ZW^^>mڴW_}}G LT9RoX(/”)SF̟PHhE䐛7\|ĉ~#G ܎ HTbm޼# ***""1~˖-C ٺu+ѐI˖-C~b/ucڞO7nٲe -3f_|nܸ_`ZߜnITp8v] 3%bbb}], ` X_ ʕ+Wsђ@$*\/5 +--MMMݰa iҤIdddTTThh(\kIIIyǓL–.]TTp+ WTpZ|m:]06M駟~G;FxGg͚sy睿- EEE+*XV_hhU>f#ίk!#=s+W7n',|'7 Aii)APw'I~P4iA~6Hf6Ff!^x!;;[kbl2݃t8_]xa8vX֭UYSaaa@@ HVUU͜93++kӟLTPHRyy'tZjGppac3ݒ/A(dP +AqP/_NIImGE@TTjܜ k֬IKK#?A(Q5BBBHe4W^|r!)GIOOyf| ,zdOUV=SjllÇ/յ{bbbvv/]WTT|je}} 2LTTPx[ٳgǎ/_>vW_}uEu̻DPRRBG򹚟 I)DDDM,[7 2ٺuk-N< f//B96n۩pZ-liii֭S66M؏GIQ=r䈀eѣʋ/iR ROE:KII $:3BadjAE/~C|g- **b`ZҘ:--EGGGEEu҅hꃁ4pss饗Lrss m,|+**ۛ[Y󸸸;w+5YQAd<(zNNN^[]"QAjP7R GTT8) -uM5322-aw#DEl6pԸ8D4rȈ@I&?^S*ee09ebHLgԍseeeuI-Y W 4_ti~ݣ+~M68p`bbuUscZP7V %0, 0O%t~رcgΜ9]~g*@c;nYV9gg6l ?A4-[R4Zp{oԨQDC&СCm۶{yyɴj^lgAߞ0a*>|'k_JKKɩa2V\TT4o޼v-[,//OwyPs΍!hPSBZ;@K|ҥ?|%%%ڿYY4X8nGEYVHԁ3?!...--n:&&&22|H ::,v%%% 3?jUKT=m۶޽{/zxxl6*++ڴi;wܷobH-7nܔ)SLP Z"e]+O;ڋyw.^⠢@KtѺxիk9riμ[@\}GA~ڷoE~@I [1`!ɓ'>|xʕ:_bdxYYwoYQA(,,cqCJߑ`PW`ϟ_feeΊ N{ -ZXnݢE,Y?+:iZvBCCWZ5x QBj>nٲ%..ٳ|2;t{3p#ñoذhˋQ\ϥK6o,uiOMpppTTTLLL֭@yys#2:_agePRR%={={+"S]w}'}={t/v&*HW8jK/ԱciӦݻWwyDS{ zzPQ%Zwĉkfee]rȑ#'OHTBoڴgϞP[@p.]JJJ'?A4:u&?^y}vݻ'Lu]OZ,///6^VV&Ts5nXyEfΜum۶ ڵ9sO~ ~ISnڴҥKXVfF```JJoG5A@KPVVܹs*//+"QP?Ԃ9??ja0V+qəaÆ4ҹsȑ#G ԝԥ۵kWݨQ#B3f/n\"Fjl5~wu,o| L꼢b!QRRR rju_}e_K߿;1??5QP`(]֮]uֲ\YWLf|P͛7oذ!55{ҥKȘ!Q&Mddd 0 33h`0\tW(_XYwբE 1ˍ¿?`npBҙ"5eӟtxxxxyyF8k\ccc;v8u={ђPQAj8;n@j X8@7:'һDQQʕ+? .:ueڌ䫯Zn݁~ӧO2VaaaRRR||< a7ԗzz!CnJ4{W;(͒JZ޹XYu7 .$oO0嬬DOݻg}6cƌ_~ŵDڟZjUhh(quG H3jC][._7//e[[Pqq~vڳg^p!??ޗ6!M~@|y\'99yذaRhȧcǎqMJ+*j\=vݭ[7uEofddݻPĉSRR`N"?ύ+*8y|߿p'Nud2ܹscbbn  ZV{6m_I[ntĉ8t_t ƐЏ'&444"""::I&D$*aFG4O̙3g5檖7R_wdɒ_|Q]g3 N" ߵl"]5?\GgE ߴUƍ :uݻ]uCJ7V+ x2e qMFTTHlP@RRҢEΝ;WZZ7\b7|sҥ_~q@ iiiJhhhdddTT 5bj2Fqݺu~~~˗/'1cFNNj`i˲&*~ؽ{YF->?woYfׯ6lp٪Ugu[**TVV!!!3gܰaCaaa]w<==Y뚢}]tLZ輲رc .(///**twaM]k9AZCq8V;"V EƃMbΞjSVm{jAQ(#( !ɾ)@Jd{a%]o1 !OOOh4g bH$B򔖖GGG#T Ol֭vvvk֬A(L'000;;ҥK!M~{r^z 1ĉoߎP׌3BBB^jLTJ'*hmے+۷opЉyOrwws0'8*MUWW8p`EEE:s@LG$al`{ݻÅT[[7 ֽ SZZ n V^moo#ꫯ!/T[[kE枺͛7yPWhhh6m222ZnݪU+Wyr6c?;w%2Z-y5?/$$_A@ AK~x2\U*[_;\tԭ[FDDtq(9AEOm\(?KY&w͙3Nyy9/VM:eJ*>#ߟbq4PFF9233{hQeeeۺuk+*<lٲ6;mLȚAE 0̅I37mT*T*>S`> xZڹsg||<,k D"OP(r `003fN2EV#p*ʤ _zw/z?}tk3&O?TTT݆f_6mjժf9;YNQQQ @L[N~-55 WRD"|ЩСCvҗE'gwwwq **Oz]x<e˖z24P )VZ~+//''$Я =<<:v8gd)PB&y{{>|,,,L-ci>|x||ȑ# Xr%Bj˯\BF_VJ555'PPP񒒒z-So=et$ɿ?cǎYf]$*֭svv&LE6mXz/ v5jȟ 0Qrr+"""W6X 2d+))+4L&ÐT\\n|}}Oo o'$$8::"&/( 3m<3+T*JMׅӒۆ X8@lll,;:NT>Zy̙YYY,jFig5 j+*ٳ,X8@Px5缔+^xNEw*r;v$Yd)|>͂cFVWUU-Yk׮SNG!$ 2*ś7o1bϝ;Y U(!!!eee'7ʡKzᑓhαc jv-7݇:aO=:4azdgg0LΝ p5M תU+V|'L]FV$|?孃ّd4**mG#` 0 ӑqP($u֝>}0+ sllڵkM]M#㾦L&[zK4**X 6L0_CB!r"ftBsEqݺuk۷'c?#w` `ihh5jGPP>z'Q͛76L_?pJ>|Y 4:t萐0zDDty<0ff~9pdz \B9$$dݺu}f(//'CnO{=89mll|O#,R׷~;rbiӧO +T\\L4m4+bO ba?quuu*\bccWXq-D 2 0 UQQQ^^_ڵΝ; T* WXXpYDvvvr\Pxyy!0`}\\رclj<,ZkKJJL&{\aN9sM6TovÆ <Ç駟-[fÙt`>+A?+êOC xq7_iDo߾!!!+VصkWqq@葛#FDDDta> FNW7ZJ"Ƽk׆cEj<ޑf:-// ڽ{IR. T),, $#q'PQ~n Y0aT &թS777*HRI?I'/񊊊:vHɻ0trj~)>>ԩSF|M>ߪU,hyt:sp…Dظ5DXb ggg򴴴 ...111C4UAE0 wݿ׮]۶m+,,D40Z~AEEō7w $kllld>?!44… =!?0L&;z)S0aÆO>vb+XX^y:u iRV0`kp5C||<υdn'*XCENZl٨Q bG}ԧO =+..vuu D4Uu `߿ߡC+0 S[[KM%%%/^ܹsgzz:d>?!88811ѠãDWbq@@3 S#A(o;\n^(cKҶSRSSOqJzW/cL& j!+ \d|dU rrr;U__D"XCCC@:( ooowwwD0{W[p!BaRכoIZ;2D }]1VkQk&O} j 69s-y>ߪU+Lk׎5ӳe9U@.jQQQuuuj >>r &`{{3g"&!ɔJ%mŭZbyׯ_'m,a~2{=WZZڶm[zN޽q W\!5??SN{tk |A}}}>}" :fwѣ+Wgy`FBBBV3f4<Ɔ6mژWlnz=W׈f%%%9rÇ8@]]][R3,P(-0M052"+XRnܸq}ڴicǎ4ifAq_/^Y ҥ˂ .]TPPo!KM@\ekkPZnN8AO{bqV8lرcW^݌'JR"C jժfOg;wիWSd2mر㭷2ڸq㪪\4ZYre~~^Xbarrr\\\.\PB!=PQ**UUU%$$߿|G}7Ġ8DDxl'PK.-tRnnի  >|x\\=Baj%5 ?c3LcǎM23nܸ":{H$ڲe ޭ[x<͛7pڲfFQ*III~^y啂>IⰢo6mڴ˗/[ەk#88'O#`B @?BdHo0F~~~XXƍB矯^͛3%O^|{z899-Z(99977wʕOFЈ%'$$xxx#&7==bǎիWffOH$&M(,,lݺ5U=cǎ t5//t:GHӤLn߾}̙<ڲڵk===GeFg=O?5p5R$=:ĐN/7J.C`m8R9a„M6k7V(qTAE0/]~=%%ݻ/^LHHEX2<.;;;000888-- Ѡ3@$*PwϟA4L?̌t,矜D=0>F*=ΓP(;FOȎ{JqƴivM7k֬COd26jGGYSSӽ{˙۷oyy!݆_ęvD"D_7t*ŅB@Y_~%5Z~=N#X) R!QZHT^r%11޽{Nz*bG":.dff5|@kӻwoBChPizu-!!aȑډ'w~mnAG믿UVk)aÆSNWWW3ٵk{|>|ٳOB Ɔi=!C:yqq'|rQs9uر̐\gU 'ڼysZZʕ+~bVknaaDWE:rD@cTs/:u neH$HZOS޽}|| (D9::s sP(|^Ѹq(W^4Ĥ{Ǐ'Mzd2+*М@̛7oРAÆ 1 TWW?*1iҤSң@ \|wyc|>il~ij9T*\$((8$$Ѱrfމ! ̋~  OKqƙ3g*++`n޼8;;{{{:99!fiǏ;ӈIiZT*M:zh(O?lٲ;QA?vڪUH-{ ҷL±$iii...111C4**mY Y*++O:lZRgK˦O 233 z 4H.a2)A&",x7i )++suu! aaaݻw_bI+*444T*s<ӧOJT :eH6mO?5k֚5k:X3TϠ,RZZڲezѧO˗#K ZzunnK-Z,̆X,޿?ajcǎ]~ymƹsM{nrj\Xth4f@#?d <|YK?:ti`eeM\\7b%/\yժU#G|z)rZ#m0`VT_^Yf%''wɛ2G" -TSSC;vKX,HLQFVͫL4wӧfY_۵kWZZJs =<<,Y|~EotĈJGF( RZaΞ=>QQ!Wk׮^blL&c9 gʔ)-B4GP+Y0i%70Q|ߑo ੤R),)~[z5;w>f̝;766ѣL&3ͮ^޽iݺ5%%z%%%cƌ<+Vlhh0dQdTk偐aE&}~oo۷Yƅ}҅PQ(/^E\oL]QLOEEɓ'Y +*<믿 L&spp` C@@)SZQR\dɨQ\\\j59"QAX}W .ѣߋ `Db : ” ҥFUzk=233;vggg#&J 줤 pwwV(:t@4>|0Wg϶4i˜Zaa!j #:u"md_~#""5iqpj*(]ᇡC6Tvڅ,oZ4;y#Fٳ%8h [&NXSSh ߳gύ7ȹf|/ޘ FMİ0|i%'YGI,Np333߿s` 9-ERRRhh zxyyrB! ̘ \jx_~f0 ӹsg+<]v} ^dժUC;v,;ST:H۶mkkki\.;wISbV:}_yְaΞ=ۼA6Ceddpow?e˖Y` o899!?EE yA }AQQҪZ/111<< nD&#""ƍbW^IMM߿?Q tJLL D~={{{aRsww=; p1Λ;w\Jb" MUTTر#0׿L!JՌO:h"crQrm=wٲe {\Mpj|g8p6DzK$e e#R}*a 6Bj.]5kV^f͚ӧO?xY F! Y^G ^(11q…]vuqq!?Rdp1f̘={DDDL:%$*XAw`B_8oƍwMILjjj&L`h4JbONNlM4#KAoʕ7^KrӛBz{CC^G%%%ޅB!LPe‚aOM8!02DZcǼ{I~ٳ֭[999OHTDbbb\ɓ'#?D ѯ_w`ŋ?sΛ_;wxC]/,#Gٸq#m9rdYYH$a{-,n rrr&OlSjkk (sνy}P(ɰh/rڟ@kvC)++ۺuu6mڴ;wܻwLG,#Q[.\;uvZ'Pr}%%%aaa~~~vvv < >Xnݺ?~ȑ5GL߿֭[d>F(Xpvqތm2 CZ#G,((0kjZJeyյE/_=bsUPQh#طo+;vX@SI$-Xpٹs*2 T`C1c x<ƍ$&JXf_$%%͞=ζy{{D1bEGV\544X[M4ѣ'N<~xmm-]&Qf \\\` B$*Uf7Bhjoݺdɒ_a/_kQPQOؼy3h`kkVRRrQoooL@ Q2AB+WpތǏ/Z$''O4(/eٺu}lDF߾}52˒ҥKJŠ d|ƍO>dڵEEEt}P(HPQMŮŜC0brL& ܪ={ojժ2hhh`D"Dq>}zܹ;vD~=MoXXXYYÇr90) b 3f xw9oʕ+:Fɾ}6n׷֓&MJOOyiiiӧOZ՚(IfРAٳR*VQA?t%H<9s椦R{JcݻFP->,SoGF vww6lkkkA2!`D'O'1bň """JJJr9`;͛PO?]t)͘0aBrr2[||s:uo%x al?eLDt^?33􏺺:kKTx,L2 OkjjhR'ȕkʔ) Bah@\|@bN{ ߮]'s`D"AIO:y_~ٱcG'P~eee3f F` `֯_rJā+V8q"pvvfs2U14XCEG>:6iҤ,Z}ȑ#0&*#nOV^MUXe2**pe3gĬb&=1PE `^ &T*UYYYbb7|Ӿ}nݺ͟??330/R+e4ɓ'̙Oزe h`oo?uԨ={3#UXhl BÇgdd\IZbW^y𡩷 z۷oIc^&66v^盄tmsrr"߭[wZh4,?GCTZaH$bg>齫VJMM]|[oEIR)8" sDAs &TWW'%%'NPT YCE:ydpRDލ `**Xٳg߿HIIiӦ -vvv6V 4wi[Z+WhM___ $$ɴoߞdm ln122?>tPUU @碣[^8<{bAt:^cK0R,))IKK۱c[׮]Ǐ, DΙ3]v۶mC fϞCnVɝ*}HT"~~~!!!X!L|8V=&&`GNΞ=y3.\n'*? ?^Ȥiݺ9sLBa} vgggaڷoo/?n8:I Dɉܮ~-PQ9 h1( hH"u{ŋ;;;{{{=zY =mڴvڍ=Y 4ptt7o^BB½{6mڄ, FhȐ!#G,..F4X0lذݻw6?ի_YTT&=ͭ3 `1ݻ7%%%99.t!o=fșD._vj4Z3oFBTUILF:09r4 &lڴI%0#|>C+--=wAX Zmlllppphhhyy9B GGG___oooT aD `뗐ᑗh`ԩnZf ٳF1ǩ*j۶-0mڴ|#u:|=IJJX,&Ν;x`?b%Ac-[ᄈÖ/ߥJTlerózŗ_~In~7D,P*=M0Y޽{066600Ç ՒJq)B~tO4h ~׌Ǐs @0LvXKWTT :td25@z899( gggDC\\رc5g붶555$!!aƌ;wlK8k֬ b_]xq˖-˗/ϙ3g۶mzÇ$'J<<?#99ܹs[=qo\\ҥKWXaև*!bP7/--Ņ)`js„K0f B0+++%%%//ܹsdkd2,Z B@(ѻwoBChe@d2YXXؤI@`+__}U߾}=<_s$}E>^&gTT+..vuu 1seSquxx<+ITP*7oLJJ*,,H$!Q,ZF~UzӨW^X$*G,޽`Gnbcc?n߹sfB}}*nݚa<% D"9ׇx۷oTu7xp5554KvEEEt~ ;TOT@E9;vǎ'OF4hC.HTʿ@ C* ~eee/_ʊ#?qM%A~}}} THT@ صk݆  vxzznݺu6SN 4{Z6n"AiiϜ9F1u!4w=$$ۛ^D5jԨG+֭[2ŋfwxjZ$*s6QO)SrD1PH$Po`F|EVT=s۷ʰِfMz8;;+ __nݺ!`~z;;e˖!3gNFF͛9o 0d$|ݦ>Oכr(Rw?Ú5khEҥKWXo]]U3/\0|ˀ"GsOG!; DƴB!**=7XLYLEZvܹ7o^tZNq5 $?@Ar??.] `=OtR{{/`~֭([ dD͛70`ԩԉ "פXz]]]GM]UUE"ׯwss?~Z ,~#$ͻPu PU `+*(ӧO?>777999==%Rč0[H$mcРA  `Onʔ)jÎ=zdeeqޒ+JS4/߿{ァVM{9 _ĉڵ+))yyywС#ۻSNrT;^ŒRhbw%Hā6HT4+++ ̼|2/5VT0cbbΝ;G &"- X*Rr___GGGDlmm'L`9⠗-HL]:ׯӧԩS ;\v-))ɴæ$*<cǎT3 'йr#iހ>jb49րD& 7l0āGBnw⢣+++sssaG)}{ͨk׮秤TTT`'I$TT:O'M zHT;vˇP<=޽{2Ǜ4_믿{=e45QA^GI[G:rIMM5k֎;(?D|ȾƎZ`ٳ D"$*=9vXZZZmmիWx@  ic_p!<<ݻo߾~:VV6I$O`U@'www\ӡCDqTTqqqcǎ-//G4!Hx n1xRCli_~HW$DHT F|%K֋ڵkGs'߾}{ڴi+d"|>_,ÜwՈC ǏWToNMM%@X)jgVVVEEEKn'8p 66 pwwW(Ox$* 2$..nň;|ͨ(///nѶm[a$ T }͛gˡPhccӼ.]4))Mg |q=(LT(((8r狊rrrJJJp]np&66J\.W(.`08;;'$$xxx!=z&7U}}}>}n޼ٵJ(|\SSSgM4CXXX׮]sssoɉaݻ߹s5j…V^3:>/H8diү_F3TTkNC Ç"##+++KJJȐ ȍ+ o]trrÇu = CсOBO0`(''#Gfff"?~vv֭[9oɍ7Ə|L&{_v-99٘5[ yyy<]In߾wA۹zӧO;wX,pBVmhh0ԁ/eggP @N0 &:\tiϞ=ׯ_'wd$UQQ@KBS&"""88͛EEE)@Lo5555AAAOaǹA54A.] )));mۖ~)[t+Vp2D@:cmQV~WO>{6Ɂf͚Ey;ϟ?/Hܹ舽f}<t:n+*D)0;;(f0tb S/+`D{9|00CA00xٷG9z(=x𠴴Ec_PQXO E~% >>>cƌA~@K`P M?vسg" ڵknn.-Y|y޽'N666Rx:uj&|D"1Jk~y6̙3 2`۩VI 裏׌us@CEb H ׯBacZ$*Mt: rNڲe˝;wjkk߿_]]ѱY.+++ ,,  J($ \.Gm@4Č7  v 9OyEBO =z0oHTߺu+>lT^^S +*0 s={\p3d2TT"EсZF@hO%?`\T@sdO?488`Nx/2-߿MM-k[$o9Ff͚~%Яn6^z5==] Q?ܹse-GpnPQ\.GfL+rxxB `Ǐo߾=++RR!,@`ٷՑvJOO!T*EEUUG B~=d2Ϙ1c؜`U0fŇ9sݻ ּ+Zbή.X %%СC~}SL5552 ]I?aÆSNHrrr^k rHruVyT!הٳg#fDkd@}M2 !7uuuJiN>r2xݻ8rH^^+ ,**@ UUU#?s'L jh>@k.2rۼy3x{{sޒw}ʕ+/5N2^Xׯ_OKKk&L9 ˗yt&7oJS{gϞɓ'cp5t8|yO__իW#f6EE!޽{EEEFRayiH$-z]jrS `1ͣO E~%lmm|}}O," 6mdool25Ǐ_j… 9o˗d<ռ؝;w i!`d2=OƂ&3f'؇`K.;w.B֭[4̧ɓɨo„ j`_{"[2`H$jr˗,YbSX|ʕGme˖ :ӓ8pdGc5~T(䪚KenQ)KTT#BaqF6pӦM{쩭EX|x?3""8ZbI塍@{{{\P(U1n8Rh޽{<C5ӄKnPۨUVMTTTmۖ5q^QDLt,0XL~!f} @TTc\݈B;vlW^EX|=8\4R,))IHHضm L&C@OJ{{{+ OOO' 90..nUUUkx<^iii۶m- $>vX^233y# cuӧO߾};M }ܹc.-)Jnnᶢ9*@$*pB =z01P KJˣVkkkɘ4;;{֭CX?(jw<0$S*b:a(`iFYa[ff"ʢO4MLLCQ`s3_!0眙8p=\:3jp)))[l!2S'**h srrKJJO۷[BPյ_ř%,Dqy^.l+VDFFKrKXQ mEFzXdh$N@ haEEEuuueeeff{MQ(Ν{/_^UUE@״ZF!g"?Accccbb"##`>իWNNȑ#h_ǻ߮)/lZ-,,ҥ˭q!Q1\n!C̟?ܹ.q{O?-~MT6QAӱON܀pn(TT\ZUUUYYYuuӧ-ZILnoܸ8-h!?A> pE,!!!#GZXX׿%w5555&&f/gb!Q1\kk׮[ʹ B<3k׮jv BKd񉉉=a1ŋV599yʕ? F#N9n+--MIIIOO'?A>bbbJ4\98QPPPNNNttÇh|SJ?^Ͳ8ITeZc8wmf0Μ9ӢE 6RWurss?+֥' \'*q"""VZE .=m)//?{0r7QQpW%%%iii)))DC&cbbH4 `0l߾}رC o߿=v_}U׮] s:555NLBct:V[]]폎~Wz-5LPEC=//X@KXQ a .pfff #BNKڪ,/bF%l6h|ԩ+V9r]s d(((ȑпHT ^y' ODC4/^T(v ERiڦk'O"cXnרVUDm 4pOwܹge ~۷PEqVTb0233 Ȁ|aI9?sLUUW_}yk׮j%|@S)))IIIIKK#?A> }!j"h46lxꩧ>S!&Bm۶.J V;v,44?|FEF:b޽vԩ˧.\oѣ={t^ mD #t |` Y g9rիކ N8AL4xtK.qqqz"ģRVZe0/^L4Ԯ]p>yEУGO0\IiСC맟 FL߰e˖M>I[pfJ0UCF 6l nIT1d *++9r[fdd2t: d.ǑsN!!!!֓@()--{|ۥR==|glvիWwuab@L:wnY)..NJJJOO'?A>O&5H)66`0;d2 9rD{yy YT: {Oܹ> BVErrr:T[ja*n6% >!,,,55U8G<֪B2 NE1QY^^^FFd:qDnnk=p'$*rOHKK۷oѐɓ' $m۶ѣG3@L jJm߮jgWTb!Q1ܸիw})7hnwÇСCzj%mۅRD**4L@(<* !An7lpĉ˗/ B.%k䔔>>))I̍* ZݰO\zpYf5fWTv~sDp !33300Pxcm]>[^xܹs}իW  "G#<?pBٶPTz75 !i_(++ HIIyo%OT!Q=,Yh4EE=cǎXW^駟.\  R(s}|}}{yr'!En޼YST[yyyQQIuwbcc?-]~K62Mv{mm ¹IH ӧO'XV0֭۴iŋΞ=+ 9jaÚ7o+gt餤.(O4HT ;~~~\l!'NȰmnSQᎺ lA4\>`ΝG Vpll_g`c'&&OHT| <۷|on?9+9Rwԉ tRM2U)|p]V"L@& `-eee˗/OJJ-//t:us3fRl׮:sɤ4dBREEE#  dJlذaԩk֬!b2L B*cjZ 0`/_&9am4bf3 +,,,55Z+HT|0Qn믿^xq~~~uuuyyyUU1 Z6i$Yu'OHIIm8F5j  KR^`0,]hL?~[n2iRtm6'X.jF/NКo&$$dggoݺQ(:NIf ") FFҙ3g-[qJk-Z,X3L͛7gyONj5|0-Yo޼yBdݻw̌I{fsC7SQI۶m3 7{F }f++>u `8õk׾˅ ;vLVP h4|ɓgoooᡦYfp'N$'r###㣢OHT|M_$"^pYИG}sfN-ȓVTTo~ڵ'N* V+U .\, rz}jjjXXTTPQ.M{V1R"9tK,2eDXmurB~|QF \ /`0M"={vAAe޾}b) -gO4iRvv+~sB!aE\\]n]DD>֪$*@& pE&O>Yt? wQE;`0'Nr<I󓓓SRRO ɓ' _O e<£)Sjkk>¬,94ܹsÇqx2w R S'޹s:V+$cas%KF/ dBluj{6mqtDȜJZm۶jbcc3&O8qhȁOTTT||U \I\\`?~d"b;w|)94fǎ=|d~fd:.\SPPh3h O Z0Ls! ӧO'?TT َ0kkk^IIɚ5k.NnС bcc8#?!))ɓDC #?!22{۶mG.//'bjL]-]G3fpHT fϟ?ܹt:Z~ >Luud߿۽{71Z Zn߾?cَ=B~| QFEEEi4B}R!&ժP(˛7o.ycOޣGC\fISNKR)^{l?UGSQUDDDZ87kj1/@qƓd2 n+VXreee%aZ ϝw9QF>F~|9"##O"QKӧOnn# ѣG{)yKxsεo޵hCnAڊ bn1;;[RZl)EaÉ'saaa$t RUe~API˗/ ];oTLe͛)))'Ȅh3e:&spU!!!{1b1[4&&FvZ-0͵: 7c#[u7j2tl2Ẑ*BB }ff` YEEȇ Kjjj.\P]]}ܹd'pQmڴQՉQQQp6G~BRRӧLAG~Bdd$ p$*paAAA#G555,,PTud*=p~ŋW\Ogwp]:tIHH6lF~8"## \߶mƎM4Dr;wʡ17oNHHxe4\[[Ƀ[pTT^u$eᢨ뮻L&srrܺf`2w㓙9a„t!]voܹsrĥ BCCq9GDFud2I0.#v ?cǎo |AbRDdnɒ%F8>V݃|PQW^^_VV&ij*6#]~$<_mVVVFLݻGEEq 4ζ{t#(((>>>&&Dp`274|??_|P_~k׮8qSYYٻwÇ*Df h~-˵k߮Vp֟#""}Qmn 1?3F,z•&Dd7s]vdgg9gϞT*رc @@g#?An }!/0{z|||f̘Np]MḾ[QQgeeI M{5jZ3fL߾}%\wNKKKNN&?A&t۫W/ 9gy`0L2oYFYvmhh$jp4|VzOT&*z~PPPEEC=m6'\DN'tvcnᮝHGۥT*ITLݸf#(**JOO?uTyyynn'^z7wsJ"&S޽; 2O #b2w3aD4D߿_yCCCcccLLchZNW^^SL`֭ovBBBobH[@FK[ק>FTJ}$%222rrr***Gopaaa~~~=OmΝ %%%DC5  HTFm۶hI'pyyywy?,ϑB4`2]ZZJ4D&\Psڵk6bf΍h4z޽d2H;fj o˜K;T8ZazudHd ի222~i܆Rݻ̙3v\@iii)))'D^bbbtB4 QGӧONNȑ# ߶mJ^o/^(U,KMM gEyBE$1LoKJcf̘6u$fONPTTT|PQ9rdժUW^=qLQLK9Bѯ_/wuss 7}q' I0 ݻwaN>M4׮]={ 4H)--Huz $LTPTRu-[l$33>{,YL͛냂:uD@gNIIIKK#?A>ݠh55uʔ)IIIDC|nٲ˗ИjfvvWj%\_4$*xՑjn7r~߾}?һwo_zk׮g6m?ߢE a,-<8oUE .MxEuYYfB !&222 HTT*_|a0>C!+W( "EڔJnرcQQEEFh4n_Qj:MҀC=~\GݺuC+3."VL8**_dɚ5kv X~~~{gkjjnZ\RTÇt"<:OyVkVVVJJJZZZYY(G~?DnٲeaBZ}N:ɡ1gΜ1bDvvSb1Z^wݔ$QAT ŋGDD77^F8%KF18}l $WUUnݺE86Qzo>Cϭ׀`0.\bEǎ9,7պe˖:'ȁEFFF!QHLL_B($}94Fh.Lj |n͚7%W]v 4aUcaaa.\[Vi+*t}V'$$L>N zU2AE8pۼyd",$Fwڵ@ Ϟ#GO G~BJJJzz: rR ~~~'5g0c &IbĈ}ѴiИŋwM8 iC79.2^^^M8]$9I`X|KJJƏvWHޥzO>Np"fjkkW^bŊg'Jk-N /4f)Y@JV5LGڴiCL8322OF7j(#Q3<3uZ!g}ޓCcOޣGC:YE~e T(n7, ,&///ZbqҴ矿kMKk4>M&Ӏ6o>`0L0d2 _*,,LOOCcxsεo)9s\R-zBTRHuT3g6{ MK{azyy 'WTň#V^&Rl$*4-rUapҥ%K0?GH z}ddd|||TT +"QQFeff=hoӦM=z8vh۵ZmO6nv4;9sF YCC**85QAp=W~Eڊ ^^^zO04qE=s\nh򋈊 **4 &<]rEG8p`…W7KW*Ç;u't RQT: , "&jkklْB~L(G~\ KC9rdYY_~~~+**dҞݻM7oJjgΜQ(%%%mڴq$\I8R[ٿ9s}_o] p\TT+MdVSQ2AEF/\ < bkٰ֬ahٲe@@0xG_z%gg;.DĤR:vؼy{j׮1DOHNN-9񉌌#G,..&zB0L:N9~hܴiSSA[YYY>6 l6ەdׯ q.wygذaGEHՠ\ 7իWw&,B|JKK'k׮Zȑ#]v[nժ?>~xўQT*wm0@$+ r3jԨ($*o 1bӧ$z'n94&==W_MLLlf* Rywy祗^rEکrlg[F|rϞ=g |RMONN.))IMM㖇nCTTLuڵ/^t)%%?'&]vmժUg͚կ_?`ZITgSݻw7 ={?c8L&SVVVRRRFFFee% FEEGFF+ং;bĈ|!̨(94歷֭??eXl6;ͧs̙skr7XddΝ+PD3UEΝ4iҺu!ZB|ΝÆ ۼysPP<ˋ **܂0>|AApƊ~ɓ'7,,uw}>۹sgi32{mٲehh_Wb1 F1666**#Qn% 777::://hHBŋgΜ)L2%44Y,ZN6IRyyy֭;|9^۷o͛E⯏hfK6]vذa>pIJ>CzI~AeffqɣQOj5 ܾ_vw}wM6eggL&@۶mӬYS ȡU:hBZkӦMgΜ@LL-[$'' ~~~F$?$*۾}ѣwI4$,]T۷KZj՘?▉ 9~͛7Ln"q޴Bm;mڴ!CDGGK>c+*84諯GcDT.nz'OG)++KauܸqǏ[V+Itt)88'l߾=1DOHJJ ~~~1uXk@$*̜8qbzz:ѐ|PXXuV94n f*MlB3h 䆹pVݿE8JR8jpQH  =sLrrړ^|O<%QTZ7 W\9x`AAAyy0z֭[7O>l6**@#!Ct= `ʔ)'p'Y&++97'P/z~Æ SNMJJ"ضm[׮] eҞ>}| ]\SSNGGTzyy^l9l .5kJ$QA8iW:n%%% \hȓZMEȇ?î]L&ѣG333\!gԩSv&LСCYx<'Q@REFF{-[2eJ۶m ʬ5klٲEɸ.?!66h4FEE wh +V Ic={޽{ׯ_)+D^PxI+-YpTun =O+Wl۶ҥK;|0“]vƌرc?Zr[VI/5z~5o|ĉ r#~1u"##g~~~ , V(͛7C{֮]۳g^{~-+*h4?^rC<ș`6l>}T :b|EEE©nV&Qw4՚cǎ .9sf޽qz5nܸ-Z= POcǎn' *++lْD~Lƈ7-11`0+B*B=ڳgO94fܹݺu_EB~5 grOBt (Jfu6lW_}t?**@>v-B|LǏs3f0{#\+Vx駝q(%#K, ٥Kz=RQrvݻ7nXQQqCspp}G i߿CY%XpFj~[l9t֭[@, 000>>h4 D43<3uTy睗^zI v횷w}^l٪X(JNR?-O>Vz'8 ݻWcJZP(vȻ#͛7wˑ }zNZlٽ{w^#qt:]\\c=գG'(//HNN&?A&&Ol4YK@cb4333G]YYI4$pǎ2iOIIɐ!Cv}XV7pGR lȍ+SNu]M04#Ɏf7KA0LǏ_hs=G4 5WITFџ}Yvvf;s0ph|'[l٦M Ɍ%QRՏ?>+]v~ &888>>>&&O>D@1kС۷o.++#ر3gdҞܧzjʕ7{fs KT޽e˖/_?3EW\s;wvډ'6( ///Ivbf:4j\\\H4p3~mۄmZZڪU.)))v gϞ=yd䁫h x'|瞳9AyyyzzzJJ 2ѥKG~B^&D41UHRTTVxܳgٳgZVKTt !zBqnݺOE}8.C0iҤ}-^1DTK[[`Aqqлj4xD??W^|0b"7j߾K/4|poo6mzl(Vn ?Μ9SѴk`0@䬬, $$$...666,,hpfr@ ݻw#N>M4$aZ EiiLֺ5kVnݢ_6 u}&LWT*X˗w}wΜ9r> o_b+,Zpkk֬)**ڼy/PQ";ZEE<0aЧO7|G:ϏhHDz}}}ߓ31edd$''oٲE΅(=Ghh#?Ahp*92??hH}CcN:չs_b_Wg>#/ ~gVt:$ ~饗Skr>yyyZ~j۶mCjp(%idfOwܹsРA7o "a߼IiݾgϞKnٲETUU C&p#w/rǎJ/)7w5mڴ^xUV: VVVB~LŅ `&4@GTWυEp]wY_|mjjj,z~Cmݺ)~=aÆ]|M6rR/]vcƌbj%id>D}=ztȐ!7o5~}wa|r˗fY ?;{ӧ7o\T ˅N5))PzTr@gS҈#6o\YYycR%LQG&<<<;;[{M2&]a1N{饗k׮-^`0(J: 'DϜ97̝;788 z>555>>PHhՃO{n:k֬;>u Z?رcFFƯ%aE>裕+Wڵ`0jZڥ-Zԯ_?/Gmm>`B;iv0ƍk׮ E~ .gaV:iҤ";|p \^5{lf2yv@d:ZjSOmٲi 83gߟDLh4իWXhHe޽۷?wLڳpgy 4zygV ޽{TTTjjb J>oq~~wQ3 O+/ZK=ݻw裏,Ψo:t(55ODw7qثW/.^(4bĈ_|Ph4j|gSLJ|WtR/z='[ $*lk׮UUU 7[.^ѣ &T* NkӦͫ:ab"Hġj[hQSSOϝ;W@4)))'СCcbbbccO s$*{W Źsdn]xxxXXEX8o=Ν;:u/juf?1ͷ^RFFF~r>Fo7o|"?]>c馄+DÆ nD3qAVquuuYYYmmٳg/__VUU@~m֭[k'&NBtwy?iҤ9sP?#?!---++hADD#?! hp $* Ì3ڷogϞAɡ1ړNs^T(%%%mڴQTWT5YYYjbbSFFƝwYTT$|#QA>EHV\o߾cǎyB%:7`.] }Ux }?ŋo_(l6̙3%''O>9wL 7iǎ۶mۺu1cܬ $}=FDD )))qO ?AT*Udddllh$? QDe47op2:+Y;;x(dH'ʆ`8h|߃?/6$y="9ӜXBʛ駟 /aaQV߿?'''v3_QAx xzekY~p}饗$W(JmUegU@D:uFsQQQ'Ojd"5IN7|pQZZ*'~'8T*Uaa @0>|xmm-i/d.+!C!`:a'qwx\o`߾}_͙3GxC_>ȦBZkv…  JҟsRRR &\u]G0RƘ@h4 'Đ 7.\p6 &??pO"(*huvǏ°ϛL&ha]TGt6쫯27nܐ6Ν;ۼyZrʕ߾}{fB쾢Ǔֵk׮ܹ3>>4LDQ@:ѥp);r l"}w>VkXufZ .WVVOFSXX'H@LFq޽EEEG! ueyyyDq^Txs( .*\.iӦ̜9SpOO5j?On&ùͺ\|`֭64spԕ9i߿ԩScccfdMmmmyy<|0iN:`H?nh***JKKICDO?'~2,XըU~Ԍ3 зo_ֵvw}w 7p. O0aÆ ˖-# ܆QQF9ZڧO &AOw}3!/zʕ+D|r\Ty<}fxb~~oso M _9jpKQ .{$QT}:nΜ9cǎUTz"ojkklRUUE?A Qaa!F( 6 Gy(DtULvph! .*~׮];}ڵ?kĉ/~Zm\.ploNŋz-KA2 ZÓ~-_l6Ν;S .K&y^j,Y2uԆ]OOdff=.Ժ}|ξ3*dgg޽d2F0 n)*ظq.]鄯PB3*+W1cp@0d͹sʜNG}Dm6b),,$ 3|5o[oeeeFn)*ܩy-YDVG7"PFQe?OgΜ)KӧOOxjHZHWII^0aB}}=i_`Q4mʔ)AimZTL&כ|)v}zyiӦIme2YDDYŋGm6:i?p;F<`Z#### |CiBBC=4gFj5~C?ARvbHb;vkHCD .+Ǐ;N RЭ[7aلKeQѸ#Gt:ѣGSSS UUU=кu`[\._|Eֵm۶.]?~<22R:k%5 O< c'Hpmp8V+E0qqqw...>pil6WUU3(SOM<97EMOξԺjkkZ} &USV+ ^|rQQ믿nHCEFR?ԩS m }Zk׮&M={vTT~vQYVVvi]JJnl@Fy뭷JKKIC\˗/?}M iiiĺ׿D^PXzΝ vCʨT* Rs &lܸq…!qFR޽{\\\̙O&Kt^z _nj3k,FC&}G$l6l6ÑD Y PT[jW_}4/,QHzpۈ@j))):O>ѣL&# gL****==`0u]gf G}t:'HA~l6n_f@S(^gHC\{ILL_fMffba_j]w…555"YKtp3 !-3*@#˻w>` sذacƌ #F_srr>~deeYVA?q@0xG âEB\ 2ҥKф5kf:n}:=]\\ܑ#GӉ ǏO(+,>&ĉNBrdw~( RI@ju^^^ZZ`(..NII!H Eτ3˜1cRSSǎjÇWUUOj'@РDN44ĕoN0("N{)S؝ZnChB^^ѣ|Cn۶LCшwd2oߞ~2$K bbbƍ$S^'>\YYYVVF?A rss}x CQj...~:ikĉO=ԪU Jl6;vL*nںukffҥKٝZ1pKBwmʔ)cǎ}饗MF2 @Rw>ydh2 E&>h;¹o߾۷9r$>t:'HA^^jl QT`{˗/zSNm޼(vщ'n۶M+)٢0`СCٗZ*{^r2վgZVߩv999III#̨D6p@־}ľ}jbA`ihur|111Æ jdPVVv]~~G(*@4hݻ.^HzΞ=o>hڛovGyDkv].dW/''ʕ+;vo؝Z*~^6>վG ̨D4qD.\&&&3*H./]tРA111 :TYYI?A" '@H_~9r$׊cΝE~{GvNvڠA{=v;~QߧR***fw<))~!I\. c '**jڴi{': @˩Tŋ6`0ߟ~  "??fYVH VRR N" qy<L'$$Fzrz7|#khh {wާظqjmj@#%C.+Jrmd>`rrrxxxNhH"(nm/?~pvܹ38lD?Atr~~nX,@EQ^\\޽{8@ر;x`hBvZ%Uq\aiiizzlfw-Om+W7o9 u/Z(>>>""`0SEpkhh4^_lYii}4 ~o> /^$ qTB>zh c~z~ΝcǎݵkinȐ![l:u*Q4sIIo~OCCCnd2s\ Mq8֭#r hzh޼y^WH& Bd\LQhBxxU,Yrh?3 ҡhѣ f nho>eI+))9uԚ5ki[lIIIy衇2n; \g԰;5㡨͛7CQTX.]VZ5n8RBEǬ^zٲe{$22R8ApSUUE?A 4MaaOE*7ޘ3gΫJ{'Μ9uVhڪUҊE_r1|r\["E諌e6+**/- Ƿ̨n׮]T* aTB&Ynp?~jP(={TVV:N j%%%@mB +j{9]YYٳg:DM=z3g] -W[n111~(KhɈJRwBVO0aբ6IUV-]4**4QٳtVVVOV=zn3 >_Rݽ^^zUup1cwYyƍ꧟~ta7㌰Fm#)*-ZxbKd2Nlٲ%KDFF ]vOtiKZ.,,t8W&EE?`Bt?d2b[rJffXw݁;ON^o>}?.ٕB/~…FQ۷ᇧLB&-G:DDD^_`ҥK, * R -fh- D-YDΙ3v㰰ϛL&h|p_ '|RRRFx|DXVMѣG6* GP(۶m#@?rr )hL֯_UV   """bbbsY|9b avuq#3g)Sq)ҥ;3|phگ}CĠ)*lْdi}c29AJ(6nhX!fT BSDDDzad,n VZ6!!A3w}T6 :`mD?v(*@Hlz~رׯ_' э1W_1cQ4mٲeFBVxCJpݾsK|p^xvݾGU Xry!PT rٳg Вq1EN׭[7a.--]|9bq:UUUDgXl6[aa!iEu;w,..r in̙NZ~=Q4]tno]999»SNV^ 7=YYY{o[/]Xt}}ZvLQǺu!HƷaat@(d&s:tĉ"`Lh@N={z'O;wndd$vwt:+++/_L ⊋l~(*~{$)xO>t:i&k?p_Q?z!C<(~QAPZZջw^EJŻ۟6oLACTRT )] 9s к(* t4V[PP0c ^O&FDoZm6[^^io!wG![ꫯ {|e%K~_Jd}~,^zy>}?~QAXOQfsEEE#r hcڴi̟yHH\dd۷o?hР{W8q I~tvB? . WrrGyiݟv}WDqK={R<O?V,))4}W'|rm*;uJR(Fg+fT t:r… #FػwX7 HRRҧ~ڊ*۔Bضmp ߆153f̐!C""".aK&A1!@jt:{ѽ{c2, tOTRRbX'h4I&UVV5jӦM#F۾}K.O~l(qVf.7*sM8q۶m-*jJ{lܸbCЎo7~>}t!'''!!LwD |N}zv*++۱cǵkD\)))vfO [{zˉB ;^J4o޼>}Os\n6JRm|0**?/**[oe6+**"$Ʒ̨40 8q00LdfT,^X8_DFF&&&FEEׯ_)// ff9$ fYl^3gL&z*Ұa.^"OT6e-Ӡ'xGxZh "QQtuҥKӵo^Ւ ڔ\._hԩSAzBB @~BYY~gv;@h3gjzҐNwThZ^oN.\ВvC Zz4O=j(QB*B+F(Be|& POWP0Qm!<<|f͊tde˖]vO]VVb)))1Lnnq]~4l6WUU3(n>9r={'~ (JIzZsAA?ϤWEIVTTI±+,A.mHV?=5""錀`Œ hEZv޼yK.N#<kߟr(z5Mׯ7n\@Znʕ#Č p I >Ç aP,Ʉ9Q9g RYWW|z!]vmǎ //bxw(99#G<}4iHΝ;ɓDqKtR\\m+:O7nhj=~xWwFJΛ%Ǻu!41w| *|뮻x≤$2B3*9g Fx̙#z2'ڵ~VfQT9ɴG:uJDsz.]6rcźWpmqهn⯵|KV+Jޡw,//oQj:.22p,Z]vd~ZVTSLYb` @,7 555"BbnHH||޽{:DRp LvҥXҸϏ5jΝ >… _x^-s-̨fBȐ(Bw|ƌ 4A-kh-Z4c 2mm7G OHHjcƌy?ѕ+Wjjj'HDaaFQRaΝcǎݳgiHD\\ܑ#GӉ~˗?3!UT, Q"u\ޖljq±w?xٱcGJk׮s5jX$hu&8WZ#]reǎDP(nXfhZz„ UUU!o-mذl6O>x\. kƧ}i0uKoCV3!ih4EoQoѣ5--m)))diˆX&ȯ>}F8Yh#FZ_P"rJUU餟 :_?fY, . ֡R***OeҐ'_'? QҴiRRR;^7t,&Eeիcǎ}υkMS7-y[4p^M3* CjjjLLLFF̙3@31Bhׯ_TTԐ!CVZGh|]vOJO`WmBk_|4$bʕ^y:&& \_oUGN>^'n['G =yyyD-5յkGGG1b„ dvQT)!##cܹL\|yǎ555N _?n=~-=Vi&vZW_}̙3QRttMLL<{ ]~R**ܸqßm޼ys-]xE,mٸqb!( fT, x=P9tPԳgҸ82B?A:4~V%Z E@{' ˉB"ۥKEs9sxǎs:3*5eB~QgΜ9K f|+W7o9&fTF3x 4Qj0(4|ѱcI& g2r媪*I?A ՅCJ?=6l2V;|~.**to۷\駟z޺^T* /WEAzz޽{_x>3*4Xn9r9U@=ztRRhر#h]/d{'+++&&jODtFDjo4 v(*ٳz}ii)7KDCCL&H֯_ZRRrO<O*JEj|S7|#nfyyy7o&|pcF@:wl;v옛C,HCC!2`0=:66@ Ns׮]ĥjCnp vV;a„ׯDtСC$[:ujJJJFF?=O|`RDWnzO<)nnl6WTT0~E0`'vСG¨s?qPj^OA| t%%%-555%%@u!{ P(l6w)|dҥK'N[,H?QT{]t4$bʔ){1(˕,!"Ψ 1zh4~t '}111IIIՎ\.'~xQ!qvK&X'H`O0{8wނ .D^[l!8yɓQ1?HTTTf@Ӈ/Bŋ:422[nd@DZ=~s ݻ 2r%YUUE?Atڈ~BQ')))>|sHC"~ߜ={w%of͚PXRɌ Bony#ɒ婩˗/.FF!RxAd2ٔ)SVX!J:vH?PYYk.W\\~i Ac2IcU*wjV^cƪGbhD=V3M FSWW7}zSNd 8C[X"""6lh߾=pႳсHC\&pXV  mڴIׯ_(cڵNz7d2޻zG:`XaKDPP]]m4wpV'\"##5Mtt,XE,0x kİ0Ic9#Dh4 L 甸N:-Zh̘1\b.Zpֈ9r~X2u…* KJJ6_~@+Vt|۷?I>2,,,L/ U"h"Ix㍼<q˜QN:kd2 cU DnfTh*K.ܹ3*++:DJJJr8V~!@*fϞjO^__Oq%Lvp 7n?8---7VR5m...w>|8;^ٸqb!1E#rN 0w\{2R^/91Rٳg+Wv֍LqOfۅoHBQ !CO0!ɓ={ d2R僺vҤI*JXqaĈ?xރcʕ#Ds|[\\\ZZBϟ;wV%r1ׯ_׮];wl2U[[[^^t:>LJII6~!@ZF]]]]\\|5Ґ~ CͨМy&O|ر'|2p7VP^TX֭[CjO {Ǻu8ޢ/fT4+111555&&F3&""X8͌ %33gϞ ,7j:$ QTHNnn޽{ ._L1jԨM6Λ_T]رctcU*U3O_{0m޼ Cܰ0 &rSSSccc <<pfFӧOBB}G?]mm-['_~6nO7QTHQFF .\@1ӧOoذ!47_.+߾[o;@W6ʜ8qB&9s{!qrfBRqE1ezرc{I&} ̨cd2333&N"O4ĕeZ ֭GyqҐzo߾qn3vu:|0`R >ܸqVzꩧ{7S(B!RTFC+ZDQB\RRnѣLj#4 @s0MaaaC 7n0hILL$@\Ν+++ Qb8~@_TTtaҐ/111_u]9<[l:ujl4 uLPh4k׮J̊ A{F8 'NرcVn0lذ3f'B?ӧ'HANNjm:{}t_dW\t!tv3*TRRrرĖJ ر㧟~Ru7 M6Boq8E${:b7.:::)))111>dȐE 4 N>]VVVYYI?Atyyyn' pZmuuIHCRzѣGSSSCd{JBO;vlp({ Z}ȑryDDiƍ(ZpUz•ܹsǏ/{ѡC2khh t.]d2v@D~Ý;wn߾}BBBKINCn.˃Myfhw3*@PTw}<гgOHsxvoHVV~^z m P?~tOBOt@@>@Yn^_jQH /p̙ F߃[Qٛ7o6mdWR]tiС{iO?46668v3FwN把 a #wp _dɂ ڵk'|/\4S 5A6Bvv3<*"##9:~xyy餟 ._?fY, {Ub``d`Q E1T;\hܱѱI4ni?t6-Zڏҍ"'XY}ʅ\ݥkYDaluBQ`CΙ|| . +VTUUM0 T(jz?...>v֭[uuuwq/K9̙3F&''󿢂dLZD[Fywu]ޙ\<rR~~ӦM{f. :qiHHVOs`8QT˗tnҐ_/~u ('|w~yß}TE'O}>4S(~vV4LLW+*#j-anڴ@|tE=o/ُ;vr455QPP@?H 8>ttEuuu|9sF(:J5?|7|ٲeӧO{^}U}) [QAwرm6nu!?U*ylju֬Y=XJJ n_ ≳8ഴ~ʔ)>@'HN!'t:mV_oﻌFcCC5\O%J+[[[矋v322N>{9^{tRш?}]viZfH qFt?f%- )'[n`r@?A&"L?Eϛ3gNMMM~~SHCV]. 77[$R[GGGeee6iӦOzŝ}!1 +*1111#F9r+l6!rҨQFCeff9JJJZZZHCB:tf @'&Nxȑ\> 777رEJ5W/_wyꩧΝ;V^-]f7lӟԇf "44_ 3gݻ7##C XffܸqǏ_jմiH|VT+cccGvZ59hhh(//߽{wkk+iHH8ŢV pGBBB]]]^^ICVeWTzy'v]wM<뮓.۲eKCC~;9@_QhooݳgbabE6W_}utttrrǎK&n.B|||RRҨQVX1sL $V0+&Uhhh Y7n\ssOo jzxn+&6zO?T΁dggN>]N~> ߯YXX}[n bERFk"##333GDD^Hќ?ާB܄aXQM|(b4'M$L>쳐7|sԩr?@W*#-}뭷<̐,k`($%%eeeG)fn ?\THLL6~ %%%e0ĹnOC:r' YɓF _hll s27k|N6mǎ˖- dEJ/pBCCf5\c4ţФ$2@v'&&.\0888;;{֬Y ѣG'ģnnăp~o]vt;vdff+NFRAGqr}Æ iii2~?B槙䪪f͚UYY ˟ .ADD… M./ vgq/^\QQA?Ar񶷟C_QT3B}v^OtM<Ya. s Cb|'2g}}N#G%>͆y9hll>}zeeĉ!q9 .JBBBQQQRRd:ujXX@`uW\rر̔)SX ÇOVLLls! (*~3aM6555˾2`Aj0_AsYڭqqqgܸq}y̙sj`*j02:k֬2>eFW\%(FD|[1&MB 0L~BVVig~z^j*]v}Guuu2`Anٳg}fV4͏?+Y`DNg KR Eoo/Q7h4ŲxbO?K>F1~x >|Տ~oi233I >+W%KnҐÇ_y_?TAT*0_A勷شi#<"q__ϐ  ,ضmի!q  ]b< ]Qaȑ>i"""(2q! !8o?!##4@ t:u:ݢEHCV>SB-~8AP_T8s挏͛7{INN>qLƣh|p͚5MMM< 3$.,WW_?z=Gb'**J<"nA~F &={vhh^ VTTO\\\\QQfKOO' +(*jݷo߼y:;;ICVJ'|'qj4A{zz|tEnĉ2| YqǏ|0jzf϶ڵKՒ.,Wy Ԕ)S~deet:LqL&C=TPP zO䠶rOVbb8i|JSN:thwEavwvvnQA4~xq+d>_}CRQTr\eee40W+*@G0a8!+~/9ΞF .~nw @ 8x`LL QМ9s^z%yaܹs>駟Ν;WC]po!4 YСC3glmm% J)",,;liiS?uPV4_/*t_Nô.*%%ȑ#qqqD!C7n-aսR]]vZT48qb DP*%(TR8_|O3^(._׈۷{{ ^>MIIke2%W__?}-rJ d($$$333>>~N399Lr***'H+''j:y0l oںu`ظq#QmOjTFDGGwtt =;s̿G>sqݽ|BkkݛEbE9wܱc[QF JKKO?AZ999vfO~`XjQ%}癙'{1bDggi?I; ԩSyyyv+WT "7n\~~Q&MuwwWWWVTT>}@$dXVn7  |+WtbMrorѣG~_.ٛot]vI8@XŸtuu-Zh۶mwqiHH+jܳ\ EHHHQQb 8qd"@'߿~QPP@?@(*rrr8PXXx)Ґcǎ]}ղ 7?955t9/vvYQ\2>X~`9rʕsh4ƍ @'_'HHV[,nZz=E~XffICRRRWPP QM<}}}K3rI&?~|8W|n#{:t(;;[H#) S]ꫯ4bĈѣGkZ2]]]%%%g] y NSJ?WpRWWJ2TXXSOu]Uxx펏is9}݅ ٳgѻcǎM>M oرZ6**@ꪪ*-- -8b8NG W#G8q4dhO='OZ%{׷lr} ù# ~L3g,++3ͤXg% q:~1cwdN2@ ~BQQl`2ӉB~BfŊ˱aÆy,ɋ "FN|K/E@Q(\Gy/#FR{]]] ,?~ii)-It:^VVMjh)4ƚÇ ߿ҤI /\uU?GoiiIHHr}}}nJb qW_49{K,N @𮟰{j tNl6kZhzϯ" :~N>-Y&))Ik>1cx$iSzzzYYfըjuWWWrr|0x}zeeeFFi% @`RcƌYtܹsſ!@'TWWOhX,zAi&pw>Lӧ_ZZJ?AZ111~BNNi+4EEE:nѢE|*&O| lٲeIII]w}9{,{jjj<@?^4 WZ5}wmٲ4IPP+*jYYY6M<&t銊 E?AZ111C<23gi۸zd*++ <}!!!2?;;/˕W^NJ u֔9$,*\VT`uΝj4CPP{3&L;wnDDDFFF^^oO(-- !'dee.,''ԩS!C]]] ١AuBBSN߿vTNf \ )S涶Xo۷_# !!!_[[̙37g]%oHJJZl٤I 0:ui%&&:>ee P\\HqƦ;wm`7o?~|QQEN>/BjjU.?Eu?_k/z_}QG EpaӧO_fxƌ[@\.Wuu5Prr:DZF& yz饗>/^<~ӧ={k/ʕ+SRRf͚5EGKAj %K_4|DWT(DȇFl˖-3fLhhhdd$ UUU!'v!40H!80o޼jҐ&$$m`y|||gg'W^1cƜ9sf w9iWTh4^ZZ6lhmm}g iJDwyt:A 0uttTTTO\JJL:Zv߾}7t" yjuWW.J6gϞ}GYQrDEEuvv[?[¢+*?k4|DWH+뮻,Y"N\GGGy? Қ2efs:a@QaV_{+VW" yQ*|I\\ƶqƎejhhXt].m}}}_QA.\EEEvv}F#iJR$RSS7oOOf'%YOVzz@N> _tO=iV||CfϞ-qxSSS׭[wx Q֜9s^~%K_Zvɓf}|E`ィ!MMM3gܷo_FFiVTMJK֭3LdrU[[K̴l~$GQ lڴIׯY(dkҥMMM7o& 4wO?4>>v_=Ooo+*(HGGGvv={ HNtYQlk*,,LӉaի5 ~LdeeYVIOHcz[o!DKoRRE^ WT~"Μ9СC!7~\  Qh(,, #\O@?A[lۭV+(.~@LrVҐs_JLLx< EA@"sF+*={6$$d?U*y;gΜ> ɬb3 FQHNN;qiȖlޱcDzeˈ_KHR;,%%L( r.???77WΙ3gҤI RfpOߠRWr.Szd2l)VllҥKSSS'Ol0A Z[[KKK].W}}=iHE՚fiX'qkjj͛W]]MgϞWAp<겲.+* M0aŊiiiǏ Z[['OVX,@EH۷oѢE![oVttN@عsgNN9+* ~cǎjLE?At:l.**O@@L={z/iVGG O' m۶t貢7޸z~QQQ u$,jZ@|);wz4LGǎ꫉cׯ_z59(E[oojCCC'+~ByyyCCiHE{ f~En۶mz~D!g)))+(( /9-[P(J%g6myzZZZJJJ\.Wcc#iHE8ŢV >6 k֬! 9+,,|ꩧ.LNNΝ;ٳ& |[SSSii)i o?l6Oa:nŊn4伛ZZZ~_7RRRʸpy"I+N5]]]6HMMX MMM%%%$d0VnX,åEtMݤ![۶mknn LJq% ;G?ABBByf}111o FO0ĥݮ͛E6553Q>Mo+++M&Q . 1bĈŋ Op\ǎ# F[?BQTǘ>}4d"""juYYYJJ Q.+*@bEFF\'|'N;~8iH%&&O0ͤ\.dee"KDΝ;srrGEGppq CRRRqq^+! .~bbbvfe |Ҕ)SZ[[ICΌFo-/X_|ʼn'ON~?ϜN'9.+*@t:]ZZ^OII[ǎK&NǏ/--u\$d29s0(t_\WWDrVVVfو|򢢢ロ40֯_=OcEH=cƌѣGO4i#G$|'o?At Ґdr86-++4ťFҐoݺuڵDqQbcccǎˤt:lBNPT*rpJJJ3gNdd3BBB߉~y   £G[#{C]v}<1 rrrvI~@я0 233g͚E >ǎs\%%%,*8i222HjyU% 9{[ZZ*++b Cddg}Vgm90RRR9߳CG^wu111iiiW]uY|[cccyy9i%$$x  >JhEUUUM0#*AAAn[S4UW] dgQ^OA:p.$$䦛n犴X2illt\$p8v)SH~\P{)..޽{7i_Ņ R**xM07ͼyHˬ^4LDg9`P9__rrrBBB*@&>~BIIIKK iH%99OHII! @B\P(v>,iٗ_~)ɓ'GEVguOOG?O 6:Ezq$N +VHMM %|o?aݭ!'v!49'3114yZv^_b& 9~AT*.}뭷"""N~Phh5kn0+ O}}}yyyII  { E-ܢ,YMrqƖ_|(AV];vꫯ#u288ܹsܲe 91J%N }}}DoHLL\fMqqF9V__rJKK'H(##jő+(*^?~WWiٯ~ j+*|%%%rp ~J  ڡsNrs]D@|7xj_|J" J@W7mtW_}odD+*CtQQQ bwuרQqኊ Қ3gf@Q^WW˧2w9Z'(88x+*xko'GJV0z}ee%;JuQ|8OFEEtnaժUI,Ç3 x Q=i"99ȑ#MMM!s={v F%|H]TfYYYJJ QB M$JhѢ`b@>|Տ~f_QT ;v4dnΜ9/%K9Z} 7D/~)ST` eΝ9998XQ_bZZZbbbdd7ߜA&CyOhoo' X,o?h4(*XbbbJj.ڵkׯ_Ohbcc'NhZ0@$'Y,o?`0h(*t:]ee ***HC~_uuuy#|W/ h4J)))ǎgE˗sNrs]|8ݥϛ7O|?~|ZZZpp0`jkk].Wyy9j'z@QP(vڥyҐO?TT~Z6@6YZ}E;r#FPpRRRgQ&DQW̝;tO8q„ AjfY}G:Cw￯hڢdzTjZ.;c/E?rZj0/_~L&2rUUU'HE)/..{'dAAAb~PJJJYYQ(* lT*SO% Fn[VBE9w1ҐI&UVVö{o߾} Aľ0$>z=Q ŰM *<<<>>h4vmcǎ%\ҊӧO$bbblf3i\.$&&!zgov?9T7l?P> T*VT^_YYi2Ɗ ,::zĉ+77tFFF .AwwwUU墟 aZsrrH0~`7o^mm-iwGmݺOVMMM#9RLQ겲wbE5f̘s.x:2+:p!f !i-+4n/8W'C|(JY&5Kf޸D/vԐM'kb^9 9ίO("=NŒ$Qv{0@<7g6/\0>>H$HC|\.7hRll6v>;*<6vK^C:tX ~B4OE~cGGG7 pKL&S<?qD4% ;|ooK~":NqΞ=3ψ=}"H8&܀Vz=p@KKpOO`u'$B@ B8P(WUq>55t:~ao4QMOO̐nz¿lݺuddlmmu;w$NXd2r8Ǐ' s;$9s(?h4|jZb -͑nePc\.M__z#@bNcN|k@xnܶH$bXN:Evҥqb F(g~~^nJ'gϞ'O۷d21+:~B4OOP_xlBjK$=sJ47~Ս\~Rɣ.KƴGpmj~+ޱcGSSS',,,d2 tuuB`0(uVijjl6ONN7x^PװPV9 aS$ID[_= 9`0o߾j]<jB!,,,bH Btww 7@(* Ò$kUxzKEc(ΤR)Nu:npw<SSSVl6]3jrP(EԎXGRB@o_r… j/JFc<w\Dۢ6 6mvvرc$jB!NGQ ߿O$ EV_}i(_2ܷo? QZ-K`~~nH;*ȋ^~@  d2'$ EP}}}lvxx_~! |$Iؒ@:D09`uFzy+++G9u`jh4d'_'8QTݝf^o.# ?5ͯzw+y:N+JT2 L!gffUێ efggyn uNѨuee@8xBIPTut:/^z/_L+vFv횐vTΎ  k;lmmX,>O>iZ|Xd2'vkN6 l6[6=rw}G|'cccL$uZQIx?6;::nSOa.~ `0H?EQԙbGFFkPcǎOVضmVVm6eh<YP^7aZ[^^N$x<~f# EPf9J' P\.wYe~;tŋ|JR.] i.$h4jjLc.~ObL~(CCC'?(*ua2>h4JrL&}7Ǐ裏yr\* #IR*D:NlQA]]];vXZZJ&X,NW* mmmBQx9٬ǿ|g]]]?2p ;bTbGFxr`0ڵ}hhh׮]{zVXZZJ$Dt`08::J?nX_b9s QO?_t:W^{&&&sr"|5Ǭo~~D /e`>|-[ 7>OX,7EQH$"IӧBŢ^zP׀cFQfcd=Vp8̫uP1 @@^3NgOOOSS._G?A844 =J?n E333g% عsg6u ߟm6z?!|;* ?٦%n]wTj@`llm׮]y?z3 !qW'HD @LMMIO~ |'NPجVk\v:W\YTU(wTjacO3?77 r׽`2{1yaؾ};9^|"X\\LӤ!>hh(|> F@a<>>^,ICWk^.z_|:r,|Gɴ'3owzr`O<#n۶M~y0"ϟ~(f3@]G?xQ}o3s,ߪpK/739_{SNr꜑+MF,|Gl4|B!Y\\|%sF6#]).>|=_MDcҖC1}{kCԨ {3gT:tcΝ;ř Qaɒ%˗/_?]v[0*c^V\?lkk+}p8_D.+ R:NݥG9gի:199FMصk׷ÇW}_k9bܹs֖,YrW. 6;ʘzG__뮻~W~7e9dPqp3N}C2T*ȑ#k֬W&͊+FFF>O>0_ܹs*xGK._|ɒ%yU,?񏏍mݺի{1wTXlه>{ʕ_AF>!QH$g262 2V\944f͚;oo{oVk>kW_}^|۶mc= И˷?zPzqB[t;s|> QH$HRx<]'@9*bŊ#G:uJ033sUW}ߌbUxy+W|bX;*,BݻGGGИ+@`ҥoxM_}\~_ |}Bt:;0 Jjhh* Q+:t 2553?3Ot_wTXti0\$8|M7ݴF2C+~;kf͝0dzlPO gTX]]`WWÇըk֬y:/F":bqffwTꪫG+N!lTJ4M$T*P%  \n݁Ԩvۙ3g:/P(|Sھ}|sUvt˗s5:::{_իW{q~'ɟI) FST"OB @UB7nQVlݺudd///]{7oP,_} ~ K,y߾~T{{OSP (ܞ|>_\2L"OQ @5ڲeK$ټy^+=e?ʕ+-[+\xQL/{ҥPhOP(r-F^h'rǏQt:J577P *iӦH$aÆbFxx<^7<<}h޽o/^P;*B+XǏtMX,U̹l6kPA݉D>`X*P2Looo(ttt<#U{yO?=bX;*\y商 %gΜ}fC۶m暻;[bEMs=/w}V <U-H߿ڵSSSjԊ͛7 u^}wo֏7_{W_}wTXl[ne޽jlttt݅B2RbX&I&MMMj,T @kmm=tPGGOOGFFr\u^ކ b//V׹tP( ^vC=i&52OjTu]`ʕj,x>Ԁq5j__/WtM/{7w^{Q+4T~bqͣ[nUFFFzzzr܉'Ԩ+WT*ePVZ544t-Q+~w꼼Ͽ=kvܹ ^ҥK ~m۶{,  [jU*d2XL EgbŊSNQ+&&&,YRz|׻UWK/|GkTQM޽{lll߾}pX `֙3g٬}B555Rt:mPK;::ը!hkjj;rooڵ Ν+=V–,Y  ~n̙3===|>Rd2بo3TjN4*_O~k׮Wr~X%wT~~~'OtM_jr\魁ܜL&3}oTgyS|`U{ysO?X,+yX_+tG77::z7>|X X$N>}sV w7ɟK/nٲJ3O@M B6lfjԐO}S###>`u^޺u놇IX<|/i ?ٳg۟xL&,TNrl5*%L醆5xs>Ԫ`0gϞH$裏QC_###B:/~^yW_}?kP(dV|g>,$ z}BͲE>԰@ sp8QC~W̼}*>T?3o]w5::}_j@M;uT6r ښL&S}g/~5?J^᷾,][ٳ' 5gvPri5*^xvF* =SWWyf)j%K^~嫯:"@<ORD>y B300vک))j˶m~_}lΝ:\P(ǥO`0֖ND$ye,@ǎꚜ~_uxCvя~T }BOOȈ x:OlXxnq5jΝ;GFFI*~{Gk֭j@ ݻwQ~P-tvvaA(3C`aZjՑ#G֬Ygo|R0m6>>s`0̟Ǐ J Bx<ɔ C`bCCCk֬9}5w+˖-S9O۷χu`?~<'TDmN[[[www<BPq Bpȑa5j_|Q M73׫\.'TD8L TC`F:zk9rHKK̡'Op _jpi*+L&Sz P?^`?'Eٽ{oo.v'an^Z x;{zz(H$H$R}E! ۷oÆ lV=22R0&''wڕN7wB`P)uuu7/x<@M0T`0gϞp8㏫Q[^x'9433~|3j:zh>f QWW>Zd,"@D"<'R0۷:zhFD"Jq5] /~1|s ַ%sߵkW(REkvfKc5/RD"a`,F{o]]?q)j /pUW+>Pܚe/͠Ç}B'T*ڪ HqHdƍbQ2==}UW}ߌbj0;v 7<3+VPo``P('TJ}}}:N$ ,T ׯ_?33FmywСEyKz̙37tӾ}`A\>"R¶*@WW5m~رcǹs<ǥ`!r|>"2L"O`pG`kmm=tPWWn{ナm.[l e355U aΝ05iOPt*r6!Cx㍇%k_|q {*|tMXG>222g0,H ݝH$X #GYfttT/.opv./| ccc;w jP }BX"ɤR&5PbGioo?}ehhkyw  *' jP'TD,d2d>jll*8qB2:: _x֖-[vWz+o>x`}}T\.W(*bʕT*N CO4:zW_D"?H04T'Np  $*X,r|>?99)HT*9C7D"<ڵkը9/#W*T7ۻzj5(bח>"d&bj[aCBrjԜ__۱cǦMƷZq{IRjP T:iX ?\0|ꩧn\y_|q۶m G̺u/~ӟVy2OrB>"Dwwwccpi ~@ c|PgΜYPZ=ΑzH >!8p>"d&Ogcl߾s5//og}˗*T~x|||׮].L6- gϞZZZD&ihhP抡wF"O~RԜǏO/\²eˮ*OerS533W:U'TJKKK2L 0 ޒO|H#HXT?%K&&&w_y =zn8tPccl6{hmmM$t^ ? oխD֯_?33F͉F=\SSSٯ֙3gn}577bPqd2J'@y*\d2kNOOQs~_j髭^|y0U5kݻ7'*tJO$T*TvСH$"E->}ں˗oU'|R J '~bڵlJΝ;_|K_R\p0 B*W,7n822j,N}}}lSSSY  Zry.ӧGGGEk׮WZo|Û6mՄ[ouΝ`P EbvSzO(}B*J$ P= .h{{ӧEMMM=\_緾#GlܸSV+>!6 " Pm566vtt8qBSz֢D_g0+=_bӦMV >a ytgg}T3CR__?U8v5__,Y/_}U{˗/ꪫ!ɔ,9 offf|^O[ /??WsTbP4`:p@6Op8>! P`ŏ|#O>5j}{=܍7 Z >lQ 6OO(p8ٙN`BxH$Q>ϼ ;v쨪 W^ygj^R] ?QO(C9}H$^Z#"t2liiQxC]f͙3gQP"ȑ#GVZ%@er\}B544d2D"a Cooo?y5k=}t  <*@LLL \.קF566f2d2ܬc*dWWױcǤQo'&&BPY gϞL&@'TVcccwww"O.@LMM]v``@o~3nP9[nݲe411>"Jovt*jjjR 133nݺB E[nLwC ٴiӎ;t(\.WzdPX,$I2*TLXܸqݻ]O<ĭZB%$@ '~5oʕT*N0' *l͏<>P욛Cd||~ _Сvmذ+_ + EQ)*)Lf2ҡW `*Tx;ԡv9rd eF|l `n'TVSSS*J@*TG}tbQ?###up7T(p8|Сf)x6- Q:&F51T"lvÆ 333RԨ`0811wc z<*e'ÇQ~d2'aP]8nݺi)j?r9>*Ş={2lll,wѣG(D"Q:644T@9zhGGԔk_Z{{\ ֭[l٢%}BE$t:mT Cj4<<111!E|Pamڴiǎ:\B`PD"N׫TC*uԩ1)jo۶mn )H) jmmM&T>Z ktt[nv[駟¼inn BR}B>?vD*FjUPO(p83L1 , GݼysXv/[:*\A}B.;ye;;;3L[[#(*԰7HQ~'_^t9*\X,644F}B6=ueD:;;t<@m;pu릧i/W_fwCFbv DD:nkkOxpᮮ))jڳ># * I[vԩ\.gPuuuD"J@  I)jZ.~p7T$@7HH,0Nf}B%T*fC*,Njoo{C dΝGuұ?{ӧ(h4  5k֔i;npnݺeɓ\̙3jY4MRdM P`A_fͩSi iӦM;vi T__?OhmmUb*,4Ǐ׿<*\D"jщ'}BEקd2zj5.455ua)jZ]]efp1C@m9qD.f 0OhiiQ*,Lׯ/ RԺ_|P-bCCChT VzzzFFF(L&L&oF5搡533qƞ)j/?V4bRoxx8޽{ttT2kll'4770 oGՁ/ P?^(*;L^j̫+$XvYWWm6)XSO=eT- bt:J555P ֭[#w- C=H$t|OO}BbL&JVZ@-9 >zs=w}:\.'ʕ+ST:.A J1TXDzzz6n833# ƦMvءP ;6{15lʕ :5*P`q) ׯ H) :zhmPMMMd2b15sᮮ))i̓PH "=٬}B555RL&b 5bt񎎎I)QXlhh(J}B577'F5"uԩ5k֌KA͉FCCCXL l=>ST:O Ț5kJRPCB`ss@>|x &ҒH$2LCCP`Qkoo?uԄ@ ۛH$@PO P`loocǎM6̓|>ʯ5Hz5jo:|T-[lݺU`'TPkkk*J& u8 թ{׮]:sh`` yD"JEQ5C7HAI$@@ ?abbB2TtOX Obq͏>TP($p9 }BxPɓR0,' g RNH3N`0(oo-2::**ݻ7KL}BED"D"N4 \SNI ;wۤEeff/'_]]]"HRxtařv=w}:7OfO(d2J+ \GJ|[w gvSzO(h4 3TRLOO[R0z{{`N===Ԕ T__?O(`*pfff6lؐf`555t2liiQ0T`| _g?- ٳ'JIiv{~2kll'xjPY ̍Gyd:P_?Ocjj}}}ӂSccc&I&jP% 3w޸qcXvw}Š+`a0T`~MLLttt KA{E ܜH$`1T`ޝ={ѣRPzΝ;&dzBfoL&t>Pr?'e799yl6[:EAʩ%Hd25X ( 6dY)(KB!'իd*O`Q1T|KA466>3]w,N ʯ5HiI'C;|t z{{W^->!'Y[[[9C*>9(P(k׮T*% }BD"Q:oDQ5m T?\c't`ᙘ( \Or mmmT*H'*P1w޸qKyq=,  ,uuue@%HAR]vB!)Q\P('Y0lkkKӝ c*Pa]]]RPWBj>!QN`0RD"DP;vXWWu]3466J@' QNP-q T'NtttKA߿믗*T:y Bx#ٞ_.' ձSH$?ܹsZOY<fbq1P166d YOV?FDOd2'B‚)ǹs?G"S*:!K$B!FQ*Ўɓ')G6~<7ApPdOd2P6l6{Jb qчR)SWrZ',Lr ٬/M h_;;;.\s)G:~ÇMX[[+JZMVd p ˗/߸q#L>x`ddP>Z>}a˿f@@vիW@8bݻw 0obuu\.W*Ek)JL&c h+B:Í7._lB駟~voY]]m}>!dT*|>?::j hOB:_|옂pOO8(>6'TTӧO7MS Ѩ)z{j>!dB!ϏX:PSO==L&MЃZ}BZ}5”N[}±cǬG@y֖)Ǒ#Gfgg)zīWJRRYZZFvmb>t.iiii|||}}#J |\.wСVpQk@*ЩVVVVWWMA8٬)˗/KRZ'l``P(A .#TpD[n;w]@pP>]I@g[__B7:TOT*/^F9AP݄ tOg?uV$1@XYY)l0 Ç[zPnh4N_( &T ]zڵkv 4XݻA L/^T*R˗HO N7nܸ|M$O>S췥jO| ZwܹpΎ)/?OvKKKJ\.B6::R57!TU*{lA055FM'Z}BTz5”dJ@'Ol4 4ǏN&xkf[}B__5#T-,,LNNnmm >|0NyyZw5BD2L\N;=aiiilllcc&J=zhppkqqVBDlOE@*+VVV|0L&c oXVK70EB011OPzĉ/_D[n;wX\\T*rY[R6m D *[ǗLA>Z}BTZ[[FhbX&)lV&R/rrrӧ HjOY,f>!oT5yS7o^x@YXXjL&s̙G}pP fyjj Օ+W_nw,,,쾽e}BDPfX *лvvvΟ?=SOԔ^PyMD"fwn2>hBz>h4j =~ M"rAdYo4@*^z5;,BrٳfX,W_ JZA.f:P?sss?O)"t.3z666T*-XtBk [[[yŋtZOY** \ntt@*xOj\ru;Gp Ǐ} VVVN8f WXA檯߿FFFt1M'Nx)h@T*ZMt:](cǬ|/^o588ɓD"a mJRV777 t:}̙\.7<}t:_hC;;;sss a:tPXGЛ O<9??o _ h+>Rj5}BvZ}А5'T7l6O:ULX,W_ hrB5Z vvvΟ?=SH$2==L{\\'ȑ#>akP${n<ϭ>Spf^/˵Zmkk  P(>|P~7o&?MW\xl6*>!dCCC>a``X~??C)SSSvB̌>!LG/*[}KvvvL#FGGgggѨ)pROpOHx Bx{rٳf]oppɓ'D~'cǎBOxGBx'333Nj4bϞ=}4vJzT}g6HhFFFZ}y"Tw0>>Cݪ믿0\\'|>}*xƆ)L,ꫯMVP*v am TD{cyyylllmm]#LOOr9S{'ܻw^”dB T=իW7o޼xwMABf[}B__5B#T>66 St+W\~ޮE"L&A.'EStb855eloo͕J%}B"H6m d H{o{{ɓDh}񟙙)0\g2B011OhBFӵZeppɓ'Dj R^BFlO n _vvvΞ=[*L)={h {mmm>6M#e2bf m&}Dn# "eV  n޼H$>#SX,#GV[[[333rYx!Lx|bbP(:PBrK.옢ܼyŋvr^'&HdbF й Rtyym+W\~]333rynnNWD" 't jffԩSFX,NMMzfVT*0%ɉBf#APmSY{7,}H& ]Lccc8(ORu}Bh r"Tի'N>"Ld򫯾4􂍍ZVTT* @*AZ__?q)žaS@wبOS** \N p677M"t.3tJR a^?*@;ޞ|)͛7/^h>_,s>"Th4N>]LO\ru;@7i j^[#4tP(A0<*@{)͛7/^hh[kkkRVt|>̙t:m P:)ו+W_nhC>Z>}bQ@h O8nP,mEp ;@Ȅ 1^zuĉGS|h4j hrR,..Z#4###|P(8@B$kkkccc˦+O@@[*@ޞ|)v%'O ժz*LL'Y6$Th4N:533;bGxJ@Z}BRYZZFhl5}mNl?T*Hdzz:y2@8^zO.AɤA?6th4zx<~\O?U)@^|Y. aD"l6W9}G,ܺu+H쵳_rŋ^|Y*ժ>!4h4 \.n:/t>׿uX,NMMw>RxFlP(LLLBKzᤣhԥZYY)lpbL&S,'&&ABRٳ;;;]|'OaŲlX}'Е UjӧFWٳg.rR')g23gdX,fPɓ'\Dɓ'r\T*񉉉BOw -..onnv͉bÇGGG].VkeeH$lX,>FQS Н^x166gD"ӹ\ε}T*R˗G"} @@/*@zO~N?ȑ#G={)jOS2( l6'Tn666|a:v—L&*@y$N?|pppНTT>!4}}}\.l6k : DǏwJҫW r>PzB8uL${XtϟW{Z#T*A0::j xBfٳr r+WSzbV')J \.OJ=dggҥKo||I$qtjZ* ^??rb.O_X}U'*@/]pd2NJ\.BNb>}1?ҥK]pGwJzi֬t:}̙\.O Tu޽tARԃ|βP a( A YPzZV;uTbOLLSBZ- |>OpϟӋ/SB5}Bh9AP&ONNNnnnvY\ruwJ'ȑ#B!}_^x166g9wܭ[Ѩk<~Z3|X,^|yĉ.8k%LZM  >ڇPkkk'NXYY邳 V|ZT* r3gΤik@*ϟwYÇ]+a~~RTU}Bh|X'@;*mkkkrrraa ΒH$@N|??X,ZVsssJVBfs\}}} Tߵk׮^gM&z^TA‘f7g? 檯 rBVOn6]pX,6==f]k/h JVFAwf~~~rrhtY"͛7ɵvVP.gff h BabbBI`O>욏}7ZI٬rVmmm$X>!zPxKKKqsݺu+َl6*>!4>X,>^|yĉ8N&INOoX,f*oouuulllee;388ѣT*f;BO(333p񉉉BOPx'cccKKKqt:A7۶Z}BT}'#H|'DQMɅ8N2>~m+F^Jm Hr  "T@8yd^Dѩ l;<>!4d>! h6^RO/fDO(Jp$k =s…;wt͉>O>ćCO___7}5=!T˗oܸ5 ݻXVp޽zOG__s;='T޵k׮^59vÇɤ[z޽{sssF !HRAQkD7n\|k300_i7J>!4TP(r9}_gg;J}lvq٬}s\.OP_zɓ] H$r֭gnR'#N|>{?%TӧOǷDo~߻kJc|FKgΜrM0,--w͉Ν;w֭h4rO)  ծ9Q6~<;ڪjJEb @*Y]]Y ΦRVP.>!G P(` P/--u͉l/||\UP@tOv͉ǻ677gffΎ'p|Xq h4'O]sh4:55AלhssVU*}BhvBBAt4p0ӧj7O>/~GܬOp.;sL: yn:|駟vo'T*zOp>/v7ntӉ {n,k7}ܜb8FFFr\XЕ zk׺D###k=Kz~F5'D"ϝ;οn}}ZV*yOpd r>GB -,,LNNnmmuӡ~wۿ__H$f%I|Cn:ԹsnݺF\__T*jUVP(&&& Jn:T6~<_g}}\.j5}B8AAPh_'Nxe7ѣ>LRom jǞ!h bqQ @[[___ZZCG>| k !blP(LLL_677'''>}MJ&Ǐfmm p3gd2}[*hLNNwӡbثWs}Bx6-bw$T:C<}tZC=|pbb;>Lx<;~Xd2=c!޿Cmmm?cTr-HLLL l62 @guYR^YYq'HrBd JtW^v|d2 fH !*?J&|>L&OPTwܹpΎ)h˿fpP @+gϞm6e}}}AL&c 'T:̩S)zM*j hB-,,oooRB?n 6$T766Lѭ[}Ȉ5ڙPccckkk&b1;vAt՟'^2EKӭ>axxEt/^3gGZC n9>>hN100P( @*]h{{ɓhgb1Akt Нӧk)͑#G ( ?XsRdvpȑk@*]ҥKP>/?6n޼H$>#Sihh(bx!kߨ?կ߆sܙ3g5zPo߾tΎ)pBAP!Rf{bdd$~k"Tz̩S)H>KǏONNnoorB!JYEϟmnn{A УLl. 9л^zuĉGS|#A O8#D"l6-LzV. @_\\쵃B011O`~=99^8l4fArD"[Bh4N>]պ>X,>?Bl6ϟ?_*PX, x<fcaEG)Bt[J)ƉSr8uEVPz ^@l*]eB(30fLN?yg}?xoB?t:<W}Y>7#Unw_*(ʲsmmm墜T*5,0' O כfQ2Pz^ݞNsu >! C}F/~t2|I 8>>N$W|*aggǬ: x~~W6774Md{{| }֖}NCIENDB`npm_3.5.2.orig/html/npm-logo-white-trans.png0000644000000000000000000032556512631326456017157 0ustar 00000000000000PNG  IHDR8 hBtEXtSoftwareAdobe ImageReadyqe<PLTE̙fff333.tRNSKFIDATxv۸Qsĉ-$sjƖ%x@lcxS:#K G.Hg[W {`G,xE-q 6"u(, ¯\vߑjBfUjyoI>2\;J?pp 8H-pp `ؽ P=v:`Zc\`cׂ B ~ P$,WjAH[ bi/• pU. ~ {-@.yB lС>ׂ_ߢ(? 36?Ʈ:t |!E ?j[~p<  ,.p Z\Ϲ~ |!8@8 |"?~^~ |8@8#8@8C:~ X\ϵ~  |"?>ׂZs8@8=S-x_\-.\`q?>ߢ|@8#X\?jAxX\?O(s."S--r8@8>~8> s-)=X\?ϵ ,. Epp|  ,.Z\8OA~@8r?X.EEQ.*.?fjа8`ZC t ~0@-)@8ֹ`~@8ʹcq W8d-X8^.Eb|_ D ^ ~$~ t@>aq~@8`\`q@ H.:~L[ ~}\ 8)؋Уؽ \ t0D=p 0R.`~_-)spr@8: Ee@82A,~\l~dԂ @8or?ߓ ?]soQD?p_ v/pzp*?jhM@8`\`/mgNj_s X}^ֵ 0r/b|sL ~UsA- ~Ekhqp\`q[pZ[س8 ~k  |"z@8ZE@0p|sUNJO A@88 rW``pv/p2X\^0a0ajArz AA jpt++ n ^40@8CEsz@8j`bcC, ڔ۽p0``0  +`({صoT[X"^ | _aPe6%,g EW \\-.@0B,W [`m o/~~С\`$(  Zەp&ؽpQ 􂭂bK = `J0X1yK%rAp!z!3?@8f^\   + h~p̔ E@/2wqk d0Xp2 @0T W(??@8-.jzl E7%,:q`qOAbc1j |-zS- :?@8,.{A` qp>X\` lQ 6O׋ *}z~p| ,.@/ vYb Z-r`uq{&z`S0 `" , b`sX\zn`s?U`\t`X1$,@2ht̷@.@/ 2hC>{"b@q0X--&?@8 @/'}d0gzqAtW@/ ,1-a|`X\H@1`ϻ_^ 8Ю aJ@v.``=-a|`p|(lŀ' { v -ӎX\z V;`s%~ n  zhm \v/b-"=T (uqp^ | d@˫fq\^ ؔ`[ ؋N`%RB9@8QjAHQ-@/ j-^J,`q=!m`? X3T lKO>{A8 @/Q W-qM n?@.EA ؕ\Y\-.@0M0ثa1X\zHn &ؽ  B  h aqzA`ӹKX\mVK djM챸 `b`k^ vK  @0X:`b g>pp `sppA  [b`}'^Alrz Zz `M dX\_.X\zNg1уoX\z`b`T%? oQ``k1-*X\-.``u4?I \~ ArC}A 8@8p!\Q0-6?6ي}b nǹ1J$, ,.p!wW;`%yqp0ߢzL;36%$X8@889 8@8s { ؖ=?,.@0P L?ߢzw0V q0ؽ߸v8@8( ,.``%`R  )K K` @/ wv/WEQ.`` $Mo`$Xb`[P;pp[@/08P1?,.```dtCN{A8y}X\@1p0r/.- lJ2Ltc\`/K `~0}joQy0?*%C?`aq TXa`bp^;ED Ya8X; f1}1p5` E 6;xX\@1Ow>1M.޵~/oQ``H^x%< @0```b{os.p!b`W~؋z`Ϧ@2^xX\ v m Ňr. @/0@A  IRxX\``da?۽@1-`Qȫ|hqs`Qɷw㳸9A ꊱwC@0pd[M'>~䷛at@0#O~~ @0(@Dm~y݃6%ؕG~NiK $;1܏od!smKDp[@ab0{Utp|c􂅃b  8wW@D  =tS|<Xv` , ,1`tɍp&­}bP{  4 :`jyw ?ر`+ 9k 7=~<wP"z`0̩dt~=VѲ @1XlrFrf+J 8~ݠCp 38P2Xڅnfs ڥ~XQbpWzlH=T:HVM v%8t? %|ͺ-/^0a0ŶB5|\W8 Oe(Fݠpp CIP J-3eɟ3= q󝁩M/";-;@0p%% o Gm_"~G0J!zb0k s`!'ItP"z` ؕ0k w:#'<'A@2^ cfBίcz`C<չ'¯Wv88y؁eK 'Jn%'Ǐl:@/`0ǮɼK5~t.=3, H<{/KӁg X'(%.T3X](ğqcR1bC`K&07cN% ,1+`-gnu'G{y_j8xyez~Mb ՟7%,; ?8ж>U,4ܝ odXv @1XޕS*Mcc_Nv0 b$0x`+||pp 8`FoogG;%,; ͤ{ܗw\Az`C`GE'E% ?%8j0P $*Т]:; .P.?'Ѳ`b`bO TZ?Fto8(;ezF`8`4#27Sأ8/(.h`0c0P 1b{- ۅ@/pasmJ t_|^rwF8p؁g K eO{u !soc,.=t&ݠ)G-qDpp%AަK $Z_pPg?1%sO8@0!Xb`[@R/q.8S/1/8( @1Y:H'=oc]b[_89( 6 q``S] tfs0̌<=f|cpp`Hm ndRpp5Xv4 ˶nOcCÙsafǰcwp}`b8`.7ޗzOݒoGt`8P1+/cp {0⻼2.|?#= vk Q~]Wo8ys!t~~;@0.NkX _2F;́#buUA[~,.v` d*Supj,;@1p4Zg}G qpp@0p5#Af*IA_]83\O``bm2Uu<+>9ՆW;uqQ $ *S ge6atPD}<`Q ._nKU1k8uA!z[\49%Q7Pc߾qpvk N0`= @2XH9C/\dt}oPz``Sm 44Iϟ:pJy?7ItP"%%E=>'3(`Nosz`оٕ[o9` 4ڔ`dpg]d/M.YyZvz``Sm ܖXU|?#sn?l 8Ut`0Z0SK J2tT;c?~YJa!^0@1x_a 1pgݺ\xT}\ՙ.m+At\a\a ؖ0HC|'"ş[v[`y}j o7w,8(+ <@1P фJ3k:09yr0`ҁe8.ڕ0xr(ݠc.#,3J` Xb -0zo/=8X1.v`zb`[#J3S❙݀-pk ><`O͙Gc*ow;l f*OM߾{ ;Vw;cs%|7TU"O_f8}؁ "Xb`[Ew:p]6U;"y**O){ hk^Pn~-;@/```Sd0cWۄ$Zv` μ'`1~)ΪcVgKp6 WCA ̼ Xb,6Pap\ _^Yn8ψ` Xbb0p@.u<u@1 W>S?npP ݱD`bC`hnpv7̷~:pP"z``AQ&H8dP?|6},;@/ yA|=;L}#ote8d`t;Xv` %[x*?|%E8ܲ@1P c?Hp;8 ֹv%pc^tbTo`N*~FIVXbd}7x<%Ѵ=x@88yѹ8``_acږ01Fe~fgPuAAVDAM (K'Y3r ^XpP^SmJ+`Qݻ8W4 pP2Xv ^b`Sd0Pp&A?p2嶼!Ƹw8(q؁e%%$}v~kcvD@1P 6qj8Voi=8z @0P P ZL[>:*CɌcp <ïz Ϲv%ìn-4fnG\k`b$ !ZuYC)|H8+@p&){~c{--?糖> :#|mcZ$ܱ޴`b`bص5Ipcpఃgɩ,1v+`+7&Ol'+XtV8f%TI0`K>ptB<is-; ,1<Ԧ^2g w[갃5߲@1P h:6rv?p9``W{8z`]l9OuS%Md=qA @8/,wNbыCZ9жQZ.[{Qvd^xf ٿޖ`o~7H7xB7eL;33>@1 6MGa*`8T'u؁e͊A\:`/=Ƅ'׋;4>6\tlVKGىK 1Xq%l_ p0x8snj݃k= * W9Xx@osҷ.k K $5ŎO G)p,η1o~I;L@1XOܬʿ=$Zv <}ؔCt|trd^}ԭal ,1`Y_W5sԁs >/Upytčb-,5 rpN!P\5y8xTV=WوgZ1:Tv` nO1p^Of: Ypt q륃t:e s% Z_BËu@sX04v癛0bc͢ڈpPZ7|`Y:XpZoz)269P0+uYOC8(\Rf*(Hoů 1V88JK?`g'8жn?{/~̋ߏ9|_;J$.```b@g%+T/{e+j,Z찃q1JHF$I7@ +jrNv @23Xz^qpT+ ?vy=&qOQ0.K>ȏX ,p9aAk^eR`=T VzpuyP`PknPMt )맖.ca``] 9"GH0cpP>:UnY F{`QrכZm8]̑` ֏kvP!_vW0)缥u/ڍdb>YȽv Ǚt앬P``b@OwB{׍vc%X ,Y0AaIl`IەL8l:Ia~1oD:B8A'iv7``dGAliu71 'tʿI|Nb`['tVv~~8' )}L<XR` 4MvPa~nv!({f[K֍i8y>UhuUQ_PFsmDgX]㰃e=ө0T:Ff_0c1A$yr,9Unk.;/8P2pV:@~ o1U|-ӟ2Aaoa0P lK HVLG ~e1Hy#GeppPZTI~0P P y k0|9Sn FG-2/?  v:mJ >5..l>8~Qfs3kvpYb -g㫊E%/(ApP!uA|10%`nـ&EtPȋW xRa㊿;P `}MW=6))\eЯv{^cA-H9ڍf +uru6+<Ɵ&|mJ/b,*\:')(~BQܕ'5̪`B/tlB ?*JAl;:A{3w\4+Xr1aẽX8fAko]pgdʂO{2ePu*{׫1'TLZ^v`[Cٱ5}.i1=Pbp$:E& tt (~JWHMGVK++A);a5 5yJq+b`Wgņ2GӑUqZ'c_B(:B8|Q3T+uN⌛{(혮c S#e4@t)pP#8'lKQ,7!4q:!_ {t5%.wr89OkuU,{;b0@toc\"89MrM v%Pc6wUxf>1NjpJg򷜰Pc5HT{jÆtI1,:k91ľw >im 4: /AZ< jLv`bZVT 3r^im^8Mr2t0V:P JXT 1F G P &;._h 5-vNbl)A2Xأk7+?cŭ_86/?3| 5nwkvP5Xb`[B 77b c\pP; *|͛ M+ܸ9koH~/`bbإL}oQuf_+d?͗%޺o:ց_2F vvEB+*݉;ޭ:ث ,-}^7<w``U e p˼Q]qeA?Qt؁e%l/^x45:J^w1x:1ę*uNbeفbȨ35 o6*bupٷD+OS};pP] ی.J,3F#a,`c 2s8x&_qkY_'XvC#<@ӗ k&pAP WTH5~;VbxrFض|4T4_9']" U LW8=cNZT 8`prrcta:?٭?Z,[v\:,0c>~P [2)S9<6A{n*޿o\d@)uo4~ab`Y8Ȝ/ʯ:H2^DPðʫXf8@EB.H8W8xa fY[-;p(4b:pPd==d};Nͭ^­y*}`فb⮔:(>,)%`Tp )H;8jٕEջoj=(bBeKkESc$6M--6*G" p9Ͻk?C:plKͨBCST {,ZWz=ba}tެW(4@lT2A|~E{8xTzꅃIۀdWgL^C!i0P k AtP;`e.Mރb؏e]1ؗ`*>vƗM5k ؖ%^wxZ|#e1Ha5< &N$*+Xz2GddC~88x,j̋:z:]z;Yc30277QaŁtcP{h3̲<76wtF2x@1x?˝E8h1_9.Ѷ9E%ʂϻE"ob1KoYlL4ҁ&Xv pV_z7fQ,w?ʿr;ppM4XqJ^vpyXTy土j!&Fش. pcpP!cvA{gt]}_pۭ{b K0x/I0|h=p88*I9gGW%XWk l /<)\WG#sԯiQ1Ph.v:`bd!4ygAj2ưaLAj=Mh1pM/ bp+~p˫Zjd0ˆčdouE  dN|> g%W[vbفm u67Be:*@ML1ㆃ.`ܲE͒b#E龁fP Ƿ%~V2}8HAtPy sO?IWF7Veaǯbp vAIN 8s# n*U#!n4"WJ̈́[|Xqp7A5nBrU<-5W|ňbtu7^BXvpr~gJ3Y &ɀ]8x;{q~|͢,;(t뭳_!._:lT͋AR X%t+KEkM&-hs[>bA c0P 96O:beAs/f|q*_m°nab/Agc2+BTjY`xQ=F_mnu)+x8lCp.Sns$YRP6QnssQ(9ݸP8Agy$8(|j~-?3rڨ /A0so)F>Z8#v`@;Vɝ >O `pPtAtbۤ:Wƣ=7A>rgcP8Nǀ~P$ZvpZvn{Go7`dHt0U8fSw#߹u:a$6$Zvl&~u{[9]76\F%΅l8 1Itpn{D;ZdpX 0W̯,s"mqn Jcsk=XAjpY nM`_hp0{8%V<**‹A@-uSt``Rh [;S >f5̻pPinͽ@8hFz9^[^ E6 \1*"  z,u> vtNe<AG(zjhsa6/+I1`e~R”+_&(U}J3VXvx*[ n%uFs0(N2@88'}lg  ֮ZUNGۙE,n'̧rp/-(GS;\uG \LU ߨpP2[8Xn~,;7쫋\/tzd0ďAAjJ3 I<%Mj7{R+~A|=n/u T>Vp[q.)Á =>,Sa{Xp t{ļX8 6& W&,XvT.??]] -*_ ~-2)E?d8x8H5F`.{ʻ9͆m ;b sJ5<7|y/Q8S o@8)k5ay-625Og )0Խ3s!OAV:hHֲX2w٢C _u|8F 8L8% mNp+ωvp}+S~ 6*_ UG:JZ[h,'쪱م֤aED^:\s't+T|9-w,D;X]~~oDaI:߁E,5*$^a!eVmxkN̰Q@͇u>Gp5j}3xod"~3+nǻ"X||[;L2WAAԹpjt7TLI]`sGYGKMeF/<~C hW86e:8N[a-BOwth1vJ sd1m8x^ o1$VuR5E/Oֆ9}(707%@g^NppP1,6m,;Sc "}%VA: 曃vp+޻_- _ 4h: n'w4Z8(ʈm8T:R\uo1ԙaph |ӆ(~9͎2x{@l/0~p[6 v6XvɸF49GpݵQ_bd87++oi[ojl/Q( YF $6 #rR8h?JrN{((!6ܢG8=Nby|_ Ap'9Ėbס+lTxS \ap q4>3Bmy*T=6* X@881}#tj*}[3j ?QAZ]?50gp.PE-ZicAp?ǧ@mTދAP @8F)d?|I@GЕ'3FٿL I \}HH&Ma@jԏG q2P `ݯ xnjO//P1̛ ҕdPzb9ȼY܊T= WL#iƿmrNɻ6x7T[9oG^[JfNr2>.Gnpl3?+gzpDoڟ_tP?m ' '_z98m?`I£Pj_slT+ߖF&#b¢qjyl~>}mv`uߔ *]t_zq-tKQ[%0u2EFMy՛fpPp|%44ߣ3$ 0Լq֨2Xc( d\"cEWav` @"ݥ\E4V {~A7z_ sG~7+Ӂg%(57JڙEO1 0wp 7Pؤގ!?*NF}bk4V;h8(XaBlpϻNO9٨[_k$ZpP$5Qb(Ӳ TC*_oW18mHGz8Q;`KJ ״N)x2.[`b 2ɠWޏ3ggpPE, y6 &;ڲ~H'z0ؕp8V(GAp{e/8b8mŁppaa 7g2݇ApPi]<+AtD7־L9 ~=|S"p~cH5(: O<x3H0 - 崻g,|:8] b S(rGapݟ~8V5LI`m R \W%A1ZcK%4Z2(=:v|pP{:Jn8rfs. mT^d %4b2$N ה,;XNlƎTnV|W b%e,,BM8N]]-;=P7.ou4c `dp ?b9,=~ ^pjN(t- 8O[#^SV}Qo)t2"G2m J,@84(|VY!ܛW^)>mK O<Tz(;ޓM z}Svp8ȸ?uMuvNbg}l7 >_}F gb㠷)& !حpP餈s[z8K8x?\O/^˺sKa=&],5xŚv`dlJ5L J*ۏJ  oí8^ӁJߐ6 ij(_b 6A~pOR?Ȋ_Jx@[|CLq!C+;u[ %wnoڲE&Ub3T1p?URVA:7m1mt{g ;`[hy{N*}Ma_ο_0*ipz=~?R㍵=8(W2ij}EiALMBopp Wf|;(òf^@n.e~$`pyy3VxTboOi}1, ޛ8`p.=:+(/}'␟$0zoSˋ-Z &6^W#܉cp8tkz9¤1L^׌uܚ_C|2 r0-A1œw2JJYzȿ z*M*+O6ʽ/;J9xl?nMI0œ!8̑q3yA58H@\?M畷;'k벃X L Bv'|`̷S88?Ϙ>I Xd4K8?,w/f/lT]c8&t V |.B \x2X9gͅMKXᕑt  )-:d"l ޿]| vhѝApsp AEʓwݱ۲T8ڼ8`ah WuN췧OL:~_MYlv`AM 7d2lXщd0 wG8tp}=ڕCeͲXݔ J;a`Q $dA|@8xT61A֣{?ݹȼtpj`;b]c@w}*&Hр@2mVfwnWV&f ]^sqzw_a}ǓUcxS.p,Y b%@8lQ<\|OAqEtః_Xenzb ?hTVv9>  ^x 篎kOđJ˫WsGQ2&Q_ R6K& n2K ֞Nk$jak-;߇+TlE}ߖ%(N G ԟԋ 9e_.8 {M1D HL:?U?Kt=凃-L=vN)wH?꩗e^ 5a28@8/wL}?<~L:UT8%  I:Ť"FǷ%a28$ȗ~b>X88]qp./Sl\sI}hzxDN-}> o2P @2@8( ^rZqpj2~芃& 8'뙺3nIS{GjrJY0ؗ`Sl $ȏ; ڙFħGA~-\,;8*5ޛh>v+b7@8&l VxOw7n'M/9G ^P7ɈO10s]Aro]yq8^'͟s'w,+ͷoW7l/MoW4zvI[s? S "&<={"3l!7?5wNbW{,Uk,Gi`b)Zf ,]}9H5;XnBp`@2{2?h),)<t eC.W;'q/-ҕܫxS ̙`dc@81xSs''?Jf`r)sG1GQwQ*>KP @2H1Ye0CYiX1x)ͫ9 [˄1m_8{)_Ey[ՋVR6+B1@@ d8z{vߒxE:gNw.;_Q/=p0:_ sOœV kxSǛ |1RB `BjnB!p4:MTˎ=ܥ Lߵo9A{]XEJ BqE/IpO⁃k-k&ma^^zp?UNNpiqv{k$Dii#D͠Oz B% [~Yu`$8(:OoZeߪޅ|Q&es}!Bh1 ~2ûhmoG[Z^/ghd{zQQ'E$#![LOv;ݒ^3-j+6:vفhvP֐ !Bhˠg8z|Nt ;Tޚbl_xe`XDDnVP*7#"^z{Bĩ mK=(k;8H}Q)CE=qp`=eNA|JB|Gfn8+1:?eM=:\]V a q3"B+b@ f^HpߠR0U(Ezk7en:&+^Ϊ>%@P !B|g:Qمgd2_iOSE\xӢTf An#2 sppޗ4^bP Ƞ  58Xnmz`fn(:ng{EAN/'B;Ā!ẁ C=^KяM@'*M/V1>^V̅8~d8?Dq1(kd2@V[cp 8XbD O춼t`&x|2!B 8z UAAglꑓ]q0lmсB8e^,EZfo58_MErm2½l8l a.;@bz?AhQ!.pa,rKzT}8l@^*\*!Bd*`BUpQXGg-s sB'9[ @d 1?͸\+80r˴[;B11S !]oT+3}Y~*ͷݓW B! P8pp^pկ1Kbz#8A{7Whh'!!rVp~?ފdπd$6/B`1@# V-`2]N%m`-8x;j~piID=ĀB 8>^z9m l߽Tq̮:Hv!dB]@d~ lK7U$4|-=QcadeuvXl$B]+(8-PhPRX&ƿ"ߕ;*Tk'!@ B t988j׽eG:P/WrlAKID.bA?4Ƴx Wt9(Ma-er]Ap}86pBzbX-^dH/JV-)C>B{-@XW':ȕ,B;2 AA;:y ut`pTczo.;@] 68 raXaWi| F80C.; tu?% 4\|7MPp0R>?AY= *vI+&N, @\| AW=CeXQ58hܓJMA B[Yn޲J܃f:+iq=Ý@cw?teDȀk B8zGW.WI LhrO"B(1!8@Hq0:` T m -80}I"|,R 8>6\bcƣ/G\9180]ܓ  ~@ Bt`mߚy\="~-{G6f[u}zzBGlM `! e`eәnkpP͜] : !8@ "nk_cj{*= l$;:@(1 Bd7_uPp:QA+A Bp@cp>Ú6 c+sO"B!8@ ozˑr'8XB΃'S[_%K DQbBgE`ѡ/~Sܺj1:$"̉c v[ -Sҽ{mѸ6Of{BĀB!8@p0fp~5]`ˮB9:zÇ1: Bp<&8XyzU:8[4m:v>$"&Bȯpprt@u"WJ9p0ۏОJkLn0#.;@Eo{_^Nv|Νky,5}vUp@jPlύ'#@S׻ZIy ۵]\8ВMb BFscn޷8nYٞJ5UpVK;:  F=!BʣDhžk9p&M=M,f㠬Ϸ,? tIv!-f1@̫}3`xʞjq0j4fAaoJcG,_sO"B1PB(8Ոԛl{]8hf {Wv_v/Гݓ :@h'1 4΃ir >|76ڢMA'CQ}{921 6EZ{ȻQp[b,I$!Fޱg=O%p.!& w~ :wcpY~ԺD!k5:EA=T  t=1 6" .$8-S"X6mՄJ@s x !-b@Bhcm"2Bb˓ .s֕؂/6u52~ABb2C2awG哃-%)8XuAڤI;p a`.;@(50A *!3 r8NQ҂cdl+a5ʿێhDO,H A BJ]ppNfpP .v>MZQq5 #ɂ\@ ͱ~o3HKƍ@E>6gW>Zes`zehevс=.; BhoA2X_7g`.6> tQ8 l8HҍޟB Bm@Uo࠻+";8X]}RR9([#Iݓ;t0 4B4 H-|kp|N}օf','?M#:fB@2hk??.p6iLe6wt#t=DcRAm 2O4r.n`pwd.r|2 P&Z!5dEyzOvLG4ĺ/T鍾DR|@ BA 9K# .8ˢ *:At,2@ܓ !X J$Aq0-WZ7p)bEȗ PtfP!K>0pa[=ԟ}d8+µMFO y:: ЋTJlqkb>ONd3JeBj ^xvgd&h"b.FDq!e`H=4 G}+U V YIp.; M@]!.a܎! #@#oufBppSp`aX(*!~28sp?~ȝ@iZwSAAʂ"ݓ7@@ B&]bmC>@H\cp7hFuvN"t>ٌBHˠ o ,=-ʥ`yuf6gycՅ*vt vlyLN B@DNvxKH Ћa#$k%ʋڸO.Nx^9 d) uǢ}F4xWoߓ7@.A Ի  ;ԛ|lپ/}sF$;oX%jy>!C@A!d L? +699Qbx=<p_'F'A B  y̓p`VO_fLdOE<8*Q75:D.OF@ BFȠ:F~bbv%&_ck;~w7 thVp 0P fP"t9p|ivf=:Ͽ˰MN&cY5r{^_τ^vP1B^2k.̼z8@֍e -:"|8Yky-Ё'UKpdI =!WAIF9=tF]એo^M-,9lŠ޶B.; 9@&Ġ@ BȠ (3PRL)}ߟ3UM3jӦN@*T@Ā2!d ";NfěN-_'/]8 k1{TH9l;lLwc7յ@fR|$Xp !À53P )IOһ( Yzuq~tSᴽ`~tO"d")3Ȁ (:30K.߿'Ϩ?!ɴU(k8 p ipRL.)J;D8ÒXP @w p0V蚨wl G]:,+U:\DCf@U#pP1B(2 //2}3òM0ű--Ki61 ;ɿ) (86np90 @AK%F8 >1@eF?G}+ +l2 bqlW"I{_,*@!Մh!FBCd:dXf{p`3p3eA@ iu 8zC |8*PclwJԞXz p u)Hic4B95d8X/pIJE hZp +k';qP^Kw?TRqЌ!:88SB޼k> b8l/A;8Zm̓Y byYv7@ \"PN |e!4&6_,Ύv%ocrp0_ܴ< 8ݨ] `åTW=0sd@A>Uu730@IiW\ -p ~||ws 35vxU8P6Sp |.X]Ab@B(2}֮Ͱ-tҹA8(*/uO8p"tT":Bmɛ?A(vd^࠸7{0YeU( ۺ 8kW|+tBh=g 02R?8Ez8k;80A:ࠑ#lr!d eyϿl;2`a(Z\ǁڟFm"vvv8@ye.8fA෿d=Z=!pMNtDt \BܖYdp >O 8hc!h= lfّ#ub;8y Cp*8ɧ,p{Wb;dzDe!\EgF OAp(tp8V p w(_vX]w+/Wz|Yǁ:Hq$? qe288wV75F p𱰡!.el_׳.!Bt AyfSh,G΁eud?Ҏ HUׂ8 Α`[88e'lW9:ܜ,R) Dpp/803+=sw^p0SAL7,ʫ/FԎ]@8@p%˰8T8Ps16Upq-13pPyA!ooT%讅6v$8`g =R!8@1n\!Tr;3X{&;8_ewqбe= vp<`܀feZnޑk涛8㠧l-oy8qrh N4 ࠺B oս%񜍞8qP79wk!vAdCqEXטceZ򗝮kCq`wT|rl M 8wff;D[!f1Vݱ6-iWǁEk㍀&J8@pfa{+ꃃzؔAOeS*f}JG$] WZ,)Vf2ձ,>:8PFθȲdžZt4?`t_!B*8Pr./ydAEW:Y[  8h:B80Ѝ8^]q0__5e!`yǙtD@p 8@!*+vQ;\?~U~ qm`ݔ/!wjtvPjA:p`BY &- 8u/G;-:5Q!4ᆫJp`?̀*mp ףYOAHt|8]B *RJ9i4֗dLZr/o |q:'p6# H)]p Da* 7{+ʲ0 ُqqTn]K]B9A )VpPoܾ[8|S`.S㠭:"鶁p4 t2 \%Hȹ',bASj?u4wTGj2舩.LȠ@t:p? A~l}?o*'VSo(Eq7[W 'd8@Ò pv^" |;Ԛt-J_ud|]T;pнpsٻj;paK i<ߵϴ\Ȓd׶<:V `+bB!'2hpAY8Ė7pP-ͼpz!헅q0l88~TAR,:L(tޤppIp``58vF96u ۴g:(;Bhpp@Hx8@(2h]t hA b0-,98Ph:P{@"8x]p03{*M֑yMApP d Nܿ6`oY!|;ɒ?_]ណﲉ+8t(r8@(3XFt78hFp`?vo`z{p>$\EMG*[^㨂|9Pz8@o Ks"BArp`ARտ 8pKS;;]M'đ pPFf w:P>pppatpW![U:89ebdSߐAy9I'˾Ki1 8XyH-o_NeW8XBBKB *yop@J4jY. p`0ϚS}Ge߮jC~KnjcāsX\Uot\Tm]t 4p8,X%S {sKюy߹Cd:t֮;BOK#IDR|ePLBp5:y#e5iYG|4^KŦ໚뎃 e,t t̠(8p%Y.Te5mǎ8 =i@8@28Ri~,TK0IdpPF_ݵ4AooAq=)_88Ǝ䀎n@DFmU8ЫG{ӮDZu@mŧɚս/`A .G 2EGi"0 ,-쬘0cp0DWԞꮭVGB#Ga{䊨`1jGdp0tT2VdnܡtK"mt i:̆穾p@p0ع7 k;ߤ 8@H8$vvl5<4$R;^ߒʮ"@6L7϶),#!d8 Aܳ 8ziMIkvVS?c&i'Q@ȠzEt Y =780˦-_H1,`JpbیfP5']~@mȺ&: ]@Wt c9fpV^'Muf[jQi0]-#8pkFʟ,aӻ"pk~u{oPK^8zzp Qy0yd TOۧpE:6XA{}p0u:Z-oѓpPj8f:@:HNA=Sp p0tA EwȖY㠜O~p6nAע-&8+TVp]f&du:WAp*p9p|ǁӖ:FUw7vuܑ8g98ͤr |dPkiٰ:T㠑4||c{APqA58`}3:IipngOu> qF~Qtm@B 8}ۻ@@΀6Gˇ/ѓ࠮{:{;҃FJpT)tJV唢/8p 6UHq svoLB#r q8@"o%-lAp]Pr^6t*[7:!Et;wqx-9f1'68G:mLEa{o5:SVtxnx Ptd|d˶NzZNVp]=@>tB`GoA wP/lgjxEwMj8E8j yK;yR8(N#@A8pnF 8$,0{_DuˤMop6x;Y :\}?pnd kf}p8P|G8]S.7m}f30czwzl'6kR*K r{3j1ޯ1>%~qoWߌa}Yy}mӤn8@w" R:niA)]EW?is5jVp1te翀t3s@Ԣ8J^p7<8Î ;g7SG?2w*A\a2 ƶSpV)%28huvz S2D3*eAjM\cۦLylࠬ-%vǁQE dTm|ׂp,|_x *,`.pBdt̠0oiY483p`1(12pSLDk?:\w-kG. c p84K+u:ws,8@p#P4fkX\hA7Q |؁oA2$lAsq= (5Sp*As]E '(F )"Mp G8ь%A"ؼo@w\ *d8q1'(3e QUTK2Gvݑ(ɦ|R^HfEwIѻ7ᴇDfZ^ 2-1jX $XNy 8 |!,o=㠺⋬h܀t4t8mpqXUK0)pмu|Ro.**|x&zd@:r2pfPYAv :UjHp7eGNx  ;%8(ߍJ|pЎw?ޯ:CWpvgi* `j8@ [T 2(Elht'@e6 ~:88x{O$o{IDq1 dD %}k~/A po@g_yQ-T c숰;Ttn^"Sd{טoen 8+ndAVpv,xv=3S‘N8y8ifLqp!3p·9Apk+mɠۼM|k鎯q *u %dPA$tbRࠥvfFe1kX 8\^cvR_:=}18P>>jZ8@p@A@rY, ,Ns L dlEK|UQE[mq vb0qx C8؂:[]=[74BJ3]pRnbq:8(^\226aJљO8nj-XĹJ^pPߘlïi);޶r"lBy8sd86fPwӈWIAptb >L u[,غ8xW7՟@l6j:p?GJ_4~~2h8>> b@Y=) iM0Ct+y;^%tX Bo`esZU=4Gn=|Gw:1}f>[vl8>t|wkvmEWrQVĖ״V'8qT!tF@G000@?>Uz)uE_?n+?uH1BFU}fEL(Eg8\޹H0@tȔ9k`8nqou  ;NnFK&wp8`}&9&tYnoluuEl}oBlOE=8mAw 80x@Pr8T3d;)."8Џw(Fr@ UY qpTblip5h:pNzD6c,nu{r,Qhmo 8e,Up@ $܃)|ndptYh**3Azd@eaI>pp58q Vʹ? -~>UcMbq S=tP2h ~p8OrɔQX^Age_B@[A4r8H ~H8D|-'ތ?0ݹ98o>GAW6 OqT!p, %<#Rp8LK:ϻscpгXN9C^RL rb p8p\$$? t$ 8xI&T6;Tc:AQ(q##[f2@ep8$P3qlтp`qퟐ]>HAtdP,,!pp~..5$@h|p0Q7l@AceƵ:{㐃0>[ p8qWӬ˪hh Vwp  q2 "E8([rc/q:2ZA(/$t҂Ab&:{oނ3ދ'okACP̠ LXB9G>U nQ46:~QqEa1s B 8.iǷo{A }6qPzP}U\zw q2@*NpXRp0]_TG?8P[tA Gd4@8TVySu޶ 83S7(OvQ#8ʎO)|!V2 uM{7[p ݡsEQ@˿݈Hq\ Z`HYe;c{3˛ąC~{oz)Vu((O\cGܭ*9u7]%ǿSv p+:2[ /8H4C p9*uo4kqpr\ܒx988]ý t98`vwEfOJ||T83#ͫ P"x5p0:́d8ҕ7\5M,*8(O;-w8/W45dp&X{AN{3{b0&m88 l8'ktbrTDYbd`'2j81iPycueNH\. \ޒJ\ 8؁ *Qρ_hM_]iek䯳qTE* HA tluAp@A p$?t-8XuT޺ez{ q N^=85`ppcǁ5zx8to<࠾H\ǽ;R-AFeq,"8p &:B7v ;Q G{{8eQͻX^q-ݿ;GZp<  ̀.88\ ${Obi-m1v|J6k̙;W"jɱ{ 8G'p@8Ĝh uhuooׁg2_yم1/9pKb@x;f2@:(b8KY288Oq_&a>}vݛ}O[ Gd`fpf8R-pk;1nRz<8m]a*_:|?uWA!F 2@p/ h j=+s݂@oFBNpp~b3F6KqϔE%k7QŧV%Q:"e;Cd7^K +8H xw[,"l|o:(j+h p IX*A&ppEAIhR 쫻8h{`l{޸k2 ԬE :hN: P#app18 I|Q}S@Jʲeܸe:qZ\85:\D=8䬖+S8iRt{g\*l抒3'̣sՁhn+:.wg@ l W&c*T(-ATbwD хAhpQ6Nl|W UCS#_.! npp/8lp1˙ ƒtB߮j8kvSR>[4鎃am} 8qpp4V&8p$MH1 "&8X~_ºD `8@dp8v}8Ș3FQrAչl5CH D,Q]QY p8Hetjs2)qCX玁fZg%W0o*bC~u p<%+ܺL{joh΃Kq8npp˺ tNp}֘Ȥj:|u@D {1}8rASJKbt D28?[-88|nv ,+_?(ş["s:PrRv8&S9 )B7,,/zzS'U{lx : ǮL8@p8Vd|8PNi?]W?"ج "xܭ݂yyA7UBeǁVPͷ+h,yyEm9J[#&9XA<qAcLZ:{?ݍ`GWFIygCVtٜ.\uq6GOs8@p@Iq7A9R~}l7L,LZ\p*p8Т T2:Yu·/;ϫJ;^Dq˅l>ǀ࠸#8&){YH6z^a3rA쎃p 0@X1ic@p0C^;6\}ߞjW+lxmmL*Uoqq84/\I[ 0q,܆x_L C?8tqQZq!AIhQ%P]efX/s{|qf2⏷n͕U;B% (-.{ޠڬ,*P|9 tZ=cåe\2Y>!,M+t0q"p%`πgYK\o㠝ټe|1dԹusRS~A1d G8@pZμxut pӢ;[Tģ\6_>{mM\u 8@p8q@80io2v/`Y)[c/cq<D!nJNAp娞8H8_'gb8x$ kϿ Mwĝ뀃Jp)8AAR[ؿf/Yf.p5|[9 &}\uwop"!JK:X]:?!$8( -{L_cy[/ ?9jhfWlhN䀎Hٮl{syWW$8#tHQ:X)6Z8h%:5A889( ^&i`ਂ@pPzq@8X-)8ؒ떝K,m=q^?PxgA^حyZ8Hjiyw#=iL\m[W|0 :!h= [9W/܈ 8ӉtCQ6?y;A;}w/運kt8~νq%#889A;=lzY_ٺAs2LB:\.p`]Y]⯓xvr|q!(OV`dq@8mۯ:  tLBvk`99|-88A4r<:VW7QdLt57 8`8s4Tngp8𹁴?`g4ɺ':+ۨqpLp<pm1pp(W`k`*ݰ++ru8@| 8JJUN'^t ċ@jNs$A ˺Mߚ_ŁO;C(6_u ࠸qa("8@v ̇w*3ay *:gK uѪ>Yřʦf{yrICںa*Qq80^ypPEA pЏw6OϩĔ9>e Ϯ8 pP73l1 p8\"EtIhpbJ{:8F̊5;KpP p`Qzn9]" p8p =:pKfgDnQL pfϞAArq;fЂpp[\%ʕj| T8(ʶ1k4=V:Z6ͱBvv$1 t 4 8Ctg"ppCu`WŁꁃz9Q]Tx-Λ:`thnVAmtKIU {,=%Dnu.h>\EB3ݕAv_)ʠ$8a)G`Į1j;D N6g/0,.5_ս8858P1T[qxqx\ A˗Qw`涵(׳_%8VS &q8Htu3 08(>zh,b-ƶ6 #if%W\x5Rdt8$ )t0noүvΨmezU}WAM>v{M|{/ ( 8_c Fbb7Fgx[ QǁY9 `JUf8 qL&$2_ս/m~mz*U[6)S,p;pYRRq<0 3n Jp{v,f;EXi>&YD!yCߙAj.iS,gn|X:T-ٚ\yRba-1%/R pb ʞ bKMg,WxH gӴWʲݥ] ;zǚ4E$3ʙrDL`,K/_=#`yf* dM^\ p0p= ^&6>k{l+A.+o A*pgZU.8dx N.ut;6?pq|@̀<[T-GTdրڻgП=4_ݎ Ui}D;6E2AְL=6^"Em}ʞG8] zgB78詎f틟588;д=BQ*0V{d 3iSp $3)-xap`.n *Cif ˬ8`fɳr"*/t;1U`vԮZ(yDD1K 8H:p`jj>{2$bpВ\WH48Ǚb0M8\iGd h%pz,Q2 kX:=G:{Ȫꠂb4#0?ymPCbp8`WXC@ۅS L^ c^]8! n;}8〽 8\ĝrɷ⃃bKstҋ(lY,y+;a>Xp@{ph p0OT@q>,gv|w:gO<D!U@5j|pdo.DC[8Nצ$u, >z0'h`玃(p8`Re?prH-]p-}'8M#+eRY5cgbk|y0#: x>8 m{+Ϲ|mv3VrkcW7Ekzpp_F8άa!x2ۚ -#ME-;'e^w5p= :8P8Np@1B 8`^Gb?`635dowݹ}փw|1 ;L_z;j:{NWtQ8`<,&8(ρu:8"^_G |.pÝ`TC=9 }8)~u18өA_:rynK׌6hp`AI4Oe EgP(ݵ 8|(N޽xqp_Vsd[>P9c{"tx/3pHi 8bt38(~ `m=~:imں@]OF.AŅ8BVS%ZCIࠆ_V)Gֻkdss7N_=gb;:߄Gp%<KJpu#pP 39݃ln߲n8x93b-ߏ1 8p\ZEDUvtbHZ˳(pY0.";p0<3;/w926 +p@r_AvDm-w80xd"rьhc$@g~8D'QpL:>z"zPŘg.vAlxb8/Z}LB= )[.@n)2O{,cr濝p8HMARkm}8 *ʋ;QuImJّn2G898A IACfl27~y Ŷ㠛h_~9h&\ǁ28 *[pp8(Hݡ|Z2Zp^[q1OWe)򪁃~~e5 wMq8Ac8Ul&*pn-&+`` ,#[ |]pR],ypwOśMM*`qp+8(gg@rLy⛊*KCS?My'Ƣ1`՘rU'M#A]69b1eκ Ӧ[.z^ ގTF.k m̀ 78`ҀK:21^<&qw~Z@ʭSٛ/o;>6F8qo $9 _=  |XзU3n]e-oI9qO8TpABp% pR\q0 ӱr!wT8꼗iGAf{+88Af:*)AJz's18("Z ɽu[ [P'6ߑedG4=gyΕ>*x^@p8`;f ٶH#4;*Bz}Muilf`y]q3888% *ma A}9=*'tߒ8x 7Woeޢ 8X!ppqp2\M]h0o 8'3SwH4x¾Udr 5&Pn#/-M3pA۩/3*#IΞ3%c|W4u.p}7݌8h +_WWa w|ǐJhu싁 gnp0j*mO,bʞZw Qq@p8ut/4fW|o 9!'Uu/JLfh:Jp@ImhhjV%{A8dSnfv\|YfKz: GV} }GtND^C4)H>jjQU\͛/OҖ;J.z1+g p87NҘA8s*1'ҁbNCɢHqPX^ pxYB:bBpPI̮TގmW]q70q0}Q-W9@ p8p 8NO2R֝YC5u-ݑ^8w]u`qpCA8^ H!v2mM{vyGf^O.ј:Pd p8 UF.t(u m;䰛 nm-48A]> t̯99 p8B,! !ZfW򉢃pWDF{'a,8fuKS^r¼"7 ZpP8^〨 N~o+fp0u=tXdiޅ:Q&;uV߿ѯe(G'tYѬϲp@ pTUK(td\Yć7pP,W`?`tc*b6k K@Ӯ7t-ł(2pfUK8He` 6dVNAѐmUֽSٶm禤`v Jth^PׁP8`~ꞇ ,;Cm]-7)G؏t &LT_cLW@q1wUq@e$L5p 1sw ` h[Brd# 8'L~H/;Y%e|/7 &cˈPH؝B TTS繉࠭VG 0GNp`Ԝ8Oۻ ` ax|AiࠩHQ8@[GFQ2`g<,/ 88pq\!u%p_MI4 DBuA~ Y& m۰Z+p0 {pPLg24z7V69fIc @8U *%iJU4Aֽ$p0)qn8`22`wTSIG#3HJhWHY4g) *u9$Owp8Y̵p;8^'p~̧s?4}p4q 4NAp{zyl88NWKn1_߫Z=rtɛ8(Vtڀئpp8|cNt68N]| .d38J>7 = p|=ƋX+FSlLt6yZ񥤪XEe-Ah;8j8Y$۷ķ18@e]Z/n⠎^p!5at 8 q@HJKGp'ބd5OW=I4{@'q6WBp@fጞ+`pV W=48_nw(taiyBw$#NQ918 q@Rm F =8h[fy-8Aufm3] p8YWu8iOi_w {MTfCV8@+B,be Q@V$܄ڦKe3mu!ݎJ20XFyW df?AP4uC VJ&Nv!q$hV -Alzl &!8xwH~m{8Z?DgN' rVƒZPp%ޟ7߳3&|֎]Mā ޖyva98&hV 6tP3Q}^r lM5}A12T''yüMT̈G|-cHPm1ծ_0BO|BN߾`RSqAXp 7Un|pYhV 6 d0WӥAէwM[mtF8x ?$ ?~ҲzH `y 1(&s) Np F~Yn\_՛y־`he4+`;8~2hTj@AIࠈ*t?$u'd5,4D0;8J&aAA8XTCVsE&>}Y ai\m:(TSR-o&e<;ky?#2m: 8;+ Dc_gi/7Y͗"m5ė4UI j$D{ \Yr__|fyA*^O-m)FE\*8ΑQ81e,KNq=W wƒ:q{Wbp;>ջ٣hNԵ{8Q6C;8}i9`[j~e>sS`+f+G"U5[yIrx&jL F67 .&CrZz/`:>y8{Ahp>'QS$e:P_ae~o ~3q0.8@pv8p?'@  qxq}q/GQn'Bp8TO^9yWJqg`VA][\/!Vp8 'ocUW`7:^eQNn \S M߀ ɲڭ*mE{;̏+؂4epnN`^`8@e'`; 4 jpjOH(2VMYUO+\N[8'|/cZ0&ڧ28C{ T&z ߞ8vXHl8p<8hsV'8.zV -`’оz5C7*]p/ $Ӻ!eyȬJCl*qzgkAHPv ,|Np6_וY^C@zӋBgi"L+As(hs~y'N{5,Xp@ C&qpj^Ɓe#j W)tDY9$d$mb2S|BOM\.v3)V \\Hp*q5?-*b8.d9pesr pв\._E\e8X1qsMvsEC8@{E r1،(M?q>pQ/-\%rp(gIptq^>+A$8t+\݈*.qYRqKzJ %h }%F|HĒԠKځ IjjfFj3Η{-B'9q4?ý/16}*Alh8`~AJQWIwKz&f9`]T?N<1$q8&~zS>vovKƸC O˨(\JO t]:&-(W m:8RaW~p8l /hxm&>f=+@8&*~9LAAlhࢻ֑7E B< p  ࠫN=l83@lsk)8a880$փp0{On5u۽C"qpf:(Wh/HwQE)սR$p0$N)C E;$Kt&^k;0pcm%V.ārZL曳iHm 4xxlE bm{~3vܑ8x܍z_v<%RLϴfbp8XHuh5l μCA`58ƉGבґ8y\y94t܀=74qI V |p>aAHs> MX؋>P[gKnꭩ7{KL .WĚp8`~& vl` &j2C|K|oo)&.mtt_J)\@]`&pZp_p`N 85|49&#*kF+4'^\v[spDSpWpg' Z~MO؉Q]pWWhO|_boLmJՃA88@ t!TMZ ;E ̀4e9VE_붸%*Ad0ɏMsR 8@xChgYA9kA93q9 .nL"r -iuNKPG]ґy;8('"j M9qWK<)8]`CmxM}Ŧ[.ڌH;OĀt|֟qgt;tܼmAZZ8Hk=\ /+vpvR+[Az"C$ \+ AVp2qP{G[%\(Ag (It%%|s>a/l9l^*38N{5Se/U֝ikj0/ڃע$A$ҕ༤'J ڢ1.]CDzq}}SzJ|ĻN2OM<8"8*W]C$m:g=p$'aC-8]-l01q0.S.'sQ|bB&<%*^57lh"4h5tjH6&Hdk8(\Μ="˩Òn8,;$p }cgKpA\9Vܺb jp<7h8*€i|esH -A;//lAicY!:ͷ888?hUAMvF"5[5GsG>ok]rS.YuZĜY GN8ZMo,]5f@A78Vvu5;ppv%>P%p q6ݯZ/ap'8H]4!&cF@̞.F88xW ݂eKԛ8M’є,UA 8@kYaQگEAt?Cpp*WT6|kqs_R[pPE]g+[t191zp@=3TVSu`꨽mV(pӷ}8ӽr;7pر\Ͳ{ p{8 q@6]m8Μ376kq :gQpblltD8oŠ88@;Fr^lp88(AMo- U كZL/LI ,ptcQA":mcdāZ)kWי`f pU" &!|k7`a pc1qL8@Z$P%L 8Z@ߜL2O858)Xlف ; *. -.pnoɾL?@=^^~GI pVbO09U{e/:]"X!yGASnW(ݢŠfeƈi%U:TQ8 pyrTL4^k$&{$p8@L+Zԏ98п6pg;_|tC~N/)S[88@~4Q7< 8h6䫴J$?H8iʱS[M[`rn-\pqHC˗ [ Y L y ar_tPQNLwF_ZȷIi485 ,)8aဂF1m% AH 夎:@oeo~|m#J(̶l$ 9X `&j\yZjͷ%98rVx$), kU#)U+Nxk?|¦ `Y_Qaz'Ƿv4Rjܩm/NَqBK??H y Yxh8pBTR, .qׇNp{.R+סҬDK~ H`O?|®Q' !fuA{K_*U qoQar^}49`KTzzp8@q$'l5HIGp0&UX s?qgt8t8]) p'lg w=&>^W08}u+(7P[tcp 5SP:7q|$p)}1` VxۇufO" ,OV2,|# LOHו|eբvF+A p{fɦY Q]y 1qJtq\/]֌y{uY@$ 4T[Tzp8 AuC TY; A"vLyd= @/Z=,1&TaXE;c>5qP/^Iw%}yBmట !D8@*S''aXA(c&.FMі^C"Rncn'V$,1Ud; ֎ K@@5M';$orp N* P;ėXκ(JP>8`":8| Vj4qpd"*':^. ¼JE[|0QY}Jnl8-@<p`o2` jՓA1|^+ң #ww93 ~u \8jb qOp/) 8M{3Uk<5k Z3hGFY*[NzJpeCp’#p@q$Qawp`2cUc4q[I,5ip- ڢ qL'P -ͯ=G:/}煥A[ 8Οq02 pfePS4ϐ`Ai߻ip#@|KI@p78Lj@l^LLCp`8*maN ٪j*aBEt^N57b \77HߓR̥q48ώbA=`]8pU-3^61A bRA7(^Ҋ*Y_ CbRA*06M$=g#OZW۸~KmXq~mJ9(C \Z%qglE84*].A@_iWN8 q8@7G Y8+m|AH]?~DƩ#g}}8(ѧ[ t9};#;w ﬔI<+Bd~}{eZqZ’8@# -*P]$*Ӥ`T'!eT-=y\ y_ M#A֡ex\7 ~8{g | pS~~@ vR{}׃[b.g]p0P5}Kg80ZBlÀ18 SAp5\ܠ6xZpp#ZO|8`9.[}!48^ , Aw}8ȷGBd*8{Mʔzp8 mr&j΀4x-쩈+#W:qmq V70k|ӭEDž8@|U`~aAZx4 ߤ@/@n *ZG0kQ [JJ=| pF= VEm5h7@+&ihѦBpVDdk q(08DU@!8? M_pH{ V] D;8@#(!Fc7P _K +[ 8'6Y6?%q8@8p,Ѐur0qW5vM%v$[pPP8(W҆2t68(S%  38xX8I pP "ZGmn0$H B68J@TWJ8~Ɓ88Xo#yap@`!-aRepd{'NZ c<_F 6|[8L?p#j&8L*aC 7k @F"ɞA+#`*,%e$qe48Fj)o\!uVפ'-m!$`DA 8PV"(b\ƄAcu6;ߛp48@ NoP8]|)@V,AnjUF[*8xw$p- 8@`A NP{e[=v i>$.6Tj8X `=pV.Vfp2SkKdz2k-,^8h OuH?:,c$6~i;ppGh\'M 8X.vDA 8 ϫ޻U;Z-DgATpH9b=nGpA R>Cxzi n=8AHn@u۹2Ūͣ ,c)8(9H}O8)XA.DAKZ9chp8Ɩ1~N~U!J_ր Hfᥠ8@"$q@mћ4,TizpII5 8@oj'$ޠ2 Y$18-h- nmHNL!ڹ0п6 G5@u' <_6BUI;gZ`8|R GZF N8p6F8G\A\z*1:^Ei[px#qq$1hi &w4$yW7[׹l (n@$)hp8B@p0qt8O 8JBnA*3&fj=tLA 8 qf( R ǨA;&q}_GVv-[j؁|LqR]9p8V瑃ůe +w[j*l1:0 p8@=  8Ȼ<O$8,֒}-!WRp8MeBxdpŝ=Jt^"$~}ŚāέlW'r 8@H`(EJ Dr&ċ8ʱ!q_8l8pͭ&1e f$8@AS!Q#4Q8H> 1V>.5X*q:Бx޻Fp8M8ptPpL_1ruV|\FO>zBi GH 4#F* ^#9(Fi p;K%Υk&I7 w"qtbB0oc3AR=қeNTA IdoО3Aov±Go㾩J )eI p,񉐔UIkͽ7Yn5jeF:j 8@ zB58SP@QB&%f_`7v;zoq636Jp8@Q{: B$FY}` ԡyJale%WXvp`K[)G! 5@$Xw3Ɂk8{on7,n Cp@pޯj@G%9hl"W58h*/Ry܊o18Y .H0>QțR.:?&ݲZz8J*C '&q&BAHZwDr:$}麻[pqn8h%װD|Jk,ĀD#q_#`¬ u-8I) 48[UNcp&eQ\Iآl I\:QSC3ZƋ'Gw08x8tmi@*-֔,ltBQ38I RWT; Y]KV{EaMUs?CuN@ Naopd+p ٘ wƭܺUM LA/"lco78xЯe,ҵ氲nܭi) {vp d"0=S!p58Y38jpwr.6;<;#qXТ2%ai%q}Tm D3SL6(3Or{? Nod귯D'OnRܙ&R3F1$-x̀ gG (dJൌf~㑆 ,v|y ݼ&gI{%{F$H 8prҀх'cɁ*8+SoYRaHmGu-'8@TQ'S(3ϕp|$eL-s%A.08e}WtWqpQ 8=pOQA48Sa=%_4dWfet3:+TGO 8@TԀG! 8N?3 tsAuh?.՛K i_ā&Mu,Q;P'M P,jmsTrЌ!ۯ+Ґ \W(nIR[pСQL(ijy@p0㮼UxpB2 DK8|8QਡQPTj0:ui2ZPww|u^ 4 sVxUkhf@J ~Or |B{{i' gf [U*Z}ΆP@'S&)C\r* $Y5xsVov(Dw!H gM|24!` 9xZ. 0zm`2 Ec8pNQђ8 q8@;}J*\WҽIh(q&i_ ~ω8pe8Kj(APr<ZU-0XUH$gwBeeG~p*'q8X,x5\Wgg﵌Zj;ytg ׃%Ejʼnջ5N]280%/jꅠO`k}¥3n^W(Uw5%KāS 8@$ (|)9 jkQp 6*/uexI`mM 8@NYBy<8xNRn_-l $ p0|?#ā['8@$$CԠYWr`ZFUp!sw8μ_7(Q~ X"q8I;hyԀ A ޠ U-07ਅ`]Y!{Dp@Rj@A p]UQ@i% }\@)q0$*99|HQĀL Ԡg;p43άJHL bp!=A,vpC6ƾ|ozHQ?I'UqUX=&qo1L2vPj5p/.Av Ǝ*|;0 ~$y)oI-ZwpfS 5@PA@r`ZF p n<'qp[h&S++y;b&XxW%[!q@2v̨q85._ۑp!Ҵ 38uA`qWPsh88`tD @a y%So&@ìS[Nt}uQsM&e7( RRȪR: ]jY*( _] v%gWT҈D8IuQ( hʞ*ɁkUg_veXM+v˧5ʿk:Ob-A8@3QԀ j'5pyo$,q /wpp2=p*(Ac 8@?AA 6^drо9t<`88Ql|TY'E?|IKŀ8Фd518 8LWq@@p6j';T+9q-4SU0:PL=j<2OT{WZ1yjиGp`pR{@ j%U\bI]rT vW0U-: /q 6#Ԙ$Z5A{XlZ륚.NsFTn^q,8 qA7Ԁ.PXS۝ŕT8^C+Tdp:-^ע\'pPb.!$C_ݡj5fї+ 'A^¤ʐ ['>p T?C ~(]5p b<7^ e6R]eC :U7%jnݨop<ם\ ^TciG 8ȶ% q~z-J/EP炃uM|>p%j(ۯ-G% g8SOf jPtr0ZNp/L;p \#bljkIs q$s5@PA35p"cUamā73yqagWN8@āNSEfdʰ-q`PPŢ AG.p@pj5<Y;gnacWrtvA p@pp8P]9}ưU ڸ=%F(q`5dJe#b/P5P!CC*d8sw,޹SvG*Wo8@ZyRu"xdj5<[rPk$-Ps57ܺHX&307AJ{R"q8@>jP3Ω l8 Zf{fYj588&vpΗDJ:(aq[(`n@Qn֓Bp8Xj?tr {-cs8kX 5ʯj-,0`l?H 2p9WՉ{pNqRGuU&qPN&HUj`A 8xN_O`kW<Tks<W( qL~L~p8'gft75r+瀃poWpTa18؀@1i88( pجEuݯ &W[%I}38 q8RP@P`99YZ8_e`q*d#_8CPLp~Q:@ EŦXJ-`* NJ ]rCr5888`lNj8x=˕ljcҔAQ m@Chw8G#縨 A Wpqmu#$ apP<)D8`l2rc5npRU8?j>q.;87 8`LJ @ mlaM5\s*lQ 4+]opоav[MF!S!pp9"p"fࠡ .JDC`U 8Qj8pNl!._n-quڮv&06}+}=BPJlDg5D +]{[O 8Ok880#5擃aw\b_Q$MnjJp8Dj86Ô/B[4Z~H8 8HzucWIq8`!p|X?Sa*jJ3TnujcU!l8HEp8jL80#58AZDZZ8׉8jfH 94=j5&StE {@t#|'kYw ^Np8)J?A3q;)jp"8؜q8vn{T@p\ f.(23h{.k <]y$Ig)7hƻⷪڐUN0^˸58ed y%. 20O1vl EL8  8H j80%{bkBE⠹V"q'p8)2 9/Y*]aQM%2Tb58HA%8pO}Rrcj`I (kppz9nnA#kۓ<U p=5 p0I \{SFJjv H@p8pOA `91ͧ[{'JMMXf9$$Ԭ8Xr0a*".` #PfQdݢ4=j5@P`Ҏa_J!8X18bp@ 5g pp90ww8(oވ |pd '[$xc8E.jPQ\9EbE{bǣ Vb58Ap@p8X5 PcziPuUdv ^jj`Zop.!h88`hG 8P`iZhz87`A{sz_H(g,ОJPA_95sZaZe pи'DU[ (xTNABYijԧ]epPqȯe lkR8E8P@h ~Y=g䛢p?ODC‡[v% 9[j;굡TK.p28XP p8]5@ A' 5ޭ`Z5Qe p8BP8jeY\Lz+N:ϯe$q`JAj5DP}y?ܴ7Q R#qp8p@!!ۼ16ۓ58*Ӕ -{uFI8~-#!lH:SWe[Z 4]E2Dy8@ m^^IB/7n8x%80zV;5@P NVK58ȳb8Z}jj5@l,Tٷki2-ghs8pZ~<ё3AP@PrrY͜AUXڛz~d8xC#8@Zp78 jPyBg ~ԄSIR=tāKr,p@Gre>jEP`|š:q0y~>8e0HCWcwHyFl84j88%3iKrmʉRbHds,/3ҚG6r A ȁ〶lq,?WZ 4H⠩h'PD8pWpTQӒ8 q8l A ȁsrP%.-Ag,ll8 6Lުh@gH8^  5 qVTj +Z&p[M8yK +~)@<8s' A " P&Ic}X :;NG-&* ;'pԴF&MSP@P6j8څ"IUE-'*@@ā##q@8:I72AP)ԀjpOi2B! UnE-'O 8=Kā'LQ@ rDsF}!Lr^$Йp 5@v$]wNiE=DVkB`g8o5@ UbQrȓu׋PԷrUj9yppcREj+HבLZ%N^m$qT$R&U5 ݯ݉8/I-PFiUmRD cupj5-[P88HK$!oJ|A^8H?B>[8h[Pt48XCV,>J$qӬyf-%u*~p(T A p,b/K$G$UAm /QxB! 8@@pN.Sz{S")m:ӟqs.?7TXb-k@* "ɭA,Q'W(Tp2j@⒃j7; KTg$VZ%[ǤR'Y5|4WIKCe?+ 2V p8 q8֡㩁p9CbX >W3r#;a!ȵWn@̆U<:&U5 (^HWS4=Bi4ط%q+GeAýяہP8OKFW?M #ӪzT!erkq pM?J'!@{0&Z5qۚ{WX~8~58أ0玨>A r{!ʛe8خJ\֤/4՞ |?J '!@|Du}fj\VVJσK9؁#(̛[P8+YF~Cv RcF`] l?J)" Sj%#8*WRՑӲL5e,8-8HBP8 W VGMvn3q/QxdяB&!@+G^mvYQ V p9l櫿ƶoL{"8[ q krP\y@ m}Cnsmn8@( \Bz{r`_f+$pwζT?/qP,(RJ 8`, 5@Y;ppl6mL6 D ppnCp*jhRcvร !<8<7 jnȁ,)lRBWD3NKp$+8H) puP8ؗxjxk}%8H9o k`Dp](.cs\aq`rh ?psK vJ 8P(7FXʹrpD865 nl7bāƒ∯)āQ^Yp| MөA 7rP֬vyg7Z>sZp|>Stv!#FSZt0˰rSփHyN8sS.G ȻGI B؈fOUQV-rΧ MGD 8p;!?r`bnA^i1 Ԧ 8X9S9&j@pPLj@i瑃 {2I7~V= bm8@U'A rgg?pZ܊4BM(\Y*Lj5@PxR/u{;28vNp[)Lj8 5@du85I;V?pP0V \8 ʱpXK5@p pPpЗpM7pN: A pLy2ۓf\VV =l `8sAd58@1mljAO+%E KTQADCQ82't z A x&)!pIqv%"*8F!'8أ!ARxrक़VhR'27cߐvpkǍc%5A ACȁRG_B!47ŧ}88@+p@Ԁq)LoA x'jP࿺sd4df,)~eku 8p`KλL 88 wuWh"=ym >'0.7Lt 5@p j5\(:R+4H_ٗ8m52pG5qB 8N \hK W_=n1}]" 4`\ EN B@ G.e9fG4LcE|pN,> d@@n  88$>8el6ůqhv pkOKR6!1`,qw'=YT-|WH4pMo8LYzP̨jY2|'WmW90A]kL/K= 8k9=j@Ħ_tppWgLp=8@nx"ࠧotpj%Vq;*];`~9d05XͺMfCPǐ4TikY5-S[mp]{AlrsA3Iht#=S!"jp9]Pj|ppUߡ*8@Q-a<r +A_iudijR,fT ;*hB)N"k`"h- t~vSC\zjj+GQ~΍@ "=o|vX׌o’G5'rN8@0*4qIڶL? pS 5@ȁnb#Ueum1q;WssRTvfeBE@A)8nR4թ>c7W9b *9 "q058uLj@QIz^h(/ߨl !8@$òt:u$!; t#kgqYHm*QEߢd,1ҩsn*G r0RޫmvYz~Ҋ۷ ؉2 Av#$ d>5@p 9UUk{hŞV˽GȲ_Z~jIF9r8S 8@t5D)r!!br2 w#AS."e*d@=t0lZƝ/>"q`P9zcBEP'aDkQVAi jG{  2l FH l]GR9#B@ZFa+#%pfS[ M(MM+[b ƨA9&TPjJȁ_jdz.hAU@}:Z([XTq +1{t58P;F`- `b79Vvv\O]N̉<5&qH M-Sd $)fP鸜^ޕ8V0$}q8@Aƅ{/"krP!/u.R8_F>Gn{R[#`n"j8 maI{!8h[lp4'WQ 8@[u5@ #&p0V̑tb8ЩgoqY$t^_P19( wߗۚ|hW~&A "qyZD PBɁklhnU'k%Tns~1-~*̧PBȁtc8hikū~" Uyx9do pppĶPB`cZA9;5u?78ZF  ̧PBZ6X1OXUmg [d'wZ8@(0B (XڲpF/8x7_4M)cp֖.&oPWb@ $' Uʝ~GRyit(jP`>@ JdDV bF۳ٶ4Nn)M{p@P굌妪WC=?Φ/bFo;euLj8@t$88؏pS쵌Cc61W[ ^nqUu&q8@7h@‬!@H ~N!5G3-G5*@ @2uEw?{~Y80ޮnH :gTJBȁ\e,7]0E?w*FS[AQYB$I3Pu8hl'FePq{")dCV?$hq:\.t 8(t 5@ ya2J)Cz˶ǒ ="^1}phQ2` <(%ʯe,7]YM=I޷ &Ry8YmrpwTI8PmyX8Z)Q-A)C\jHBUlI֛~ rcH/߸`{_L>Ktp50B |P !8@mAn')8YyK jB]/W kLP`:@ @L]J$ 8C!!%*ɧD6< p)j5@pW9k_Oc^Ќ4ӣOT/pVƋ N!`99x`*IRVץ܎8f6.HA(Ď@V>T8g{=7iYAKPAPj`eLn1q1D]6j#q8@50B @擃,3<LLYǍVY:"DOCӤ>c jrȬ$NI+8} ?a3pKp@p uS Xzhā%+6 pihU‰A @iORpĒ:(8WyL98 j  ]7Ɋ*+JCs/LO]ҵrp8@$:2QQ5@pܓ2kS,KtU5I8VLg*Pjr઎(B/f!=P䨂ā\.?rB BkjTQ>r(=w"q`'/GL[]쥓>1?98@PuHIU̱4q0oӹLi8(n<dJy?O5@pp:'*͇zĬmދ٨⧰#q<.?98@Pjc=8eamp%-OCg0"BP_EqZ7.HPGO,mlʱ,.~EjDr \emu \ ~B$vԀb) J) 8@AkoߪpYeduW[7$ХC 36[ j,TB / YH dr+P_O"j<-)ӑ*@ BCAk[0At,^AvVH$^AA !JìX@p$G|N7p*w +Pғj#275{ 񠃍Ήµ3'|P!2 pOРЋ^Hįe((rҜs  h;{6?ijE'Kw5Kv0/zMk0G!"Prޖ8S7&O#:bG+?ھ=5@!`9K7[/עs_hA r@<˺4u~miU_ЀG!"^z&SBp9P2RQ)Avg684rބYrh 8@&ͿҾ7uin*tP: %`8D B$! VU@iJ1[}.ۛyB(B5 8DZRO~|@ ±p>!8  jAw?wPQ(f% ?8`gjPN +[}[o-T7!(88!!8@AɁK}M;v>Bj@Wg&.ߙ71C 8|B1nC  B^,pe;_y j\P4rKA^N BpBp9J*T`&^t ݣj^Sb ‘YK١V 4@ LPBp9Glݖ3&`zdƉ !!8X 'D?P; hPBp9" scUPjt7ikp,qP;5$6;@уb ݎ388ޱq+4OK6``> 5@p 8*;Q]' ԀBp9Py߷Q?(>z8"jBP!`\rpxO{p p@!5@d B ZHM xAZpR ]ԀBpN3ӏ*8XԀBpgm8h82' V5PY9kGG18S^P@ ZƉfVmX!(V !YB2o1{T8aQब!Gr01q ZmH{ᓚ|yAT/o5Z7< ! d B6;経tG~>qvsT!#q|b8Qa}Gݙ-I{!)=IdF}*ՠyBjUxR96*8Q6l>ؿ5S A8 w9XXF1_޿qlVQT5P @9[<\A8=˛AAMj<2vyk8H/X5T@5`ُep_3B8}-3jʯKn^V:pٿl[r>|{ lZ5 a`Yv  c9hSs> 2+$?8 T@5k+>qAL85A4FV rOKFsa;ԗp X5T(oSWXq(|u }7Tpon~ZٴrГ6u=<ݞip^pp*AS @8Aoきɽ6=vE5P တ`cm|:q0>p \'h B5% pE7s@R W {&-S6Lz׍Q T@8 s93Aҍ>Ϭ_&6hU^Cp@r0 owC8xq>]Nl 9IC|\ P _' m^(`S۾;z jS\7ZXFס^>@ =Dԗp`7)w+A5P ဧor88"p X5ඉ\Yi#5PT@8 i9cs^\~Ϝe{ Ur+D͎pPU@8 f9XAFˢo{EovoDLk,67grUVMl>$`)W c\n.Gpp|SRyr``T3(V  ԯ<  |,pcp6Abq0"w5`/T@5~{zX947Mʁ&|d/B5NM8?Oupp0-qmhY8P Tخԩa~?6vG8|W(ev4p}3GvjM{Q5P T@8 A98y2ɽ_as`rPKՠZ5`Xg`U5ဌ~_ y^ny䟍6GS3' mgef`gF/FSt!߼j*<*V̭7j]@8@9jtx8|$zG8x DޘzɫA]@8)ڬQ-۰pPeW]0oVv ဥ_d.m1)G܏en.#ELg$]5Q @(7):U8H]OV qs:>pD5X] @92 P`׍ԏe8 xрKT@5v.ࣿԴ)+ߎ>]8HXFO_lq@5n.7`_ iˁOΪsgZJo T lrcgm|y,omey}BW [b(Up*mG/'㧀 cKx~]dՀscoD5P ဤcz`1]܊Ol:Isp5P nOW|qppu3?p cm)0u@5P T@8 c9(}6 yd6'z`?q㉑_<~4`{q r5T@8 plק$}ax8yO/UjIdW@5xkro2Tp0嬵d i^ VDf@5P ာc_Cp0c`GrU8\Id@5P ǔ"8.Jd݄ ,8 8P Tՠ8/Lp0'ڇ'X5`z4׎@_@5>q^{|Io[ܸk",8(V X ypTO^T I9Co^}-/cBm| @P `Ǟ1:q.|[De?R`hj/-b8ίzt5k8HS8hPU|\}p@rpo18c㾿[_ cw8V X03VgTG eИٯ{pݪ+fX5`#Av>-._Z܄zՠ>~56+/,oq֍kƍe4p@jjɋv `狙&HXFFFwkpr,\7fԤ A9RcoT@8`SrPr /@`U#@5P ]AKji헯x*ՠƽ]8P TP]-d=Z>hV-@8XU Q5oH|GIz Xbp=r(n8wpo)?r9q_, &F`d5x93d vTmI4=9P~$pr5++N k63M283 bj`@8P-6$>pVLp5kVdD''P*z AW )ISZj>^ 2n֢Ajl Lq9Q#ׅbw܄)syѷW5P `unUٳqDM5WXƾ$p`DDԓϋ k(a>ja-܅7/濠ݷov ǔbcD.m5x,oE<ٺc\miη;op-l/`u(x5P `X~¿w}د8xAkm5ظS;^ϜgH6P 'j4PB".^6—TϿf's7<[w,}CQ ?ărp >[x<;@5P v/O-i8SL_ j-{xƪՠ8jSI7z]Q?}l_ o9@5xaW>K_8?_j‹iˤEP ,ϕ/] r_9qՠ>gXT@8`t>x!~W.̈W*YAâ@5XQ+}>2n\ljpmlTT\`c/|(u^3 ۼn%h`1prS \uGh7P ll'G5sŭcx҅6T ,俲 \7xUS;TדGE_]`}Wd8N6Ȯ~p<@\W/ aTXGw?ɻ:>cjhFွ&E@8k<z+WϳǿA!G8 骁ϗ@5P& cn,y6=b~8*@8`|נ n= d@8P VArptZ7Mc_ۨ[Tm |͠$5x9e[[p* &? 'G:Y6U]"0et4tPe }j7b(v sʿٗ9A7r;bp@05VLas//7?;l#bωM5nP 2U| N3M'r/YCLyP(7Mj9̰qRX5T@5<䱌MC/KÏp@jPU@5vP `Mg~z. |sUi@8@9Xmqia88^ l]6P A5xj9XXƾ/̷-I.Phpk5vP 0sgqP^SO ềp@jP1Q8% {;IAfk*bop>[6&Έj<x,cY4qF p)Ɓj(o)|x:>q|DE8(8P M5TVY W8j2% ~+AK>,A#p!Q8T 2 QMg6p&fKk P  s5(Ǹp!SUq0 tpj>屌%A;7zp$,'U@5P R?A7K3˲ꇘj LvϷm3u_Ş,jE'xG{+-/U?2d #@84e4/7:r]R&nkX5P V|lW3'$LUĶOT`rP2 n`々#bobjp{R =B|$c^x{Svo&3ތ6Ip3!Fz?F8Hw4j@5P GBs@97vFOp zɻ.U8GЃ]nခՄ({B5P6.uB88< =1U8`ت Q8PT~G% ?An 0ce(pc(K>8zCr5AP g>:[G >"y5px5Ĩ6|7mFpMv ^%Է{H/Uz0ါ(w]Wj猔/wp>@v /,8'n8yj|hq5P +1AY1U8عܭ18ВW `guRv = (7z5TnBUV#T8ဧbqX 8Hk r0wIo8ulpkk pC6T!Heof (DGoApA_q"ՀGmp2֡ȉ{x Wmi4pk|v"}8s[x] <{B5@8Z9y\:#d:ذ4(}\w9ws>WGxF4P @5@8`rPWHߜO>r]LJn`j`8=뜫4@8 wr7'峷(};i\{c (mrp08[~Csd|YοN-zpO`jPe+>)\xZ&jl(|ڪ(|4vDL-<WT ဍAY1U^.S ~>.fCjp#A]1U^B4rDd3 m|Z ~6@8堏~1}3ͅ'h(|Г 0 jgc}p {j`Ǘcv];F }YI'|R v Pj7F%̈ cnk`T(C?~kOޘlv6Q?=@5kp@rз*_OՇ #rl <6v Gr08ئ{.^֛u.ƅ(u9iNwqQtm5ppUlTڤ?*2;MjM5P HYbmɑRoG<Oįnk`C5@8@9XQߦ)oXlsv`h@5P P6O`}AU(Fՠ[ sǹ}⟲>ў5u`@5Tv ^ΥArЦ)oXr @Dž(S?!}_pmdx,p_{@8xw5@8@9{-#pE9ՠ@A8P ?oT ?j9ruDp,[O@XHت'ஏ$7^+o8#/6Vj,pPjC5@8O* ?tJ8HXF  B>v PϽ36)愃@8HX L|T5ȸk 0RvzL ($)A2n\;M ox_5+.A b㠾ATr]OKnt0KAj`&䫟k˩ t-~]n?Tj2.8|v\jCy)/~y(7uGA՛H8AUP T7F0 뮰 [5hp+ؖ+u8HZl]5`r0=s8XF! @5@8@91ppA2 ܪjU ,kpT (_?QQ}gh N5hU5ဇ” =|e|S@TH^ό6~{pb@5jprZRo Y9^.P" ~]xOHx9ƼyTJ2ȻqPDr \kc1å9|,pp@5@8g+e\:c\)p> kǫvM5@8@9جԴ(@TG@Kx6Sس4l\ 6wt X9qP @8@9ذgVչ=G3 prV|*2E P `O+7BDem D^u?t-yArPW TP @8 |9ha]8qP T U7qWVTPrr 浨p@&E98q ͸qhJ}|8ͪ Tvn yUP @8@9xp#zQ @5@9bqyB5r#M&/iF<2x" (Q7mqWC ^8dbP(ރU7f0Sroㆃm-^p+?֝paA9xN5ܨ- s炍o!tkr@8`ڌn t5]`rptL=H4qpO/Cی G}ݪ 9ՠJUF||_k e7XqUxZ5͘'I'`p[P @8`=یi7z r09%Ȍ39M5U)ޫpi}玪 r0N1'xB;kLy߸@T@9z`AjL.uJ8vj<'~4T p9A[1F8H=*ZrL6rOƎjPVh 7T^k dXx; 594F}&/g~4/L4T@8`rOO R~ T@8@9ȵA?<w{prp9& AoLgWpߕh`ZyIAk `Yg j9A1<,8Y@8~=q&Pe,ݯMKUp3[2<D`r|pp{wT@9qz8PP┃.Xr@85[olAjp'e<( 堄8P @8 Z2Cͱ. pArpfTS5ҕ28P @8J/+ ry@9<#A($.e\9xs-cpYP┃z}-`@5; [j( YI jpG9{Tuoly'5ܷ|`K@r6* ys3<p,؞T)-MT@85; S9T@8 N8TPVIОb5{W.x`K@Գ% r0r5RI`@5vLvL/-MUB  堇+5I8P@5P @8$Xj(䷞9jjXMͼT@8`t9=Oa-#\a/bXj]8=( rԞ% r0|ap`-#l[ j(wpAr住2j0Xzpfՠx524@5P•8_T@8 6HonP@5Xx& pKVJ͒@ /| V8\T@8%*r riap (28@5)p[ YT@8 "}U=P P āp.堭jD{K@ @5P :j{SnW>T`rP ֬_> @8ˁnY@8,up@5PbBXtEmpj`rЗ  @5Pp;: *s JrPpArKA5P ? @5P --t8NsŬY,b@5ظ@7ՠppjD3ppAXJpP!b5(>\9 @5P /5X8hNlQ7@5PWDj(AN3O,T`ker`#(?pArPT@8:=TFP kT\A1gpQ T@8V*(k/@5P®eXjk9ˁEpy5h G2!5P XÁnYAU VP3k1J9Q7(o`+[5PF|np *1pН%Q T@8V,p-T@5P XXpԩP >R\QM7@5@9YuD5P P,bD5P PtP tT#[xIj03ǩpAn`f T@81k0Nm58[ Oe@8@7 [꤉j𦒾XĈj EΛ ʁnjptP T@8+1p(jr(TOTq@5`\@7`EU5 `@8ASpQ@5@9T/pJQ`YW@8. q9DT^ ʁrjpX KUeD5P PeD5P PT@8ZFT') ʁrjp[9詖+8Pʁ XˈjpڔP `r`-#7V;M5@9\p.ZF2W@9@5P P,W@5P PT@8/dZFTPTPeD5(j @rSkQ `lrT>P ${сrj ʁ(\b-#j ۔k ] j r`Yp[\ʁě5x@9\pʁrjgTkQ  pʁrjA9AwlV T(]9\A5P @9PU |V@5P @9@8@8&+2f5hp(2b@85f j r`f TP0kkhr`Yՠ@9P0k&ZF@5@9P TdZF@5T@8@5P 5݋@5ZF@5@9P T4ؤj Ё\5P @9 OeۓtPn@5 .٦}֠w9\585x\ˁn5@7P ru G. ]5p(h]5`tT Y9骁j ʁnӢ j w)˕@5P @9 T@8@5P @9:OWU^4vPt?z@8?]5@8@5P  \9ѻ ;V@8^tPt@5j P AuW ,$ R9N*@+[Ĩ%t@5@8(jp(Gՠ9ܫ@ru?Ԫ@ Aw7k ʁn5@8n`Y v9DɫAW @9 Tj*@8ʜrPU@98Njp(Ajsˎ%t@5@8BnGjpd.Y@9  A T`orpt T@9 T z+P5(@7P X +@5@8@5@8@5Q( Y.]( _5hs9  a7P :f ƔS pňf jp+F|CNp(YFýY AU  ʁrtX4`B5W WW ܳAQ T@9}-jp(ʁjK9H\Wjplk@9hf T@9ح5@8Z4HWj @5U1@5(@rPUPrTP{€sW T@9HQ5Q5PTqoQ5xӈrzBU @8Wˁji9ȺQ5Rj @5dZ˨pD,ujPU@9\fIDATxjqc.FBFBֽ(*(S} ]PKxG jCZ.^dK&zyf~d9oX>t|c@:܍ۯVw677$SS!k&q@8 q@8 q@@8L¼ij.WBݏa /wng>?8 q@8 q@8  q@@b+HG ?3Y;Bݏq?q?7o5]*R*ع/KKά- ɩ~܏Yx!y#00?7 q@8 q@8 qq@8 q@8 q@@8 q@8 q b+s!?±د*PMZ_x@8 q@8 q@@8 q*z`ZXq+~wssaTv9|(l4ݎρ꭯F;׋Riڪ%f$ [[x@8 q@8 q@@8 q*vm[ȇ+.ݼhq@8 q@8 q@@8 TVn$㵶-CGm!?``i[`IENDB`npm_3.5.2.orig/html/static/0000755000000000000000000000000012631326456013725 5ustar 00000000000000npm_3.5.2.orig/html/static/style.css0000644000000000000000000001300312631326456015574 0ustar 00000000000000/* reset */ * { margin:0; padding:0; border:none; font-family:inherit; font-size:inherit; font-weight:inherit; } :target::before { content:" >>> "; position:absolute; display:block; opacity:0.5; color:#f00; margin:0 0 0 -2em; } abbr, acronym { border-bottom:1px dotted #aaa; } kbd, code, pre { font-family:monospace; margin:0; font-size:18px; line-height:24px; background:#eee; outline:1px solid #ccc; } kbd code, kbd pre, kbd kbd, pre code, pre pre, pre kbd, code code, code pre, code kbd { outline: none } .dollar::before { content:"$ "; display:inline; } p, ul, ol, dl, pre { margin:30px 0; line-height:30px; } hr { margin:30px auto 29px; width:66%; height:1px; background:#aaa; } pre { display:block; } dd :first-child { margin-top:0; } body { quotes:"“" "”" "‘" "’"; width:666px; margin:30px auto 120px; font-family:Times New Roman, serif; font-size:20px; background:#fff; line-height:30px; color:#111; } blockquote { position:relative; font-size:16px; line-height:30px; font-weight:bold; width:85%; margin:0 auto; } blockquote::before { font-size:90px; display:block; position:absolute; top:20px; right:100%; content:"“"; padding-right:10px; color:#ccc; } .source cite::before { content:"— "; } .source { padding-left:20%; margin-top:30px; } .source cite span { font-style:normal; } blockquote p { margin-bottom:0; } .quote blockquote { font-weight:normal; } h1, h2, h3, h4, h5, h6, dt, #header { font-family:serif; font-size:20px; font-weight:bold; } h2 { background:#eee; } h1, h2 { line-height:40px; } i, em, cite { font-style:italic; } b, strong { font-weight:bold; } i, em, cite, b, strong, small { line-height:28px; } small, .small, .small *, aside { font-style:italic; color:#669; font-size:18px; } small a, .small a { text-decoration:underline; } del { text-decoration:line-through; } ins { text-decoration:underline; } .alignright { display:block; float:right; margin-left:1em; } .alignleft { display:block; float:left; margin-right:1em; } q:before, q q q:before, q q q q q:before, q q q q q q q:before { content:"“"; } q q:before, q q q q:before, q q q q q q:before, q q q q q q q q:before { content:"‘"; } q:after, q q q:after, q q q q q:after, q q q q q q q:after { content:"”"; } q q:after, q q q q:after, q q q q q q:after, q q q q q q q q:after { content:"’"; } a { color:#00f; text-decoration:none; } a:visited { color:#636; } a:hover, a:active { color:#c00!important; text-decoration:underline; } h1 { font-weight:bold; background:#fff; } h1 a, h1 a:visited { font-family:monospace; font-size:60px; color:#c00; display:block; } h1 a:focus, h1 a:hover, h1 a:active { color:#f00!important; text-decoration:none; } .navigation { display:table; width:100%; margin:0 0 30px 0; position:relative; } #nav-above { margin-bottom:0; } .navigation .nav-previous { display:table-cell; text-align:left; width:50%; } /* hang the » and « off into the margins */ .navigation .nav-previous a:before, .navigation .nav-next a:after { content: "«"; display:block; height:30px; margin-bottom:-30px; text-decoration:none; margin-left:-15px; } .navigation .nav-next a:after { content: "»"; text-align:right; margin-left:0; margin-top:-30px; margin-right:-15px; } .navigation .nav-next { display:table-cell; text-align:right; width:50%; } .navigation a { display:block; width:100%; height:100%; } input, button, textarea { border:0; line-height:30px; } textarea { height:300px; } input { height:30px; line-height:30px; } input.submit, input#submit, input.button, button, input[type=submit] { cursor:hand; cursor:pointer; outline:1px solid #ccc; } #wrapper { margin-bottom:90px; position:relative; z-index:1; *zoom:1; background:#fff; } #wrapper:after { display:block; content:"."; visibility:hidden; width:0; height:0; clear:both; } .sidebar .xoxo > li { float:left; width:50%; } .sidebar li { list-style:none; } .sidebar #elsewhere { margin-left:-10%; margin-right:-10%; } .sidebar #rss-links, .sidebar #twitter-feeds { float:right; clear:right; width:20%; } .sidebar #comment { clear:both; float:none; width:100%; } .sidebar #search { clear:both; float:none; width:100%; } .sidebar #search h2 { margin-left:40%; } .sidebar #search #s { width:90%; float:left; } .sidebar #search #searchsubmit { width:10%; float:right; } .sidebar * { font-size:15px; line-height:30px; } #footer, #footer * { text-align:center; font-size:16px; color:#ccc; font-style:italic; word-spacing:1em; margin-top:0; } #toc { position:absolute; top:0; right:0; padding:40px 0 40px 20px; margin:0; width:200px; opacity:0.2; z-index:-1; } #toc:hover { opacity:1; background:#fff; z-index:999; } #toc ul { padding:0; margin:0; } #toc, #toc li { list-style-type:none; font-size:15px; line-height:15px; } #toc li { padding:0 0 0 10px; } #toc li a { position:relative; display:block; } table#npmlogo { line-height:10px; width:180px; margin:0 auto; } @media print { a[href] { color:inherit; } a[href]:after { white-space:nowrap; content:" " attr(href); } a[href^=\#], .navigation { display:none; } } npm_3.5.2.orig/html/static/toc.js0000644000000000000000000000132712631326456015053 0ustar 00000000000000;(function () { var wrapper = document.getElementById('wrapper') var els = Array.prototype.slice.call(wrapper.getElementsByTagName('*'), 0) .filter(function (el) { return el.parentNode === wrapper && el.tagName.match(/H[1-6]/) && el.id }) var l = 2 var toc = document.createElement('ul') toc.innerHTML = els.map(function (el) { var i = el.tagName.charAt(1) var out = '' while (i > l) { out += '

    ' l++ } while (i < l) { out += '
' l-- } out += '
  • ' + (el.innerText || el.text || el.innerHTML) + '' return out }).join('\n') toc.id = 'toc' document.body.appendChild(toc) })() npm_3.5.2.orig/lib/access.js0000644000000000000000000000627712631326456014053 0ustar 00000000000000'use strict' var resolve = require('path').resolve var readPackageJson = require('read-package-json') var mapToRegistry = require('./utils/map-to-registry.js') var npm = require('./npm.js') var whoami = require('./whoami') module.exports = access access.usage = 'npm access public []\n' + 'npm access restricted []\n' + 'npm access grant []\n' + 'npm access revoke []\n' + 'npm access ls-packages [||]\n' + 'npm access ls-collaborators [ []]\n' + 'npm access edit []' access.subcommands = ['public', 'restricted', 'grant', 'revoke', 'ls-packages', 'ls-collaborators', 'edit'] access.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length === 2) { return cb(null, access.subcommands) } switch (argv[2]) { case 'grant': if (argv.length === 3) { return cb(null, ['read-only', 'read-write']) } else { return cb(null, []) } break case 'public': case 'restricted': case 'ls-packages': case 'ls-collaborators': case 'edit': return cb(null, []) case 'revoke': return cb(null, []) default: return cb(new Error(argv[2] + ' not recognized')) } } function access (args, cb) { var cmd = args.shift() var params return parseParams(cmd, args, function (err, p) { if (err) { return cb(err) } params = p return mapToRegistry(params.package, npm.config, invokeCmd) }) function invokeCmd (err, uri, auth, base) { if (err) { return cb(err) } params.auth = auth try { return npm.registry.access(cmd, uri, params, function (err, data) { !err && data && console.log(JSON.stringify(data, undefined, 2)) cb(err, data) }) } catch (e) { cb(e.message + '\n\nUsage:\n' + access.usage) } } } function parseParams (cmd, args, cb) { // mapToRegistry will complain if package is undefined, // but it's not needed for ls-packages var params = { 'package': '' } if (cmd === 'grant') { params.permissions = args.shift() } if (['grant', 'revoke', 'ls-packages'].indexOf(cmd) !== -1) { var entity = (args.shift() || '').split(':') params.scope = entity[0] params.team = entity[1] } if (cmd === 'ls-packages') { if (!params.scope) { whoami([], true, function (err, scope) { params.scope = scope cb(err, params) }) } else { cb(null, params) } } else { getPackage(args.shift(), function (err, pkg) { if (err) return cb(err) params.package = pkg if (cmd === 'ls-collaborators') params.user = args.shift() cb(null, params) }) } } function getPackage (name, cb) { if (name && name.trim()) { cb(null, name.trim()) } else { readPackageJson( resolve(npm.prefix, 'package.json'), function (err, data) { if (err) { if (err.code === 'ENOENT') { cb(new Error('no package name passed to command and no package.json found')) } else { cb(err) } } else { cb(null, data.name) } } ) } } npm_3.5.2.orig/lib/adduser.js0000644000000000000000000000744512631326456014237 0ustar 00000000000000module.exports = adduser var log = require('npmlog') var npm = require('./npm.js') var read = require('read') var userValidate = require('npm-user-validate') var crypto try { crypto = require('crypto') } catch (ex) {} adduser.usage = 'npm adduser [--registry=url] [--scope=@orgname] [--always-auth]' function adduser (args, cb) { if (!crypto) { return cb(new Error( 'You must compile node with ssl support to use the adduser feature' )) } var creds = npm.config.getCredentialsByURI(npm.config.get('registry')) var c = { u: creds.username || '', p: creds.password || '', e: creds.email || '' } var u = {} var fns = [readUsername, readPassword, readEmail, save] loop() function loop (er) { if (er) return cb(er) var fn = fns.shift() if (fn) return fn(c, u, loop) cb() } } function readUsername (c, u, cb) { var v = userValidate.username read({prompt: 'Username: ', default: c.u || ''}, function (er, un) { if (er) { return cb(er.message === 'cancelled' ? er.message : er) } // make sure it's valid. we have to do this here, because // couchdb will only ever say "bad password" with a 401 when // you try to PUT a _users record that the validate_doc_update // rejects for *any* reason. if (!un) { return readUsername(c, u, cb) } var error = v(un) if (error) { log.warn(error.message) return readUsername(c, u, cb) } c.changed = c.u !== un u.u = un cb(er) }) } function readPassword (c, u, cb) { var v = userValidate.pw var prompt if (c.p && !c.changed) { prompt = 'Password: (or leave unchanged) ' } else { prompt = 'Password: ' } read({prompt: prompt, silent: true}, function (er, pw) { if (er) { return cb(er.message === 'cancelled' ? er.message : er) } if (!c.changed && pw === '') { // when the username was not changed, // empty response means "use the old value" pw = c.p } if (!pw) { return readPassword(c, u, cb) } var error = v(pw) if (error) { log.warn(error.message) return readPassword(c, u, cb) } c.changed = c.changed || c.p !== pw u.p = pw cb(er) }) } function readEmail (c, u, cb) { var v = userValidate.email var r = { prompt: 'Email: (this IS public) ', default: c.e || '' } read(r, function (er, em) { if (er) { return cb(er.message === 'cancelled' ? er.message : er) } if (!em) { return readEmail(c, u, cb) } var error = v(em) if (error) { log.warn(error.message) return readEmail(c, u, cb) } u.e = em cb(er) }) } function save (c, u, cb) { // save existing configs, but yank off for this PUT var uri = npm.config.get('registry') var scope = npm.config.get('scope') // there may be a saved scope and no --registry (for login) if (scope) { if (scope.charAt(0) !== '@') scope = '@' + scope var scopedRegistry = npm.config.get(scope + ':registry') var cliRegistry = npm.config.get('registry', 'cli') if (scopedRegistry && !cliRegistry) uri = scopedRegistry } var params = { auth: { username: u.u, password: u.p, email: u.e } } npm.registry.adduser(uri, params, function (er, doc) { if (er) return cb(er) // don't want this polluting the configuration npm.config.del('_token', 'user') if (scope) npm.config.set(scope + ':registry', uri, 'user') if (doc && doc.token) { npm.config.setCredentialsByURI(uri, { token: doc.token }) } else { npm.config.setCredentialsByURI(uri, { username: u.u, password: u.p, email: u.e, alwaysAuth: npm.config.get('always-auth') }) } log.info('adduser', 'Authorized user %s', u.u) npm.config.save('user', cb) }) } npm_3.5.2.orig/lib/bin.js0000644000000000000000000000073612631326456013354 0ustar 00000000000000module.exports = bin var npm = require('./npm.js') var osenv = require('osenv') bin.usage = 'npm bin [--global]' function bin (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } var b = npm.bin var PATH = osenv.path() if (!silent) console.log(b) process.nextTick(cb.bind(this, null, b)) if (npm.config.get('global') && PATH.indexOf(b) === -1) { npm.config.get('logstream').write('(not in PATH env variable)\n') } } npm_3.5.2.orig/lib/bugs.js0000644000000000000000000000143712631326456013543 0ustar 00000000000000module.exports = bugs bugs.usage = 'npm bugs []' var npm = require('./npm.js') var log = require('npmlog') var opener = require('opener') var fetchPackageMetadata = require('./fetch-package-metadata.js') bugs.completion = function (opts, cb) { // FIXME: there used to be registry completion here, but it stopped making // sense somewhere around 50,000 packages on the registry cb() } function bugs (args, cb) { var n = args.length ? args[0] : '.' fetchPackageMetadata(n, '.', function (er, d) { if (er) return cb(er) var url = d.bugs && ((typeof d.bugs === 'string') ? d.bugs : d.bugs.url) if (!url) { url = 'https://www.npmjs.org/package/' + d.name } log.silly('bugs', 'url', url) opener(url, { command: npm.config.get('browser') }, cb) }) } npm_3.5.2.orig/lib/build.js0000644000000000000000000002103112631326456013672 0ustar 00000000000000// npm build command // everything about the installation after the creation of // the .npm/{name}/{version}/package folder. // linking the modules into the npm.root, // resolving dependencies, etc. // This runs AFTER install or link are completed. var npm = require('./npm.js') var log = require('npmlog') var chain = require('slide').chain var fs = require('graceful-fs') var path = require('path') var lifecycle = require('./utils/lifecycle.js') var readJson = require('read-package-json') var link = require('./utils/link.js') var linkIfExists = link.ifExists var cmdShim = require('cmd-shim') var cmdShimIfExists = cmdShim.ifExists var asyncMap = require('slide').asyncMap var ini = require('ini') var writeFile = require('write-file-atomic') var packageId = require('./utils/package-id.js') module.exports = build build.usage = 'npm build []' build._didBuild = {} build._noLC = {} function build (args, global, didPre, didRB, cb) { if (typeof cb !== 'function') { cb = didRB didRB = false } if (typeof cb !== 'function') { cb = didPre didPre = false } if (typeof cb !== 'function') { cb = global global = npm.config.get('global') } // it'd be nice to asyncMap these, but actually, doing them // in parallel generally munges up the output from node-waf var builder = build_(global, didPre, didRB) chain(args.map(function (arg) { return function (cb) { builder(arg, cb) } }), cb) } function build_ (global, didPre, didRB) { return function (folder, cb) { folder = path.resolve(folder) if (build._didBuild[folder]) log.info('build', 'already built', folder) build._didBuild[folder] = true log.info('build', folder) readJson(path.resolve(folder, 'package.json'), function (er, pkg) { if (er) return cb(er) chain([ !didPre && [lifecycle, pkg, 'preinstall', folder], [linkStuff, pkg, folder, global, didRB], [writeBuiltinConf, pkg, folder], didPre !== build._noLC && [lifecycle, pkg, 'install', folder], didPre !== build._noLC && [lifecycle, pkg, 'postinstall', folder], didPre !== build._noLC && npm.config.get('npat') && [lifecycle, pkg, 'test', folder] ], cb) }) } } var writeBuiltinConf = build.writeBuiltinConf = function (pkg, folder, cb) { // the builtin config is "sticky". Any time npm installs // itself globally, it puts its builtin config file there var parent = path.dirname(folder) var dir = npm.globalDir if (pkg.name !== 'npm' || !npm.config.get('global') || !npm.config.usingBuiltin || dir !== parent) { return cb() } var data = ini.stringify(npm.config.sources.builtin.data) writeFile(path.resolve(folder, 'npmrc'), data, cb) } var linkStuff = build.linkStuff = function (pkg, folder, global, didRB, cb) { // allow to opt out of linking binaries. if (npm.config.get('bin-links') === false) return cb() // if it's global, and folder is in {prefix}/node_modules, // then bins are in {prefix}/bin // otherwise, then bins are in folder/../.bin var parent = pkg.name && pkg.name[0] === '@' ? path.dirname(path.dirname(folder)) : path.dirname(folder) var gnm = global && npm.globalDir var gtop = parent === gnm log.info('linkStuff', packageId(pkg)) log.silly('linkStuff', packageId(pkg), 'has', parent, 'as its parent node_modules') if (global) log.silly('linkStuff', packageId(pkg), 'is part of a global install') if (gnm) log.silly('linkStuff', packageId(pkg), 'is installed into a global node_modules') if (gtop) log.silly('linkStuff', packageId(pkg), 'is installed into the top-level global node_modules') shouldWarn(pkg, folder, global, function () { asyncMap( [linkBins, linkMans, !didRB && rebuildBundles], function (fn, cb) { if (!fn) return cb() log.verbose(fn.name, packageId(pkg)) fn(pkg, folder, parent, gtop, cb) }, cb ) }) } function shouldWarn (pkg, folder, global, cb) { var parent = path.dirname(folder) var top = parent === npm.dir var cwd = npm.localPrefix readJson(path.resolve(cwd, 'package.json'), function (er, topPkg) { if (er) return cb(er) var linkedPkg = path.basename(cwd) var currentPkg = path.basename(folder) // current searched package is the linked package on first call if (linkedPkg !== currentPkg) { // don't generate a warning if it's listed in dependencies if (Object.keys(topPkg.dependencies || {}) .concat(Object.keys(topPkg.devDependencies || {})) .indexOf(currentPkg) === -1) { if (top && pkg.preferGlobal && !global) { log.warn('prefer global', packageId(pkg) + ' should be installed with -g') } } } cb() }) } function rebuildBundles (pkg, folder, parent, gtop, cb) { if (!npm.config.get('rebuild-bundle')) return cb() var deps = Object.keys(pkg.dependencies || {}) .concat(Object.keys(pkg.devDependencies || {})) var bundles = pkg.bundleDependencies || pkg.bundledDependencies || [] fs.readdir(path.resolve(folder, 'node_modules'), function (er, files) { // error means no bundles if (er) return cb() log.verbose('rebuildBundles', files) // don't asyncMap these, because otherwise build script output // gets interleaved and is impossible to read chain(files.filter(function (file) { // rebuild if: // not a .folder, like .bin or .hooks return !file.match(/^[\._-]/) && // not some old 0.x style bundle file.indexOf('@') === -1 && // either not a dep, or explicitly bundled (deps.indexOf(file) === -1 || bundles.indexOf(file) !== -1) }).map(function (file) { file = path.resolve(folder, 'node_modules', file) return function (cb) { if (build._didBuild[file]) return cb() log.verbose('rebuild bundle', file) // if file is not a package dir, then don't do it. fs.lstat(path.resolve(file, 'package.json'), function (er) { if (er) return cb() build_(false)(file, cb) }) } }), cb) }) } function linkBins (pkg, folder, parent, gtop, cb) { if (!pkg.bin || !gtop && path.basename(parent) !== 'node_modules') { return cb() } var binRoot = gtop ? npm.globalBin : path.resolve(parent, '.bin') log.verbose('link bins', [pkg.bin, binRoot, gtop]) asyncMap(Object.keys(pkg.bin), function (b, cb) { linkBin( path.resolve(folder, pkg.bin[b]), path.resolve(binRoot, b), gtop && folder, function (er) { if (er) return cb(er) // bins should always be executable. // XXX skip chmod on windows? var src = path.resolve(folder, pkg.bin[b]) fs.chmod(src, npm.modes.exec, function (er) { if (er && er.code === 'ENOENT' && npm.config.get('ignore-scripts')) { return cb() } if (er || !gtop) return cb(er) var dest = path.resolve(binRoot, b) var out = npm.config.get('parseable') ? dest + '::' + src + ':BINFILE' : dest + ' -> ' + src log.clearProgress() console.log(out) log.showProgress() cb() }) } ) }, cb) } function linkBin (from, to, gently, cb) { if (process.platform !== 'win32') { return linkIfExists(from, to, gently, cb) } else { return cmdShimIfExists(from, to, cb) } } function linkMans (pkg, folder, parent, gtop, cb) { if (!pkg.man || !gtop || process.platform === 'win32') return cb() var manRoot = path.resolve(npm.config.get('prefix'), 'share', 'man') log.verbose('linkMans', 'man files are', pkg.man, 'in', manRoot) // make sure that the mans are unique. // otherwise, if there are dupes, it'll fail with EEXIST var set = pkg.man.reduce(function (acc, man) { acc[path.basename(man)] = man return acc }, {}) pkg.man = pkg.man.filter(function (man) { return set[path.basename(man)] === man }) asyncMap(pkg.man, function (man, cb) { if (typeof man !== 'string') return cb() log.silly('linkMans', 'preparing to link', man) var parseMan = man.match(/(.*\.([0-9]+)(\.gz)?)$/) if (!parseMan) { return cb(new Error( man + ' is not a valid name for a man file. ' + 'Man files must end with a number, ' + 'and optionally a .gz suffix if they are compressed.' )) } var stem = parseMan[1] var sxn = parseMan[2] var bn = path.basename(stem) var manSrc = path.resolve(folder, man) var manDest = path.join(manRoot, 'man' + sxn, bn) linkIfExists(manSrc, manDest, gtop && folder, cb) }, cb) } npm_3.5.2.orig/lib/cache/0000755000000000000000000000000012631326456013303 5ustar 00000000000000npm_3.5.2.orig/lib/cache.js0000644000000000000000000002415012631326456013643 0ustar 00000000000000// XXX lib/utils/tar.js and this file need to be rewritten. // URL-to-cache folder mapping: // : -> ! // @ -> _ // http://registry.npmjs.org/foo/version -> cache/http!/... // /* fetching a URL: 1. Check for URL in inflight URLs. If present, add cb, and return. 2. Acquire lock at {cache}/{sha(url)}.lock retries = {cache-lock-retries, def=10} stale = {cache-lock-stale, def=60000} wait = {cache-lock-wait, def=10000} 3. if lock can't be acquired, then fail 4. fetch url, clear lock, call cbs cache folders: 1. urls: http!/server.com/path/to/thing 2. c:\path\to\thing: file!/c!/path/to/thing 3. /path/to/thing: file!/path/to/thing 4. git@ private: git_github.com!npm/npm 5. git://public: git!/github.com/npm/npm 6. git+blah:// git-blah!/server.com/foo/bar adding a folder: 1. tar into tmp/random/package.tgz 2. untar into tmp/random/contents/package, stripping one dir piece 3. tar tmp/random/contents/package to cache/n/v/package.tgz 4. untar cache/n/v/package.tgz into cache/n/v/package 5. rm tmp/random Adding a url: 1. fetch to tmp/random/package.tgz 2. goto folder(2) adding a name@version: 1. registry.get(name/version) 2. if response isn't 304, add url(dist.tarball) adding a name@range: 1. registry.get(name) 2. Find a version that satisfies 3. add name@version adding a local tarball: 1. untar to tmp/random/{blah} 2. goto folder(2) adding a namespaced package: 1. lookup registry for @namespace 2. namespace_registry.get('name') 3. add url(namespace/latest.tarball) */ exports = module.exports = cache cache.unpack = unpack cache.clean = clean cache.read = read var npm = require('./npm.js') var fs = require('graceful-fs') var writeFileAtomic = require('write-file-atomic') var assert = require('assert') var rm = require('./utils/gently-rm.js') var readJson = require('read-package-json') var log = require('npmlog') var path = require('path') var asyncMap = require('slide').asyncMap var tar = require('./utils/tar.js') var fileCompletion = require('./utils/completion/file-completion.js') var deprCheck = require('./utils/depr-check.js') var addNamed = require('./cache/add-named.js') var addLocal = require('./cache/add-local.js') var addRemoteTarball = require('./cache/add-remote-tarball.js') var addRemoteGit = require('./cache/add-remote-git.js') var inflight = require('inflight') var realizePackageSpecifier = require('realize-package-specifier') var npa = require('npm-package-arg') var getStat = require('./cache/get-stat.js') var cachedPackageRoot = require('./cache/cached-package-root.js') var mapToRegistry = require('./utils/map-to-registry.js') cache.usage = 'npm cache add ' + '\nnpm cache add ' + '\nnpm cache add ' + '\nnpm cache add ' + '\nnpm cache add @' + '\nnpm cache ls []' + '\nnpm cache clean [[@]]' cache.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length === 2) { return cb(null, ['add', 'ls', 'clean']) } switch (argv[2]) { case 'clean': case 'ls': // cache and ls are easy, because the completion is // what ls_ returns anyway. // just get the partial words, minus the last path part var p = path.dirname(opts.partialWords.slice(3).join('/')) if (p === '.') p = '' return ls_(p, 2, cb) case 'add': // Same semantics as install and publish. return npm.commands.install.completion(opts, cb) } } function cache (args, cb) { var cmd = args.shift() switch (cmd) { case 'rm': case 'clear': case 'clean': return clean(args, cb) case 'list': case 'sl': case 'ls': return ls(args, cb) case 'add': return add(args, npm.prefix, cb) default: return cb('Usage: ' + cache.usage) } } // if the pkg and ver are in the cache, then // just do a readJson and return. // if they're not, then fetch them from the registry. function read (name, ver, forceBypass, cb) { assert(typeof name === 'string', 'must include name of module to install') assert(typeof cb === 'function', 'must include callback') if (forceBypass === undefined || forceBypass === null) forceBypass = true var root = cachedPackageRoot({name: name, version: ver}) function c (er, data) { if (er) log.verbose('cache', 'addNamed error for', name + '@' + ver, er) if (data) deprCheck(data) return cb(er, data) } if (forceBypass && npm.config.get('force')) { log.verbose('using force', 'skipping cache') return addNamed(name, ver, null, c) } readJson(path.join(root, 'package', 'package.json'), function (er, data) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (data) { if (!data.name) return cb(new Error('No name provided')) if (!data.version) return cb(new Error('No version provided')) } if (er) return addNamed(name, ver, null, c) else c(er, data) }) } function normalize (args) { var normalized = '' if (args.length > 0) { var a = npa(args[0]) if (a.name) normalized = a.name if (a.rawSpec) normalized = [normalized, a.rawSpec].join('/') if (args.length > 1) normalized = [normalized].concat(args.slice(1)).join('/') } if (normalized.substr(-1) === '/') { normalized = normalized.substr(0, normalized.length - 1) } normalized = path.normalize(normalized) log.silly('ls', 'normalized', normalized) return normalized } // npm cache ls [] function ls (args, cb) { var prefix = npm.config.get('cache') if (prefix.indexOf(process.env.HOME) === 0) { prefix = '~' + prefix.substr(process.env.HOME.length) } ls_(normalize(args), npm.config.get('depth'), function (er, files) { console.log(files.map(function (f) { return path.join(prefix, f) }).join('\n').trim()) cb(er, files) }) } // Calls cb with list of cached pkgs matching show. function ls_ (req, depth, cb) { return fileCompletion(npm.cache, req, depth, cb) } // npm cache clean [] function clean (args, cb) { assert(typeof cb === 'function', 'must include callback') if (!args) args = [] var f = path.join(npm.cache, normalize(args)) if (f === npm.cache) { fs.readdir(npm.cache, function (er, files) { if (er) return cb() asyncMap( files.filter(function (f) { return npm.config.get('force') || f !== '-' }).map(function (f) { return path.join(npm.cache, f) }), rm, cb ) }) } else { rm(f, cb) } } // npm cache add // npm cache add // npm cache add // npm cache add cache.add = function (pkg, ver, where, scrub, cb) { assert(typeof pkg === 'string', 'must include name of package to install') assert(typeof cb === 'function', 'must include callback') if (scrub) { return clean([], function (er) { if (er) return cb(er) add([pkg, ver], where, cb) }) } return add([pkg, ver], where, cb) } var adding = 0 function add (args, where, cb) { // this is hot code. almost everything passes through here. // the args can be any of: // ['url'] // ['pkg', 'version'] // ['pkg@version'] // ['pkg', 'url'] // This is tricky, because urls can contain @ // Also, in some cases we get [name, null] rather // that just a single argument. var usage = 'Usage:\n' + ' npm cache add \n' + ' npm cache add @\n' + ' npm cache add \n' + ' npm cache add \n' var spec log.silly('cache add', 'args', args) if (args[1] === undefined) args[1] = null // at this point the args length must ==2 if (args[1] !== null) { spec = args[0] + '@' + args[1] } else if (args.length === 2) { spec = args[0] } log.verbose('cache add', 'spec', spec) if (!spec) return cb(usage) adding++ cb = afterAdd(cb) realizePackageSpecifier(spec, where, function (err, p) { if (err) return cb(err) log.silly('cache add', 'parsed spec', p) switch (p.type) { case 'local': case 'directory': addLocal(p, null, cb) break case 'remote': // get auth, if possible mapToRegistry(spec, npm.config, function (err, uri, auth) { if (err) return cb(err) addRemoteTarball(p.spec, { name: p.name }, null, auth, cb) }) break case 'git': case 'hosted': addRemoteGit(p.rawSpec, cb) break default: if (p.name) return addNamed(p.name, p.spec, null, cb) cb(new Error("couldn't figure out how to install " + spec)) } }) } function unpack (pkg, ver, unpackTarget, dMode, fMode, uid, gid, cb) { if (typeof cb !== 'function') { cb = gid gid = null } if (typeof cb !== 'function') { cb = uid uid = null } if (typeof cb !== 'function') { cb = fMode fMode = null } if (typeof cb !== 'function') { cb = dMode dMode = null } read(pkg, ver, false, function (er) { if (er) { log.error('unpack', 'Could not read data for %s', pkg + '@' + ver) return cb(er) } npm.commands.unbuild([unpackTarget], true, function (er) { if (er) return cb(er) tar.unpack( path.join(cachedPackageRoot({ name: pkg, version: ver }), 'package.tgz'), unpackTarget, dMode, fMode, uid, gid, cb ) }) }) } function afterAdd (cb) { return function (er, data) { adding-- if (er || !data || !data.name || !data.version) return cb(er, data) log.silly('cache', 'afterAdd', data.name + '@' + data.version) // Save the resolved, shasum, etc. into the data so that the next // time we load from this cached data, we have all the same info. var pj = path.join(cachedPackageRoot(data), 'package', 'package.json') var done = inflight(pj, cb) if (!done) return log.verbose('afterAdd', pj, 'already in flight; not writing') log.verbose('afterAdd', pj, 'not in flight; writing') getStat(function (er, cs) { if (er) return done(er) writeFileAtomic(pj, JSON.stringify(data), { chown: cs }, function (er) { if (!er) log.verbose('afterAdd', pj, 'written') return done(er, data) }) }) } } npm_3.5.2.orig/lib/completion.js0000644000000000000000000001604612631326456014756 0ustar 00000000000000 module.exports = completion completion.usage = 'source <(npm completion)' var npm = require('./npm.js') var npmconf = require('./config/core.js') var configDefs = npmconf.defs var configTypes = configDefs.types var shorthands = configDefs.shorthands var nopt = require('nopt') var configNames = Object.keys(configTypes) .filter(function (e) { return e.charAt(0) !== '_' }) var shorthandNames = Object.keys(shorthands) var allConfs = configNames.concat(shorthandNames) var once = require('once') completion.completion = function (opts, cb) { if (opts.w > 3) return cb() var fs = require('graceful-fs') var path = require('path') var bashExists = null var zshExists = null fs.stat(path.resolve(process.env.HOME, '.bashrc'), function (er) { bashExists = !er next() }) fs.stat(path.resolve(process.env.HOME, '.zshrc'), function (er) { zshExists = !er next() }) function next () { if (zshExists === null || bashExists === null) return var out = [] if (zshExists) out.push('~/.zshrc') if (bashExists) out.push('~/.bashrc') if (opts.w === 2) { out = out.map(function (m) { return ['>>', m] }) } cb(null, out) } } function completion (args, cb) { if (process.platform === 'win32' && !(/^MINGW(32|64)$/.test(process.env.MSYSTEM))) { var e = new Error('npm completion supported only in MINGW / Git bash on Windows') e.code = 'ENOTSUP' e.errno = require('constants').ENOTSUP return cb(e) } // if the COMP_* isn't in the env, then just dump the script. if (process.env.COMP_CWORD === undefined || process.env.COMP_LINE === undefined || process.env.COMP_POINT === undefined) { return dumpScript(cb) } console.error(process.env.COMP_CWORD) console.error(process.env.COMP_LINE) console.error(process.env.COMP_POINT) // get the partial line and partial word, // if the point isn't at the end. // ie, tabbing at: npm foo b|ar var w = +process.env.COMP_CWORD var words = args.map(unescape) var word = words[w] var line = process.env.COMP_LINE var point = +process.env.COMP_POINT var partialLine = line.substr(0, point) var partialWords = words.slice(0, w) // figure out where in that last word the point is. var partialWord = args[w] var i = partialWord.length while (partialWord.substr(0, i) !== partialLine.substr(-1 * i) && i > 0) { i-- } partialWord = unescape(partialWord.substr(0, i)) partialWords.push(partialWord) var opts = { words: words, w: w, word: word, line: line, lineLength: line.length, point: point, partialLine: partialLine, partialWords: partialWords, partialWord: partialWord, raw: args } cb = wrapCb(cb, opts) console.error(opts) if (partialWords.slice(0, -1).indexOf('--') === -1) { if (word.charAt(0) === '-') return configCompl(opts, cb) if (words[w - 1] && words[w - 1].charAt(0) === '-' && !isFlag(words[w - 1])) { // awaiting a value for a non-bool config. // don't even try to do this for now console.error('configValueCompl') return configValueCompl(opts, cb) } } // try to find the npm command. // it's the first thing after all the configs. // take a little shortcut and use npm's arg parsing logic. // don't have to worry about the last arg being implicitly // boolean'ed, since the last block will catch that. var parsed = opts.conf = nopt(configTypes, shorthands, partialWords.slice(0, -1), 0) // check if there's a command already. console.error(parsed) var cmd = parsed.argv.remain[1] if (!cmd) return cmdCompl(opts, cb) Object.keys(parsed).forEach(function (k) { npm.config.set(k, parsed[k]) }) // at this point, if words[1] is some kind of npm command, // then complete on it. // otherwise, do nothing cmd = npm.commands[cmd] if (cmd && cmd.completion) return cmd.completion(opts, cb) // nothing to do. cb() } function dumpScript (cb) { var fs = require('graceful-fs') var path = require('path') var p = path.resolve(__dirname, 'utils/completion.sh') // The Darwin patch below results in callbacks first for the write and then // for the error handler, so make sure we only call our callback once. cb = once(cb) fs.readFile(p, 'utf8', function (er, d) { if (er) return cb(er) d = d.replace(/^\#\!.*?\n/, '') process.stdout.write(d, function () { cb() }) process.stdout.on('error', function (er) { // Darwin is a real dick sometimes. // // This is necessary because the "source" or "." program in // bash on OS X closes its file argument before reading // from it, meaning that you get exactly 1 write, which will // work most of the time, and will always raise an EPIPE. // // Really, one should not be tossing away EPIPE errors, or any // errors, so casually. But, without this, `. <(npm completion)` // can never ever work on OS X. if (er.errno === 'EPIPE') er = null cb(er) }) }) } function unescape (w) { if (w.charAt(0) === '\'') return w.replace(/^'|'$/g, '') else return w.replace(/\\ /g, ' ') } function escape (w) { if (!w.match(/\s+/)) return w return '\'' + w + '\'' } // The command should respond with an array. Loop over that, // wrapping quotes around any that have spaces, and writing // them to stdout. Use console.log, not the outfd config. // If any of the items are arrays, then join them with a space. // Ie, returning ['a', 'b c', ['d', 'e']] would allow it to expand // to: 'a', 'b c', or 'd' 'e' function wrapCb (cb, opts) { return function (er, compls) { if (!Array.isArray(compls)) compls = compls ? [compls] : [] compls = compls.map(function (c) { if (Array.isArray(c)) c = c.map(escape).join(' ') else c = escape(c) return c }) if (opts.partialWord) { compls = compls.filter(function (c) { return c.indexOf(opts.partialWord) === 0 }) } console.error([er && er.stack, compls, opts.partialWord]) if (er || compls.length === 0) return cb(er) console.log(compls.join('\n')) cb() } } // the current word has a dash. Return the config names, // with the same number of dashes as the current word has. function configCompl (opts, cb) { var word = opts.word var split = word.match(/^(-+)((?:no-)*)(.*)$/) var dashes = split[1] var no = split[2] var flags = configNames.filter(isFlag) console.error(flags) return cb(null, allConfs.map(function (c) { return dashes + c }).concat(flags.map(function (f) { return dashes + (no || 'no-') + f }))) } // expand with the valid values of various config values. // not yet implemented. function configValueCompl (opts, cb) { console.error('configValue', opts) return cb(null, []) } // check if the thing is a flag or not. function isFlag (word) { // shorthands never take args. var split = word.match(/^(-*)((?:no-)+)?(.*)$/) var no = split[2] var conf = split[3] return no || configTypes[conf] === Boolean || shorthands[conf] } // complete against the npm commands function cmdCompl (opts, cb) { return cb(null, npm.fullList) } npm_3.5.2.orig/lib/config/0000755000000000000000000000000012631326456013505 5ustar 00000000000000npm_3.5.2.orig/lib/config.js0000644000000000000000000001716112631326456014051 0ustar 00000000000000 module.exports = config config.usage = 'npm config set ' + '\nnpm config get []' + '\nnpm config delete ' + '\nnpm config list' + '\nnpm config edit' + '\nnpm set ' + '\nnpm get []' var log = require('npmlog') var npm = require('./npm.js') var npmconf = require('./config/core.js') var fs = require('graceful-fs') var writeFileAtomic = require('write-file-atomic') var types = npmconf.defs.types var ini = require('ini') var editor = require('editor') var os = require('os') var umask = require('./utils/umask') config.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv[1] !== 'config') argv.unshift('config') if (argv.length === 2) { var cmds = ['get', 'set', 'delete', 'ls', 'rm', 'edit'] if (opts.partialWord !== 'l') cmds.push('list') return cb(null, cmds) } var action = argv[2] switch (action) { case 'set': // todo: complete with valid values, if possible. if (argv.length > 3) return cb(null, []) // fallthrough /*eslint no-fallthrough:0*/ case 'get': case 'delete': case 'rm': return cb(null, Object.keys(types)) case 'edit': case 'list': case 'ls': return cb(null, []) default: return cb(null, []) } } // npm config set key value // npm config get key // npm config list function config (args, cb) { var action = args.shift() switch (action) { case 'set': return set(args[0], args[1], cb) case 'get': return get(args[0], cb) case 'delete': case 'rm': case 'del': return del(args[0], cb) case 'list': case 'ls': return list(cb) case 'edit': return edit(cb) default: return unknown(action, cb) } } function edit (cb) { var e = npm.config.get('editor') var which = npm.config.get('global') ? 'global' : 'user' var f = npm.config.get(which + 'config') if (!e) return cb(new Error('No EDITOR config or environ set.')) npm.config.save(which, function (er) { if (er) return cb(er) fs.readFile(f, 'utf8', function (er, data) { if (er) data = '' data = [ ';;;;', '; npm ' + (npm.config.get('global') ? 'globalconfig' : 'userconfig') + ' file', '; this is a simple ini-formatted file', '; lines that start with semi-colons are comments.', '; read `npm help config` for help on the various options', ';;;;', '', data ].concat([ ';;;;', '; all options with default values', ';;;;' ]).concat(Object.keys(npmconf.defaults).reduce(function (arr, key) { var obj = {} obj[key] = npmconf.defaults[key] if (key === 'logstream') return arr return arr.concat( ini.stringify(obj) .replace(/\n$/m, '') .replace(/^/g, '; ') .replace(/\n/g, '\n; ') .split('\n')) }, [])) .concat(['']) .join(os.EOL) writeFileAtomic( f, data, function (er) { if (er) return cb(er) editor(f, { editor: e }, cb) } ) }) }) } function del (key, cb) { if (!key) return cb(new Error('no key provided')) var where = npm.config.get('global') ? 'global' : 'user' npm.config.del(key, where) npm.config.save(where, cb) } function set (key, val, cb) { if (key === undefined) { return unknown('', cb) } if (val === undefined) { if (key.indexOf('=') !== -1) { var k = key.split('=') key = k.shift() val = k.join('=') } else { val = '' } } key = key.trim() val = val.trim() log.info('config', 'set %j %j', key, val) var where = npm.config.get('global') ? 'global' : 'user' if (key.match(/umask/)) val = umask.fromString(val) npm.config.set(key, val, where) npm.config.save(where, cb) } function get (key, cb) { if (!key) return list(cb) if (!publicVar(key)) { return cb(new Error('---sekretz---')) } var val = npm.config.get(key) if (key.match(/umask/)) val = umask.toString(val) console.log(val) cb() } function sort (a, b) { return a > b ? 1 : -1 } function publicVar (k) { return !(k.charAt(0) === '_' || k.indexOf(':_') !== -1 || types[k] !== types[k]) } function getKeys (data) { return Object.keys(data).filter(publicVar).sort(sort) } function list (cb) { var msg = '' var long = npm.config.get('long') var cli = npm.config.sources.cli.data var cliKeys = getKeys(cli) if (cliKeys.length) { msg += '; cli configs\n' cliKeys.forEach(function (k) { if (cli[k] && typeof cli[k] === 'object') return if (k === 'argv') return msg += k + ' = ' + JSON.stringify(cli[k]) + '\n' }) msg += '\n' } // env configs var env = npm.config.sources.env.data var envKeys = getKeys(env) if (envKeys.length) { msg += '; environment configs\n' envKeys.forEach(function (k) { if (env[k] !== npm.config.get(k)) { if (!long) return msg += '; ' + k + ' = ' + JSON.stringify(env[k]) + ' (overridden)\n' } else msg += k + ' = ' + JSON.stringify(env[k]) + '\n' }) msg += '\n' } // user config file var uconf = npm.config.sources.user.data var uconfKeys = getKeys(uconf) if (uconfKeys.length) { msg += '; userconfig ' + npm.config.get('userconfig') + '\n' uconfKeys.forEach(function (k) { var val = (k.charAt(0) === '_') ? '---sekretz---' : JSON.stringify(uconf[k]) if (uconf[k] !== npm.config.get(k)) { if (!long) return msg += '; ' + k + ' = ' + val + ' (overridden)\n' } else msg += k + ' = ' + val + '\n' }) msg += '\n' } // global config file var gconf = npm.config.sources.global.data var gconfKeys = getKeys(gconf) if (gconfKeys.length) { msg += '; globalconfig ' + npm.config.get('globalconfig') + '\n' gconfKeys.forEach(function (k) { var val = (k.charAt(0) === '_') ? '---sekretz---' : JSON.stringify(gconf[k]) if (gconf[k] !== npm.config.get(k)) { if (!long) return msg += '; ' + k + ' = ' + val + ' (overridden)\n' } else msg += k + ' = ' + val + '\n' }) msg += '\n' } // builtin config file var builtin = npm.config.sources.builtin || {} if (builtin && builtin.data) { var bconf = builtin.data var bpath = builtin.path var bconfKeys = getKeys(bconf) if (bconfKeys.length) { msg += '; builtin config ' + bpath + '\n' bconfKeys.forEach(function (k) { var val = (k.charAt(0) === '_') ? '---sekretz---' : JSON.stringify(bconf[k]) if (bconf[k] !== npm.config.get(k)) { if (!long) return msg += '; ' + k + ' = ' + val + ' (overridden)\n' } else msg += k + ' = ' + val + '\n' }) msg += '\n' } } // only show defaults if --long if (!long) { msg += '; node bin location = ' + process.execPath + '\n' + '; cwd = ' + process.cwd() + '\n' + '; HOME = ' + process.env.HOME + '\n' + '; "npm config ls -l" to show all defaults.\n' console.log(msg) return cb() } var defaults = npmconf.defaults var defKeys = getKeys(defaults) msg += '; default values\n' defKeys.forEach(function (k) { if (defaults[k] && typeof defaults[k] === 'object') return var val = JSON.stringify(defaults[k]) if (defaults[k] !== npm.config.get(k)) { msg += '; ' + k + ' = ' + val + ' (overridden)\n' } else msg += k + ' = ' + val + '\n' }) msg += '\n' console.log(msg) return cb() } function unknown (action, cb) { cb('Usage:\n' + config.usage) } npm_3.5.2.orig/lib/dedupe.js0000644000000000000000000001233212631326456014045 0ustar 00000000000000var util = require('util') var path = require('path') var validate = require('aproba') var without = require('lodash.without') var asyncMap = require('slide').asyncMap var chain = require('slide').chain var npa = require('npm-package-arg') var log = require('npmlog') var npm = require('./npm.js') var Installer = require('./install.js').Installer var findRequirement = require('./install/deps.js').findRequirement var earliestInstallable = require('./install/deps.js').earliestInstallable var checkPermissions = require('./install/check-permissions.js') var decomposeActions = require('./install/decompose-actions.js') var loadExtraneous = require('./install/deps.js').loadExtraneous var filterInvalidActions = require('./install/filter-invalid-actions.js') var recalculateMetadata = require('./install/deps.js').recalculateMetadata var sortActions = require('./install/diff-trees.js').sortActions var moduleName = require('./utils/module-name.js') var packageId = require('./utils/package-id.js') var childPath = require('./utils/child-path.js') module.exports = dedupe module.exports.Deduper = Deduper dedupe.usage = 'npm dedupe' function dedupe (args, cb) { validate('AF', arguments) // the /path/to/node_modules/.. var where = path.resolve(npm.dir, '..') var dryrun = false if (npm.command.match(/^find/)) dryrun = true if (npm.config.get('dry-run')) dryrun = true new Deduper(where, dryrun).run(cb) } function Deduper (where, dryrun) { validate('SB', arguments) Installer.call(this, where, dryrun, []) this.noPackageJsonOk = true this.topLevelLifecycles = false } util.inherits(Deduper, Installer) Deduper.prototype.normalizeTree = function (log, cb) { validate('OF', arguments) log.silly('dedupe', 'normalizeTree') // If we're looking globally only look at the one package we're operating on if (npm.config.get('global')) { var args = this.args this.currentTree.children = this.currentTree.children.filter(function (child) { return args.filter(function (arg) { return arg === moduleName(child) }).length }) } Installer.prototype.normalizeTree.call(this, log, cb) } Deduper.prototype.loadIdealTree = function (cb) { validate('F', arguments) log.silly('install', 'loadIdealTree') var self = this chain([ [this.newTracker(this.progress.loadIdealTree, 'cloneCurrentTree')], [this, this.cloneCurrentTreeToIdealTree], [this, this.finishTracker, 'cloneCurrentTree'], [this.newTracker(this.progress.loadIdealTree, 'loadAllDepsIntoIdealTree', 10)], [ function (next) { loadExtraneous(self.idealTree, self.progress.loadAllDepsIntoIdealTree, next) } ], [this, this.finishTracker, 'loadAllDepsIntoIdealTree'], [this, function (next) { recalculateMetadata(this.idealTree, log, next) }] ], cb) } Deduper.prototype.generateActionsToTake = function (cb) { validate('F', arguments) log.silly('dedupe', 'generateActionsToTake') chain([ [this.newTracker(log, 'hoist', 1)], [hoistChildren, this.idealTree, this.differences], [this, this.finishTracker, 'hoist'], [this.newTracker(log, 'sort-actions', 1)], [this, function (next) { this.differences = sortActions(this.differences) next() }], [this, this.finishTracker, 'sort-actions'], [filterInvalidActions, this.where, this.differences], [checkPermissions, this.differences], [decomposeActions, this.differences, this.todo] ], cb) } function move (node, hoistTo, diff) { node.parent.children = without(node.parent.children, node) hoistTo.children.push(node) node.fromPath = node.path node.path = childPath(hoistTo.path, node) node.parent = hoistTo if (!diff.filter(function (action) { return action[0] === 'move' && action[1] === node }).length) { diff.push(['move', node]) } } function moveRemainingChildren (node, diff) { node.children.forEach(function (child) { move(child, node, diff) moveRemainingChildren(child, diff) }) } function remove (child, diff, done) { remove_(child, diff, {}, done) } function remove_ (child, diff, seen, done) { if (seen[child.path]) return done() seen[child.path] = true diff.push(['remove', child]) child.parent.children = without(child.parent.children, child) asyncMap(child.children, function (child, next) { remove_(child, diff, seen, next) }, done) } function hoistChildren (tree, diff, next) { hoistChildren_(tree, diff, {}, next) } function hoistChildren_ (tree, diff, seen, next) { validate('OAOF', arguments) if (seen[tree.path]) return next() seen[tree.path] = true asyncMap(tree.children, function (child, done) { if (!tree.parent) return hoistChildren_(child, diff, seen, done) var better = findRequirement(tree.parent, moduleName(child), child.package._requested || npa(packageId(child))) if (better) { return chain([ [remove, child, diff], [recalculateMetadata, tree, log] ], done) } var hoistTo = earliestInstallable(tree, tree.parent, child.package) if (hoistTo) { move(child, hoistTo, diff) chain([ [recalculateMetadata, hoistTo, log], [hoistChildren_, child, diff, seen], [ function (next) { moveRemainingChildren(child, diff) next() } ] ], done) } else { done() } }, next) } npm_3.5.2.orig/lib/deprecate.js0000644000000000000000000000241412631326456014533 0ustar 00000000000000var npm = require('./npm.js') var mapToRegistry = require('./utils/map-to-registry.js') var npa = require('npm-package-arg') module.exports = deprecate deprecate.usage = 'npm deprecate [@] ' deprecate.completion = function (opts, cb) { // first, get a list of remote packages this user owns. // once we have a user account, then don't complete anything. if (opts.conf.argv.remain.length > 2) return cb() // get the list of packages by user var path = '/-/by-user/' mapToRegistry(path, npm.config, function (er, uri, c) { if (er) return cb(er) if (!(c && c.username)) return cb() var params = { timeout: 60000, auth: c } npm.registry.get(uri + c.username, params, function (er, list) { if (er) return cb() console.error(list) return cb(null, list[c.username]) }) }) } function deprecate (args, cb) { var pkg = args[0] var msg = args[1] if (msg === undefined) return cb('Usage: ' + deprecate.usage) // fetch the data and make sure it exists. var p = npa(pkg) mapToRegistry(p.name, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { version: p.spec, message: msg, auth: auth } npm.registry.deprecate(uri, params, cb) }) } npm_3.5.2.orig/lib/dist-tag.js0000644000000000000000000000715412631326456014321 0ustar 00000000000000module.exports = distTag var log = require('npmlog') var npa = require('npm-package-arg') var semver = require('semver') var npm = require('./npm.js') var mapToRegistry = require('./utils/map-to-registry.js') var readLocalPkg = require('./utils/read-local-package.js') distTag.usage = 'npm dist-tag add @ []' + '\nnpm dist-tag rm ' + '\nnpm dist-tag ls []' distTag.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length === 2) { return cb(null, ['add', 'rm', 'ls']) } switch (argv[2]) { default: return cb() } } function distTag (args, cb) { var cmd = args.shift() switch (cmd) { case 'add': case 'a': case 'set': case 's': return add(args[0], args[1], cb) case 'rm': case 'r': case 'del': case 'd': case 'remove': return remove(args[1], args[0], cb) case 'ls': case 'l': case 'sl': case 'list': return list(args[0], cb) default: return cb('Usage:\n' + distTag.usage) } } function add (spec, tag, cb) { var thing = npa(spec || '') var pkg = thing.name var version = thing.rawSpec var t = (tag || npm.config.get('tag')).trim() log.verbose('dist-tag add', t, 'to', pkg + '@' + version) if (!pkg || !version || !t) return cb('Usage:\n' + distTag.usage) if (semver.validRange(t)) { var er = new Error('Tag name must not be a valid SemVer range: ' + t) return cb(er) } fetchTags(pkg, function (er, tags) { if (er) return cb(er) if (tags[t] === version) { log.warn('dist-tag add', t, 'is already set to version', version) return cb() } tags[t] = version mapToRegistry(pkg, npm.config, function (er, uri, auth, base) { var params = { 'package': pkg, distTag: t, version: version, auth: auth } npm.registry.distTags.add(base, params, function (er) { if (er) return cb(er) console.log('+' + t + ': ' + pkg + '@' + version) cb() }) }) }) } function remove (tag, pkg, cb) { log.verbose('dist-tag del', tag, 'from', pkg) fetchTags(pkg, function (er, tags) { if (er) return cb(er) if (!tags[tag]) { log.info('dist-tag del', tag, 'is not a dist-tag on', pkg) return cb(new Error(tag + ' is not a dist-tag on ' + pkg)) } var version = tags[tag] delete tags[tag] mapToRegistry(pkg, npm.config, function (er, uri, auth, base) { var params = { 'package': pkg, distTag: tag, auth: auth } npm.registry.distTags.rm(base, params, function (er) { if (er) return cb(er) console.log('-' + tag + ': ' + pkg + '@' + version) cb() }) }) }) } function list (pkg, cb) { if (!pkg) { return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(distTag.usage) list(pkg, cb) }) } fetchTags(pkg, function (er, tags) { if (er) { log.error('dist-tag ls', "Couldn't get dist-tag data for", pkg) return cb(er) } var msg = Object.keys(tags).map(function (k) { return k + ': ' + tags[k] }).sort().join('\n') console.log(msg) cb(er, tags) }) } function fetchTags (pkg, cb) { mapToRegistry(pkg, npm.config, function (er, uri, auth, base) { if (er) return cb(er) var params = { 'package': pkg, auth: auth } npm.registry.distTags.fetch(base, params, function (er, tags) { if (er) return cb(er) if (!tags || !Object.keys(tags).length) { return cb(new Error('No dist-tags found for ' + pkg)) } cb(null, tags) }) }) } npm_3.5.2.orig/lib/docs.js0000644000000000000000000000176412631326456013536 0ustar 00000000000000module.exports = docs docs.usage = 'npm docs ' + '\nnpm docs .' var npm = require('./npm.js') var opener = require('opener') var log = require('npmlog') var fetchPackageMetadata = require('./fetch-package-metadata.js') docs.completion = function (opts, cb) { // FIXME: there used to be registry completion here, but it stopped making // sense somewhere around 50,000 packages on the registry cb() } function docs (args, cb) { if (!args || !args.length) args = ['.'] var pending = args.length log.silly('docs', args) args.forEach(function (proj) { getDoc(proj, function (err) { if (err) { return cb(err) } --pending || cb() }) }) } function getDoc (project, cb) { log.silly('getDoc', project) fetchPackageMetadata(project, '.', function (er, d) { if (er) return cb(er) var url = d.homepage if (!url) url = 'https://www.npmjs.org/package/' + d.name return opener(url, {command: npm.config.get('browser')}, cb) }) } npm_3.5.2.orig/lib/edit.js0000644000000000000000000000156612631326456013533 0ustar 00000000000000// npm edit // open the package folder in the $EDITOR module.exports = edit edit.usage = 'npm edit [@]' edit.completion = require('./utils/completion/installed-shallow.js') var npm = require('./npm.js') var path = require('path') var fs = require('graceful-fs') var editor = require('editor') function edit (args, cb) { var p = args[0] if (args.length !== 1 || !p) return cb(edit.usage) var e = npm.config.get('editor') if (!e) { return cb(new Error( "No editor set. Set the 'editor' config, or $EDITOR environ." )) } p = p.split('/') .join('/node_modules/') .replace(/(\/node_modules)+/, '/node_modules') var f = path.resolve(npm.dir, p) fs.lstat(f, function (er) { if (er) return cb(er) editor(f, { editor: e }, function (er) { if (er) return cb(er) npm.commands.rebuild(args, cb) }) }) } npm_3.5.2.orig/lib/explore.js0000644000000000000000000000214712631326456014260 0ustar 00000000000000// npm explore [@] // open a subshell to the package folder. module.exports = explore explore.usage = 'npm explore [ -- ]' explore.completion = require('./utils/completion/installed-shallow.js') var npm = require('./npm.js') var spawn = require('./utils/spawn') var path = require('path') var fs = require('graceful-fs') function explore (args, cb) { if (args.length < 1 || !args[0]) return cb(explore.usage) var p = args.shift() args = args.join(' ').trim() if (args) args = ['-c', args] else args = [] var cwd = path.resolve(npm.dir, p) var sh = npm.config.get('shell') fs.stat(cwd, function (er, s) { if (er || !s.isDirectory()) { return cb(new Error( "It doesn't look like " + p + ' is installed.' )) } if (!args.length) { console.log( '\nExploring ' + cwd + '\n' + "Type 'exit' or ^D when finished\n" ) } var shell = spawn(sh, args, { cwd: cwd, stdio: 'inherit' }) shell.on('close', function (er) { // only fail if non-interactive. if (!args.length) return cb() cb(er) }) }) } npm_3.5.2.orig/lib/faq.js0000644000000000000000000000020712631326456013344 0ustar 00000000000000module.exports = faq faq.usage = 'npm faq' var npm = require('./npm.js') function faq (args, cb) { npm.commands.help(['faq'], cb) } npm_3.5.2.orig/lib/fetch-package-metadata.js0000644000000000000000000002473512631326456017051 0ustar 00000000000000'use strict' var fs = require('graceful-fs') var path = require('path') var zlib = require('zlib') var log = require('npmlog') var realizePackageSpecifier = require('realize-package-specifier') var tar = require('tar') var once = require('once') var semver = require('semver') var readPackageTree = require('read-package-tree') var readPackageJson = require('read-package-json') var iferr = require('iferr') var rimraf = require('rimraf') var clone = require('lodash.clonedeep') var validate = require('aproba') var unpipe = require('unpipe') var normalizePackageData = require('normalize-package-data') var npm = require('./npm.js') var mapToRegistry = require('./utils/map-to-registry.js') var cache = require('./cache.js') var cachedPackageRoot = require('./cache/cached-package-root.js') var tempFilename = require('./utils/temp-filename.js') var getCacheStat = require('./cache/get-stat.js') var unpack = require('./utils/tar.js').unpack var pulseTillDone = require('./utils/pulse-till-done.js') var parseJSON = require('./utils/parse-json.js') function andLogAndFinish (spec, tracker, done) { validate('SF', [spec, done]) return function (er, pkg) { if (er) { log.silly('fetchPackageMetaData', 'error for ' + spec, er) if (tracker) tracker.finish() } return done(er, pkg) } } module.exports = function fetchPackageMetadata (spec, where, tracker, done) { if (!done) { done = tracker || where tracker = null if (done === where) where = null } if (typeof spec === 'object') { var dep = spec spec = dep.raw } var logAndFinish = andLogAndFinish(spec, tracker, done) if (!dep) { log.silly('fetchPackageMetaData', spec) return realizePackageSpecifier(spec, where, iferr(logAndFinish, function (dep) { fetchPackageMetadata(dep, where, tracker, done) })) } if (dep.type === 'version' || dep.type === 'range' || dep.type === 'tag') { fetchNamedPackageData(dep, addRequestedAndFinish) } else if (dep.type === 'directory') { fetchDirectoryPackageData(dep, where, addRequestedAndFinish) } else { fetchOtherPackageData(spec, dep, where, addRequestedAndFinish) } function addRequestedAndFinish (er, pkg) { if (pkg) { pkg._requested = dep pkg._spec = spec pkg._where = where if (!pkg._args) pkg._args = [] pkg._args.push([pkg._spec, pkg._where]) // non-npm registries can and will return unnormalized data, plus // even the npm registry may have package data normalized with older // normalization rules. This ensures we get package data in a consistent, // stable format. try { normalizePackageData(pkg) } catch (ex) { // don't care } } logAndFinish(er, pkg) } } function fetchOtherPackageData (spec, dep, where, next) { validate('SOSF', arguments) log.silly('fetchOtherPackageData', spec) cache.add(spec, null, where, false, iferr(next, function (pkg) { var result = clone(pkg) result._inCache = true next(null, result) })) } function fetchDirectoryPackageData (dep, where, next) { validate('OSF', arguments) log.silly('fetchDirectoryPackageData', dep.name || dep.rawSpec) readPackageJson(path.join(dep.spec, 'package.json'), false, next) } var regCache = {} function fetchNamedPackageData (dep, next) { validate('OF', arguments) log.silly('fetchNamedPackageData', dep.name || dep.rawSpec) mapToRegistry(dep.name || dep.rawSpec, npm.config, iferr(next, function (url, auth) { if (regCache[url]) { pickVersionFromRegistryDocument(clone(regCache[url])) } else { npm.registry.get(url, {auth: auth}, pulseTillDone('fetchMetadata', iferr(next, pickVersionFromRegistryDocument))) } function returnAndAddMetadata (pkg) { delete pkg._from delete pkg._resolved delete pkg._shasum next(null, pkg) } function pickVersionFromRegistryDocument (pkg) { if (!regCache[url]) regCache[url] = pkg var versions = Object.keys(pkg.versions).sort(semver.rcompare) if (dep.type === 'tag') { var tagVersion = pkg['dist-tags'][dep.spec] if (pkg.versions[tagVersion]) return returnAndAddMetadata(pkg.versions[tagVersion]) } else { var latestVersion = pkg['dist-tags'][npm.config.get('tag')] || versions[0] // Find the the most recent version less than or equal // to latestVersion that satisfies our spec for (var ii = 0; ii < versions.length; ++ii) { if (semver.gt(versions[ii], latestVersion)) continue if (semver.satisfies(versions[ii], dep.spec)) { return returnAndAddMetadata(pkg.versions[versions[ii]]) } } // Failing that, try finding the most recent version that matches // our spec for (var jj = 0; jj < versions.length; ++jj) { if (semver.satisfies(versions[jj], dep.spec)) { return returnAndAddMetadata(pkg.versions[versions[jj]]) } } // Failing THAT, if the range was '*' uses latestVersion if (dep.spec === '*') { return returnAndAddMetadata(pkg.versions[latestVersion]) } } // And failing that, we error out var targets = versions.length ? 'Valid install targets:\n' + versions.join(', ') + '\n' : 'No valid targets found.' var er = new Error('No compatible version found: ' + dep.raw + '\n' + targets) return next(er) } })) } function retryWithCached (pkg, asserter, next) { if (!pkg._inCache) { cache.add(pkg._spec, null, pkg._where, false, iferr(next, function (newpkg) { Object.keys(newpkg).forEach(function (key) { if (key[0] !== '_') return pkg[key] = newpkg[key] }) pkg._inCache = true return asserter(pkg, next) })) } return !pkg._inCache } module.exports.addShrinkwrap = function addShrinkwrap (pkg, next) { validate('OF', arguments) if (pkg._shrinkwrap !== undefined) return next(null, pkg) if (retryWithCached(pkg, addShrinkwrap, next)) return pkg._shrinkwrap = null // FIXME: cache the shrinkwrap directly var pkgname = pkg.name var ver = pkg.version var tarball = path.join(cachedPackageRoot({name: pkgname, version: ver}), 'package.tgz') untarStream(tarball, function (er, untar) { if (er) { if (er.code === 'ENOTTARBALL') { pkg._shrinkwrap = null return next() } else { return next(er) } } if (er) return next(er) var foundShrinkwrap = false untar.on('entry', function (entry) { if (!/^(?:[^\/]+[\/])npm-shrinkwrap.json$/.test(entry.path)) return log.silly('addShrinkwrap', 'Found shrinkwrap in ' + pkgname + ' ' + entry.path) foundShrinkwrap = true var shrinkwrap = '' entry.on('data', function (chunk) { shrinkwrap += chunk }) entry.on('end', function () { untar.close() log.silly('addShrinkwrap', 'Completed reading shrinkwrap in ' + pkgname) try { pkg._shrinkwrap = parseJSON(shrinkwrap) } catch (ex) { var er = new Error('Error parsing ' + pkgname + '@' + ver + "'s npm-shrinkwrap.json: " + ex.message) er.type = 'ESHRINKWRAP' return next(er) } next(null, pkg) }) entry.resume() }) untar.on('end', function () { if (!foundShrinkwrap) { pkg._shrinkwrap = null next(null, pkg) } }) }) } module.exports.addBundled = function addBundled (pkg, next) { validate('OF', arguments) if (pkg._bundled !== undefined) return next(null, pkg) if (!pkg.bundleDependencies) return next(null, pkg) if (retryWithCached(pkg, addBundled, next)) return pkg._bundled = null var pkgname = pkg.name var ver = pkg.version var tarball = path.join(cachedPackageRoot({name: pkgname, version: ver}), 'package.tgz') var target = tempFilename('unpack') getCacheStat(iferr(next, function (cs) { log.verbose('addBundled', 'extract', tarball) unpack(tarball, target, null, null, cs.uid, cs.gid, iferr(next, function () { log.silly('addBundled', 'read tarball') readPackageTree(target, function (er, tree) { log.silly('cleanup', 'remove extracted module') rimraf(target, function () { if (tree) { pkg._bundled = tree.children } next(null, pkg) }) }) })) })) } // FIXME: hasGzipHeader / hasTarHeader / untarStream duplicate a lot // of code from lib/utils/tar.js– these should be brought together. function hasGzipHeader (c) { return c[0] === 0x1F && c[1] === 0x8B && c[2] === 0x08 } function hasTarHeader (c) { return c[257] === 0x75 && // tar archives have 7573746172 at position c[258] === 0x73 && // 257 and 003030 or 202000 at position 262 c[259] === 0x74 && c[260] === 0x61 && c[261] === 0x72 && ((c[262] === 0x00 && c[263] === 0x30 && c[264] === 0x30) || (c[262] === 0x20 && c[263] === 0x20 && c[264] === 0x00)) } function untarStream (tarball, cb) { validate('SF', arguments) cb = once(cb) var stream var file = stream = fs.createReadStream(tarball) var tounpipe = [file] file.on('error', function (er) { er = new Error('Error extracting ' + tarball + ' archive: ' + er.message) er.code = 'EREADFILE' cb(er) }) file.on('data', function OD (c) { if (hasGzipHeader(c)) { doGunzip() } else if (hasTarHeader(c)) { doUntar() } else { if (file.close) file.close() if (file.destroy) file.destroy() var er = new Error('Non-gzip/tarball ' + tarball) er.code = 'ENOTTARBALL' return cb(er) } file.removeListener('data', OD) file.emit('data', c) cb(null, stream) }) function doGunzip () { var gunzip = stream.pipe(zlib.createGunzip()) gunzip.on('error', function (er) { er = new Error('Error extracting ' + tarball + ' archive: ' + er.message) er.code = 'EGUNZIP' cb(er) }) tounpipe.push(gunzip) stream = gunzip doUntar() } function doUntar () { var untar = stream.pipe(tar.Parse()) untar.on('error', function (er) { er = new Error('Error extracting ' + tarball + ' archive: ' + er.message) er.code = 'EUNTAR' cb(er) }) tounpipe.push(untar) stream = untar addClose() } function addClose () { stream.close = function () { tounpipe.forEach(function (stream) { unpipe(stream) }) if (file.close) file.close() if (file.destroy) file.destroy() } } } npm_3.5.2.orig/lib/fetch-package-metadata.md0000644000000000000000000000341712631326456017027 0ustar 00000000000000fetch-package-metadata ---------------------- var fetchPackageMetadata = require("npm/lib/fetch-package-metadata") fetchPackageMetadata(spec, contextdir, callback) This will get package metadata (and if possible, ONLY package metadata) for a specifer as passed to `npm install` et al, eg `npm@next` or `npm@^2.0.3` ## fetchPackageMetadata(*spec*, *contextdir*, *tracker*, *callback*) * *spec* **string** | **object** -- The package specifier, can be anything npm can understand (see [realize-package-specifier]), or it can be the result from realize-package-specifier or npm-package-arg (for non-local deps). * *contextdir* **string** -- The directory from which relative paths to local packages should be resolved. * *tracker* **object** -- **(optional)** An are-we-there-yet tracker group as provided by `npm.log.newGroup()`. * *callback* **function (er, package)** -- Called when the package information has been loaded. `package` is the object for of the `package.json` matching the requested spec. In the case of named packages, it comes from the registry and thus may not exactly match what's found in the associated tarball. [realize-package-specifier]: (https://github.com/npm/realize-package-specifier) In the case of tarballs and git repos, it will use the cache to download them in order to get the package metadata. For named packages, only the metadata is downloaded (eg http://registry.npmjs.org/package). For local directories, the package.json is read directly. For local tarballs, the tarball is streamed in memory and just the package.json is extracted from it. (Due to the nature of tars, having the package.json early in the file will result in it being loaded faster– the extractor short-circuits the uncompress/untar streams as best as it can.) npm_3.5.2.orig/lib/get.js0000644000000000000000000000035312631326456013356 0ustar 00000000000000 module.exports = get get.usage = 'npm get (See `npm config`)' var npm = require('./npm.js') get.completion = npm.commands.config.completion function get (args, cb) { npm.commands.config(['get'].concat(args), cb) } npm_3.5.2.orig/lib/help-search.js0000644000000000000000000001317212631326456014775 0ustar 00000000000000 module.exports = helpSearch var fs = require('graceful-fs') var path = require('path') var asyncMap = require('slide').asyncMap var npm = require('./npm.js') var glob = require('glob') var color = require('ansicolors') helpSearch.usage = 'npm help-search ' function helpSearch (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } if (!args.length) return cb(helpSearch.usage) var docPath = path.resolve(__dirname, '..', 'doc') return glob(docPath + '/*/*.md', function (er, files) { if (er) return cb(er) readFiles(files, function (er, data) { if (er) return cb(er) searchFiles(args, data, function (er, results) { if (er) return cb(er) formatResults(args, results, cb) }) }) }) } function readFiles (files, cb) { var res = {} asyncMap(files, function (file, cb) { fs.readFile(file, 'utf8', function (er, data) { res[file] = data return cb(er) }) }, function (er) { return cb(er, res) }) } function searchFiles (args, files, cb) { var results = [] Object.keys(files).forEach(function (file) { var data = files[file] // skip if no matches at all var match for (var a = 0, l = args.length; a < l && !match; a++) { match = data.toLowerCase().indexOf(args[a].toLowerCase()) !== -1 } if (!match) return var lines = data.split(/\n+/) // if a line has a search term, then skip it and the next line. // if the next line has a search term, then skip all 3 // otherwise, set the line to null. then remove the nulls. l = lines.length for (var i = 0; i < l; i++) { var line = lines[i] var nextLine = lines[i + 1] var ll match = false if (nextLine) { for (a = 0, ll = args.length; a < ll && !match; a++) { match = nextLine.toLowerCase() .indexOf(args[a].toLowerCase()) !== -1 } if (match) { // skip over the next line, and the line after it. i += 2 continue } } match = false for (a = 0, ll = args.length; a < ll && !match; a++) { match = line.toLowerCase().indexOf(args[a].toLowerCase()) !== -1 } if (match) { // skip over the next line i++ continue } lines[i] = null } // now squish any string of nulls into a single null lines = lines.reduce(function (l, r) { if (!(r === null && l[l.length - 1] === null)) l.push(r) return l }, []) if (lines[lines.length - 1] === null) lines.pop() if (lines[0] === null) lines.shift() // now see how many args were found at all. var found = {} var totalHits = 0 lines.forEach(function (line) { args.forEach(function (arg) { var hit = (line || '').toLowerCase() .split(arg.toLowerCase()).length - 1 if (hit > 0) { found[arg] = (found[arg] || 0) + hit totalHits += hit } }) }) var cmd = 'npm help ' if (path.basename(path.dirname(file)) === 'api') { cmd = 'npm apihelp ' } cmd += path.basename(file, '.md').replace(/^npm-/, '') results.push({ file: file, cmd: cmd, lines: lines, found: Object.keys(found), hits: found, totalHits: totalHits }) }) // if only one result, then just show that help section. if (results.length === 1) { return npm.commands.help([results[0].file.replace(/\.md$/, '')], cb) } if (results.length === 0) { console.log('No results for ' + args.map(JSON.stringify).join(' ')) return cb() } // sort results by number of results found, then by number of hits // then by number of matching lines results = results.sort(function (a, b) { return a.found.length > b.found.length ? -1 : a.found.length < b.found.length ? 1 : a.totalHits > b.totalHits ? -1 : a.totalHits < b.totalHits ? 1 : a.lines.length > b.lines.length ? -1 : a.lines.length < b.lines.length ? 1 : 0 }) cb(null, results) } function formatResults (args, results, cb) { if (!results) return cb(null) var cols = Math.min(process.stdout.columns || Infinity, 80) + 1 var out = results.map(function (res) { var out = res.cmd var r = Object.keys(res.hits) .map(function (k) { return k + ':' + res.hits[k] }).sort(function (a, b) { return a > b ? 1 : -1 }).join(' ') out += ((new Array(Math.max(1, cols - out.length - r.length))) .join(' ')) + r if (!npm.config.get('long')) return out out = '\n\n' + out + '\n' + (new Array(cols)).join('—') + '\n' + res.lines.map(function (line, i) { if (line === null || i > 3) return '' for (var out = line, a = 0, l = args.length; a < l; a++) { var finder = out.toLowerCase().split(args[a].toLowerCase()) var newOut = '' var p = 0 finder.forEach(function (f) { newOut += out.substr(p, f.length) var hilit = out.substr(p + f.length, args[a].length) if (npm.color) hilit = color.bgBlack(color.red(hilit)) newOut += hilit p += f.length + args[a].length }) } return newOut }).join('\n').trim() return out }).join('\n') if (results.length && !npm.config.get('long')) { out = 'Top hits for ' + (args.map(JSON.stringify).join(' ')) + '\n' + (new Array(cols)).join('—') + '\n' + out + '\n' + (new Array(cols)).join('—') + '\n' + '(run with -l or --long to see more context)' } console.log(out.trim()) cb(null, results) } npm_3.5.2.orig/lib/help.js0000644000000000000000000001407612631326456013536 0ustar 00000000000000 module.exports = help help.completion = function (opts, cb) { if (opts.conf.argv.remain.length > 2) return cb(null, []) getSections(cb) } var path = require('path') var spawn = require('./utils/spawn') var npm = require('./npm.js') var log = require('npmlog') var opener = require('opener') var glob = require('glob') function help (args, cb) { var argv = npm.config.get('argv').cooked var argnum = 0 if (args.length === 2 && ~~args[0]) { argnum = ~~args.shift() } // npm help foo bar baz: search topics if (args.length > 1 && args[0]) { return npm.commands['help-search'](args, argnum, cb) } var section = npm.deref(args[0]) || args[0] // npm help : show basic usage if (!section) { var valid = argv[0] === 'help' ? 0 : 1 return npmUsage(valid, cb) } // npm -h: show command usage if (npm.config.get('usage') && npm.commands[section] && npm.commands[section].usage) { npm.config.set('loglevel', 'silent') log.level = 'silent' console.log(npm.commands[section].usage) return cb() } // npm apihelp
    : Prefer section 3 over section 1 var apihelp = argv.length && argv[0].indexOf('api') !== -1 var pref = apihelp ? [3, 1, 5, 7] : [1, 3, 5, 7] if (argnum) { pref = [ argnum ].concat(pref.filter(function (n) { return n !== argnum })) } // npm help
    : Try to find the path var manroot = path.resolve(__dirname, '..', 'man') // legacy if (section === 'global') section = 'folders' else if (section === 'json') section = 'package.json' // find either /section.n or /npm-section.n // The glob is used in the glob. The regexp is used much // further down. Globs and regexps are different var compextglob = '.+(gz|bz2|lzma|[FYzZ]|xz)' var compextre = '\\.(gz|bz2|lzma|[FYzZ]|xz)$' var f = '+(npm-' + section + '|' + section + ').[0-9]?(' + compextglob + ')' return glob(manroot + '/*/' + f, function (er, mans) { if (er) return cb(er) if (!mans.length) return npm.commands['help-search'](args, cb) mans = mans.map(function (man) { var ext = path.extname(man) if (man.match(new RegExp(compextre))) man = path.basename(man, ext) return man }) viewMan(pickMan(mans, pref), cb) }) } function pickMan (mans, pref_) { var nre = /([0-9]+)$/ var pref = {} pref_.forEach(function (sect, i) { pref[sect] = i }) mans = mans.sort(function (a, b) { var an = a.match(nre)[1] var bn = b.match(nre)[1] return an === bn ? (a > b ? -1 : 1) : pref[an] < pref[bn] ? -1 : 1 }) return mans[0] } function viewMan (man, cb) { var nre = /([0-9]+)$/ var num = man.match(nre)[1] var section = path.basename(man, '.' + num) // at this point, we know that the specified man page exists var manpath = path.join(__dirname, '..', 'man') var env = {} Object.keys(process.env).forEach(function (i) { env[i] = process.env[i] }) env.MANPATH = manpath var viewer = npm.config.get('viewer') var conf switch (viewer) { case 'woman': var a = ['-e', '(woman-find-file \'' + man + '\')'] conf = { env: env, stdio: 'inherit' } var woman = spawn('emacsclient', a, conf) woman.on('close', cb) break case 'browser': opener(htmlMan(man), { command: npm.config.get('browser') }, cb) break default: conf = { env: env, stdio: 'inherit' } var manProcess = spawn('man', [num, section], conf) manProcess.on('close', cb) break } } function htmlMan (man) { var sect = +man.match(/([0-9]+)$/)[1] var f = path.basename(man).replace(/([0-9]+)$/, 'html') switch (sect) { case 1: sect = 'cli' break case 3: sect = 'api' break case 5: sect = 'files' break case 7: sect = 'misc' break default: throw new Error('invalid man section: ' + sect) } return path.resolve(__dirname, '..', 'html', 'doc', sect, f) } function npmUsage (valid, cb) { npm.config.set('loglevel', 'silent') log.level = 'silent' console.log([ '\nUsage: npm ', '', 'where is one of:', npm.config.get('long') ? usages() : ' ' + wrap(Object.keys(npm.commands)), '', 'npm -h quick help on ', 'npm -l display full usage info', 'npm faq commonly asked questions', 'npm help search for help on ', 'npm help npm involved overview', '', 'Specify configs in the ini-formatted file:', ' ' + npm.config.get('userconfig'), 'or on the command line via: npm --key value', 'Config info can be viewed via: npm help config', '', 'npm@' + npm.version + ' ' + path.dirname(__dirname) ].join('\n')) cb(valid) } function usages () { // return a string of : var maxLen = 0 return Object.keys(npm.commands).filter(function (c) { return c === npm.deref(c) }).reduce(function (set, c) { set.push([c, npm.commands[c].usage || '']) maxLen = Math.max(maxLen, c.length) return set }, []).map(function (item) { var c = item[0] var usage = item[1] return '\n ' + c + (new Array(maxLen - c.length + 2).join(' ')) + (usage.split('\n').join('\n' + (new Array(maxLen + 6).join(' ')))) }).join('\n') } function wrap (arr) { var out = [''] var l = 0 var line line = process.stdout.columns if (!line) { line = 60 } else { line = Math.min(60, Math.max(line - 16, 24)) } arr.sort(function (a, b) { return a < b ? -1 : 1 }) .forEach(function (c) { if (out[l].length + c.length + 2 < line) { out[l] += ', ' + c } else { out[l++] += ',' out[l] = c } }) return out.join('\n ').substr(2) } function getSections (cb) { var g = path.resolve(__dirname, '../man/man[0-9]/*.[0-9]') glob(g, function (er, files) { if (er) return cb(er) cb(null, Object.keys(files.reduce(function (acc, file) { file = path.basename(file).replace(/\.[0-9]+$/, '') file = file.replace(/^npm-/, '') acc[file] = true return acc }, { help: true }))) }) } npm_3.5.2.orig/lib/init.js0000644000000000000000000000221212631326456013536 0ustar 00000000000000// initialize a package.json file module.exports = init var log = require('npmlog') var npm = require('./npm.js') var initJson = require('init-package-json') init.usage = 'npm init [--force|-f|--yes|-y]' function init (args, cb) { var dir = process.cwd() log.pause() var initFile = npm.config.get('init-module') if (!initJson.yes(npm.config)) { console.log([ 'This utility will walk you through creating a package.json file.', 'It only covers the most common items, and tries to guess sensible defaults.', '', 'See `npm help json` for definitive documentation on these fields', 'and exactly what they do.', '', 'Use `npm install --save` afterwards to install a package and', 'save it as a dependency in the package.json file.', '', 'Press ^C at any time to quit.' ].join('\n')) } initJson(dir, initFile, npm.config, function (er, data) { log.resume() log.silly('package data', data) if (er && er.message === 'canceled') { log.warn('init', 'canceled') return cb(null, data) } log.info('init', 'written successfully') cb(er, data) }) } npm_3.5.2.orig/lib/install/0000755000000000000000000000000012631326456013706 5ustar 00000000000000npm_3.5.2.orig/lib/install-test.js0000644000000000000000000000076512631326456015231 0ustar 00000000000000'use strict' // npm install-test // Runs `npm install` and then runs `npm test` module.exports = installTest var install = require('./install.js') var test = require('./test.js') installTest.usage = '\nnpm install-test [args]' + '\nSame args as `npm install`' + '\n\nalias: npm it' installTest.completion = install.completion function installTest (args, cb) { install(args, function (er) { if (er) { return cb(er) } test([], cb) }) } npm_3.5.2.orig/lib/install.js0000644000000000000000000006076012631326456014255 0ustar 00000000000000'use strict' // npm install // // See doc/install.md for more description // Managing contexts... // there's a lot of state associated with an "install" operation, including // packages that are already installed, parent packages, current shrinkwrap, and // so on. We maintain this state in a "context" object that gets passed around. // every time we dive into a deeper node_modules folder, the "family" list that // gets passed along uses the previous "family" list as its __proto__. Any // "resolved precise dependency" things that aren't already on this object get // added, and then that's passed to the next generation of installation. module.exports = install module.exports.Installer = Installer install.usage = '\nnpm install (with no args, in package dir)' + '\nnpm install [<@scope>/]' + '\nnpm install [<@scope>/]@' + '\nnpm install [<@scope>/]@' + '\nnpm install [<@scope>/]@' + '\nnpm install ' + '\nnpm install ' + '\nnpm install ' + '\nnpm install ' + '\nnpm install /' + '\n\nalias: npm i' + '\ncommon options: [--save|--save-dev|--save-optional] [--save-exact]' install.completion = function (opts, cb) { validate('OF', arguments) // install can complete to a folder with a package.json, or any package. // if it has a slash, then it's gotta be a folder // if it starts with https?://, then just give up, because it's a url if (/^https?:\/\//.test(opts.partialWord)) { // do not complete to URLs return cb(null, []) } if (/\//.test(opts.partialWord)) { // Complete fully to folder if there is exactly one match and it // is a folder containing a package.json file. If that is not the // case we return 0 matches, which will trigger the default bash // complete. var lastSlashIdx = opts.partialWord.lastIndexOf('/') var partialName = opts.partialWord.slice(lastSlashIdx + 1) var partialPath = opts.partialWord.slice(0, lastSlashIdx) if (partialPath === '') partialPath = '/' var annotatePackageDirMatch = function (sibling, cb) { var fullPath = path.join(partialPath, sibling) if (sibling.slice(0, partialName.length) !== partialName) { return cb(null, null) // not name match } fs.readdir(fullPath, function (err, contents) { if (err) return cb(null, { isPackage: false }) cb( null, { fullPath: fullPath, isPackage: contents.indexOf('package.json') !== -1 } ) }) } return fs.readdir(partialPath, function (err, siblings) { if (err) return cb(null, []) // invalid dir: no matching asyncMap(siblings, annotatePackageDirMatch, function (err, matches) { if (err) return cb(err) var cleaned = matches.filter(function (x) { return x !== null }) if (cleaned.length !== 1) return cb(null, []) if (!cleaned[0].isPackage) return cb(null, []) // Success - only one match and it is a package dir return cb(null, [cleaned[0].fullPath]) }) }) } // FIXME: there used to be registry completion here, but it stopped making // sense somewhere around 50,000 packages on the registry cb() } // system packages var fs = require('fs') var path = require('path') // dependencies var log = require('npmlog') var readPackageTree = require('read-package-tree') var chain = require('slide').chain var asyncMap = require('slide').asyncMap var archy = require('archy') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var iferr = require('iferr') var validate = require('aproba') // npm internal utils var npm = require('./npm.js') var locker = require('./utils/locker.js') var lock = locker.lock var unlock = locker.unlock var ls = require('./ls.js') var parseJSON = require('./utils/parse-json.js') // install specific libraries var copyTree = require('./install/copy-tree.js') var readShrinkwrap = require('./install/read-shrinkwrap.js') var recalculateMetadata = require('./install/deps.js').recalculateMetadata var loadDeps = require('./install/deps.js').loadDeps var loadDevDeps = require('./install/deps.js').loadDevDeps var getAllMetadata = require('./install/deps.js').getAllMetadata var loadRequestedDeps = require('./install/deps.js').loadRequestedDeps var loadExtraneous = require('./install/deps.js').loadExtraneous var pruneTree = require('./install/prune-tree.js') var diffTrees = require('./install/diff-trees.js') var checkPermissions = require('./install/check-permissions.js') var decomposeActions = require('./install/decompose-actions.js') var filterInvalidActions = require('./install/filter-invalid-actions.js') var validateTree = require('./install/validate-tree.js') var validateArgs = require('./install/validate-args.js') var saveRequested = require('./install/save.js').saveRequested var getSaveType = require('./install/save.js').getSaveType var doSerialActions = require('./install/actions.js').doSerial var doReverseSerialActions = require('./install/actions.js').doReverseSerial var doParallelActions = require('./install/actions.js').doParallel var doOneAction = require('./install/actions.js').doOne var packageId = require('./utils/package-id.js') var moduleName = require('./utils/module-name.js') var errorMessage = require('./utils/error-message.js') function unlockCB (lockPath, name, cb) { validate('SSF', arguments) return function (installEr) { var args = arguments try { unlock(lockPath, name, reportErrorAndReturn) } catch (unlockEx) { process.nextTick(function () { reportErrorAndReturn(unlockEx) }) } function reportErrorAndReturn (unlockEr) { if (installEr) { if (unlockEr && unlockEr.code !== 'ENOTLOCKED') { log.warn('unlock' + name, unlockEr) } return cb.apply(null, args) } if (unlockEr) return cb(unlockEr) return cb.apply(null, args) } } } function install (where, args, cb) { if (!cb) { cb = args args = where where = null } var globalTop = path.resolve(npm.globalDir, '..') if (!where) { where = npm.config.get('global') ? globalTop : npm.prefix } validate('SAF', [where, args, cb]) // the /path/to/node_modules/.. var dryrun = !!npm.config.get('dry-run') if (npm.config.get('dev')) { log.warn('install', 'Usage of the `--dev` option is deprecated. Use `--only=dev` instead.') } if (where === globalTop && !args.length) { args = ['.'] } args = args.filter(function (a) { return path.resolve(a) !== npm.prefix }) new Installer(where, dryrun, args).run(cb) } function Installer (where, dryrun, args) { validate('SBA', arguments) this.where = where this.dryrun = dryrun this.args = args this.currentTree = null this.idealTree = null this.differences = [] this.todo = [] this.progress = {} this.noPackageJsonOk = !!args.length this.topLevelLifecycles = !args.length this.npat = npm.config.get('npat') this.dev = npm.config.get('dev') || (!/^prod(uction)?$/.test(npm.config.get('only')) && !npm.config.get('production')) || /^dev(elopment)?$/.test(npm.config.get('only')) this.prod = !/^dev(elopment)?$/.test(npm.config.get('only')) this.rollback = npm.config.get('rollback') this.link = npm.config.get('link') this.global = this.where === path.resolve(npm.globalDir, '..') } Installer.prototype = {} Installer.prototype.run = function (cb) { validate('F', arguments) // FIXME: This is bad and I should feel bad. // lib/install needs to have some way of sharing _limited_ // state with the things it calls. Passing the object is too // much. The global config is WAY too much. =( =( // But not having this is gonna break linked modules in // subtle stupid ways, and refactoring all this code isn't // the right thing to do just yet. if (this.global) { var prevGlobal = npm.config.get('global') npm.config.set('global', true) var next = cb cb = function () { npm.config.set('global', prevGlobal) next.apply(null, arguments) } } var installSteps = [] var postInstallSteps = [] installSteps.push( [this.newTracker(log, 'loadCurrentTree', 4)], [this, this.loadCurrentTree], [this, this.finishTracker, 'loadCurrentTree'], [this.newTracker(log, 'loadIdealTree', 12)], [this, this.loadIdealTree], [this, this.finishTracker, 'loadIdealTree'], [this, this.debugTree, 'currentTree', 'currentTree'], [this, this.debugTree, 'idealTree', 'idealTree'], [this.newTracker(log, 'generateActionsToTake')], [this, this.generateActionsToTake], [this, this.finishTracker, 'generateActionsToTake'], [this, this.debugActions, 'diffTrees', 'differences'], [this, this.debugActions, 'decomposeActions', 'todo']) if (!this.dryrun) { installSteps.push( [this.newTracker(log, 'executeActions', 8)], [this, this.executeActions], [this, this.finishTracker, 'executeActions']) var node_modules = path.resolve(this.where, 'node_modules') var staging = path.resolve(node_modules, '.staging') postInstallSteps.push( [this.newTracker(log, 'rollbackFailedOptional', 1)], [this, this.rollbackFailedOptional, staging, this.todo], [this, this.finishTracker, 'rollbackFailedOptional'], [this, this.commit, staging, this.todo], [this.newTracker(log, 'runTopLevelLifecycles', 2)], [this, this.runTopLevelLifecycles], [this, this.finishTracker, 'runTopLevelLifecycles']) if (getSaveType(this.args)) { postInstallSteps.push( [this, this.saveToDependencies]) } } postInstallSteps.push( [this, this.printInstalled]) var self = this chain(installSteps, function (installEr) { if (installEr) self.failing = true chain(postInstallSteps, function (postInstallEr) { if (self.idealTree) { self.idealTree.warnings.forEach(function (warning) { if (warning.code === 'EPACKAGEJSON' && self.global) return if (warning.code === 'ENOTDIR') return errorMessage(warning).summary.forEach(function (logline) { log.warn.apply(log, logline) }) }) } if (installEr && postInstallEr) { var msg = errorMessage(postInstallEr) msg.summary.forEach(function (logline) { log.warn.apply(log, logline) }) msg.detail.forEach(function (logline) { log.verbose.apply(log, logline) }) } cb(installEr || postInstallEr, self.getInstalledModules(), self.idealTree) }) }) } Installer.prototype.loadArgMetadata = function (next) { var self = this getAllMetadata(this.args, this.currentTree, iferr(next, function (args) { self.args = args next() })) } Installer.prototype.newTracker = function (tracker, name, size) { validate('OS', [tracker, name]) if (size) validate('N', [size]) this.progress[name] = tracker.newGroup(name, size) var self = this return function (next) { self.progress[name].silly(name, 'Starting') next() } } Installer.prototype.finishTracker = function (name, cb) { validate('SF', arguments) this.progress[name].silly(name, 'Finishing') this.progress[name].finish() cb() } Installer.prototype.loadCurrentTree = function (cb) { validate('F', arguments) log.silly('install', 'loadCurrentTree') var todo = [] if (this.global) { todo.push([this, this.readGlobalPackageData]) } else { todo.push([this, this.readLocalPackageData]) } todo.push( [this, this.normalizeTree, log.newGroup('normalizeTree')]) chain(todo, cb) } Installer.prototype.loadIdealTree = function (cb) { validate('F', arguments) log.silly('install', 'loadIdealTree') chain([ [this.newTracker(this.progress.loadIdealTree, 'cloneCurrentTree')], [this, this.cloneCurrentTreeToIdealTree], [this, this.finishTracker, 'cloneCurrentTree'], [this.newTracker(this.progress.loadIdealTree, 'loadShrinkwrap')], [this, this.loadShrinkwrap], [this, this.finishTracker, 'loadShrinkwrap'], [this.newTracker(this.progress.loadIdealTree, 'loadAllDepsIntoIdealTree', 10)], [this, this.loadAllDepsIntoIdealTree], [this, this.finishTracker, 'loadAllDepsIntoIdealTree'], [this, function (next) { recalculateMetadata(this.idealTree, log, next) }], [this, this.debugTree, 'idealTree:prePrune', 'idealTree'], [this, function (next) { next(pruneTree(this.idealTree)) }] ], cb) } Installer.prototype.loadAllDepsIntoIdealTree = function (cb) { validate('F', arguments) log.silly('install', 'loadAllDepsIntoIdealTree') var saveDeps = getSaveType(this.args) var cg = this.progress.loadAllDepsIntoIdealTree var installNewModules = !!this.args.length var steps = [] if (installNewModules) { steps.push([validateArgs, this.idealTree, this.args]) steps.push([loadRequestedDeps, this.args, this.idealTree, saveDeps, cg.newGroup('loadRequestedDeps')]) } else { if (this.prod) { steps.push( [loadDeps, this.idealTree, cg.newGroup('loadDeps')]) } if (this.dev) { steps.push( [loadDevDeps, this.idealTree, cg.newGroup('loadDevDeps')]) } } steps.push( [loadExtraneous.andResolveDeps, this.idealTree, cg.newGroup('loadExtraneous')]) chain(steps, cb) } Installer.prototype.generateActionsToTake = function (cb) { validate('F', arguments) log.silly('install', 'generateActionsToTake') var cg = this.progress.generateActionsToTake chain([ [validateTree, this.idealTree, cg.newGroup('validateTree')], [diffTrees, this.currentTree, this.idealTree, this.differences, cg.newGroup('diffTrees')], [this, this.computeLinked], [filterInvalidActions, this.where, this.differences], [checkPermissions, this.differences], [decomposeActions, this.differences, this.todo] ], cb) } Installer.prototype.computeLinked = function (cb) { validate('F', arguments) if (!this.link || this.global) return cb() var linkTodoList = [] var self = this asyncMap(this.differences, function (action, next) { var cmd = action[0] var pkg = action[1] if (cmd !== 'add' && cmd !== 'update') return next() var isReqByTop = pkg.package._requiredBy.filter(function (name) { return name === '/' }).length var isReqByUser = pkg.package._requiredBy.filter(function (name) { return name === '#USER' }).length var isExtraneous = pkg.package._requiredBy.length === 0 if (!isReqByTop && !isReqByUser && !isExtraneous) return next() isLinkable(pkg, function (install, link) { if (install) linkTodoList.push(['global-install', pkg]) if (link) linkTodoList.push(['global-link', pkg]) if (install || link) { pkg.parent.children = pkg.parent.children.filter(function (child) { return child !== pkg }) } next() }) }, function () { if (linkTodoList.length === 0) return cb() pruneTree(self.idealTree) self.differences.length = 0 Array.prototype.push.apply(self.differences, linkTodoList) diffTrees(self.currentTree, self.idealTree, self.differences, log.newGroup('d2'), cb) }) } function isLinkable (pkg, cb) { var globalPackage = path.resolve(npm.globalPrefix, 'lib', 'node_modules', moduleName(pkg)) var globalPackageJson = path.resolve(globalPackage, 'package.json') fs.stat(globalPackage, function (er) { if (er) return cb(true, true) fs.readFile(globalPackageJson, function (er, data) { var json = parseJSON.noExceptions(data) cb(false, json && json.version === pkg.package.version) }) }) } Installer.prototype.executeActions = function (cb) { validate('F', arguments) log.silly('install', 'executeActions') var todo = this.todo var cg = this.progress.executeActions var node_modules = path.resolve(this.where, 'node_modules') var staging = path.resolve(node_modules, '.staging') var steps = [] var trackLifecycle = cg.newGroup('lifecycle') cb = unlockCB(node_modules, '.staging', cb) steps.push( [doSerialActions, 'global-install', staging, todo, trackLifecycle.newGroup('global-install')], [doParallelActions, 'fetch', staging, todo, cg.newGroup('fetch', 10)], [lock, node_modules, '.staging'], [rimraf, staging], [mkdirp, staging], [doParallelActions, 'extract', staging, todo, cg.newGroup('extract', 10)], [doParallelActions, 'preinstall', staging, todo, trackLifecycle.newGroup('preinstall')], [doReverseSerialActions, 'remove', staging, todo, cg.newGroup('remove')], [doSerialActions, 'move', staging, todo, cg.newGroup('move')], [doSerialActions, 'finalize', staging, todo, cg.newGroup('finalize')], [doSerialActions, 'build', staging, todo, trackLifecycle.newGroup('build')], [doSerialActions, 'global-link', staging, todo, trackLifecycle.newGroup('global-link')], [doParallelActions, 'update-linked', staging, todo, trackLifecycle.newGroup('update-linked')], [doSerialActions, 'install', staging, todo, trackLifecycle.newGroup('install')], [doSerialActions, 'postinstall', staging, todo, trackLifecycle.newGroup('postinstall')]) if (this.npat) { steps.push( [doParallelActions, 'test', staging, todo, trackLifecycle.newGroup('npat')]) } var self = this chain(steps, function (er) { if (!er || self.rollback) { rimraf(staging, function () { cb(er) }) } else { cb(er) } }) } Installer.prototype.rollbackFailedOptional = function (staging, actionsToRun, cb) { if (!this.rollback) return cb() var failed = actionsToRun.map(function (action) { return action[1] }).filter(function (pkg) { return pkg.failed && pkg.rollback }) asyncMap(failed, function (pkg, next) { asyncMap(pkg.rollback, function (rollback, done) { rollback(staging, pkg, done) }, next) }, cb) } Installer.prototype.commit = function (staging, actionsToRun, cb) { var toCommit = actionsToRun.map(function (action) { return action[1] }).filter(function (pkg) { return !pkg.failed && pkg.commit }) asyncMap(toCommit, function (pkg, next) { asyncMap(pkg.commit, function (commit, done) { commit(staging, pkg, done) }, function () { pkg.commit = [] next.apply(null, arguments) }) }, cb) } Installer.prototype.runTopLevelLifecycles = function (cb) { validate('F', arguments) if (this.failing) return cb() log.silly('install', 'runTopLevelLifecycles') var steps = [] var trackLifecycle = this.progress.runTopLevelLifecycles if (!this.topLevelLifecycles) { trackLifecycle.finish() return cb() } steps.push( [doOneAction, 'preinstall', this.idealTree.path, this.idealTree, trackLifecycle.newGroup('preinstall:.')], [doOneAction, 'build', this.idealTree.path, this.idealTree, trackLifecycle.newGroup('build:.')], [doOneAction, 'install', this.idealTree.path, this.idealTree, trackLifecycle.newGroup('install:.')], [doOneAction, 'postinstall', this.idealTree.path, this.idealTree, trackLifecycle.newGroup('postinstall:.')]) if (this.npat) { steps.push( [doOneAction, 'test', this.idealTree.path, this.idealTree, trackLifecycle.newGroup('npat:.')]) } if (this.dev) { steps.push( [doOneAction, 'prepublish', this.idealTree.path, this.idealTree, trackLifecycle.newGroup('prepublish')]) } chain(steps, cb) } Installer.prototype.saveToDependencies = function (cb) { validate('F', arguments) if (this.failing) return cb() log.silly('install', 'saveToDependencies') saveRequested(this.args, this.idealTree, cb) } Installer.prototype.readGlobalPackageData = function (cb) { validate('F', arguments) log.silly('install', 'readGlobalPackageData') var self = this this.loadArgMetadata(iferr(cb, function () { mkdirp(self.where, iferr(cb, function () { var pkgs = {} self.args.forEach(function (pkg) { pkgs[pkg.name] = true }) readPackageTree(self.where, function (ctx, kid) { return ctx.parent || pkgs[kid] }, iferr(cb, function (currentTree) { self.currentTree = currentTree return cb() })) })) })) } Installer.prototype.readLocalPackageData = function (cb) { validate('F', arguments) log.silly('install', 'readLocalPackageData') var self = this mkdirp(this.where, iferr(cb, function () { readPackageTree(self.where, iferr(cb, function (currentTree) { self.currentTree = currentTree self.currentTree.warnings = [] if (!self.noPackageJsonOk && !currentTree.package) { log.error('install', "Couldn't read dependencies") var er = new Error("ENOENT, open '" + path.join(self.where, 'package.json') + "'") er.code = 'ENOPACKAGEJSON' er.errno = 34 return cb(er) } if (!currentTree.package) currentTree.package = {} self.loadArgMetadata(iferr(cb, function () { if (currentTree.package._shrinkwrap) return cb() fs.readFile(path.join(self.where, 'npm-shrinkwrap.json'), function (er, data) { if (er) return cb() try { currentTree.package._shrinkwrap = parseJSON(data) } catch (ex) { return cb(ex) } return cb() }) })) })) })) } Installer.prototype.cloneCurrentTreeToIdealTree = function (cb) { validate('F', arguments) log.silly('install', 'cloneCurrentTreeToIdealTree') this.idealTree = copyTree(this.currentTree) this.idealTree.warnings = [] cb() } Installer.prototype.loadShrinkwrap = function (cb) { validate('F', arguments) log.silly('install', 'loadShrinkwrap') var installNewModules = !!this.args.length if (installNewModules) { readShrinkwrap(this.idealTree, cb) } else { readShrinkwrap.andInflate(this.idealTree, cb) } } Installer.prototype.normalizeTree = function (log, cb) { validate('OF', arguments) log.silly('install', 'normalizeTree') recalculateMetadata(this.currentTree, log, iferr(cb, function (tree) { tree.children.forEach(function (child) { if (child.package._requiredBy.length === 0) { child.package._requiredBy.push('#EXISTING') } }) cb(null, tree) })) } Installer.prototype.getInstalledModules = function () { return this.differences.filter(function (action) { var mutation = action[0] return (mutation === 'add' || mutation === 'update') }).map(function (action) { var child = action[1] return [child.package._id, child.path] }) } Installer.prototype.printInstalled = function (cb) { validate('F', arguments) log.silly('install', 'printInstalled') var self = this log.clearProgress() this.differences.forEach(function (action) { var mutation = action[0] var child = action[1] var name = packageId(child) var where = path.relative(self.where, child.path) if (mutation === 'remove') { console.log('- ' + name + ' ' + where) } else if (mutation === 'move') { var oldWhere = path.relative(self.where, child.fromPath) console.log(name + ' ' + oldWhere + ' -> ' + where) } }) var addedOrMoved = this.differences.filter(function (action) { var mutation = action[0] var child = action[1] return !child.failed && (mutation === 'add' || mutation === 'update') }).map(function (action) { var child = action[1] return packageId(child) }) log.showProgress() if (!addedOrMoved.length) return cb() recalculateMetadata(this.idealTree, log, iferr(cb, function (tree) { log.clearProgress() ls.fromTree(self.where, tree, addedOrMoved, false, function () { log.showProgress() cb() }) })) } Installer.prototype.debugActions = function (name, actionListName, cb) { validate('SSF', arguments) var actionsToLog = this[actionListName] log.silly(name, 'action count', actionsToLog.length) actionsToLog.forEach(function (action) { log.silly(name, action.map(function (value) { return (value && value.package) ? packageId(value) : value }).join(' ')) }) cb() } // This takes an object and a property name instead of a value to allow us // to define the arguments for use by chain before the property exists yet. Installer.prototype.debugTree = function (name, treeName, cb) { validate('SSF', arguments) log.silly(name, this.prettify(this[treeName]).trim()) cb() } Installer.prototype.prettify = function (tree) { validate('O', arguments) var seen = {} function byName (aa, bb) { return packageId(aa).localeCompare(packageId(bb)) } function expandTree (tree) { seen[tree.path] = true return { label: packageId(tree), nodes: tree.children.filter(function (tree) { return !seen[tree.path] }).sort(byName).map(expandTree) } } return archy(expandTree(tree), '', { unicode: npm.config.get('unicode') }) } npm_3.5.2.orig/lib/link.js0000644000000000000000000001313212631326456013533 0ustar 00000000000000// link with no args: symlink the folder to the global location // link with package arg: symlink the global to the local var npm = require('./npm.js') var symlink = require('./utils/link.js') var fs = require('graceful-fs') var log = require('npmlog') var asyncMap = require('slide').asyncMap var chain = require('slide').chain var path = require('path') var build = require('./build.js') var npa = require('npm-package-arg') module.exports = link link.usage = 'npm link (in package dir)' + '\nnpm link [<@scope>/][@]' + '\n\nalias: npm ln' link.completion = function (opts, cb) { var dir = npm.globalDir fs.readdir(dir, function (er, files) { cb(er, files.filter(function (f) { return !f.match(/^[\._-]/) })) }) } function link (args, cb) { if (process.platform === 'win32') { var semver = require('semver') if (!semver.gte(process.version, '0.7.9')) { var msg = 'npm link not supported on windows prior to node 0.7.9' var e = new Error(msg) e.code = 'ENOTSUP' e.errno = require('constants').ENOTSUP return cb(e) } } if (npm.config.get('global')) { return cb(new Error( 'link should never be --global.\n' + 'Please re-run this command with --local' )) } if (args.length === 1 && args[0] === '.') args = [] if (args.length) return linkInstall(args, cb) linkPkg(npm.prefix, cb) } function parentFolder (id, folder) { if (id[0] === '@') { return path.resolve(folder, '..', '..') } else { return path.resolve(folder, '..') } } function linkInstall (pkgs, cb) { asyncMap(pkgs, function (pkg, cb) { var t = path.resolve(npm.globalDir, '..') var pp = path.resolve(npm.globalDir, pkg) var rp = null var target = path.resolve(npm.dir, pkg) function n (er, data) { if (er) return cb(er, data) // we want the ONE thing that was installed into the global dir var installed = data.filter(function (info) { var id = info[0] var folder = info[1] return parentFolder(id, folder) === npm.globalDir }) var id = installed[0][0] pp = installed[0][1] var what = npa(id) pkg = what.name target = path.resolve(npm.dir, pkg) next() } // if it's a folder, a random not-installed thing, or not a scoped package, // then link or install it first if (pkg[0] !== '@' && (pkg.indexOf('/') !== -1 || pkg.indexOf('\\') !== -1)) { return fs.lstat(path.resolve(pkg), function (er, st) { if (er || !st.isDirectory()) { npm.commands.install(t, pkg, n) } else { rp = path.resolve(pkg) linkPkg(rp, n) } }) } fs.lstat(pp, function (er, st) { if (er) { rp = pp return npm.commands.install(t, [pkg], n) } else if (!st.isSymbolicLink()) { rp = pp next() } else { return fs.realpath(pp, function (er, real) { if (er) log.warn('invalid symbolic link', pkg) else rp = real next() }) } }) function next () { if (npm.config.get('dry-run')) return resultPrinter(pkg, pp, target, rp, cb) chain( [ [ function (cb) { log.verbose('link', 'symlinking %s to %s', pp, target) cb() } ], [symlink, pp, target], // do not run any scripts rp && [build, [target], npm.config.get('global'), build._noLC, true], [resultPrinter, pkg, pp, target, rp] ], cb ) } }, cb) } function linkPkg (folder, cb_) { var me = folder || npm.prefix var readJson = require('read-package-json') log.verbose('linkPkg', folder) readJson(path.resolve(me, 'package.json'), function (er, d) { function cb (er) { return cb_(er, [[d && d._id, target, null, null]]) } if (er) return cb(er) if (!d.name) { er = new Error('Package must have a name field to be linked') return cb(er) } if (npm.config.get('dry-run')) return resultPrinter(path.basename(me), me, target, cb) var target = path.resolve(npm.globalDir, d.name) symlink(me, target, false, true, function (er) { if (er) return cb(er) log.verbose('link', 'build target', target) // also install missing dependencies. npm.commands.install(me, [], function (er) { if (er) return cb(er) // build the global stuff. Don't run *any* scripts, because // install command already will have done that. build([target], true, build._noLC, true, function (er) { if (er) return cb(er) resultPrinter(path.basename(me), me, target, cb) }) }) }) }) } function resultPrinter (pkg, src, dest, rp, cb) { if (typeof cb !== 'function') { cb = rp rp = null } var where = dest rp = (rp || '').trim() src = (src || '').trim() // XXX If --json is set, then look up the data from the package.json if (npm.config.get('parseable')) { return parseableOutput(dest, rp || src, cb) } if (rp === src) rp = null log.clearProgress() console.log(where + ' -> ' + src + (rp ? ' -> ' + rp : '')) log.showProgress() cb() } function parseableOutput (dest, rp, cb) { // XXX this should match ls --parseable and install --parseable // look up the data from package.json, format it the same way. // // link is always effectively 'long', since it doesn't help much to // *just* print the target folder. // However, we don't actually ever read the version number, so // the second field is always blank. log.clearProgress() console.log(dest + '::' + rp) log.showProgress() cb() } npm_3.5.2.orig/lib/logout.js0000644000000000000000000000177012631326456014114 0ustar 00000000000000module.exports = logout var dezalgo = require('dezalgo') var log = require('npmlog') var npm = require('./npm.js') var mapToRegistry = require('./utils/map-to-registry.js') logout.usage = 'npm logout [--registry=] [--scope=<@scope>]' function logout (args, cb) { cb = dezalgo(cb) mapToRegistry('/', npm.config, function (err, uri, auth, normalized) { if (err) return cb(err) if (auth.token) { log.verbose('logout', 'clearing session token for', normalized) npm.registry.logout(normalized, { auth: auth }, function (err) { if (err) return cb(err) npm.config.clearCredentialsByURI(normalized) npm.config.save('user', cb) }) } else if (auth.username || auth.password) { log.verbose('logout', 'clearing user credentials for', normalized) npm.config.clearCredentialsByURI(normalized) npm.config.save('user', cb) } else { cb(new Error( 'Not logged in to', normalized + ',', "so can't log out." )) } }) } npm_3.5.2.orig/lib/ls.js0000644000000000000000000003154512631326456013224 0ustar 00000000000000// show the installed versions of packages // // --parseable creates output like this: // ::: // Flags are a :-separated list of zero or more indicators module.exports = exports = ls var path = require('path') var url = require('url') var readPackageTree = require('read-package-tree') var log = require('npmlog') var archy = require('archy') var semver = require('semver') var color = require('ansicolors') var npa = require('npm-package-arg') var iferr = require('iferr') var npm = require('./npm.js') var mutateIntoLogicalTree = require('./install/mutate-into-logical-tree.js') var recalculateMetadata = require('./install/deps.js').recalculateMetadata var packageId = require('./utils/package-id.js') ls.usage = 'npm ls [[<@scope>/] ...]' + '\n\naliases: list, la, ll' ls.completion = require('./utils/completion/installed-deep.js') function ls (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } var dir = path.resolve(npm.dir, '..') readPackageTree(dir, andRecalculateMetadata(iferr(cb, function (physicalTree) { lsFromTree(dir, physicalTree, args, silent, cb) }))) } function andRecalculateMetadata (next) { return function (er, tree) { recalculateMetadata(tree || {}, log, next) } } var lsFromTree = ls.fromTree = function (dir, physicalTree, args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } // npm ls 'foo@~1.3' bar 'baz@<2' if (!args) { args = [] } else { args = args.map(function (a) { var p = npa(a) var name = p.name var ver = semver.validRange(p.rawSpec) || '' return [ name, ver ] }) } var data = mutateIntoLogicalTree.asReadInstalled(physicalTree) pruneNestedExtraneous(data) filterByEnv(data) var bfs = filterFound(bfsify(data), args) var lite = getLite(bfs) if (silent) return cb(null, data, lite) var long = npm.config.get('long') var json = npm.config.get('json') var out if (json) { var seen = [] var d = long ? bfs : lite // the raw data can be circular out = JSON.stringify(d, function (k, o) { if (typeof o === 'object') { if (seen.indexOf(o) !== -1) return '[Circular]' seen.push(o) } return o }, 2) } else if (npm.config.get('parseable')) { out = makeParseable(bfs, long, dir) } else if (data) { out = makeArchy(bfs, long, dir) } console.log(out) if (args.length && !data._found) process.exitCode = 1 var er // if any errors were found, then complain and exit status 1 if (lite.problems && lite.problems.length) { er = lite.problems.join('\n') } cb(er, data, lite) } function pruneNestedExtraneous (data, visited) { visited = visited || [] visited.push(data) for (var i in data.dependencies) { if (data.dependencies[i].extraneous) { data.dependencies[i].dependencies = {} } else if (visited.indexOf(data.dependencies[i]) === -1) { pruneNestedExtraneous(data.dependencies[i], visited) } } } function filterByEnv (data) { var dev = npm.config.get('dev') || /^dev(elopment)?$/.test(npm.config.get('only')) var production = npm.config.get('production') || /^prod(uction)?$/.test(npm.config.get('only')) var dependencies = {} var devDependencies = data.devDependencies || [] Object.keys(data.dependencies).forEach(function (name) { var keys = Object.keys(devDependencies) if (production && !dev && keys.indexOf(name) !== -1) return if (dev && !production && keys.indexOf(name) === -1) return if (!dev && keys.indexOf(name) !== -1 && data.dependencies[name].missing) return dependencies[name] = data.dependencies[name] }) data.dependencies = dependencies } function alphasort (a, b) { a = a.toLowerCase() b = b.toLowerCase() return a > b ? 1 : a < b ? -1 : 0 } function isCruft (data) { return data.extraneous && data.error && data.error.code === 'ENOTDIR' } function getLite (data, noname) { var lite = {} if (isCruft(data)) return lite var maxDepth = npm.config.get('depth') if (!noname && data.name) lite.name = data.name if (data.version) lite.version = data.version if (data.extraneous) { lite.extraneous = true lite.problems = lite.problems || [] lite.problems.push('extraneous: ' + packageId(data) + ' ' + (data.path || '')) } if (data.error && data.path !== path.resolve(npm.globalDir, '..') && (data.error.code !== 'ENOENT' || noname)) { lite.invalid = true lite.problems = lite.problems || [] var message = data.error.message lite.problems.push('error in ' + data.path + ': ' + message) } if (data._from) { lite.from = data._from } if (data._resolved) { lite.resolved = data._resolved } if (data.invalid) { lite.invalid = true lite.problems = lite.problems || [] lite.problems.push('invalid: ' + packageId(data) + ' ' + (data.path || '')) } if (data.peerInvalid) { lite.peerInvalid = true lite.problems = lite.problems || [] lite.problems.push('peer dep not met: ' + packageId(data) + ' ' + (data.path || '')) } var deps = (data.dependencies && Object.keys(data.dependencies)) || [] if (deps.length) { lite.dependencies = deps.map(function (d) { var dep = data.dependencies[d] if (dep.missing && !dep.optional) { lite.problems = lite.problems || [] var p if (data.depth > maxDepth) { p = 'max depth reached: ' } else { p = 'missing: ' } p += d + '@' + dep.requiredBy + ', required by ' + packageId(data) lite.problems.push(p) return [d, { required: dep.requiredBy, missing: true }] } else if (dep.peerMissing) { lite.problems = lite.problems || [] dep.peerMissing.forEach(function (missing) { var pdm = 'peer dep missing: ' + missing.requires + ', required by ' + missing.requiredBy lite.problems.push(pdm) }) return [d, { required: dep, peerMissing: true }] } return [d, getLite(dep, true)] }).reduce(function (deps, d) { if (d[1].problems) { lite.problems = lite.problems || [] lite.problems.push.apply(lite.problems, d[1].problems) } deps[d[0]] = d[1] return deps }, {}) } return lite } function bfsify (root) { // walk over the data, and turn it from this: // +-- a // | `-- b // | `-- a (truncated) // `--b (truncated) // into this: // +-- a // `-- b // which looks nicer var queue = [root] var seen = [root] while (queue.length) { var current = queue.shift() var deps = current.dependencies = current.dependencies || {} Object.keys(deps).forEach(function (d) { var dep = deps[d] if (dep.missing) return if (seen.indexOf(dep) !== -1) { if (npm.config.get('parseable') || !npm.config.get('long')) { delete deps[d] return } else { dep = deps[d] = Object.create(dep) dep.dependencies = {} } } queue.push(dep) seen.push(dep) }) } return root } function filterFound (root, args) { if (!args.length) return root var deps = root.dependencies if (deps) { Object.keys(deps).forEach(function (d) { var dep = filterFound(deps[d], args) if (dep.peerMissing) return // see if this one itself matches var found = false for (var i = 0; !found && i < args.length; i++) { if (d === args[i][0]) { found = semver.satisfies(dep.version, args[i][1], true) } } // included explicitly if (found) dep._found = true // included because a child was included if (dep._found && !root._found) root._found = 1 // not included if (!dep._found) delete deps[d] }) } if (!root._found) root._found = false return root } function makeArchy (data, long, dir) { var out = makeArchy_(data, long, dir, 0) return archy(out, '', { unicode: npm.config.get('unicode') }) } function makeArchy_ (data, long, dir, depth, parent, d) { if (data.missing) { if (depth - 1 <= npm.config.get('depth')) { // just missing var unmet = 'UNMET ' + (data.optional ? 'OPTIONAL ' : '') + 'DEPENDENCY' if (npm.color) { if (data.optional) { unmet = color.bgBlack(color.yellow(unmet)) } else { unmet = color.bgBlack(color.red(unmet)) } } data = unmet + ' ' + d + '@' + data.requiredBy } else { data = d + '@' + data.requiredBy } return data } var out = {} // the top level is a bit special. out.label = data._id || '' if (data._found === true && data._id) { if (npm.color) { out.label = color.bgBlack(color.yellow(out.label.trim())) + ' ' } else { out.label = out.label.trim() + ' ' } } if (data.link) out.label += ' -> ' + data.link if (data.invalid) { if (data.realName !== data.name) out.label += ' (' + data.realName + ')' var invalid = 'invalid' if (npm.color) invalid = color.bgBlack(color.red(invalid)) out.label += ' ' + invalid } if (data.peerInvalid) { var peerInvalid = 'peer invalid' if (npm.color) peerInvalid = color.bgBlack(color.red(peerInvalid)) out.label += ' ' + peerInvalid } if (data.peerMissing) { var peerMissing = 'UNMET PEER DEPENDENCY' if (npm.color) peerMissing = color.bgBlack(color.red(peerMissing)) out.label = peerMissing + ' ' + out.label } if (data.extraneous && data.path !== dir) { var extraneous = 'extraneous' if (npm.color) extraneous = color.bgBlack(color.green(extraneous)) out.label += ' ' + extraneous } if (data.error && depth) { var message = data.error.message if (message.indexOf('\n')) message = message.slice(0, message.indexOf('\n')) var error = 'error: ' + message if (npm.color) error = color.bgRed(color.brightWhite(error)) out.label += ' ' + error } // add giturl to name@version if (data._resolved) { try { var type = npa(data._resolved).type var isGit = type === 'git' || type === 'hosted' if (isGit) { out.label += ' (' + data._resolved + ')' } } catch (ex) { // npa threw an exception then it ain't git so whatev } } if (long) { if (dir === data.path) out.label += '\n' + dir out.label += '\n' + getExtras(data, dir) } else if (dir === data.path) { if (out.label) out.label += ' ' out.label += dir } // now all the children. out.nodes = [] if (depth <= npm.config.get('depth')) { out.nodes = Object.keys(data.dependencies || {}) .sort(alphasort).filter(function (d) { return !isCruft(data.dependencies[d]) }).map(function (d) { return makeArchy_(data.dependencies[d], long, dir, depth + 1, data, d) }) } if (out.nodes.length === 0 && data.path === dir) { out.nodes = ['(empty)'] } return out } function getExtras (data) { var extras = [] if (data.description) extras.push(data.description) if (data.repository) extras.push(data.repository.url) if (data.homepage) extras.push(data.homepage) if (data._from) { var from = data._from if (from.indexOf(data.name + '@') === 0) { from = from.substr(data.name.length + 1) } var u = url.parse(from) if (u.protocol) extras.push(from) } return extras.join('\n') } function makeParseable (data, long, dir, depth, parent, d) { depth = depth || 0 return [ makeParseable_(data, long, dir, depth, parent, d) ] .concat(Object.keys(data.dependencies || {}) .sort(alphasort).map(function (d) { return makeParseable(data.dependencies[d], long, dir, depth + 1, data, d) })) .filter(function (x) { return x }) .join('\n') } function makeParseable_ (data, long, dir, depth, parent, d) { if (data.hasOwnProperty('_found') && data._found !== true) return '' if (data.missing) { if (depth < npm.config.get('depth')) { data = npm.config.get('long') ? path.resolve(parent.path, 'node_modules', d) + ':' + d + '@' + JSON.stringify(data.requiredBy) + ':INVALID:MISSING' : '' } else { data = path.resolve(dir || '', 'node_modules', d || '') + (npm.config.get('long') ? ':' + d + '@' + JSON.stringify(data.requiredBy) + ':' + // no realpath resolved ':MAXDEPTH' : '') } return data } if (!npm.config.get('long')) return data.path return data.path + ':' + (data._id || '') + ':' + (data.realPath !== data.path ? data.realPath : '') + (data.extraneous ? ':EXTRANEOUS' : '') + (data.error && data.path !== path.resolve(npm.globalDir, '..') ? ':ERROR' : '') + (data.invalid ? ':INVALID' : '') + (data.peerInvalid ? ':PEERINVALID' : '') + (data.peerMissing ? ':PEERINVALID:MISSING' : '') } npm_3.5.2.orig/lib/npm.js0000644000000000000000000002775012631326456013403 0ustar 00000000000000;(function () { // windows: running 'npm blah' in this folder will invoke WSH, not node. /*globals WScript*/ if (typeof WScript !== 'undefined') { WScript.echo( 'npm does not work when run\n' + 'with the Windows Scripting Host\n\n' + '"cd" to a different directory,\n' + 'or type "npm.cmd ",\n' + 'or type "node npm ".' ) WScript.quit(1) return } var gfs = require('graceful-fs') // Patch the global fs module here at the app level var fs = gfs.gracefulify(require('fs')) var EventEmitter = require('events').EventEmitter var npm = module.exports = new EventEmitter() var npmconf = require('./config/core.js') var log = require('npmlog') var path = require('path') var abbrev = require('abbrev') var which = require('which') var CachingRegClient = require('./cache/caching-client.js') var parseJSON = require('./utils/parse-json.js') npm.config = { loaded: false, get: function () { throw new Error('npm.load() required') }, set: function () { throw new Error('npm.load() required') } } npm.commands = {} npm.rollbacks = [] try { // startup, ok to do this synchronously var j = parseJSON(fs.readFileSync( path.join(__dirname, '../package.json')) + '') npm.version = j.version } catch (ex) { try { log.info('error reading version', ex) } catch (er) {} npm.version = ex } var commandCache = {} // short names for common things var aliases = { 'rm': 'uninstall', 'r': 'uninstall', 'un': 'uninstall', 'unlink': 'uninstall', 'remove': 'uninstall', 'rb': 'rebuild', 'list': 'ls', 'la': 'ls', 'll': 'ls', 'ln': 'link', 'i': 'install', 'isntall': 'install', 'it': 'install-test', 'up': 'update', 'upgrade': 'update', 'c': 'config', 'dist-tags': 'dist-tag', 'info': 'view', 'show': 'view', 'find': 'search', 's': 'search', 'se': 'search', 'author': 'owner', 'home': 'docs', 'issues': 'bugs', 'unstar': 'star', // same function 'apihelp': 'help', 'login': 'adduser', 'add-user': 'adduser', 'tst': 'test', 't': 'test', 'find-dupes': 'dedupe', 'ddp': 'dedupe', 'v': 'view', 'verison': 'version' } var aliasNames = Object.keys(aliases) // these are filenames in . var cmdList = [ 'install', 'install-test', 'uninstall', 'cache', 'config', 'set', 'get', 'update', 'outdated', 'prune', 'pack', 'dedupe', 'rebuild', 'link', 'publish', 'star', 'stars', 'tag', 'adduser', 'logout', 'unpublish', 'owner', 'access', 'team', 'deprecate', 'shrinkwrap', 'help', 'help-search', 'ls', 'search', 'view', 'init', 'version', 'edit', 'explore', 'docs', 'repo', 'bugs', 'faq', 'root', 'prefix', 'bin', 'whoami', 'dist-tag', 'ping', 'test', 'stop', 'start', 'restart', 'run-script', 'completion' ] var plumbing = [ 'build', 'unbuild', 'xmas', 'substack', 'visnup' ] var littleGuys = [ 'isntall' ] var fullList = cmdList.concat(aliasNames).filter(function (c) { return plumbing.indexOf(c) === -1 }) var abbrevs = abbrev(fullList) // we have our reasons fullList = npm.fullList = fullList.filter(function (c) { return littleGuys.indexOf(c) === -1 }) Object.keys(abbrevs).concat(plumbing).forEach(function addCommand (c) { Object.defineProperty(npm.commands, c, { get: function () { if (!loaded) { throw new Error( 'Call npm.load(config, cb) before using this command.\n' + 'See the README.md or cli.js for example usage.' ) } var a = npm.deref(c) if (c === 'la' || c === 'll') { npm.config.set('long', true) } npm.command = c if (commandCache[a]) return commandCache[a] var cmd = require(__dirname + '/' + a + '.js') commandCache[a] = function () { var args = Array.prototype.slice.call(arguments, 0) if (typeof args[args.length - 1] !== 'function') { args.push(defaultCb) } if (args.length === 1) args.unshift([]) npm.registry.version = npm.version if (!npm.registry.refer) { npm.registry.refer = [a].concat(args[0]).map(function (arg) { // exclude anything that might be a URL, path, or private module // Those things will always have a slash in them somewhere if (arg && arg.match && arg.match(/\/|\\/)) { return '[REDACTED]' } else { return arg } }).filter(function (arg) { return arg && arg.match }).join(' ') } cmd.apply(npm, args) } Object.keys(cmd).forEach(function (k) { commandCache[a][k] = cmd[k] }) return commandCache[a] }, enumerable: fullList.indexOf(c) !== -1, configurable: true }) // make css-case commands callable via camelCase as well if (c.match(/\-([a-z])/)) { addCommand(c.replace(/\-([a-z])/g, function (a, b) { return b.toUpperCase() })) } }) function defaultCb (er, data) { log.disableProgress() if (er) console.error(er.stack || er.message) else console.log(data) } npm.deref = function (c) { if (!c) return '' if (c.match(/[A-Z]/)) { c = c.replace(/([A-Z])/g, function (m) { return '-' + m.toLowerCase() }) } if (plumbing.indexOf(c) !== -1) return c var a = abbrevs[c] if (aliases[a]) a = aliases[a] return a } var loaded = false var loading = false var loadErr = null var loadListeners = [] function loadCb (er) { loadListeners.forEach(function (cb) { process.nextTick(cb.bind(npm, er, npm)) }) loadListeners.length = 0 } npm.load = function (cli, cb_) { if (!cb_ && typeof cli === 'function') { cb_ = cli cli = {} } if (!cb_) cb_ = function () {} if (!cli) cli = {} loadListeners.push(cb_) if (loaded || loadErr) return cb(loadErr) if (loading) return loading = true var onload = true function cb (er) { if (loadErr) return loadErr = er if (er) return cb_(er) if (npm.config.get('force')) { log.warn('using --force', 'I sure hope you know what you are doing.') } npm.config.loaded = true loaded = true loadCb(loadErr = er) onload = onload && npm.config.get('onload-script') if (onload) { require(onload) onload = false } } log.pause() load(npm, cli, cb) } function load (npm, cli, cb) { which(process.argv[0], function (er, node) { if (!er && node.toUpperCase() !== process.execPath.toUpperCase()) { log.verbose('node symlink', node) process.execPath = node process.installPrefix = path.resolve(node, '..', '..') } // look up configs var builtin = path.resolve(__dirname, '..', 'npmrc') npmconf.load(cli, builtin, function (er, config) { if (er === config) er = null npm.config = config if (er) return cb(er) // if the 'project' config is not a filename, and we're // not in global mode, then that means that it collided // with either the default or effective userland config if (!config.get('global') && config.sources.project && config.sources.project.type !== 'ini') { log.verbose( 'config', 'Skipping project config: %s. (matches userconfig)', config.localPrefix + '/.npmrc' ) } // Include npm-version and node-version in user-agent var ua = config.get('user-agent') || '' ua = ua.replace(/\{node-version\}/gi, process.version) ua = ua.replace(/\{npm-version\}/gi, npm.version) ua = ua.replace(/\{platform\}/gi, process.platform) ua = ua.replace(/\{arch\}/gi, process.arch) config.set('user-agent', ua) var color = config.get('color') log.level = config.get('loglevel') log.heading = config.get('heading') || 'npm' log.stream = config.get('logstream') switch (color) { case 'always': log.enableColor() npm.color = true break case false: log.disableColor() npm.color = false break default: var tty = require('tty') if (process.stdout.isTTY) npm.color = true else if (!tty.isatty) npm.color = true else if (tty.isatty(1)) npm.color = true else npm.color = false break } log.resume() if (config.get('progress')) { log.enableProgress() } else { log.disableProgress() } // at this point the configs are all set. // go ahead and spin up the registry client. npm.registry = new CachingRegClient(npm.config) var umask = npm.config.get('umask') npm.modes = { exec: parseInt('0777', 8) & (~umask), file: parseInt('0666', 8) & (~umask), umask: umask } var gp = Object.getOwnPropertyDescriptor(config, 'globalPrefix') Object.defineProperty(npm, 'globalPrefix', gp) var lp = Object.getOwnPropertyDescriptor(config, 'localPrefix') Object.defineProperty(npm, 'localPrefix', lp) return cb(null, npm) }) }) } Object.defineProperty(npm, 'prefix', { get: function () { return npm.config.get('global') ? npm.globalPrefix : npm.localPrefix }, set: function (r) { var k = npm.config.get('global') ? 'globalPrefix' : 'localPrefix' npm[k] = r return r }, enumerable: true }) Object.defineProperty(npm, 'bin', { get: function () { if (npm.config.get('global')) return npm.globalBin return path.resolve(npm.root, '.bin') }, enumerable: true }) Object.defineProperty(npm, 'globalBin', { get: function () { var b = npm.globalPrefix if (process.platform !== 'win32') b = path.resolve(b, 'bin') return b } }) Object.defineProperty(npm, 'dir', { get: function () { if (npm.config.get('global')) return npm.globalDir return path.resolve(npm.prefix, 'node_modules') }, enumerable: true }) Object.defineProperty(npm, 'globalDir', { get: function () { return (process.platform !== 'win32') ? path.resolve(npm.globalPrefix, 'lib', 'node_modules') : path.resolve(npm.globalPrefix, 'node_modules') }, enumerable: true }) Object.defineProperty(npm, 'root', { get: function () { return npm.dir } }) Object.defineProperty(npm, 'cache', { get: function () { return npm.config.get('cache') }, set: function (r) { return npm.config.set('cache', r) }, enumerable: true }) var tmpFolder var rand = require('crypto').randomBytes(4).toString('hex') Object.defineProperty(npm, 'tmp', { get: function () { if (!tmpFolder) tmpFolder = 'npm-' + process.pid + '-' + rand return path.resolve(npm.config.get('tmp'), tmpFolder) }, enumerable: true }) // the better to repl you with Object.getOwnPropertyNames(npm.commands).forEach(function (n) { if (npm.hasOwnProperty(n) || n === 'config') return Object.defineProperty(npm, n, { get: function () { return function () { var args = Array.prototype.slice.call(arguments, 0) var cb = defaultCb if (args.length === 1 && Array.isArray(args[0])) { args = args[0] } if (typeof args[args.length - 1] === 'function') { cb = args.pop() } npm.commands[n](args, cb) } }, enumerable: false, configurable: true }) }) if (require.main === module) { require('../bin/npm-cli.js') } })() npm_3.5.2.orig/lib/outdated.js0000644000000000000000000003011212631326456014404 0ustar 00000000000000/* npm outdated [pkg] Does the following: 1. check for a new version of pkg If no packages are specified, then run for all installed packages. --parseable creates output like this: ::: */ module.exports = outdated outdated.usage = 'npm outdated [[<@scope>/] ...]' outdated.completion = require('./utils/completion/installed-deep.js') var os = require('os') var url = require('url') var path = require('path') var log = require('npmlog') var readPackageTree = require('read-package-tree') var readJson = require('read-package-json') var asyncMap = require('slide').asyncMap var color = require('ansicolors') var styles = require('ansistyles') var table = require('text-table') var semver = require('semver') var npa = require('npm-package-arg') var mutateIntoLogicalTree = require('./install/mutate-into-logical-tree.js') var cache = require('./cache.js') var npm = require('./npm.js') var long = npm.config.get('long') var mapToRegistry = require('./utils/map-to-registry.js') var isExtraneous = require('./install/is-extraneous.js') var recalculateMetadata = require('./install/deps.js').recalculateMetadata var moduleName = require('./utils/module-name.js') function uniqName (item) { return item[0].path + '|' + item[1] + '|' + item[7] } function uniq (list) { var uniqed = [] var seen = {} list.forEach(function (item) { var name = uniqName(item) if (seen[name]) return seen[name] = true uniqed.push(item) }) return uniqed } function andRecalculateMetadata (next) { return function (er, tree) { if (er) return next(er) recalculateMetadata(tree, log, next) } } function outdated (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } var dir = path.resolve(npm.dir, '..') // default depth for `outdated` is 0 (cf. `ls`) if (npm.config.get('depth') === Infinity) npm.config.set('depth', 0) readPackageTree(dir, andRecalculateMetadata(function (er, tree) { mutateIntoLogicalTree(tree) outdated_(args, '', tree, {}, 0, function (er, list) { list = uniq(list || []).sort(function (aa, bb) { return aa[0].path.localeCompare(bb[0].path) || aa[1].localeCompare(bb[1]) }) if (er || silent || list.length === 0) return cb(er, list) log.disableProgress() if (npm.config.get('json')) { console.log(makeJSON(list)) } else if (npm.config.get('parseable')) { console.log(makeParseable(list)) } else { var outList = list.map(makePretty) var outHead = [ 'Package', 'Current', 'Wanted', 'Latest', 'Location' ] if (long) outHead.push('Package Type') var outTable = [outHead].concat(outList) if (npm.color) { outTable[0] = outTable[0].map(function (heading) { return styles.underline(heading) }) } var tableOpts = { align: ['l', 'r', 'r', 'r', 'l'], stringLength: function (s) { return ansiTrim(s).length } } console.log(table(outTable, tableOpts)) } cb(null, list.map(function (item) { return [item[0].parent.path].concat(item.slice(1, 7)) })) }) })) } // [[ dir, dep, has, want, latest, type ]] function makePretty (p) { var dep = p[0] var depname = p[1] var dir = dep.path var has = p[2] var want = p[3] var latest = p[4] var type = p[6] var deppath = p[7] if (!npm.config.get('global')) { dir = path.relative(process.cwd(), dir) } var columns = [ depname, has || 'MISSING', want, latest, deppath ] if (long) columns[5] = type if (npm.color) { columns[0] = color[has === want ? 'yellow' : 'red'](columns[0]) // dep columns[2] = color.green(columns[2]) // want columns[3] = color.magenta(columns[3]) // latest columns[4] = color.brightBlack(columns[4]) // dir if (long) columns[5] = color.brightBlack(columns[5]) // type } return columns } function ansiTrim (str) { var r = new RegExp('\x1b(?:\\[(?:\\d+[ABCDEFGJKSTm]|\\d+;\\d+[Hfm]|' + '\\d+;\\d+;\\d+m|6n|s|u|\\?25[lh])|\\w)', 'g') return str.replace(r, '') } function makeParseable (list) { return list.map(function (p) { var dep = p[0] var depname = p[1] var dir = dep.path var has = p[2] var want = p[3] var latest = p[4] var type = p[6] var out = [ dir, depname + '@' + want, (has ? (depname + '@' + has) : 'MISSING'), depname + '@' + latest ] if (long) out.push(type) return out.join(':') }).join(os.EOL) } function makeJSON (list) { var out = {} list.forEach(function (p) { var dep = p[0] var depname = p[1] var dir = dep.path var has = p[2] var want = p[3] var latest = p[4] var type = p[6] if (!npm.config.get('global')) { dir = path.relative(process.cwd(), dir) } out[depname] = { current: has, wanted: want, latest: latest, location: dir } if (long) out[depname].type = type }) return JSON.stringify(out, null, 2) } function outdated_ (args, path, tree, parentHas, depth, cb) { if (!tree.package) tree.package = {} if (path && tree.package.name) path += ' > ' + tree.package.name if (!path && tree.package.name) path = tree.package.name if (depth > npm.config.get('depth')) { return cb(null, []) } var types = {} var pkg = tree.package var deps = tree.children.filter(function (child) { return !isExtraneous(child) }) || [] deps.forEach(function (dep) { types[moduleName(dep)] = 'dependencies' }) Object.keys(tree.missingDeps).forEach(function (name) { deps.push({ package: { name: name }, path: tree.path, parent: tree, isMissing: true }) types[name] = 'dependencies' }) // If we explicitly asked for dev deps OR we didn't ask for production deps // AND we asked to save dev-deps OR we didn't ask to save anything that's NOT // dev deps then… // (All the save checking here is because this gets called from npm-update currently // and that requires this logic around dev deps.) // FIXME: Refactor npm update to not be in terms of outdated. var dev = npm.config.get('dev') || /^dev(elopment)?$/.test(npm.config.get('also')) var prod = npm.config.get('production') || /^prod(uction)?$/.test(npm.config.get('only')) if ((dev || !prod) && (npm.config.get('save-dev') || ( !npm.config.get('save') && !npm.config.get('save-optional')))) { Object.keys(tree.missingDevDeps).forEach(function (name) { deps.push({ package: { name: name }, path: tree.path, parent: tree, isMissing: true }) if (!types[name]) { types[name] = 'devDependencies' } }) } if (npm.config.get('save-dev')) { deps = deps.filter(function (dep) { return pkg.devDependencies[moduleName(dep)] }) deps.forEach(function (dep) { types[moduleName(dep)] = 'devDependencies' }) } else if (npm.config.get('save')) { // remove optional dependencies from dependencies during --save. deps = deps.filter(function (dep) { return !pkg.optionalDependencies[moduleName(dep)] }) } else if (npm.config.get('save-optional')) { deps = deps.filter(function (dep) { return pkg.optionalDependencies[moduleName(dep)] }) deps.forEach(function (dep) { types[moduleName(dep)] = 'optionalDependencies' }) } var doUpdate = dev || ( !prod && !Object.keys(parentHas).length && !npm.config.get('global') ) if (doUpdate) { Object.keys(pkg.devDependencies).forEach(function (k) { if (!(k in parentHas)) { deps[k] = pkg.devDependencies[k] types[k] = 'devDependencies' } }) } var has = Object.create(parentHas) tree.children.forEach(function (child) { if (child.package.name && child.package.private) { deps = deps.filter(function (dep) { return dep !== child }) } has[child.package.name] = { version: child.package.version, from: child.package._from } }) // now get what we should have, based on the dep. // if has[dep] !== shouldHave[dep], then cb with the data // otherwise dive into the folder asyncMap(deps, function (dep, cb) { var name = moduleName(dep) var required = (tree.package.dependencies)[name] || (tree.package.optionalDependencies)[name] || (tree.package.devDependencies)[name] || dep.package._requested && dep.package._requested.spec || '*' if (!long) return shouldUpdate(args, dep, name, has, required, depth, path, cb) shouldUpdate(args, dep, name, has, required, depth, path, cb, types[name]) }, cb) } function shouldUpdate (args, tree, dep, has, req, depth, pkgpath, cb, type) { // look up the most recent version. // if that's what we already have, or if it's not on the args list, // then dive into it. Otherwise, cb() with the data. // { version: , from: } var curr = has[dep] function skip (er) { // show user that no viable version can be found if (er) return cb(er) outdated_(args, pkgpath, tree, has, depth + 1, cb) } function doIt (wanted, latest) { if (!long) { return cb(null, [[tree, dep, curr && curr.version, wanted, latest, req, null, pkgpath]]) } cb(null, [[tree, dep, curr && curr.version, wanted, latest, req, type, pkgpath]]) } if (args.length && args.indexOf(dep) === -1) return skip() var parsed = npa(dep + '@' + req) if (parsed.type === 'git' || parsed.type === 'hosted') { return doIt('git', 'git') } // search for the latest package mapToRegistry(dep, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, updateDeps) }) function updateLocalDeps (latestRegistryVersion) { readJson(path.resolve(parsed.spec, 'package.json'), function (er, localDependency) { if (er) return cb() var wanted = localDependency.version var latest = localDependency.version if (latestRegistryVersion) { latest = latestRegistryVersion if (semver.lt(wanted, latestRegistryVersion)) { wanted = latestRegistryVersion req = dep + '@' + latest } } if (curr.version !== wanted) { doIt(wanted, latest) } else { skip() } }) } function updateDeps (er, d) { if (er) { if (parsed.type !== 'local') return cb(er) return updateLocalDeps() } if (!d || !d['dist-tags'] || !d.versions) return cb() var l = d.versions[d['dist-tags'].latest] if (!l) return cb() var r = req if (d['dist-tags'][req]) { r = d['dist-tags'][req] } if (semver.validRange(r, true)) { // some kind of semver range. // see if it's in the doc. var vers = Object.keys(d.versions) var v = semver.maxSatisfying(vers, r, true) if (v) { return onCacheAdd(null, d.versions[v]) } } // We didn't find the version in the doc. See if cache can find it. cache.add(dep, req, null, false, onCacheAdd) function onCacheAdd (er, d) { // if this fails, then it means we can't update this thing. // it's probably a thing that isn't published. if (er) { if (er.code && er.code === 'ETARGET') { // no viable version found return skip(er) } return skip() } // check that the url origin hasn't changed (#1727) and that // there is no newer version available var dFromUrl = d._from && url.parse(d._from).protocol var cFromUrl = curr && curr.from && url.parse(curr.from).protocol if (!curr || dFromUrl && cFromUrl && d._from !== curr.from || d.version !== curr.version || d.version !== l.version) { if (parsed.type === 'local') return updateLocalDeps(l.version) doIt(d.version, l.version) } else { skip() } } } } npm_3.5.2.orig/lib/owner.js0000644000000000000000000001676312631326456013745 0ustar 00000000000000module.exports = owner owner.usage = 'npm owner add [<@scope>/]' + '\nnpm owner rm [<@scope>/]' + '\nnpm owner ls [<@scope>/]' var npm = require('./npm.js') var log = require('npmlog') var mapToRegistry = require('./utils/map-to-registry.js') var readLocalPkg = require('./utils/read-local-package.js') owner.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length > 4) return cb() if (argv.length <= 2) { var subs = ['add', 'rm'] if (opts.partialWord === 'l') subs.push('ls') else subs.push('ls', 'list') return cb(null, subs) } npm.commands.whoami([], true, function (er, username) { if (er) return cb() var un = encodeURIComponent(username) var byUser, theUser switch (argv[2]) { case 'ls': // FIXME: there used to be registry completion here, but it stopped // making sense somewhere around 50,000 packages on the registry return cb() case 'rm': if (argv.length > 3) { theUser = encodeURIComponent(argv[3]) byUser = '-/by-user/' + theUser + '|' + un return mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) console.error(uri) npm.registry.get(uri, { auth: auth }, function (er, d) { if (er) return cb(er) // return the intersection return cb(null, d[theUser].filter(function (p) { // kludge for server adminery. return un === 'isaacs' || d[un].indexOf(p) === -1 })) }) }) } // else fallthrough /*eslint no-fallthrough:0*/ case 'add': if (argv.length > 3) { theUser = encodeURIComponent(argv[3]) byUser = '-/by-user/' + theUser + '|' + un return mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) console.error(uri) npm.registry.get(uri, { auth: auth }, function (er, d) { console.error(uri, er || d) // return mine that they're not already on. if (er) return cb(er) var mine = d[un] || [] var theirs = d[theUser] || [] return cb(null, mine.filter(function (p) { return theirs.indexOf(p) === -1 })) }) }) } // just list all users who aren't me. return mapToRegistry('-/users', npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, list) { if (er) return cb() return cb(null, Object.keys(list).filter(function (n) { return n !== un })) }) }) default: return cb() } }) } function owner (args, cb) { var action = args.shift() switch (action) { case 'ls': case 'list': return ls(args[0], cb) case 'add': return add(args[0], args[1], cb) case 'rm': case 'remove': return rm(args[0], args[1], cb) default: return unknown(action, cb) } } function ls (pkg, cb) { if (!pkg) { return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(owner.usage) ls(pkg, cb) }) } mapToRegistry(pkg, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, data) { var msg = '' if (er) { log.error('owner ls', "Couldn't get owner data", pkg) return cb(er) } var owners = data.maintainers if (!owners || !owners.length) { msg = 'admin party!' } else { msg = owners.map(function (o) { return o.name + ' <' + o.email + '>' }).join('\n') } console.log(msg) cb(er, owners) }) }) } function add (user, pkg, cb) { if (!user) return cb(owner.usage) if (!pkg) { return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(new Error(owner.usage)) add(user, pkg, cb) }) } log.verbose('owner add', '%s to %s', user, pkg) mutate(pkg, user, function (u, owners) { if (!owners) owners = [] for (var i = 0, l = owners.length; i < l; i++) { var o = owners[i] if (o.name === u.name) { log.info( 'owner add', 'Already a package owner: ' + o.name + ' <' + o.email + '>' ) return false } } owners.push(u) return owners }, cb) } function rm (user, pkg, cb) { if (!pkg) { return readLocalPkg(function (er, pkg) { if (er) return cb(er) if (!pkg) return cb(new Error(owner.usage)) rm(user, pkg, cb) }) } log.verbose('owner rm', '%s from %s', user, pkg) mutate(pkg, user, function (u, owners) { var found = false var m = owners.filter(function (o) { var match = (o.name === user) found = found || match return !match }) if (!found) { log.info('owner rm', 'Not a package owner: ' + user) return false } if (!m.length) { return new Error( 'Cannot remove all owners of a package. Add someone else first.' ) } return m }, cb) } function mutate (pkg, user, mutation, cb) { if (user) { var byUser = '-/user/org.couchdb.user:' + user mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, mutate_) }) } else { mutate_(null, null) } function mutate_ (er, u) { if (!er && user && (!u || u.error)) { er = new Error( "Couldn't get user data for " + user + ': ' + JSON.stringify(u) ) } if (er) { log.error('owner mutate', 'Error getting user data for %s', user) return cb(er) } if (u) u = { name: u.name, email: u.email } mapToRegistry(pkg, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, data) { if (er) { log.error('owner mutate', 'Error getting package data for %s', pkg) return cb(er) } // save the number of maintainers before mutation so that we can figure // out if maintainers were added or removed var beforeMutation = data.maintainers.length var m = mutation(u, data.maintainers) if (!m) return cb() // handled if (m instanceof Error) return cb(m) // error data = { _id: data._id, _rev: data._rev, maintainers: m } var dataPath = pkg.replace('/', '%2f') + '/-rev/' + data._rev mapToRegistry(dataPath, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { method: 'PUT', body: data, auth: auth } npm.registry.request(uri, params, function (er, data) { if (!er && data.error) { er = new Error('Failed to update package metadata: ' + JSON.stringify(data)) } if (er) { log.error('owner mutate', 'Failed to update package metadata') } else if (m.length > beforeMutation) { console.log('+ %s (%s)', user, pkg) } else if (m.length < beforeMutation) { console.log('- %s (%s)', user, pkg) } cb(er, data) }) }) }) }) } } function unknown (action, cb) { cb('Usage: \n' + owner.usage) } npm_3.5.2.orig/lib/pack.js0000644000000000000000000000336712631326456013525 0ustar 00000000000000// npm pack // Packs the specified package into a .tgz file, which can then // be installed. module.exports = pack var install = require('./install.js') var cache = require('./cache.js') var fs = require('graceful-fs') var chain = require('slide').chain var path = require('path') var cwd = process.cwd() var writeStream = require('fs-write-stream-atomic') var cachedPackageRoot = require('./cache/cached-package-root.js') pack.usage = 'npm pack [[<@scope>/]...]' // if it can be installed, it can be packed. pack.completion = install.completion function pack (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } if (args.length === 0) args = ['.'] chain( args.map(function (arg) { return function (cb) { pack_(arg, cb) } }), function (er, files) { if (er || silent) return cb(er, files) printFiles(files, cb) } ) } function printFiles (files, cb) { files = files.map(function (file) { return path.relative(cwd, file) }) console.log(files.join('\n')) cb() } // add to cache, then cp to the cwd function pack_ (pkg, cb) { cache.add(pkg, null, null, false, function (er, data) { if (er) return cb(er) // scoped packages get special treatment var name = data.name if (name[0] === '@') name = name.substr(1).replace(/\//g, '-') var fname = name + '-' + data.version + '.tgz' var cached = path.join(cachedPackageRoot(data), 'package.tgz') var from = fs.createReadStream(cached) var to = writeStream(fname) var errState = null from.on('error', cb_) to.on('error', cb_) to.on('close', cb_) from.pipe(to) function cb_ (er) { if (errState) return if (er) return cb(errState = er) cb(null, fname) } }) } npm_3.5.2.orig/lib/ping.js0000644000000000000000000000100712631326456013531 0ustar 00000000000000var npm = require('./npm.js') module.exports = ping ping.usage = 'npm ping\nping registry' function ping (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } var registry = npm.config.get('registry') if (!registry) return cb(new Error('no default registry set')) var auth = npm.config.getCredentialsByURI(registry) npm.registry.ping(registry, {auth: auth}, function (er, pong) { if (!silent) console.log(JSON.stringify(pong)) cb(er, er ? null : pong) }) } npm_3.5.2.orig/lib/prefix.js0000644000000000000000000000044512631326456014076 0ustar 00000000000000module.exports = prefix var npm = require('./npm.js') prefix.usage = 'npm prefix [-g]' function prefix (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } if (!silent) console.log(npm.prefix) process.nextTick(cb.bind(this, null, npm.prefix)) } npm_3.5.2.orig/lib/prune.js0000644000000000000000000000277412631326456013741 0ustar 00000000000000// prune extraneous packages. module.exports = prune prune.usage = 'npm prune [[<@scope>/]...] [--production]' var readInstalled = require('read-installed') var npm = require('./npm.js') var path = require('path') var readJson = require('read-package-json') var log = require('npmlog') prune.completion = require('./utils/completion/installed-deep.js') function prune (args, cb) { // check if is a valid package.json file var jsonFile = path.resolve(npm.dir, '..', 'package.json') readJson(jsonFile, log.warn, function (er) { if (er) return cb(er) next() }) function next () { var opt = { depth: npm.config.get('depth'), dev: !npm.config.get('production') || npm.config.get('dev') } readInstalled(npm.prefix, opt, function (er, data) { if (er) return cb(er) prune_(args, data, cb) }) } } function prune_ (args, data, cb) { npm.commands.unbuild(prunables(args, data, []), cb) } function prunables (args, data, seen) { var deps = data.dependencies || {} return Object.keys(deps).map(function (d) { if (typeof deps[d] !== 'object' || seen.indexOf(deps[d]) !== -1) return null seen.push(deps[d]) if (deps[d].extraneous && (args.length === 0 || args.indexOf(d) !== -1)) { var extra = deps[d] delete deps[d] return extra.path } return prunables(args, deps[d], seen) }).filter(function (d) { return d !== null }) .reduce(function FLAT (l, r) { return l.concat(Array.isArray(r) ? r.reduce(FLAT, []) : r) }, []) } npm_3.5.2.orig/lib/publish.js0000644000000000000000000001200512631326456014242 0ustar 00000000000000 module.exports = publish var npm = require('./npm.js') var log = require('npmlog') var path = require('path') var readJson = require('read-package-json') var lifecycle = require('./utils/lifecycle.js') var chain = require('slide').chain var mapToRegistry = require('./utils/map-to-registry.js') var cachedPackageRoot = require('./cache/cached-package-root.js') var createReadStream = require('graceful-fs').createReadStream var npa = require('npm-package-arg') var semver = require('semver') var getPublishConfig = require('./utils/get-publish-config.js') publish.usage = 'npm publish [|] [--tag ] [--access ]' + "\n\nPublishes '.' if no argument supplied" + '\n\nSets tag `latest` if no --tag specified' publish.completion = function (opts, cb) { // publish can complete to a folder with a package.json // or a tarball, or a tarball url. // for now, not yet implemented. return cb() } function publish (args, isRetry, cb) { if (typeof cb !== 'function') { cb = isRetry isRetry = false } if (args.length === 0) args = ['.'] if (args.length !== 1) return cb(publish.usage) log.verbose('publish', args) var t = npm.config.get('tag').trim() if (semver.validRange(t)) { var er = new Error('Tag name must not be a valid SemVer range: ' + t) return cb(er) } var arg = args[0] // if it's a local folder, then run the prepublish there, first. readJson(path.resolve(arg, 'package.json'), function (er, data) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (data) { if (!data.name) return cb(new Error('No name provided')) if (!data.version) return cb(new Error('No version provided')) } // Error is OK. Could be publishing a URL or tarball, however, that means // that we will not have automatically run the prepublish script, since // that gets run when adding a folder to the cache. if (er) return cacheAddPublish(arg, false, isRetry, cb) else cacheAddPublish(arg, true, isRetry, cb) }) } // didPre in this case means that we already ran the prepublish script, // and that the 'dir' is an actual directory, and not something silly // like a tarball or name@version thing. // That means that we can run publish/postpublish in the dir, rather than // in the cache dir. function cacheAddPublish (dir, didPre, isRetry, cb) { npm.commands.cache.add(dir, null, null, false, function (er, data) { if (er) return cb(er) log.silly('publish', data) var cachedir = path.resolve(cachedPackageRoot(data), 'package') chain( [ !didPre && [lifecycle, data, 'prepublish', cachedir], [publish_, dir, data, isRetry, cachedir], [lifecycle, data, 'publish', didPre ? dir : cachedir], [lifecycle, data, 'postpublish', didPre ? dir : cachedir] ], cb ) }) } function publish_ (arg, data, isRetry, cachedir, cb) { if (!data) return cb(new Error('no package.json file found')) var mappedConfig = getPublishConfig( data.publishConfig, npm.config, npm.registry ) var config = mappedConfig.config var registry = mappedConfig.client data._npmVersion = npm.version data._nodeVersion = process.versions.node delete data.modules if (data.private) { return cb(new Error( 'This package has been marked as private\n' + "Remove the 'private' field from the package.json to publish it." )) } mapToRegistry(data.name, config, function (er, registryURI, auth, registryBase) { if (er) return cb(er) var tarballPath = cachedir + '.tgz' // we just want the base registry URL in this case log.verbose('publish', 'registryBase', registryBase) log.silly('publish', 'uploading', tarballPath) data._npmUser = { name: auth.username, email: auth.email } var params = { metadata: data, body: createReadStream(tarballPath), auth: auth } // registry-frontdoor cares about the access level, which is only // configurable for scoped packages if (config.get('access')) { if (!npa(data.name).scope && config.get('access') === 'restricted') { return cb(new Error("Can't restrict access to unscoped packages.")) } params.access = config.get('access') } log.showProgress('publish:' + data._id) registry.publish(registryBase, params, function (er) { if (er && er.code === 'EPUBLISHCONFLICT' && npm.config.get('force') && !isRetry) { log.warn('publish', 'Forced publish over ' + data._id) return npm.commands.unpublish([data._id], function (er) { // ignore errors. Use the force. Reach out with your feelings. // but if it fails again, then report the first error. publish([arg], er || true, cb) }) } // report the unpublish error if this was a retry and unpublish failed if (er && isRetry && isRetry !== true) return cb(isRetry) if (er) return cb(er) log.clearProgress() console.log('+ ' + data._id) cb() }) }) } npm_3.5.2.orig/lib/rebuild.js0000644000000000000000000000407512631326456014232 0ustar 00000000000000 module.exports = rebuild var readInstalled = require('read-installed') var semver = require('semver') var log = require('npmlog') var npm = require('./npm.js') var npa = require('npm-package-arg') rebuild.usage = 'npm rebuild [[<@scope>/]...]' rebuild.completion = require('./utils/completion/installed-deep.js') function rebuild (args, cb) { var opt = { depth: npm.config.get('depth'), dev: true } readInstalled(npm.prefix, opt, function (er, data) { log.info('readInstalled', typeof data) if (er) return cb(er) var set = filter(data, args) var folders = Object.keys(set).filter(function (f) { return f !== npm.prefix }) if (!folders.length) return cb() log.silly('rebuild set', folders) cleanBuild(folders, set, cb) }) } function cleanBuild (folders, set, cb) { npm.commands.build(folders, function (er) { if (er) return cb(er) log.clearProgress() console.log(folders.map(function (f) { return set[f] + ' ' + f }).join('\n')) log.showProgress() cb() }) } function filter (data, args, set, seen) { if (!set) set = {} if (!seen) seen = {} if (set.hasOwnProperty(data.path)) return set if (seen.hasOwnProperty(data.path)) return set seen[data.path] = true var pass if (!args.length) pass = true // rebuild everything else if (data.name && data._id) { for (var i = 0, l = args.length; i < l; i++) { var arg = args[i] var nv = npa(arg) var n = nv.name var v = nv.rawSpec if (n !== data.name) continue if (!semver.satisfies(data.version, v, true)) continue pass = true break } } if (pass && data._id) { log.verbose('rebuild', 'path, id', [data.path, data._id]) set[data.path] = data._id } // need to also dive through kids, always. // since this isn't an install these won't get auto-built unless // they're not dependencies. Object.keys(data.dependencies || {}).forEach(function (d) { // return var dep = data.dependencies[d] if (typeof dep === 'string') return filter(dep, args, set, seen) }) return set } npm_3.5.2.orig/lib/repo.js0000644000000000000000000000267012631326456013550 0ustar 00000000000000module.exports = repo repo.usage = 'npm repo []' var npm = require('./npm.js') var opener = require('opener') var hostedGitInfo = require('hosted-git-info') var url_ = require('url') var fetchPackageMetadata = require('./fetch-package-metadata.js') repo.completion = function (opts, cb) { // FIXME: there used to be registry completion here, but it stopped making // sense somewhere around 50,000 packages on the registry cb() } function repo (args, cb) { var n = args.length ? args[0] : '.' fetchPackageMetadata(n, '.', function (er, d) { if (er) return cb(er) getUrlAndOpen(d, cb) }) } function getUrlAndOpen (d, cb) { var r = d.repository if (!r) return cb(new Error('no repository')) // XXX remove this when npm@v1.3.10 from node 0.10 is deprecated // from https://github.com/npm/npm-www/issues/418 var info = hostedGitInfo.fromUrl(r.url) var url = info ? info.browse() : unknownHostedUrl(r.url) if (!url) return cb(new Error('no repository: could not get url')) opener(url, { command: npm.config.get('browser') }, cb) } function unknownHostedUrl (url) { try { var idx = url.indexOf('@') if (idx !== -1) { url = url.slice(idx + 1).replace(/:([^\d]+)/, '/$1') } url = url_.parse(url) var protocol = url.protocol === 'https:' ? 'https:' : 'http:' return protocol + '//' + (url.host || '') + url.path.replace(/\.git$/, '') } catch (e) {} } npm_3.5.2.orig/lib/restart.js0000644000000000000000000000010012631326456014251 0ustar 00000000000000module.exports = require('./utils/lifecycle.js').cmd('restart') npm_3.5.2.orig/lib/root.js0000644000000000000000000000042712631326456013564 0ustar 00000000000000module.exports = root var npm = require('./npm.js') root.usage = 'npm root [-g]' function root (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } if (!silent) console.log(npm.dir) process.nextTick(cb.bind(this, null, npm.dir)) } npm_3.5.2.orig/lib/run-script.js0000644000000000000000000001221112631326456014701 0ustar 00000000000000module.exports = runScript var lifecycle = require('./utils/lifecycle.js') var npm = require('./npm.js') var path = require('path') var readJson = require('read-package-json') var log = require('npmlog') var chain = require('slide').chain runScript.usage = 'npm run-script [-- ...]' + '\n\nalias: npm run' runScript.completion = function (opts, cb) { // see if there's already a package specified. var argv = opts.conf.argv.remain if (argv.length >= 4) return cb() if (argv.length === 3) { // either specified a script locally, in which case, done, // or a package, in which case, complete against its scripts var json = path.join(npm.localPrefix, 'package.json') return readJson(json, function (er, d) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (er) d = {} var scripts = Object.keys(d.scripts || {}) console.error('local scripts', scripts) if (scripts.indexOf(argv[2]) !== -1) return cb() // ok, try to find out which package it was, then var pref = npm.config.get('global') ? npm.config.get('prefix') : npm.localPrefix var pkgDir = path.resolve(pref, 'node_modules', argv[2], 'package.json') readJson(pkgDir, function (er, d) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (er) d = {} var scripts = Object.keys(d.scripts || {}) return cb(null, scripts) }) }) } readJson(path.join(npm.localPrefix, 'package.json'), function (er, d) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) d = d || {} cb(null, Object.keys(d.scripts || {})) }) } function runScript (args, cb) { if (!args.length) return list(cb) var pkgdir = npm.localPrefix var cmd = args.shift() readJson(path.resolve(pkgdir, 'package.json'), function (er, d) { if (er) return cb(er) run(d, pkgdir, cmd, args, cb) }) } function list (cb) { var json = path.join(npm.localPrefix, 'package.json') var cmdList = [ 'publish', 'install', 'uninstall', 'test', 'stop', 'start', 'restart', 'version' ].reduce(function (l, p) { return l.concat(['pre' + p, p, 'post' + p]) }, []) return readJson(json, function (er, d) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (er) d = {} var allScripts = Object.keys(d.scripts || {}) var scripts = [] var runScripts = [] allScripts.forEach(function (script) { if (cmdList.indexOf(script) !== -1) scripts.push(script) else runScripts.push(script) }) if (log.level === 'silent') { return cb(null, allScripts) } if (npm.config.get('json')) { console.log(JSON.stringify(d.scripts || {}, null, 2)) return cb(null, allScripts) } if (npm.config.get('parseable')) { allScripts.forEach(function (script) { console.log(script + ':' + d.scripts[script]) }) return cb(null, allScripts) } var s = '\n ' var prefix = ' ' if (scripts.length) { console.log('Lifecycle scripts included in %s:', d.name) } scripts.forEach(function (script) { console.log(prefix + script + s + d.scripts[script]) }) if (!scripts.length && runScripts.length) { console.log('Scripts available in %s via `npm run-script`:', d.name) } else if (runScripts.length) { console.log('\navailable via `npm run-script`:') } runScripts.forEach(function (script) { console.log(prefix + script + s + d.scripts[script]) }) return cb(null, allScripts) }) } function run (pkg, wd, cmd, args, cb) { if (!pkg.scripts) pkg.scripts = {} var cmds if (cmd === 'restart' && !pkg.scripts.restart) { cmds = [ 'prestop', 'stop', 'poststop', 'restart', 'prestart', 'start', 'poststart' ] } else { if (!pkg.scripts[cmd]) { if (cmd === 'test') { pkg.scripts.test = 'echo \'Error: no test specified\'' } else if (cmd === 'env') { if (process.platform === 'win32') { log.verbose('run-script using default platform env: SET (Windows)') pkg.scripts[cmd] = 'SET' } else { log.verbose('run-script using default platform env: env (Unix)') pkg.scripts[cmd] = 'env' } } else if (npm.config.get('if-present')) { return cb(null) } else { return cb(new Error('missing script: ' + cmd)) } } cmds = [cmd] } if (!cmd.match(/^(pre|post)/)) { cmds = ['pre' + cmd].concat(cmds).concat('post' + cmd) } log.verbose('run-script', cmds) chain(cmds.map(function (c) { // pass cli arguments after -- to script. if (pkg.scripts[c] && c === cmd) { pkg.scripts[c] = pkg.scripts[c] + joinArgs(args) } // when running scripts explicitly, assume that they're trusted. return [lifecycle, pkg, c, wd, true] }), cb) } // join arguments after '--' and pass them to script, // handle special characters such as ', ", ' '. function joinArgs (args) { var joinedArgs = '' args.forEach(function (arg) { joinedArgs += ' "' + arg.replace(/"/g, '\\"') + '"' }) return joinedArgs } npm_3.5.2.orig/lib/search.js0000644000000000000000000001715412631326456014053 0ustar 00000000000000 module.exports = exports = search var npm = require('./npm.js') var columnify = require('columnify') var updateIndex = require('./cache/update-index.js') search.usage = 'npm search [--long] [search terms ...]' + '\n\naliases: s, se' search.completion = function (opts, cb) { var compl = {} var partial = opts.partialWord var ipartial = partial.toLowerCase() var plen = partial.length // get the batch of data that matches so far. // this is an example of using npm.commands.search programmatically // to fetch data that has been filtered by a set of arguments. search(opts.conf.argv.remain.slice(2), true, function (er, data) { if (er) return cb(er) Object.keys(data).forEach(function (name) { data[name].words.split(' ').forEach(function (w) { if (w.toLowerCase().indexOf(ipartial) === 0) { compl[partial + w.substr(plen)] = true } }) }) cb(null, Object.keys(compl)) }) } function search (args, silent, staleness, cb) { if (typeof cb !== 'function') { cb = staleness staleness = 600 } if (typeof cb !== 'function') { cb = silent silent = false } var searchopts = npm.config.get('searchopts') var searchexclude = npm.config.get('searchexclude') if (typeof searchopts !== 'string') searchopts = '' searchopts = searchopts.split(/\s+/) var opts = searchopts.concat(args).map(function (s) { return s.toLowerCase() }).filter(function (s) { return s }) if (typeof searchexclude === 'string') { searchexclude = searchexclude.split(/\s+/) } else { searchexclude = [] } searchexclude = searchexclude.map(function (s) { return s.toLowerCase() }) getFilteredData(staleness, opts, searchexclude, function (er, data) { // now data is the list of data that we want to show. // prettify and print it, and then provide the raw // data to the cb. if (er || silent) return cb(er, data) console.log(prettify(data, args)) cb(null, data) }) } function getFilteredData (staleness, args, notArgs, cb) { updateIndex(staleness, function (er, data) { if (er) return cb(er) return cb(null, filter(data, args, notArgs)) }) } function filter (data, args, notArgs) { // data={:{package data}} return Object.keys(data).map(function (d) { return data[d] }).filter(function (d) { return typeof d === 'object' }).map(stripData).map(getWords).filter(function (data) { return filterWords(data, args, notArgs) }).reduce(function (l, r) { l[r.name] = r return l }, {}) } function stripData (data) { return { name: data.name, description: npm.config.get('description') ? data.description : '', maintainers: (data.maintainers || []).map(function (m) { return '=' + m.name }), url: !Object.keys(data.versions || {}).length ? data.url : null, keywords: data.keywords || [], version: Object.keys(data.versions || {})[0] || [], time: data.time && data.time.modified && (new Date(data.time.modified).toISOString() // remove time .split('T').join(' ') .replace(/:[0-9]{2}\.[0-9]{3}Z$/, '')) .slice(0, -5) || 'prehistoric' } } function getWords (data) { data.words = [ data.name ] .concat(data.description) .concat(data.maintainers) .concat(data.url && ('<' + data.url + '>')) .concat(data.keywords) .map(function (f) { return f && f.trim && f.trim() }) .filter(function (f) { return f }) .join(' ') .toLowerCase() return data } function filterWords (data, args, notArgs) { var words = data.words for (var i = 0, l = args.length; i < l; i++) { if (!match(words, args[i])) return false } for (i = 0, l = notArgs.length; i < l; i++) { if (match(words, notArgs[i])) return false } return true } function match (words, arg) { if (arg.charAt(0) === '/') { arg = arg.replace(/\/$/, '') arg = new RegExp(arg.substr(1, arg.length - 1)) return words.match(arg) } return words.indexOf(arg) !== -1 } function prettify (data, args) { var searchsort = (npm.config.get('searchsort') || 'NAME').toLowerCase() var sortField = searchsort.replace(/^\-+/, '') var searchRev = searchsort.charAt(0) === '-' var truncate = !npm.config.get('long') if (Object.keys(data).length === 0) { return 'No match found for ' + (args.map(JSON.stringify).join(' ')) } var lines = Object.keys(data).map(function (d) { // strip keyname return data[d] }).map(function (dat) { dat.author = dat.maintainers delete dat.maintainers dat.date = dat.time delete dat.time return dat }).map(function (dat) { // split keywords on whitespace or , if (typeof dat.keywords === 'string') { dat.keywords = dat.keywords.split(/[,\s]+/) } if (Array.isArray(dat.keywords)) { dat.keywords = dat.keywords.join(' ') } // split author on whitespace or , if (typeof dat.author === 'string') { dat.author = dat.author.split(/[,\s]+/) } if (Array.isArray(dat.author)) { dat.author = dat.author.join(' ') } return dat }) lines.sort(function (a, b) { var aa = a[sortField].toLowerCase() var bb = b[sortField].toLowerCase() return aa === bb ? 0 : aa < bb ? -1 : 1 }) if (searchRev) lines.reverse() var columns = npm.config.get('description') ? ['name', 'description', 'author', 'date', 'version', 'keywords'] : ['name', 'author', 'date', 'version', 'keywords'] var output = columnify( lines, { include: columns, truncate: truncate, config: { name: { maxWidth: 40, truncate: false, truncateMarker: '' }, description: { maxWidth: 60 }, author: { maxWidth: 20 }, date: { maxWidth: 11 }, version: { maxWidth: 11 }, keywords: { maxWidth: Infinity } } } ) output = trimToMaxWidth(output) output = highlightSearchTerms(output, args) return output } var colors = [31, 33, 32, 36, 34, 35] var cl = colors.length function addColorMarker (str, arg, i) { var m = i % cl + 1 var markStart = String.fromCharCode(m) var markEnd = String.fromCharCode(0) if (arg.charAt(0) === '/') { return str.replace( new RegExp(arg.substr(1, arg.length - 2), 'gi'), function (bit) { return markStart + bit + markEnd } ) } // just a normal string, do the split/map thing var pieces = str.toLowerCase().split(arg.toLowerCase()) var p = 0 return pieces.map(function (piece) { piece = str.substr(p, piece.length) var mark = markStart + str.substr(p + piece.length, arg.length) + markEnd p += piece.length + arg.length return piece + mark }).join('') } function colorize (line) { for (var i = 0; i < cl; i++) { var m = i + 1 var color = npm.color ? '\u001B[' + colors[i] + 'm' : '' line = line.split(String.fromCharCode(m)).join(color) } var uncolor = npm.color ? '\u001B[0m' : '' return line.split('\u0000').join(uncolor) } function getMaxWidth () { var cols try { var tty = require('tty') var stdout = process.stdout cols = !tty.isatty(stdout.fd) ? Infinity : process.stdout.getWindowSize()[0] cols = (cols === 0) ? Infinity : cols } catch (ex) { cols = Infinity } return cols } function trimToMaxWidth (str) { var maxWidth = getMaxWidth() return str.split('\n').map(function (line) { return line.slice(0, maxWidth) }).join('\n') } function highlightSearchTerms (str, terms) { terms.forEach(function (arg, i) { str = addColorMarker(str, arg, i) }) return colorize(str).trim() } npm_3.5.2.orig/lib/set.js0000644000000000000000000000042412631326456013371 0ustar 00000000000000 module.exports = set set.usage = 'npm set (See `npm config`)' var npm = require('./npm.js') set.completion = npm.commands.config.completion function set (args, cb) { if (!args.length) return cb(set.usage) npm.commands.config(['set'].concat(args), cb) } npm_3.5.2.orig/lib/shrinkwrap.js0000644000000000000000000001055712631326456014776 0ustar 00000000000000// emit JSON describing versions of all packages currently installed (for later // use with shrinkwrap install) module.exports = exports = shrinkwrap var path = require('path') var log = require('npmlog') var writeFileAtomic = require('write-file-atomic') var iferr = require('iferr') var readPackageTree = require('read-package-tree') var validate = require('aproba') var npm = require('./npm.js') var recalculateMetadata = require('./install/deps.js').recalculateMetadata var validatePeerDeps = require('./install/deps.js').validatePeerDeps var isExtraneous = require('./install/is-extraneous.js') var isOnlyDev = require('./install/is-dev.js').isOnlyDev var packageId = require('./utils/package-id.js') var moduleName = require('./utils/module-name.js') shrinkwrap.usage = 'npm shrinkwrap' function shrinkwrap (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } if (args.length) { log.warn('shrinkwrap', "doesn't take positional args") } var dir = path.resolve(npm.dir, '..') npm.config.set('production', true) readPackageTree(dir, andRecalculateMetadata(iferr(cb, function (tree) { var pkginfo = treeToShrinkwrap(tree, !!npm.config.get('dev') || /^dev(elopment)?$/.test(npm.config.get('also'))) shrinkwrap_(pkginfo, silent, cb) }))) } function andRecalculateMetadata (next) { validate('F', arguments) return function (er, tree) { validate('EO', arguments) if (er) return next(er) recalculateMetadata(tree, log, next) } } function treeToShrinkwrap (tree, dev) { validate('OB', arguments) var pkginfo = {} if (tree.package.name) pkginfo.name = tree.package.name if (tree.package.version) pkginfo.version = tree.package.version var problems = [] if (tree.children.length) { shrinkwrapDeps(dev, problems, pkginfo.dependencies = {}, tree) } if (problems.length) pkginfo.problems = problems return pkginfo } function shrinkwrapDeps (dev, problems, deps, tree, seen) { validate('BAOO', [dev, problems, deps, tree]) if (!seen) seen = {} if (seen[tree.path]) return seen[tree.path] = true Object.keys(tree.missingDeps).forEach(function (name) { var invalid = tree.children.filter(function (dep) { return moduleName(dep) === name })[0] if (invalid) { problems.push('invalid: have ' + invalid.package._id + ' (expected: ' + tree.missingDeps[name] + ') ' + invalid.path) } else if (!tree.package.optionalDependencies || !tree.package.optionalDependencies[name]) { var topname = packageId(tree) problems.push('missing: ' + name + '@' + tree.package.dependencies[name] + (topname ? ', required by ' + topname : '')) } }) tree.children.sort(function (aa, bb) { return moduleName(aa).localeCompare(moduleName(bb)) }).forEach(function (child) { if (!dev && isOnlyDev(child)) { log.warn('shrinkwrap', 'Excluding devDependency: %s', packageId(child), child.parent.package.dependencies) return } var pkginfo = deps[moduleName(child)] = {} pkginfo.version = child.package.version pkginfo.from = child.package._from pkginfo.resolved = child.package._resolved if (isExtraneous(child)) { problems.push('extraneous: ' + child.package._id + ' ' + child.path) } validatePeerDeps(child, function (tree, pkgname, version) { problems.push('peer invalid: ' + pkgname + '@' + version + ', required by ' + child.package._id) }) if (child.children.length) { shrinkwrapDeps(dev, problems, pkginfo.dependencies = {}, child, seen) } }) } function shrinkwrap_ (pkginfo, silent, cb) { if (pkginfo.problems) { return cb(new Error('Problems were encountered\n' + 'Please correct and try again.\n' + pkginfo.problems.join('\n'))) } save(pkginfo, silent, cb) } function save (pkginfo, silent, cb) { // copy the keys over in a well defined order // because javascript objects serialize arbitrarily var swdata try { swdata = JSON.stringify(pkginfo, null, 2) + '\n' } catch (er) { log.error('shrinkwrap', 'Error converting package info to json') return cb(er) } var file = path.resolve(npm.prefix, 'npm-shrinkwrap.json') writeFileAtomic(file, swdata, function (er) { if (er) return cb(er) if (silent) return cb(null, pkginfo) log.clearProgress() console.log('wrote npm-shrinkwrap.json') log.showProgress() cb(null, pkginfo) }) } npm_3.5.2.orig/lib/star.js0000644000000000000000000000213612631326456013551 0ustar 00000000000000module.exports = star var npm = require('./npm.js') var log = require('npmlog') var asyncMap = require('slide').asyncMap var mapToRegistry = require('./utils/map-to-registry.js') star.usage = 'npm star [...]\n' + 'npm unstar [...]' star.completion = function (opts, cb) { // FIXME: there used to be registry completion here, but it stopped making // sense somewhere around 50,000 packages on the registry cb() } function star (args, cb) { if (!args.length) return cb(star.usage) var s = npm.config.get('unicode') ? '\u2605 ' : '(*)' var u = npm.config.get('unicode') ? '\u2606 ' : '( )' var using = !(npm.command.match(/^un/)) if (!using) s = u asyncMap(args, function (pkg, cb) { mapToRegistry(pkg, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { starred: using, auth: auth } npm.registry.star(uri, params, function (er, data, raw, req) { if (!er) { console.log(s + ' ' + pkg) log.verbose('star', data) } cb(er, data, raw, req) }) }) }, cb) } npm_3.5.2.orig/lib/stars.js0000644000000000000000000000207612631326456013737 0ustar 00000000000000module.exports = stars stars.usage = 'npm stars []' var npm = require('./npm.js') var log = require('npmlog') var mapToRegistry = require('./utils/map-to-registry.js') function stars (args, cb) { npm.commands.whoami([], true, function (er, username) { var name = args.length === 1 ? args[0] : username if (er) { if (er.code === 'ENEEDAUTH' && !name) { var needAuth = new Error("'npm stars' on your own user account requires auth") needAuth.code = 'ENEEDAUTH' return cb(needAuth) } if (er.code !== 'ENEEDAUTH') return cb(er) } mapToRegistry('', npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { username: name, auth: auth } npm.registry.stars(uri, params, showstars) }) }) function showstars (er, data) { if (er) return cb(er) if (data.rows.length === 0) { log.warn('stars', 'user has not starred any packages.') } else { data.rows.forEach(function (a) { console.log(a.value) }) } cb() } } npm_3.5.2.orig/lib/start.js0000644000000000000000000000007612631326456013736 0ustar 00000000000000module.exports = require('./utils/lifecycle.js').cmd('start') npm_3.5.2.orig/lib/stop.js0000644000000000000000000000007512631326456013565 0ustar 00000000000000module.exports = require('./utils/lifecycle.js').cmd('stop') npm_3.5.2.orig/lib/substack.js0000644000000000000000000000073012631326456014415 0ustar 00000000000000module.exports = substack var npm = require('./npm.js') var isms = [ '\u001b[32mbeep \u001b[35mboop\u001b[m', 'Replace your configs with services', 'SEPARATE ALL THE CONCERNS!', 'MODULE ALL THE THINGS!', '\\o/', 'but first, burritos', 'full time mad scientist here', 'c/,,\\' ] function substack (args, cb) { var i = Math.floor(Math.random() * isms.length) console.log(isms[i]) var c = args.shift() if (c) npm.commands[c](args, cb) else cb() } npm_3.5.2.orig/lib/tag.js0000644000000000000000000000212712631326456013353 0ustar 00000000000000// turns out tagging isn't very complicated // all the smarts are in the couch. module.exports = tag tag.usage = '[DEPRECATED] npm tag @ []' + '\nSee `dist-tag`' tag.completion = require('./unpublish.js').completion var npm = require('./npm.js') var mapToRegistry = require('./utils/map-to-registry.js') var npa = require('npm-package-arg') var semver = require('semver') var log = require('npmlog') function tag (args, cb) { var thing = npa(args.shift() || '') var project = thing.name var version = thing.rawSpec var t = args.shift() || npm.config.get('tag') t = t.trim() if (!project || !version || !t) return cb('Usage:\n' + tag.usage) if (semver.validRange(t)) { var er = new Error('Tag name must not be a valid SemVer range: ' + t) return cb(er) } log.warn('tag', 'This command is deprecated. Use `npm dist-tag` instead.') mapToRegistry(project, npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { version: version, tag: t, auth: auth } npm.registry.tag(uri, params, cb) }) } npm_3.5.2.orig/lib/team.js0000644000000000000000000000257212631326456013532 0ustar 00000000000000var mapToRegistry = require('./utils/map-to-registry.js') var npm = require('./npm') module.exports = team team.subcommands = ['create', 'destroy', 'add', 'rm', 'ls', 'edit'] team.usage = 'npm team create \n' + 'npm team destroy \n' + 'npm team add \n' + 'npm team rm \n' + 'npm team ls |\n' + 'npm team edit ' team.completion = function (opts, cb) { var argv = opts.conf.argv.remain if (argv.length === 2) { return cb(null, team.subcommands) } switch (argv[2]) { case 'ls': case 'create': case 'destroy': case 'add': case 'rm': case 'edit': return cb(null, []) default: return cb(new Error(argv[2] + ' not recognized')) } } function team (args, cb) { // Entities are in the format : var cmd = args.shift() var entity = (args.shift() || '').split(':') return mapToRegistry('/', npm.config, function (err, uri, auth) { if (err) { return cb(err) } try { return npm.registry.team(cmd, uri, { auth: auth, scope: entity[0], team: entity[1], user: args.shift() }, function (err, data) { !err && data && console.log(JSON.stringify(data, undefined, 2)) cb(err, data) }) } catch (e) { cb(e.message + '\n\nUsage:\n' + team.usage) } }) } npm_3.5.2.orig/lib/test.js0000644000000000000000000000044612631326456013561 0ustar 00000000000000module.exports = test var testCmd = require('./utils/lifecycle.js').cmd('test') function test (args, cb) { testCmd(args, function (er) { if (!er) return cb() if (er.code === 'ELIFECYCLE') { return cb('Test failed. See above for more details.') } return cb(er) }) } npm_3.5.2.orig/lib/unbuild.js0000644000000000000000000000746512631326456014254 0ustar 00000000000000module.exports = unbuild module.exports.rmStuff = rmStuff unbuild.usage = 'npm unbuild \n(this is plumbing)' var readJson = require('read-package-json') var gentlyRm = require('./utils/gently-rm.js') var npm = require('./npm.js') var path = require('path') var isInside = require('path-is-inside') var lifecycle = require('./utils/lifecycle.js') var asyncMap = require('slide').asyncMap var chain = require('slide').chain var log = require('npmlog') var build = require('./build.js') // args is a list of folders. // remove any bins/etc, and then delete the folder. function unbuild (args, silent, cb) { if (typeof silent === 'function') { cb = silent silent = false } asyncMap(args, unbuild_(silent), cb) } function unbuild_ (silent) { return function (folder, cb_) { function cb (er) { cb_(er, path.relative(npm.root, folder)) } folder = path.resolve(folder) var base = isInside(folder, npm.prefix) ? npm.prefix : folder delete build._didBuild[folder] log.verbose('unbuild', folder.substr(npm.prefix.length + 1)) readJson(path.resolve(folder, 'package.json'), function (er, pkg) { // if no json, then just trash it, but no scripts or whatever. if (er) return gentlyRm(folder, false, base, cb) chain( [ [lifecycle, pkg, 'preuninstall', folder, false, true], [lifecycle, pkg, 'uninstall', folder, false, true], !silent && function (cb) { log.clearProgress() console.log('unbuild ' + pkg._id) log.showProgress() cb() }, [rmStuff, pkg, folder], [lifecycle, pkg, 'postuninstall', folder, false, true], [gentlyRm, folder, false, base] ], cb ) }) } } function rmStuff (pkg, folder, cb) { // if it's global, and folder is in {prefix}/node_modules, // then bins are in {prefix}/bin // otherwise, then bins are in folder/../.bin var parent = path.dirname(folder) var gnm = npm.dir var top = gnm === parent log.verbose('unbuild rmStuff', pkg._id, 'from', gnm) if (!top) log.verbose('unbuild rmStuff', 'in', parent) asyncMap([rmBins, rmMans], function (fn, cb) { fn(pkg, folder, parent, top, cb) }, cb) } function rmBins (pkg, folder, parent, top, cb) { if (!pkg.bin) return cb() var binRoot = top ? npm.bin : path.resolve(parent, '.bin') asyncMap(Object.keys(pkg.bin), function (b, cb) { if (process.platform === 'win32') { chain([ [gentlyRm, path.resolve(binRoot, b) + '.cmd', true, folder], [gentlyRm, path.resolve(binRoot, b), true, folder] ], cb) } else { gentlyRm(path.resolve(binRoot, b), true, folder, cb) } }, cb) } function rmMans (pkg, folder, parent, top, cb) { if (!pkg.man || !top || process.platform === 'win32' || !npm.config.get('global')) { return cb() } var manRoot = path.resolve(npm.config.get('prefix'), 'share', 'man') log.verbose('rmMans', 'man files are', pkg.man, 'in', manRoot) asyncMap(pkg.man, function (man, cb) { if (Array.isArray(man)) { man.forEach(rmMan) } else { rmMan(man) } function rmMan (man) { log.silly('rmMan', 'preparing to remove', man) var parseMan = man.match(/(.*\.([0-9]+)(\.gz)?)$/) if (!parseMan) { log.error( 'rmMan', man, 'is not a valid name for a man file.', 'Man files must end with a number, ' + 'and optionally a .gz suffix if they are compressed.' ) return cb() } var stem = parseMan[1] var sxn = parseMan[2] var gz = parseMan[3] || '' var bn = path.basename(stem) var manDest = path.join( manRoot, 'man' + sxn, (bn.indexOf(pkg.name) === 0 ? bn : pkg.name + '-' + bn) + '.' + sxn + gz ) gentlyRm(manDest, true, cb) } }, cb) } npm_3.5.2.orig/lib/uninstall.js0000644000000000000000000000447012631326456014614 0ustar 00000000000000'use strict' // remove a package. module.exports = uninstall module.exports.Uninstaller = Uninstaller uninstall.usage = 'npm uninstall [<@scope>/][@]... [--save|--save-dev|--save-optional]' + '\n\naliases: remove, rm, r, un, unlink' var util = require('util') var path = require('path') var validate = require('aproba') var chain = require('slide').chain var readJson = require('read-package-json') var npm = require('./npm.js') var Installer = require('./install.js').Installer var getSaveType = require('./install/save.js').getSaveType var removeDeps = require('./install/deps.js').removeDeps var loadExtraneous = require('./install/deps.js').loadExtraneous var log = require('npmlog') uninstall.completion = require('./utils/completion/installed-shallow.js') function uninstall (args, cb) { validate('AF', arguments) // the /path/to/node_modules/.. var dryrun = !!npm.config.get('dry-run') if (args.length === 1 && args[0] === '.') args = [] args = args.filter(function (a) { return path.resolve(a) !== where }) var where = npm.config.get('global') || !args.length ? path.resolve(npm.globalDir, '..') : npm.prefix if (args.length) { new Uninstaller(where, dryrun, args).run(cb) } else { // remove this package from the global space, if it's installed there readJson(path.resolve(npm.localPrefix, 'package.json'), function (er, pkg) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (er) return cb(uninstall.usage) new Uninstaller(where, dryrun, [pkg.name]).run(cb) }) } } function Uninstaller (where, dryrun, args) { validate('SBA', arguments) Installer.call(this, where, dryrun, args) } util.inherits(Uninstaller, Installer) Uninstaller.prototype.loadArgMetadata = function (next) { this.args = this.args.map(function (arg) { return {name: arg} }) next() } Uninstaller.prototype.loadAllDepsIntoIdealTree = function (cb) { validate('F', arguments) log.silly('uninstall', 'loadAllDepsIntoIdealtree') var saveDeps = getSaveType(this.args) var cg = this.progress.loadAllDepsIntoIdealTree var steps = [] steps.push( [removeDeps, this.args, this.idealTree, saveDeps, cg.newGroup('removeDeps')], [loadExtraneous, this.idealTree, cg.newGroup('loadExtraneous')]) chain(steps, cb) } npm_3.5.2.orig/lib/unpublish.js0000644000000000000000000000701712631326456014614 0ustar 00000000000000 module.exports = unpublish var log = require('npmlog') var npm = require('./npm.js') var readJson = require('read-package-json') var path = require('path') var mapToRegistry = require('./utils/map-to-registry.js') var npa = require('npm-package-arg') var getPublishConfig = require('./utils/get-publish-config.js') unpublish.usage = 'npm unpublish [<@scope>/][@]' unpublish.completion = function (opts, cb) { if (opts.conf.argv.remain.length >= 3) return cb() npm.commands.whoami([], true, function (er, username) { if (er) return cb() var un = encodeURIComponent(username) if (!un) return cb() var byUser = '-/by-user/' + un mapToRegistry(byUser, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, pkgs) { // do a bit of filtering at this point, so that we don't need // to fetch versions for more than one thing, but also don't // accidentally a whole project. pkgs = pkgs[un] if (!pkgs || !pkgs.length) return cb() var pp = npa(opts.partialWord).name pkgs = pkgs.filter(function (p) { return p.indexOf(pp) === 0 }) if (pkgs.length > 1) return cb(null, pkgs) mapToRegistry(pkgs[0], npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, d) { if (er) return cb(er) var vers = Object.keys(d.versions) if (!vers.length) return cb(null, pkgs) return cb(null, vers.map(function (v) { return pkgs[0] + '@' + v })) }) }) }) }) }) } function unpublish (args, cb) { if (args.length > 1) return cb(unpublish.usage) var thing = args.length ? npa(args[0]) : {} var project = thing.name var version = thing.rawSpec log.silly('unpublish', 'args[0]', args[0]) log.silly('unpublish', 'thing', thing) if (!version && !npm.config.get('force')) { return cb( 'Refusing to delete entire project.\n' + 'Run with --force to do this.\n' + unpublish.usage ) } if (!project || path.resolve(project) === npm.localPrefix) { // if there's a package.json in the current folder, then // read the package name and version out of that. var cwdJson = path.join(npm.localPrefix, 'package.json') return readJson(cwdJson, function (er, data) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (er) return cb('Usage:\n' + unpublish.usage) log.verbose('unpublish', data) gotProject(data.name, data.version, data.publishConfig, cb) }) } return gotProject(project, version, cb) } function gotProject (project, version, publishConfig, cb_) { if (typeof cb_ !== 'function') { cb_ = publishConfig publishConfig = null } function cb (er) { if (er) return cb_(er) console.log('- ' + project + (version ? '@' + version : '')) cb_() } var mappedConfig = getPublishConfig(publishConfig, npm.config, npm.registry) var config = mappedConfig.config var registry = mappedConfig.client // remove from the cache first npm.commands.cache(['clean', project, version], function (er) { if (er) { log.error('unpublish', 'Failed to clean cache') return cb(er) } mapToRegistry(project, config, function (er, uri, auth) { if (er) return cb(er) var params = { version: version, auth: auth } registry.unpublish(uri, params, cb) }) }) } npm_3.5.2.orig/lib/update.js0000644000000000000000000000326112631326456014062 0ustar 00000000000000module.exports = update update.usage = 'npm update [-g] [...]' var url = require('url') var log = require('npmlog') var chain = require('slide').chain var npm = require('./npm.js') var Installer = require('./install.js').Installer update.completion = npm.commands.outdated.completion function update (args, cb) { var dryrun = false if (npm.config.get('dry-run')) dryrun = true npm.commands.outdated(args, true, function (er, rawOutdated) { if (er) return cb(er) var outdated = rawOutdated.map(function (ww) { return { dep: ww[0], depname: ww[1], current: ww[2], wanted: ww[3], latest: ww[4], req: ww[5], what: ww[1] + '@' + ww[3] } }) var wanted = outdated.filter(function (ww) { if (ww.current === ww.wanted && ww.wanted !== ww.latest) { log.verbose( 'outdated', 'not updating', ww.depname, "because it's currently at the maximum version that matches its specified semver range" ) } return ww.current !== ww.wanted }) if (wanted.length === 0) return cb() log.info('outdated', 'updating', wanted) var toInstall = {} wanted.forEach(function (ww) { // use the initial installation method (repo, tar, git) for updating if (url.parse(ww.req).protocol) ww.what = ww.req var where = ww.dep.parent && ww.dep.parent.path || ww.dep.path if (toInstall[where]) { toInstall[where].push(ww.what) } else { toInstall[where] = [ww.what] } }) chain(Object.keys(toInstall).map(function (where) { return [new Installer(where, dryrun, toInstall[where]), 'run'] }), cb) }) } npm_3.5.2.orig/lib/utils/0000755000000000000000000000000012631326456013400 5ustar 00000000000000npm_3.5.2.orig/lib/version.js0000644000000000000000000001374112631326456014271 0ustar 00000000000000// npm version module.exports = version var semver = require('semver') var path = require('path') var fs = require('graceful-fs') var writeFileAtomic = require('write-file-atomic') var chain = require('slide').chain var log = require('npmlog') var npm = require('./npm.js') var git = require('./utils/git.js') var assert = require('assert') var lifecycle = require('./utils/lifecycle.js') var parseJSON = require('./utils/parse-json.js') version.usage = 'npm version [ | major | minor | patch | premajor | preminor | prepatch | prerelease]' + '\n(run in package dir)\n' + "'npm -v' or 'npm --version' to print npm version " + '(' + npm.version + ')\n' + "'npm view version' to view a package's " + 'published version\n' + "'npm ls' to inspect current package/dependency versions" function version (args, silent, cb_) { if (typeof cb_ !== 'function') { cb_ = silent silent = false } if (args.length > 1) return cb_(version.usage) var packagePath = path.join(npm.localPrefix, 'package.json') fs.readFile(packagePath, function (er, data) { if (data) data = data.toString() try { data = parseJSON(data) } catch (e) { er = e data = null } if (!args.length) return dump(data, cb_) if (er) { log.error('version', 'No valid package.json found') return cb_(er) } var newVersion = semver.valid(args[0]) if (!newVersion) newVersion = semver.inc(data.version, args[0]) if (!newVersion) return cb_(version.usage) if (data.version === newVersion) return cb_(new Error('Version not changed')) data.version = newVersion var lifecycleData = Object.create(data) lifecycleData._id = data.name + '@' + newVersion var localData = {} var where = npm.prefix chain([ [checkGit, localData], [lifecycle, lifecycleData, 'preversion', where], [updatePackage, newVersion, silent], [lifecycle, lifecycleData, 'version', where], [commit, localData, newVersion], [lifecycle, lifecycleData, 'postversion', where] ], cb_) }) } function readPackage (cb) { var packagePath = path.join(npm.localPrefix, 'package.json') fs.readFile(packagePath, function (er, data) { if (er) return cb(new Error(er)) if (data) data = data.toString() try { data = JSON.parse(data) } catch (e) { er = e data = null } cb(er, data) }) } function updatePackage (newVersion, silent, cb_) { function cb (er) { if (!er && !silent) console.log('v' + newVersion) cb_(er) } readPackage(function (er, data) { if (er) return cb(new Error(er)) data.version = newVersion write(data, 'package.json', cb) }) } function commit (localData, newVersion, cb) { updateShrinkwrap(newVersion, function (er, hasShrinkwrap) { if (er || !localData.hasGit) return cb(er) _commit(newVersion, hasShrinkwrap, cb) }) } function updateShrinkwrap (newVersion, cb) { fs.readFile(path.join(npm.localPrefix, 'npm-shrinkwrap.json'), function (er, data) { if (er && er.code === 'ENOENT') return cb(null, false) try { data = data.toString() data = parseJSON(data) } catch (er) { log.error('version', 'Bad npm-shrinkwrap.json data') return cb(er) } data.version = newVersion write(data, 'npm-shrinkwrap.json', function (er) { if (er) { log.error('version', 'Bad npm-shrinkwrap.json data') return cb(er) } cb(null, true) }) }) } function dump (data, cb) { var v = {} if (data && data.name && data.version) v[data.name] = data.version v.npm = npm.version Object.keys(process.versions).sort().forEach(function (k) { v[k] = process.versions[k] }) if (npm.config.get('json')) v = JSON.stringify(v, null, 2) console.log(v) cb() } function checkGit (localData, cb) { fs.stat(path.join(npm.localPrefix, '.git'), function (er, s) { var doGit = !er && npm.config.get('git-tag-version') if (!doGit) { if (er) log.verbose('version', 'error checking for .git', er) log.verbose('version', 'not tagging in git') return cb(null, false) } // check for git git.whichAndExec( [ 'status', '--porcelain' ], { env: process.env }, function (er, stdout) { if (er && er.code === 'ENOGIT') { log.warn( 'version', 'This is a Git checkout, but the git command was not found.', 'npm could not create a Git tag for this release!' ) return cb(null, false) } var lines = stdout.trim().split('\n').filter(function (line) { return line.trim() && !line.match(/^\?\? /) }).map(function (line) { return line.trim() }) if (lines.length && !npm.config.get('force')) { return cb(new Error( 'Git working directory not clean.\n' + lines.join('\n') )) } localData.hasGit = true cb(null, true) } ) }) } function _commit (version, hasShrinkwrap, cb) { var options = { env: process.env } var message = npm.config.get('message').replace(/%s/g, version) var sign = npm.config.get('sign-git-tag') var flag = sign ? '-sm' : '-am' chain( [ git.chainableExec([ 'add', 'package.json' ], options), hasShrinkwrap && git.chainableExec([ 'add', 'npm-shrinkwrap.json' ], options), git.chainableExec([ 'commit', '-m', message ], options), git.chainableExec([ 'tag', npm.config.get('tag-version-prefix') + version, flag, message ], options) ], cb ) } function write (data, file, cb) { assert(data && typeof data === 'object', 'must pass data to version write') assert(typeof file === 'string', 'must pass filename to write to version write') log.verbose('version.write', 'data', data, 'to', file) writeFileAtomic( path.join(npm.localPrefix, file), new Buffer(JSON.stringify(data, null, 2) + '\n'), cb ) } npm_3.5.2.orig/lib/view.js0000644000000000000000000002055212631326456013554 0ustar 00000000000000// npm view [pkg [pkg ...]] module.exports = view view.usage = 'npm view [<@scope>/][@] [[.subfield]...]' + '\n\naliases: info, show, v' var npm = require('./npm.js') var readJson = require('read-package-json') var log = require('npmlog') var util = require('util') var semver = require('semver') var mapToRegistry = require('./utils/map-to-registry.js') var npa = require('npm-package-arg') var path = require('path') view.completion = function (opts, cb) { if (opts.conf.argv.remain.length <= 2) { // FIXME: there used to be registry completion here, but it stopped making // sense somewhere around 50,000 packages on the registry return cb() } // have the package, get the fields. var tag = npm.config.get('tag') mapToRegistry(opts.conf.argv.remain[2], npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, d) { if (er) return cb(er) var dv = d.versions[d['dist-tags'][tag]] var fields = [] d.versions = Object.keys(d.versions).sort(semver.compareLoose) fields = getFields(d).concat(getFields(dv)) cb(null, fields) }) }) function getFields (d, f, pref) { f = f || [] if (!d) return f pref = pref || [] Object.keys(d).forEach(function (k) { if (k.charAt(0) === '_' || k.indexOf('.') !== -1) return var p = pref.concat(k).join('.') f.push(p) if (Array.isArray(d[k])) { d[k].forEach(function (val, i) { var pi = p + '[' + i + ']' if (val && typeof val === 'object') getFields(val, f, [p]) else f.push(pi) }) return } if (typeof d[k] === 'object') getFields(d[k], f, [p]) }) return f } } function view (args, silent, cb) { if (typeof cb !== 'function') { cb = silent silent = false } if (!args.length) args = ['.'] var pkg = args.shift() var nv = npa(pkg) var name = nv.name var local = (name === '.' || !name) if (npm.config.get('global') && local) { return cb(new Error('Cannot use view command in global mode.')) } if (local) { var dir = npm.prefix readJson(path.resolve(dir, 'package.json'), function (er, d) { d = d || {} if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (!d.name) return cb(new Error('Invalid package.json')) var p = d.name nv = npa(p) if (pkg && ~pkg.indexOf('@')) { nv.rawSpec = pkg.split('@')[pkg.indexOf('@')] } fetchAndRead(nv, args, silent, cb) }) } else { fetchAndRead(nv, args, silent, cb) } } function fetchAndRead (nv, args, silent, cb) { // get the data about this package var name = nv.name var version = nv.rawSpec || npm.config.get('tag') mapToRegistry(name, npm.config, function (er, uri, auth) { if (er) return cb(er) npm.registry.get(uri, { auth: auth }, function (er, data) { if (er) return cb(er) if (data['dist-tags'] && data['dist-tags'].hasOwnProperty(version)) { version = data['dist-tags'][version] } if (data.time && data.time.unpublished) { var u = data.time.unpublished er = new Error('Unpublished by ' + u.name + ' on ' + u.time) er.statusCode = 404 er.code = 'E404' er.pkgid = data._id return cb(er, data) } var results = [] var error = null var versions = data.versions || {} data.versions = Object.keys(versions).sort(semver.compareLoose) if (!args.length) args = [''] // remove readme unless we asked for it if (args.indexOf('readme') === -1) { delete data.readme } Object.keys(versions).forEach(function (v) { if (semver.satisfies(v, version, true)) { args.forEach(function (args) { // remove readme unless we asked for it if (args.indexOf('readme') !== -1) { delete versions[v].readme } results.push(showFields(data, versions[v], args)) }) } }) results = results.reduce(reducer, {}) var retval = results if (args.length === 1 && args[0] === '') { retval = cleanBlanks(retval) log.silly('cleanup', retval) } if (error || silent) cb(error, retval) else printData(results, data._id, cb.bind(null, error, retval)) }) }) } function cleanBlanks (obj) { var clean = {} Object.keys(obj).forEach(function (version) { clean[version] = obj[version][''] }) return clean } function reducer (l, r) { if (r) { Object.keys(r).forEach(function (v) { l[v] = l[v] || {} Object.keys(r[v]).forEach(function (t) { l[v][t] = r[v][t] }) }) } return l } // return whatever was printed function showFields (data, version, fields) { var o = {} ;[data, version].forEach(function (s) { Object.keys(s).forEach(function (k) { o[k] = s[k] }) }) return search(o, fields.split('.'), version.version, fields) } function search (data, fields, version, title) { var field var tail = fields while (!field && fields.length) field = tail.shift() fields = [field].concat(tail) var o if (!field && !tail.length) { o = {} o[version] = {} o[version][title] = data return o } var index = field.match(/(.+)\[([^\]]+)\]$/) if (index) { field = index[1] index = index[2] if (data.field && data.field.hasOwnProperty(index)) { return search(data[field][index], tail, version, title) } else { field = field + '[' + index + ']' } } if (Array.isArray(data)) { if (data.length === 1) { return search(data[0], fields, version, title) } var results = [] data.forEach(function (data, i) { var tl = title.length var newt = title.substr(0, tl - fields.join('.').length - 1) + '[' + i + ']' + [''].concat(fields).join('.') results.push(search(data, fields.slice(), version, newt)) }) results = results.reduce(reducer, {}) return results } if (!data.hasOwnProperty(field)) return undefined data = data[field] if (tail.length) { if (typeof data === 'object') { // there are more fields to deal with. return search(data, tail, version, title) } else { return new Error('Not an object: ' + data) } } o = {} o[version] = {} o[version][title] = data return o } function printData (data, name, cb) { var versions = Object.keys(data) var msg = '' var includeVersions = versions.length > 1 var includeFields versions.forEach(function (v) { var fields = Object.keys(data[v]) includeFields = includeFields || (fields.length > 1) fields.forEach(function (f) { var d = cleanup(data[v][f]) if (includeVersions || includeFields || typeof d !== 'string') { d = cleanup(data[v][f]) d = npm.config.get('json') ? JSON.stringify(d, null, 2) : util.inspect(d, false, 5, npm.color) } else if (typeof d === 'string' && npm.config.get('json')) { d = JSON.stringify(d) } if (f && includeFields) f += ' = ' if (d.indexOf('\n') !== -1) d = ' \n' + d msg += (includeVersions ? name + '@' + v + ' ' : '') + (includeFields ? f : '') + d + '\n' }) }) // preserve output symmetry by adding a whitespace-only line at the end if // there's one at the beginning if (/^\s*\n/.test(msg)) msg += '\n' // print directly to stdout to not unnecessarily add blank lines process.stdout.write(msg) cb(null, data) } function cleanup (data) { if (Array.isArray(data)) { return data.map(cleanup) } if (!data || typeof data !== 'object') return data if (typeof data.versions === 'object' && data.versions && !Array.isArray(data.versions)) { data.versions = Object.keys(data.versions || {}) } var keys = Object.keys(data) keys.forEach(function (d) { if (d.charAt(0) === '_') delete data[d] else if (typeof data[d] === 'object') data[d] = cleanup(data[d]) }) keys = Object.keys(data) if (keys.length <= 3 && data.name && (keys.length === 1 || keys.length === 3 && data.email && data.url || keys.length === 2 && (data.email || data.url))) { data = unparsePerson(data) } return data } function unparsePerson (d) { if (typeof d === 'string') return d return d.name + (d.email ? ' <' + d.email + '>' : '') + (d.url ? ' (' + d.url + ')' : '') } npm_3.5.2.orig/lib/visnup.js0000644000000000000000000000774312631326456014135 0ustar 00000000000000module.exports = visnup var npm = require('./npm.js') var handsomeFace = [ [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 237, 236, 236, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 235, 236, 235, 233, 237, 235, 233, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 235, 233, 232, 235, 235, 234, 233, 236, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 237, 235, 232, 232, 234, 233, 233, 232, 232, 233, 232, 232, 235, 232, 233, 234, 234, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 232, 232, 232, 239, 238, 235, 233, 232, 232, 232, 232, 232, 232, 232, 233, 235, 232, 233, 233, 232, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 234, 234, 232, 233, 234, 233, 234, 235, 233, 235, 60, 238, 238, 234, 234, 233, 234, 233, 238, 251, 246, 233, 233, 232, 0, 0, 0, 0, 0, 0], [0, 0, 233, 233, 233, 232, 232, 239, 249, 251, 252, 231, 231, 188, 250, 254, 59, 60, 255, 231, 231, 231, 252, 235, 239, 235, 232, 233, 0, 0, 0, 0, 0, 0], [0, 0, 232, 233, 232, 232, 232, 248, 231, 231, 231, 231, 231, 231, 231, 254, 238, 254, 231, 231, 231, 231, 231, 252, 233, 235, 237, 233, 234, 0, 0, 0, 0, 0], [0, 0, 233, 232, 232, 232, 248, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 251, 233, 233, 233, 236, 233, 0, 0, 0, 0], [232, 233, 233, 232, 232, 246, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 249, 233, 234, 234, 0, 0, 0, 0], [232, 232, 232, 232, 233, 249, 231, 255, 255, 255, 255, 254, 109, 60, 239, 237, 238, 237, 235, 235, 235, 235, 236, 235, 235, 235, 234, 232, 232, 232, 232, 232, 233, 0], [0, 232, 232, 233, 233, 233, 233, 233, 233, 233, 233, 233, 235, 236, 238, 238, 235, 188, 254, 254, 145, 236, 252, 254, 254, 254, 254, 249, 236, 235, 232, 232, 233, 0], [0, 0, 233, 237, 249, 239, 233, 252, 231, 231, 231, 231, 231, 231, 254, 235, 235, 254, 231, 231, 251, 235, 237, 231, 231, 231, 231, 7, 237, 235, 232, 233, 233, 0], [0, 0, 0, 0, 233, 248, 239, 233, 231, 231, 231, 231, 254, 233, 233, 235, 254, 255, 231, 254, 237, 236, 254, 239, 235, 235, 233, 233, 232, 232, 233, 232, 0, 0], [0, 0, 0, 232, 233, 246, 255, 255, 236, 236, 236, 236, 236, 255, 231, 231, 231, 231, 231, 231, 252, 234, 248, 231, 231, 231, 231, 248, 232, 232, 232, 0, 0, 0], [0, 0, 0, 0, 235, 237, 7, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 255, 238, 235, 7, 231, 231, 231, 246, 232, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 235, 103, 188, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 231, 252, 232, 238, 231, 231, 255, 244, 232, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 235, 236, 103, 146, 253, 255, 231, 231, 231, 231, 231, 253, 251, 250, 250, 250, 246, 232, 235, 152, 255, 146, 66, 233, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 233, 103, 146, 146, 146, 146, 254, 231, 231, 231, 109, 103, 146, 255, 188, 239, 240, 103, 255, 253, 103, 238, 234, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 232, 235, 109, 146, 146, 146, 146, 146, 252, 152, 146, 146, 146, 146, 146, 146, 146, 146, 146, 146, 103, 235, 233, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 235, 235, 103, 146, 146, 146, 146, 146, 146, 188, 188, 188, 188, 188, 188, 152, 146, 146, 146, 66, 235, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 233, 235, 66, 146, 146, 146, 146, 152, 255, 146, 240, 239, 241, 109, 146, 146, 146, 103, 233, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 234, 237, 109, 146, 146, 146, 146, 146, 254, 231, 231, 188, 146, 146, 146, 103, 233, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 233, 237, 60, 103, 146, 146, 146, 146, 146, 103, 66, 60, 235, 232, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 232, 233, 233, 236, 235, 237, 235, 237, 237, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0] ] function visnup (args, cb) { handsomeFace.forEach(function (line) { console.log(line.map(function (ch) { return '\u001b[' + (ch ? '48;5;' + ch : ch) + 'm' }).join(' ')) }) var c = args.shift() if (c) npm.commands[c](args, cb) else cb() } npm_3.5.2.orig/lib/whoami.js0000644000000000000000000000261712631326456014070 0ustar 00000000000000var npm = require('./npm.js') module.exports = whoami whoami.usage = 'npm whoami [--registry ]\n(just prints username according to given registry)' function whoami (args, silent, cb) { // FIXME: need tighter checking on this, but is a breaking change if (typeof cb !== 'function') { cb = silent silent = false } var registry = npm.config.get('registry') if (!registry) return cb(new Error('no default registry set')) var auth = npm.config.getCredentialsByURI(registry) if (auth) { if (auth.username) { if (!silent) console.log(auth.username) return process.nextTick(cb.bind(this, null, auth.username)) } else if (auth.token) { return npm.registry.whoami(registry, { auth: auth }, function (er, username) { if (er) return cb(er) if (!username) { var needNewSession = new Error( 'Your auth token is no longer valid. Please log in again.' ) needNewSession.code = 'ENEEDAUTH' return cb(needNewSession) } if (!silent) console.log(username) cb(null, username) }) } } // At this point, if they have a credentials object, it doesn't have a token // or auth in it. Probably just the default registry. var needAuth = new Error( 'this command requires you to be logged in.' ) needAuth.code = 'ENEEDAUTH' process.nextTick(cb.bind(this, needAuth)) } npm_3.5.2.orig/lib/xmas.js0000644000000000000000000000311112631326456013542 0ustar 00000000000000// happy xmas var log = require('npmlog') module.exports = function (args, cb) { var s = process.platform === 'win32' ? ' *' : ' \u2605' var f = '\uFF0F' var b = '\uFF3C' var x = process.platform === 'win32' ? ' ' : '' var o = [ '\u0069', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u0020', '\u2E1B', '\u2042', '\u2E2E', '&', '@', '\uFF61' ] var oc = [21, 33, 34, 35, 36, 37] var l = '\u005e' function w (s) { process.stderr.write(s) } w('\n') ;(function T (H) { for (var i = 0; i < H; i++) w(' ') w(x + '\u001b[33m' + s + '\n') var M = H * 2 - 1 for (var L = 1; L <= H; L++) { var O = L * 2 - 2 var S = (M - O) / 2 for (i = 0; i < S; i++) w(' ') w(x + '\u001b[32m' + f) for (i = 0; i < O; i++) { w( '\u001b[' + oc[Math.floor(Math.random() * oc.length)] + 'm' + o[Math.floor(Math.random() * o.length)] ) } w(x + '\u001b[32m' + b + '\n') } w(' ') for (i = 1; i < H; i++) w('\u001b[32m' + l) w('| ' + x + ' |') for (i = 1; i < H; i++) w('\u001b[32m' + l) if (H > 10) { w('\n ') for (i = 1; i < H; i++) w(' ') w('| ' + x + ' |') for (i = 1; i < H; i++) w(' ') } })(20) w('\n\n') log.heading = '' log.addLevel('npm', 100000, log.headingStyle) log.npm('loves you', 'Happy Xmas, Noders!') cb() } var dg = false Object.defineProperty(module.exports, 'usage', {get: function () { if (dg) module.exports([], function () {}) dg = true return ' ' }}) npm_3.5.2.orig/lib/cache/add-local-tarball.js0000644000000000000000000001325412631326456017105 0ustar 00000000000000var mkdir = require('mkdirp') var assert = require('assert') var fs = require('graceful-fs') var writeFileAtomic = require('write-file-atomic') var path = require('path') var sha = require('sha') var npm = require('../npm.js') var log = require('npmlog') var tar = require('../utils/tar.js') var pathIsInside = require('path-is-inside') var getCacheStat = require('./get-stat.js') var cachedPackageRoot = require('./cached-package-root.js') var chownr = require('chownr') var inflight = require('inflight') var once = require('once') var writeStream = require('fs-write-stream-atomic') var tempFilename = require('../utils/temp-filename.js') var rimraf = require('rimraf') var packageId = require('../utils/package-id.js') module.exports = addLocalTarball function addLocalTarball (p, pkgData, shasum, cb) { assert(typeof p === 'string', 'must have path') assert(typeof cb === 'function', 'must have callback') if (!pkgData) pkgData = {} // If we don't have a shasum yet, compute it. if (!shasum) { return sha.get(p, function (er, shasum) { if (er) return cb(er) log.silly('addLocalTarball', 'shasum (computed)', shasum) addLocalTarball(p, pkgData, shasum, cb) }) } if (pathIsInside(p, npm.cache)) { if (path.basename(p) !== 'package.tgz') { return cb(new Error('Not a valid cache tarball name: ' + p)) } log.verbose('addLocalTarball', 'adding from inside cache', p) return addPlacedTarball(p, pkgData, shasum, cb) } addTmpTarball(p, pkgData, shasum, function (er, data) { if (data) { data._resolved = p data._shasum = data._shasum || shasum } return cb(er, data) }) } function addPlacedTarball (p, pkgData, shasum, cb) { assert(pkgData, 'should have package data by now') assert(typeof cb === 'function', 'cb function required') getCacheStat(function (er, cs) { if (er) return cb(er) return addPlacedTarball_(p, pkgData, cs.uid, cs.gid, shasum, cb) }) } function addPlacedTarball_ (p, pkgData, uid, gid, resolvedSum, cb) { var folder = path.join(cachedPackageRoot(pkgData), 'package') // First, make sure we have the shasum, if we don't already. if (!resolvedSum) { sha.get(p, function (er, shasum) { if (er) return cb(er) addPlacedTarball_(p, pkgData, uid, gid, shasum, cb) }) return } mkdir(folder, function (er) { if (er) return cb(er) var pj = path.join(folder, 'package.json') var json = JSON.stringify(pkgData, null, 2) writeFileAtomic(pj, json, function (er) { cb(er, pkgData) }) }) } function addTmpTarball (tgz, pkgData, shasum, cb) { assert(typeof cb === 'function', 'must have callback function') assert(shasum, 'must have shasum by now') cb = inflight('addTmpTarball:' + tgz, cb) if (!cb) return log.verbose('addTmpTarball', tgz, 'already in flight; not adding') log.verbose('addTmpTarball', tgz, 'not in flight; adding') // we already have the package info, so just move into place if (pkgData && pkgData.name && pkgData.version) { log.verbose( 'addTmpTarball', 'already have metadata; skipping unpack for', packageId(pkgData) ) return addTmpTarball_(tgz, pkgData, shasum, cb) } // This is a tarball we probably downloaded from the internet. The shasum's // already been checked, but we haven't ever had a peek inside, so we unpack // it here just to make sure it is what it says it is. // // NOTE: we might not have any clue what we think it is, for example if the // user just did `npm install ./foo.tgz` var target = tempFilename('unpack') getCacheStat(function (er, cs) { if (er) return cb(er) log.verbose('addTmpTarball', 'validating metadata from', tgz) tar.unpack(tgz, target, null, null, cs.uid, cs.gid, function (unpackEr, data) { // cleanup the extracted package and move on with the metadata rimraf(target, function () { if (unpackEr) return cb(unpackEr) // check that this is what we expected. if (!data.name) { return cb(new Error('No name provided')) } else if (pkgData.name && data.name !== pkgData.name) { return cb(new Error('Invalid Package: expected ' + pkgData.name + ' but found ' + data.name)) } if (!data.version) { return cb(new Error('No version provided')) } else if (pkgData.version && data.version !== pkgData.version) { return cb(new Error('Invalid Package: expected ' + packageId(pkgData) + ' but found ' + packageId(data))) } addTmpTarball_(tgz, data, shasum, cb) }) }) }) } function addTmpTarball_ (tgz, data, shasum, cb) { assert(typeof cb === 'function', 'must have callback function') cb = once(cb) assert(data.name, 'should have package name by now') assert(data.version, 'should have package version by now') var root = cachedPackageRoot(data) var pkg = path.resolve(root, 'package') var target = path.resolve(root, 'package.tgz') getCacheStat(function (er, cs) { if (er) return cb(er) mkdir(pkg, function (er, created) { // chown starting from the first dir created by mkdirp, // or the root dir, if none had to be created, so that // we know that we get all the children. function chown () { chownr(created || root, cs.uid, cs.gid, done) } if (er) return cb(er) var read = fs.createReadStream(tgz) var write = writeStream(target, { mode: npm.modes.file }) var fin = cs.uid && cs.gid ? chown : done read.on('error', cb).pipe(write).on('error', cb).on('close', fin) }) }) function done () { data._shasum = data._shasum || shasum cb(null, data) } } npm_3.5.2.orig/lib/cache/add-local.js0000644000000000000000000001006412631326456015462 0ustar 00000000000000var assert = require('assert') var path = require('path') var mkdir = require('mkdirp') var chownr = require('chownr') var pathIsInside = require('path-is-inside') var readJson = require('read-package-json') var log = require('npmlog') var npm = require('../npm.js') var tar = require('../utils/tar.js') var deprCheck = require('../utils/depr-check.js') var getCacheStat = require('./get-stat.js') var cachedPackageRoot = require('./cached-package-root.js') var addLocalTarball = require('./add-local-tarball.js') var sha = require('sha') var inflight = require('inflight') var lifecycle = require('../utils/lifecycle.js') var iferr = require('iferr') module.exports = addLocal function addLocal (p, pkgData, cb_) { assert(typeof p === 'object', 'must have spec info') assert(typeof cb === 'function', 'must have callback') pkgData = pkgData || {} function cb (er, data) { if (er) { log.error('addLocal', 'Could not install %s', p.spec) return cb_(er) } if (data && !data._fromHosted) { data._from = path.relative(npm.prefix, p.spec) || '.' var resolved = path.relative(npm.prefix, p.spec) if (resolved) data._resolved = 'file:' + resolved } return cb_(er, data) } if (p.type === 'directory') { addLocalDirectory(p.spec, pkgData, null, cb) } else { addLocalTarball(p.spec, pkgData, null, cb) } } // At this point, if shasum is set, it's something that we've already // read and checked. Just stashing it in the data at this point. function addLocalDirectory (p, pkgData, shasum, cb) { assert(pkgData, 'must pass package data') assert(typeof cb === 'function', 'must have callback') // if it's a folder, then read the package.json, // tar it to the proper place, and add the cache tar if (pathIsInside(p, npm.cache)) { return cb(new Error( 'Adding a cache directory to the cache will make the world implode.' )) } readJson(path.join(p, 'package.json'), false, function (er, data) { if (er) return cb(er) if (!data.name) { return cb(new Error('No name provided in package.json')) } else if (pkgData.name && pkgData.name !== data.name) { return cb(new Error( 'Invalid package: expected ' + pkgData.name + ' but found ' + data.name )) } if (!data.version) { return cb(new Error('No version provided in package.json')) } else if (pkgData.version && pkgData.version !== data.version) { return cb(new Error( 'Invalid package: expected ' + pkgData.name + '@' + pkgData.version + ' but found ' + data.name + '@' + data.version )) } deprCheck(data) // pack to {cache}/name/ver/package.tgz var root = cachedPackageRoot(data) var tgz = path.resolve(root, 'package.tgz') var pj = path.resolve(root, 'package/package.json') var wrapped = inflight(tgz, next) if (!wrapped) return log.verbose('addLocalDirectory', tgz, 'already in flight; waiting') log.verbose('addLocalDirectory', tgz, 'not in flight; packing') getCacheStat(function (er, cs) { mkdir(path.dirname(pj), function (er, made) { if (er) return cb(er) var doPrePublish = !pathIsInside(p, npm.tmp) if (doPrePublish) { lifecycle(data, 'prepublish', p, iferr(cb, thenPack)) } else { thenPack() } function thenPack () { tar.pack(tgz, p, data, function (er) { if (er) { log.error('addLocalDirectory', 'Could not pack', p, 'to', tgz) return cb(er) } if (!cs || isNaN(cs.uid) || isNaN(cs.gid)) wrapped() chownr(made || tgz, cs.uid, cs.gid, wrapped) }) } }) }) function next (er) { if (er) return cb(er) // if we have the shasum already, just add it if (shasum) { return addLocalTarball(tgz, data, shasum, cb) } else { sha.get(tgz, function (er, shasum) { if (er) { return cb(er) } data._shasum = shasum return addLocalTarball(tgz, data, shasum, cb) }) } } }) } npm_3.5.2.orig/lib/cache/add-named.js0000644000000000000000000002176212631326456015463 0ustar 00000000000000var path = require('path') var assert = require('assert') var fs = require('graceful-fs') var http = require('http') var log = require('npmlog') var semver = require('semver') var readJson = require('read-package-json') var url = require('url') var npm = require('../npm.js') var deprCheck = require('../utils/depr-check.js') var inflight = require('inflight') var addRemoteTarball = require('./add-remote-tarball.js') var cachedPackageRoot = require('./cached-package-root.js') var mapToRegistry = require('../utils/map-to-registry.js') var pulseTillDone = require('../utils/pulse-till-done.js') var packageId = require('../utils/package-id.js') module.exports = addNamed function getOnceFromRegistry (name, from, next, done) { function fixName (err, data, json, resp) { // this is only necessary until npm/npm-registry-client#80 is fixed if (err && err.pkgid && err.pkgid !== name) { err.message = err.message.replace( new RegExp(': ' + err.pkgid.replace(/(\W)/g, '\\$1') + '$'), ': ' + name ) err.pkgid = name } next(err, data, json, resp) } mapToRegistry(name, npm.config, function (er, uri, auth) { if (er) return done(er) var key = 'registry:' + uri next = inflight(key, next) if (!next) return log.verbose(from, key, 'already in flight; waiting') else log.verbose(from, key, 'not in flight; fetching') npm.registry.get(uri, { auth: auth }, pulseTillDone('fetchRegistry', fixName)) }) } function addNamed (name, version, data, cb_) { assert(typeof name === 'string', 'must have module name') assert(typeof cb_ === 'function', 'must have callback') var key = name + '@' + version log.silly('addNamed', key) function cb (er, data) { if (data && !data._fromHosted) data._from = key cb_(er, data) } if (semver.valid(version, true)) { log.verbose('addNamed', JSON.stringify(version), 'is a plain semver version for', name) addNameVersion(name, version, data, cb) } else if (semver.validRange(version, true)) { log.verbose('addNamed', JSON.stringify(version), 'is a valid semver range for', name) addNameRange(name, version, data, cb) } else { log.verbose('addNamed', JSON.stringify(version), 'is being treated as a dist-tag for', name) addNameTag(name, version, data, cb) } } function addNameTag (name, tag, data, cb) { log.info('addNameTag', [name, tag]) var explicit = true if (!tag) { explicit = false tag = npm.config.get('tag') } getOnceFromRegistry(name, 'addNameTag', next, cb) function next (er, data, json, resp) { if (!er) er = errorResponse(name, resp) if (er) return cb(er) log.silly('addNameTag', 'next cb for', name, 'with tag', tag) engineFilter(data) if (data['dist-tags'] && data['dist-tags'][tag] && data.versions[data['dist-tags'][tag]]) { var ver = data['dist-tags'][tag] return addNamed(name, ver, data.versions[ver], cb) } if (!explicit && Object.keys(data.versions).length) { return addNamed(name, '*', data, cb) } er = installTargetsError(tag, data) return cb(er) } } function engineFilter (data) { var npmv = npm.version var nodev = npm.config.get('node-version') var strict = npm.config.get('engine-strict') if (!nodev || npm.config.get('force')) return data Object.keys(data.versions || {}).forEach(function (v) { var eng = data.versions[v].engines if (!eng) return if (!strict) return if (eng.node && !semver.satisfies(nodev, eng.node, true) || eng.npm && !semver.satisfies(npmv, eng.npm, true)) { delete data.versions[v] } }) } function addNameVersion (name, v, data, cb) { var ver = semver.valid(v, true) if (!ver) return cb(new Error('Invalid version: ' + v)) var response if (data) { response = null return next() } getOnceFromRegistry(name, 'addNameVersion', setData, cb) function setData (er, d, json, resp) { if (!er) { er = errorResponse(name, resp) } if (er) return cb(er) data = d && d.versions[ver] if (!data) { er = new Error('version not found: ' + name + '@' + ver) er.package = name er.statusCode = 404 return cb(er) } response = resp next() } function next () { deprCheck(data) var dist = data.dist if (!dist) return cb(new Error('No dist in ' + packageId(data) + ' package')) if (!dist.tarball) { return cb(new Error( 'No dist.tarball in ' + packageId(data) + ' package' )) } if ((response && response.statusCode !== 304) || npm.config.get('force')) { return fetchit() } // we got cached data, so let's see if we have a tarball. var pkgroot = cachedPackageRoot({ name: name, version: ver }) var pkgtgz = path.join(pkgroot, 'package.tgz') var pkgjson = path.join(pkgroot, 'package', 'package.json') fs.stat(pkgtgz, function (er) { if (!er) { readJson(pkgjson, function (er, data) { if (er && er.code !== 'ENOENT' && er.code !== 'ENOTDIR') return cb(er) if (data) { if (!data.name) return cb(new Error('No name provided')) if (!data.version) return cb(new Error('No version provided')) // check the SHA of the package we have, to ensure it wasn't installed // from somewhere other than the registry (eg, a fork) if (data._shasum && dist.shasum && data._shasum !== dist.shasum) { return fetchit() } } if (er) return fetchit() else return cb(null, data) }) } else return fetchit() }) function fetchit () { mapToRegistry(name, npm.config, function (er, _, auth, ruri) { if (er) return cb(er) // Use the same protocol as the registry. https registry --> https // tarballs, but only if they're the same hostname, or else detached // tarballs may not work. var tb = url.parse(dist.tarball) var rp = url.parse(ruri) if (tb.hostname === rp.hostname && tb.protocol !== rp.protocol) { tb.protocol = rp.protocol // If a different port is associated with the other protocol // we need to update that as well if (rp.port !== tb.port) { tb.port = rp.port delete tb.host } delete tb.href } tb = url.format(tb) // Only add non-shasum'ed packages if --forced. Only ancient things // would lack this for good reasons nowadays. if (!dist.shasum && !npm.config.get('force')) { return cb(new Error('package lacks shasum: ' + packageId(data))) } addRemoteTarball(tb, data, dist.shasum, auth, cb) }) } } } function addNameRange (name, range, data, cb) { range = semver.validRange(range, true) if (range === null) { return cb(new Error( 'Invalid version range: ' + range )) } log.silly('addNameRange', { name: name, range: range, hasData: !!data }) if (data) return next() getOnceFromRegistry(name, 'addNameRange', setData, cb) function setData (er, d, json, resp) { if (!er) { er = errorResponse(name, resp) } if (er) return cb(er) data = d next() } function next () { log.silly( 'addNameRange', 'number 2', { name: name, range: range, hasData: !!data } ) engineFilter(data) log.silly('addNameRange', 'versions' , [data.name, Object.keys(data.versions || {})]) // if the tagged version satisfies, then use that. var tagged = data['dist-tags'][npm.config.get('tag')] if (tagged && data.versions[tagged] && semver.satisfies(tagged, range, true)) { return addNamed(name, tagged, data.versions[tagged], cb) } // find the max satisfying version. var versions = Object.keys(data.versions || {}) var ms = semver.maxSatisfying(versions, range, true) if (!ms) { if (range === '*' && versions.length) { return addNameTag(name, 'latest', data, cb) } else { return cb(installTargetsError(range, data)) } } // if we don't have a registry connection, try to see if // there's a cached copy that will be ok. addNamed(name, ms, data.versions[ms], cb) } } function installTargetsError (requested, data) { var targets = Object.keys(data['dist-tags']).filter(function (f) { return (data.versions || {}).hasOwnProperty(f) }).concat(Object.keys(data.versions || {})) requested = data.name + (requested ? "@'" + requested + "'" : '') targets = targets.length ? 'Valid install targets:\n' + targets.join(', ') + '\n' : 'No valid targets found.\n' + 'Perhaps not compatible with your version of node?' var er = new Error('No compatible version found: ' + requested + '\n' + targets) er.code = 'ETARGET' return er } function errorResponse (name, response) { var er if (response.statusCode >= 400) { er = new Error(http.STATUS_CODES[response.statusCode]) er.statusCode = response.statusCode er.code = 'E' + er.statusCode er.pkgid = name } return er } npm_3.5.2.orig/lib/cache/add-remote-git.js0000644000000000000000000003607412631326456016455 0ustar 00000000000000var assert = require('assert') var fs = require('graceful-fs') var path = require('path') var url = require('url') var chownr = require('chownr') var dezalgo = require('dezalgo') var hostedFromURL = require('hosted-git-info').fromUrl var inflight = require('inflight') var log = require('npmlog') var mkdir = require('mkdirp') var normalizeGitUrl = require('normalize-git-url') var npa = require('npm-package-arg') var realizePackageSpecifier = require('realize-package-specifier') var uniqueFilename = require('unique-filename') var addLocal = require('./add-local.js') var correctMkdir = require('../utils/correct-mkdir.js') var git = require('../utils/git.js') var npm = require('../npm.js') var rm = require('../utils/gently-rm.js') var tempFilename = require('../utils/temp-filename.js') var remotes = path.resolve(npm.config.get('cache'), '_git-remotes') var templates = path.join(remotes, '_templates') var VALID_VARIABLES = [ 'GIT_ASKPASS', 'GIT_PROXY_COMMAND', 'GIT_SSH', 'GIT_SSH_COMMAND', 'GIT_SSL_CAINFO', 'GIT_SSL_NO_VERIFY' ] module.exports = addRemoteGit function addRemoteGit (uri, _cb) { assert(typeof uri === 'string', 'must have git URL') assert(typeof _cb === 'function', 'must have callback') var cb = dezalgo(_cb) log.verbose('addRemoteGit', 'caching', uri) // the URL comes in exactly as it was passed on the command line, or as // normalized by normalize-package-data / read-package-json / read-installed, // so figure out what to do with it using hosted-git-info var parsed = hostedFromURL(uri) if (parsed) { // normalize GitHub syntax to org/repo (for now) var from if (parsed.type === 'github' && parsed.getDefaultRepresentation() === 'shortcut') { from = parsed.path() } else { from = parsed.toString() } log.verbose('addRemoteGit', from, 'is a repository hosted by', parsed.type) // prefer explicit URLs to pushing everything through shortcuts if (parsed.getDefaultRepresentation() !== 'shortcut') { return tryClone(from, parsed.toString(), false, cb) } // try git:, then git+ssh:, then git+https: before failing tryGitProto(from, parsed, cb) } else { // verify that this is a Git URL before continuing parsed = npa(uri) if (parsed.type !== 'git') { return cb(new Error(uri + 'is not a Git or GitHub URL')) } tryClone(parsed.rawSpec, uri, false, cb) } } function tryGitProto (from, hostedInfo, cb) { var gitURL = hostedInfo.git() if (!gitURL) return tryHTTPS(from, hostedInfo, cb) log.silly('tryGitProto', 'attempting to clone', gitURL) tryClone(from, gitURL, true, function (er) { if (er) return tryHTTPS(from, hostedInfo, cb) cb.apply(this, arguments) }) } function tryHTTPS (from, hostedInfo, cb) { var httpsURL = hostedInfo.https() if (!httpsURL) { return cb(new Error(from + ' can not be cloned via Git, SSH, or HTTPS')) } log.silly('tryHTTPS', 'attempting to clone', httpsURL) tryClone(from, httpsURL, true, function (er) { if (er) return trySSH(from, hostedInfo, cb) cb.apply(this, arguments) }) } function trySSH (from, hostedInfo, cb) { var sshURL = hostedInfo.ssh() if (!sshURL) return tryHTTPS(from, hostedInfo, cb) log.silly('trySSH', 'attempting to clone', sshURL) tryClone(from, sshURL, false, cb) } function tryClone (from, combinedURL, silent, cb) { log.silly('tryClone', 'cloning', from, 'via', combinedURL) var normalized = normalizeGitUrl(combinedURL) var cloneURL = normalized.url var treeish = normalized.branch // ensure that similarly-named remotes don't collide var cachedRemote = uniqueFilename(remotes, combinedURL.replace(/[^a-zA-Z0-9]+/g, '-'), cloneURL) var repoID = path.relative(remotes, cachedRemote) cachedRemote = path.join(remotes, repoID) cb = inflight(repoID, cb) if (!cb) { return log.verbose('tryClone', repoID, 'already in flight; waiting') } log.verbose('tryClone', repoID, 'not in flight; caching') // initialize the remotes cache with the correct perms getGitDir(function (er) { if (er) return cb(er) fs.stat(cachedRemote, function (er, s) { if (er) return mirrorRemote(from, cloneURL, treeish, cachedRemote, silent, finish) if (!s.isDirectory()) return resetRemote(from, cloneURL, treeish, cachedRemote, finish) validateExistingRemote(from, cloneURL, treeish, cachedRemote, finish) }) // always set permissions on the cached remote function finish (er, data) { if (er) return cb(er, data) addModeRecursive(cachedRemote, npm.modes.file, function (er) { return cb(er, data) }) } }) } // don't try too hard to hold on to a remote function resetRemote (from, cloneURL, treeish, cachedRemote, cb) { log.info('resetRemote', 'resetting', cachedRemote, 'for', from) rm(cachedRemote, function (er) { if (er) return cb(er) mirrorRemote(from, cloneURL, treeish, cachedRemote, false, cb) }) } // reuse a cached remote when possible, but nuke it if it's in an // inconsistent state function validateExistingRemote (from, cloneURL, treeish, cachedRemote, cb) { git.whichAndExec( ['config', '--get', 'remote.origin.url'], { cwd: cachedRemote, env: gitEnv() }, function (er, stdout, stderr) { var originURL if (stdout) { originURL = stdout.trim() log.silly('validateExistingRemote', from, 'remote.origin.url:', originURL) } if (stderr) stderr = stderr.trim() if (stderr || er) { log.warn('addRemoteGit', from, 'resetting remote', cachedRemote, 'because of error:', stderr || er) return resetRemote(from, cloneURL, treeish, cachedRemote, cb) } else if (cloneURL !== originURL) { log.warn( 'addRemoteGit', from, 'pre-existing cached repo', cachedRemote, 'points to', originURL, 'and not', cloneURL ) return resetRemote(from, cloneURL, treeish, cachedRemote, cb) } log.verbose('validateExistingRemote', from, 'is updating existing cached remote', cachedRemote) updateRemote(from, cloneURL, treeish, cachedRemote, cb) } ) } // make a complete bare mirror of the remote repo // NOTE: npm uses a blank template directory to prevent weird inconsistencies // https://github.com/npm/npm/issues/5867 function mirrorRemote (from, cloneURL, treeish, cachedRemote, silent, cb) { mkdir(cachedRemote, function (er) { if (er) return cb(er) var args = [ 'clone', '--template=' + templates, '--mirror', cloneURL, cachedRemote ] git.whichAndExec( ['clone', '--template=' + templates, '--mirror', cloneURL, cachedRemote], { cwd: cachedRemote, env: gitEnv() }, function (er, stdout, stderr) { if (er) { var combined = (stdout + '\n' + stderr).trim() var command = 'git ' + args.join(' ') + ':' if (silent) { log.verbose(command, combined) } else { log.error(command, combined) } return cb(er) } log.verbose('mirrorRemote', from, 'git clone ' + cloneURL, stdout.trim()) setPermissions(from, cloneURL, treeish, cachedRemote, cb) } ) }) } function setPermissions (from, cloneURL, treeish, cachedRemote, cb) { if (process.platform === 'win32') { log.verbose('setPermissions', from, 'skipping chownr on Windows') resolveHead(from, cloneURL, treeish, cachedRemote, cb) } else { getGitDir(function (er, cs) { if (er) { log.error('setPermissions', from, 'could not get cache stat') return cb(er) } chownr(cachedRemote, cs.uid, cs.gid, function (er) { if (er) { log.error( 'setPermissions', 'Failed to change git repository ownership under npm cache for', cachedRemote ) return cb(er) } log.verbose('setPermissions', from, 'set permissions on', cachedRemote) resolveHead(from, cloneURL, treeish, cachedRemote, cb) }) }) } } // always fetch the origin, even right after mirroring, because this way // permissions will get set correctly function updateRemote (from, cloneURL, treeish, cachedRemote, cb) { git.whichAndExec( ['fetch', '-a', 'origin'], { cwd: cachedRemote, env: gitEnv() }, function (er, stdout, stderr) { if (er) { var combined = (stdout + '\n' + stderr).trim() log.error('git fetch -a origin (' + cloneURL + ')', combined) return cb(er) } log.verbose('updateRemote', 'git fetch -a origin (' + cloneURL + ')', stdout.trim()) setPermissions(from, cloneURL, treeish, cachedRemote, cb) } ) } // branches and tags are both symbolic labels that can be attached to different // commits, so resolve the commit-ish to the current actual treeish the label // corresponds to // // important for shrinkwrap function resolveHead (from, cloneURL, treeish, cachedRemote, cb) { log.verbose('resolveHead', from, 'original treeish:', treeish) var args = ['rev-list', '-n1', treeish] git.whichAndExec( args, { cwd: cachedRemote, env: gitEnv() }, function (er, stdout, stderr) { if (er) { log.error('git ' + args.join(' ') + ':', stderr) return cb(er) } var resolvedTreeish = stdout.trim() log.silly('resolveHead', from, 'resolved treeish:', resolvedTreeish) var resolvedURL = getResolved(cloneURL, resolvedTreeish) if (!resolvedURL) { return cb(new Error( 'unable to clone ' + from + ' because git clone string ' + cloneURL + ' is in a form npm can\'t handle' )) } log.verbose('resolveHead', from, 'resolved Git URL:', resolvedURL) // generate a unique filename var tmpdir = path.join(tempFilename('git-cache'), resolvedTreeish) log.silly('resolveHead', 'Git working directory:', tmpdir) mkdir(tmpdir, function (er) { if (er) return cb(er) cloneResolved(from, resolvedURL, resolvedTreeish, cachedRemote, tmpdir, cb) }) } ) } // make a clone from the mirrored cache so we have a temporary directory in // which we can check out the resolved treeish function cloneResolved (from, resolvedURL, resolvedTreeish, cachedRemote, tmpdir, cb) { var args = ['clone', cachedRemote, tmpdir] git.whichAndExec( args, { cwd: cachedRemote, env: gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + '\n' + stderr).trim() if (er) { log.error('git ' + args.join(' ') + ':', stderr) return cb(er) } log.verbose('cloneResolved', from, 'clone', stdout) checkoutTreeish(from, resolvedURL, resolvedTreeish, tmpdir, cb) } ) } // there is no safe way to do a one-step clone to a treeish that isn't // guaranteed to be a branch, so explicitly check out the treeish once it's // cloned function checkoutTreeish (from, resolvedURL, resolvedTreeish, tmpdir, cb) { var args = ['checkout', resolvedTreeish] git.whichAndExec( args, { cwd: tmpdir, env: gitEnv() }, function (er, stdout, stderr) { stdout = (stdout + '\n' + stderr).trim() if (er) { log.error('git ' + args.join(' ') + ':', stderr) return cb(er) } log.verbose('checkoutTreeish', from, 'checkout', stdout) // convince addLocal that the checkout is a local dependency realizePackageSpecifier(tmpdir, function (er, spec) { if (er) { log.error('addRemoteGit', 'Failed to map', tmpdir, 'to a package specifier') return cb(er) } // ensure pack logic is applied // https://github.com/npm/npm/issues/6400 addLocal(spec, null, function (er, data) { if (data) { if (npm.config.get('save-exact')) { log.verbose('addRemoteGit', 'data._from:', resolvedURL, '(save-exact)') data._from = resolvedURL } else { log.verbose('addRemoteGit', 'data._from:', from) data._from = from } log.verbose('addRemoteGit', 'data._resolved:', resolvedURL) data._resolved = resolvedURL } cb(er, data) }) }) } ) } function getGitDir (cb) { correctMkdir(remotes, function (er, stats) { if (er) return cb(er) // We don't need global templates when cloning. Use an empty directory for // the templates, creating it (and setting its permissions) if necessary. mkdir(templates, function (er) { if (er) return cb(er) // Ensure that both the template and remotes directories have the correct // permissions. fs.chown(templates, stats.uid, stats.gid, function (er) { cb(er, stats) }) }) }) } var gitEnv_ function gitEnv () { // git responds to env vars in some weird ways in post-receive hooks // so don't carry those along. if (gitEnv_) return gitEnv_ // allow users to override npm's insistence on not prompting for // passphrases, but default to just failing when credentials // aren't available gitEnv_ = { GIT_ASKPASS: 'echo' } for (var k in process.env) { if (!~VALID_VARIABLES.indexOf(k) && k.match(/^GIT/)) continue gitEnv_[k] = process.env[k] } return gitEnv_ } addRemoteGit.getResolved = getResolved function getResolved (uri, treeish) { // normalize hosted-git-info clone URLs back into regular URLs // this will only work on URLs that hosted-git-info recognizes // https://github.com/npm/npm/issues/7961 var rehydrated = hostedFromURL(uri) if (rehydrated) uri = rehydrated.toString() var parsed = url.parse(uri) // Checks for known protocols: // http:, https:, ssh:, and git:, with optional git+ prefix. if (!parsed.protocol || !parsed.protocol.match(/^(((git\+)?(https?|ssh))|git|file):$/)) { uri = 'git+ssh://' + uri } if (!/^git[+:]/.test(uri)) { uri = 'git+' + uri } // Not all URIs are actually URIs, so use regex for the treeish. return uri.replace(/(?:#.*)?$/, '#' + treeish) } // similar to chmodr except it add permissions rather than overwriting them // adapted from https://github.com/isaacs/chmodr/blob/master/chmodr.js function addModeRecursive (cachedRemote, mode, cb) { fs.readdir(cachedRemote, function (er, children) { // Any error other than ENOTDIR means it's not readable, or doesn't exist. // Give up. if (er && er.code !== 'ENOTDIR') return cb(er) if (er || !children.length) return addMode(cachedRemote, mode, cb) var len = children.length var errState = null children.forEach(function (child) { addModeRecursive(path.resolve(cachedRemote, child), mode, then) }) function then (er) { if (errState) return undefined if (er) return cb(errState = er) if (--len === 0) return addMode(cachedRemote, dirMode(mode), cb) } }) } function addMode (cachedRemote, mode, cb) { fs.stat(cachedRemote, function (er, stats) { if (er) return cb(er) mode = stats.mode | mode fs.chmod(cachedRemote, mode, cb) }) } // taken from https://github.com/isaacs/chmodr/blob/master/chmodr.js function dirMode (mode) { if (mode & parseInt('0400', 8)) mode |= parseInt('0100', 8) if (mode & parseInt('040', 8)) mode |= parseInt('010', 8) if (mode & parseInt('04', 8)) mode |= parseInt('01', 8) return mode } npm_3.5.2.orig/lib/cache/add-remote-tarball.js0000644000000000000000000000733112631326456017305 0ustar 00000000000000var mkdir = require('mkdirp') var assert = require('assert') var log = require('npmlog') var path = require('path') var sha = require('sha') var retry = require('retry') var createWriteStream = require('fs-write-stream-atomic') var npm = require('../npm.js') var inflight = require('inflight') var addLocalTarball = require('./add-local-tarball.js') var cacheFile = require('npm-cache-filename') var rimraf = require('rimraf') var pulseTillDone = require('../utils/pulse-till-done.js') module.exports = addRemoteTarball function addRemoteTarball (u, pkgData, shasum, auth, cb_) { assert(typeof u === 'string', 'must have module URL') assert(typeof cb_ === 'function', 'must have callback') function cb (er, data) { if (data) { data._from = u data._resolved = u data._shasum = data._shasum || shasum } cb_(er, data) } cb_ = inflight(u, cb_) if (!cb_) return log.verbose('addRemoteTarball', u, 'already in flight; waiting') log.verbose('addRemoteTarball', u, 'not in flight; adding') // XXX Fetch direct to cache location, store tarballs under // ${cache}/registry.npmjs.org/pkg/-/pkg-1.2.3.tgz var tmp = cacheFile(npm.tmp, u) function next (er, resp, shasum) { if (er) return cb(er) addLocalTarball(tmp, pkgData, shasum, cleanup) } function cleanup (er, data) { if (er) return cb(er) rimraf(tmp, function () { cb(er, data) }) } log.verbose('addRemoteTarball', [u, shasum]) mkdir(path.dirname(tmp), function (er) { if (er) return cb(er) addRemoteTarball_(u, tmp, shasum, auth, next) }) } function addRemoteTarball_ (u, tmp, shasum, auth, cb) { // Tuned to spread 3 attempts over about a minute. // See formula at . var operation = retry.operation({ retries: npm.config.get('fetch-retries'), factor: npm.config.get('fetch-retry-factor'), minTimeout: npm.config.get('fetch-retry-mintimeout'), maxTimeout: npm.config.get('fetch-retry-maxtimeout') }) operation.attempt(function (currentAttempt) { log.info( 'retry', 'fetch attempt', currentAttempt, 'at', (new Date()).toLocaleTimeString() ) fetchAndShaCheck(u, tmp, shasum, auth, function (er, response, shasum) { // Only retry on 408, 5xx or no `response`. var sc = response && response.statusCode var statusRetry = !sc || (sc === 408 || sc >= 500) if (er && statusRetry && operation.retry(er)) { log.warn('retry', 'will retry, error on last attempt: ' + er) return } cb(er, response, shasum) }) }) } function fetchAndShaCheck (u, tmp, shasum, auth, cb) { cb = pulseTillDone('fetchTarball', cb) npm.registry.fetch(u, { auth: auth }, function (er, response) { if (er) { log.error('fetch failed', u) return cb(er, response) } var tarball = createWriteStream(tmp, { mode: npm.modes.file }) tarball.on('error', function (er) { cb(er) tarball.destroy() }) tarball.on('finish', function () { if (!shasum) { // Well, we weren't given a shasum, so at least sha what we have // in case we want to compare it to something else later return sha.get(tmp, function (er, shasum) { log.silly('fetchAndShaCheck', 'shasum', shasum) cb(er, response, shasum) }) } // validate that the url we just downloaded matches the expected shasum. log.silly('fetchAndShaCheck', 'shasum', shasum) sha.check(tmp, shasum, function (er) { if (er && er.message) { // add original filename for better debuggability er.message = er.message + '\n' + 'From: ' + u } return cb(er, response, shasum) }) }) response.pipe(tarball) }) } npm_3.5.2.orig/lib/cache/cached-package-root.js0000644000000000000000000000060312631326456017421 0ustar 00000000000000var assert = require('assert') var resolve = require('path').resolve var npm = require('../npm.js') module.exports = getCacheRoot function getCacheRoot (data) { assert(data, 'must pass package metadata') assert(data.name, 'package metadata must include name') assert(data.version, 'package metadata must include version') return resolve(npm.cache, data.name, data.version) } npm_3.5.2.orig/lib/cache/caching-client.js0000644000000000000000000001460012631326456016512 0ustar 00000000000000module.exports = CachingRegistryClient var path = require('path') var fs = require('graceful-fs') var url = require('url') var assert = require('assert') var inherits = require('util').inherits var RegistryClient = require('npm-registry-client') var npm = require('../npm.js') var log = require('npmlog') var getCacheStat = require('./get-stat.js') var cacheFile = require('npm-cache-filename') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var chownr = require('chownr') var writeFile = require('write-file-atomic') var parseJSON = require('../utils/parse-json') function CachingRegistryClient (config) { RegistryClient.call(this, adaptConfig(config)) this._mapToCache = cacheFile(config.get('cache')) // swizzle in our custom cache invalidation logic this._request = this.request this.request = this._invalidatingRequest this.get = get } inherits(CachingRegistryClient, RegistryClient) CachingRegistryClient.prototype._invalidatingRequest = function (uri, params, cb) { var client = this this._request(uri, params, function () { var args = arguments var method = params.method if (method !== 'HEAD' && method !== 'GET') { var invalidated = client._mapToCache(uri) // invalidate cache // // This is irrelevant for commands that do etag / last-modified caching, // but ls and view also have a timed cache, so this keeps the user from // thinking that it didn't work when it did. // Note that failure is an acceptable option here, since the only // result will be a stale cache for some helper commands. log.verbose('request', 'invalidating', invalidated, 'on', method) return rimraf(invalidated, function () { cb.apply(undefined, args) }) } cb.apply(undefined, args) }) } function get (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to get') assert(params && typeof params === 'object', 'must pass params to get') assert(typeof cb === 'function', 'must pass callback to get') var parsed = url.parse(uri) assert( parsed.protocol === 'http:' || parsed.protocol === 'https:', 'must have a URL that starts with http: or https:' ) var cacheBase = cacheFile(npm.config.get('cache'))(uri) var cachePath = path.join(cacheBase, '.cache.json') // If the GET is part of a write operation (PUT or DELETE), then // skip past the cache entirely, but still save the results. if (uri.match(/\?write=true$/)) { log.verbose('get', 'GET as part of write; not caching result') return get_.call(this, uri, cachePath, params, cb) } var client = this fs.stat(cachePath, function (er, stat) { if (!er) { fs.readFile(cachePath, function (er, data) { data = parseJSON.noExceptions(data) params.stat = stat params.data = data get_.call(client, uri, cachePath, params, cb) }) } else { get_.call(client, uri, cachePath, params, cb) } }) } function get_ (uri, cachePath, params, cb) { var staleOk = params.staleOk === undefined ? false : params.staleOk var timeout = params.timeout === undefined ? -1 : params.timeout var data = params.data var stat = params.stat var etag var lastModified timeout = Math.min(timeout, npm.config.get('cache-max') || 0) timeout = Math.max(timeout, npm.config.get('cache-min') || -Infinity) if (process.env.COMP_CWORD !== undefined && process.env.COMP_LINE !== undefined && process.env.COMP_POINT !== undefined) { timeout = Math.max(timeout, 60000) } if (data) { if (data._etag) etag = data._etag if (data._lastModified) lastModified = data._lastModified if (stat && timeout && timeout > 0) { if ((Date.now() - stat.mtime.getTime()) / 1000 < timeout) { log.verbose('get', uri, 'not expired, no request') delete data._etag delete data._lastModified return cb(null, data, JSON.stringify(data), { statusCode: 304 }) } if (staleOk) { log.verbose('get', uri, 'staleOk, background update') delete data._etag delete data._lastModified process.nextTick( cb.bind(null, null, data, JSON.stringify(data), { statusCode: 304 }) ) cb = function () {} } } } var options = { etag: etag, lastModified: lastModified, follow: params.follow, auth: params.auth } this.request(uri, options, function (er, remoteData, raw, response) { // if we get an error talking to the registry, but we have it // from the cache, then just pretend we got it. if (er && cachePath && data && !data.error) { er = null response = { statusCode: 304 } } if (response) { log.silly('get', 'cb', [response.statusCode, response.headers]) if (response.statusCode === 304 && (etag || lastModified)) { remoteData = data log.verbose(etag ? 'etag' : 'lastModified', uri + ' from cache') } } data = remoteData if (!data) er = er || new Error('failed to fetch from registry: ' + uri) if (er) return cb(er, data, raw, response) saveToCache(cachePath, data, saved) // just give the write the old college try. if it fails, whatever. function saved () { delete data._etag delete data._lastModified cb(er, data, raw, response) } function saveToCache (cachePath, data, saved) { log.verbose('get', 'saving', data.name, 'to', cachePath) getCacheStat(function (er, st) { mkdirp(path.dirname(cachePath), function (er, made) { if (er) return saved() writeFile(cachePath, JSON.stringify(data), function (er) { if (er) return saved() chownr(made || cachePath, st.uid, st.gid, saved) }) }) }) } }) } function adaptConfig (config) { return { proxy: { http: config.get('proxy'), https: config.get('https-proxy'), localAddress: config.get('local-address') }, ssl: { certificate: config.get('cert'), key: config.get('key'), ca: config.get('ca'), strict: config.get('strict-ssl') }, retry: { retries: config.get('fetch-retries'), factor: config.get('fetch-retry-factor'), minTimeout: config.get('fetch-retry-mintimeout'), maxTimeout: config.get('fetch-retry-maxtimeout') }, userAgent: config.get('user-agent'), log: log, defaultTag: config.get('tag'), couchToken: config.get('_token') } } npm_3.5.2.orig/lib/cache/get-stat.js0000644000000000000000000000024612631326456015373 0ustar 00000000000000var npm = require('../npm.js') var correctMkdir = require('../utils/correct-mkdir.js') module.exports = function getCacheStat (cb) { correctMkdir(npm.cache, cb) } npm_3.5.2.orig/lib/cache/update-index.js0000644000000000000000000000642112631326456016233 0ustar 00000000000000module.exports = updateIndex var fs = require('graceful-fs') var assert = require('assert') var path = require('path') var mkdir = require('mkdirp') var chownr = require('chownr') var npm = require('../npm.js') var log = require('npmlog') var cacheFile = require('npm-cache-filename') var getCacheStat = require('./get-stat.js') var mapToRegistry = require('../utils/map-to-registry.js') var pulseTillDone = require('../utils/pulse-till-done.js') var parseJSON = require('../utils/parse-json.js') /* /-/all is special. * It uses timestamp-based caching and partial updates, * because it is a monster. */ function updateIndex (staleness, cb) { assert(typeof cb === 'function', 'must pass callback to updateIndex') mapToRegistry('-/all', npm.config, function (er, uri, auth) { if (er) return cb(er) var params = { timeout: staleness, follow: true, staleOk: true, auth: auth } var cacheBase = cacheFile(npm.config.get('cache'))(uri) var cachePath = path.join(cacheBase, '.cache.json') log.info('updateIndex', cachePath) getCacheStat(function (er, st) { if (er) return cb(er) mkdir(cacheBase, function (er, made) { if (er) return cb(er) fs.readFile(cachePath, function (er, data) { if (er) { log.warn('', 'Building the local index for the first time, please be patient') return updateIndex_(uri, params, {}, cachePath, cb) } chownr(made || cachePath, st.uid, st.gid, function (er) { if (er) return cb(er) data = parseJSON.noExceptions(data) if (!data) { fs.writeFile(cachePath, '{}', function (er) { if (er) return cb(new Error('Broken cache.')) log.warn('', 'Building the local index for the first time, please be patient') return updateIndex_(uri, params, {}, cachePath, cb) }) } var t = +data._updated || 0 // use the cache and update in the background if it's not too old if (Date.now() - t < 60000) { cb(null, data) cb = function () {} } if (t === 0) { log.warn('', 'Building the local index for the first time, please be patient') } else { log.verbose('updateIndex', 'Cached search data present with timestamp', t) uri += '/since?stale=update_after&startkey=' + t } updateIndex_(uri, params, data, cachePath, cb) }) }) }) }) }) } function updateIndex_ (all, params, data, cachePath, cb) { log.silly('update-index', 'fetching', all) npm.registry.request(all, params, pulseTillDone('updateIndex', function (er, updates, _, res) { if (er) return cb(er, data) var headers = res.headers var updated = updates._updated || Date.parse(headers.date) Object.keys(updates).forEach(function (p) { data[p] = updates[p] }) data._updated = updated getCacheStat(function (er, st) { if (er) return cb(er) fs.writeFile(cachePath, JSON.stringify(data), function (er) { delete data._updated if (er) return cb(er) chownr(cachePath, st.uid, st.gid, function (er) { cb(er, data) }) }) }) })) } npm_3.5.2.orig/lib/config/clear-credentials-by-uri.js0000644000000000000000000000064712631326456020640 0ustar 00000000000000var assert = require('assert') var toNerfDart = require('./nerf-dart.js') module.exports = clearCredentialsByURI function clearCredentialsByURI (uri) { assert(uri && typeof uri === 'string', 'registry URL is required') var nerfed = toNerfDart(uri) this.del(nerfed + ':_authToken', 'user') this.del(nerfed + ':_password', 'user') this.del(nerfed + ':username', 'user') this.del(nerfed + ':email', 'user') } npm_3.5.2.orig/lib/config/core.js0000644000000000000000000002600112631326456014772 0ustar 00000000000000var CC = require('config-chain').ConfigChain var inherits = require('inherits') var configDefs = require('./defaults.js') var types = configDefs.types var once = require('once') var fs = require('fs') var path = require('path') var nopt = require('nopt') var ini = require('ini') var Umask = configDefs.Umask var mkdirp = require('mkdirp') var umask = require('../utils/umask') exports.load = load exports.Conf = Conf exports.loaded = false exports.rootConf = null exports.usingBuiltin = false exports.defs = configDefs Object.defineProperty(exports, 'defaults', { get: function () { return configDefs.defaults }, enumerable: true }) Object.defineProperty(exports, 'types', { get: function () { return configDefs.types }, enumerable: true }) exports.validate = validate var myUid = process.env.SUDO_UID !== undefined ? process.env.SUDO_UID : (process.getuid && process.getuid()) var myGid = process.env.SUDO_GID !== undefined ? process.env.SUDO_GID : (process.getgid && process.getgid()) var loading = false var loadCbs = [] function load () { var cli, builtin, cb for (var i = 0; i < arguments.length; i++) { switch (typeof arguments[i]) { case 'string': builtin = arguments[i]; break case 'object': cli = arguments[i]; break case 'function': cb = arguments[i]; break } } if (!cb) cb = function () {} if (exports.loaded) { var ret = exports.loaded if (cli) { ret = new Conf(ret) ret.unshift(cli) } return process.nextTick(cb.bind(null, null, ret)) } // either a fresh object, or a clone of the passed in obj if (!cli) { cli = {} } else { cli = Object.keys(cli).reduce(function (c, k) { c[k] = cli[k] return c }, {}) } loadCbs.push(cb) if (loading) return loading = true cb = once(function (er, conf) { if (!er) { exports.loaded = conf loading = false } loadCbs.forEach(function (fn) { fn(er, conf) }) loadCbs.length = 0 }) // check for a builtin if provided. exports.usingBuiltin = !!builtin var rc = exports.rootConf = new Conf() if (builtin) { rc.addFile(builtin, 'builtin') } else { rc.add({}, 'builtin') } rc.on('load', function () { load_(builtin, rc, cli, cb) }) rc.on('error', cb) } function load_ (builtin, rc, cli, cb) { var defaults = configDefs.defaults var conf = new Conf(rc) conf.usingBuiltin = !!builtin conf.add(cli, 'cli') conf.addEnv() conf.loadPrefix(function (er) { if (er) return cb(er) // If you're doing `npm --userconfig=~/foo.npmrc` then you'd expect // that ~/.npmrc won't override the stuff in ~/foo.npmrc (or, indeed // be used at all). // // However, if the cwd is ~, then ~/.npmrc is the home for the project // config, and will override the userconfig. // // If you're not setting the userconfig explicitly, then it will be loaded // twice, which is harmless but excessive. If you *are* setting the // userconfig explicitly then it will override your explicit intent, and // that IS harmful and unexpected. // // Solution: Do not load project config file that is the same as either // the default or resolved userconfig value. npm will log a "verbose" // message about this when it happens, but it is a rare enough edge case // that we don't have to be super concerned about it. var projectConf = path.resolve(conf.localPrefix, '.npmrc') var defaultUserConfig = rc.get('userconfig') var resolvedUserConfig = conf.get('userconfig') if (!conf.get('global') && projectConf !== defaultUserConfig && projectConf !== resolvedUserConfig) { conf.addFile(projectConf, 'project') conf.once('load', afterPrefix) } else { conf.add({}, 'project') afterPrefix() } }) function afterPrefix () { conf.addFile(conf.get('userconfig'), 'user') conf.once('error', cb) conf.once('load', afterUser) } function afterUser () { // globalconfig and globalignorefile defaults // need to respond to the 'prefix' setting up to this point. // Eg, `npm config get globalconfig --prefix ~/local` should // return `~/local/etc/npmrc` // annoying humans and their expectations! if (conf.get('prefix')) { var etc = path.resolve(conf.get('prefix'), 'etc') mkdirp(etc, function () { defaults.globalconfig = path.resolve(etc, 'npmrc') defaults.globalignorefile = path.resolve(etc, 'npmignore') afterUserContinuation() }) } else { afterUserContinuation() } } function afterUserContinuation () { conf.addFile(conf.get('globalconfig'), 'global') // move the builtin into the conf stack now. conf.root = defaults conf.add(rc.shift(), 'builtin') conf.once('load', function () { conf.loadExtras(afterExtras) }) } function afterExtras (er) { if (er) return cb(er) // warn about invalid bits. validate(conf) var cafile = conf.get('cafile') if (cafile) { return conf.loadCAFile(cafile, finalize) } finalize() } function finalize (er) { if (er) { return cb(er) } exports.loaded = conf cb(er, conf) } } // Basically the same as CC, but: // 1. Always ini // 2. Parses environment variable names in field values // 3. Field values that start with ~/ are replaced with process.env.HOME // 4. Can inherit from another Conf object, using it as the base. inherits(Conf, CC) function Conf (base) { if (!(this instanceof Conf)) return new Conf(base) CC.apply(this) if (base) { if (base instanceof Conf) { this.root = base.list[0] || base.root } else { this.root = base } } else { this.root = configDefs.defaults } } Conf.prototype.loadPrefix = require('./load-prefix.js') Conf.prototype.loadCAFile = require('./load-cafile.js') Conf.prototype.loadUid = require('./load-uid.js') Conf.prototype.setUser = require('./set-user.js') Conf.prototype.findPrefix = require('./find-prefix.js') Conf.prototype.getCredentialsByURI = require('./get-credentials-by-uri.js') Conf.prototype.setCredentialsByURI = require('./set-credentials-by-uri.js') Conf.prototype.clearCredentialsByURI = require('./clear-credentials-by-uri.js') Conf.prototype.loadExtras = function (cb) { this.setUser(function (er) { if (er) return cb(er) this.loadUid(function (er) { if (er) return cb(er) // Without prefix, nothing will ever work mkdirp(this.prefix, cb) }.bind(this)) }.bind(this)) } Conf.prototype.save = function (where, cb) { var target = this.sources[where] if (!target || !(target.path || target.source) || !target.data) { var er if (where !== 'builtin') er = new Error('bad save target: ' + where) if (cb) { process.nextTick(cb.bind(null, er)) return this } return this.emit('error', er) } if (target.source) { var pref = target.prefix || '' Object.keys(target.data).forEach(function (k) { target.source[pref + k] = target.data[k] }) if (cb) process.nextTick(cb) return this } var data = ini.stringify(target.data) var then = function then (er) { if (er) return done(er) fs.chmod(target.path, mode, done) } var done = function done (er) { if (er) { if (cb) return cb(er) else return this.emit('error', er) } this._saving -- if (this._saving === 0) { if (cb) cb() this.emit('save') } } then = then.bind(this) done = done.bind(this) this._saving ++ var mode = where === 'user' ? '0600' : '0666' if (!data.trim()) { fs.unlink(target.path, function () { // ignore the possible error (e.g. the file doesn't exist) done(null) }) } else { mkdirp(path.dirname(target.path), function (er) { if (er) return then(er) fs.writeFile(target.path, data, 'utf8', function (er) { if (er) return then(er) if (where === 'user' && myUid && myGid) { fs.chown(target.path, +myUid, +myGid, then) } else { then() } }) }) } return this } Conf.prototype.addFile = function (file, name) { name = name || file var marker = { __source__: name } this.sources[name] = { path: file, type: 'ini' } this.push(marker) this._await() fs.readFile(file, 'utf8', function (er, data) { // just ignore missing files. if (er) return this.add({}, marker) this.addString(data, file, 'ini', marker) }.bind(this)) return this } // always ini files. Conf.prototype.parse = function (content, file) { return CC.prototype.parse.call(this, content, file, 'ini') } Conf.prototype.add = function (data, marker) { try { Object.keys(data).forEach(function (k) { data[k] = parseField(data[k], k) }) } catch (e) { this.emit('error', e) return this } return CC.prototype.add.call(this, data, marker) } Conf.prototype.addEnv = function (env) { env = env || process.env var conf = {} Object.keys(env) .filter(function (k) { return k.match(/^npm_config_/i) }) .forEach(function (k) { if (!env[k]) return // leave first char untouched, even if // it is a '_' - convert all other to '-' var p = k.toLowerCase() .replace(/^npm_config_/, '') .replace(/(?!^)_/g, '-') conf[p] = env[k] }) return CC.prototype.addEnv.call(this, '', conf, 'env') } function parseField (f, k) { if (typeof f !== 'string' && !(f instanceof String)) return f // type can be an array or single thing. var typeList = [].concat(types[k]) var isPath = typeList.indexOf(path) !== -1 var isBool = typeList.indexOf(Boolean) !== -1 var isString = typeList.indexOf(String) !== -1 var isUmask = typeList.indexOf(Umask) !== -1 var isNumber = typeList.indexOf(Number) !== -1 f = ('' + f).trim() if (f.match(/^".*"$/)) { try { f = JSON.parse(f) } catch (e) { throw new Error('Failed parsing JSON config key ' + k + ': ' + f) } } if (isBool && !isString && f === '') return true switch (f) { case 'true': return true case 'false': return false case 'null': return null case 'undefined': return undefined } f = envReplace(f) if (isPath) { var homePattern = process.platform === 'win32' ? /^~(\/|\\)/ : /^~\// if (f.match(homePattern) && process.env.HOME) { f = path.resolve(process.env.HOME, f.substr(2)) } f = path.resolve(f) } if (isUmask) f = umask.fromString(f) if (isNumber && !isNaN(f)) f = +f return f } function envReplace (f) { if (typeof f !== 'string' || !f) return f // replace any ${ENV} values with the appropriate environ. var envExpr = /(\\*)\$\{([^}]+)\}/g return f.replace(envExpr, function (orig, esc, name) { esc = esc.length && esc.length % 2 if (esc) return orig if (undefined === process.env[name]) { throw new Error('Failed to replace env in config: ' + orig) } return process.env[name] }) } function validate (cl) { // warn about invalid configs at every level. cl.list.forEach(function (conf) { nopt.clean(conf, configDefs.types) }) nopt.clean(cl.root, configDefs.types) } npm_3.5.2.orig/lib/config/defaults.js0000644000000000000000000002370312631326456015657 0ustar 00000000000000// defaults, types, and shorthands. var path = require('path') var url = require('url') var Stream = require('stream').Stream var semver = require('semver') var stableFamily = semver.parse(process.version) var nopt = require('nopt') var os = require('os') var osenv = require('osenv') var umask = require('../utils/umask') var hasUnicode = require('has-unicode') var log try { log = require('npmlog') } catch (er) { var util = require('util') log = { warn: function (m) { console.warn(m + ' ' + util.format.apply(util, [].slice.call(arguments, 1))) } } } exports.Umask = Umask function Umask () {} function validateUmask (data, k, val) { return umask.validate(data, k, val) } function validateSemver (data, k, val) { if (!semver.valid(val)) return false data[k] = semver.valid(val) } function validateStream (data, k, val) { if (!(val instanceof Stream)) return false data[k] = val } nopt.typeDefs.semver = { type: semver, validate: validateSemver } nopt.typeDefs.Stream = { type: Stream, validate: validateStream } nopt.typeDefs.Umask = { type: Umask, validate: validateUmask } nopt.invalidHandler = function (k, val, type) { log.warn('invalid config', k + '=' + JSON.stringify(val)) if (Array.isArray(type)) { if (type.indexOf(url) !== -1) type = url else if (type.indexOf(path) !== -1) type = path } switch (type) { case Umask: log.warn('invalid config', 'Must be umask, octal number in range 0000..0777') break case url: log.warn('invalid config', "Must be a full url with 'http://'") break case path: log.warn('invalid config', 'Must be a valid filesystem path') break case Number: log.warn('invalid config', 'Must be a numeric value') break case Stream: log.warn('invalid config', 'Must be an instance of the Stream class') break } } if (!stableFamily || (+stableFamily.minor % 2)) stableFamily = null else stableFamily = stableFamily.major + '.' + stableFamily.minor var defaults var temp = osenv.tmpdir() var home = osenv.home() var uidOrPid = process.getuid ? process.getuid() : process.pid if (home) process.env.HOME = home else home = path.resolve(temp, 'npm-' + uidOrPid) var cacheExtra = process.platform === 'win32' ? 'npm-cache' : '.npm' var cacheRoot = process.platform === 'win32' && process.env.APPDATA || home var cache = path.resolve(cacheRoot, cacheExtra) var globalPrefix Object.defineProperty(exports, 'defaults', {get: function () { if (defaults) return defaults if (process.env.PREFIX) { globalPrefix = process.env.PREFIX } else if (process.platform === 'win32') { // c:\node\node.exe --> prefix=c:\node\ globalPrefix = path.dirname(process.execPath) } else { // /usr/local/bin/node --> prefix=/usr/local globalPrefix = path.dirname(path.dirname(process.execPath)) // destdir only is respected on Unix if (process.env.DESTDIR) { globalPrefix = path.join(process.env.DESTDIR, globalPrefix) } } defaults = { access: null, 'always-auth': false, also: null, 'bin-links': true, browser: null, ca: null, cafile: null, cache: cache, 'cache-lock-stale': 60000, 'cache-lock-retries': 10, 'cache-lock-wait': 10000, 'cache-max': Infinity, 'cache-min': 10, cert: null, color: true, depth: Infinity, description: true, dev: false, 'dry-run': false, editor: osenv.editor(), 'engine-strict': false, force: false, 'fetch-retries': 2, 'fetch-retry-factor': 10, 'fetch-retry-mintimeout': 10000, 'fetch-retry-maxtimeout': 60000, git: 'git', 'git-tag-version': true, global: false, globalconfig: path.resolve(globalPrefix, 'etc', 'npmrc'), 'global-style': false, group: process.platform === 'win32' ? 0 : process.env.SUDO_GID || (process.getgid && process.getgid()), heading: 'npm', 'if-present': false, 'ignore-scripts': false, 'init-module': path.resolve(home, '.npm-init.js'), 'init-author-name': '', 'init-author-email': '', 'init-author-url': '', 'init-version': '1.0.0', 'init-license': 'ISC', json: false, key: null, 'legacy-bundling': false, link: false, 'local-address': undefined, loglevel: 'warn', logstream: process.stderr, long: false, message: '%s', 'node-version': process.version, npat: false, 'onload-script': false, only: null, optional: true, parseable: false, prefix: globalPrefix, production: process.env.NODE_ENV === 'production', 'progress': !process.env.TRAVIS && !process.env.CI, 'proprietary-attribs': true, proxy: null, 'https-proxy': null, 'user-agent': 'npm/{npm-version} ' + 'node/{node-version} ' + '{platform} ' + '{arch}', 'rebuild-bundle': true, registry: 'https://registry.npmjs.org/', rollback: true, save: false, 'save-bundle': false, 'save-dev': false, 'save-exact': false, 'save-optional': false, 'save-prefix': '^', scope: '', searchopts: '', searchexclude: null, searchsort: 'name', shell: osenv.shell(), shrinkwrap: true, 'sign-git-tag': false, 'strict-ssl': true, tag: 'latest', 'tag-version-prefix': 'v', tmp: temp, unicode: hasUnicode(), 'unsafe-perm': process.platform === 'win32' || process.platform === 'cygwin' || !(process.getuid && process.setuid && process.getgid && process.setgid) || process.getuid() !== 0, usage: false, user: process.platform === 'win32' ? 0 : 'nobody', userconfig: path.resolve(home, '.npmrc'), umask: process.umask ? process.umask() : umask.fromString('022'), version: false, versions: false, viewer: process.platform === 'win32' ? 'browser' : 'man', _exit: true } return defaults }}) exports.types = { access: [null, 'restricted', 'public'], 'always-auth': Boolean, also: [null, 'dev', 'development'], 'bin-links': Boolean, browser: [null, String], ca: [null, String, Array], cafile: path, cache: path, 'cache-lock-stale': Number, 'cache-lock-retries': Number, 'cache-lock-wait': Number, 'cache-max': Number, 'cache-min': Number, cert: [null, String], color: ['always', Boolean], depth: Number, description: Boolean, dev: Boolean, 'dry-run': Boolean, editor: String, 'engine-strict': Boolean, force: Boolean, 'fetch-retries': Number, 'fetch-retry-factor': Number, 'fetch-retry-mintimeout': Number, 'fetch-retry-maxtimeout': Number, git: String, 'git-tag-version': Boolean, global: Boolean, globalconfig: path, 'global-style': Boolean, group: [Number, String], 'https-proxy': [null, url], 'user-agent': String, 'heading': String, 'if-present': Boolean, 'ignore-scripts': Boolean, 'init-module': path, 'init-author-name': String, 'init-author-email': String, 'init-author-url': ['', url], 'init-license': String, 'init-version': semver, json: Boolean, key: [null, String], 'legacy-bundling': Boolean, link: Boolean, // local-address must be listed as an IP for a local network interface // must be IPv4 due to node bug 'local-address': getLocalAddresses(), loglevel: ['silent', 'error', 'warn', 'http', 'info', 'verbose', 'silly'], logstream: Stream, long: Boolean, message: String, 'node-version': [null, semver], npat: Boolean, 'onload-script': [null, String], only: [null, 'dev', 'development', 'prod', 'production'], optional: Boolean, parseable: Boolean, prefix: path, production: Boolean, progress: Boolean, 'proprietary-attribs': Boolean, proxy: [null, false, url], // allow proxy to be disabled explicitly 'rebuild-bundle': Boolean, registry: [null, url], rollback: Boolean, save: Boolean, 'save-bundle': Boolean, 'save-dev': Boolean, 'save-exact': Boolean, 'save-optional': Boolean, 'save-prefix': String, scope: String, searchopts: String, searchexclude: [null, String], searchsort: [ 'name', '-name', 'description', '-description', 'author', '-author', 'date', '-date', 'keywords', '-keywords' ], shell: String, shrinkwrap: Boolean, 'sign-git-tag': Boolean, 'strict-ssl': Boolean, tag: String, tmp: path, unicode: Boolean, 'unsafe-perm': Boolean, usage: Boolean, user: [Number, String], userconfig: path, umask: Umask, version: Boolean, 'tag-version-prefix': String, versions: Boolean, viewer: String, _exit: Boolean } function getLocalAddresses () { var interfaces // #8094: some environments require elevated permissions to enumerate // interfaces, and synchronously throw EPERM when run without // elevated privileges try { interfaces = os.networkInterfaces() } catch (e) { interfaces = {} } return Object.keys(interfaces).map(function (nic) { return interfaces[nic].filter(function (addr) { return addr.family === 'IPv4' }) .map(function (addr) { return addr.address }) }).reduce(function (curr, next) { return curr.concat(next) }, []).concat(undefined) } exports.shorthands = { s: ['--loglevel', 'silent'], d: ['--loglevel', 'info'], dd: ['--loglevel', 'verbose'], ddd: ['--loglevel', 'silly'], noreg: ['--no-registry'], N: ['--no-registry'], reg: ['--registry'], 'no-reg': ['--no-registry'], silent: ['--loglevel', 'silent'], verbose: ['--loglevel', 'verbose'], quiet: ['--loglevel', 'warn'], q: ['--loglevel', 'warn'], h: ['--usage'], H: ['--usage'], '?': ['--usage'], help: ['--usage'], v: ['--version'], f: ['--force'], gangster: ['--force'], gangsta: ['--force'], desc: ['--description'], 'no-desc': ['--no-description'], 'local': ['--no-global'], l: ['--long'], m: ['--message'], p: ['--parseable'], porcelain: ['--parseable'], g: ['--global'], S: ['--save'], D: ['--save-dev'], E: ['--save-exact'], O: ['--save-optional'], y: ['--yes'], n: ['--no-yes'], B: ['--save-bundle'], C: ['--prefix'] } npm_3.5.2.orig/lib/config/find-prefix.js0000644000000000000000000000247212631326456016263 0ustar 00000000000000// try to find the most reasonable prefix to use module.exports = findPrefix var fs = require('fs') var path = require('path') function findPrefix (p, cb_) { function cb (er, p) { process.nextTick(function () { cb_(er, p) }) } p = path.resolve(p) // if there's no node_modules folder, then // walk up until we hopefully find one. // if none anywhere, then use cwd. var walkedUp = false while (path.basename(p) === 'node_modules') { p = path.dirname(p) walkedUp = true } if (walkedUp) return cb(null, p) findPrefix_(p, p, cb) } function findPrefix_ (p, original, cb) { if (p === '/' || (process.platform === 'win32' && p.match(/^[a-zA-Z]:(\\|\/)?$/))) { return cb(null, original) } fs.readdir(p, function (er, files) { // an error right away is a bad sign. // unless the prefix was simply a non // existent directory. if (er && p === original) { if (er.code === 'ENOENT') return cb(null, original) return cb(er) } // walked up too high or something. if (er) return cb(null, original) if (files.indexOf('node_modules') !== -1 || files.indexOf('package.json') !== -1) { return cb(null, p) } var d = path.dirname(p) if (d === p) return cb(null, original) return findPrefix_(d, original, cb) }) } npm_3.5.2.orig/lib/config/get-credentials-by-uri.js0000644000000000000000000000376612631326456020336 0ustar 00000000000000var assert = require('assert') var toNerfDart = require('./nerf-dart.js') module.exports = getCredentialsByURI function getCredentialsByURI (uri) { assert(uri && typeof uri === 'string', 'registry URL is required') var nerfed = toNerfDart(uri) var defnerf = toNerfDart(this.get('registry')) // hidden class micro-optimization var c = { scope: nerfed, token: undefined, password: undefined, username: undefined, email: undefined, auth: undefined, alwaysAuth: undefined } if (this.get(nerfed + ':_authToken')) { c.token = this.get(nerfed + ':_authToken') // the bearer token is enough, don't confuse things return c } // Handle the old-style _auth= style for the default // registry, if set. // // XXX(isaacs): Remove when npm 1.4 is no longer relevant var authDef = this.get('_auth') var userDef = this.get('username') var passDef = this.get('_password') if (authDef && !(userDef && passDef)) { authDef = new Buffer(authDef, 'base64').toString() authDef = authDef.split(':') userDef = authDef.shift() passDef = authDef.join(':') } if (this.get(nerfed + ':_password')) { c.password = new Buffer(this.get(nerfed + ':_password'), 'base64').toString('utf8') } else if (nerfed === defnerf && passDef) { c.password = passDef } if (this.get(nerfed + ':username')) { c.username = this.get(nerfed + ':username') } else if (nerfed === defnerf && userDef) { c.username = userDef } if (this.get(nerfed + ':email')) { c.email = this.get(nerfed + ':email') } else if (this.get('email')) { c.email = this.get('email') } if (this.get(nerfed + ':always-auth') !== undefined) { var val = this.get(nerfed + ':always-auth') c.alwaysAuth = val === 'false' ? false : !!val } else if (this.get('always-auth') !== undefined) { c.alwaysAuth = this.get('always-auth') } if (c.username && c.password) { c.auth = new Buffer(c.username + ':' + c.password).toString('base64') } return c } npm_3.5.2.orig/lib/config/load-cafile.js0000644000000000000000000000125212631326456016203 0ustar 00000000000000module.exports = loadCAFile var fs = require('fs') function loadCAFile (cafilePath, cb) { if (!cafilePath) return process.nextTick(cb) fs.readFile(cafilePath, 'utf8', afterCARead.bind(this)) function afterCARead (er, cadata) { if (er) { // previous cafile no longer exists, so just continue on gracefully if (er.code === 'ENOENT') return cb() return cb(er) } var delim = '-----END CERTIFICATE-----' var output output = cadata .split(delim) .filter(function (xs) { return !!xs.trim() }) .map(function (xs) { return xs.trimLeft() + delim }) this.set('ca', output) cb(null) } } npm_3.5.2.orig/lib/config/load-prefix.js0000644000000000000000000000246512631326456016264 0ustar 00000000000000module.exports = loadPrefix var findPrefix = require('./find-prefix.js') var path = require('path') function loadPrefix (cb) { var cli = this.list[0] Object.defineProperty(this, 'prefix', { set: function (prefix) { var g = this.get('global') this[g ? 'globalPrefix' : 'localPrefix'] = prefix }.bind(this), get: function () { var g = this.get('global') return g ? this.globalPrefix : this.localPrefix }.bind(this), enumerable: true }) Object.defineProperty(this, 'globalPrefix', { set: function (prefix) { this.set('prefix', prefix) }.bind(this), get: function () { return path.resolve(this.get('prefix')) }.bind(this), enumerable: true }) var p Object.defineProperty(this, 'localPrefix', { set: function (prefix) { p = prefix }, get: function () { return p }, enumerable: true }) // try to guess at a good node_modules location. // If we are *explicitly* given a prefix on the cli, then // always use that. otherwise, infer local prefix from cwd. if (Object.prototype.hasOwnProperty.call(cli, 'prefix')) { p = path.resolve(cli.prefix) process.nextTick(cb) } else { findPrefix(process.cwd(), function (er, found) { p = found cb(er) }) } } npm_3.5.2.orig/lib/config/load-uid.js0000644000000000000000000000060212631326456015537 0ustar 00000000000000module.exports = loadUid var getUid = require('uid-number') // Call in the context of a npmconf object function loadUid (cb) { // if we're not in unsafe-perm mode, then figure out who // to run stuff as. Do this first, to support `npm update npm -g` if (!this.get('unsafe-perm')) { getUid(this.get('user'), this.get('group'), cb) } else { process.nextTick(cb) } } npm_3.5.2.orig/lib/config/nerf-dart.js0000644000000000000000000000072712631326456015733 0ustar 00000000000000var url = require('url') module.exports = toNerfDart /** * Maps a URL to an identifier. * * Name courtesy schiffertronix media LLC, a New Jersey corporation * * @param {String} uri The URL to be nerfed. * * @returns {String} A nerfed URL. */ function toNerfDart (uri) { var parsed = url.parse(uri) delete parsed.protocol delete parsed.auth delete parsed.query delete parsed.search delete parsed.hash return url.resolve(url.format(parsed), '.') } npm_3.5.2.orig/lib/config/set-credentials-by-uri.js0000644000000000000000000000236212631326456020341 0ustar 00000000000000var assert = require('assert') var toNerfDart = require('./nerf-dart.js') module.exports = setCredentialsByURI function setCredentialsByURI (uri, c) { assert(uri && typeof uri === 'string', 'registry URL is required') assert(c && typeof c === 'object', 'credentials are required') var nerfed = toNerfDart(uri) if (c.token) { this.set(nerfed + ':_authToken', c.token, 'user') this.del(nerfed + ':_password', 'user') this.del(nerfed + ':username', 'user') this.del(nerfed + ':email', 'user') this.del(nerfed + ':always-auth', 'user') } else if (c.username || c.password || c.email) { assert(c.username, 'must include username') assert(c.password, 'must include password') assert(c.email, 'must include email address') this.del(nerfed + ':_authToken', 'user') var encoded = new Buffer(c.password, 'utf8').toString('base64') this.set(nerfed + ':_password', encoded, 'user') this.set(nerfed + ':username', c.username, 'user') this.set(nerfed + ':email', c.email, 'user') if (c.alwaysAuth !== undefined) { this.set(nerfed + ':always-auth', c.alwaysAuth, 'user') } else { this.del(nerfed + ':always-auth', 'user') } } else { throw new Error('No credentials to set.') } } npm_3.5.2.orig/lib/config/set-user.js0000644000000000000000000000135512631326456015616 0ustar 00000000000000module.exports = setUser var assert = require('assert') var path = require('path') var fs = require('fs') var mkdirp = require('mkdirp') function setUser (cb) { var defaultConf = this.root assert(defaultConf !== Object.prototype) // If global, leave it as-is. // If not global, then set the user to the owner of the prefix folder. // Just set the default, so it can be overridden. if (this.get('global')) return cb() if (process.env.SUDO_UID) { defaultConf.user = +(process.env.SUDO_UID) return cb() } var prefix = path.resolve(this.get('prefix')) mkdirp(prefix, function (er) { if (er) return cb(er) fs.stat(prefix, function (er, st) { defaultConf.user = st && st.uid return cb(er) }) }) } npm_3.5.2.orig/lib/install/access-error.js0000644000000000000000000000031212631326456016630 0ustar 00000000000000'use strict' module.exports = function (dir, er) { if (!er) return var accessEr = new Error("EACCES, access '" + dir + "'", -13) accessEr.code = 'EACCES' accessEr.path = dir return accessEr } npm_3.5.2.orig/lib/install/action/0000755000000000000000000000000012631326456015163 5ustar 00000000000000npm_3.5.2.orig/lib/install/actions.js0000644000000000000000000001111212631326456015700 0ustar 00000000000000'use strict' var path = require('path') var validate = require('aproba') var chain = require('slide').chain var asyncMap = require('slide').asyncMap var log = require('npmlog') var andFinishTracker = require('./and-finish-tracker.js') var andAddParentToErrors = require('./and-add-parent-to-errors.js') var failedDependency = require('./deps.js').failedDependency var packageId = require('../utils/package-id.js') var moduleName = require('../utils/module-name.js') var buildPath = require('./build-path.js') var actions = {} actions.fetch = require('./action/fetch.js') actions.extract = require('./action/extract.js') actions.build = require('./action/build.js') actions.test = require('./action/test.js') actions.preinstall = require('./action/preinstall.js') actions.install = require('./action/install.js') actions.postinstall = require('./action/postinstall.js') actions.prepublish = require('./action/prepublish.js') actions.finalize = require('./action/finalize.js') actions.remove = require('./action/remove.js') actions.move = require('./action/move.js') actions['update-linked'] = require('./action/update-linked.js') actions['global-install'] = require('./action/global-install.js') actions['global-link'] = require('./action/global-link.js') // FIXME: We wrap actions like three ways to sunday here. // Rewrite this to only work one way. Object.keys(actions).forEach(function (actionName) { var action = actions[actionName] actions[actionName] = function (top, buildpath, pkg, log, next) { validate('SSOOF', arguments) // refuse to run actions for failed packages if (pkg.failed) return next() if (action.rollback) { if (!pkg.rollback) pkg.rollback = [] pkg.rollback.unshift(action.rollback) } if (action.commit) { if (!pkg.commit) pkg.commit = [] pkg.commit.push(action.commit) } return action(top, buildpath, pkg, log, andFinishTracker(log, andAddParentToErrors(pkg.parent, andHandleOptionalDepErrors(pkg, next)))) } }) function markAsFailed (pkg) { pkg.failed = true pkg.requires.forEach(function (req) { req.requiredBy = req.requiredBy.filter(function (reqReqBy) { return reqReqBy !== pkg }) if (req.requiredBy.length === 0 && !req.userRequired && !req.existing) { markAsFailed(req) } }) } function andHandleOptionalDepErrors (pkg, next) { return function (er) { if (!er) return next.apply(null, arguments) markAsFailed(pkg) var anyFatal = pkg.userRequired || !pkg.parent for (var ii = 0; ii < pkg.requiredBy.length; ++ii) { var parent = pkg.requiredBy[ii] var isFatal = failedDependency(parent, pkg) if (isFatal) anyFatal = true } if (anyFatal) return next.apply(null, arguments) log.warn('install:' + packageId(pkg), er.message) log.verbose('install:' + packageId(pkg), er.stack) next() } } function prepareAction (staging, log) { validate('SO', arguments) return function (action) { validate('SO', action) var cmd = action[0] var pkg = action[1] if (!actions[cmd]) throw new Error('Unknown decomposed command "' + cmd + '" (is it new?)') var top = path.resolve(staging, '../..') var buildpath = buildPath(staging, pkg) return [actions[cmd], top, buildpath, pkg, log.newGroup(cmd + ':' + moduleName(pkg))] } } exports.actions = actions function execAction (todo, done) { validate('AF', arguments) var cmd = todo.shift() todo.push(done) cmd.apply(null, todo) } exports.doOne = function (cmd, staging, pkg, log, next) { validate('SSOOF', arguments) execAction(prepareAction(staging, log)([cmd, pkg]), next) } exports.doSerial = function (type, staging, actionsToRun, log, next) { validate('SSAOF', arguments) actionsToRun = actionsToRun .filter(function (value) { return value[0] === type }) log.silly('doSerial', '%s %d', type, actionsToRun.length) chain(actionsToRun.map(prepareAction(staging, log)), andFinishTracker(log, next)) } exports.doReverseSerial = function (type, staging, actionsToRun, log, next) { validate('SSAOF', arguments) actionsToRun = actionsToRun .filter(function (value) { return value[0] === type }) .reverse() log.silly('doReverseSerial', '%s %d', type, actionsToRun.length) chain(actionsToRun.map(prepareAction(staging, log)), andFinishTracker(log, next)) } exports.doParallel = function (type, staging, actionsToRun, log, next) { validate('SSAOF', arguments) actionsToRun = actionsToRun.filter(function (value) { return value[0] === type }) log.silly('doParallel', type + ' ' + actionsToRun.length) asyncMap(actionsToRun.map(prepareAction(staging, log)), execAction, andFinishTracker(log, next)) } npm_3.5.2.orig/lib/install/and-add-parent-to-errors.js0000644000000000000000000000050412631326456020754 0ustar 00000000000000'use strict' var validate = require('aproba') module.exports = function (parent, cb) { validate('F', [cb]) return function (er) { if (!er) return cb.apply(null, arguments) if (er instanceof Error && parent && parent.package && parent.package.name) { er.parent = parent.package.name } cb(er) } } npm_3.5.2.orig/lib/install/and-finish-tracker.js0000644000000000000000000000055012631326456017715 0ustar 00000000000000'use strict' var validate = require('aproba') module.exports = function (tracker, cb) { validate('OF', [tracker, cb]) return function () { tracker.finish() cb.apply(null, arguments) } } module.exports.now = function (tracker, cb) { validate('OF', [tracker, cb]) tracker.finish() cb.apply(null, Array.prototype.slice.call(arguments, 2)) } npm_3.5.2.orig/lib/install/and-ignore-errors.js0000644000000000000000000000031412631326456017577 0ustar 00000000000000'use strict' module.exports = function (cb) { return function () { var args = Array.prototype.slice.call(arguments, 1) if (args.length) args.unshift(null) return cb.apply(null, args) } } npm_3.5.2.orig/lib/install/build-path.js0000644000000000000000000000036312631326456016277 0ustar 00000000000000'use strict' var uniqueFilename = require('unique-filename') var moduleName = require('../utils/module-name.js') module.exports = buildPath function buildPath (staging, pkg) { return uniqueFilename(staging, moduleName(pkg), pkg.realpath) } npm_3.5.2.orig/lib/install/check-permissions.js0000644000000000000000000000355112631326456017676 0ustar 00000000000000'use strict' var path = require('path') var log = require('npmlog') var validate = require('aproba') var uniq = require('lodash.uniq') var asyncMap = require('slide').asyncMap var npm = require('../npm.js') var exists = require('./exists.js') var writable = require('./writable.js') module.exports = function (actions, next) { validate('AF', arguments) var errors = [] asyncMap(actions, function (action, done) { var cmd = action[0] var pkg = action[1] switch (cmd) { case 'add': hasAnyWriteAccess(path.resolve(pkg.path, '..'), errors, done) break case 'update': case 'remove': hasWriteAccess(pkg.path, errors, andHasWriteAccess(path.resolve(pkg.path, '..'), errors, done)) break case 'move': hasAnyWriteAccess(pkg.path, errors, andHasWriteAccess(path.resolve(pkg.fromPath, '..'), errors, done)) break default: done() } }, function () { if (!errors.length) return next() uniq(errors.map(function (er) { return 'Missing write access to ' + er.path })).forEach(function (er) { log.warn('checkPermissions', er) }) npm.config.get('force') ? next() : next(errors[0]) }) } function andHasWriteAccess (dir, errors, done) { validate('SAF', arguments) return function () { hasWriteAccess(dir, errors, done) } } function hasAnyWriteAccess (dir, errors, done) { validate('SAF', arguments) findNearestDir() function findNearestDir () { var nextDir = path.resolve(dir, '..') exists(dir, function (dirDoesntExist) { if (!dirDoesntExist || nextDir === dir) { return hasWriteAccess(dir, errors, done) } else { dir = nextDir findNearestDir() } }) } } function hasWriteAccess (dir, errors, done) { validate('SAF', arguments) writable(dir, function (er) { if (er) errors.push(er) done() }) } npm_3.5.2.orig/lib/install/copy-tree.js0000644000000000000000000000116112631326456016152 0ustar 00000000000000'use strict' module.exports = function (tree) { return copyTree(tree, {}) } function copyTree (tree, cache) { if (cache[tree.path]) return cache[tree.path] var newTree = cache[tree.path] = Object.create(tree) copyModuleList(newTree, 'children', cache) newTree.children.forEach(function (child) { child.parent = newTree }) copyModuleList(newTree, 'requires', cache) copyModuleList(newTree, 'requiredBy', cache) return newTree } function copyModuleList (tree, key, cache) { var newList = [] tree[key].forEach(function (child) { newList.push(copyTree(child, cache)) }) tree[key] = newList } npm_3.5.2.orig/lib/install/decompose-actions.js0000644000000000000000000000257512631326456017671 0ustar 00000000000000'use strict' var validate = require('aproba') var asyncMap = require('slide').asyncMap var npm = require('../npm.js') module.exports = function (differences, decomposed, next) { validate('AAF', arguments) asyncMap(differences, function (action, done) { var cmd = action[0] var pkg = action[1] switch (cmd) { case 'add': case 'update': addSteps(decomposed, pkg, done) break case 'move': moveSteps(decomposed, pkg, done) break case 'remove': case 'update-linked': default: defaultSteps(decomposed, cmd, pkg, done) } }, next) } function addSteps (decomposed, pkg, done) { if (!pkg.fromBundle) { decomposed.push(['fetch', pkg]) decomposed.push(['extract', pkg]) decomposed.push(['test', pkg]) } if (!pkg.fromBundle || npm.config.get('rebuild-bundle')) { decomposed.push(['preinstall', pkg]) decomposed.push(['build', pkg]) decomposed.push(['install', pkg]) decomposed.push(['postinstall', pkg]) } decomposed.push(['finalize', pkg]) done() } function moveSteps (decomposed, pkg, done) { decomposed.push(['move', pkg]) decomposed.push(['build', pkg]) decomposed.push(['install', pkg]) decomposed.push(['postinstall', pkg]) decomposed.push(['test', pkg]) done() } function defaultSteps (decomposed, cmd, pkg, done) { decomposed.push([cmd, pkg]) done() } npm_3.5.2.orig/lib/install/deps.js0000644000000000000000000005171612631326456015211 0ustar 00000000000000'use strict' var assert = require('assert') var path = require('path') var url = require('url') var semver = require('semver') var asyncMap = require('slide').asyncMap var chain = require('slide').chain var union = require('lodash.union') var iferr = require('iferr') var npa = require('npm-package-arg') var validate = require('aproba') var realizePackageSpecifier = require('realize-package-specifier') var dezalgo = require('dezalgo') var fetchPackageMetadata = require('../fetch-package-metadata.js') var andAddParentToErrors = require('./and-add-parent-to-errors.js') var addShrinkwrap = require('../fetch-package-metadata.js').addShrinkwrap var addBundled = require('../fetch-package-metadata.js').addBundled var readShrinkwrap = require('./read-shrinkwrap.js') var inflateShrinkwrap = require('./inflate-shrinkwrap.js') var inflateBundled = require('./inflate-bundled.js') var andFinishTracker = require('./and-finish-tracker.js') var npm = require('../npm.js') var flatName = require('./flatten-tree.js').flatName var createChild = require('./node.js').create var resetMetadata = require('./node.js').reset var andIgnoreErrors = require('./and-ignore-errors.js') var isInstallable = require('./validate-args.js').isInstallable var packageId = require('../utils/package-id.js') var moduleName = require('../utils/module-name.js') // The export functions in this module mutate a dependency tree, adding // items to them. function isDep (tree, child) { if (child.fromShrinkwrap) return true var name = moduleName(child) var requested = isProdDep(tree, name) var matches if (requested) matches = doesChildVersionMatch(child, requested, tree) if (matches) return matches requested = isDevDep(tree, name) if (!requested) return return doesChildVersionMatch(child, requested, tree) } function isDevDep (tree, name) { var devDeps = tree.package.devDependencies || {} var reqVer = devDeps[name] if (reqVer == null) return return npa(name + '@' + reqVer) } function isProdDep (tree, name) { var deps = tree.package.dependencies || {} var reqVer = deps[name] if (reqVer == null) return false return npa(name + '@' + reqVer) } var registryTypes = { range: true, version: true } function doesChildVersionMatch (child, requested, requestor) { // we always consider deps provided by a shrinkwrap as "correct" or else // we'll subvert them if they're intentionally "invalid" if (child.parent === requestor && child.fromShrinkwrap) return true // ranges of * ALWAYS count as a match, because when downloading we allow // prereleases to match * if there are ONLY prereleases if (requested.spec === '*') return true var childReq = child.package._requested if (childReq) { if (childReq.rawSpec === requested.rawSpec) return true if (childReq.type === requested.type && childReq.spec === requested.spec) return true } if (!registryTypes[requested.type]) return requested.rawSpec === child.package._from return semver.satisfies(child.package.version, requested.spec) } exports.recalculateMetadata = function (tree, log, next) { recalculateMetadata(tree, log, {}, next) } function recalculateMetadata (tree, log, seen, next) { validate('OOOF', arguments) if (seen[tree.path]) return next() seen[tree.path] = true if (tree.parent == null) resetMetadata(tree) function markDeps (spec, done) { validate('SF', arguments) realizePackageSpecifier(spec, packageRelativePath(tree), function (er, req) { if (er || !req.name) return done() var child = findRequirement(tree, req.name, req) if (child) { resolveWithExistingModule(child, tree, log, andIgnoreErrors(done)) } else if (tree.package.dependencies[req.name] != null) { tree.missingDeps[req.name] = req.rawSpec done() } else if (tree.package.devDependencies[req.name] != null) { tree.missingDevDeps[req.name] = req.rawSpec done() } else { done() } }) } function specs (deps) { return Object.keys(deps).map(function (depname) { return depname + '@' + deps[depname] }) } var tomark = specs(tree.package.dependencies) if (!tree.parent && (npm.config.get('dev') || !npm.config.get('production'))) { tomark = union(tomark, specs(tree.package.devDependencies)) } tree.children = tree.children.filter(function (child) { return !child.failed }) chain([ [asyncMap, tomark, markDeps], [asyncMap, tree.children, function (child, done) { recalculateMetadata(child, log, seen, done) }] ], function () { tree.userRequired = tree.package._requiredBy.some(function (req) { return req === '#USER' }) tree.existing = tree.package._requiredBy.some(function (req) { return req === '#EXISTING' }) tree.package._location = flatNameFromTree(tree) next(null, tree) }) } function addRequiredDep (tree, child) { if (!isDep(tree, child)) return false var name = isProdDep(tree, moduleName(child)) ? flatNameFromTree(tree) : '#DEV:' + flatNameFromTree(tree) replaceModuleName(child.package, '_requiredBy', name) replaceModule(child, 'requiredBy', tree) replaceModule(tree, 'requires', child) return true } exports._removeObsoleteDep = removeObsoleteDep function removeObsoleteDep (child) { if (child.removed) return child.removed = true var requires = child.requires || [] requires.forEach(function (requirement) { requirement.requiredBy = requirement.requiredBy.filter(function (reqBy) { return reqBy !== child }) if (requirement.requiredBy.length === 0) removeObsoleteDep(requirement) }) } function matchingDep (tree, name) { if (tree.package.dependencies && tree.package.dependencies[name]) return tree.package.dependencies[name] if (tree.package.devDependencies && tree.package.devDependencies[name]) return tree.package.devDependencies[name] return } function packageRelativePath (tree) { if (!tree) return '' var requested = tree.package._requested || {} var isLocal = requested.type === 'directory' || requested.type === 'local' return isLocal ? requested.spec : tree.path } function getShrinkwrap (tree, name) { return tree.package._shrinkwrap && tree.package._shrinkwrap.dependencies && tree.package._shrinkwrap.dependencies[name] } exports.getAllMetadata = function (args, tree, next) { asyncMap(args, function (spec, done) { if (tree && spec.lastIndexOf('@') <= 0) { var sw = getShrinkwrap(tree, spec) if (sw) { // FIXME: This is duplicated in inflate-shrinkwrap and should be factoed // into a shared function spec = sw.resolved ? spec + '@' + sw.resolved : (sw.from && url.parse(sw.from).protocol) ? spec + '@' + sw.from : spec + '@' + sw.version } else { var version = matchingDep(tree, spec) if (version != null) { spec += '@' + version } } } fetchPackageMetadata(spec, packageRelativePath(tree), done) }, next) } // Add a list of args to tree's top level dependencies exports.loadRequestedDeps = function (args, tree, saveToDependencies, log, next) { validate('AOOF', [args, tree, log, next]) asyncMap(args, function (pkg, done) { var depLoaded = andAddParentToErrors(tree, done) resolveWithNewModule(pkg, tree, log.newGroup('loadRequestedDeps'), iferr(depLoaded, function (child, tracker) { validate('OO', arguments) if (npm.config.get('global')) { child.isGlobal = true } var childName = moduleName(child) if (saveToDependencies) { tree.package[saveToDependencies][childName] = child.package._requested.rawSpec || child.package._requested.spec } if (saveToDependencies && saveToDependencies !== 'devDependencies') { tree.package.dependencies[childName] = child.package._requested.rawSpec || child.package._requested.spec } child.userRequired = true child.save = saveToDependencies // For things the user asked to install, that aren't a dependency (or // won't be when we're done), flag it as "depending" on the user // themselves, so we don't remove it as a dep that no longer exists if (!addRequiredDep(tree, child)) { replaceModuleName(child.package, '_requiredBy', '#USER') } depLoaded(null, child, tracker) })) }, andForEachChild(loadDeps, andFinishTracker(log, next))) } function moduleNameMatches (name) { return function (child) { return moduleName(child) === name } } function noModuleNameMatches (name) { return function (child) { return moduleName(child) !== name } } // while this implementation does not require async calling, doing so // gives this a consistent interface with loadDeps et al exports.removeDeps = function (args, tree, saveToDependencies, log, next) { validate('AOOF', [args, tree, log, next]) args.forEach(function (pkg) { var pkgName = moduleName(pkg) if (saveToDependencies) { var toRemove = tree.children.filter(moduleNameMatches(pkgName)) replaceModule(tree, 'removed', toRemove[0]) toRemove.forEach(function (parent) { parent.save = saveToDependencies }) } tree.children = tree.children.filter(noModuleNameMatches(pkgName)) }) log.finish() next() } function andForEachChild (load, next) { validate('F', [next]) next = dezalgo(next) return function (er, children, logs) { // when children is empty, logs won't be passed in at all (asyncMap is weird) // so shortcircuit before arg validation if (!er && (!children || children.length === 0)) return next() validate('EAA', arguments) if (er) return next(er) assert(children.length === logs.length) var cmds = [] for (var ii = 0; ii < children.length; ++ii) { cmds.push([load, children[ii], logs[ii]]) } var sortedCmds = cmds.sort(function installOrder (aa, bb) { return moduleName(aa[1]).localeCompare(moduleName(bb[1])) }) chain(sortedCmds, next) } } function isDepOptional (tree, name) { if (!tree.package.optionalDependencies) return false if (tree.package.optionalDependencies[name] != null) return true return false } var failedDependency = exports.failedDependency = function (tree, name_pkg) { var name, pkg if (typeof name_pkg === 'string') { name = name_pkg } else { pkg = name_pkg name = moduleName(pkg) } tree.children = tree.children.filter(noModuleNameMatches(name)) if (isDepOptional(tree, name)) { return false } tree.failed = true if (!tree.parent) return true if (tree.userRequired) return true for (var ii = 0; ii < tree.requiredBy.length; ++ii) { var requireParent = tree.requiredBy[ii] if (failedDependency(requireParent, tree.package)) { return true } } return false } function top (tree) { if (tree.parent) return top(tree.parent) return tree } function treeWarn (tree, what, error) { var topTree = top(tree) if (!topTree.warnings) topTree.warnings = [] error.optional = flatNameFromTree(tree) + '/' + what topTree.warnings.push(error) } function andHandleOptionalErrors (log, tree, name, done) { validate('OOSF', arguments) return function (er, child, childLog) { if (!er) validate('OO', [child, childLog]) if (!er) return done(er, child, childLog) var isFatal = failedDependency(tree, name) if (er && !isFatal) { tree.children = tree.children.filter(noModuleNameMatches(name)) treeWarn(tree, name, er) return done() } else { return done(er, child, childLog) } } } // Load any missing dependencies in the given tree exports.loadDeps = loadDeps function loadDeps (tree, log, next) { validate('OOF', arguments) if (tree.loaded || (tree.parent && tree.parent.failed)) return andFinishTracker.now(log, next) if (tree.parent) tree.loaded = true if (!tree.package.dependencies) tree.package.dependencies = {} asyncMap(Object.keys(tree.package.dependencies), function (dep, done) { var version = tree.package.dependencies[dep] if (tree.package.optionalDependencies && tree.package.optionalDependencies[dep] && !npm.config.get('optional')) { return done() } addDependency(dep, version, tree, log.newGroup('loadDep:' + dep), andHandleOptionalErrors(log, tree, dep, done)) }, andForEachChild(loadDeps, andFinishTracker(log, next))) } // Load development dependencies into the given tree exports.loadDevDeps = function (tree, log, next) { validate('OOF', arguments) if (!tree.package.devDependencies) return andFinishTracker.now(log, next) asyncMap(Object.keys(tree.package.devDependencies), function (dep, done) { // things defined as both dev dependencies and regular dependencies are treated // as the former if (tree.package.dependencies[dep]) return done() var logGroup = log.newGroup('loadDevDep:' + dep) addDependency(dep, tree.package.devDependencies[dep], tree, logGroup, done) }, andForEachChild(loadDeps, andFinishTracker(log, next))) } exports.loadExtraneous = function loadExtraneous (tree, log, next) { var seen = {} function loadExtraneous (tree, log, next) { validate('OOF', arguments) if (seen[tree.path]) return next() seen[tree.path] = true asyncMap(tree.children.filter(function (child) { return !child.loaded }), function (child, done) { resolveWithExistingModule(child, tree, log, done) }, andForEachChild(loadExtraneous, andFinishTracker(log, next))) } loadExtraneous(tree, log, next) } exports.loadExtraneous.andResolveDeps = function (tree, log, next) { validate('OOF', arguments) asyncMap(tree.children.filter(function (child) { return !child.loaded }), function (child, done) { resolveWithExistingModule(child, tree, log, done) }, andForEachChild(loadDeps, andFinishTracker(log, next))) } function addDependency (name, versionSpec, tree, log, done) { validate('SSOOF', arguments) var next = andAddParentToErrors(tree, done) var spec = name + '@' + versionSpec realizePackageSpecifier(spec, packageRelativePath(tree), iferr(done, function (req) { var child = findRequirement(tree, name, req) if (child) { resolveWithExistingModule(child, tree, log, iferr(next, function (child, log) { if (child.package._shrinkwrap === undefined) { readShrinkwrap.andInflate(child, function (er) { next(er, child, log) }) } else { next(null, child, log) } })) } else { resolveWithNewModule(req, tree, log, next) } })) } function resolveWithExistingModule (child, tree, log, next) { validate('OOOF', arguments) addRequiredDep(tree, child) if (tree.parent && child.parent !== tree) updatePhantomChildren(tree.parent, child) next(null, child, log) } var updatePhantomChildren = exports.updatePhantomChildren = function (current, child) { validate('OO', arguments) while (current && current !== child.parent) { // FIXME: phantomChildren doesn't actually belong in the package.json if (!current.package._phantomChildren) current.package._phantomChildren = {} current.package._phantomChildren[moduleName(child)] = child.package.version current = current.parent } } function flatNameFromTree (tree) { validate('O', arguments) if (!tree.parent) return '/' var path = flatNameFromTree(tree.parent) if (path !== '/') path += '/' return flatName(path, tree) } exports._replaceModuleName = replaceModuleName function replaceModuleName (obj, key, name) { validate('OSS', arguments) obj[key] = union(obj[key] || [], [name]) } exports._replaceModule = replaceModule function replaceModule (obj, key, child) { validate('OSO', arguments) if (!obj[key]) obj[key] = [] // we replace children with a new array object instead of mutating it // because mutating it results in weird failure states. // I would very much like to know _why_ this is. =/ var children = [].concat(obj[key]) var childName = moduleName(child) for (var replaceAt = 0; replaceAt < children.length; ++replaceAt) { if (moduleName(children[replaceAt]) === childName) break } var replacing = children.splice(replaceAt, 1, child) obj[key] = children return replacing[0] } function resolveWithNewModule (pkg, tree, log, next) { validate('OOOF', arguments) if (pkg.type) { return fetchPackageMetadata(pkg, packageRelativePath(tree), log.newItem('fetchMetadata'), iferr(next, function (pkg) { resolveWithNewModule(pkg, tree, log, next) })) } if (!pkg._installable) { log.silly('resolveWithNewModule', packageId(pkg), 'checking installable status') return isInstallable(pkg, iferr(next, function () { pkg._installable = true resolveWithNewModule(pkg, tree, log, next) })) } if (!pkg._from) { pkg._from = pkg._requested.name + '@' + pkg._requested.spec } addShrinkwrap(pkg, iferr(next, function () { addBundled(pkg, iferr(next, function () { var parent = earliestInstallable(tree, tree, pkg) || tree var child = createChild({ package: pkg, parent: parent, path: path.join(parent.path, 'node_modules', pkg.name), realpath: path.resolve(parent.realpath, 'node_modules', pkg.name), children: pkg._bundled || [], isLink: tree.isLink }) var replaced = replaceModule(parent, 'children', child) if (replaced) removeObsoleteDep(replaced) addRequiredDep(tree, child) pkg._location = flatNameFromTree(child) if (tree.parent && parent !== tree) updatePhantomChildren(tree.parent, child) if (pkg._bundled) { inflateBundled(child, child.children) } if (pkg._shrinkwrap && pkg._shrinkwrap.dependencies) { return inflateShrinkwrap(child, pkg._shrinkwrap.dependencies, function (er) { next(er, child, log) }) } next(null, child, log) })) })) } var validatePeerDeps = exports.validatePeerDeps = function (tree, onInvalid) { if (!tree.package.peerDependencies) return Object.keys(tree.package.peerDependencies).forEach(function (pkgname) { var version = tree.package.peerDependencies[pkgname] var match = findRequirement(tree.parent || tree, pkgname, npa(pkgname + '@' + version)) if (!match) onInvalid(tree, pkgname, version) }) } exports.validateAllPeerDeps = function (tree, onInvalid) { validateAllPeerDeps(tree, onInvalid, {}) } function validateAllPeerDeps (tree, onInvalid, seen) { validate('OFO', arguments) if (seen[tree.path]) return seen[tree.path] = true validatePeerDeps(tree, onInvalid) tree.children.forEach(function (child) { validateAllPeerDeps(child, onInvalid, seen) }) } // Determine if a module requirement is already met by the tree at or above // our current location in the tree. var findRequirement = exports.findRequirement = function (tree, name, requested, requestor) { validate('OSO', [tree, name, requested]) if (!requestor) requestor = tree var nameMatch = function (child) { return moduleName(child) === name && child.parent && !child.removed } var versionMatch = function (child) { return doesChildVersionMatch(child, requested, requestor) } if (nameMatch(tree)) { // this *is* the module, but it doesn't match the version, so a // new copy will have to be installed return versionMatch(tree) ? tree : null } var matches = tree.children.filter(nameMatch) if (matches.length) { matches = matches.filter(versionMatch) // the module exists as a dependent, but the version doesn't match, so // a new copy will have to be installed above here if (matches.length) return matches[0] return null } if (!tree.parent) return null return findRequirement(tree.parent, name, requested, requestor) } // Find the highest level in the tree that we can install this module in. // If the module isn't installed above us yet, that'd be the very top. // If it is, then it's the level below where its installed. var earliestInstallable = exports.earliestInstallable = function (requiredBy, tree, pkg) { validate('OOO', arguments) function undeletedModuleMatches (child) { return !child.removed && moduleName(child) === pkg.name } if (tree.children.some(undeletedModuleMatches)) return null // If any of the children of this tree have conflicting // binaries then we need to decline to install this package here. var binaryMatches = typeof pkg.bin === 'object' && tree.children.some(function (child) { if (child.removed) return false if (typeof child.package.bin !== 'object') return false return Object.keys(child.package.bin).some(function (bin) { return pkg.bin[bin] }) }) if (binaryMatches) return null // if this tree location requested the same module then we KNOW it // isn't compatible because if it were findRequirement would have // found that version. var deps = tree.package.dependencies || {} if (!tree.removed && requiredBy !== tree && deps[pkg.name]) { return null } // FIXME: phantomChildren doesn't actually belong in the package.json if (tree.package._phantomChildren && tree.package._phantomChildren[pkg.name]) return null if (!tree.parent) return tree if (tree.isGlobal) return tree if (npm.config.get('global-style') && !tree.parent.parent) return tree if (npm.config.get('legacy-bundling')) return tree return (earliestInstallable(requiredBy, tree.parent, pkg) || tree) } npm_3.5.2.orig/lib/install/diff-trees.js0000644000000000000000000001115212631326456016274 0ustar 00000000000000'use strict' var validate = require('aproba') var npa = require('npm-package-arg') var flattenTree = require('./flatten-tree.js') function nonRegistrySource (pkg) { validate('O', arguments) var requested = pkg._requested || (pkg._from && npa(pkg._from)) if (!requested) return false if (requested.type === 'hosted') return true if (requested.type === 'local') return true return false } function pkgAreEquiv (aa, bb) { var aaSha = (aa.dist && aa.dist.shasum) || aa._shasum var bbSha = (bb.dist && bb.dist.shasum) || bb._shasum if (aaSha === bbSha) return true if (aaSha || bbSha) return false if (nonRegistrySource(aa) || nonRegistrySource(bb)) return false if (aa.version === bb.version) return true return false } function getNameAndVersion (pkg) { var versionspec = pkg._shasum if (!versionspec && nonRegistrySource(pkg)) { if (pkg._requested) { versionspec = pkg._requested.spec } else if (pkg._from) { versionspec = npa(pkg._from).spec } } if (!versionspec) { versionspec = pkg.version } return pkg.name + '@' + versionspec } function pushAll (aa, bb) { Array.prototype.push.apply(aa, bb) } module.exports = function (oldTree, newTree, differences, log, next) { validate('OOAOF', arguments) pushAll(differences, sortActions(diffTrees(oldTree, newTree))) log.finish() next() } function isLink (node) { return node && node.isLink } function requiredByAllLinked (node) { if (!node.requiredBy.length) return false return node.requiredBy.filter(isLink).length === node.requiredBy.length } function isNotReqByTop (req) { return req !== '/' && // '/' is the top level itself req !== '#USER' && // #USER req !== '#EXTRANEOUS' } var sortActions = module.exports.sortActions = function (differences) { var actions = {} differences.forEach(function (action) { var child = action[1] actions[child.package._location] = action }) var sorted = [] var added = {} var sortedlocs = Object.keys(actions).sort(sortByLocation) // Do top level deps first, this stops the sorting by required order from // unsorting these deps. var toplocs = sortedlocs.filter(function (location) { var mod = actions[location][1] if (!mod.package._requiredBy) return true // If the module is required by ANY non-top level package // then we don't want to include this. return !mod.package._requiredBy.some(isNotReqByTop) }) toplocs.concat(sortedlocs).forEach(function (location) { sortByDeps(actions[location]) }) function sortByLocation (aa, bb) { return bb.localeCompare(aa) } function sortByDeps (action) { var mod = action[1] if (added[mod.package._location]) return added[mod.package._location] = action mod.package._requiredBy.sort().forEach(function (location) { if (actions[location]) sortByDeps(actions[location]) }) sorted.unshift(action) } return sorted } function diffTrees (oldTree, newTree) { validate('OO', arguments) var differences = [] var flatOldTree = flattenTree(oldTree) var flatNewTree = flattenTree(newTree) var toRemove = {} var toRemoveByNameAndVer = {} // find differences Object.keys(flatOldTree).forEach(function (flatname) { if (flatNewTree[flatname]) return var pkg = flatOldTree[flatname] toRemove[flatname] = pkg var namever = getNameAndVersion(pkg.package) if (!toRemoveByNameAndVer[namever]) toRemoveByNameAndVer[namever] = [] toRemoveByNameAndVer[namever].push(flatname) }) Object.keys(flatNewTree).forEach(function (path) { var pkg = flatNewTree[path] pkg.oldPkg = flatOldTree[path] pkg.isInLink = (pkg.oldPkg && isLink(pkg.oldPkg.parent)) || (pkg.parent && isLink(pkg.parent)) || requiredByAllLinked(pkg) if (pkg.oldPkg) { if (!pkg.userRequired && pkgAreEquiv(pkg.oldPkg.package, pkg.package)) return if (!pkg.isInLink && (isLink(pkg.oldPkg) || isLink(pkg))) { differences.push(['update-linked', pkg]) } else { differences.push(['update', pkg]) } } else { var vername = getNameAndVersion(pkg.package) if (toRemoveByNameAndVer[vername] && toRemoveByNameAndVer[vername].length && !pkg.fromBundle) { var flatname = toRemoveByNameAndVer[vername].shift() pkg.fromPath = toRemove[flatname].path differences.push(['move', pkg]) delete toRemove[flatname] } else { differences.push(['add', pkg]) } } }) Object .keys(toRemove) .map(function (path) { return toRemove[path] }) .forEach(function (pkg) { differences.push(['remove', pkg]) }) return differences } npm_3.5.2.orig/lib/install/exists.js0000644000000000000000000000140712631326456015565 0ustar 00000000000000'use strict' var fs = require('fs') var inflight = require('inflight') var accessError = require('./access-error.js') var isFsAccessAvailable = require('./is-fs-access-available.js') if (isFsAccessAvailable) { module.exports = fsAccessImplementation } else { module.exports = fsStatImplementation } // exposed only for testing purposes module.exports.fsAccessImplementation = fsAccessImplementation module.exports.fsStatImplementation = fsStatImplementation function fsAccessImplementation (dir, done) { done = inflight('exists:' + dir, done) if (!done) return fs.access(dir, fs.F_OK, done) } function fsStatImplementation (dir, done) { done = inflight('exists:' + dir, done) if (!done) return fs.stat(dir, function (er) { done(accessError(dir, er)) }) } npm_3.5.2.orig/lib/install/filter-invalid-actions.js0000644000000000000000000000175312631326456020621 0ustar 00000000000000'use strict' var path = require('path') var validate = require('aproba') var log = require('npmlog') var packageId = require('../utils/package-id.js') module.exports = function (top, differences, next) { validate('SAF', arguments) var action var keep = [] differences.forEach(function (action) { var cmd = action[0] var pkg = action[1] if (cmd === 'remove') { pkg.removing = true } }) /*eslint no-cond-assign:0*/ while (action = differences.shift()) { var cmd = action[0] var pkg = action[1] if (pkg.isInLink || pkg.parent.target || pkg.parent.isLink) { // we want to skip warning if this is a child of another module that we're removing if (!pkg.parent.removing) { log.warn('skippingAction', 'Module is inside a symlinked module: not running ' + cmd + ' ' + packageId(pkg) + ' ' + path.relative(top, pkg.path)) } } else { keep.push(action) } } differences.push.apply(differences, keep) next() } npm_3.5.2.orig/lib/install/flatten-tree.js0000644000000000000000000000135112631326456016636 0ustar 00000000000000'use strict' var validate = require('aproba') var moduleName = require('../utils/module-name.js') module.exports = function (tree) { validate('O', arguments) var seen = {} var flat = {} var todo = [[tree, '/']] while (todo.length) { var next = todo.shift() var pkg = next[0] seen[pkg.path] = true var path = next[1] flat[path] = pkg if (path !== '/') path += '/' for (var ii = 0; ii < pkg.children.length; ++ii) { var child = pkg.children[ii] if (!seen[child.path]) { todo.push([child, flatName(path, child)]) } } } return flat } var flatName = module.exports.flatName = function (path, child) { validate('SO', arguments) return path + (moduleName(child) || 'TOP') } npm_3.5.2.orig/lib/install/inflate-bundled.js0000644000000000000000000000075412631326456017307 0ustar 00000000000000'use strict' var validate = require('aproba') var childPath = require('../utils/child-path.js') module.exports = function inflateBundled (parent, children) { validate('OA', arguments) children.forEach(function (child) { child.fromBundle = true child.parent = parent child.path = childPath(parent.path, child) child.realpath = childPath(parent.path, child) child.isLink = child.isLink || parent.isLink || parent.target inflateBundled(child, child.children) }) } npm_3.5.2.orig/lib/install/inflate-shrinkwrap.js0000644000000000000000000000443612631326456020063 0ustar 00000000000000'use strict' var url = require('url') var asyncMap = require('slide').asyncMap var validate = require('aproba') var iferr = require('iferr') var fetchPackageMetadata = require('../fetch-package-metadata.js') var addShrinkwrap = require('../fetch-package-metadata.js').addShrinkwrap var addBundled = require('../fetch-package-metadata.js').addBundled var inflateBundled = require('./inflate-bundled.js') var npm = require('../npm.js') var createChild = require('./node.js').create var moduleName = require('../utils/module-name.js') var childPath = require('../utils/child-path.js') var inflateShrinkwrap = module.exports = function (tree, swdeps, finishInflating) { validate('OOF', arguments) if (!npm.config.get('shrinkwrap')) return finishInflating() var onDisk = {} tree.children.forEach(function (child) { onDisk[moduleName(child)] = child }) tree.children = [] asyncMap(Object.keys(swdeps), function (name, next) { var sw = swdeps[name] var spec = sw.resolved ? name + '@' + sw.resolved : (sw.from && url.parse(sw.from).protocol) ? name + '@' + sw.from : name + '@' + sw.version var child = onDisk[name] if (child && (child.fromShrinkwrap || (sw.resolved && child.package._resolved === sw.resolved) || (sw.from && url.parse(sw.from).protocol && child.package._from === sw.from) || child.package.version === sw.version)) { if (!child.fromShrinkwrap) child.fromShrinkwrap = spec tree.children.push(child) return next() } fetchPackageMetadata(spec, tree.path, iferr(next, function (pkg) { pkg._from = sw.from || spec addShrinkwrap(pkg, iferr(next, function () { addBundled(pkg, iferr(next, function () { var child = createChild({ package: pkg, loaded: false, parent: tree, fromShrinkwrap: spec, path: childPath(tree.path, pkg), realpath: childPath(tree.realpath, pkg), children: pkg._bundled || [] }) tree.children.push(child) if (pkg._bundled) { inflateBundled(child, child.children) } inflateShrinkwrap(child, sw.dependencies || {}, next) })) })) })) }, finishInflating) } npm_3.5.2.orig/lib/install/is-dev.js0000644000000000000000000000036612631326456015440 0ustar 00000000000000'use strict' var isDev = exports.isDev = function (node) { return node.package._requiredBy.some(function (req) { return req === '#DEV:/' }) } exports.isOnlyDev = function (node) { return node.package._requiredBy.length === 1 && isDev(node) } npm_3.5.2.orig/lib/install/is-extraneous.js0000644000000000000000000000117512631326456017056 0ustar 00000000000000'use strict' var path = require('path') var isDev = require('./is-dev.js').isDev var npm = require('../npm.js') module.exports = function (tree) { var pkg = tree.package var requiredBy = pkg._requiredBy.filter(function (req) { return req[0] !== '#' }) var isTopLevel = tree.parent == null var isChildOfTop = !isTopLevel && tree.parent.parent == null var isTopGlobal = isChildOfTop && tree.parent.path === path.resolve(npm.globalDir, '..') var topHasNoPackageJson = isChildOfTop && tree.parent.error return !isTopLevel && (!isChildOfTop || !topHasNoPackageJson) && !isTopGlobal && requiredBy.length === 0 && !isDev(tree) } npm_3.5.2.orig/lib/install/is-fs-access-available.js0000644000000000000000000000137312631326456020446 0ustar 00000000000000'use strict' var fs = require('fs') var semver = require('semver') var isWindows = process.platform === 'win32' // fs.access first introduced in node 0.12 / io.js if (!fs.access) { module.exports = false } else if (!isWindows) { // fs.access always works on non-Windows OSes module.exports = true } else { // The Windows implementation of `fs.access` has a bug where it will // sometimes return access errors all the time for directories, even // when access is available. As all we actually test ARE directories, this // is a bit of a problem. // This was fixed in io.js version 1.5.0 // As of 2015-07-20, it is still unfixed in node: // https://github.com/joyent/node/issues/25657 module.exports = semver.gte(process.version, '1.5.0') } npm_3.5.2.orig/lib/install/mutate-into-logical-tree.js0000644000000000000000000000704212631326456021062 0ustar 00000000000000'use strict' var union = require('lodash.union') var without = require('lodash.without') var validate = require('aproba') var flattenTree = require('./flatten-tree.js') var isExtraneous = require('./is-extraneous.js') var validateAllPeerDeps = require('./deps.js').validateAllPeerDeps var packageId = require('../utils/package-id.js') var moduleName = require('../utils/module-name.js') var mutateIntoLogicalTree = module.exports = function (tree) { validate('O', arguments) validateAllPeerDeps(tree, function (tree, pkgname, version) { if (!tree.missingPeers) tree.missingPeers = {} tree.missingPeers[pkgname] = version }) var flat = flattenTree(tree) function getNode (flatname) { return flatname.substr(0, 5) === '#DEV:' ? flat[flatname.substr(5)] : flat[flatname] } Object.keys(flat).sort().forEach(function (flatname) { var node = flat[flatname] var requiredBy = node.package._requiredBy || [] var requiredByNames = requiredBy.filter(function (parentFlatname) { var parentNode = getNode(parentFlatname) if (!parentNode) return false return parentNode.package.dependencies[moduleName(node)] || (parentNode.package.devDependencies && parentNode.package.devDependencies[moduleName(node)]) }) requiredBy = requiredByNames.map(getNode) node.requiredBy = requiredBy if (!requiredBy.length) return if (node.parent) node.parent.children = without(node.parent.children, node) requiredBy.forEach(function (parentNode) { parentNode.children = union(parentNode.children, [node]) }) if (node.package._requiredBy.some(function (nodename) { return nodename[0] === '#' })) { tree.children = union(tree.children, [node]) } }) return tree } module.exports.asReadInstalled = function (tree) { mutateIntoLogicalTree(tree) return translateTree(tree) } function translateTree (tree) { return translateTree_(tree, {}) } function translateTree_ (tree, seen) { var pkg = tree.package if (seen[tree.path]) return pkg seen[tree.path] = pkg if (pkg._dependencies) return pkg pkg._dependencies = pkg.dependencies pkg.dependencies = {} tree.children.forEach(function (child) { pkg.dependencies[moduleName(child)] = translateTree_(child, seen) }) Object.keys(tree.missingDeps).forEach(function (name) { if (pkg.dependencies[name]) { pkg.dependencies[name].invalid = true pkg.dependencies[name].realName = name pkg.dependencies[name].extraneous = false } else { pkg.dependencies[name] = { requiredBy: tree.missingDeps[name], missing: true, optional: !!pkg.optionalDependencies[name] } } }) var checkForMissingPeers = (tree.parent ? [] : [tree]).concat(tree.children) checkForMissingPeers.filter(function (child) { return child.missingPeers }).forEach(function (child) { Object.keys(child.missingPeers).forEach(function (pkgname) { var version = child.missingPeers[pkgname] var peerPkg = pkg.dependencies[pkgname] if (!peerPkg) { peerPkg = pkg.dependencies[pkgname] = { _id: pkgname + '@' + version, name: pkgname, version: version } } if (!peerPkg.peerMissing) peerPkg.peerMissing = [] peerPkg.peerMissing.push({ requiredBy: packageId(child), requires: pkgname + '@' + version }) }) }) pkg.path = tree.path pkg.error = tree.error pkg.extraneous = isExtraneous(tree) if (tree.target && tree.parent && !tree.parent.target) pkg.link = tree.realpath return pkg } npm_3.5.2.orig/lib/install/node.js0000644000000000000000000000277312631326456015202 0ustar 00000000000000'use strict' var defaultTemplate = { package: { dependencies: {}, devDependencies: {}, optionalDependencies: {}, _requiredBy: [], _phantomChildren: {} }, loaded: false, children: [], requiredBy: [], requires: [], missingDeps: {}, missingDevDeps: {}, path: null, realpath: null, userRequired: false, existing: false } function isLink (node) { return node && node.isLink } var create = exports.create = function (node, template) { if (!template) template = defaultTemplate Object.keys(template).forEach(function (key) { if (template[key] != null && typeof template[key] === 'object' && !(template[key] instanceof Array)) { if (!node[key]) node[key] = {} return create(node[key], template[key]) } if (node[key] != null) return node[key] = template[key] }) if (isLink(node) || isLink(node.parent)) { node.isLink = true } return node } exports.reset = function (node) { reset(node, {}) } function reset (node, seen) { if (seen[node.path]) return seen[node.path] = true var child = create(node) child.package._requiredBy = child.package._requiredBy.filter(function (req) { return req[0] === '#' }) child.requiredBy = [] child.package._phantomChildren = {} // FIXME: cleaning up after read-package-json's mess =( if (child.package._id === '@') delete child.package._id child.missingDeps = {} child.children.forEach(function (child) { reset(child, seen) }) if (!child.package.version) child.package.version = '' } npm_3.5.2.orig/lib/install/prune-tree.js0000644000000000000000000000236112631326456016334 0ustar 00000000000000'use strict' var validate = require('aproba') var flattenTree = require('./flatten-tree.js') function isNotPackage (mod) { return function (parentMod) { return mod !== parentMod } } module.exports = function pruneTree (tree) { validate('O', arguments) var flat = flattenTree(tree) // we just do this repeatedly until there are no more orphaned packages // which isn't as effecient as it could be on a REALLY big tree // but we'll face that if it proves to be an issue var removedPackage do { removedPackage = false Object.keys(flat).forEach(function (flatname) { var child = flat[flatname] if (!child.parent) return child.package._requiredBy = (child.package._requiredBy || []).filter(function (req) { var isDev = req.substr(0, 4) === '#DEV' if (req[0] === '#' && !isDev) return true if (flat[req]) return true if (!isDev) return false var reqChildAsDevDep = flat[req.substr(5)] return reqChildAsDevDep && !reqChildAsDevDep.parent }) if (!child.package._requiredBy.length) { removedPackage = true delete flat[flatname] child.parent.children = child.parent.children.filter(isNotPackage(child)) } }) } while (removedPackage) } npm_3.5.2.orig/lib/install/read-shrinkwrap.js0000644000000000000000000000156412631326456017353 0ustar 00000000000000'use strict' var path = require('path') var fs = require('graceful-fs') var iferr = require('iferr') var inflateShrinkwrap = require('./inflate-shrinkwrap.js') var parseJSON = require('../utils/parse-json.js') var readShrinkwrap = module.exports = function (child, next) { fs.readFile(path.join(child.path, 'npm-shrinkwrap.json'), function (er, data) { if (er) { child.package._shrinkwrap = null return next() } try { child.package._shrinkwrap = parseJSON(data) } catch (ex) { child.package._shrinkwrap = null return next(ex) } return next() }) } module.exports.andInflate = function (child, next) { readShrinkwrap(child, iferr(next, function () { if (child.package._shrinkwrap) { return inflateShrinkwrap(child, child.package._shrinkwrap.dependencies || {}, next) } else { return next() } })) } npm_3.5.2.orig/lib/install/save.js0000644000000000000000000001476112631326456015213 0ustar 00000000000000'use strict' var fs = require('graceful-fs') var path = require('path') var url = require('url') var writeFileAtomic = require('write-file-atomic') var log = require('npmlog') var semver = require('semver') var iferr = require('iferr') var validate = require('aproba') var without = require('lodash.without') var npm = require('../npm.js') var deepSortObject = require('../utils/deep-sort-object.js') var parseJSON = require('../utils/parse-json.js') var moduleName = require('../utils/module-name.js') // if the -S|--save option is specified, then write installed packages // as dependencies to a package.json file. exports.saveRequested = function (args, tree, andReturn) { validate('AOF', arguments) savePackageJson(args, tree, andWarnErrors(andSaveShrinkwrap(tree, andReturn))) } function andSaveShrinkwrap (tree, andReturn) { validate('OF', arguments) return function (er) { validate('E', arguments) saveShrinkwrap(tree, andWarnErrors(andReturn)) } } function andWarnErrors (cb) { validate('F', arguments) return function (er) { if (er) log.warn('saveError', er.message) arguments[0] = null cb.apply(null, arguments) } } function saveShrinkwrap (tree, next) { validate('OF', arguments) var saveTarget = path.resolve(tree.path, 'npm-shrinkwrap.json') fs.stat(saveTarget, function (er, stat) { if (er) return next() var save = npm.config.get('save') var saveDev = npm.config.get('save-dev') var saveOptional = npm.config.get('save-optional') var shrinkwrap = tree.package._shrinkwrap || {dependencies: {}} var hasDevOnlyDeps = tree.requires.filter(function (dep) { var devReqs = dep.package._requiredBy.filter(function (name) { return name.substr(0, 4) === '#DEV' }) return devReqs.length === dep.package._requiredBy.length }).some(function (dep) { return shrinkwrap.dependencies[dep.package.name] != null }) if (!saveOptional && saveDev && !hasDevOnlyDeps) return next() if (saveOptional || !save) return next() if (hasDevOnlyDeps) { var dev = npm.config.get('dev') npm.config.set('dev', true) npm.commands.shrinkwrap([], true, function () { npm.config.set('dev', dev) next.apply(this, arguments) }) } else { npm.commands.shrinkwrap([], true, next) } }) } function savePackageJson (args, tree, next) { validate('AOF', arguments) var saveBundle = npm.config.get('save-bundle') // each item in the tree is a top-level thing that should be saved // to the package.json file. // The relevant tree shape is { : {what:} } var saveTarget = path.resolve(tree.path, 'package.json') // don't use readJson, because we don't want to do all the other // tricky npm-specific stuff that's in there. fs.readFile(saveTarget, iferr(next, function (packagejson) { try { packagejson = parseJSON(packagejson) } catch (ex) { return next(ex) } // If we're saving bundled deps, normalize the key before we start if (saveBundle) { var bundle = packagejson.bundleDependencies || packagejson.bundledDependencies delete packagejson.bundledDependencies if (!Array.isArray(bundle)) bundle = [] } var toSave = getThingsToSave(tree) var toRemove = getThingsToRemove(args, tree) var savingTo = {} toSave.forEach(function (pkg) { savingTo[pkg.save] = true }) toRemove.forEach(function (pkg) { savingTo[pkg.save] = true }) Object.keys(savingTo).forEach(function (save) { if (!packagejson[save]) packagejson[save] = {} }) log.verbose('saving', toSave) toSave.forEach(function (pkg) { packagejson[pkg.save][pkg.name] = pkg.spec if (saveBundle) { var ii = bundle.indexOf(pkg.name) if (ii === -1) bundle.push(pkg.name) } }) toRemove.forEach(function (pkg) { delete packagejson[pkg.save][pkg.name] if (saveBundle) { bundle = without(bundle, pkg.name) } }) Object.keys(savingTo).forEach(function (key) { packagejson[key] = deepSortObject(packagejson[key]) }) if (saveBundle) { packagejson.bundledDependencies = deepSortObject(bundle) } var json = JSON.stringify(packagejson, null, 2) + '\n' writeFileAtomic(saveTarget, json, next) })) } var getSaveType = exports.getSaveType = function (args) { validate('A', arguments) var nothingToSave = !args.length var globalInstall = npm.config.get('global') var noSaveFlags = !npm.config.get('save') && !npm.config.get('save-dev') && !npm.config.get('save-optional') if (nothingToSave || globalInstall || noSaveFlags) return null if (npm.config.get('save-optional')) return 'optionalDependencies' else if (npm.config.get('save-dev')) return 'devDependencies' else return 'dependencies' } function computeVersionSpec (child) { validate('O', arguments) var requested = child.package._requested if (!requested || requested.type === 'tag') { requested = { type: 'version', spec: child.package.version } } if (requested.type === 'version' || requested.type === 'range') { var version = child.package.version var rangeDescriptor = '' if (semver.valid(version, true) && semver.gte(version, '0.1.0', true) && !npm.config.get('save-exact')) { rangeDescriptor = npm.config.get('save-prefix') } return rangeDescriptor + version } else if (requested.type === 'directory' || requested.type === 'local') { var relativePath = path.relative(child.parent.path, requested.spec) if (/^[.][.]/.test(relativePath)) { return url.format({ protocol: 'file', slashes: true, pathname: requested.spec }) } else { return url.format({ protocol: 'file', slashes: false, pathname: relativePath }) } } else { return requested.spec } } function getThingsToSave (tree) { validate('O', arguments) var toSave = tree.children.filter(function (child) { return child.save }).map(function (child) { return { name: moduleName(child), spec: computeVersionSpec(child), save: child.save } }) return toSave } function getThingsToRemove (args, tree) { validate('AO', arguments) if (!tree.removed) return [] var toRemove = tree.removed.map(function (child) { return { name: moduleName(child), save: child.save } }) var saveType = getSaveType(args) args.forEach(function (arg) { toRemove.push({ name: arg, save: saveType }) }) return toRemove } npm_3.5.2.orig/lib/install/update-package-json.js0000644000000000000000000000147512631326456020075 0ustar 00000000000000'use strict' var path = require('path') var writeFileAtomic = require('write-file-atomic') var deepSortObject = require('../utils/deep-sort-object.js') module.exports = function (pkg, buildpath, next) { // FIXME: This bundled dance is because we're sticking a big tree of bundled // deps into the parsed package.json– it probably doesn't belong there =/ // But the real reason we don't just dump it out is that it's the result // of npm-read-tree, which produces circular data structures, due to the // parent and children keys. var bundled = pkg.package._bundled delete pkg.package._bundled // FIXME var packagejson = deepSortObject(pkg.package) var data = JSON.stringify(packagejson, null, 2) + '\n' pkg.package._bundled = bundled writeFileAtomic(path.resolve(buildpath, 'package.json'), data, next) } npm_3.5.2.orig/lib/install/validate-args.js0000644000000000000000000000253012631326456016767 0ustar 00000000000000'use strict' var validate = require('aproba') var asyncMap = require('slide').asyncMap var chain = require('slide').chain var npmInstallChecks = require('npm-install-checks') var checkEngine = npmInstallChecks.checkEngine var checkPlatform = npmInstallChecks.checkPlatform var npm = require('../npm.js') module.exports = function (idealTree, args, next) { validate('OAF', arguments) var force = npm.config.get('force') asyncMap(args, function (pkg, done) { chain([ [checkSelf, idealTree, pkg, force], [isInstallable, pkg] ], done) }, next) } var isInstallable = module.exports.isInstallable = function (pkg, next) { var force = npm.config.get('force') var nodeVersion = npm.config.get('node-version') var strict = npm.config.get('engine-strict') chain([ [checkEngine, pkg, npm.version, nodeVersion, force, strict], [checkPlatform, pkg, force] ], next) } function checkSelf (idealTree, pkg, force, next) { if (idealTree.package && idealTree.package.name !== pkg.name) return next() if (force) { var warn = new Error("Wouldn't install " + pkg.name + ' as a dependency of itself, but being forced') warn.code = 'ENOSELF' idealTree.warnings.push(warn) next() } else { var er = new Error('Refusing to install ' + pkg.name + ' as a dependency of itself') er.code = 'ENOSELF' next(er) } } npm_3.5.2.orig/lib/install/validate-tree.js0000644000000000000000000000440212631326456016772 0ustar 00000000000000'use strict' var path = require('path') var validate = require('aproba') var asyncMap = require('slide').asyncMap var chain = require('slide').chain var npmInstallChecks = require('npm-install-checks') var checkGit = npmInstallChecks.checkGit var clone = require('lodash.clonedeep') var normalizePackageData = require('normalize-package-data') var npm = require('../npm.js') var andFinishTracker = require('./and-finish-tracker.js') var flattenTree = require('./flatten-tree.js') var validateAllPeerDeps = require('./deps.js').validateAllPeerDeps var packageId = require('../utils/package-id.js') module.exports = function (idealTree, log, next) { validate('OOF', arguments) var moduleMap = flattenTree(idealTree) var modules = Object.keys(moduleMap).map(function (name) { return moduleMap[name] }) chain([ [asyncMap, modules, function (mod, done) { chain([ mod.parent && !mod.isLink && [checkGit, mod.realpath], [checkErrors, mod, idealTree] ], done) }], [thenValidateAllPeerDeps, idealTree], [thenCheckTop, idealTree] ], andFinishTracker(log, next)) } function checkErrors (mod, idealTree, next) { if (mod.error && (mod.parent || path.resolve(npm.globalDir, '..') !== mod.path)) idealTree.warnings.push(mod.error) next() } function thenValidateAllPeerDeps (idealTree, next) { validate('OF', arguments) validateAllPeerDeps(idealTree, function (tree, pkgname, version) { var warn = new Error(packageId(tree) + ' requires a peer of ' + pkgname + '@' + version + ' but none was installed.') warn.code = 'EPEERINVALID' idealTree.warnings.push(warn) }) next() } function thenCheckTop (idealTree, next) { validate('OF', arguments) if (idealTree.package.error) return next() // FIXME: when we replace read-package-json with something less magic, // this should done elsewhere. // As it is, the package has already been normalized and thus some // errors are suppressed. var pkg = clone(idealTree.package) try { normalizePackageData(pkg, function (warn) { var warnObj = new Error(packageId(idealTree) + ' ' + warn) warnObj.code = 'EPACKAGEJSON' idealTree.warnings.push(warnObj) }, false) } catch (er) { er.code = 'EPACKAGEJSON' idealTree.warnings.push(er) } next() } npm_3.5.2.orig/lib/install/writable.js0000644000000000000000000000200312631326456016050 0ustar 00000000000000'use strict' var path = require('path') var fs = require('fs') var inflight = require('inflight') var accessError = require('./access-error.js') var andIgnoreErrors = require('./and-ignore-errors.js') var isFsAccessAvailable = require('./is-fs-access-available.js') if (isFsAccessAvailable) { module.exports = fsAccessImplementation } else { module.exports = fsOpenImplementation } // exposed only for testing purposes module.exports.fsAccessImplementation = fsAccessImplementation module.exports.fsOpenImplementation = fsOpenImplementation function fsAccessImplementation (dir, done) { done = inflight('writable:' + dir, done) if (!done) return fs.access(dir, fs.W_OK, done) } function fsOpenImplementation (dir, done) { done = inflight('writable:' + dir, done) if (!done) return var tmp = path.join(dir, '.npm.check.permissions') fs.open(tmp, 'w', function (er, fd) { if (er) return done(accessError(dir, er)) fs.close(fd, function () { fs.unlink(tmp, andIgnoreErrors(done)) }) }) } npm_3.5.2.orig/lib/install/action/build.js0000644000000000000000000000065112631326456016622 0ustar 00000000000000'use strict' var chain = require('slide').chain var build = require('../../build.js') var npm = require('../../npm.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('build', packageId(pkg)) chain([ [build.linkStuff, pkg.package, pkg.path, npm.config.get('global'), true], [build.writeBuiltinConf, pkg.package, pkg.path] ], next) } npm_3.5.2.orig/lib/install/action/extract.js0000644000000000000000000000524312631326456017177 0ustar 00000000000000'use strict' var path = require('path') var iferr = require('iferr') var asyncMap = require('slide').asyncMap var fs = require('graceful-fs') var rename = require('../../utils/rename.js') var gentlyRm = require('../../utils/gently-rm.js') var updatePackageJson = require('../update-package-json') var npm = require('../../npm.js') var moduleName = require('../../utils/module-name.js') var packageId = require('../../utils/package-id.js') var cache = require('../../cache.js') var buildPath = require('../build-path.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('extract', packageId(pkg)) var up = npm.config.get('unsafe-perm') var user = up ? null : npm.config.get('user') var group = up ? null : npm.config.get('group') cache.unpack(pkg.package.name, pkg.package.version, buildpath, null, null, user, group, andUpdatePackageJson(pkg, buildpath, andStageBundledChildren(pkg, buildpath, log, next))) } function andUpdatePackageJson (pkg, buildpath, next) { return iferr(next, function () { updatePackageJson(pkg, buildpath, next) }) } function andStageBundledChildren (pkg, buildpath, log, next) { var staging = path.resolve(buildpath, '..') return iferr(next, function () { asyncMap(pkg.children, andStageBundledModule(pkg, staging, buildpath), cleanupBundled) }) function cleanupBundled () { gentlyRm(path.join(buildpath, 'node_modules'), next) } } function andStageBundledModule (bundler, staging, parentPath) { return function (child, next) { stageBundledModule(bundler, child, staging, parentPath, next) } } function getTree (pkg) { while (pkg.parent) pkg = pkg.parent return pkg } function warn (pkg, code, msg) { var tree = getTree(pkg) var err = new Error(msg) err.code = code tree.warnings.push(err) } function stageBundledModule (bundler, child, staging, parentPath, next) { var stageFrom = path.join(parentPath, 'node_modules', child.package.name) var stageTo = buildPath(staging, child) asyncMap(child.children, andStageBundledModule(bundler, staging, stageFrom), iferr(next, moveModule)) function moveModule () { if (child.fromBundle) { return rename(stageFrom, stageTo, iferr(next, updateMovedPackageJson)) } else { return fs.stat(stageFrom, function (notExists, exists) { if (exists) { warn(bundler, 'EBUNDLEOVERRIDE', 'In ' + packageId(bundler) + ' replacing bundled version of ' + moduleName(child) + ' with ' + packageId(child)) return gentlyRm(stageFrom, next) } else { return next() } }) } } function updateMovedPackageJson () { updatePackageJson(child, stageTo, next) } } npm_3.5.2.orig/lib/install/action/fetch.js0000644000000000000000000000156212631326456016616 0ustar 00000000000000'use strict' // var cache = require('../../cache.js') // var packageId = require('../../utils/package-id.js') // var moduleName = require('../../utils/module-name.js') module.exports = function (top, buildpath, pkg, log, next) { next() /* // FIXME: Unnecessary as long as we have to have the tarball to resolve all deps, which // is progressively seeming to be likely for the indefinite future. // ALSO fails for local deps specified with relative URLs outside of the top level. var name = moduleName(pkg) var version switch (pkg.package._requested.type) { case 'version': case 'range': version = pkg.package.version break case 'hosted': name = name + '@' + pkg.package._requested.spec break default: name = pkg.package._requested.raw } log.silly('fetch', packageId(pkg)) cache.add(name, version, top, false, next) */ } npm_3.5.2.orig/lib/install/action/finalize.js0000644000000000000000000000536412631326456017332 0ustar 00000000000000'use strict' var path = require('path') var rimraf = require('rimraf') var fs = require('graceful-fs') var mkdirp = require('mkdirp') var asyncMap = require('slide').asyncMap var rename = require('../../utils/rename.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('finalize', pkg.path) var delpath = path.join(path.dirname(pkg.path), '.' + path.basename(pkg.path) + '.DELETE') mkdirp(path.resolve(pkg.path, '..'), whenParentExists) function whenParentExists (mkdirEr) { if (mkdirEr) return next(mkdirEr) // We stat first, because we can't rely on ENOTEMPTY from Windows. // Windows, by contrast, gives the generic EPERM of a folder already exists. fs.lstat(pkg.path, destStatted) } function destStatted (doesNotExist) { if (doesNotExist) { rename(buildpath, pkg.path, whenMoved) } else { moveAway() } } function whenMoved (renameEr) { if (!renameEr) return next() if (renameEr.code !== 'ENOTEMPTY') return next(renameEr) moveAway() } function moveAway () { rename(pkg.path, delpath, whenOldMovedAway) } function whenOldMovedAway (renameEr) { if (renameEr) return next(renameEr) rename(buildpath, pkg.path, whenConflictMoved) } function whenConflictMoved (renameEr) { // if we got an error we'll try to put back the original module back, // succeed or fail though we want the original error that caused this if (renameEr) return rename(delpath, pkg.path, function () { next(renameEr) }) fs.readdir(path.join(delpath, 'node_modules'), makeTarget) } function makeTarget (readdirEr, files) { if (readdirEr) return cleanup() if (!files.length) return cleanup() mkdirp(path.join(pkg.path, 'node_modules'), function (mkdirEr) { moveModules(mkdirEr, files) }) } function moveModules (mkdirEr, files) { if (mkdirEr) return next(mkdirEr) asyncMap(files, function (file, done) { var from = path.join(delpath, 'node_modules', file) var to = path.join(pkg.path, 'node_modules', file) rename(from, to, done) }, cleanup) } function cleanup (moveEr) { if (moveEr) return next(moveEr) rimraf(delpath, afterCleanup) } function afterCleanup (rimrafEr) { if (rimrafEr) log.warn('finalize', rimrafEr) next() } } module.exports.rollback = function (buildpath, pkg, next) { var top = path.resolve(buildpath, '..') rimraf(pkg.path, function () { removeEmptyParents(pkg.path) }) function removeEmptyParents (pkgdir) { if (path.relative(top, pkgdir)[0] === '.') return next() fs.rmdir(pkgdir, function (er) { // FIXME: Make sure windows does what we want here if (er && er.code !== 'ENOENT') return next() removeEmptyParents(path.resolve(pkgdir, '..')) }) } } npm_3.5.2.orig/lib/install/action/global-install.js0000644000000000000000000000113612631326456020426 0ustar 00000000000000'use strict' var path = require('path') var npm = require('../../npm.js') var Installer = require('../../install.js').Installer var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('global-install', packageId(pkg)) var globalRoot = path.resolve(npm.globalDir, '..') npm.config.set('global', true) var install = new Installer(globalRoot, false, [pkg.package.name + '@' + pkg.package._requested.spec]) install.link = false install.run(function () { npm.config.set('global', false) next.apply(null, arguments) }) } npm_3.5.2.orig/lib/install/action/global-link.js0000644000000000000000000000036212631326456017715 0ustar 00000000000000'use strict' var npm = require('../../npm.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('global-link', packageId(pkg)) npm.link(pkg.package.name, next) } npm_3.5.2.orig/lib/install/action/install.js0000644000000000000000000000045212631326456017170 0ustar 00000000000000'use strict' var lifecycle = require('../../utils/lifecycle.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('install', packageId(pkg), buildpath) lifecycle(pkg.package, 'install', pkg.path, false, false, next) } npm_3.5.2.orig/lib/install/action/move.js0000644000000000000000000000624712631326456016500 0ustar 00000000000000'use strict' var fs = require('graceful-fs') var path = require('path') var chain = require('slide').chain var iferr = require('iferr') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var rmStuff = require('../../unbuild.js').rmStuff var lifecycle = require('../../utils/lifecycle.js') var updatePackageJson = require('../update-package-json.js') var rename = require('../../utils/rename.js') /* Move a module from one point in the node_modules tree to another. Do not disturb either the source or target location's node_modules folders. */ module.exports = function (top, buildpath, pkg, log, next) { log.silly('move', pkg.fromPath, pkg.path) chain([ [lifecycle, pkg.package, 'preuninstall', pkg.fromPath, false, true], [lifecycle, pkg.package, 'uninstall', pkg.fromPath, false, true], [rmStuff, pkg.package, pkg.fromPath], [lifecycle, pkg.package, 'postuninstall', pkg.fromPath, false, true], [moveModuleOnly, pkg.fromPath, pkg.path, log], [lifecycle, pkg.package, 'preinstall', pkg.path, false, true], [removeEmptyParents, path.resolve(pkg.fromPath, '..')], [updatePackageJson, pkg, pkg.path] ], next) } function removeEmptyParents (pkgdir, next) { fs.rmdir(pkgdir, function (er) { // FIXME: Make sure windows does what we want here if (er && er.code !== 'ENOENT') return next() removeEmptyParents(path.resolve(pkgdir, '..'), next) }) } function moveModuleOnly (from, to, log, done) { var fromModules = path.join(from, 'node_modules') var tempFromModules = from + '.node_modules' var toModules = path.join(to, 'node_modules') var tempToModules = to + '.node_modules' log.silly('move', 'move existing destination node_modules away', toModules) rename(toModules, tempToModules, removeDestination(done)) function removeDestination (next) { return function (er) { log.silly('move', 'remove existing destination', to) if (er) { rimraf(to, iferr(next, makeDestination(next))) } else { rimraf(to, iferr(next, makeDestination(iferr(next, moveToModulesBack(next))))) } } } function moveToModulesBack (next) { return function () { log.silly('move', 'move existing destination node_modules back', toModules) rename(tempToModules, toModules, iferr(done, next)) } } function makeDestination (next) { return function () { log.silly('move', 'make sure destination parent exists', path.resolve(to, '..')) mkdirp(path.resolve(to, '..'), iferr(done, moveNodeModules(next))) } } function moveNodeModules (next) { return function () { log.silly('move', 'move source node_modules away', fromModules) rename(fromModules, tempFromModules, iferr(doMove(next), doMove(moveNodeModulesBack(next)))) } } function doMove (next) { return function () { log.silly('move', 'move module dir to final dest', from, to) rename(from, to, iferr(done, next)) } } function moveNodeModulesBack (next) { return function () { mkdirp(from, iferr(done, function () { log.silly('move', 'put source node_modules back', fromModules) rename(tempFromModules, fromModules, iferr(done, next)) })) } } } npm_3.5.2.orig/lib/install/action/postinstall.js0000644000000000000000000000046212631326456020077 0ustar 00000000000000'use strict' var lifecycle = require('../../utils/lifecycle.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('postinstall', packageId(pkg), buildpath) lifecycle(pkg.package, 'postinstall', pkg.path, false, false, next) } npm_3.5.2.orig/lib/install/action/preinstall.js0000644000000000000000000000046112631326456017677 0ustar 00000000000000'use strict' var lifecycle = require('../../utils/lifecycle.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('preinstall', packageId(pkg), buildpath) lifecycle(pkg.package, 'preinstall', buildpath, false, false, next) } npm_3.5.2.orig/lib/install/action/prepublish.js0000644000000000000000000000046112631326456017677 0ustar 00000000000000'use strict' var lifecycle = require('../../utils/lifecycle.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('prepublish', packageId(pkg), buildpath) lifecycle(pkg.package, 'prepublish', buildpath, false, false, next) } npm_3.5.2.orig/lib/install/action/remove.js0000644000000000000000000000465312631326456017026 0ustar 00000000000000'use strict' var path = require('path') var fs = require('graceful-fs') var rimraf = require('rimraf') var asyncMap = require('slide').asyncMap var mkdirp = require('mkdirp') var npm = require('../../npm.js') var andIgnoreErrors = require('../and-ignore-errors.js') var rename = require('../../utils/rename.js') // This is weird because we want to remove the module but not it's node_modules folder // allowing for this allows us to not worry about the order of operations module.exports = function (top, buildpath, pkg, log, next) { log.silly('remove', pkg.path) if (pkg.target) { removeLink(pkg, next) } else { removeDir(pkg, log, next) } } function removeLink (pkg, next) { npm.commands.unbuild(pkg.path, true, next) } function removeDir (pkg, log, next) { var modpath = path.join(path.dirname(pkg.path), '.' + path.basename(pkg.path) + '.MODULES') rename(path.join(pkg.path, 'node_modules'), modpath, unbuildPackage) function unbuildPackage (renameEr) { npm.commands.unbuild(pkg.path, true, function () { rimraf(pkg.path, renameEr ? andRemoveEmptyParents(pkg.path) : moveModulesBack) }) } function andRemoveEmptyParents (path) { return function (er) { if (er) return next(er) removeEmptyParents(pkg.path) } } function moveModulesBack () { fs.readdir(modpath, makeTarget) } function makeTarget (readdirEr, files) { if (readdirEr) return cleanup() if (!files.length) return cleanup() mkdirp(path.join(pkg.path, 'node_modules'), function (mkdirEr) { moveModules(mkdirEr, files) }) } function moveModules (mkdirEr, files) { if (mkdirEr) return next(mkdirEr) asyncMap(files, function (file, done) { var from = path.join(modpath, file) var to = path.join(pkg.path, 'node_modules', file) // we ignore errors here, because they can legitimately happen, for instance, // bundled modules will be in both node_modules folders rename(from, to, andIgnoreErrors(done)) }, cleanup) } function cleanup () { rimraf(modpath, afterCleanup) } function afterCleanup (rimrafEr) { if (rimrafEr) log.warn('remove', rimrafEr) removeEmptyParents(path.resolve(pkg.path, '..')) } function removeEmptyParents (pkgdir) { fs.rmdir(pkgdir, function (er) { // FIXME: Make sure windows does what we want here if (er && er.code !== 'ENOENT') return next() removeEmptyParents(path.resolve(pkgdir, '..')) }) } } npm_3.5.2.orig/lib/install/action/test.js0000644000000000000000000000044512631326456016503 0ustar 00000000000000'use strict' var lifecycle = require('../../utils/lifecycle.js') var packageId = require('../../utils/package-id.js') module.exports = function (top, buildpath, pkg, log, next) { log.silly('test', packageId(pkg), buildpath) lifecycle(pkg.package, 'test', buildpath, false, false, next) } npm_3.5.2.orig/lib/install/action/update-linked.js0000644000000000000000000000044012631326456020245 0ustar 00000000000000'use strict' var path = require('path') module.exports = function (top, buildpath, pkg, log, next) { log.warn('update-linked', path.relative(top, pkg.path), 'needs updating to', pkg.package.version, 'from', pkg.oldPkg.package.version, "but we can't, as it's a symlink") next() } npm_3.5.2.orig/lib/utils/child-path.js0000644000000000000000000000044212631326456015753 0ustar 00000000000000'use strict' var path = require('path') var validate = require('aproba') var moduleName = require('../utils/module-name.js') module.exports = childPath function childPath (parentPath, child) { validate('SO', arguments) return path.join(parentPath, 'node_modules', moduleName(child)) } npm_3.5.2.orig/lib/utils/completion/0000755000000000000000000000000012631326456015551 5ustar 00000000000000npm_3.5.2.orig/lib/utils/completion.sh0000755000000000000000000000317712631326456016120 0ustar 00000000000000#!/bin/bash ###-begin-npm-completion-### # # npm command completion script # # Installation: npm completion >> ~/.bashrc (or ~/.zshrc) # Or, maybe: npm completion > /usr/local/etc/bash_completion.d/npm # if type complete &>/dev/null; then _npm_completion () { local words cword if type _get_comp_words_by_ref &>/dev/null; then _get_comp_words_by_ref -n = -n @ -w words -i cword else cword="$COMP_CWORD" words=("${COMP_WORDS[@]}") fi local si="$IFS" IFS=$'\n' COMPREPLY=($(COMP_CWORD="$cword" \ COMP_LINE="$COMP_LINE" \ COMP_POINT="$COMP_POINT" \ npm completion -- "${words[@]}" \ 2>/dev/null)) || return $? IFS="$si" } complete -o default -F _npm_completion npm elif type compdef &>/dev/null; then _npm_completion() { local si=$IFS compadd -- $(COMP_CWORD=$((CURRENT-1)) \ COMP_LINE=$BUFFER \ COMP_POINT=0 \ npm completion -- "${words[@]}" \ 2>/dev/null) IFS=$si } compdef _npm_completion npm elif type compctl &>/dev/null; then _npm_completion () { local cword line point words si read -Ac words read -cn cword let cword-=1 read -l line read -ln point si="$IFS" IFS=$'\n' reply=($(COMP_CWORD="$cword" \ COMP_LINE="$line" \ COMP_POINT="$point" \ npm completion -- "${words[@]}" \ 2>/dev/null)) || return $? IFS="$si" } compctl -K _npm_completion npm fi ###-end-npm-completion-### npm_3.5.2.orig/lib/utils/correct-mkdir.js0000644000000000000000000000530012631326456016501 0ustar 00000000000000var chownr = require('chownr') var dezalgo = require('dezalgo') var fs = require('graceful-fs') var inflight = require('inflight') var log = require('npmlog') var mkdirp = require('mkdirp') // memoize the directories created by this step var stats = {} var effectiveOwner module.exports = function correctMkdir (path, cb) { cb = dezalgo(cb) if (stats[path]) return cb(null, stats[path]) fs.stat(path, function (er, st) { if (er) return makeDirectory(path, cb) if (!st.isDirectory()) { log.error('correctMkdir', 'invalid dir %s', path) return cb(er) } var ownerStats = calculateOwner() // there's always a chance the permissions could have been frobbed, so fix if (st.uid !== ownerStats.uid) { stats[path] = ownerStats setPermissions(path, ownerStats, cb) } else { stats[path] = st cb(null, stats[path]) } }) } function calculateOwner () { if (!effectiveOwner) { effectiveOwner = { uid: 0, gid: 0 } if (process.getuid) effectiveOwner.uid = +process.getuid() if (process.getgid) effectiveOwner.gid = +process.getgid() if (effectiveOwner.uid === 0) { if (process.env.SUDO_UID) effectiveOwner.uid = +process.env.SUDO_UID if (process.env.SUDO_GID) effectiveOwner.gid = +process.env.SUDO_GID } } return effectiveOwner } function makeDirectory (path, cb) { cb = inflight('makeDirectory:' + path, cb) if (!cb) { return log.verbose('makeDirectory', path, 'creation already in flight; waiting') } else { log.verbose('makeDirectory', path, 'creation not in flight; initializing') } var owner = calculateOwner() if (!process.getuid) { return mkdirp(path, function (er) { log.verbose('makeCacheDir', 'UID & GID are irrelevant on', process.platform) stats[path] = owner return cb(er, stats[path]) }) } if (owner.uid !== 0 || !process.env.HOME) { log.silly( 'makeDirectory', path, 'uid:', owner.uid, 'gid:', owner.gid ) stats[path] = owner mkdirp(path, afterMkdir) } else { fs.stat(process.env.HOME, function (er, st) { if (er) { log.error('makeDirectory', 'homeless?') return cb(er) } log.silly( 'makeDirectory', path, 'uid:', st.uid, 'gid:', st.gid ) stats[path] = st mkdirp(path, afterMkdir) }) } function afterMkdir (er, made) { if (er || !stats[path] || isNaN(stats[path].uid) || isNaN(stats[path].gid)) { return cb(er, stats[path]) } if (!made) return cb(er, stats[path]) setPermissions(made, stats[path], cb) } } function setPermissions (path, st, cb) { chownr(path, st.uid, st.gid, function (er) { return cb(er, st) }) } npm_3.5.2.orig/lib/utils/deep-sort-object.js0000644000000000000000000000055212631326456017106 0ustar 00000000000000'use strict' var sortedObject = require('sorted-object') module.exports = function deepSortObject (obj, sortBy) { if (obj == null || typeof obj !== 'object') return obj if (obj instanceof Array) return obj.sort(sortBy) obj = sortedObject(obj) Object.keys(obj).forEach(function (key) { obj[key] = deepSortObject(obj[key], sortBy) }) return obj } npm_3.5.2.orig/lib/utils/depr-check.js0000644000000000000000000000060612631326456015745 0ustar 00000000000000var log = require('npmlog') var deprecated = {} var deprWarned = {} module.exports = function deprCheck (data) { if (deprecated[data._id]) data.deprecated = deprecated[data._id] if (data.deprecated) deprecated[data._id] = data.deprecated else return if (!deprWarned[data._id]) { deprWarned[data._id] = true log.warn('deprecated', '%s: %s', data._id, data.deprecated) } } npm_3.5.2.orig/lib/utils/error-handler.js0000644000000000000000000001173612631326456016512 0ustar 00000000000000 module.exports = errorHandler var cbCalled = false var log = require('npmlog') var npm = require('../npm.js') var rm = require('rimraf') var itWorked = false var path = require('path') var wroteLogFile = false var exitCode = 0 var rollbacks = npm.rollbacks var chain = require('slide').chain var writeStream = require('fs-write-stream-atomic') var errorMessage = require('./error-message.js') process.on('exit', function (code) { log.disableProgress() if (!npm.config || !npm.config.loaded) return if (code) itWorked = false if (itWorked) log.info('ok') else { if (!cbCalled) { log.error('', 'cb() never called!') } if (wroteLogFile) { // just a line break if (log.levels[log.level] <= log.levels.error) console.error('') log.error( '', [ 'Please include the following file with any support request:', ' ' + path.resolve('npm-debug.log') ].join('\n') ) wroteLogFile = false } if (code) { log.error('code', code) } } var doExit = npm.config.get('_exit') if (doExit) { // actually exit. if (exitCode === 0 && !itWorked) { exitCode = 1 } if (exitCode !== 0) process.exit(exitCode) } else { itWorked = false // ready for next exit } }) function exit (code, noLog) { exitCode = exitCode || process.exitCode || code var doExit = npm.config ? npm.config.get('_exit') : true log.verbose('exit', [code, doExit]) if (log.level === 'silent') noLog = true if (rollbacks.length) { chain(rollbacks.map(function (f) { return function (cb) { npm.commands.unbuild([f], true, cb) } }), function (er) { if (er) { log.error('error rolling back', er) if (!code) errorHandler(er) else if (noLog) rm('npm-debug.log', reallyExit.bind(null, er)) else writeLogFile(reallyExit.bind(this, er)) } else { if (!noLog && code) writeLogFile(reallyExit) else rm('npm-debug.log', reallyExit) } }) rollbacks.length = 0 } else if (code && !noLog) writeLogFile(reallyExit) else rm('npm-debug.log', reallyExit) function reallyExit (er) { if (er && !code) code = typeof er.errno === 'number' ? er.errno : 1 // truncate once it's been written. log.record.length = 0 itWorked = !code // just emit a fake exit event. // if we're really exiting, then let it exit on its own, so that // in-process stuff can finish or clean up first. if (!doExit) process.emit('exit', code) } } function errorHandler (er) { log.disableProgress() // console.error('errorHandler', er) if (!npm.config || !npm.config.loaded) { // logging won't work unless we pretend that it's ready er = er || new Error('Exit prior to config file resolving.') console.error(er.stack || er.message) } if (cbCalled) { er = er || new Error('Callback called more than once.') } cbCalled = true if (!er) return exit(0) if (typeof er === 'string') { log.error('', er) return exit(1, true) } else if (!(er instanceof Error)) { log.error('weird error', er) return exit(1, true) } var m = er.code || er.message.match(/^(?:Error: )?(E[A-Z]+)/) if (m && !er.code) { er.code = m } ;[ 'type', 'fstream_path', 'fstream_unc_path', 'fstream_type', 'fstream_class', 'fstream_finish_call', 'fstream_linkpath', 'stack', 'fstream_stack', 'statusCode', 'pkgid' ].forEach(function (k) { var v = er[k] if (!v) return if (k === 'fstream_stack') v = v.join('\n') log.verbose(k, v) }) log.verbose('cwd', process.cwd()) var os = require('os') // log.error('System', os.type() + ' ' + os.release()) // log.error('command', process.argv.map(JSON.stringify).join(' ')) // log.error('node -v', process.version) // log.error('npm -v', npm.version) log.error('', os.type() + ' ' + os.release()) log.error('argv', process.argv.map(JSON.stringify).join(' ')) log.error('node', process.version) log.error('npm ', 'v' + npm.version) ;[ 'file', 'path', 'code', 'errno', 'syscall' ].forEach(function (k) { var v = er[k] if (v) log.error(k, v) }) // just a line break if (log.levels[log.level] <= log.levels.error) console.error('') var msg = errorMessage(er) msg.summary.concat(msg.detail).forEach(function (errline) { log.error.apply(log, errline) }) exit(typeof er.errno === 'number' ? er.errno : 1) } var writingLogFile = false function writeLogFile (cb) { if (writingLogFile) return cb() writingLogFile = true wroteLogFile = true var fstr = writeStream('npm-debug.log') var os = require('os') var out = '' log.record.forEach(function (m) { var pref = [m.id, m.level] if (m.prefix) pref.push(m.prefix) pref = pref.join(' ') m.message.trim().split(/\r?\n/).map(function (line) { return (pref + ' ' + line).trim() }).forEach(function (line) { out += line + os.EOL }) }) fstr.end(out) fstr.on('close', cb) } npm_3.5.2.orig/lib/utils/error-message.js0000644000000000000000000002255412631326456016521 0ustar 00000000000000'use strict' var npm = require('../npm.js') var util = require('util') var nameValidator = require('validate-npm-package-name') module.exports = errorMessage function errorMessage (er) { var short = [] var detail = [] if (er.optional) { short.push(['optional', 'Skipping failed optional dependency ' + er.optional + ':']) } switch (er.code) { case 'ECONNREFUSED': short.push(['', er]) detail.push([ '', [ '\nIf you are behind a proxy, please make sure that the', "'proxy' config is set properly. See: 'npm help config'" ].join('\n') ]) break case 'EACCES': case 'EPERM': short.push(['', er]) detail.push(['', ['\nPlease try running this command again as root/Administrator.' ].join('\n')]) break case 'ELIFECYCLE': short.push(['', er.message]) detail.push([ '', [ '', 'Failed at the ' + er.pkgid + ' ' + er.stage + " script '" + er.script + "'.", 'Make sure you have the latest version of node.js and npm installed.', 'If you do, this is most likely a problem with the ' + er.pkgname + ' package,', 'not with npm itself.', 'Tell the author that this fails on your system:', ' ' + er.script, 'You can get information on how to open an issue for this project with:', ' npm bugs ' + er.pkgname, 'Or if that isn\'t available, you can get their info via:', ' npm owner ls ' + er.pkgname, 'There is likely additional logging output above.' ].join('\n')] ) break case 'ENOGIT': short.push(['', er.message]) detail.push([ '', [ '', 'Failed using git.', 'This is most likely not a problem with npm itself.', 'Please check if you have git installed and in your PATH.' ].join('\n') ]) break case 'EJSONPARSE': short.push(['', er.message]) short.push(['', 'File: ' + er.file]) detail.push([ '', [ 'Failed to parse package.json data.', 'package.json must be actual JSON, not just JavaScript.', '', 'This is not a bug in npm.', 'Tell the package author to fix their package.json file.' ].join('\n'), 'JSON.parse' ]) break // TODO(isaacs) // Add a special case here for E401 and E403 explaining auth issues? case 'E404': // There's no need to have 404 in the message as well. var msg = er.message.replace(/^404\s+/, '') short.push(['404', msg]) if (er.pkgid && er.pkgid !== '-') { detail.push(['404', '']) detail.push(['404', '', "'" + er.pkgid + "' is not in the npm registry."]) var valResult = nameValidator(er.pkgid) if (valResult.validForNewPackages) { detail.push(['404', 'You should bug the author to publish it (or use the name yourself!)']) } else { detail.push(['404', 'Your package name is not valid, because', '']) var errorsArray = (valResult.errors || []).concat(valResult.warnings || []) errorsArray.forEach(function (item, idx) { detail.push(['404', ' ' + (idx + 1) + '. ' + item]) }) } if (er.parent) { detail.push(['404', "It was specified as a dependency of '" + er.parent + "'"]) } detail.push(['404', '\nNote that you can also install from a']) detail.push(['404', 'tarball, folder, http url, or git url.']) } break case 'EPUBLISHCONFLICT': short.push(['publish fail', 'Cannot publish over existing version.']) detail.push(['publish fail', "Update the 'version' field in package.json and try again."]) detail.push(['publish fail', '']) detail.push(['publish fail', 'To automatically increment version numbers, see:']) detail.push(['publish fail', ' npm help version']) break case 'EISGIT': short.push(['git', er.message]) short.push(['git', ' ' + er.path]) detail.push([ 'git', [ 'Refusing to remove it. Update manually,', 'or move it out of the way first.' ].join('\n') ]) break case 'ECYCLE': short.push([ 'cycle', [ er.message, 'While installing: ' + er.pkgid ].join('\n') ]) detail.push([ 'cycle', [ 'Found a pathological dependency case that npm cannot solve.', 'Please report this to the package author.' ].join('\n') ]) break case 'EBADPLATFORM': short.push([ 'notsup', [ 'Not compatible with your operating system or architecture: ' + er.pkgid ].join('\n') ]) detail.push([ 'notsup', [ 'Valid OS: ' + (er.os.join ? er.os.join(',') : util.inspect(er.os)), 'Valid Arch: ' + (er.cpu.join ? er.cpu.join(',') : util.inspect(er.cpu)), 'Actual OS: ' + process.platform, 'Actual Arch: ' + process.arch ].join('\n') ]) break case 'EEXIST': short.push(['', er.message]) short.push(['', 'File exists: ' + er.path]) detail.push(['', 'Move it away, and try again.']) break case 'ENEEDAUTH': short.push(['need auth', er.message]) detail.push(['need auth', 'You need to authorize this machine using `npm adduser`']) break case 'ECONNRESET': case 'ENOTFOUND': case 'ETIMEDOUT': case 'EAI_FAIL': short.push(['network', er.message]) detail.push([ 'network', [ 'This is most likely not a problem with npm itself', 'and is related to network connectivity.', 'In most cases you are behind a proxy or have bad network settings.', '\nIf you are behind a proxy, please make sure that the', "'proxy' config is set properly. See: 'npm help config'" ].join('\n') ]) break case 'ENOPACKAGEJSON': short.push(['package.json', er.message]) detail.push([ 'package.json', [ 'This is most likely not a problem with npm itself.', "npm can't find a package.json file in your current directory." ].join('\n') ]) break case 'ETARGET': short.push(['notarget', er.message]) msg = [ 'This is most likely not a problem with npm itself.', 'In most cases you or one of your dependencies are requesting', "a package version that doesn't exist." ] if (er.parent) { msg.push("\nIt was specified as a dependency of '" + er.parent + "'\n") } detail.push(['notarget', msg.join('\n')]) break case 'ENOTSUP': if (er.required) { short.push(['notsup', er.message]) short.push(['notsup', 'Not compatible with your version of node/npm: ' + er.pkgid]) detail.push([ 'notsup', [ 'Not compatible with your version of node/npm: ' + er.pkgid, 'Required: ' + JSON.stringify(er.required), 'Actual: ' + JSON.stringify({ npm: npm.version, node: npm.config.get('node-version') }) ].join('\n') ]) break } // else passthrough /*eslint no-fallthrough:0*/ case 'ENOSPC': short.push(['nospc', er.message]) detail.push([ 'nospc', [ 'This is most likely not a problem with npm itself', 'and is related to insufficient space on your system.' ].join('\n') ]) break case 'EROFS': short.push(['rofs', er.message]) detail.push([ 'rofs', [ 'This is most likely not a problem with npm itself', 'and is related to the file system being read-only.', '\nOften virtualized file systems, or other file systems', "that don't support symlinks, give this error." ].join('\n') ]) break case 'ENOENT': short.push(['enoent', er.message]) detail.push([ 'enoent', [ er.message, 'This is most likely not a problem with npm itself', 'and is related to npm not being able to find a file.', er.file ? "\nCheck if the file '" + er.file + "' is present." : '' ].join('\n') ]) break case 'EMISSINGARG': case 'EUNKNOWNTYPE': case 'EINVALIDTYPE': case 'ETOOMANYARGS': short.push(['typeerror', er.stack]) detail.push([ 'typeerror', [ 'This is an error with npm itself. Please report this error at:', ' ' ].join('\n') ]) break case 'EISDIR': short.push(['eisdir', er.message]) detail.push([ 'eisdir', [ 'This is most likely not a problem with npm itself', 'and is related to npm not being able to find a package.json in', 'a package you are trying to install.' ].join('\n') ]) break default: short.push(['', er.message || er]) detail.push([ '', [ '', 'If you need help, you may report this error at:', ' ' ].join('\n') ]) break } return {summary: short, detail: detail} } npm_3.5.2.orig/lib/utils/gently-rm.js0000644000000000000000000001752012631326456015661 0ustar 00000000000000// only remove the thing if it's a symlink into a specific folder. This is // a very common use-case of npm's, but not so common elsewhere. exports = module.exports = gentlyRm var resolve = require('path').resolve var dirname = require('path').dirname var normalize = require('path').normalize var validate = require('aproba') var log = require('npmlog') var lstat = require('graceful-fs').lstat var readlink = require('graceful-fs').readlink var isInside = require('path-is-inside') var vacuum = require('fs-vacuum') var chain = require('slide').chain var asyncMap = require('slide').asyncMap var readCmdShim = require('read-cmd-shim') var iferr = require('iferr') var npm = require('../npm.js') function gentlyRm (target, gently, base, cb) { if (!cb) { cb = base base = undefined } if (!cb) { cb = gently gently = false } log.silly( 'gentlyRm', target, 'is being', gently ? 'gently removed' : 'purged', base ? 'from base ' + base : '' ) // never rm the root, prefix, or bin dirs // // globals included because of `npm link` -- as far as the package // requesting the link is concerned, the linked package is always // installed globally var prefixes = [ npm.prefix, npm.globalPrefix, npm.dir, npm.root, npm.globalDir, npm.bin, npm.globalBin ] var targetPath = normalize(resolve(npm.prefix, target)) if (prefixes.indexOf(targetPath) !== -1) { log.verbose('gentlyRm', targetPath, "is part of npm and can't be removed") return cb(new Error('May not delete: ' + targetPath)) } var options = { log: log.silly.bind(log, 'vacuum-fs') } if (npm.config.get('force') || !gently) options.purge = true if (base) options.base = normalize(resolve(npm.prefix, base)) if (!gently) { log.verbose('gentlyRm', "don't care about contents; nuking", targetPath) return vacuum(targetPath, options, cb) } var parent = options.base = options.base || normalize(npm.prefix) // Do all the async work we'll need to do in order to tell if this is a // safe operation chain([ [isEverInside, parent, prefixes], [readLinkOrShim, targetPath], [isEverInside, targetPath, prefixes], [isEverInside, targetPath, [parent]] ], function (er, results) { if (er) { if (er.code === 'ENOENT') return cb() return cb(er) } var parentInfo = { path: parent, managed: results[0] } var targetInfo = { path: targetPath, symlink: results[1], managed: results[2], inParent: results[3] } isSafeToRm(parentInfo, targetInfo, iferr(cb, thenRemove)) function thenRemove (toRemove, removeBase) { if (!toRemove) return cb() if (removeBase) options.base = removeBase log.verbose('gentlyRm', options.purge ? 'Purging' : 'Vacuuming', toRemove, 'up to', options.base) return vacuum(toRemove, options, cb) } }) } exports._isSafeToRm = isSafeToRm function isSafeToRm (parent, target, cb) { log.silly('gentlyRm', 'parent.path =', parent.path) log.silly('gentlyRm', 'parent.managed =', parent.managed && parent.managed.target + ' is in ' + parent.managed.path) log.silly('gentlyRm', 'target.path = ', target.path) log.silly('gentlyRm', 'target.symlink =', target.symlink) log.silly('gentlyRm', 'target.managed =', target.managed && target.managed.target + ' is in ' + target.managed.path) log.silly('gentlyRm', 'target.inParent = ', target.inParent) // The parent directory or something it symlinks to must eventually be in // a folder that npm maintains. if (!parent.managed) { log.verbose('gentlyRm', parent.path, 'is not contained in any diretory npm is known to control or ' + 'any place they link to') return cb(clobberFail(target.path, 'containing path ' + parent.path + " isn't under npm's control")) } // The target or something it symlinks to must eventually be in the parent // or something the parent symlinks to if (target.inParent) { var actualTarget = target.inParent.target var targetsParent = target.inParent.path // if the target.path was what we found in some version of parent, remove // using that parent as the base if (target.path === actualTarget) { return cb(null, target.path, targetsParent) } else { // If something the target.path links to was what was found, just // remove target.path in the location it was found. return cb(null, target.path, dirname(target.path)) } } // If the target is in a managed directory and is in a symlink, but was // not in our parent that usually means someone else installed a bin file // with the same name as one of our bin files. if (target.managed && target.symlink) { log.warn('gentlyRm', 'not removing', target.path, "as it wasn't installed by", parent.path) return cb() } if (target.symlink) { return cb(clobberFail(target.path, target.symlink + ' symlink target is not controlled by npm ' + parent.path)) } else { return cb(clobberFail(target.path, 'is outside ' + parent.path + ' and not a link')) } } function clobberFail (target, msg) { validate('SS', arguments) var er = new Error('Refusing to delete ' + target + ': ' + msg) er.code = 'EEXIST' er.path = target return er } exports._isEverInside = isEverInside // return the first of path, where target (or anything it symlinks to) // isInside the path (or anything it symlinks to) function isEverInside (target, paths, cb) { validate('SAF', arguments) function skipENOENT (er) { if (er.code === 'ENOENT') return cb(null, false) return cb(er) } asyncMap(paths, readAllLinks, iferr(skipENOENT, function (resolvedPaths) { readAllLinks(target, iferr(skipENOENT, function (targets) { cb(null, areAnyInsideAny(targets, resolvedPaths)) })) })) } exports._areAnyInsideAny = areAnyInsideAny // Return the first path found that any target is inside function areAnyInsideAny (targets, paths) { validate('AA', arguments) var toCheck = [] paths.forEach(function (path) { targets.forEach(function (target) { toCheck.push([target, path]) }) }) for (var ii = 0; ii < toCheck.length; ++ii) { var target = toCheck[ii][0] var path = toCheck[ii][1] var inside = isInside(target, path) if (!inside) log.silly('isEverInside', target, 'is not inside', path) if (inside && path) return inside && path && {target: target, path: path} } return false } exports._readAllLinks = readAllLinks // resolves chains of symlinks of unlimited depth, returning a list of paths // it's seen in the process when it hits either a symlink cycle or a // non-symlink function readAllLinks (path, cb) { validate('SF', arguments) var seen = {} _readAllLinks(path) function _readAllLinks (path) { if (seen[path]) return cb(null, Object.keys(seen)) seen[path] = true resolveSymlink(path, iferr(cb, _readAllLinks)) } } exports._resolveSymlink = resolveSymlink var resolvedPaths = {} function resolveSymlink (symlink, cb) { validate('SF', arguments) var cached = resolvedPaths[symlink] if (cached) return cb(null, cached) readLinkOrShim(symlink, iferr(cb, function (symlinkTarget) { if (symlinkTarget) { resolvedPaths[symlink] = resolve(dirname(symlink), symlinkTarget) } else { resolvedPaths[symlink] = symlink } return cb(null, resolvedPaths[symlink]) })) } exports._readLinkOrShim = readLinkOrShim function readLinkOrShim (path, cb) { validate('SF', arguments) lstat(path, iferr(cb, function (stat) { if (stat.isSymbolicLink()) { readlink(path, cb) } else { readCmdShim(path, function (er, source) { if (!er) return cb(null, source) // lstat wouldn't return an error on these, so we don't either. if (er.code === 'ENOTASHIM' || er.code === 'EISDIR') { return cb(null, null) } else { return cb(er) } }) } })) } npm_3.5.2.orig/lib/utils/get-publish-config.js0000644000000000000000000000144712631326456017432 0ustar 00000000000000var Conf = require('../config/core.js').Conf var CachingRegClient = require('../cache/caching-client.js') var log = require('npmlog') module.exports = getPublishConfig function getPublishConfig (publishConfig, defaultConfig, defaultClient) { var config = defaultConfig var client = defaultClient log.verbose('getPublishConfig', publishConfig) if (publishConfig) { config = new Conf(defaultConfig) config.save = defaultConfig.save.bind(defaultConfig) // don't modify the actual publishConfig object, in case we have // to set a login token or some other data. config.unshift(Object.keys(publishConfig).reduce(function (s, k) { s[k] = publishConfig[k] return s }, {})) client = new CachingRegClient(config) } return { config: config, client: client } } npm_3.5.2.orig/lib/utils/git.js0000644000000000000000000000225212631326456014522 0ustar 00000000000000// handle some git configuration for windows exports.spawn = spawnGit exports.chainableExec = chainableExec exports.whichAndExec = whichAndExec var exec = require('child_process').execFile var spawn = require('./spawn') var npm = require('../npm.js') var which = require('which') var git = npm.config.get('git') var assert = require('assert') var log = require('npmlog') function prefixGitArgs () { return process.platform === 'win32' ? ['-c', 'core.longpaths=true'] : [] } function execGit (args, options, cb) { log.info('git', args) var fullArgs = prefixGitArgs().concat(args || []) return exec(git, fullArgs, options, cb) } function spawnGit (args, options) { log.info('git', args) return spawn(git, prefixGitArgs().concat(args || []), options) } function chainableExec () { var args = Array.prototype.slice.call(arguments) return [execGit].concat(args) } function whichGit (cb) { return which(git, cb) } function whichAndExec (args, options, cb) { assert.equal(typeof cb, 'function', 'no callback provided') // check for git whichGit(function (err) { if (err) { err.code = 'ENOGIT' return cb(err) } execGit(args, options, cb) }) } npm_3.5.2.orig/lib/utils/lifecycle.js0000644000000000000000000002432112631326456015677 0ustar 00000000000000exports = module.exports = lifecycle exports.cmd = cmd exports.makeEnv = makeEnv var log = require('npmlog') var spawn = require('./spawn') var npm = require('../npm.js') var path = require('path') var fs = require('graceful-fs') var chain = require('slide').chain var Stream = require('stream').Stream var PATH = 'PATH' var uidNumber = require('uid-number') var umask = require('./umask') // windows calls it's path 'Path' usually, but this is not guaranteed. if (process.platform === 'win32') { PATH = 'Path' Object.keys(process.env).forEach(function (e) { if (e.match(/^PATH$/i)) { PATH = e } }) } function logid (pkg, stage) { return pkg._id + '~' + stage + ':' } function lifecycle (pkg, stage, wd, unsafe, failOk, cb) { if (typeof cb !== 'function') { cb = failOk failOk = false } if (typeof cb !== 'function') { cb = unsafe unsafe = false } if (typeof cb !== 'function') { cb = wd wd = null } while (pkg && pkg._data) pkg = pkg._data if (!pkg) return cb(new Error('Invalid package data')) log.info('lifecycle', logid(pkg, stage), pkg._id) if (!pkg.scripts || npm.config.get('ignore-scripts')) pkg.scripts = {} validWd(wd || path.resolve(npm.dir, pkg.name), function (er, wd) { if (er) return cb(er) unsafe = unsafe || npm.config.get('unsafe-perm') if ((wd.indexOf(npm.dir) !== 0 || wd.indexOf(pkg.name) !== wd.length - pkg.name.length) && !unsafe && pkg.scripts[stage]) { log.warn('lifecycle', logid(pkg, stage), 'cannot run in wd', '%s %s (wd=%s)', pkg._id, pkg.scripts[stage], wd ) return cb() } // set the env variables, then run scripts as a child process. var env = makeEnv(pkg) env.npm_lifecycle_event = stage env.npm_node_execpath = env.NODE = env.NODE || process.execPath env.npm_execpath = require.main.filename // 'nobody' typically doesn't have permission to write to /tmp // even if it's never used, sh freaks out. if (!npm.config.get('unsafe-perm')) env.TMPDIR = wd lifecycle_(pkg, stage, wd, env, unsafe, failOk, cb) }) } function lifecycle_ (pkg, stage, wd, env, unsafe, failOk, cb) { var pathArr = [] var p = wd.split('node_modules') var acc = path.resolve(p.shift()) p.forEach(function (pp) { pathArr.unshift(path.join(acc, 'node_modules', '.bin')) acc = path.join(acc, 'node_modules', pp) }) pathArr.unshift(path.join(acc, 'node_modules', '.bin')) // we also unshift the bundled node-gyp-bin folder so that // the bundled one will be used for installing things. pathArr.unshift(path.join(__dirname, '..', '..', 'bin', 'node-gyp-bin')) if (env[PATH]) pathArr.push(env[PATH]) env[PATH] = pathArr.join(process.platform === 'win32' ? ';' : ':') var packageLifecycle = pkg.scripts && pkg.scripts.hasOwnProperty(stage) if (packageLifecycle) { // define this here so it's available to all scripts. env.npm_lifecycle_script = pkg.scripts[stage] } else { log.silly('lifecycle', logid(pkg, stage), 'no script for ' + stage + ', continuing') } function done (er) { if (er) { if (npm.config.get('force')) { log.info('lifecycle', logid(pkg, stage), 'forced, continuing', er) er = null } else if (failOk) { log.warn('lifecycle', logid(pkg, stage), 'continuing anyway', er.message) er = null } } cb(er) } chain( [ packageLifecycle && [runPackageLifecycle, pkg, env, wd, unsafe], [runHookLifecycle, pkg, env, wd, unsafe] ], done ) } function validWd (d, cb) { fs.stat(d, function (er, st) { if (er || !st.isDirectory()) { var p = path.dirname(d) if (p === d) { return cb(new Error('Could not find suitable wd')) } return validWd(p, cb) } return cb(null, d) }) } function runPackageLifecycle (pkg, env, wd, unsafe, cb) { // run package lifecycle scripts in the package root, or the nearest parent. var stage = env.npm_lifecycle_event var cmd = env.npm_lifecycle_script var note = '\n> ' + pkg._id + ' ' + stage + ' ' + wd + '\n> ' + cmd + '\n' runCmd(note, cmd, pkg, env, stage, wd, unsafe, cb) } var running = false var queue = [] function dequeue () { running = false if (queue.length) { var r = queue.shift() runCmd.apply(null, r) } } function runCmd (note, cmd, pkg, env, stage, wd, unsafe, cb) { if (running) { queue.push([note, cmd, pkg, env, stage, wd, unsafe, cb]) return } running = true log.pause() var user = unsafe ? null : npm.config.get('user') var group = unsafe ? null : npm.config.get('group') if (log.level !== 'silent') { log.clearProgress() console.log(note) log.showProgress() } log.verbose('lifecycle', logid(pkg, stage), 'unsafe-perm in lifecycle', unsafe) if (process.platform === 'win32') { unsafe = true } if (unsafe) { runCmd_(cmd, pkg, env, wd, stage, unsafe, 0, 0, cb) } else { uidNumber(user, group, function (er, uid, gid) { runCmd_(cmd, pkg, env, wd, stage, unsafe, uid, gid, cb) }) } } function runCmd_ (cmd, pkg, env, wd, stage, unsafe, uid, gid, cb_) { function cb (er) { cb_.apply(null, arguments) log.resume() process.nextTick(dequeue) } var conf = { cwd: wd, env: env, stdio: [ 0, 1, 2 ] } if (!unsafe) { conf.uid = uid ^ 0 conf.gid = gid ^ 0 } var sh = 'sh' var shFlag = '-c' if (process.platform === 'win32') { sh = process.env.comspec || 'cmd' shFlag = '/d /s /c' conf.windowsVerbatimArguments = true } log.verbose('lifecycle', logid(pkg, stage), 'PATH:', env[PATH]) log.verbose('lifecycle', logid(pkg, stage), 'CWD:', wd) log.silly('lifecycle', logid(pkg, stage), 'Args:', [shFlag, cmd]) var progressEnabled = log.progressEnabled if (progressEnabled) log.disableProgress() var proc = spawn(sh, [shFlag, cmd], conf) proc.on('error', procError) proc.on('close', function (code, signal) { log.silly('lifecycle', logid(pkg, stage), 'Returned: code:', code, ' signal:', signal) if (signal) { process.kill(process.pid, signal) } else if (code) { var er = new Error('Exit status ' + code) } procError(er) }) function procError (er) { if (progressEnabled) log.enableProgress() if (er) { log.info('lifecycle', logid(pkg, stage), 'Failed to exec ' + stage + ' script') er.message = pkg._id + ' ' + stage + ': `' + cmd + '`\n' + er.message if (er.code !== 'EPERM') { er.code = 'ELIFECYCLE' } er.pkgid = pkg._id er.stage = stage er.script = cmd er.pkgname = pkg.name } return cb(er) } } function runHookLifecycle (pkg, env, wd, unsafe, cb) { // check for a hook script, run if present. var stage = env.npm_lifecycle_event var hook = path.join(npm.dir, '.hooks', stage) var cmd = hook fs.stat(hook, function (er) { if (er) return cb() var note = '\n> ' + pkg._id + ' ' + stage + ' ' + wd + '\n> ' + cmd runCmd(note, hook, pkg, env, stage, wd, unsafe, cb) }) } function makeEnv (data, prefix, env) { prefix = prefix || 'npm_package_' if (!env) { env = {} for (var i in process.env) if (!i.match(/^npm_/)) { env[i] = process.env[i] } // npat asks for tap output if (npm.config.get('npat')) env.TAP = 1 // express and others respect the NODE_ENV value. if (npm.config.get('production')) env.NODE_ENV = 'production' } else if (!data.hasOwnProperty('_lifecycleEnv')) { Object.defineProperty(data, '_lifecycleEnv', { value: env, enumerable: false } ) } for (i in data) if (i.charAt(0) !== '_') { var envKey = (prefix + i).replace(/[^a-zA-Z0-9_]/g, '_') if (i === 'readme') { continue } if (data[i] && typeof data[i] === 'object') { try { // quick and dirty detection for cyclical structures JSON.stringify(data[i]) makeEnv(data[i], envKey + '_', env) } catch (ex) { // usually these are package objects. // just get the path and basic details. var d = data[i] makeEnv( { name: d.name, version: d.version, path: d.path }, envKey + '_', env ) } } else { env[envKey] = String(data[i]) env[envKey] = env[envKey].indexOf('\n') !== -1 ? JSON.stringify(env[envKey]) : env[envKey] } } if (prefix !== 'npm_package_') return env prefix = 'npm_config_' var pkgConfig = {} var keys = npm.config.keys var pkgVerConfig = {} var namePref = data.name + ':' var verPref = data.name + '@' + data.version + ':' keys.forEach(function (i) { // in some rare cases (e.g. working with nerf darts), there are segmented // "private" (underscore-prefixed) config names -- don't export if (i.charAt(0) === '_' && i.indexOf('_' + namePref) !== 0 || i.match(/:_/)) { return } var value = npm.config.get(i) if (value instanceof Stream || Array.isArray(value)) return if (i.match(/umask/)) value = umask.toString(value) if (!value) value = '' else if (typeof value === 'number') value = '' + value else if (typeof value !== 'string') value = JSON.stringify(value) value = value.indexOf('\n') !== -1 ? JSON.stringify(value) : value i = i.replace(/^_+/, '') var k if (i.indexOf(namePref) === 0) { k = i.substr(namePref.length).replace(/[^a-zA-Z0-9_]/g, '_') pkgConfig[k] = value } else if (i.indexOf(verPref) === 0) { k = i.substr(verPref.length).replace(/[^a-zA-Z0-9_]/g, '_') pkgVerConfig[k] = value } var envKey = (prefix + i).replace(/[^a-zA-Z0-9_]/g, '_') env[envKey] = value }) prefix = 'npm_package_config_' ;[pkgConfig, pkgVerConfig].forEach(function (conf) { for (var i in conf) { var envKey = (prefix + i) env[envKey] = conf[i] } }) return env } function cmd (stage) { function CMD (args, cb) { npm.commands['run-script']([stage].concat(args), cb) } CMD.usage = 'npm ' + stage + ' [-- ]' var installedShallow = require('./completion/installed-shallow.js') CMD.completion = function (opts, cb) { installedShallow(opts, function (d) { return d.scripts && d.scripts[stage] }, cb) } return CMD } npm_3.5.2.orig/lib/utils/link.js0000644000000000000000000000302412631326456014672 0ustar 00000000000000module.exports = link link.ifExists = linkIfExists var fs = require('graceful-fs') var chain = require('slide').chain var mkdir = require('mkdirp') var rm = require('./gently-rm.js') var path = require('path') var npm = require('../npm.js') function linkIfExists (from, to, gently, cb) { fs.stat(from, function (er) { if (er) return cb() fs.readlink(to, function (er, fromOnDisk) { // if the link already exists and matches what we would do, // we don't need to do anything if (!er) { var toDir = path.dirname(to) var absoluteFrom = path.resolve(toDir, from) var absoluteFromOnDisk = path.resolve(toDir, fromOnDisk) if (absoluteFrom === absoluteFromOnDisk) return cb() } link(from, to, gently, cb) }) }) } function link (from, to, gently, abs, cb) { if (typeof cb !== 'function') { cb = abs abs = false } if (typeof cb !== 'function') { cb = gently gently = null } if (npm.config.get('force')) gently = false to = path.resolve(to) var target = from = path.resolve(from) if (!abs && process.platform !== 'win32') { // junctions on windows must be absolute target = path.relative(path.dirname(to), from) // if there is no folder in common, then it will be much // longer, and using a relative link is dumb. if (target.length >= from.length) target = from } chain( [ [fs, 'stat', from], [rm, to, gently], [mkdir, path.dirname(to)], [fs, 'symlink', target, to, 'junction'] ], cb ) } npm_3.5.2.orig/lib/utils/locker.js0000644000000000000000000000347512631326456015226 0ustar 00000000000000var crypto = require('crypto') var resolve = require('path').resolve var lockfile = require('lockfile') var log = require('npmlog') var npm = require('../npm.js') var correctMkdir = require('../utils/correct-mkdir.js') var installLocks = {} function lockFileName (base, name) { var c = name.replace(/[^a-zA-Z0-9]+/g, '-').replace(/^-+|-+$/g, '') var p = resolve(base, name) var h = crypto.createHash('sha1').update(p).digest('hex') var l = resolve(npm.cache, '_locks') return resolve(l, c.substr(0, 24) + '-' + h.substr(0, 16) + '.lock') } function lock (base, name, cb) { var lockDir = resolve(npm.cache, '_locks') correctMkdir(lockDir, function (er) { if (er) return cb(er) var opts = { stale: npm.config.get('cache-lock-stale'), retries: npm.config.get('cache-lock-retries'), wait: npm.config.get('cache-lock-wait') } var lf = lockFileName(base, name) lockfile.lock(lf, opts, function (er) { if (er) log.warn('locking', lf, 'failed', er) if (!er) { log.verbose('lock', 'using', lf, 'for', resolve(base, name)) installLocks[lf] = true } cb(er) }) }) } function unlock (base, name, cb) { var lf = lockFileName(base, name) var locked = installLocks[lf] if (locked === false) { return process.nextTick(cb) } else if (locked === true) { lockfile.unlock(lf, function (er) { if (er) { log.warn('unlocking', lf, 'failed', er) } else { installLocks[lf] = false log.verbose('unlock', 'done using', lf, 'for', resolve(base, name)) } cb(er) }) } else { var notLocked = new Error( 'Attempt to unlock ' + resolve(base, name) + ", which hasn't been locked" ) notLocked.code = 'ENOTLOCKED' throw notLocked } } module.exports = { lock: lock, unlock: unlock } npm_3.5.2.orig/lib/utils/map-to-registry.js0000644000000000000000000000310212631326456016775 0ustar 00000000000000var url = require('url') var log = require('npmlog') var npa = require('npm-package-arg') module.exports = mapToRegistry function mapToRegistry (name, config, cb) { log.silly('mapToRegistry', 'name', name) var registry // the name itself takes precedence var data = npa(name) if (data.scope) { // the name is definitely scoped, so escape now name = name.replace('/', '%2f') log.silly('mapToRegistry', 'scope (from package name)', data.scope) registry = config.get(data.scope + ':registry') if (!registry) { log.verbose('mapToRegistry', 'no registry URL found in name for scope', data.scope) } } // ...then --scope=@scope or --scope=scope var scope = config.get('scope') if (!registry && scope) { // I'm an enabler, sorry if (scope.charAt(0) !== '@') scope = '@' + scope log.silly('mapToRegistry', 'scope (from config)', scope) registry = config.get(scope + ':registry') if (!registry) { log.verbose('mapToRegistry', 'no registry URL found in config for scope', scope) } } // ...and finally use the default registry if (!registry) { log.silly('mapToRegistry', 'using default registry') registry = config.get('registry') } log.silly('mapToRegistry', 'registry', registry) var auth = config.getCredentialsByURI(registry) // normalize registry URL so resolution doesn't drop a piece of registry URL var normalized = registry.slice(-1) !== '/' ? registry + '/' : registry var uri = url.resolve(normalized, name) log.silly('mapToRegistry', 'uri', uri) cb(null, uri, auth, normalized) } npm_3.5.2.orig/lib/utils/module-name.js0000644000000000000000000000164112631326456016143 0ustar 00000000000000'use strict' var path = require('path') var validate = require('aproba') module.exports = moduleName module.exports.test = {} module.exports.test.pathToPackageName = pathToPackageName function pathToPackageName (dir) { if (dir == null) return '' if (dir === '') return '' var name = path.relative(path.resolve(dir, '..'), dir) var scoped = path.relative(path.resolve(dir, '../..'), dir) if (scoped[0] === '@') return scoped return name } module.exports.test.isNotEmpty = isNotEmpty function isNotEmpty (str) { return str != null && str !== '' } var unknown = 0 function moduleName (tree) { validate('O', arguments) var pkg = tree.package || tree if (isNotEmpty(pkg.name)) return pkg.name var pkgName = pathToPackageName(tree.path) if (pkgName !== '') return pkgName if (tree._invalidName != null) return tree._invalidName tree._invalidName = '!invalid#' + (++unknown) return tree._invalidName } npm_3.5.2.orig/lib/utils/package-id.js0000644000000000000000000000061712631326456015727 0ustar 00000000000000'use strict' var moduleName = require('./module-name.js') module.exports = function (tree) { var pkg = tree.package || tree // FIXME: Excluding the '@' here is cleaning up after the mess that // read-package-json makes. =( if (pkg._id && pkg._id !== '@') return pkg._id var name = moduleName(tree) if (pkg.version) { return name + '@' + pkg.version } else { return name } } npm_3.5.2.orig/lib/utils/parse-json.js0000644000000000000000000000112012631326456016011 0ustar 00000000000000'use strict' var parseJSON = module.exports = function (content) { return JSON.parse(stripBOM(content)) } parseJSON.noExceptions = function (content) { try { return parseJSON(content) } catch (ex) { return } } // from read-package-json function stripBOM (content) { content = content.toString() // Remove byte order marker. This catches EF BB BF (the UTF-8 BOM) // because the buffer-to-string conversion in `fs.readFileSync()` // translates it to FEFF, the UTF-16 BOM. if (content.charCodeAt(0) === 0xFEFF) { content = content.slice(1) } return content } npm_3.5.2.orig/lib/utils/pulse-till-done.js0000644000000000000000000000061312631326456016753 0ustar 00000000000000'use strict' var validate = require('aproba') var log = require('npmlog') var pulsers = 0 var pulse module.exports = function (prefix, cb) { validate('SF', [prefix, cb]) if (!pulsers++) { pulse = setInterval(function () { log.gauge.pulse('network') }, 250) } return function () { if (!--pulsers) { clearInterval(pulse) } cb.apply(null, arguments) } } npm_3.5.2.orig/lib/utils/read-local-package.js0000644000000000000000000000051112631326456017327 0ustar 00000000000000exports = module.exports = readLocalPkg var npm = require('../npm.js') var readJson = require('read-package-json') function readLocalPkg (cb) { if (npm.config.get('global')) return cb() var path = require('path') readJson(path.resolve(npm.prefix, 'package.json'), function (er, d) { return cb(er, d && d.name) }) } npm_3.5.2.orig/lib/utils/rename.js0000644000000000000000000000047712631326456015215 0ustar 00000000000000'use strict' var fs = require('graceful-fs') var SaveStack = require('./save-stack.js') module.exports = rename function rename (from, to, cb) { var saved = new SaveStack(rename) fs.rename(from, to, function (er) { if (er) { return cb(saved.completeWith(er)) } else { return cb() } }) } npm_3.5.2.orig/lib/utils/save-stack.js0000644000000000000000000000053512631326456016002 0ustar 00000000000000'use strict' var inherits = require('inherits') module.exports = SaveStack function SaveStack (fn) { Error.call(this) Error.captureStackTrace(this, fn || SaveStack) } inherits(SaveStack, Error) SaveStack.prototype.completeWith = function (er) { this['__' + 'proto' + '__'] = er this.stack = this.stack + '\n\n' + er.stack return this } npm_3.5.2.orig/lib/utils/spawn.js0000644000000000000000000000160012631326456015063 0ustar 00000000000000module.exports = spawn var _spawn = require('child_process').spawn var EventEmitter = require('events').EventEmitter function spawn (cmd, args, options) { var raw = _spawn(cmd, args, options) var cooked = new EventEmitter() raw.on('error', function (er) { er.file = cmd cooked.emit('error', er) }).on('close', function (code, signal) { // Create ENOENT error because Node.js v0.8 will not emit // an `error` event if the command could not be found. if (code === 127) { var er = new Error('spawn ENOENT') er.code = 'ENOENT' er.errno = 'ENOENT' er.syscall = 'spawn' er.file = cmd cooked.emit('error', er) } else { cooked.emit('close', code, signal) } }) cooked.stdin = raw.stdin cooked.stdout = raw.stdout cooked.stderr = raw.stderr cooked.kill = function (sig) { return raw.kill(sig) } return cooked } npm_3.5.2.orig/lib/utils/tar.js0000644000000000000000000003252712631326456014535 0ustar 00000000000000// commands for packing and unpacking tarballs // this file is used by lib/cache.js var fs = require('graceful-fs') var path = require('path') var writeFileAtomic = require('write-file-atomic') var writeStreamAtomic = require('fs-write-stream-atomic') var log = require('npmlog') var uidNumber = require('uid-number') var readJson = require('read-package-json') var tar = require('tar') var zlib = require('zlib') var fstream = require('fstream') var Packer = require('fstream-npm') var iferr = require('iferr') var inherits = require('inherits') var npm = require('../npm.js') var rm = require('./gently-rm.js') var myUid = process.getuid && process.getuid() var myGid = process.getgid && process.getgid() var readPackageTree = require('read-package-tree') var union = require('lodash.union') var flattenTree = require('../install/flatten-tree.js') var moduleName = require('./module-name.js') var packageId = require('./package-id.js') if (process.env.SUDO_UID && myUid === 0) { if (!isNaN(process.env.SUDO_UID)) myUid = +process.env.SUDO_UID if (!isNaN(process.env.SUDO_GID)) myGid = +process.env.SUDO_GID } exports.pack = pack exports.unpack = unpack function pack (tarball, folder, pkg, cb) { log.verbose('tar pack', [tarball, folder]) log.verbose('tarball', tarball) log.verbose('folder', folder) readJson(path.join(folder, 'package.json'), function (er, pkg) { if (er || !pkg.bundleDependencies) { pack_(tarball, folder, null, null, pkg, cb) } else { // we require this at runtime due to load-order issues, because recursive // requires fail if you replace the exports object, and we do, not in deps, but // in a dep of it. var recalculateMetadata = require('../install/deps.js').recalculateMetadata readPackageTree(folder, iferr(cb, function (tree) { recalculateMetadata(tree, log.newGroup('pack:' + pkg), iferr(cb, function () { pack_(tarball, folder, tree, flattenTree(tree), pkg, cb) })) })) } }) } function BundledPacker (props) { Packer.call(this, props) } inherits(BundledPacker, Packer) BundledPacker.prototype.applyIgnores = function (entry, partial, entryObj) { // package.json files can never be ignored. if (entry === 'package.json') return true // readme files should never be ignored. if (entry.match(/^readme(\.[^\.]*)$/i)) return true // license files should never be ignored. if (entry.match(/^(license|licence)(\.[^\.]*)?$/i)) return true // changelogs should never be ignored. if (entry.match(/^(changes|changelog|history)(\.[^\.]*)?$/i)) return true // special rules. see below. if (entry === 'node_modules' && this.packageRoot) return true // some files are *never* allowed under any circumstances if (entry === '.git' || entry === '.lock-wscript' || entry.match(/^\.wafpickle-[0-9]+$/) || entry === 'CVS' || entry === '.svn' || entry === '.hg' || entry.match(/^\..*\.swp$/) || entry === '.DS_Store' || entry.match(/^\._/) || entry === 'npm-debug.log' ) { return false } // in a node_modules folder, we only include bundled dependencies // also, prevent packages in node_modules from being affected // by rules set in the containing package, so that // bundles don't get busted. // Also, once in a bundle, everything is installed as-is // To prevent infinite cycles in the case of cyclic deps that are // linked with npm link, even in a bundle, deps are only bundled // if they're not already present at a higher level. if (this.bundleMagic) { // bubbling up. stop here and allow anything the bundled pkg allows if (entry.indexOf('/') !== -1) return true // never include the .bin. It's typically full of platform-specific // stuff like symlinks and .cmd files anyway. if (entry === '.bin') return false // the package root. var p = this.parent // the package before this one. var pp = p && p.parent // if this entry has already been bundled, and is a symlink, // and it is the *same* symlink as this one, then exclude it. if (pp && pp.bundleLinks && this.bundleLinks && pp.bundleLinks[entry] && pp.bundleLinks[entry] === this.bundleLinks[entry]) { return false } // since it's *not* a symbolic link, if we're *already* in a bundle, // then we should include everything. if (pp && pp.package && pp.basename === 'node_modules') { return true } // only include it at this point if it's a bundleDependency return this.isBundled(entry) } // if (this.bundled) return true return Packer.prototype.applyIgnores.call(this, entry, partial, entryObj) } function nameMatch (name) { return function (other) { return name === moduleName(other) } } function pack_ (tarball, folder, tree, flatTree, pkg, cb) { function InstancePacker (props) { BundledPacker.call(this, props) } inherits(InstancePacker, BundledPacker) InstancePacker.prototype.isBundled = function (name) { var bd = this.package && this.package.bundleDependencies if (!bd) return false if (!Array.isArray(bd)) { throw new Error(packageId(this) + '\'s `bundledDependencies` should ' + 'be an array') } if (!tree) return false if (bd.indexOf(name) !== -1) return true var pkg = tree.children.filter(nameMatch(name))[0] if (!pkg) return false var requiredBy = union([], pkg.package._requiredBy) var seen = {} while (requiredBy.length) { var req = requiredBy.shift() if (seen[req]) continue seen[req] = true var reqPkg = flatTree[req] if (!reqPkg) continue if (reqPkg.parent === tree && bd.indexOf(moduleName(reqPkg)) !== -1) { return true } requiredBy = union(requiredBy, reqPkg.package._requiredBy) } return false } new InstancePacker({ path: folder, type: 'Directory', isDirectory: true }) .on('error', function (er) { if (er) log.error('tar pack', 'Error reading ' + folder) return cb(er) }) // By default, npm includes some proprietary attributes in the // package tarball. This is sane, and allowed by the spec. // However, npm *itself* excludes these from its own package, // so that it can be more easily bootstrapped using old and // non-compliant tar implementations. .pipe(tar.Pack({ noProprietary: !npm.config.get('proprietary-attribs') })) .on('error', function (er) { if (er) log.error('tar.pack', 'tar creation error', tarball) cb(er) }) .pipe(zlib.Gzip()) .on('error', function (er) { if (er) log.error('tar.pack', 'gzip error ' + tarball) cb(er) }) .pipe(writeStreamAtomic(tarball)) .on('error', function (er) { if (er) log.error('tar.pack', 'Could not write ' + tarball) cb(er) }) .on('close', cb) } function unpack (tarball, unpackTarget, dMode, fMode, uid, gid, cb) { log.verbose('tar', 'unpack', tarball) log.verbose('tar', 'unpacking to', unpackTarget) if (typeof cb !== 'function') { cb = gid gid = null } if (typeof cb !== 'function') { cb = uid uid = null } if (typeof cb !== 'function') { cb = fMode fMode = npm.modes.file } if (typeof cb !== 'function') { cb = dMode dMode = npm.modes.exec } uidNumber(uid, gid, function (er, uid, gid) { if (er) return cb(er) unpack_(tarball, unpackTarget, dMode, fMode, uid, gid, cb) }) } function unpack_ (tarball, unpackTarget, dMode, fMode, uid, gid, cb) { rm(unpackTarget, function (er) { if (er) return cb(er) // gzip {tarball} --decompress --stdout \ // | tar -mvxpf - --strip-components=1 -C {unpackTarget} gunzTarPerm(tarball, unpackTarget, dMode, fMode, uid, gid, function (er, folder) { if (er) return cb(er) readJson(path.resolve(folder, 'package.json'), cb) }) }) } function gunzTarPerm (tarball, target, dMode, fMode, uid, gid, cb_) { if (!dMode) dMode = npm.modes.exec if (!fMode) fMode = npm.modes.file log.silly('gunzTarPerm', 'modes', [dMode.toString(8), fMode.toString(8)]) var cbCalled = false function cb (er) { if (cbCalled) return cbCalled = true cb_(er, target) } var fst = fs.createReadStream(tarball) fst.on('open', function (fd) { fs.fstat(fd, function (er, st) { if (er) return fst.emit('error', er) if (st.size === 0) { er = new Error('0-byte tarball\n' + 'Please run `npm cache clean`') fst.emit('error', er) } }) }) // figure out who we're supposed to be, if we're not pretending // to be a specific user. if (npm.config.get('unsafe-perm') && process.platform !== 'win32') { uid = myUid gid = myGid } function extractEntry (entry) { log.silly('gunzTarPerm', 'extractEntry', entry.path) // never create things that are user-unreadable, // or dirs that are user-un-listable. Only leads to headaches. var originalMode = entry.mode = entry.mode || entry.props.mode entry.mode = entry.mode | (entry.type === 'Directory' ? dMode : fMode) entry.mode = entry.mode & (~npm.modes.umask) entry.props.mode = entry.mode if (originalMode !== entry.mode) { log.silly('gunzTarPerm', 'modified mode', [entry.path, originalMode, entry.mode]) } // if there's a specific owner uid/gid that we want, then set that if (process.platform !== 'win32' && typeof uid === 'number' && typeof gid === 'number') { entry.props.uid = entry.uid = uid entry.props.gid = entry.gid = gid } } var extractOpts = { type: 'Directory', path: target, strip: 1 } if (process.platform !== 'win32' && typeof uid === 'number' && typeof gid === 'number') { extractOpts.uid = uid extractOpts.gid = gid } var sawIgnores = {} extractOpts.filter = function () { // symbolic links are not allowed in packages. if (this.type.match(/^.*Link$/)) { log.warn('excluding symbolic link', this.path.substr(target.length + 1) + ' -> ' + this.linkpath) return false } // Note: This mirrors logic in the fs read operations that are // employed during tarball creation, in the fstream-npm module. // It is duplicated here to handle tarballs that are created // using other means, such as system tar or git archive. if (this.type === 'File') { var base = path.basename(this.path) if (base === '.npmignore') { sawIgnores[ this.path ] = true } else if (base === '.gitignore') { var npmignore = this.path.replace(/\.gitignore$/, '.npmignore') if (sawIgnores[npmignore]) { // Skip this one, already seen. return false } else { // Rename, may be clobbered later. this.path = npmignore this._path = npmignore } } } return true } fst .on('error', function (er) { if (er) log.error('tar.unpack', 'error reading ' + tarball) cb(er) }) .on('data', function OD (c) { // detect what it is. // Then, depending on that, we'll figure out whether it's // a single-file module, gzipped tarball, or naked tarball. // gzipped files all start with 1f8b08 if (c[0] === 0x1F && c[1] === 0x8B && c[2] === 0x08) { fst .pipe(zlib.Unzip()) .on('error', function (er) { if (er) log.error('tar.unpack', 'unzip error ' + tarball) cb(er) }) .pipe(tar.Extract(extractOpts)) .on('entry', extractEntry) .on('error', function (er) { if (er) log.error('tar.unpack', 'untar error ' + tarball) cb(er) }) .on('close', cb) } else if (hasTarHeader(c)) { // naked tar fst .pipe(tar.Extract(extractOpts)) .on('entry', extractEntry) .on('error', function (er) { if (er) log.error('tar.unpack', 'untar error ' + tarball) cb(er) }) .on('close', cb) } else { // naked js file var jsOpts = { path: path.resolve(target, 'index.js') } if (process.platform !== 'win32' && typeof uid === 'number' && typeof gid === 'number') { jsOpts.uid = uid jsOpts.gid = gid } fst .pipe(fstream.Writer(jsOpts)) .on('error', function (er) { if (er) log.error('tar.unpack', 'copy error ' + tarball) cb(er) }) .on('close', function () { var j = path.resolve(target, 'package.json') readJson(j, function (er, d) { if (er) { log.error('not a package', tarball) return cb(er) } writeFileAtomic(j, JSON.stringify(d) + '\n', cb) }) }) } // now un-hook, and re-emit the chunk fst.removeListener('data', OD) fst.emit('data', c) }) } function hasTarHeader (c) { return c[257] === 0x75 && // tar archives have 7573746172 at position c[258] === 0x73 && // 257 and 003030 or 202000 at position 262 c[259] === 0x74 && c[260] === 0x61 && c[261] === 0x72 && ((c[262] === 0x00 && c[263] === 0x30 && c[264] === 0x30) || (c[262] === 0x20 && c[263] === 0x20 && c[264] === 0x00)) } npm_3.5.2.orig/lib/utils/temp-filename.js0000644000000000000000000000025512631326456016463 0ustar 00000000000000'use strict' var uniqueFilename = require('unique-filename') var npm = require('../npm.js') module.exports = function (prefix) { return uniqueFilename(npm.tmp, prefix) } npm_3.5.2.orig/lib/utils/umask.js0000644000000000000000000000052712631326456015062 0ustar 00000000000000var umask = require('umask') var npmlog = require('npmlog') var _fromString = umask.fromString module.exports = umask // fromString with logging callback umask.fromString = function (val) { _fromString(val, function (err, result) { if (err) { npmlog.warn('invalid umask', err.message) } val = result }) return val } npm_3.5.2.orig/lib/utils/warn-deprecated.js0000644000000000000000000000105412631326456017003 0ustar 00000000000000module.exports = warnDeprecated var log = require('npmlog') var deprecations = {} function warnDeprecated (type) { return function warn (messages, instance) { if (!instance) { if (!deprecations[type]) { deprecations[type] = {} messages.forEach(function (m) { log.warn(type, m) }) } } else { if (!deprecations[type]) deprecations[type] = {} if (!deprecations[type][instance]) { deprecations[type][instance] = true messages.forEach(function (m) { log.warn(type, m) }) } } } } npm_3.5.2.orig/lib/utils/completion/file-completion.js0000644000000000000000000000132312631326456021174 0ustar 00000000000000module.exports = fileCompletion var mkdir = require('mkdirp') var path = require('path') var glob = require('glob') function fileCompletion (root, req, depth, cb) { if (typeof cb !== 'function') { cb = depth depth = Infinity } mkdir(root, function (er) { if (er) return cb(er) // can be either exactly the req, or a descendent var pattern = root + '/{' + req + ',' + req + '/**/*}' var opts = { mark: true, dot: true, maxDepth: depth } glob(pattern, opts, function (er, files) { if (er) return cb(er) return cb(null, (files || []).map(function (f) { var tail = f.substr(root.length + 1).replace(/^\//, '') return path.join(req, tail) })) }) }) } npm_3.5.2.orig/lib/utils/completion/installed-deep.js0000644000000000000000000000216612631326456021006 0ustar 00000000000000module.exports = installedDeep var npm = require('../../npm.js') var readInstalled = require('read-installed') function installedDeep (opts, cb) { var local var global var depth = npm.config.get('depth') var opt = { depth: depth, dev: true } if (npm.config.get('global')) { local = [] next() } else { readInstalled(npm.prefix, opt, function (er, data) { local = getNames(data || {}) next() }) } readInstalled(npm.config.get('prefix'), opt, function (er, data) { global = getNames(data || {}) next() }) function getNames_ (d, n) { if (d.realName && n) { if (n[d.realName]) return n n[d.realName] = true } if (!n) n = {} Object.keys(d.dependencies || {}).forEach(function (dep) { getNames_(d.dependencies[dep], n) }) return n } function getNames (d) { return Object.keys(getNames_(d)) } function next () { if (!local || !global) return if (!npm.config.get('global')) { global = global.map(function (g) { return [g, '-g'] }) } var names = local.concat(global) return cb(null, names) } } npm_3.5.2.orig/lib/utils/completion/installed-shallow.js0000644000000000000000000000362012631326456021536 0ustar 00000000000000 module.exports = installedShallow var npm = require('../../npm.js') var fs = require('graceful-fs') var path = require('path') var readJson = require('read-package-json') var asyncMap = require('slide').asyncMap function installedShallow (opts, filter, cb) { if (typeof cb !== 'function') { cb = filter filter = null } var conf = opts.conf var args = conf.argv.remain if (args.length > 3) return cb() var local var global var localDir = npm.dir var globalDir = npm.globalDir if (npm.config.get('global')) { local = [] next() } else { fs.readdir(localDir, function (er, pkgs) { local = (pkgs || []).filter(function (p) { return p.charAt(0) !== '.' }) next() }) } fs.readdir(globalDir, function (er, pkgs) { global = (pkgs || []).filter(function (p) { return p.charAt(0) !== '.' }) next() }) function next () { if (!local || !global) return filterInstalled(local, global, filter, cb) } } function filterInstalled (local, global, filter, cb) { var fl var fg if (!filter) { fl = local fg = global return next() } asyncMap(local, function (p, cb) { readJson(path.join(npm.dir, p, 'package.json'), function (er, d) { if (!d || !filter(d)) return cb(null, []) return cb(null, d.name) }) }, function (er, local) { fl = local || [] next() }) var globalDir = npm.globalDir asyncMap(global, function (p, cb) { readJson(path.join(globalDir, p, 'package.json'), function (er, d) { if (!d || !filter(d)) return cb(null, []) return cb(null, d.name) }) }, function (er, global) { fg = global || [] next() }) function next () { if (!fg || !fl) return if (!npm.config.get('global')) { fg = fg.map(function (g) { return [g, '-g'] }) } console.error('filtered', fl, fg) return cb(null, fl.concat(fg)) } } npm_3.5.2.orig/node_modules/.bin/0000755000000000000000000000000012631326456014775 5ustar 00000000000000npm_3.5.2.orig/node_modules/abbrev/0000755000000000000000000000000012631326456015410 5ustar 00000000000000npm_3.5.2.orig/node_modules/ansi-regex/0000755000000000000000000000000012631326456016211 5ustar 00000000000000npm_3.5.2.orig/node_modules/ansicolors/0000755000000000000000000000000012631326456016323 5ustar 00000000000000npm_3.5.2.orig/node_modules/ansistyles/0000755000000000000000000000000012631326456016345 5ustar 00000000000000npm_3.5.2.orig/node_modules/aproba/0000755000000000000000000000000012631326456015413 5ustar 00000000000000npm_3.5.2.orig/node_modules/archy/0000755000000000000000000000000012631326456015255 5ustar 00000000000000npm_3.5.2.orig/node_modules/async-some/0000755000000000000000000000000012631326456016225 5ustar 00000000000000npm_3.5.2.orig/node_modules/chownr/0000755000000000000000000000000012631326456015447 5ustar 00000000000000npm_3.5.2.orig/node_modules/cmd-shim/0000755000000000000000000000000012631326456015650 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/0000755000000000000000000000000012631326456016154 5ustar 00000000000000npm_3.5.2.orig/node_modules/config-chain/0000755000000000000000000000000012631326456016474 5ustar 00000000000000npm_3.5.2.orig/node_modules/debuglog/0000755000000000000000000000000012631326456015737 5ustar 00000000000000npm_3.5.2.orig/node_modules/dezalgo/0000755000000000000000000000000012631326456015574 5ustar 00000000000000npm_3.5.2.orig/node_modules/editor/0000755000000000000000000000000012631326456015435 5ustar 00000000000000npm_3.5.2.orig/node_modules/fs-vacuum/0000755000000000000000000000000012631326456016055 5ustar 00000000000000npm_3.5.2.orig/node_modules/fs-write-stream-atomic/0000755000000000000000000000000012631326456020452 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream/0000755000000000000000000000000012631326456015610 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/0000755000000000000000000000000012631326456016400 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/0000755000000000000000000000000012631326456015072 5ustar 00000000000000npm_3.5.2.orig/node_modules/graceful-fs/0000755000000000000000000000000012631326456016345 5ustar 00000000000000npm_3.5.2.orig/node_modules/has-unicode/0000755000000000000000000000000012631326456016346 5ustar 00000000000000npm_3.5.2.orig/node_modules/hosted-git-info/0000755000000000000000000000000012631326456017147 5ustar 00000000000000npm_3.5.2.orig/node_modules/iferr/0000755000000000000000000000000012631326456015256 5ustar 00000000000000npm_3.5.2.orig/node_modules/imurmurhash/0000755000000000000000000000000012631326456016513 5ustar 00000000000000npm_3.5.2.orig/node_modules/inflight/0000755000000000000000000000000012631326456015753 5ustar 00000000000000npm_3.5.2.orig/node_modules/inherits/0000755000000000000000000000000012631326456015774 5ustar 00000000000000npm_3.5.2.orig/node_modules/ini/0000755000000000000000000000000012631326456014726 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/0000755000000000000000000000000012631326456017452 5ustar 00000000000000npm_3.5.2.orig/node_modules/lockfile/0000755000000000000000000000000012631326456015737 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash._baseindexof/0000755000000000000000000000000012631326456020046 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash._baseuniq/0000755000000000000000000000000012631326456017366 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash._bindcallback/0000755000000000000000000000000012631326456020150 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash._cacheindexof/0000755000000000000000000000000012631326456020177 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash._createcache/0000755000000000000000000000000012631326456020006 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash._getnative/0000755000000000000000000000000012631326456017545 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/0000755000000000000000000000000012631326456017356 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.isarguments/0000755000000000000000000000000012631326456017761 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.isarray/0000755000000000000000000000000012631326456017072 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.keys/0000755000000000000000000000000012631326456016373 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.restparam/0000755000000000000000000000000012631326456017416 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.union/0000755000000000000000000000000012631326456016550 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/0000755000000000000000000000000012631326456016374 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.without/0000755000000000000000000000000012631326456017123 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/0000755000000000000000000000000012631326456015435 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/0000755000000000000000000000000012631326456015671 5ustar 00000000000000npm_3.5.2.orig/node_modules/nopt/0000755000000000000000000000000012631326456015127 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-git-url/0000755000000000000000000000000012631326456017530 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/0000755000000000000000000000000012631326456020447 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-cache-filename/0000755000000000000000000000000012631326456017560 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/0000755000000000000000000000000012631326456017643 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-package-arg/0000755000000000000000000000000012631326456017101 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/0000755000000000000000000000000012631326456020063 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-user-validate/0000755000000000000000000000000012631326456017504 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/0000755000000000000000000000000012631326456015443 5ustar 00000000000000npm_3.5.2.orig/node_modules/once/0000755000000000000000000000000012631326456015073 5ustar 00000000000000npm_3.5.2.orig/node_modules/opener/0000755000000000000000000000000012631326456015437 5ustar 00000000000000npm_3.5.2.orig/node_modules/osenv/0000755000000000000000000000000012631326456015301 5ustar 00000000000000npm_3.5.2.orig/node_modules/path-is-inside/0000755000000000000000000000000012631326456016765 5ustar 00000000000000npm_3.5.2.orig/node_modules/read/0000755000000000000000000000000012631326456015062 5ustar 00000000000000npm_3.5.2.orig/node_modules/read-cmd-shim/0000755000000000000000000000000012631326456016561 5ustar 00000000000000npm_3.5.2.orig/node_modules/read-installed/0000755000000000000000000000000012631326456017037 5ustar 00000000000000npm_3.5.2.orig/node_modules/read-package-json/0000755000000000000000000000000012631326456017422 5ustar 00000000000000npm_3.5.2.orig/node_modules/read-package-tree/0000755000000000000000000000000012631326456017410 5ustar 00000000000000npm_3.5.2.orig/node_modules/readdir-scoped-modules/0000755000000000000000000000000012631326456020502 5ustar 00000000000000npm_3.5.2.orig/node_modules/realize-package-specifier/0000755000000000000000000000000012631326456021142 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/0000755000000000000000000000000012631326456015637 5ustar 00000000000000npm_3.5.2.orig/node_modules/retry/0000755000000000000000000000000012631326456015314 5ustar 00000000000000npm_3.5.2.orig/node_modules/rimraf/0000755000000000000000000000000012631326456015427 5ustar 00000000000000npm_3.5.2.orig/node_modules/semver/0000755000000000000000000000000012631326456015450 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/0000755000000000000000000000000012631326456014722 5ustar 00000000000000npm_3.5.2.orig/node_modules/slide/0000755000000000000000000000000012631326456015247 5ustar 00000000000000npm_3.5.2.orig/node_modules/sorted-object/0000755000000000000000000000000012631326456016713 5ustar 00000000000000npm_3.5.2.orig/node_modules/strip-ansi/0000755000000000000000000000000012631326456016240 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/0000755000000000000000000000000012631326456014735 5ustar 00000000000000npm_3.5.2.orig/node_modules/text-table/0000755000000000000000000000000012631326456016220 5ustar 00000000000000npm_3.5.2.orig/node_modules/uid-number/0000755000000000000000000000000012631326456016216 5ustar 00000000000000npm_3.5.2.orig/node_modules/umask/0000755000000000000000000000000012631326456015267 5ustar 00000000000000npm_3.5.2.orig/node_modules/unique-filename/0000755000000000000000000000000012631326456017233 5ustar 00000000000000npm_3.5.2.orig/node_modules/unpipe/0000755000000000000000000000000012631326456015447 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/0000755000000000000000000000000012631326456021541 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-name/0000755000000000000000000000000012631326456021037 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/0000755000000000000000000000000012631326456015251 5ustar 00000000000000npm_3.5.2.orig/node_modules/wrappy/0000755000000000000000000000000012631326456015471 5ustar 00000000000000npm_3.5.2.orig/node_modules/write-file-atomic/0000755000000000000000000000000012631326456017470 5ustar 00000000000000npm_3.5.2.orig/node_modules/.bin/mkdirp0000755000000000000000000000000012631326456021551 2../mkdirp/bin/cmd.jsustar 00000000000000npm_3.5.2.orig/node_modules/.bin/node-gyp0000755000000000000000000000000012631326456023220 2../node-gyp/bin/node-gyp.jsustar 00000000000000npm_3.5.2.orig/node_modules/.bin/nopt0000755000000000000000000000000012631326456021152 2../nopt/bin/nopt.jsustar 00000000000000npm_3.5.2.orig/node_modules/.bin/opener0000755000000000000000000000000012631326456021532 2../opener/opener.jsustar 00000000000000npm_3.5.2.orig/node_modules/.bin/rimraf0000755000000000000000000000000012631326456020772 2../rimraf/bin.jsustar 00000000000000npm_3.5.2.orig/node_modules/.bin/semver0000755000000000000000000000000012631326456021722 2../semver/bin/semverustar 00000000000000npm_3.5.2.orig/node_modules/.bin/which0000755000000000000000000000000012631326456021125 2../which/bin/whichustar 00000000000000npm_3.5.2.orig/node_modules/abbrev/.npmignore0000644000000000000000000000005512631326456017407 0ustar 00000000000000.nyc_output nyc_output node_modules coverage npm_3.5.2.orig/node_modules/abbrev/.travis.yml0000644000000000000000000000007412631326456017522 0ustar 00000000000000language: node_js node_js: - '0.10' - '0.12' - 'iojs' npm_3.5.2.orig/node_modules/abbrev/CONTRIBUTING.md0000644000000000000000000000017312631326456017642 0ustar 00000000000000 To get started, sign the Contributor License Agreement. npm_3.5.2.orig/node_modules/abbrev/LICENSE0000644000000000000000000000137512631326456016423 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/abbrev/README.md0000644000000000000000000000076312631326456016675 0ustar 00000000000000# abbrev-js Just like [ruby's Abbrev](http://apidock.com/ruby/Abbrev). Usage: var abbrev = require("abbrev"); abbrev("foo", "fool", "folding", "flop"); // returns: { fl: 'flop' , flo: 'flop' , flop: 'flop' , fol: 'folding' , fold: 'folding' , foldi: 'folding' , foldin: 'folding' , folding: 'folding' , foo: 'foo' , fool: 'fool' } This is handy for command-line scripts, or other cases where you want to be able to accept shorthands. npm_3.5.2.orig/node_modules/abbrev/abbrev.js0000644000000000000000000000334412631326456017213 0ustar 00000000000000 module.exports = exports = abbrev.abbrev = abbrev abbrev.monkeyPatch = monkeyPatch function monkeyPatch () { Object.defineProperty(Array.prototype, 'abbrev', { value: function () { return abbrev(this) }, enumerable: false, configurable: true, writable: true }) Object.defineProperty(Object.prototype, 'abbrev', { value: function () { return abbrev(Object.keys(this)) }, enumerable: false, configurable: true, writable: true }) } function abbrev (list) { if (arguments.length !== 1 || !Array.isArray(list)) { list = Array.prototype.slice.call(arguments, 0) } for (var i = 0, l = list.length, args = [] ; i < l ; i ++) { args[i] = typeof list[i] === "string" ? list[i] : String(list[i]) } // sort them lexicographically, so that they're next to their nearest kin args = args.sort(lexSort) // walk through each, seeing how much it has in common with the next and previous var abbrevs = {} , prev = "" for (var i = 0, l = args.length ; i < l ; i ++) { var current = args[i] , next = args[i + 1] || "" , nextMatches = true , prevMatches = true if (current === next) continue for (var j = 0, cl = current.length ; j < cl ; j ++) { var curChar = current.charAt(j) nextMatches = nextMatches && curChar === next.charAt(j) prevMatches = prevMatches && curChar === prev.charAt(j) if (!nextMatches && !prevMatches) { j ++ break } } prev = current if (j === cl) { abbrevs[current] = current continue } for (var a = current.substr(0, j) ; j <= cl ; j ++) { abbrevs[a] = current a += current.charAt(j) } } return abbrevs } function lexSort (a, b) { return a === b ? 0 : a > b ? 1 : -1 } npm_3.5.2.orig/node_modules/abbrev/package.json0000644000000000000000000000245612631326456017705 0ustar 00000000000000{ "name": "abbrev", "version": "1.0.7", "description": "Like ruby's abbrev module, but in js", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me" }, "main": "abbrev.js", "scripts": { "test": "tap test.js --cov" }, "repository": { "type": "git", "url": "git+ssh://git@github.com/isaacs/abbrev-js.git" }, "license": "ISC", "devDependencies": { "tap": "^1.2.0" }, "readme": "# abbrev-js\n\nJust like [ruby's Abbrev](http://apidock.com/ruby/Abbrev).\n\nUsage:\n\n var abbrev = require(\"abbrev\");\n abbrev(\"foo\", \"fool\", \"folding\", \"flop\");\n \n // returns:\n { fl: 'flop'\n , flo: 'flop'\n , flop: 'flop'\n , fol: 'folding'\n , fold: 'folding'\n , foldi: 'folding'\n , foldin: 'folding'\n , folding: 'folding'\n , foo: 'foo'\n , fool: 'fool'\n }\n\nThis is handy for command-line scripts, or other cases where you want to be able to accept shorthands.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/abbrev-js/issues" }, "homepage": "https://github.com/isaacs/abbrev-js#readme", "_id": "abbrev@1.0.7", "_shasum": "5b6035b2ee9d4fb5cf859f08a9be81b208491843", "_resolved": "https://registry.npmjs.org/abbrev/-/abbrev-1.0.7.tgz", "_from": "abbrev@>=1.0.7 <1.1.0" } npm_3.5.2.orig/node_modules/abbrev/test.js0000644000000000000000000000205512631326456016727 0ustar 00000000000000var abbrev = require('./abbrev.js') var assert = require("assert") var util = require("util") console.log("TAP version 13") var count = 0 function test (list, expect) { count++ var actual = abbrev(list) assert.deepEqual(actual, expect, "abbrev("+util.inspect(list)+") === " + util.inspect(expect) + "\n"+ "actual: "+util.inspect(actual)) actual = abbrev.apply(exports, list) assert.deepEqual(abbrev.apply(exports, list), expect, "abbrev("+list.map(JSON.stringify).join(",")+") === " + util.inspect(expect) + "\n"+ "actual: "+util.inspect(actual)) console.log('ok - ' + list.join(' ')) } test([ "ruby", "ruby", "rules", "rules", "rules" ], { rub: 'ruby' , ruby: 'ruby' , rul: 'rules' , rule: 'rules' , rules: 'rules' }) test(["fool", "foom", "pool", "pope"], { fool: 'fool' , foom: 'foom' , poo: 'pool' , pool: 'pool' , pop: 'pope' , pope: 'pope' }) test(["a", "ab", "abc", "abcd", "abcde", "acde"], { a: 'a' , ab: 'ab' , abc: 'abc' , abcd: 'abcd' , abcde: 'abcde' , ac: 'acde' , acd: 'acde' , acde: 'acde' }) console.log("1..%d", count) npm_3.5.2.orig/node_modules/ansi-regex/index.js0000644000000000000000000000020712631326456017655 0ustar 00000000000000'use strict'; module.exports = function () { return /[\u001b\u009b][[()#;?]*(?:[0-9]{1,4}(?:;[0-9]{0,4})*)?[0-9A-ORZcf-nqry=><]/g; }; npm_3.5.2.orig/node_modules/ansi-regex/license0000644000000000000000000000213712631326456017561 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/ansi-regex/package.json0000644000000000000000000000424612631326456020505 0ustar 00000000000000{ "name": "ansi-regex", "version": "2.0.0", "description": "Regular expression for matching ANSI escape codes", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/ansi-regex.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "maintainers": [ { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, { "name": "Joshua Appelman", "email": "jappelman@xebia.com", "url": "jbnicolai.com" } ], "engines": { "node": ">=0.10.0" }, "scripts": { "test": "mocha test/test.js", "view-supported": "node test/viewCodes.js" }, "files": [ "index.js" ], "keywords": [ "ansi", "styles", "color", "colour", "colors", "terminal", "console", "cli", "string", "tty", "escape", "formatting", "rgb", "256", "shell", "xterm", "command-line", "text", "regex", "regexp", "re", "match", "test", "find", "pattern" ], "devDependencies": { "mocha": "*" }, "readme": "# ansi-regex [![Build Status](https://travis-ci.org/sindresorhus/ansi-regex.svg?branch=master)](https://travis-ci.org/sindresorhus/ansi-regex)\n\n> Regular expression for matching [ANSI escape codes](http://en.wikipedia.org/wiki/ANSI_escape_code)\n\n\n## Install\n\n```\n$ npm install --save ansi-regex\n```\n\n\n## Usage\n\n```js\nvar ansiRegex = require('ansi-regex');\n\nansiRegex().test('\\u001b[4mcake\\u001b[0m');\n//=> true\n\nansiRegex().test('cake');\n//=> false\n\n'\\u001b[4mcake\\u001b[0m'.match(ansiRegex());\n//=> ['\\u001b[4m', '\\u001b[0m']\n```\n\n\n## License\n\nMIT © [Sindre Sorhus](http://sindresorhus.com)\n", "readmeFilename": "readme.md", "bugs": { "url": "https://github.com/sindresorhus/ansi-regex/issues" }, "homepage": "https://github.com/sindresorhus/ansi-regex#readme", "_id": "ansi-regex@2.0.0", "_shasum": "c5061b6e0ef8a81775e50f5d66151bf6bf371107", "_resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.0.0.tgz", "_from": "ansi-regex@2.0.0" } npm_3.5.2.orig/node_modules/ansi-regex/readme.md0000644000000000000000000000112112631326456017763 0ustar 00000000000000# ansi-regex [![Build Status](https://travis-ci.org/sindresorhus/ansi-regex.svg?branch=master)](https://travis-ci.org/sindresorhus/ansi-regex) > Regular expression for matching [ANSI escape codes](http://en.wikipedia.org/wiki/ANSI_escape_code) ## Install ``` $ npm install --save ansi-regex ``` ## Usage ```js var ansiRegex = require('ansi-regex'); ansiRegex().test('\u001b[4mcake\u001b[0m'); //=> true ansiRegex().test('cake'); //=> false '\u001b[4mcake\u001b[0m'.match(ansiRegex()); //=> ['\u001b[4m', '\u001b[0m'] ``` ## License MIT © [Sindre Sorhus](http://sindresorhus.com) npm_3.5.2.orig/node_modules/ansicolors/LICENSE0000644000000000000000000000206612631326456017334 0ustar 00000000000000Copyright 2013 Thorsten Lorenz. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/ansicolors/README.md0000644000000000000000000000407312631326456017606 0ustar 00000000000000# ansicolors [![build status](https://secure.travis-ci.org/thlorenz/ansicolors.png)](http://next.travis-ci.org/thlorenz/ansicolors) Functions that surround a string with ansicolor codes so it prints in color. In case you need styles, like `bold`, have a look at [ansistyles](https://github.com/thlorenz/ansistyles). ## Installation npm install ansicolors ## Usage ```js var colors = require('ansicolors'); // foreground colors var redHerring = colors.red('herring'); var blueMoon = colors.blue('moon'); var brighBlueMoon = colors.brightBlue('moon'); console.log(redHerring); // this will print 'herring' in red console.log(blueMoon); // this 'moon' in blue console.log(brightBlueMoon); // I think you got the idea // background colors console.log(colors.bgYellow('printed on yellow background')); console.log(colors.bgBrightBlue('printed on bright blue background')); // mixing background and foreground colors // below two lines have same result (order in which bg and fg are combined doesn't matter) console.log(colors.bgYellow(colors.blue('printed on yellow background in blue'))); console.log(colors.blue(colors.bgYellow('printed on yellow background in blue'))); ``` ## Advanced API **ansicolors** allows you to access opening and closing escape sequences separately. ```js var colors = require('ansicolors'); function inspect(obj, depth) { return require('util').inspect(obj, false, depth || 5, true); } console.log('open blue', inspect(colors.open.blue)); console.log('close bgBlack', inspect(colors.close.bgBlack)); // => open blue '\u001b[34m' // close bgBlack '\u001b[49m' ``` ## Tests Look at the [tests](https://github.com/thlorenz/ansicolors/blob/master/test/ansicolors.js) to see more examples and/or run them via: npm explore ansicolors && npm test ## Alternatives **ansicolors** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, I'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js). npm_3.5.2.orig/node_modules/ansicolors/ansicolors.js0000644000000000000000000000303612631326456021037 0ustar 00000000000000// ColorCodes explained: http://www.termsys.demon.co.uk/vtansi.htm 'use strict'; var colorNums = { white : 37 , black : 30 , blue : 34 , cyan : 36 , green : 32 , magenta : 35 , red : 31 , yellow : 33 , brightBlack : 90 , brightRed : 91 , brightGreen : 92 , brightYellow : 93 , brightBlue : 94 , brightMagenta : 95 , brightCyan : 96 , brightWhite : 97 } , backgroundColorNums = { bgBlack : 40 , bgRed : 41 , bgGreen : 42 , bgYellow : 43 , bgBlue : 44 , bgMagenta : 45 , bgCyan : 46 , bgWhite : 47 , bgBrightBlack : 100 , bgBrightRed : 101 , bgBrightGreen : 102 , bgBrightYellow : 103 , bgBrightBlue : 104 , bgBrightMagenta : 105 , bgBrightCyan : 106 , bgBrightWhite : 107 } , open = {} , close = {} , colors = {} ; Object.keys(colorNums).forEach(function (k) { var o = open[k] = '\u001b[' + colorNums[k] + 'm'; var c = close[k] = '\u001b[39m'; colors[k] = function (s) { return o + s + c; }; }); Object.keys(backgroundColorNums).forEach(function (k) { var o = open[k] = '\u001b[' + backgroundColorNums[k] + 'm'; var c = close[k] = '\u001b[49m'; colors[k] = function (s) { return o + s + c; }; }); module.exports = colors; colors.open = open; colors.close = close; npm_3.5.2.orig/node_modules/ansicolors/package.json0000644000000000000000000000657412631326456020625 0ustar 00000000000000{ "name": "ansicolors", "version": "0.3.2", "description": "Functions that surround a string with ansicolor codes so it prints in color.", "main": "ansicolors.js", "scripts": { "test": "node test/*.js" }, "repository": { "type": "git", "url": "git://github.com/thlorenz/ansicolors.git" }, "keywords": [ "ansi", "colors", "highlight", "string" ], "author": { "name": "Thorsten Lorenz", "email": "thlorenz@gmx.de", "url": "thlorenz.com" }, "license": "MIT", "readmeFilename": "README.md", "gitHead": "858847ca28e8b360d9b70eee0592700fa2ab087d", "readme": "# ansicolors [![build status](https://secure.travis-ci.org/thlorenz/ansicolors.png)](http://next.travis-ci.org/thlorenz/ansicolors)\n\nFunctions that surround a string with ansicolor codes so it prints in color.\n\nIn case you need styles, like `bold`, have a look at [ansistyles](https://github.com/thlorenz/ansistyles).\n\n## Installation\n\n npm install ansicolors\n\n## Usage\n\n```js\nvar colors = require('ansicolors');\n\n// foreground colors\nvar redHerring = colors.red('herring');\nvar blueMoon = colors.blue('moon');\nvar brighBlueMoon = colors.brightBlue('moon');\n\nconsole.log(redHerring); // this will print 'herring' in red\nconsole.log(blueMoon); // this 'moon' in blue\nconsole.log(brightBlueMoon); // I think you got the idea\n\n// background colors\nconsole.log(colors.bgYellow('printed on yellow background'));\nconsole.log(colors.bgBrightBlue('printed on bright blue background'));\n\n// mixing background and foreground colors\n// below two lines have same result (order in which bg and fg are combined doesn't matter)\nconsole.log(colors.bgYellow(colors.blue('printed on yellow background in blue')));\nconsole.log(colors.blue(colors.bgYellow('printed on yellow background in blue')));\n```\n\n## Advanced API\n\n**ansicolors** allows you to access opening and closing escape sequences separately.\n\n```js\nvar colors = require('ansicolors');\n\nfunction inspect(obj, depth) {\n return require('util').inspect(obj, false, depth || 5, true);\n}\n\nconsole.log('open blue', inspect(colors.open.blue));\nconsole.log('close bgBlack', inspect(colors.close.bgBlack));\n\n// => open blue '\\u001b[34m'\n// close bgBlack '\\u001b[49m'\n```\n\n## Tests\n\nLook at the [tests](https://github.com/thlorenz/ansicolors/blob/master/test/ansicolors.js) to see more examples and/or run them via: \n\n npm explore ansicolors && npm test\n\n## Alternatives\n\n**ansicolors** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, \nI'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js).\n", "bugs": { "url": "https://github.com/thlorenz/ansicolors/issues" }, "_id": "ansicolors@0.3.2", "dist": { "shasum": "665597de86a9ffe3aa9bfbe6cae5c6ea426b4979", "tarball": "http://registry.npmjs.org/ansicolors/-/ansicolors-0.3.2.tgz" }, "_from": "ansicolors@>=0.3.2 <0.4.0", "_npmVersion": "1.3.11", "_npmUser": { "name": "thlorenz", "email": "thlorenz@gmx.de" }, "maintainers": [ { "name": "thlorenz", "email": "thlorenz@gmx.de" } ], "directories": {}, "_shasum": "665597de86a9ffe3aa9bfbe6cae5c6ea426b4979", "_resolved": "https://registry.npmjs.org/ansicolors/-/ansicolors-0.3.2.tgz" } npm_3.5.2.orig/node_modules/ansicolors/test/0000755000000000000000000000000012631326456017302 5ustar 00000000000000npm_3.5.2.orig/node_modules/ansicolors/test/ansicolors.js0000644000000000000000000000414112631326456022014 0ustar 00000000000000'use strict'; var assert = require('assert') , colors = require('..') , open = colors.open , close = colors.close console.log('Foreground colors ..'); assert.equal(colors.white('printed in white'), '\u001b[37mprinted in white\u001b[39m'); assert.equal(colors.black('printed in black'), '\u001b[30mprinted in black\u001b[39m'); assert.equal(colors.brightBlack('printed in bright black'), '\u001b[90mprinted in bright black\u001b[39m'); assert.equal(colors.green('printed in green'), '\u001b[32mprinted in green\u001b[39m'); assert.equal(colors.brightGreen('printed in bright green'), '\u001b[92mprinted in bright green\u001b[39m'); assert.equal(colors.red('printed in red'), '\u001b[31mprinted in red\u001b[39m'); assert.equal(colors.brightRed('printed in bright red'), '\u001b[91mprinted in bright red\u001b[39m'); console.log('OK'); console.log('Background colors ..'); assert.equal( colors.bgBlack('printed with black background') , '\u001b[40mprinted with black background\u001b[49m' ); assert.equal( colors.bgYellow('printed with yellow background') , '\u001b[43mprinted with yellow background\u001b[49m' ); assert.equal( colors.bgBrightYellow('printed with bright yellow background') , '\u001b[103mprinted with bright yellow background\u001b[49m' ); assert.equal( colors.bgWhite('printed with white background') , '\u001b[47mprinted with white background\u001b[49m' ); console.log('OK'); console.log('Mixing background and foreground colors ..'); assert.equal( colors.blue(colors.bgYellow('printed in blue with yellow background')) , '\u001b[34m\u001b[43mprinted in blue with yellow background\u001b[49m\u001b[39m' ); assert.equal( colors.bgYellow(colors.blue('printed in blue with yellow background again')) , '\u001b[43m\u001b[34mprinted in blue with yellow background again\u001b[39m\u001b[49m' ); console.log('OK'); console.log('Open ...'); assert.equal(open.black, '\u001b[30m'); assert.equal(open.bgYellow, '\u001b[43m'); console.log('OK'); console.log('Close ...'); assert.equal(close.black, '\u001b[39m'); assert.equal(close.bgYellow, '\u001b[49m'); console.log('OK'); npm_3.5.2.orig/node_modules/ansistyles/LICENSE0000644000000000000000000000206612631326456017356 0ustar 00000000000000Copyright 2013 Thorsten Lorenz. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/ansistyles/README.md0000644000000000000000000000420112631326456017621 0ustar 00000000000000# ansistyles [![build status](https://secure.travis-ci.org/thlorenz/ansistyles.png)](http://next.travis-ci.org/thlorenz/ansistyles) Functions that surround a string with ansistyle codes so it prints in style. In case you need colors, like `red`, have a look at [ansicolors](https://github.com/thlorenz/ansicolors). ## Installation npm install ansistyles ## Usage ```js var styles = require('ansistyles'); console.log(styles.bright('hello world')); // prints hello world in 'bright' white console.log(styles.underline('hello world')); // prints hello world underlined console.log(styles.inverse('hello world')); // prints hello world black on white ``` ## Combining with ansicolors Get the ansicolors module: npm install ansicolors ```js var styles = require('ansistyles') , colors = require('ansicolors'); console.log( // prints hello world underlined in blue on a green background colors.bgGreen(colors.blue(styles.underline('hello world'))) ); ``` ## Tests Look at the [tests](https://github.com/thlorenz/ansistyles/blob/master/test/ansistyles.js) to see more examples and/or run them via: npm explore ansistyles && npm test ## More Styles As you can see from [here](https://github.com/thlorenz/ansistyles/blob/master/ansistyles.js#L4-L15), more styles are available, but didn't have any effect on the terminals that I tested on Mac Lion and Ubuntu Linux. I included them for completeness, but didn't show them in the examples because they seem to have no effect. ### reset A style reset function is also included, please note however that this is not nestable. Therefore the below only underlines `hell` only, but not `world`. ```js console.log(styles.underline('hell' + styles.reset('o') + ' world')); ``` It is essentially the same as: ```js console.log(styles.underline('hell') + styles.reset('') + 'o world'); ``` ## Alternatives **ansistyles** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, I'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js). npm_3.5.2.orig/node_modules/ansistyles/ansistyles.js0000644000000000000000000000172212631326456021103 0ustar 00000000000000'use strict'; /* * Info: http://www.termsys.demon.co.uk/vtansi.htm#colors * Following caveats * bright - brightens the color (bold-blue is same as brigthtBlue) * dim - nothing on Mac or Linux * italic - nothing on Mac or Linux * underline - underlines string * blink - nothing on Mac or linux * inverse - background becomes foreground and vice versa * * In summary, the only styles that work are: * - bright, underline and inverse * - the others are only included for completeness */ var styleNums = { reset : [0, 22] , bright : [1, 22] , dim : [2, 22] , italic : [3, 23] , underline : [4, 24] , blink : [5, 25] , inverse : [7, 27] } , styles = {} ; Object.keys(styleNums).forEach(function (k) { styles[k] = function (s) { var open = styleNums[k][0] , close = styleNums[k][1]; return '\u001b[' + open + 'm' + s + '\u001b[' + close + 'm'; }; }); module.exports = styles; npm_3.5.2.orig/node_modules/ansistyles/package.json0000644000000000000000000000672112631326456020641 0ustar 00000000000000{ "name": "ansistyles", "version": "0.1.3", "description": "Functions that surround a string with ansistyle codes so it prints in style.", "main": "ansistyles.js", "scripts": { "test": "node test/ansistyles.js" }, "repository": { "type": "git", "url": "git://github.com/thlorenz/ansistyles.git" }, "keywords": [ "ansi", "style", "terminal", "console" ], "author": { "name": "Thorsten Lorenz", "email": "thlorenz@gmx.de", "url": "thlorenz.com" }, "license": "MIT", "readmeFilename": "README.md", "gitHead": "27bf1bc65231bcc7fd109bf13b13601b51f8cd04", "readme": "# ansistyles [![build status](https://secure.travis-ci.org/thlorenz/ansistyles.png)](http://next.travis-ci.org/thlorenz/ansistyles)\n\nFunctions that surround a string with ansistyle codes so it prints in style.\n\nIn case you need colors, like `red`, have a look at [ansicolors](https://github.com/thlorenz/ansicolors).\n\n## Installation\n\n npm install ansistyles\n\n## Usage\n\n```js\nvar styles = require('ansistyles');\n\nconsole.log(styles.bright('hello world')); // prints hello world in 'bright' white\nconsole.log(styles.underline('hello world')); // prints hello world underlined\nconsole.log(styles.inverse('hello world')); // prints hello world black on white\n```\n\n## Combining with ansicolors\n\nGet the ansicolors module:\n\n npm install ansicolors\n\n```js\nvar styles = require('ansistyles')\n , colors = require('ansicolors');\n\n console.log(\n // prints hello world underlined in blue on a green background\n colors.bgGreen(colors.blue(styles.underline('hello world'))) \n );\n```\n\n## Tests\n\nLook at the [tests](https://github.com/thlorenz/ansistyles/blob/master/test/ansistyles.js) to see more examples and/or run them via: \n\n npm explore ansistyles && npm test\n\n## More Styles\n\nAs you can see from [here](https://github.com/thlorenz/ansistyles/blob/master/ansistyles.js#L4-L15), more styles are available,\nbut didn't have any effect on the terminals that I tested on Mac Lion and Ubuntu Linux.\n\nI included them for completeness, but didn't show them in the examples because they seem to have no effect.\n\n### reset\n\nA style reset function is also included, please note however that this is not nestable.\n\nTherefore the below only underlines `hell` only, but not `world`.\n\n```js\nconsole.log(styles.underline('hell' + styles.reset('o') + ' world'));\n```\n\nIt is essentially the same as:\n\n```js\nconsole.log(styles.underline('hell') + styles.reset('') + 'o world');\n```\n\n\n\n## Alternatives\n\n**ansistyles** tries to meet simple use cases with a very simple API. However, if you need a more powerful ansi formatting tool, \nI'd suggest to look at the [features](https://github.com/TooTallNate/ansi.js#features) of the [ansi module](https://github.com/TooTallNate/ansi.js).\n", "bugs": { "url": "https://github.com/thlorenz/ansistyles/issues" }, "_id": "ansistyles@0.1.3", "dist": { "shasum": "5de60415bda071bb37127854c864f41b23254539", "tarball": "http://registry.npmjs.org/ansistyles/-/ansistyles-0.1.3.tgz" }, "_from": "ansistyles@>=0.1.3 <0.2.0", "_npmVersion": "1.3.11", "_npmUser": { "name": "thlorenz", "email": "thlorenz@gmx.de" }, "maintainers": [ { "name": "thlorenz", "email": "thlorenz@gmx.de" } ], "directories": {}, "_shasum": "5de60415bda071bb37127854c864f41b23254539", "_resolved": "https://registry.npmjs.org/ansistyles/-/ansistyles-0.1.3.tgz" } npm_3.5.2.orig/node_modules/ansistyles/test/0000755000000000000000000000000012631326456017324 5ustar 00000000000000npm_3.5.2.orig/node_modules/ansistyles/test/ansistyles.js0000644000000000000000000000104012631326456022053 0ustar 00000000000000'use strict'; /*jshint asi: true */ var assert = require('assert') , styles = require('../') function inspect(obj, depth) { console.log(require('util').inspect(obj, false, depth || 5, true)); } assert.equal(styles.reset('reset'), '\u001b[0mreset\u001b[22m', 'reset') assert.equal(styles.underline('underlined'), '\u001b[4munderlined\u001b[24m', 'underline') assert.equal(styles.bright('bright'), '\u001b[1mbright\u001b[22m', 'bright') assert.equal(styles.inverse('inversed'), '\u001b[7minversed\u001b[27m', 'inverse') console.log('OK'); npm_3.5.2.orig/node_modules/aproba/.npmignore0000644000000000000000000000002412631326456017406 0ustar 00000000000000*~ node_modules .#* npm_3.5.2.orig/node_modules/aproba/LICENSE0000644000000000000000000000136012631326456016420 0ustar 00000000000000Copyright (c) 2015, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/aproba/README.md0000644000000000000000000000274512631326456016702 0ustar 00000000000000aproba ====== A rediculously light-weight function argument validator ``` var validate = require("aproba") function myfunc(a, b, c) { // `a` must be a string, `b` a number, `c` a function validate('SNF', arguments) // [a,b,c] is also valid } myfunc('test', 23, function () {}) // ok myfunc(123, 23, function () {}) // type error myfunc('test', 23) // missing arg error myfunc('test', 23, function () {}, true) // too many args error ``` Valid types are: type | description ---- | ----------- * | matches any type A | instanceof Array OR an arguments object S | typeof == string N | typeof == number F | typeof == function O | typeof == object and not type A and not type E B | typeof == boolean E | instanceof Error OR null Validation failures throw one of three exception types, distinguished by a `code` property of `EMISSINGARG`, `EINVALIDTYPE` or `ETOOMANYARGS`. If an error argument is found and is not null then the remaining arguments will not be validated. ### Why this exists I wanted a very simple argument validator. It needed to do two things: 1. Be more concise and easier to use than assertions 2. Not encourage an infinite bikeshed of DSLs This is why types are specified by a single character and there's no such thing as an optional argument. This is not intended to validate user data. This is specifically about asserting the interface of your functions. If you need greater validation, I encourage you to write them by hand or look elsewhere. npm_3.5.2.orig/node_modules/aproba/index.js0000644000000000000000000000412012631326456017055 0ustar 00000000000000"use strict" var types = { "*": ["any", function () { return true }], A: ["array", function (thingy) { return thingy instanceof Array || (typeof thingy === "object" && thingy.hasOwnProperty("callee")) }], S: ["string", function (thingy) { return typeof thingy === "string" }], N: ["number", function (thingy) { return typeof thingy === "number" }], F: ["function", function (thingy) { return typeof thingy === "function" }], O: ["object", function (thingy) { return typeof thingy === "object" && !types.A[1](thingy) && !types.E[1](thingy) }], B: ["boolean", function (thingy) { return typeof thingy == "boolean" }], E: ["error", function (thingy) { return thingy instanceof Error }] } var validate = module.exports = function (schema, args) { if (!schema) throw missingRequiredArg(0, "schema") if (!args) throw missingRequiredArg(1, "args") if (!types.S[1](schema)) throw invalidType(0, "string", schema) if (!types.A[1](args)) throw invalidType(1, "array", args) for (var ii = 0; ii < schema.length; ++ii) { var type = schema[ii] if (!types[type]) throw unknownType(ii, type) var typeLabel = types[type][0] var typeCheck = types[type][1] if (type === "E" && args[ii] == null) continue if (args[ii] == null) throw missingRequiredArg(ii) if (!typeCheck(args[ii])) throw invalidType(ii, typeLabel, args[ii]) if (type === "E") return } if (schema.length < args.length) throw tooManyArgs(schema.length, args.length) } function missingRequiredArg(num) { return newException("EMISSINGARG", "Missing required argument #"+(num+1)) } function unknownType(num, type) { return newException("EUNKNOWNTYPE", "Unknown type "+type+" in argument #"+(num+1)) } function invalidType(num, type, value) { return newException("EINVALIDTYPE", "Argument #"+(num+1)+": Expected "+type+" but got "+typeof value) } function tooManyArgs(expected, got) { return newException("ETOOMANYARGS", "Too many arguments, expected "+expected+" and got "+got) } function newException(code, msg) { var e = new Error(msg) e.code = code Error.captureStackTrace(e, validate) return e } npm_3.5.2.orig/node_modules/aproba/package.json0000644000000000000000000000235412631326456017705 0ustar 00000000000000{ "name": "aproba", "version": "1.0.1", "description": "A rediculously light-weight argument validator", "main": "index.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "^0.7.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/iarna/aproba" }, "keywords": [ "argument", "validate" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/aproba/issues" }, "homepage": "https://github.com/iarna/aproba", "gitHead": "a2ea029793a14cddb9457afd0a83dc421889c7ad", "_id": "aproba@1.0.1", "_shasum": "c4ac2cc5becfb8b099de7ef9f02790e7d32d99ef", "_from": "aproba@>=1.0.1 <1.1.0", "_npmVersion": "2.7.5", "_nodeVersion": "1.6.2", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "dist": { "shasum": "c4ac2cc5becfb8b099de7ef9f02790e7d32d99ef", "tarball": "http://registry.npmjs.org/aproba/-/aproba-1.0.1.tgz" }, "_resolved": "https://registry.npmjs.org/aproba/-/aproba-1.0.1.tgz" } npm_3.5.2.orig/node_modules/aproba/test/0000755000000000000000000000000012631326456016372 5ustar 00000000000000npm_3.5.2.orig/node_modules/aproba/test/index.js0000644000000000000000000000402312631326456020036 0ustar 00000000000000"use strict" var test = require("tap").test var validate = require("../index.js") function thrown (t, code, msg, todo) { validate("OSSF", arguments) try { todo() t.fail(msg) } catch (e) { t.is(e.code, code, msg + e.message) } } function notThrown (t, msg, todo) { validate("OSF", arguments) try { todo() t.pass(msg) } catch (e) { t.fail(msg+"\n"+e.stack) } } test("general", function (t) { t.plan(69) var values = { "A": [], "S": "test", "N": 123, "F": function () {}, "O": {}, "B": false, "E": new Error() } Object.keys(values).forEach(function (type) { Object.keys(values).forEach(function (contraType) { if (type === contraType) { notThrown(t, type + " matches " + contraType, function () { validate(type, [values[contraType]]) }) } else { thrown(t, "EINVALIDTYPE", type + " does not match " + contraType, function () { validate(type, [values[contraType]]) }) } }) if (type === "E") { notThrown(t, "null is ok for E", function () { validate(type, [null]) }) } else { thrown(t, "EMISSINGARG", "null not ok for "+type, function () { validate(type, [null]) }) } }) Object.keys(values).forEach(function (contraType) { notThrown(t, "* matches " + contraType, function () { validate("*", [values[contraType]]) }) }) thrown(t, "EMISSINGARG", "not enough args", function () { validate("SNF", ["abc", 123]) }) thrown(t, "ETOOMANYARGS", "too many args", function () { validate("SNF", ["abc", 123, function () {}, true]) }) notThrown(t, "E matches null", function () { validate("E", [null]) }) notThrown(t, "E matches undefined", function () { validate("E", [undefined]) }) notThrown(t, "E w/ error requires nothing else", function () { validate("ESN", [new Error(), "foo"]) }) thrown(t, "EMISSINGARG", "E w/o error works as usual", function () { validate("ESN", [null, "foo"]) }) }) npm_3.5.2.orig/node_modules/archy/.travis.yml0000644000000000000000000000005312631326456017364 0ustar 00000000000000language: node_js node_js: - 0.6 - 0.8 npm_3.5.2.orig/node_modules/archy/LICENSE0000644000000000000000000000206112631326456016261 0ustar 00000000000000This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/archy/README.markdown0000644000000000000000000000334112631326456017757 0ustar 00000000000000# archy Render nested hierarchies `npm ls` style with unicode pipes. [![browser support](http://ci.testling.com/substack/node-archy.png)](http://ci.testling.com/substack/node-archy) [![build status](https://secure.travis-ci.org/substack/node-archy.png)](http://travis-ci.org/substack/node-archy) # example ``` js var archy = require('archy'); var s = archy({ label : 'beep', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny' ] }, 'human' ] }, 'party\ntime!' ] } ] }); console.log(s); ``` output ``` beep ├── ity └─┬ boop ├─┬ o_O │ ├─┬ oh │ │ ├── hello │ │ └── puny │ └── human └── party time! ``` # methods var archy = require('archy') ## archy(obj, prefix='', opts={}) Return a string representation of `obj` with unicode pipe characters like how `npm ls` looks. `obj` should be a tree of nested objects with `'label'` and `'nodes'` fields. `'label'` is a string of text to display at a node level and `'nodes'` is an array of the descendents of the current node. If a node is a string, that string will be used as the `'label'` and an empty array of `'nodes'` will be used. `prefix` gets prepended to all the lines and is used by the algorithm to recursively update. If `'label'` has newlines they will be indented at the present indentation level with the current prefix. To disable unicode results in favor of all-ansi output set `opts.unicode` to `false`. # install With [npm](http://npmjs.org) do: ``` npm install archy ``` # license MIT npm_3.5.2.orig/node_modules/archy/examples/0000755000000000000000000000000012631326456017073 5ustar 00000000000000npm_3.5.2.orig/node_modules/archy/index.js0000644000000000000000000000216412631326456016725 0ustar 00000000000000module.exports = function archy (obj, prefix, opts) { if (prefix === undefined) prefix = ''; if (!opts) opts = {}; var chr = function (s) { var chars = { '│' : '|', '└' : '`', '├' : '+', '─' : '-', '┬' : '-' }; return opts.unicode === false ? chars[s] : s; }; if (typeof obj === 'string') obj = { label : obj }; var nodes = obj.nodes || []; var lines = (obj.label || '').split('\n'); var splitter = '\n' + prefix + (nodes.length ? chr('│') : ' ') + ' '; return prefix + lines.join(splitter) + '\n' + nodes.map(function (node, ix) { var last = ix === nodes.length - 1; var more = node.nodes && node.nodes.length; var prefix_ = prefix + (last ? ' ' : chr('│')) + ' '; return prefix + (last ? chr('└') : chr('├')) + chr('─') + (more ? chr('┬') : chr('─')) + ' ' + archy(node, prefix_, opts).slice(prefix.length + 2) ; }).join('') ; }; npm_3.5.2.orig/node_modules/archy/package.json0000644000000000000000000000326412631326456017550 0ustar 00000000000000{ "name": "archy", "version": "1.0.0", "description": "render nested hierarchies `npm ls` style with unicode pipes", "main": "index.js", "devDependencies": { "tap": "~0.3.3", "tape": "~0.1.1" }, "scripts": { "test": "tap test" }, "testling": { "files": "test/*.js", "browsers": { "iexplore": [ "6.0", "7.0", "8.0", "9.0" ], "chrome": [ "20.0" ], "firefox": [ "10.0", "15.0" ], "safari": [ "5.1" ], "opera": [ "12.0" ] } }, "repository": { "type": "git", "url": "git+ssh://git@github.com/substack/node-archy.git" }, "keywords": [ "hierarchy", "npm ls", "unicode", "pretty", "print" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "gitHead": "30223c16191e877bf027b15b12daf077b9b55b84", "bugs": { "url": "https://github.com/substack/node-archy/issues" }, "homepage": "https://github.com/substack/node-archy", "_id": "archy@1.0.0", "_shasum": "f9c8c13757cc1dd7bc379ac77b2c62a5c2868c40", "_from": "archy@>=1.0.0 <1.1.0", "_npmVersion": "1.4.25", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "dist": { "shasum": "f9c8c13757cc1dd7bc379ac77b2c62a5c2868c40", "tarball": "http://registry.npmjs.org/archy/-/archy-1.0.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/archy/-/archy-1.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/archy/test/0000755000000000000000000000000012631326456016234 5ustar 00000000000000npm_3.5.2.orig/node_modules/archy/examples/beep.js0000644000000000000000000000060312631326456020343 0ustar 00000000000000var archy = require('../'); var s = archy({ label : 'beep', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny' ] }, 'human' ] }, 'party\ntime!' ] } ] }); console.log(s); npm_3.5.2.orig/node_modules/archy/examples/multi_line.js0000644000000000000000000000063612631326456021577 0ustar 00000000000000var archy = require('../'); var s = archy({ label : 'beep\none\ntwo', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O\nwheee', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny\nmeat' ] }, 'creature' ] }, 'party\ntime!' ] } ] }); console.log(s); npm_3.5.2.orig/node_modules/archy/test/beep.js0000644000000000000000000000150212631326456017503 0ustar 00000000000000var test = require('tape'); var archy = require('../'); test('beep', function (t) { var s = archy({ label : 'beep', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny' ] }, 'human' ] }, 'party!' ] } ] }); t.equal(s, [ 'beep', '├── ity', '└─┬ boop', ' ├─┬ o_O', ' │ ├─┬ oh', ' │ │ ├── hello', ' │ │ └── puny', ' │ └── human', ' └── party!', '' ].join('\n')); t.end(); }); npm_3.5.2.orig/node_modules/archy/test/multi_line.js0000644000000000000000000000174012631326456020735 0ustar 00000000000000var test = require('tape'); var archy = require('../'); test('multi-line', function (t) { var s = archy({ label : 'beep\none\ntwo', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O\nwheee', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny\nmeat' ] }, 'creature' ] }, 'party\ntime!' ] } ] }); t.equal(s, [ 'beep', '│ one', '│ two', '├── ity', '└─┬ boop', ' ├─┬ o_O', ' │ │ wheee', ' │ ├─┬ oh', ' │ │ ├── hello', ' │ │ └── puny', ' │ │ meat', ' │ └── creature', ' └── party', ' time!', '' ].join('\n')); t.end(); }); npm_3.5.2.orig/node_modules/archy/test/non_unicode.js0000644000000000000000000000143712631326456021077 0ustar 00000000000000var test = require('tape'); var archy = require('../'); test('beep', function (t) { var s = archy({ label : 'beep', nodes : [ 'ity', { label : 'boop', nodes : [ { label : 'o_O', nodes : [ { label : 'oh', nodes : [ 'hello', 'puny' ] }, 'human' ] }, 'party!' ] } ] }, '', { unicode : false }); t.equal(s, [ 'beep', '+-- ity', '`-- boop', ' +-- o_O', ' | +-- oh', ' | | +-- hello', ' | | `-- puny', ' | `-- human', ' `-- party!', '' ].join('\n')); t.end(); }); npm_3.5.2.orig/node_modules/async-some/.eslintrc0000644000000000000000000000064512631326456020056 0ustar 00000000000000{ "env" : { "node" : true }, "rules" : { "curly" : 0, "no-lonely-if" : 1, "no-mixed-requires" : 0, "no-underscore-dangle" : 0, "no-unused-vars" : [2, {"vars" : "all", "args" : "after-used"}], "no-use-before-define" : [2, "nofunc"], "quotes" : [1, "double", "avoid-escape"], "semi" : [2, "never"], "space-after-keywords" : 1, "space-infix-ops" : 0, "strict" : 0 } } npm_3.5.2.orig/node_modules/async-some/.npmignore0000644000000000000000000000001512631326456020220 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/async-some/LICENSE0000644000000000000000000000134512631326456017235 0ustar 00000000000000Copyright (c) 2014-2015, Forrest L Norvell Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/async-some/README.md0000644000000000000000000000412712631326456017510 0ustar 00000000000000# some Short-circuited async Array.prototype.some implementation. Serially evaluates a list of values from a JS array or arraylike against an asynchronous predicate, terminating on the first truthy value. If the predicate encounters an error, pass it to the completion callback. Otherwise, pass the truthy value passed by the predicate, or `false` if no truthy value was passed. Is [Zalgo](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony)-proof, browser-safe, and pretty efficient. ## Usage ```javascript var some = require("async-some"); var resolve = require("path").resolve; var stat = require("fs").stat; var readFileSync = require("fs").readFileSync; some(["apple", "seaweed", "ham", "quince"], porkDetector, function (error, match) { if (error) return console.error(error); if (match) return console.dir(JSON.parse(readFileSync(match))); console.error("time to buy more Sporkle™!"); }); var PREFIX = resolve(__dirname, "../pork_store"); function porkDetector(value, cb) { var path = resolve(PREFIX, value + ".json"); stat(path, function (er, stat) { if (er) { if (er.code === "ENOENT") return cb(null, false); return cb(er); } cb(er, path); }); } ``` ### some(list, test, callback) * `list` {Object} An arraylike (either an Array or the arguments arraylike) to be checked. * `test` {Function} The predicate against which the elements of `list` will be tested. Takes two parameters: * `element` {any} The element of the list to be tested. * `callback` {Function} The continuation to be called once the test is complete. Takes (again) two values: * `error` {Error} Any errors that the predicate encountered. * `value` {any} A truthy value. A non-falsy result terminates checking the entire list. * `callback` {Function} The callback to invoke when either a value has been found or the entire input list has been processed with no result. Is invoked with the traditional two parameters: * `error` {Error} Errors that were encountered during the evaluation of some(). * `match` {any} Value successfully matched by `test`, if any. npm_3.5.2.orig/node_modules/async-some/package.json0000644000000000000000000000615312631326456020520 0ustar 00000000000000{ "name": "async-some", "version": "1.0.2", "description": "short-circuited, asynchronous version of Array.protototype.some", "main": "some.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/othiym23/async-some.git" }, "keywords": [ "async", "some", "array", "collections", "fp" ], "author": { "name": "Forrest L Norvell", "email": "ogd@aoaioxxysz.net" }, "license": "ISC", "bugs": { "url": "https://github.com/othiym23/async-some/issues" }, "homepage": "https://github.com/othiym23/async-some", "dependencies": { "dezalgo": "^1.0.2" }, "devDependencies": { "tap": "^1.1.0" }, "readme": "# some\n\nShort-circuited async Array.prototype.some implementation.\n\nSerially evaluates a list of values from a JS array or arraylike\nagainst an asynchronous predicate, terminating on the first truthy\nvalue. If the predicate encounters an error, pass it to the completion\ncallback. Otherwise, pass the truthy value passed by the predicate, or\n`false` if no truthy value was passed.\n\nIs\n[Zalgo](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony)-proof,\nbrowser-safe, and pretty efficient.\n\n## Usage\n\n```javascript\nvar some = require(\"async-some\");\nvar resolve = require(\"path\").resolve;\nvar stat = require(\"fs\").stat;\nvar readFileSync = require(\"fs\").readFileSync;\n\nsome([\"apple\", \"seaweed\", \"ham\", \"quince\"], porkDetector, function (error, match) {\n if (error) return console.error(error);\n\n if (match) return console.dir(JSON.parse(readFileSync(match)));\n\n console.error(\"time to buy more Sporkle™!\");\n});\n\nvar PREFIX = resolve(__dirname, \"../pork_store\");\nfunction porkDetector(value, cb) {\n var path = resolve(PREFIX, value + \".json\");\n stat(path, function (er, stat) {\n if (er) {\n if (er.code === \"ENOENT\") return cb(null, false);\n\n return cb(er);\n }\n\n cb(er, path);\n });\n}\n```\n\n### some(list, test, callback)\n\n* `list` {Object} An arraylike (either an Array or the arguments arraylike) to\n be checked.\n* `test` {Function} The predicate against which the elements of `list` will be\n tested. Takes two parameters:\n * `element` {any} The element of the list to be tested.\n * `callback` {Function} The continuation to be called once the test is\n complete. Takes (again) two values:\n * `error` {Error} Any errors that the predicate encountered.\n * `value` {any} A truthy value. A non-falsy result terminates checking the\n entire list.\n* `callback` {Function} The callback to invoke when either a value has been\n found or the entire input list has been processed with no result. Is invoked\n with the traditional two parameters:\n * `error` {Error} Errors that were encountered during the evaluation of some().\n * `match` {any} Value successfully matched by `test`, if any.\n", "readmeFilename": "README.md", "gitHead": "3a5086ad54739c48b2bbf073f23bcc95658199e3", "_id": "async-some@1.0.2", "_shasum": "4d8a81620d5958791b5b98f802d3207776e95509", "_from": "async-some@>=1.0.2 <1.1.0" } npm_3.5.2.orig/node_modules/async-some/some.js0000644000000000000000000000235712631326456017535 0ustar 00000000000000var assert = require("assert") var dezalgoify = require("dezalgo") module.exports = some /** * short-circuited async Array.prototype.some implementation * * Serially evaluates a list of values from a JS array or arraylike * against an asynchronous predicate, terminating on the first truthy * value. If the predicate encounters an error, pass it to the completion * callback. Otherwise, pass the truthy value passed by the predicate, or * `false` if no truthy value was passed. */ function some (list, test, cb) { assert("length" in list, "array must be arraylike") assert.equal(typeof test, "function", "predicate must be callable") assert.equal(typeof cb, "function", "callback must be callable") var array = slice(list) , index = 0 , length = array.length , hecomes = dezalgoify(cb) map() function map () { if (index >= length) return hecomes(null, false) test(array[index], reduce) } function reduce (er, result) { if (er) return hecomes(er, false) if (result) return hecomes(null, result) index++ map() } } // Array.prototype.slice on arguments arraylike is expensive function slice(args) { var l = args.length, a = [], i for (i = 0; i < l; i++) a[i] = args[i] return a } npm_3.5.2.orig/node_modules/async-some/test/0000755000000000000000000000000012631326456017204 5ustar 00000000000000npm_3.5.2.orig/node_modules/async-some/test/base-case.js0000644000000000000000000000134212631326456021365 0ustar 00000000000000var test = require("tap").test var some = require("../some.js") test("some() array base case", function (t) { some([], failer, function (error, match) { t.ifError(error, "ran successfully") t.notOk(match, "nothing to find, so nothing found") t.end() }) function failer(value, cb) { cb(new Error("test should never have been called")) } }) test("some() arguments arraylike base case", function (t) { go() function go() { some(arguments, failer, function (error, match) { t.ifError(error, "ran successfully") t.notOk(match, "nothing to find, so nothing found") t.end() }) function failer(value, cb) { cb(new Error("test should never have been called")) } } }) npm_3.5.2.orig/node_modules/async-some/test/parameters.js0000644000000000000000000000146612631326456021714 0ustar 00000000000000var test = require("tap").test var some = require("../some.js") var NOP = function () {} test("some() called with bogus parameters", function (t) { t.throws(function () { some() }, "throws when called with no parameters") t.throws(function () { some(null, NOP, NOP) }, "throws when called with no list") t.throws(function () { some([], null, NOP) }, "throws when called with no predicate") t.throws(function () { some([], NOP, null) }, "throws when called with no callback") t.throws(function () { some({}, NOP, NOP) }, "throws when called with wrong list type") t.throws(function () { some([], "ham", NOP) }, "throws when called with wrong test type") t.throws(function () { some([], NOP, "ham") }, "throws when called with wrong test type") t.end() }) npm_3.5.2.orig/node_modules/async-some/test/simple.js0000644000000000000000000000277712631326456021050 0ustar 00000000000000var test = require("tap").test var some = require("../some.js") test("some() doesn't find anything asynchronously", function (t) { some(["a", "b", "c", "d", "e", "f", "g"], predicate, function (error, match) { t.ifError(error, "ran successfully") t.notOk(match, "nothing to find, so nothing found") t.end() }) function predicate(value, cb) { // dezalgo ensures it's safe to not do this, but just in case setTimeout(function () { cb(null, value > "j" && value) }) } }) test("some() doesn't find anything synchronously", function (t) { some(["a", "b", "c", "d", "e", "f", "g"], predicate, function (error, match) { t.ifError(error, "ran successfully") t.notOk(match, "nothing to find, so nothing found") t.end() }) function predicate(value, cb) { cb(null, value > "j" && value) } }) test("some() doesn't find anything asynchronously", function (t) { some(["a", "b", "c", "d", "e", "f", "g"], predicate, function (error, match) { t.ifError(error, "ran successfully") t.equals(match, "d", "found expected element") t.end() }) function predicate(value, cb) { setTimeout(function () { cb(null, value > "c" && value) }) } }) test("some() doesn't find anything synchronously", function (t) { some(["a", "b", "c", "d", "e", "f", "g"], predicate, function (error, match) { t.ifError(error, "ran successfully") t.equals(match, "d", "found expected") t.end() }) function predicate(value, cb) { cb(null, value > "c" && value) } }) npm_3.5.2.orig/node_modules/chownr/LICENSE0000644000000000000000000000137512631326456016462 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/chownr/README.md0000644000000000000000000000007312631326456016726 0ustar 00000000000000Like `chown -R`. Takes the same arguments as `fs.chown()` npm_3.5.2.orig/node_modules/chownr/chownr.js0000644000000000000000000000256312631326456017313 0ustar 00000000000000module.exports = chownr chownr.sync = chownrSync var fs = require("fs") , path = require("path") function chownr (p, uid, gid, cb) { fs.readdir(p, function (er, children) { // any error other than ENOTDIR means it's not readable, or // doesn't exist. give up. if (er && er.code !== "ENOTDIR") return cb(er) if (er || !children.length) return fs.chown(p, uid, gid, cb) var len = children.length , errState = null children.forEach(function (child) { var pathChild = path.resolve(p, child); fs.lstat(pathChild, function(er, stats) { if (er) return cb(er) if (!stats.isSymbolicLink()) chownr(pathChild, uid, gid, then) else then() }) }) function then (er) { if (errState) return if (er) return cb(errState = er) if (-- len === 0) return fs.chown(p, uid, gid, cb) } }) } function chownrSync (p, uid, gid) { var children try { children = fs.readdirSync(p) } catch (er) { if (er && er.code === "ENOTDIR") return fs.chownSync(p, uid, gid) throw er } if (!children.length) return fs.chownSync(p, uid, gid) children.forEach(function (child) { var pathChild = path.resolve(p, child) var stats = fs.lstatSync(pathChild) if (!stats.isSymbolicLink()) chownrSync(pathChild, uid, gid) }) return fs.chownSync(p, uid, gid) } npm_3.5.2.orig/node_modules/chownr/package.json0000644000000000000000000000241212631326456017734 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "chownr", "description": "like `chown -R`", "version": "1.0.1", "repository": { "type": "git", "url": "git://github.com/isaacs/chownr.git" }, "main": "chownr.js", "files": [ "chownr.js" ], "devDependencies": { "mkdirp": "0.3", "rimraf": "", "tap": "^1.2.0" }, "scripts": { "test": "tap test/*.js" }, "license": "ISC", "gitHead": "c6c43844e80d7c7045e737a72b9fbb1ba0579a26", "bugs": { "url": "https://github.com/isaacs/chownr/issues" }, "homepage": "https://github.com/isaacs/chownr#readme", "_id": "chownr@1.0.1", "_shasum": "e2a75042a9551908bebd25b8523d5f9769d79181", "_from": "chownr@>=1.0.1 <1.1.0", "_npmVersion": "3.2.2", "_nodeVersion": "2.2.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "e2a75042a9551908bebd25b8523d5f9769d79181", "tarball": "http://registry.npmjs.org/chownr/-/chownr-1.0.1.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/chownr/-/chownr-1.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/cmd-shim/.npmignore0000644000000000000000000000014212631326456017644 0ustar 00000000000000lib-cov *.seed *.log *.csv *.dat *.out *.pid *.gz pids logs results npm-debug.log node_modules npm_3.5.2.orig/node_modules/cmd-shim/.travis.yml0000644000000000000000000000005712631326456017763 0ustar 00000000000000language: node_js node_js: - "0.10" - "0.8"npm_3.5.2.orig/node_modules/cmd-shim/LICENSE0000644000000000000000000000243612631326456016662 0ustar 00000000000000Copyright (c) Isaac Z. Schlueter ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npm_3.5.2.orig/node_modules/cmd-shim/README.md0000644000000000000000000000214212631326456017126 0ustar 00000000000000# cmd-shim The cmd-shim used in npm to create executable scripts on Windows, since symlinks are not suitable for this purpose there. On Unix systems, you should use a symbolic link instead. [![Build Status](https://img.shields.io/travis/ForbesLindesay/cmd-shim/master.svg)](https://travis-ci.org/ForbesLindesay/cmd-shim) [![Dependency Status](https://img.shields.io/gemnasium/ForbesLindesay/cmd-shim.svg)](https://gemnasium.com/ForbesLindesay/cmd-shim) [![NPM version](https://img.shields.io/npm/v/cmd-shim.svg)](http://badge.fury.io/js/cmd-shim) ## Installation ``` npm install cmd-shim ``` ## API ### cmdShim(from, to, cb) Create a cmd shim at `to` for the command line program at `from`. e.g. ```javascript var cmdShim = require('cmd-shim'); cmdShim(__dirname + '/cli.js', '/usr/bin/command-name', function (err) { if (err) throw err; }); ``` ### cmdShim.ifExists(from, to, cb) The same as above, but will just continue if the file does not exist. Source: ```javascript function cmdShimIfExists (from, to, cb) { fs.stat(from, function (er) { if (er) return cb() cmdShim(from, to, cb) }) } ``` npm_3.5.2.orig/node_modules/cmd-shim/index.js0000644000000000000000000001075212631326456017322 0ustar 00000000000000// On windows, create a .cmd file. // Read the #! in the file to see what it uses. The vast majority // of the time, this will be either: // "#!/usr/bin/env " // or: // "#! " // // Write a binroot/pkg.bin + ".cmd" file that has this line in it: // @ %~dp0 %* module.exports = cmdShim cmdShim.ifExists = cmdShimIfExists var fs = require("graceful-fs") var mkdir = require("mkdirp") , path = require("path") , shebangExpr = /^#\!\s*(?:\/usr\/bin\/env)?\s*([^ \t]+)(.*)$/ function cmdShimIfExists (from, to, cb) { fs.stat(from, function (er) { if (er) return cb() cmdShim(from, to, cb) }) } // Try to unlink, but ignore errors. // Any problems will surface later. function rm (path, cb) { fs.unlink(path, function(er) { cb() }) } function cmdShim (from, to, cb) { fs.stat(from, function (er, stat) { if (er) return cb(er) cmdShim_(from, to, cb) }) } function cmdShim_ (from, to, cb) { var then = times(2, next, cb) rm(to, then) rm(to + ".cmd", then) function next(er) { writeShim(from, to, cb) } } function writeShim (from, to, cb) { // make a cmd file and a sh script // First, check if the bin is a #! of some sort. // If not, then assume it's something that'll be compiled, or some other // sort of script, and just call it directly. mkdir(path.dirname(to), function (er) { if (er) return cb(er) fs.readFile(from, "utf8", function (er, data) { if (er) return writeShim_(from, to, null, null, cb) var firstLine = data.trim().split(/\r*\n/)[0] , shebang = firstLine.match(shebangExpr) if (!shebang) return writeShim_(from, to, null, null, cb) var prog = shebang[1] , args = shebang[2] || "" return writeShim_(from, to, prog, args, cb) }) }) } function writeShim_ (from, to, prog, args, cb) { var shTarget = path.relative(path.dirname(to), from) , target = shTarget.split("/").join("\\") , longProg , shProg = prog && prog.split("\\").join("/") , shLongProg shTarget = shTarget.split("\\").join("/") args = args || "" if (!prog) { prog = "\"%~dp0\\" + target + "\"" shProg = "\"$basedir/" + shTarget + "\"" args = "" target = "" shTarget = "" } else { longProg = "\"%~dp0\\" + prog + ".exe\"" shLongProg = "\"$basedir/" + prog + "\"" target = "\"%~dp0\\" + target + "\"" shTarget = "\"$basedir/" + shTarget + "\"" } // @IF EXIST "%~dp0\node.exe" ( // "%~dp0\node.exe" "%~dp0\.\node_modules\npm\bin\npm-cli.js" %* // ) ELSE ( // SETLOCAL // SET PATHEXT=%PATHEXT:;.JS;=;% // node "%~dp0\.\node_modules\npm\bin\npm-cli.js" %* // ) var cmd if (longProg) { cmd = "@IF EXIST " + longProg + " (\r\n" + " " + longProg + " " + args + " " + target + " %*\r\n" + ") ELSE (\r\n" + " @SETLOCAL\r\n" + " @SET PATHEXT=%PATHEXT:;.JS;=;%\r\n" + " " + prog + " " + args + " " + target + " %*\r\n" + ")" } else { cmd = prog + " " + args + " " + target + " %*\r\n" } // #!/bin/sh // basedir=`dirname "$0"` // // case `uname` in // *CYGWIN*) basedir=`cygpath -w "$basedir"`;; // esac // // if [ -x "$basedir/node.exe" ]; then // "$basedir/node.exe" "$basedir/node_modules/npm/bin/npm-cli.js" "$@" // ret=$? // else // node "$basedir/node_modules/npm/bin/npm-cli.js" "$@" // ret=$? // fi // exit $ret var sh = "#!/bin/sh\n" if (shLongProg) { sh = sh + "basedir=`dirname \"$0\"`\n" + "\n" + "case `uname` in\n" + " *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;\n" + "esac\n" + "\n" sh = sh + "if [ -x "+shLongProg+" ]; then\n" + " " + shLongProg + " " + args + " " + shTarget + " \"$@\"\n" + " ret=$?\n" + "else \n" + " " + shProg + " " + args + " " + shTarget + " \"$@\"\n" + " ret=$?\n" + "fi\n" + "exit $ret\n" } else { sh = shProg + " " + args + " " + shTarget + " \"$@\"\n" + "exit $?\n" } var then = times(2, next, cb) fs.writeFile(to + ".cmd", cmd, "utf8", then) fs.writeFile(to, sh, "utf8", then) function next () { chmodShim(to, cb) } } function chmodShim (to, cb) { var then = times(2, cb, cb) fs.chmod(to, 0755, then) fs.chmod(to + ".cmd", 0755, then) } function times(n, ok, cb) { var errState = null return function(er) { if (!errState) { if (er) cb(errState = er) else if (--n === 0) ok() } } } npm_3.5.2.orig/node_modules/cmd-shim/node_modules/0000755000000000000000000000000012631326456020325 5ustar 00000000000000npm_3.5.2.orig/node_modules/cmd-shim/package.json0000644000000000000000000000232512631326456020140 0ustar 00000000000000{ "name": "cmd-shim", "version": "2.0.1", "description": "Used in npm for command line application support", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "https://github.com/ForbesLindesay/cmd-shim.git" }, "license": "BSD", "dependencies": { "graceful-fs": ">3.0.1 <4.0.0-0", "mkdirp": "~0.5.0" }, "devDependencies": { "tap": "~0.4.11", "rimraf": "~2.2.8" }, "gitHead": "6f53d506be590fe9ac20c9801512cd1a3aad5974", "bugs": { "url": "https://github.com/ForbesLindesay/cmd-shim/issues" }, "homepage": "https://github.com/ForbesLindesay/cmd-shim", "_id": "cmd-shim@2.0.1", "_shasum": "4512a373d2391679aec51ad1d4733559e9b85d4a", "_from": "cmd-shim@>=2.0.1 <2.1.0", "_npmVersion": "1.5.0-alpha-4", "_npmUser": { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" }, "maintainers": [ { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" } ], "dist": { "shasum": "4512a373d2391679aec51ad1d4733559e9b85d4a", "tarball": "http://registry.npmjs.org/cmd-shim/-/cmd-shim-2.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/cmd-shim/-/cmd-shim-2.0.1.tgz" } npm_3.5.2.orig/node_modules/cmd-shim/test/0000755000000000000000000000000012631326456016627 5ustar 00000000000000npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/0000755000000000000000000000000012631326456022523 5ustar 00000000000000npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/.npmignore0000644000000000000000000000001612631326456024517 0ustar 00000000000000node_modules/ npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/LICENSE0000644000000000000000000000137512631326456023536 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/README.md0000644000000000000000000000216212631326456024003 0ustar 00000000000000# graceful-fs graceful-fs functions as a drop-in replacement for the fs module, making various improvements. The improvements are meant to normalize behavior across different platforms and environments, and to make filesystem access more resilient to errors. ## Improvements over [fs module](http://api.nodejs.org/fs.html) graceful-fs: * Queues up `open` and `readdir` calls, and retries them once something closes if there is an EMFILE error from too many file descriptors. * fixes `lchmod` for Node versions prior to 0.6.2. * implements `fs.lutimes` if possible. Otherwise it becomes a noop. * ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or `lchown` if the user isn't root. * makes `lchmod` and `lchown` become noops, if not available. * retries reading a file if `read` results in EAGAIN error. On Windows, it retries renaming a file for up to one second if `EACCESS` or `EPERM` error occurs, likely because antivirus software has locked the directory. ## USAGE ```javascript // use just like fs var fs = require('graceful-fs') // now go and do stuff with it... fs.readFileSync('some-file-or-whatever') ``` npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/fs.js0000644000000000000000000000057212631326456023475 0ustar 00000000000000// eeeeeevvvvviiiiiiillllll // more evil than monkey-patching the native builtin? // Not sure. var mod = require("module") var pre = '(function (exports, require, module, __filename, __dirname) { ' var post = '});' var src = pre + process.binding('natives').fs + post var vm = require('vm') var fn = vm.runInThisContext(src) fn(exports, require, module, __filename, __dirname) npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/graceful-fs.js0000644000000000000000000000603212631326456025260 0ustar 00000000000000// Monkey-patching the fs module. // It's ugly, but there is simply no other way to do this. var fs = module.exports = require('./fs.js') var assert = require('assert') // fix up some busted stuff, mostly on windows and old nodes require('./polyfills.js') var util = require('util') function noop () {} var debug = noop if (util.debuglog) debug = util.debuglog('gfs') else if (/\bgfs\b/i.test(process.env.NODE_DEBUG || '')) debug = function() { var m = util.format.apply(util, arguments) m = 'GFS: ' + m.split(/\n/).join('\nGFS: ') console.error(m) } if (/\bgfs\b/i.test(process.env.NODE_DEBUG || '')) { process.on('exit', function() { debug('fds', fds) debug(queue) assert.equal(queue.length, 0) }) } var originalOpen = fs.open fs.open = open function open(path, flags, mode, cb) { if (typeof mode === "function") cb = mode, mode = null if (typeof cb !== "function") cb = noop new OpenReq(path, flags, mode, cb) } function OpenReq(path, flags, mode, cb) { this.path = path this.flags = flags this.mode = mode this.cb = cb Req.call(this) } util.inherits(OpenReq, Req) OpenReq.prototype.process = function() { originalOpen.call(fs, this.path, this.flags, this.mode, this.done) } var fds = {} OpenReq.prototype.done = function(er, fd) { debug('open done', er, fd) if (fd) fds['fd' + fd] = this.path Req.prototype.done.call(this, er, fd) } var originalReaddir = fs.readdir fs.readdir = readdir function readdir(path, cb) { if (typeof cb !== "function") cb = noop new ReaddirReq(path, cb) } function ReaddirReq(path, cb) { this.path = path this.cb = cb Req.call(this) } util.inherits(ReaddirReq, Req) ReaddirReq.prototype.process = function() { originalReaddir.call(fs, this.path, this.done) } ReaddirReq.prototype.done = function(er, files) { if (files && files.sort) files = files.sort() Req.prototype.done.call(this, er, files) onclose() } var originalClose = fs.close fs.close = close function close (fd, cb) { debug('close', fd) if (typeof cb !== "function") cb = noop delete fds['fd' + fd] originalClose.call(fs, fd, function(er) { onclose() cb(er) }) } var originalCloseSync = fs.closeSync fs.closeSync = closeSync function closeSync (fd) { try { return originalCloseSync(fd) } finally { onclose() } } // Req class function Req () { // start processing this.done = this.done.bind(this) this.failures = 0 this.process() } Req.prototype.done = function (er, result) { var tryAgain = false if (er) { var code = er.code var tryAgain = code === "EMFILE" || code === "ENFILE" if (process.platform === "win32") tryAgain = tryAgain || code === "OK" } if (tryAgain) { this.failures ++ enqueue(this) } else { var cb = this.cb cb(er, result) } } var queue = [] function enqueue(req) { queue.push(req) debug('enqueue %d %s', queue.length, req.constructor.name, req) } function onclose() { var req = queue.shift() if (req) { debug('process', req.constructor.name, req) req.process() } } npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/package.json0000644000000000000000000000455112631326456025016 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "graceful-fs", "description": "A drop-in replacement for fs, making various improvements.", "version": "3.0.8", "repository": { "type": "git", "url": "git://github.com/isaacs/node-graceful-fs.git" }, "main": "graceful-fs.js", "engines": { "node": ">=0.4.0" }, "directories": { "test": "test" }, "scripts": { "test": "tap test/*.js" }, "keywords": [ "fs", "module", "reading", "retry", "retries", "queue", "error", "errors", "handling", "EMFILE", "EAGAIN", "EINVAL", "EPERM", "EACCESS" ], "license": "ISC", "devDependencies": { "mkdirp": "^0.5.0", "rimraf": "^2.2.8", "tap": "^1.2.0" }, "readme": "# graceful-fs\n\ngraceful-fs functions as a drop-in replacement for the fs module,\nmaking various improvements.\n\nThe improvements are meant to normalize behavior across different\nplatforms and environments, and to make filesystem access more\nresilient to errors.\n\n## Improvements over [fs module](http://api.nodejs.org/fs.html)\n\ngraceful-fs:\n\n* Queues up `open` and `readdir` calls, and retries them once\n something closes if there is an EMFILE error from too many file\n descriptors.\n* fixes `lchmod` for Node versions prior to 0.6.2.\n* implements `fs.lutimes` if possible. Otherwise it becomes a noop.\n* ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or\n `lchown` if the user isn't root.\n* makes `lchmod` and `lchown` become noops, if not available.\n* retries reading a file if `read` results in EAGAIN error.\n\nOn Windows, it retries renaming a file for up to one second if `EACCESS`\nor `EPERM` error occurs, likely because antivirus software has locked\nthe directory.\n\n## USAGE\n\n```javascript\n// use just like fs\nvar fs = require('graceful-fs')\n\n// now go and do stuff with it...\nfs.readFileSync('some-file-or-whatever')\n```\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/node-graceful-fs/issues" }, "homepage": "https://github.com/isaacs/node-graceful-fs#readme", "_id": "graceful-fs@3.0.8", "_shasum": "ce813e725fa82f7e6147d51c9a5ca68270551c22", "_resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-3.0.8.tgz", "_from": "graceful-fs@>3.0.1 <4.0.0-0" } npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/polyfills.js0000644000000000000000000001453512631326456025106 0ustar 00000000000000var fs = require('./fs.js') var constants = require('constants') var origCwd = process.cwd var cwd = null process.cwd = function() { if (!cwd) cwd = origCwd.call(process) return cwd } var chdir = process.chdir process.chdir = function(d) { cwd = null chdir.call(process, d) } // (re-)implement some things that are known busted or missing. // lchmod, broken prior to 0.6.2 // back-port the fix here. if (constants.hasOwnProperty('O_SYMLINK') && process.version.match(/^v0\.6\.[0-2]|^v0\.5\./)) { fs.lchmod = function (path, mode, callback) { callback = callback || noop fs.open( path , constants.O_WRONLY | constants.O_SYMLINK , mode , function (err, fd) { if (err) { callback(err) return } // prefer to return the chmod error, if one occurs, // but still try to close, and report closing errors if they occur. fs.fchmod(fd, mode, function (err) { fs.close(fd, function(err2) { callback(err || err2) }) }) }) } fs.lchmodSync = function (path, mode) { var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK, mode) // prefer to return the chmod error, if one occurs, // but still try to close, and report closing errors if they occur. var err, err2 try { var ret = fs.fchmodSync(fd, mode) } catch (er) { err = er } try { fs.closeSync(fd) } catch (er) { err2 = er } if (err || err2) throw (err || err2) return ret } } // lutimes implementation, or no-op if (!fs.lutimes) { if (constants.hasOwnProperty("O_SYMLINK")) { fs.lutimes = function (path, at, mt, cb) { fs.open(path, constants.O_SYMLINK, function (er, fd) { cb = cb || noop if (er) return cb(er) fs.futimes(fd, at, mt, function (er) { fs.close(fd, function (er2) { return cb(er || er2) }) }) }) } fs.lutimesSync = function (path, at, mt) { var fd = fs.openSync(path, constants.O_SYMLINK) , err , err2 , ret try { var ret = fs.futimesSync(fd, at, mt) } catch (er) { err = er } try { fs.closeSync(fd) } catch (er) { err2 = er } if (err || err2) throw (err || err2) return ret } } else if (fs.utimensat && constants.hasOwnProperty("AT_SYMLINK_NOFOLLOW")) { // maybe utimensat will be bound soonish? fs.lutimes = function (path, at, mt, cb) { fs.utimensat(path, at, mt, constants.AT_SYMLINK_NOFOLLOW, cb) } fs.lutimesSync = function (path, at, mt) { return fs.utimensatSync(path, at, mt, constants.AT_SYMLINK_NOFOLLOW) } } else { fs.lutimes = function (_a, _b, _c, cb) { process.nextTick(cb) } fs.lutimesSync = function () {} } } // https://github.com/isaacs/node-graceful-fs/issues/4 // Chown should not fail on einval or eperm if non-root. // It should not fail on enosys ever, as this just indicates // that a fs doesn't support the intended operation. fs.chown = chownFix(fs.chown) fs.fchown = chownFix(fs.fchown) fs.lchown = chownFix(fs.lchown) fs.chmod = chownFix(fs.chmod) fs.fchmod = chownFix(fs.fchmod) fs.lchmod = chownFix(fs.lchmod) fs.chownSync = chownFixSync(fs.chownSync) fs.fchownSync = chownFixSync(fs.fchownSync) fs.lchownSync = chownFixSync(fs.lchownSync) fs.chmodSync = chownFix(fs.chmodSync) fs.fchmodSync = chownFix(fs.fchmodSync) fs.lchmodSync = chownFix(fs.lchmodSync) function chownFix (orig) { if (!orig) return orig return function (target, uid, gid, cb) { return orig.call(fs, target, uid, gid, function (er, res) { if (chownErOk(er)) er = null cb(er, res) }) } } function chownFixSync (orig) { if (!orig) return orig return function (target, uid, gid) { try { return orig.call(fs, target, uid, gid) } catch (er) { if (!chownErOk(er)) throw er } } } // ENOSYS means that the fs doesn't support the op. Just ignore // that, because it doesn't matter. // // if there's no getuid, or if getuid() is something other // than 0, and the error is EINVAL or EPERM, then just ignore // it. // // This specific case is a silent failure in cp, install, tar, // and most other unix tools that manage permissions. // // When running as root, or if other types of errors are // encountered, then it's strict. function chownErOk (er) { if (!er) return true if (er.code === "ENOSYS") return true var nonroot = !process.getuid || process.getuid() !== 0 if (nonroot) { if (er.code === "EINVAL" || er.code === "EPERM") return true } return false } // if lchmod/lchown do not exist, then make them no-ops if (!fs.lchmod) { fs.lchmod = function (path, mode, cb) { process.nextTick(cb) } fs.lchmodSync = function () {} } if (!fs.lchown) { fs.lchown = function (path, uid, gid, cb) { process.nextTick(cb) } fs.lchownSync = function () {} } // on Windows, A/V software can lock the directory, causing this // to fail with an EACCES or EPERM if the directory contains newly // created files. Try again on failure, for up to 1 second. if (process.platform === "win32") { var rename_ = fs.rename fs.rename = function rename (from, to, cb) { var start = Date.now() rename_(from, to, function CB (er) { if (er && (er.code === "EACCES" || er.code === "EPERM") && Date.now() - start < 1000) { return rename_(from, to, CB) } if(cb) cb(er) }) } } // if read() returns EAGAIN, then just try it again. var read = fs.read fs.read = function (fd, buffer, offset, length, position, callback_) { var callback if (callback_ && typeof callback_ === 'function') { var eagCounter = 0 callback = function (er, _, __) { if (er && er.code === 'EAGAIN' && eagCounter < 10) { eagCounter ++ return read.call(fs, fd, buffer, offset, length, position, callback) } callback_.apply(this, arguments) } } return read.call(fs, fd, buffer, offset, length, position, callback) } var readSync = fs.readSync fs.readSync = function (fd, buffer, offset, length, position) { var eagCounter = 0 while (true) { try { return readSync.call(fs, fd, buffer, offset, length, position) } catch (er) { if (er.code === 'EAGAIN' && eagCounter < 10) { eagCounter ++ continue } throw er } } } npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/test/0000755000000000000000000000000012631326456023502 5ustar 00000000000000npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/test/max-open.js0000644000000000000000000000276712631326456025600 0ustar 00000000000000var test = require('tap').test var fs = require('../') test('open lots of stuff', function (t) { // Get around EBADF from libuv by making sure that stderr is opened // Otherwise Darwin will refuse to give us a FD for stderr! process.stderr.write('') // How many parallel open()'s to do var n = 1024 var opens = 0 var fds = [] var going = true var closing = false var doneCalled = 0 for (var i = 0; i < n; i++) { go() } function go() { opens++ fs.open(__filename, 'r', function (er, fd) { if (er) throw er fds.push(fd) if (going) go() }) } // should hit ulimit pretty fast setTimeout(function () { going = false t.equal(opens - fds.length, n) done() }, 100) function done () { if (closing) return doneCalled++ if (fds.length === 0) { console.error('done called %d times', doneCalled) // First because of the timeout // Then to close the fd's opened afterwards // Then this time, to complete. // Might take multiple passes, depending on CPU speed // and ulimit, but at least 3 in every case. t.ok(doneCalled >= 2) return t.end() } closing = true setTimeout(function () { // console.error('do closing again') closing = false done() }, 100) // console.error('closing time') var closes = fds.slice(0) fds.length = 0 closes.forEach(function (fd) { fs.close(fd, function (er) { if (er) throw er }) }) } }) npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/test/open.js0000644000000000000000000000160112631326456024777 0ustar 00000000000000var test = require('tap').test var fs = require('../graceful-fs.js') test('graceful fs is monkeypatched fs', function (t) { t.equal(fs, require('../fs.js')) t.end() }) test('open an existing file works', function (t) { var fd = fs.openSync(__filename, 'r') fs.closeSync(fd) fs.open(__filename, 'r', function (er, fd) { if (er) throw er fs.close(fd, function (er) { if (er) throw er t.pass('works') t.end() }) }) }) test('open a non-existing file throws', function (t) { var er try { var fd = fs.openSync('this file does not exist', 'r') } catch (x) { er = x } t.ok(er, 'should throw') t.notOk(fd, 'should not get an fd') t.equal(er.code, 'ENOENT') fs.open('neither does this file', 'r', function (er, fd) { t.ok(er, 'should throw') t.notOk(fd, 'should not get an fd') t.equal(er.code, 'ENOENT') t.end() }) }) npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/test/readdir-sort.js0000644000000000000000000000060112631326456026434 0ustar 00000000000000var test = require("tap").test var fs = require("../fs.js") var readdir = fs.readdir fs.readdir = function(path, cb) { process.nextTick(function() { cb(null, ["b", "z", "a"]) }) } var g = require("../") test("readdir reorder", function (t) { g.readdir("whatevers", function (er, files) { if (er) throw er t.same(files, [ "a", "b", "z" ]) t.end() }) }) npm_3.5.2.orig/node_modules/cmd-shim/node_modules/graceful-fs/test/write-then-read.js0000644000000000000000000000154712631326456027046 0ustar 00000000000000var fs = require('../'); var rimraf = require('rimraf'); var mkdirp = require('mkdirp'); var test = require('tap').test; var p = require('path').resolve(__dirname, 'files'); process.chdir(__dirname) // Make sure to reserve the stderr fd process.stderr.write(''); var num = 4097; var paths = new Array(num); test('make files', function (t) { rimraf.sync(p); mkdirp.sync(p); for (var i = 0; i < num; ++i) { paths[i] = 'files/file-' + i; fs.writeFileSync(paths[i], 'content'); } t.end(); }) test('read files', function (t) { // now read them var done = 0; for (var i = 0; i < num; ++i) { fs.readFile(paths[i], function(err, data) { if (err) throw err; ++done; if (done === num) { t.pass('success'); t.end() } }); } }); test('cleanup', function (t) { rimraf.sync(p); t.end(); }); npm_3.5.2.orig/node_modules/cmd-shim/test/00-setup.js0000644000000000000000000000153512631326456020546 0ustar 00000000000000var test = require('tap').test var mkdirp = require('mkdirp') var fs = require('fs') var path = require('path') var fixtures = path.resolve(__dirname, 'fixtures') var froms = { 'from.exe': 'exe', 'from.env': '#!/usr/bin/env node\nconsole.log(/hi/)\n', 'from.env.args': '#!/usr/bin/env node --expose_gc\ngc()\n', 'from.sh': '#!/usr/bin/sh\necho hi\n', 'from.sh.args': '#!/usr/bin/sh -x\necho hi\n' } var cmdShim = require('../') test('create fixture', function (t) { mkdirp(fixtures, function (er) { if (er) throw er t.pass('made dir') Object.keys(froms).forEach(function (f) { t.test('write ' + f, function (t) { fs.writeFile(path.resolve(fixtures, f), froms[f], function (er) { if (er) throw er t.pass('wrote ' + f) t.end() }) }) }) t.end() }) }) npm_3.5.2.orig/node_modules/cmd-shim/test/basic.js0000755000000000000000000001334212631326456020254 0ustar 00000000000000var test = require('tap').test var mkdirp = require('mkdirp') var fs = require('fs') var path = require('path') var fixtures = path.resolve(__dirname, 'fixtures') var cmdShim = require('../') test('no shebang', function (t) { var from = path.resolve(fixtures, 'from.exe') var to = path.resolve(fixtures, 'exe.shim') cmdShim(from, to, function(er) { if (er) throw er t.equal(fs.readFileSync(to, 'utf8'), "\"$basedir/from.exe\" \"$@\"\nexit $?\n") t.equal(fs.readFileSync(to + '.cmd', 'utf8'), "\"%~dp0\\from.exe\" %*\r\n") t.end() }) }) test('env shebang', function (t) { var from = path.resolve(fixtures, 'from.env') var to = path.resolve(fixtures, 'env.shim') cmdShim(from, to, function(er) { if (er) throw er console.error('%j', fs.readFileSync(to, 'utf8')) console.error('%j', fs.readFileSync(to + '.cmd', 'utf8')) t.equal(fs.readFileSync(to, 'utf8'), "#!/bin/sh"+ "\nbasedir=`dirname \"$0\"`"+ "\n"+ "\ncase `uname` in"+ "\n *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;"+ "\nesac"+ "\n"+ "\nif [ -x \"$basedir/node\" ]; then"+ "\n \"$basedir/node\" \"$basedir/from.env\" \"$@\""+ "\n ret=$?"+ "\nelse "+ "\n node \"$basedir/from.env\" \"$@\""+ "\n ret=$?"+ "\nfi"+ "\nexit $ret"+ "\n") t.equal(fs.readFileSync(to + '.cmd', 'utf8'), "@IF EXIST \"%~dp0\\node.exe\" (\r"+ "\n \"%~dp0\\node.exe\" \"%~dp0\\from.env\" %*\r"+ "\n) ELSE (\r"+ "\n @SETLOCAL\r"+ "\n @SET PATHEXT=%PATHEXT:;.JS;=;%\r"+ "\n node \"%~dp0\\from.env\" %*\r"+ "\n)") t.end() }) }) test('env shebang with args', function (t) { var from = path.resolve(fixtures, 'from.env.args') var to = path.resolve(fixtures, 'env.args.shim') cmdShim(from, to, function(er) { if (er) throw er console.error('%j', fs.readFileSync(to, 'utf8')) console.error('%j', fs.readFileSync(to + '.cmd', 'utf8')) t.equal(fs.readFileSync(to, 'utf8'), "#!/bin/sh"+ "\nbasedir=`dirname \"$0\"`"+ "\n"+ "\ncase `uname` in"+ "\n *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;"+ "\nesac"+ "\n"+ "\nif [ -x \"$basedir/node\" ]; then"+ "\n \"$basedir/node\" --expose_gc \"$basedir/from.env.args\" \"$@\""+ "\n ret=$?"+ "\nelse "+ "\n node --expose_gc \"$basedir/from.env.args\" \"$@\""+ "\n ret=$?"+ "\nfi"+ "\nexit $ret"+ "\n") t.equal(fs.readFileSync(to + '.cmd', 'utf8'), "@IF EXIST \"%~dp0\\node.exe\" (\r"+ "\n \"%~dp0\\node.exe\" --expose_gc \"%~dp0\\from.env.args\" %*\r"+ "\n) ELSE (\r"+ "\n @SETLOCAL\r"+ "\n @SET PATHEXT=%PATHEXT:;.JS;=;%\r"+ "\n node --expose_gc \"%~dp0\\from.env.args\" %*\r"+ "\n)") t.end() }) }) test('explicit shebang', function (t) { var from = path.resolve(fixtures, 'from.sh') var to = path.resolve(fixtures, 'sh.shim') cmdShim(from, to, function(er) { if (er) throw er console.error('%j', fs.readFileSync(to, 'utf8')) console.error('%j', fs.readFileSync(to + '.cmd', 'utf8')) t.equal(fs.readFileSync(to, 'utf8'), "#!/bin/sh" + "\nbasedir=`dirname \"$0\"`" + "\n" + "\ncase `uname` in" + "\n *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;" + "\nesac" + "\n" + "\nif [ -x \"$basedir//usr/bin/sh\" ]; then" + "\n \"$basedir//usr/bin/sh\" \"$basedir/from.sh\" \"$@\"" + "\n ret=$?" + "\nelse " + "\n /usr/bin/sh \"$basedir/from.sh\" \"$@\"" + "\n ret=$?" + "\nfi" + "\nexit $ret" + "\n") t.equal(fs.readFileSync(to + '.cmd', 'utf8'), "@IF EXIST \"%~dp0\\/usr/bin/sh.exe\" (\r" + "\n \"%~dp0\\/usr/bin/sh.exe\" \"%~dp0\\from.sh\" %*\r" + "\n) ELSE (\r" + "\n @SETLOCAL\r"+ "\n @SET PATHEXT=%PATHEXT:;.JS;=;%\r"+ "\n /usr/bin/sh \"%~dp0\\from.sh\" %*\r" + "\n)") t.end() }) }) test('explicit shebang with args', function (t) { var from = path.resolve(fixtures, 'from.sh.args') var to = path.resolve(fixtures, 'sh.args.shim') cmdShim(from, to, function(er) { if (er) throw er console.error('%j', fs.readFileSync(to, 'utf8')) console.error('%j', fs.readFileSync(to + '.cmd', 'utf8')) t.equal(fs.readFileSync(to, 'utf8'), "#!/bin/sh" + "\nbasedir=`dirname \"$0\"`" + "\n" + "\ncase `uname` in" + "\n *CYGWIN*) basedir=`cygpath -w \"$basedir\"`;;" + "\nesac" + "\n" + "\nif [ -x \"$basedir//usr/bin/sh\" ]; then" + "\n \"$basedir//usr/bin/sh\" -x \"$basedir/from.sh.args\" \"$@\"" + "\n ret=$?" + "\nelse " + "\n /usr/bin/sh -x \"$basedir/from.sh.args\" \"$@\"" + "\n ret=$?" + "\nfi" + "\nexit $ret" + "\n") t.equal(fs.readFileSync(to + '.cmd', 'utf8'), "@IF EXIST \"%~dp0\\/usr/bin/sh.exe\" (\r" + "\n \"%~dp0\\/usr/bin/sh.exe\" -x \"%~dp0\\from.sh.args\" %*\r" + "\n) ELSE (\r" + "\n @SETLOCAL\r"+ "\n @SET PATHEXT=%PATHEXT:;.JS;=;%\r"+ "\n /usr/bin/sh -x \"%~dp0\\from.sh.args\" %*\r" + "\n)") t.end() }) }) npm_3.5.2.orig/node_modules/cmd-shim/test/zz-cleanup.js0000644000000000000000000000042512631326456021256 0ustar 00000000000000var test = require('tap').test var path = require('path') var fixtures = path.resolve(__dirname, 'fixtures') var rimraf = require('rimraf') test('cleanup', function(t) { rimraf(fixtures, function(er) { if (er) throw er t.pass('cleaned up') t.end() }) }) npm_3.5.2.orig/node_modules/columnify/LICENSE0000644000000000000000000000206412631326456017163 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2013 Tim Oxley Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/columnify/Makefile0000644000000000000000000000020012631326456017604 0ustar 00000000000000 all: columnify.js prepublish: all columnify.js: index.js package.json babel index.js > columnify.js .PHONY: all prepublish npm_3.5.2.orig/node_modules/columnify/Readme.md0000644000000000000000000002636412631326456017706 0ustar 00000000000000# columnify [![NPM](https://nodei.co/npm/columnify.png?downloads=true&downloadRank=true&stars=true&chrome)](https://nodei.co/npm-dl/columnify/) [![NPM](https://nodei.co/npm-dl/columnify.png?months=3&height=3&chrome)](https://nodei.co/npm/columnify/) [![Build Status](https://img.shields.io/travis/timoxley/columnify.svg?style=flat)](https://travis-ci.org/timoxley/columnify) [![NPM Version](https://img.shields.io/npm/v/columnify.svg?style=flat)](https://npmjs.org/package/columnify) [![License](http://img.shields.io/npm/l/columnify.svg?style=flat)](LICENSE) [![Dependency Status](https://david-dm.org/timoxley/columnify.svg)](https://david-dm.org/timoxley/columnify) [![devDependency Status](https://david-dm.org/timoxley/columnify/dev-status.svg)](https://david-dm.org/timoxley/columnify#info=devDependencies) Create text-based columns suitable for console output from objects or arrays of objects. Columns are automatically resized to fit the content of the largest cell. Each cell will be padded with spaces to fill the available space and ensure column contents are left-aligned. Designed to [handle sensible wrapping in npm search results](https://github.com/isaacs/npm/pull/2328). `npm search` before & after integrating columnify: ![npm-tidy-search](https://f.cloud.github.com/assets/43438/1848959/ae02ad04-76a1-11e3-8255-4781debffc26.gif) ## Installation & Update ``` $ npm install --save columnify@latest ``` ## Usage ```javascript var columnify = require('columnify') var columns = columnify(data, options) console.log(columns) ``` ## Examples ### Columnify Objects Objects are converted to a list of key/value pairs: ```javascript var data = { "commander@0.6.1": 1, "minimatch@0.2.14": 3, "mkdirp@0.3.5": 2, "sigmund@1.0.0": 3 } console.log(columnify(data)) ``` #### Output: ``` KEY VALUE commander@0.6.1 1 minimatch@0.2.14 3 mkdirp@0.3.5 2 sigmund@1.0.0 3 ``` ### Custom Column Names ```javascript var data = { "commander@0.6.1": 1, "minimatch@0.2.14": 3, "mkdirp@0.3.5": 2, "sigmund@1.0.0": 3 } console.log(columnify(data, {columns: ['MODULE', 'COUNT']})) ``` #### Output: ``` MODULE COUNT commander@0.6.1 1 minimatch@0.2.14 3 mkdirp@0.3.5 2 sigmund@1.0.0 3 ``` ### Columnify Arrays of Objects Column headings are extracted from the keys in supplied objects. ```javascript var columnify = require('columnify') var columns = columnify([{ name: 'mod1', version: '0.0.1' }, { name: 'module2', version: '0.2.0' }]) console.log(columns) ``` #### Output: ``` NAME VERSION mod1 0.0.1 module2 0.2.0 ``` ### Filtering & Ordering Columns By default, all properties are converted into columns, whether or not they exist on every object or not. To explicitly specify which columns to include, and in which order, supply a "columns" or "include" array ("include" is just an alias). ```javascript var data = [{ name: 'module1', description: 'some description', version: '0.0.1', }, { name: 'module2', description: 'another description', version: '0.2.0', }] var columns = columnify(data, { columns: ['name', 'version'] }) console.log(columns) ``` #### Output: ``` NAME VERSION module1 0.0.1 module2 0.2.0 ``` ## Global and Per Column Options You can set a number of options at a global level (ie. for all columns) or on a per column basis. Set options on a per column basis by using the `config` option to specify individual columns: ```javascript var columns = columnify(data, { optionName: optionValue, config: { columnName: {optionName: optionValue}, columnName: {optionName: optionValue}, } }) ``` ### Maximum and Minimum Column Widths As with all options, you can define the `maxWidth` and `minWidth` globally, or for specified columns. By default, wrapping will happen at word boundaries. Empty cells or those which do not fill the `minWidth` will be padded with spaces. ```javascript var columns = columnify([{ name: 'mod1', description: 'some description which happens to be far larger than the max', version: '0.0.1', }, { name: 'module-two', description: 'another description larger than the max', version: '0.2.0', }], { minWidth: 20, config: { description: {maxWidth: 30} } }) console.log(columns) ``` #### Output: ``` NAME DESCRIPTION VERSION mod1 some description which happens 0.0.1 to be far larger than the max module-two another description larger 0.2.0 than the max ``` #### Maximum Line Width You can set a hard maximum line width using the `maxLineWidth` option. Beyond this value data is unceremoniously truncated with no truncation marker. This can either be a number or 'auto' to set the value to the width of stdout. Setting this value to 'auto' prevent TTY-imposed line-wrapping when lines exceed the screen width. #### Truncating Column Cells Instead of Wrapping You can disable wrapping and instead truncate content at the maximum column width by using the `truncate` option. Truncation respects word boundaries. A truncation marker, `…`, will appear next to the last word in any truncated line. ```javascript var columns = columnify(data, { truncate: true, config: { description: { maxWidth: 20 } } }) console.log(columns) ``` #### Output: ``` NAME DESCRIPTION VERSION mod1 some description… 0.0.1 module-two another description… 0.2.0 ``` ### Align Right/Center You can set the alignment of the column data by using the `align` option. ```js var data = { "mocha@1.18.2": 1, "commander@2.0.0": 1, "debug@0.8.1": 1 } columnify(data, {config: {value: {align: 'right'}}}) ``` #### Output: ``` KEY VALUE mocha@1.18.2 1 commander@2.0.0 1 debug@0.8.1 1 ``` `align: 'center'` works in a similar way. ### Padding Character Set a character to fill whitespace within columns with the `paddingChr` option. ```js var data = { "shortKey": "veryVeryVeryVeryVeryLongVal", "veryVeryVeryVeryVeryLongKey": "shortVal" } columnify(data, { paddingChr: '.'}) ``` #### Output: ``` KEY........................ VALUE...................... shortKey................... veryVeryVeryVeryVeryLongVal veryVeryVeryVeryVeryLongKey shortVal................... ``` ### Preserve Existing Newlines By default, `columnify` sanitises text by replacing any occurance of 1 or more whitespace characters with a single space. `columnify` can be configured to respect existing new line characters using the `preserveNewLines` option. Note this will still collapse all other whitespace. ```javascript var data = [{ name: "glob@3.2.9", paths: [ "node_modules/tap/node_modules/glob", "node_modules/tape/node_modules/glob" ].join('\n') }, { name: "nopt@2.2.1", paths: [ "node_modules/tap/node_modules/nopt" ] }, { name: "runforcover@0.0.2", paths: "node_modules/tap/node_modules/runforcover" }] console.log(columnify(data, {preserveNewLines: true})) ``` #### Output: ``` NAME PATHS glob@3.2.9 node_modules/tap/node_modules/glob node_modules/tape/node_modules/glob nopt@2.2.1 node_modules/tap/node_modules/nopt runforcover@0.0.2 node_modules/tap/node_modules/runforcover ``` Compare this with output without `preserveNewLines`: ```javascript console.log(columnify(data, {preserveNewLines: false})) // or just console.log(columnify(data)) ``` ``` NAME PATHS glob@3.2.9 node_modules/tap/node_modules/glob node_modules/tape/node_modules/glob nopt@2.2.1 node_modules/tap/node_modules/nopt runforcover@0.0.2 node_modules/tap/node_modules/runforcover ``` ### Custom Truncation Marker You can change the truncation marker to something other than the default `…` by using the `truncateMarker` option. ```javascript var columns = columnify(data, { truncate: true, truncateMarker: '>', widths: { description: { maxWidth: 20 } } }) console.log(columns) ``` #### Output: ``` NAME DESCRIPTION VERSION mod1 some description> 0.0.1 module-two another description> 0.2.0 ``` ### Custom Column Splitter If your columns need some bling, you can split columns with custom characters by using the `columnSplitter` option. ```javascript var columns = columnify(data, { columnSplitter: ' | ' }) console.log(columns) ``` #### Output: ``` NAME | DESCRIPTION | VERSION mod1 | some description which happens to be far larger than the max | 0.0.1 module-two | another description larger than the max | 0.2.0 ``` ### Control Header Display Control whether column headers are displayed by using the `showHeaders` option. ```javascript var columns = columnify(data, { showHeaders: false }) ``` This also works well for hiding a single column header, like an `id` column: ```javascript var columns = columnify(data, { config: { id: { showHeaders: false } } }) ``` ### Transforming Column Data and Headers If you need to modify the presentation of column content or heading content there are two useful options for doing that: `dataTransform` and `headerTransform`. Both of these take a function and need to return a valid string. ```javascript var columns = columnify([{ name: 'mod1', description: 'SOME DESCRIPTION TEXT.' }, { name: 'module-two', description: 'SOME SLIGHTLY LONGER DESCRIPTION TEXT.' }], { dataTransform: function(data) { return data.toLowerCase() }, config: { name: { headingTransform: function(heading) { heading = "module " + heading return "*" + heading.toUpperCase() + "*" } } } }) ``` #### Output: ``` *MODULE NAME* DESCRIPTION mod1 some description text. module-two some slightly longer description text. ``` ## Multibyte Character Support `columnify` uses [mycoboco/wcwidth.js](https://github.com/mycoboco/wcwidth.js) to calculate length of multibyte characters: ```javascript var data = [{ name: 'module-one', description: 'some description', version: '0.0.1', }, { name: '这是一个很长的名字的模块', description: '这真的是一个描述的内容这个描述很长', version: "0.3.3" }] console.log(columnify(data)) ``` #### Without multibyte handling: i.e. before columnify added this feature ``` NAME DESCRIPTION VERSION module-one some description 0.0.1 这是一个很长的名字的模块 这真的是一个描述的内容这个描述很长 0.3.3 ``` #### With multibyte handling: ``` NAME DESCRIPTION VERSION module-one some description 0.0.1 这是一个很长的名字的模块 这真的是一个描述的内容这个描述很长 0.3.3 ``` ## Contributions ``` project : columnify repo age : 1 year, 2 months active : 32 days commits : 120 files : 54 authors : 90 Tim Oxley 75.0% 8 Tim 6.7% 7 Arjun Mehta 5.8% 6 Dany 5.0% 5 Wei Gao 4.2% 2 Dany Shaanan 1.7% 1 Seth Miller 0.8% 1 Isaac Z. Schlueter 0.8% ``` ## License MIT npm_3.5.2.orig/node_modules/columnify/columnify.js0000644000000000000000000002334012631326456020521 0ustar 00000000000000"use strict"; var wcwidth = require('./width'); var _require = require('./utils'); var padRight = _require.padRight; var padCenter = _require.padCenter; var padLeft = _require.padLeft; var splitIntoLines = _require.splitIntoLines; var splitLongWords = _require.splitLongWords; var truncateString = _require.truncateString; var DEFAULT_HEADING_TRANSFORM = function DEFAULT_HEADING_TRANSFORM(key) { return key.toUpperCase(); }; var DEFAULT_DATA_TRANSFORM = function DEFAULT_DATA_TRANSFORM(cell, column, index) { return cell; }; var DEFAULTS = Object.freeze({ maxWidth: Infinity, minWidth: 0, columnSplitter: ' ', truncate: false, truncateMarker: '…', preserveNewLines: false, paddingChr: ' ', showHeaders: true, headingTransform: DEFAULT_HEADING_TRANSFORM, dataTransform: DEFAULT_DATA_TRANSFORM }); module.exports = function (items) { var options = arguments.length <= 1 || arguments[1] === undefined ? {} : arguments[1]; var columnConfigs = options.config || {}; delete options.config; // remove config so doesn't appear on every column. var maxLineWidth = options.maxLineWidth || Infinity; if (maxLineWidth === 'auto') maxLineWidth = process.stdout.columns || Infinity; delete options.maxLineWidth; // this is a line control option, don't pass it to column // Option defaults inheritance: // options.config[columnName] => options => DEFAULTS options = mixin({}, DEFAULTS, options); options.config = options.config || Object.create(null); options.spacing = options.spacing || '\n'; // probably useless options.preserveNewLines = !!options.preserveNewLines; options.showHeaders = !!options.showHeaders; options.columns = options.columns || options.include; // alias include/columns, prefer columns if supplied var columnNames = options.columns || []; // optional user-supplied columns to include items = toArray(items, columnNames); // if not suppled column names, automatically determine columns from data keys if (!columnNames.length) { items.forEach(function (item) { for (var columnName in item) { if (columnNames.indexOf(columnName) === -1) columnNames.push(columnName); } }); } // initialize column defaults (each column inherits from options.config) var columns = columnNames.reduce(function (columns, columnName) { var column = Object.create(options); columns[columnName] = mixin(column, columnConfigs[columnName]); return columns; }, Object.create(null)); // sanitize column settings columnNames.forEach(function (columnName) { var column = columns[columnName]; column.name = columnName; column.maxWidth = Math.ceil(column.maxWidth); column.minWidth = Math.ceil(column.minWidth); column.truncate = !!column.truncate; column.align = column.align || 'left'; }); // sanitize data items = items.map(function (item) { var result = Object.create(null); columnNames.forEach(function (columnName) { // null/undefined -> '' result[columnName] = item[columnName] != null ? item[columnName] : ''; // toString everything result[columnName] = '' + result[columnName]; if (columns[columnName].preserveNewLines) { // merge non-newline whitespace chars result[columnName] = result[columnName].replace(/[^\S\n]/gmi, ' '); } else { // merge all whitespace chars result[columnName] = result[columnName].replace(/\s/gmi, ' '); } }); return result; }); // transform data cells columnNames.forEach(function (columnName) { var column = columns[columnName]; items = items.map(function (item, index) { var col = Object.create(column); item[columnName] = column.dataTransform(item[columnName], col, index); var changedKeys = Object.keys(col); // disable default heading transform if we wrote to column.name if (changedKeys.indexOf('name') !== -1) { if (column.headingTransform !== DEFAULT_HEADING_TRANSFORM) return; column.headingTransform = function (heading) { return heading; }; } changedKeys.forEach(function (key) { return column[key] = col[key]; }); return item; }); }); // add headers var headers = {}; if (options.showHeaders) { columnNames.forEach(function (columnName) { var column = columns[columnName]; if (!column.showHeaders) { headers[columnName] = ''; return; } headers[columnName] = column.headingTransform(column.name); }); items.unshift(headers); } // get actual max-width between min & max // based on length of data in columns columnNames.forEach(function (columnName) { var column = columns[columnName]; column.width = items.map(function (item) { return item[columnName]; }).reduce(function (min, cur) { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, wcwidth(cur)))); }, 0); }); // split long words so they can break onto multiple lines columnNames.forEach(function (columnName) { var column = columns[columnName]; items = items.map(function (item) { item[columnName] = splitLongWords(item[columnName], column.width, column.truncateMarker); return item; }); }); // wrap long lines. each item is now an array of lines. columnNames.forEach(function (columnName) { var column = columns[columnName]; items = items.map(function (item, index) { var cell = item[columnName]; item[columnName] = splitIntoLines(cell, column.width); // if truncating required, only include first line + add truncation char if (column.truncate && item[columnName].length > 1) { item[columnName] = splitIntoLines(cell, column.width - wcwidth(column.truncateMarker)); var firstLine = item[columnName][0]; if (!endsWith(firstLine, column.truncateMarker)) item[columnName][0] += column.truncateMarker; item[columnName] = item[columnName].slice(0, 1); } return item; }); }); // recalculate column widths from truncated output/lines columnNames.forEach(function (columnName) { var column = columns[columnName]; column.width = items.map(function (item) { return item[columnName].reduce(function (min, cur) { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, wcwidth(cur)))); }, 0); }).reduce(function (min, cur) { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, cur))); }, 0); }); var rows = createRows(items, columns, columnNames, options.paddingChr); // merge lines into rows // conceive output return rows.reduce(function (output, row) { return output.concat(row.reduce(function (rowOut, line) { return rowOut.concat(line.join(options.columnSplitter)); }, [])); }, []).map(function (line) { return truncateString(line, maxLineWidth); }).join(options.spacing); }; /** * Convert wrapped lines into rows with padded values. * * @param Array items data to process * @param Array columns column width settings for wrapping * @param Array columnNames column ordering * @return Array items wrapped in arrays, corresponding to lines */ function createRows(items, columns, columnNames, paddingChr) { return items.map(function (item) { var row = []; var numLines = 0; columnNames.forEach(function (columnName) { numLines = Math.max(numLines, item[columnName].length); }); // combine matching lines of each rows var _loop = function (i) { row[i] = row[i] || []; columnNames.forEach(function (columnName) { var column = columns[columnName]; var val = item[columnName][i] || ''; // || '' ensures empty columns get padded if (column.align === 'right') row[i].push(padLeft(val, column.width, paddingChr));else if (column.align === 'center' || column.align === 'centre') row[i].push(padCenter(val, column.width, paddingChr));else row[i].push(padRight(val, column.width, paddingChr)); }); }; for (var i = 0; i < numLines; i++) { _loop(i); } return row; }); } /** * Object.assign * * @return Object Object with properties mixed in. */ function mixin() { if (Object.assign) return Object.assign.apply(Object, arguments); return ObjectAssign.apply(undefined, arguments); } function ObjectAssign(target, firstSource) { "use strict"; if (target === undefined || target === null) throw new TypeError("Cannot convert first argument to object"); var to = Object(target); var hasPendingException = false; var pendingException; for (var i = 1; i < arguments.length; i++) { var nextSource = arguments[i]; if (nextSource === undefined || nextSource === null) continue; var keysArray = Object.keys(Object(nextSource)); for (var nextIndex = 0, len = keysArray.length; nextIndex < len; nextIndex++) { var nextKey = keysArray[nextIndex]; try { var desc = Object.getOwnPropertyDescriptor(nextSource, nextKey); if (desc !== undefined && desc.enumerable) to[nextKey] = nextSource[nextKey]; } catch (e) { if (!hasPendingException) { hasPendingException = true; pendingException = e; } } } if (hasPendingException) throw pendingException; } return to; } /** * Adapted from String.prototype.endsWith polyfill. */ function endsWith(target, searchString, position) { position = position || target.length; position = position - searchString.length; var lastIndex = target.lastIndexOf(searchString); return lastIndex !== -1 && lastIndex === position; } function toArray(items, columnNames) { if (Array.isArray(items)) return items; var rows = []; for (var key in items) { var item = {}; item[columnNames[0] || 'key'] = key; item[columnNames[1] || 'value'] = items[key]; rows.push(item); } return rows; } npm_3.5.2.orig/node_modules/columnify/index.js0000644000000000000000000002200212631326456017615 0ustar 00000000000000"use strict" const wcwidth = require('./width') const { padRight, padCenter, padLeft, splitIntoLines, splitLongWords, truncateString } = require('./utils') const DEFAULT_HEADING_TRANSFORM = key => key.toUpperCase() const DEFAULT_DATA_TRANSFORM = (cell, column, index) => cell const DEFAULTS = Object.freeze({ maxWidth: Infinity, minWidth: 0, columnSplitter: ' ', truncate: false, truncateMarker: '…', preserveNewLines: false, paddingChr: ' ', showHeaders: true, headingTransform: DEFAULT_HEADING_TRANSFORM, dataTransform: DEFAULT_DATA_TRANSFORM }) module.exports = function(items, options = {}) { let columnConfigs = options.config || {} delete options.config // remove config so doesn't appear on every column. let maxLineWidth = options.maxLineWidth || Infinity if (maxLineWidth === 'auto') maxLineWidth = process.stdout.columns || Infinity delete options.maxLineWidth // this is a line control option, don't pass it to column // Option defaults inheritance: // options.config[columnName] => options => DEFAULTS options = mixin({}, DEFAULTS, options) options.config = options.config || Object.create(null) options.spacing = options.spacing || '\n' // probably useless options.preserveNewLines = !!options.preserveNewLines options.showHeaders = !!options.showHeaders; options.columns = options.columns || options.include // alias include/columns, prefer columns if supplied let columnNames = options.columns || [] // optional user-supplied columns to include items = toArray(items, columnNames) // if not suppled column names, automatically determine columns from data keys if (!columnNames.length) { items.forEach(function(item) { for (let columnName in item) { if (columnNames.indexOf(columnName) === -1) columnNames.push(columnName) } }) } // initialize column defaults (each column inherits from options.config) let columns = columnNames.reduce((columns, columnName) => { let column = Object.create(options) columns[columnName] = mixin(column, columnConfigs[columnName]) return columns }, Object.create(null)) // sanitize column settings columnNames.forEach(columnName => { let column = columns[columnName] column.name = columnName column.maxWidth = Math.ceil(column.maxWidth) column.minWidth = Math.ceil(column.minWidth) column.truncate = !!column.truncate column.align = column.align || 'left' }) // sanitize data items = items.map(item => { let result = Object.create(null) columnNames.forEach(columnName => { // null/undefined -> '' result[columnName] = item[columnName] != null ? item[columnName] : '' // toString everything result[columnName] = '' + result[columnName] if (columns[columnName].preserveNewLines) { // merge non-newline whitespace chars result[columnName] = result[columnName].replace(/[^\S\n]/gmi, ' ') } else { // merge all whitespace chars result[columnName] = result[columnName].replace(/\s/gmi, ' ') } }) return result }) // transform data cells columnNames.forEach(columnName => { let column = columns[columnName] items = items.map((item, index) => { let col = Object.create(column) item[columnName] = column.dataTransform(item[columnName], col, index) let changedKeys = Object.keys(col) // disable default heading transform if we wrote to column.name if (changedKeys.indexOf('name') !== -1) { if (column.headingTransform !== DEFAULT_HEADING_TRANSFORM) return column.headingTransform = heading => heading } changedKeys.forEach(key => column[key] = col[key]) return item }) }) // add headers let headers = {} if(options.showHeaders) { columnNames.forEach(columnName => { let column = columns[columnName] if(!column.showHeaders){ headers[columnName] = ''; return; } headers[columnName] = column.headingTransform(column.name) }) items.unshift(headers) } // get actual max-width between min & max // based on length of data in columns columnNames.forEach(columnName => { let column = columns[columnName] column.width = items .map(item => item[columnName]) .reduce((min, cur) => { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, wcwidth(cur)))) }, 0) }) // split long words so they can break onto multiple lines columnNames.forEach(columnName => { let column = columns[columnName] items = items.map(item => { item[columnName] = splitLongWords(item[columnName], column.width, column.truncateMarker) return item }) }) // wrap long lines. each item is now an array of lines. columnNames.forEach(columnName => { let column = columns[columnName] items = items.map((item, index) => { let cell = item[columnName] item[columnName] = splitIntoLines(cell, column.width) // if truncating required, only include first line + add truncation char if (column.truncate && item[columnName].length > 1) { item[columnName] = splitIntoLines(cell, column.width - wcwidth(column.truncateMarker)) let firstLine = item[columnName][0] if (!endsWith(firstLine, column.truncateMarker)) item[columnName][0] += column.truncateMarker item[columnName] = item[columnName].slice(0, 1) } return item }) }) // recalculate column widths from truncated output/lines columnNames.forEach(columnName => { let column = columns[columnName] column.width = items.map(item => { return item[columnName].reduce((min, cur) => { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, wcwidth(cur)))) }, 0) }).reduce((min, cur) => { return Math.max(min, Math.min(column.maxWidth, Math.max(column.minWidth, cur))) }, 0) }) let rows = createRows(items, columns, columnNames, options.paddingChr) // merge lines into rows // conceive output return rows.reduce((output, row) => { return output.concat(row.reduce((rowOut, line) => { return rowOut.concat(line.join(options.columnSplitter)) }, [])) }, []) .map(line => truncateString(line, maxLineWidth)) .join(options.spacing) } /** * Convert wrapped lines into rows with padded values. * * @param Array items data to process * @param Array columns column width settings for wrapping * @param Array columnNames column ordering * @return Array items wrapped in arrays, corresponding to lines */ function createRows(items, columns, columnNames, paddingChr) { return items.map(item => { let row = [] let numLines = 0 columnNames.forEach(columnName => { numLines = Math.max(numLines, item[columnName].length) }) // combine matching lines of each rows for (let i = 0; i < numLines; i++) { row[i] = row[i] || [] columnNames.forEach(columnName => { let column = columns[columnName] let val = item[columnName][i] || '' // || '' ensures empty columns get padded if (column.align === 'right') row[i].push(padLeft(val, column.width, paddingChr)) else if (column.align === 'center' || column.align === 'centre') row[i].push(padCenter(val, column.width, paddingChr)) else row[i].push(padRight(val, column.width, paddingChr)) }) } return row }) } /** * Object.assign * * @return Object Object with properties mixed in. */ function mixin(...args) { if (Object.assign) return Object.assign(...args) return ObjectAssign(...args) } function ObjectAssign(target, firstSource) { "use strict"; if (target === undefined || target === null) throw new TypeError("Cannot convert first argument to object"); var to = Object(target); var hasPendingException = false; var pendingException; for (var i = 1; i < arguments.length; i++) { var nextSource = arguments[i]; if (nextSource === undefined || nextSource === null) continue; var keysArray = Object.keys(Object(nextSource)); for (var nextIndex = 0, len = keysArray.length; nextIndex < len; nextIndex++) { var nextKey = keysArray[nextIndex]; try { var desc = Object.getOwnPropertyDescriptor(nextSource, nextKey); if (desc !== undefined && desc.enumerable) to[nextKey] = nextSource[nextKey]; } catch (e) { if (!hasPendingException) { hasPendingException = true; pendingException = e; } } } if (hasPendingException) throw pendingException; } return to; } /** * Adapted from String.prototype.endsWith polyfill. */ function endsWith(target, searchString, position) { position = position || target.length; position = position - searchString.length; let lastIndex = target.lastIndexOf(searchString); return lastIndex !== -1 && lastIndex === position; } function toArray(items, columnNames) { if (Array.isArray(items)) return items let rows = [] for (let key in items) { let item = {} item[columnNames[0] || 'key'] = key item[columnNames[1] || 'value'] = items[key] rows.push(item) } return rows } npm_3.5.2.orig/node_modules/columnify/node_modules/0000755000000000000000000000000012631326456020631 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/package.json0000644000000000000000000000317212631326456020445 0ustar 00000000000000{ "name": "columnify", "version": "1.5.2", "description": "Render data in text columns. Supports in-column text-wrap.", "main": "columnify.js", "scripts": { "pretest": "npm prune", "test": "make prepublish && tape test/*.js | tap-spec", "bench": "npm test && node bench", "prepublish": "make prepublish" }, "author": { "name": "Tim Oxley" }, "license": "MIT", "devDependencies": { "babel": "^5.8.21", "chalk": "^1.1.0", "tap-spec": "^4.0.2", "tape": "^4.0.3" }, "repository": { "type": "git", "url": "git://github.com/timoxley/columnify.git" }, "keywords": [ "column", "text", "ansi", "console", "terminal", "wrap", "table" ], "bugs": { "url": "https://github.com/timoxley/columnify/issues" }, "homepage": "https://github.com/timoxley/columnify", "dependencies": { "strip-ansi": "^3.0.0", "wcwidth": "^1.0.0" }, "directories": { "test": "test" }, "gitHead": "e7417b78091844ff2f3ba62551a4817c7ae217bd", "_id": "columnify@1.5.2", "_shasum": "6937930d47c22a9bfa20732a7fd619d47eaba65a", "_from": "columnify@>=1.5.2 <1.6.0", "_npmVersion": "2.9.0", "_nodeVersion": "2.0.1", "_npmUser": { "name": "timoxley", "email": "secoif@gmail.com" }, "maintainers": [ { "name": "timoxley", "email": "secoif@gmail.com" } ], "dist": { "shasum": "6937930d47c22a9bfa20732a7fd619d47eaba65a", "tarball": "http://registry.npmjs.org/columnify/-/columnify-1.5.2.tgz" }, "_resolved": "https://registry.npmjs.org/columnify/-/columnify-1.5.2.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/columnify/utils.js0000644000000000000000000001122412631326456017652 0ustar 00000000000000"use strict" var wcwidth = require('./width') /** * repeat string `str` up to total length of `len` * * @param String str string to repeat * @param Number len total length of output string */ function repeatString(str, len) { return Array.apply(null, {length: len + 1}).join(str).slice(0, len) } /** * Pad `str` up to total length `max` with `chr`. * If `str` is longer than `max`, padRight will return `str` unaltered. * * @param String str string to pad * @param Number max total length of output string * @param String chr optional. Character to pad with. default: ' ' * @return String padded str */ function padRight(str, max, chr) { str = str != null ? str : '' str = String(str) var length = max - wcwidth(str) if (length <= 0) return str return str + repeatString(chr || ' ', length) } /** * Pad `str` up to total length `max` with `chr`. * If `str` is longer than `max`, padCenter will return `str` unaltered. * * @param String str string to pad * @param Number max total length of output string * @param String chr optional. Character to pad with. default: ' ' * @return String padded str */ function padCenter(str, max, chr) { str = str != null ? str : '' str = String(str) var length = max - wcwidth(str) if (length <= 0) return str var lengthLeft = Math.floor(length/2) var lengthRight = length - lengthLeft return repeatString(chr || ' ', lengthLeft) + str + repeatString(chr || ' ', lengthRight) } /** * Pad `str` up to total length `max` with `chr`, on the left. * If `str` is longer than `max`, padRight will return `str` unaltered. * * @param String str string to pad * @param Number max total length of output string * @param String chr optional. Character to pad with. default: ' ' * @return String padded str */ function padLeft(str, max, chr) { str = str != null ? str : '' str = String(str) var length = max - wcwidth(str) if (length <= 0) return str return repeatString(chr || ' ', length) + str } /** * Split a String `str` into lines of maxiumum length `max`. * Splits on word boundaries. Preserves existing new lines. * * @param String str string to split * @param Number max length of each line * @return Array Array containing lines. */ function splitIntoLines(str, max) { function _splitIntoLines(str, max) { return str.trim().split(' ').reduce(function(lines, word) { var line = lines[lines.length - 1] if (line && wcwidth(line.join(' ')) + wcwidth(word) < max) { lines[lines.length - 1].push(word) // add to line } else lines.push([word]) // new line return lines }, []).map(function(l) { return l.join(' ') }) } return str.split('\n').map(function(str) { return _splitIntoLines(str, max) }).reduce(function(lines, line) { return lines.concat(line) }, []) } /** * Add spaces and `truncationChar` between words of * `str` which are longer than `max`. * * @param String str string to split * @param Number max length of each line * @param Number truncationChar character to append to split words * @return String */ function splitLongWords(str, max, truncationChar, result) { str = str.trim() result = result || [] if (!str) return result.join(' ') || '' var words = str.split(' ') var word = words.shift() || str if (wcwidth(word) > max) { // slice is based on length no wcwidth var i = 0 var wwidth = 0 var limit = max - wcwidth(truncationChar) while (i < word.length) { var w = wcwidth(word.charAt(i)) if(w + wwidth > limit) break wwidth += w ++i } var remainder = word.slice(i) // get remainder words.unshift(remainder) // save remainder for next loop word = word.slice(0, i) // grab truncated word word += truncationChar // add trailing … or whatever } result.push(word) return splitLongWords(words.join(' '), max, truncationChar, result) } /** * Truncate `str` into total width `max` * If `str` is shorter than `max`, will return `str` unaltered. * * @param String str string to truncated * @param Number max total wcwidth of output string * @return String truncated str */ function truncateString(str, max) { str = str != null ? str : '' str = String(str) if(max == Infinity) return str var i = 0 var wwidth = 0 while (i < str.length) { var w = wcwidth(str.charAt(i)) if(w + wwidth > max) break wwidth += w ++i } return str.slice(0, i) } /** * Exports */ module.exports.padRight = padRight module.exports.padCenter = padCenter module.exports.padLeft = padLeft module.exports.splitIntoLines = splitIntoLines module.exports.splitLongWords = splitLongWords module.exports.truncateString = truncateString npm_3.5.2.orig/node_modules/columnify/width.js0000644000000000000000000000021412631326456017626 0ustar 00000000000000var stripAnsi = require('strip-ansi') var wcwidth = require('wcwidth') module.exports = function(str) { return wcwidth(stripAnsi(str)) } npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/0000755000000000000000000000000012631326456022302 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/.npmignore0000644000000000000000000000001512631326456024275 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/LICENSE0000644000000000000000000000305512631326456023312 0ustar 00000000000000wcwidth.js: JavaScript Portng of Markus Kuhn's wcwidth() Implementation ======================================================================= Copyright (C) 2012 by Jun Woong. This package is a JavaScript porting of `wcwidth()` implementation [by Markus Kuhn](http://www.cl.cam.ac.uk/~mgk25/ucs/wcwidth.c). Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THIS SOFTWARE IS PROVIDED ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/Readme.md0000644000000000000000000000156712631326456024032 0ustar 00000000000000# wcwidth Determine columns needed for a fixed-size wide-character string ---- wcwidth is a simple JavaScript port of [wcwidth](http://man7.org/linux/man-pages/man3/wcswidth.3.html) implemented in C by Markus Kuhn. JavaScript port [originally](https://github.com/mycoboco/wcwidth.js) written by Woong Jun (http://code.woong.org/) ## Example ```js '한'.length // => 1 wcwidth('한'); // => 2 '한글'.length // => 2 wcwidth('한글'); // => 4 ``` `wcwidth()` and its string version, `wcswidth()` are defined by IEEE Std 1002.1-2001, a.k.a. POSIX.1-2001, and return the number of columns used to represent the given wide character and string. Markus's implementation assumes the wide character given to those functions to be encoded in ISO 10646, which is almost true for JavaScript's characters. [Further explaination here](docs) ## License MIT npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/combining.js0000644000000000000000000000600612631326456024607 0ustar 00000000000000module.exports = [ [ 0x0300, 0x036F ], [ 0x0483, 0x0486 ], [ 0x0488, 0x0489 ], [ 0x0591, 0x05BD ], [ 0x05BF, 0x05BF ], [ 0x05C1, 0x05C2 ], [ 0x05C4, 0x05C5 ], [ 0x05C7, 0x05C7 ], [ 0x0600, 0x0603 ], [ 0x0610, 0x0615 ], [ 0x064B, 0x065E ], [ 0x0670, 0x0670 ], [ 0x06D6, 0x06E4 ], [ 0x06E7, 0x06E8 ], [ 0x06EA, 0x06ED ], [ 0x070F, 0x070F ], [ 0x0711, 0x0711 ], [ 0x0730, 0x074A ], [ 0x07A6, 0x07B0 ], [ 0x07EB, 0x07F3 ], [ 0x0901, 0x0902 ], [ 0x093C, 0x093C ], [ 0x0941, 0x0948 ], [ 0x094D, 0x094D ], [ 0x0951, 0x0954 ], [ 0x0962, 0x0963 ], [ 0x0981, 0x0981 ], [ 0x09BC, 0x09BC ], [ 0x09C1, 0x09C4 ], [ 0x09CD, 0x09CD ], [ 0x09E2, 0x09E3 ], [ 0x0A01, 0x0A02 ], [ 0x0A3C, 0x0A3C ], [ 0x0A41, 0x0A42 ], [ 0x0A47, 0x0A48 ], [ 0x0A4B, 0x0A4D ], [ 0x0A70, 0x0A71 ], [ 0x0A81, 0x0A82 ], [ 0x0ABC, 0x0ABC ], [ 0x0AC1, 0x0AC5 ], [ 0x0AC7, 0x0AC8 ], [ 0x0ACD, 0x0ACD ], [ 0x0AE2, 0x0AE3 ], [ 0x0B01, 0x0B01 ], [ 0x0B3C, 0x0B3C ], [ 0x0B3F, 0x0B3F ], [ 0x0B41, 0x0B43 ], [ 0x0B4D, 0x0B4D ], [ 0x0B56, 0x0B56 ], [ 0x0B82, 0x0B82 ], [ 0x0BC0, 0x0BC0 ], [ 0x0BCD, 0x0BCD ], [ 0x0C3E, 0x0C40 ], [ 0x0C46, 0x0C48 ], [ 0x0C4A, 0x0C4D ], [ 0x0C55, 0x0C56 ], [ 0x0CBC, 0x0CBC ], [ 0x0CBF, 0x0CBF ], [ 0x0CC6, 0x0CC6 ], [ 0x0CCC, 0x0CCD ], [ 0x0CE2, 0x0CE3 ], [ 0x0D41, 0x0D43 ], [ 0x0D4D, 0x0D4D ], [ 0x0DCA, 0x0DCA ], [ 0x0DD2, 0x0DD4 ], [ 0x0DD6, 0x0DD6 ], [ 0x0E31, 0x0E31 ], [ 0x0E34, 0x0E3A ], [ 0x0E47, 0x0E4E ], [ 0x0EB1, 0x0EB1 ], [ 0x0EB4, 0x0EB9 ], [ 0x0EBB, 0x0EBC ], [ 0x0EC8, 0x0ECD ], [ 0x0F18, 0x0F19 ], [ 0x0F35, 0x0F35 ], [ 0x0F37, 0x0F37 ], [ 0x0F39, 0x0F39 ], [ 0x0F71, 0x0F7E ], [ 0x0F80, 0x0F84 ], [ 0x0F86, 0x0F87 ], [ 0x0F90, 0x0F97 ], [ 0x0F99, 0x0FBC ], [ 0x0FC6, 0x0FC6 ], [ 0x102D, 0x1030 ], [ 0x1032, 0x1032 ], [ 0x1036, 0x1037 ], [ 0x1039, 0x1039 ], [ 0x1058, 0x1059 ], [ 0x1160, 0x11FF ], [ 0x135F, 0x135F ], [ 0x1712, 0x1714 ], [ 0x1732, 0x1734 ], [ 0x1752, 0x1753 ], [ 0x1772, 0x1773 ], [ 0x17B4, 0x17B5 ], [ 0x17B7, 0x17BD ], [ 0x17C6, 0x17C6 ], [ 0x17C9, 0x17D3 ], [ 0x17DD, 0x17DD ], [ 0x180B, 0x180D ], [ 0x18A9, 0x18A9 ], [ 0x1920, 0x1922 ], [ 0x1927, 0x1928 ], [ 0x1932, 0x1932 ], [ 0x1939, 0x193B ], [ 0x1A17, 0x1A18 ], [ 0x1B00, 0x1B03 ], [ 0x1B34, 0x1B34 ], [ 0x1B36, 0x1B3A ], [ 0x1B3C, 0x1B3C ], [ 0x1B42, 0x1B42 ], [ 0x1B6B, 0x1B73 ], [ 0x1DC0, 0x1DCA ], [ 0x1DFE, 0x1DFF ], [ 0x200B, 0x200F ], [ 0x202A, 0x202E ], [ 0x2060, 0x2063 ], [ 0x206A, 0x206F ], [ 0x20D0, 0x20EF ], [ 0x302A, 0x302F ], [ 0x3099, 0x309A ], [ 0xA806, 0xA806 ], [ 0xA80B, 0xA80B ], [ 0xA825, 0xA826 ], [ 0xFB1E, 0xFB1E ], [ 0xFE00, 0xFE0F ], [ 0xFE20, 0xFE23 ], [ 0xFEFF, 0xFEFF ], [ 0xFFF9, 0xFFFB ], [ 0x10A01, 0x10A03 ], [ 0x10A05, 0x10A06 ], [ 0x10A0C, 0x10A0F ], [ 0x10A38, 0x10A3A ], [ 0x10A3F, 0x10A3F ], [ 0x1D167, 0x1D169 ], [ 0x1D173, 0x1D182 ], [ 0x1D185, 0x1D18B ], [ 0x1D1AA, 0x1D1AD ], [ 0x1D242, 0x1D244 ], [ 0xE0001, 0xE0001 ], [ 0xE0020, 0xE007F ], [ 0xE0100, 0xE01EF ] ] npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/docs/0000755000000000000000000000000012631326456023232 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/index.js0000644000000000000000000000610512631326456023751 0ustar 00000000000000"use strict" var defaults = require('defaults') var combining = require('./combining') var DEFAULTS = { nul: 0, control: 0 } module.exports = function wcwidth(str) { return wcswidth(str, DEFAULTS) } module.exports.config = function(opts) { opts = defaults(opts || {}, DEFAULTS) return function wcwidth(str) { return wcswidth(str, opts) } } /* * The following functions define the column width of an ISO 10646 * character as follows: * - The null character (U+0000) has a column width of 0. * - Other C0/C1 control characters and DEL will lead to a return value * of -1. * - Non-spacing and enclosing combining characters (general category * code Mn or Me in the * Unicode database) have a column width of 0. * - SOFT HYPHEN (U+00AD) has a column width of 1. * - Other format characters (general category code Cf in the Unicode * database) and ZERO WIDTH * SPACE (U+200B) have a column width of 0. * - Hangul Jamo medial vowels and final consonants (U+1160-U+11FF) * have a column width of 0. * - Spacing characters in the East Asian Wide (W) or East Asian * Full-width (F) category as * defined in Unicode Technical Report #11 have a column width of 2. * - All remaining characters (including all printable ISO 8859-1 and * WGL4 characters, Unicode control characters, etc.) have a column * width of 1. * This implementation assumes that characters are encoded in ISO 10646. */ function wcswidth(str, opts) { if (typeof str !== 'string') return wcwidth(str, opts) var s = 0 for (var i = 0; i < str.length; i++) { var n = wcwidth(str.charCodeAt(i), opts) if (n < 0) return -1 s += n } return s } function wcwidth(ucs, opts) { // test for 8-bit control characters if (ucs === 0) return opts.nul if (ucs < 32 || (ucs >= 0x7f && ucs < 0xa0)) return opts.control // binary search in table of non-spacing characters if (bisearch(ucs)) return 0 // if we arrive here, ucs is not a combining or C0/C1 control character return 1 + (ucs >= 0x1100 && (ucs <= 0x115f || // Hangul Jamo init. consonants ucs == 0x2329 || ucs == 0x232a || (ucs >= 0x2e80 && ucs <= 0xa4cf && ucs != 0x303f) || // CJK ... Yi (ucs >= 0xac00 && ucs <= 0xd7a3) || // Hangul Syllables (ucs >= 0xf900 && ucs <= 0xfaff) || // CJK Compatibility Ideographs (ucs >= 0xfe10 && ucs <= 0xfe19) || // Vertical forms (ucs >= 0xfe30 && ucs <= 0xfe6f) || // CJK Compatibility Forms (ucs >= 0xff00 && ucs <= 0xff60) || // Fullwidth Forms (ucs >= 0xffe0 && ucs <= 0xffe6) || (ucs >= 0x20000 && ucs <= 0x2fffd) || (ucs >= 0x30000 && ucs <= 0x3fffd))); } function bisearch(ucs) { var min = 0 var max = combining.length - 1 var mid if (ucs < combining[0][0] || ucs > combining[max][1]) return false while (max >= min) { mid = Math.floor((min + max) / 2) if (ucs > combining[mid][1]) min = mid + 1 else if (ucs < combining[mid][0]) max = mid - 1 else return true } return false } npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/0000755000000000000000000000000012631326456024757 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/package.json0000644000000000000000000000246312631326456024575 0ustar 00000000000000{ "name": "wcwidth", "version": "1.0.0", "description": "Port of C's wcwidth() and wcswidth()", "author": { "name": "Tim Oxley" }, "contributors": [ { "name": "Woong Jun", "email": "woong.jun@gmail.com", "url": "http://code.woong.org/" } ], "main": "index.js", "dependencies": { "defaults": "^1.0.0" }, "devDependencies": { "tape": "^2.13.4" }, "license": "MIT", "keywords": [ "wide character", "wc", "wide character string", "wcs", "terminal", "width", "wcwidth", "wcswidth" ], "directories": { "doc": "docs", "test": "test" }, "scripts": { "test": "tape test/*.js" }, "gitHead": "5bc3aafd45c89f233c27b9479c18a23ca91ba660", "_id": "wcwidth@1.0.0", "_shasum": "02d059ff7a8fc741e0f6b5da1e69b2b40daeca6f", "_from": "wcwidth@>=1.0.0 <2.0.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "timoxley", "email": "secoif@gmail.com" }, "maintainers": [ { "name": "timoxley", "email": "secoif@gmail.com" } ], "dist": { "shasum": "02d059ff7a8fc741e0f6b5da1e69b2b40daeca6f", "tarball": "http://registry.npmjs.org/wcwidth/-/wcwidth-1.0.0.tgz" }, "_resolved": "https://registry.npmjs.org/wcwidth/-/wcwidth-1.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/test/0000755000000000000000000000000012631326456023261 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/docs/index.md0000644000000000000000000000622212631326456024665 0ustar 00000000000000### Javascript porting of Markus Kuhn's wcwidth() implementation The following explanation comes from the original C implementation: This is an implementation of wcwidth() and wcswidth() (defined in IEEE Std 1002.1-2001) for Unicode. http://www.opengroup.org/onlinepubs/007904975/functions/wcwidth.html http://www.opengroup.org/onlinepubs/007904975/functions/wcswidth.html In fixed-width output devices, Latin characters all occupy a single "cell" position of equal width, whereas ideographic CJK characters occupy two such cells. Interoperability between terminal-line applications and (teletype-style) character terminals using the UTF-8 encoding requires agreement on which character should advance the cursor by how many cell positions. No established formal standards exist at present on which Unicode character shall occupy how many cell positions on character terminals. These routines are a first attempt of defining such behavior based on simple rules applied to data provided by the Unicode Consortium. For some graphical characters, the Unicode standard explicitly defines a character-cell width via the definition of the East Asian FullWidth (F), Wide (W), Half-width (H), and Narrow (Na) classes. In all these cases, there is no ambiguity about which width a terminal shall use. For characters in the East Asian Ambiguous (A) class, the width choice depends purely on a preference of backward compatibility with either historic CJK or Western practice. Choosing single-width for these characters is easy to justify as the appropriate long-term solution, as the CJK practice of displaying these characters as double-width comes from historic implementation simplicity (8-bit encoded characters were displayed single-width and 16-bit ones double-width, even for Greek, Cyrillic, etc.) and not any typographic considerations. Much less clear is the choice of width for the Not East Asian (Neutral) class. Existing practice does not dictate a width for any of these characters. It would nevertheless make sense typographically to allocate two character cells to characters such as for instance EM SPACE or VOLUME INTEGRAL, which cannot be represented adequately with a single-width glyph. The following routines at present merely assign a single-cell width to all neutral characters, in the interest of simplicity. This is not entirely satisfactory and should be reconsidered before establishing a formal standard in this area. At the moment, the decision which Not East Asian (Neutral) characters should be represented by double-width glyphs cannot yet be answered by applying a simple rule from the Unicode database content. Setting up a proper standard for the behavior of UTF-8 character terminals will require a careful analysis not only of each Unicode character, but also of each presentation form, something the author of these routines has avoided to do so far. http://www.unicode.org/unicode/reports/tr11/ Markus Kuhn -- 2007-05-26 (Unicode 5.0) Permission to use, copy, modify, and distribute this software for any purpose and without fee is hereby granted. The author disclaims all warranties with regard to this software. Latest version: http://www.cl.cam.ac.uk/~mgk25/ucs/wcwidth.c npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/0000755000000000000000000000000012631326456026566 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/.npmignore0000644000000000000000000000001512631326456030561 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/LICENSE0000644000000000000000000000206712631326456027600 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2015 Elijah Insua Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/README.md0000644000000000000000000000147312631326456030052 0ustar 00000000000000# defaults A simple one level options merge utility ## install `npm install defaults` ## use ```javascript var defaults = require('defaults'); var handle = function(options, fn) { options = defaults(options, { timeout: 100 }); setTimeout(function() { fn(options); }, options.timeout); } handle({ timeout: 1000 }, function() { // we're here 1000 ms later }); handle({ timeout: 10000 }, function() { // we're here 10s later }); ``` ## summary this module exports a function that takes 2 arguments: `options` and `defaults`. When called, it overrides all of `undefined` properties in `options` with the clones of properties defined in `defaults` Sidecases: if called with a falsy `options` value, options will be initialized to a new object before being merged onto. ## license [MIT](LICENSE) npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/index.js0000644000000000000000000000042512631326456030234 0ustar 00000000000000var clone = require('clone'); module.exports = function(options, defaults) { options = options || {}; Object.keys(defaults).forEach(function(key) { if (typeof options[key] === 'undefined') { options[key] = clone(defaults[key]); } }); return options; };npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/0000755000000000000000000000000012631326456031243 5ustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/package.json0000644000000000000000000000242212631326456031054 0ustar 00000000000000{ "name": "defaults", "version": "1.0.3", "description": "merge single level defaults over a config object", "main": "index.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git://github.com/tmpvar/defaults.git" }, "keywords": [ "config", "defaults" ], "author": { "name": "Elijah Insua", "email": "tmpvar@gmail.com" }, "license": "MIT", "dependencies": { "clone": "^1.0.2" }, "devDependencies": { "tap": "^2.0.0" }, "gitHead": "8831ec32a5f999bfae1a8c9bf32880971ed7c6f2", "bugs": { "url": "https://github.com/tmpvar/defaults/issues" }, "homepage": "https://github.com/tmpvar/defaults#readme", "_id": "defaults@1.0.3", "_shasum": "c656051e9817d9ff08ed881477f3fe4019f3ef7d", "_from": "defaults@>=1.0.0 <2.0.0", "_npmVersion": "2.14.4", "_nodeVersion": "4.1.1", "_npmUser": { "name": "tmpvar", "email": "tmpvar@gmail.com" }, "dist": { "shasum": "c656051e9817d9ff08ed881477f3fe4019f3ef7d", "tarball": "http://registry.npmjs.org/defaults/-/defaults-1.0.3.tgz" }, "maintainers": [ { "name": "tmpvar", "email": "tmpvar@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/defaults/-/defaults-1.0.3.tgz" } npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/test.js0000644000000000000000000000203212631326456030100 0ustar 00000000000000var defaults = require('./'), test = require('tap').test; test("ensure options is an object", function(t) { var options = defaults(false, { a : true }); t.ok(options.a); t.end() }); test("ensure defaults override keys", function(t) { var result = defaults({}, { a: false, b: true }); t.ok(result.b, 'b merges over undefined'); t.equal(result.a, false, 'a merges over undefined'); t.end(); }); test("ensure defined keys are not overwritten", function(t) { var result = defaults({ b: false }, { a: false, b: true }); t.equal(result.b, false, 'b not merged'); t.equal(result.a, false, 'a merges over undefined'); t.end(); }); test("ensure defaults clone nested objects", function(t) { var d = { a: [1,2,3], b: { hello : 'world' } }; var result = defaults({}, d); t.equal(result.a.length, 3, 'objects should be clones'); t.ok(result.a !== d.a, 'objects should be clones'); t.equal(Object.keys(result.b).length, 1, 'objects should be clones'); t.ok(result.b !== d.b, 'objects should be clones'); t.end(); }); npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000755000000000000000000000000012631326456032343 5ustar 00000000000000././@LongLink0000000000000000000000000000015700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/.npmignorenpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000000001612631326456032342 0ustar 00000000000000node_modules/ ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/.travis.ymlnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000000004412631326456032343 0ustar 00000000000000language: node_js node_js: - 0.10 ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/LICENSEnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000000205612631326456032350 0ustar 00000000000000Copyright © 2011-2015 Paul Vorbach Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/README.mdnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000000700112631326456032343 0ustar 00000000000000# clone [![build status](https://secure.travis-ci.org/pvorb/node-clone.png)](http://travis-ci.org/pvorb/node-clone) [![info badge](https://nodei.co/npm/clone.png?downloads=true&downloadRank=true&stars=true)](http://npm-stat.com/charts.html?package=clone) offers foolproof _deep cloning_ of objects, arrays, numbers, strings etc. in JavaScript. ## Installation npm install clone (It also works with browserify, ender or standalone.) ## Example ~~~ javascript var clone = require('clone'); var a, b; a = { foo: { bar: 'baz' } }; // initial value of a b = clone(a); // clone a -> b a.foo.bar = 'foo'; // change a console.log(a); // show a console.log(b); // show b ~~~ This will print: ~~~ javascript { foo: { bar: 'foo' } } { foo: { bar: 'baz' } } ~~~ **clone** masters cloning simple objects (even with custom prototype), arrays, Date objects, and RegExp objects. Everything is cloned recursively, so that you can clone dates in arrays in objects, for example. ## API `clone(val, circular, depth)` * `val` -- the value that you want to clone, any type allowed * `circular` -- boolean Call `clone` with `circular` set to `false` if you are certain that `obj` contains no circular references. This will give better performance if needed. There is no error if `undefined` or `null` is passed as `obj`. * `depth` -- depth to which the object is to be cloned (optional, defaults to infinity) `clone.clonePrototype(obj)` * `obj` -- the object that you want to clone Does a prototype clone as [described by Oran Looney](http://oranlooney.com/functional-javascript/). ## Circular References ~~~ javascript var a, b; a = { hello: 'world' }; a.myself = a; b = clone(a); console.log(b); ~~~ This will print: ~~~ javascript { hello: "world", myself: [Circular] } ~~~ So, `b.myself` points to `b`, not `a`. Neat! ## Test npm test ## Caveat Some special objects like a socket or `process.stdout`/`stderr` are known to not be cloneable. If you find other objects that cannot be cloned, please [open an issue](https://github.com/pvorb/node-clone/issues/new). ## Bugs and Issues If you encounter any bugs or issues, feel free to [open an issue at github](https://github.com/pvorb/node-clone/issues) or send me an email to . I also always like to hear from you, if you’re using my code. ## License Copyright © 2011-2015 [Paul Vorbach](http://paul.vorba.ch/) and [contributors](https://github.com/pvorb/node-clone/graphs/contributors). Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/clone.jsnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000001017312631326456032347 0ustar 00000000000000var clone = (function() { 'use strict'; /** * Clones (copies) an Object using deep copying. * * This function supports circular references by default, but if you are certain * there are no circular references in your object, you can save some CPU time * by calling clone(obj, false). * * Caution: if `circular` is false and `parent` contains circular references, * your program may enter an infinite loop and crash. * * @param `parent` - the object to be cloned * @param `circular` - set to true if the object to be cloned may contain * circular references. (optional - true by default) * @param `depth` - set to a number if the object is only to be cloned to * a particular depth. (optional - defaults to Infinity) * @param `prototype` - sets the prototype to be used when cloning an object. * (optional - defaults to parent prototype). */ function clone(parent, circular, depth, prototype) { var filter; if (typeof circular === 'object') { depth = circular.depth; prototype = circular.prototype; filter = circular.filter; circular = circular.circular } // maintain two arrays for circular references, where corresponding parents // and children have the same index var allParents = []; var allChildren = []; var useBuffer = typeof Buffer != 'undefined'; if (typeof circular == 'undefined') circular = true; if (typeof depth == 'undefined') depth = Infinity; // recurse this function so we don't reset allParents and allChildren function _clone(parent, depth) { // cloning null always returns null if (parent === null) return null; if (depth == 0) return parent; var child; var proto; if (typeof parent != 'object') { return parent; } if (clone.__isArray(parent)) { child = []; } else if (clone.__isRegExp(parent)) { child = new RegExp(parent.source, __getRegExpFlags(parent)); if (parent.lastIndex) child.lastIndex = parent.lastIndex; } else if (clone.__isDate(parent)) { child = new Date(parent.getTime()); } else if (useBuffer && Buffer.isBuffer(parent)) { child = new Buffer(parent.length); parent.copy(child); return child; } else { if (typeof prototype == 'undefined') { proto = Object.getPrototypeOf(parent); child = Object.create(proto); } else { child = Object.create(prototype); proto = prototype; } } if (circular) { var index = allParents.indexOf(parent); if (index != -1) { return allChildren[index]; } allParents.push(parent); allChildren.push(child); } for (var i in parent) { var attrs; if (proto) { attrs = Object.getOwnPropertyDescriptor(proto, i); } if (attrs && attrs.set == null) { continue; } child[i] = _clone(parent[i], depth - 1); } return child; } return _clone(parent, depth); } /** * Simple flat clone using prototype, accepts only objects, usefull for property * override on FLAT configuration object (no nested props). * * USE WITH CAUTION! This may not behave as you wish if you do not know how this * works. */ clone.clonePrototype = function clonePrototype(parent) { if (parent === null) return null; var c = function () {}; c.prototype = parent; return new c(); }; // private utility functions function __objToStr(o) { return Object.prototype.toString.call(o); }; clone.__objToStr = __objToStr; function __isDate(o) { return typeof o === 'object' && __objToStr(o) === '[object Date]'; }; clone.__isDate = __isDate; function __isArray(o) { return typeof o === 'object' && __objToStr(o) === '[object Array]'; }; clone.__isArray = __isArray; function __isRegExp(o) { return typeof o === 'object' && __objToStr(o) === '[object RegExp]'; }; clone.__isRegExp = __isRegExp; function __getRegExpFlags(re) { var flags = ''; if (re.global) flags += 'g'; if (re.ignoreCase) flags += 'i'; if (re.multiline) flags += 'm'; return flags; }; clone.__getRegExpFlags = __getRegExpFlags; return clone; })(); if (typeof module === 'object' && module.exports) { module.exports = clone; } ././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/package.jsonnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000001436312631326456032354 0ustar 00000000000000{ "name": "clone", "description": "deep cloning of objects and arrays", "tags": [ "clone", "object", "array", "function", "date" ], "version": "1.0.2", "repository": { "type": "git", "url": "git://github.com/pvorb/node-clone.git" }, "bugs": { "url": "https://github.com/pvorb/node-clone/issues" }, "main": "clone.js", "author": { "name": "Paul Vorbach", "email": "paul@vorba.ch", "url": "http://paul.vorba.ch/" }, "contributors": [ { "name": "Blake Miner", "email": "miner.blake@gmail.com", "url": "http://www.blakeminer.com/" }, { "name": "Tian You", "email": "axqd001@gmail.com", "url": "http://blog.axqd.net/" }, { "name": "George Stagas", "email": "gstagas@gmail.com", "url": "http://stagas.com/" }, { "name": "Tobiasz Cudnik", "email": "tobiasz.cudnik@gmail.com", "url": "https://github.com/TobiaszCudnik" }, { "name": "Pavel Lang", "email": "langpavel@phpskelet.org", "url": "https://github.com/langpavel" }, { "name": "Dan MacTough", "url": "http://yabfog.com/" }, { "name": "w1nk", "url": "https://github.com/w1nk" }, { "name": "Hugh Kennedy", "url": "http://twitter.com/hughskennedy" }, { "name": "Dustin Diaz", "url": "http://dustindiaz.com" }, { "name": "Ilya Shaisultanov", "url": "https://github.com/diversario" }, { "name": "Nathan MacInnes", "email": "nathan@macinn.es", "url": "http://macinn.es/" }, { "name": "Benjamin E. Coe", "email": "ben@npmjs.com", "url": "https://twitter.com/benjamincoe" }, { "name": "Nathan Zadoks", "url": "https://github.com/nathan7" }, { "name": "Róbert Oroszi", "email": "robert+gh@oroszi.net", "url": "https://github.com/oroce" }, { "name": "Aurélio A. Heckert", "url": "http://softwarelivre.org/aurium" }, { "name": "Guy Ellis", "url": "http://www.guyellisrocks.com/" } ], "license": "MIT", "engines": { "node": ">=0.8" }, "dependencies": {}, "devDependencies": { "nodeunit": "~0.9.0" }, "optionalDependencies": {}, "scripts": { "test": "nodeunit test.js" }, "readme": "# clone\n\n[![build status](https://secure.travis-ci.org/pvorb/node-clone.png)](http://travis-ci.org/pvorb/node-clone)\n\n[![info badge](https://nodei.co/npm/clone.png?downloads=true&downloadRank=true&stars=true)](http://npm-stat.com/charts.html?package=clone)\n\noffers foolproof _deep cloning_ of objects, arrays, numbers, strings etc. in JavaScript.\n\n\n## Installation\n\n npm install clone\n\n(It also works with browserify, ender or standalone.)\n\n\n## Example\n\n~~~ javascript\nvar clone = require('clone');\n\nvar a, b;\n\na = { foo: { bar: 'baz' } }; // initial value of a\n\nb = clone(a); // clone a -> b\na.foo.bar = 'foo'; // change a\n\nconsole.log(a); // show a\nconsole.log(b); // show b\n~~~\n\nThis will print:\n\n~~~ javascript\n{ foo: { bar: 'foo' } }\n{ foo: { bar: 'baz' } }\n~~~\n\n**clone** masters cloning simple objects (even with custom prototype), arrays,\nDate objects, and RegExp objects. Everything is cloned recursively, so that you\ncan clone dates in arrays in objects, for example.\n\n\n## API\n\n`clone(val, circular, depth)`\n\n * `val` -- the value that you want to clone, any type allowed\n * `circular` -- boolean\n\n Call `clone` with `circular` set to `false` if you are certain that `obj`\n contains no circular references. This will give better performance if needed.\n There is no error if `undefined` or `null` is passed as `obj`.\n * `depth` -- depth to which the object is to be cloned (optional,\n defaults to infinity)\n\n`clone.clonePrototype(obj)`\n\n * `obj` -- the object that you want to clone\n\nDoes a prototype clone as\n[described by Oran Looney](http://oranlooney.com/functional-javascript/).\n\n\n## Circular References\n\n~~~ javascript\nvar a, b;\n\na = { hello: 'world' };\n\na.myself = a;\nb = clone(a);\n\nconsole.log(b);\n~~~\n\nThis will print:\n\n~~~ javascript\n{ hello: \"world\", myself: [Circular] }\n~~~\n\nSo, `b.myself` points to `b`, not `a`. Neat!\n\n\n## Test\n\n npm test\n\n\n## Caveat\n\nSome special objects like a socket or `process.stdout`/`stderr` are known to not\nbe cloneable. If you find other objects that cannot be cloned, please [open an\nissue](https://github.com/pvorb/node-clone/issues/new).\n\n\n## Bugs and Issues\n\nIf you encounter any bugs or issues, feel free to [open an issue at\ngithub](https://github.com/pvorb/node-clone/issues) or send me an email to\n. I also always like to hear from you, if you’re using my code.\n\n## License\n\nCopyright © 2011-2015 [Paul Vorbach](http://paul.vorba.ch/) and\n[contributors](https://github.com/pvorb/node-clone/graphs/contributors).\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the “Software”), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of\nthe Software, and to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS\nFOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR\nCOPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n", "readmeFilename": "README.md", "homepage": "https://github.com/pvorb/node-clone#readme", "_id": "clone@1.0.2", "_shasum": "260b7a99ebb1edfe247538175f783243cb19d149", "_resolved": "https://registry.npmjs.org/clone/-/clone-1.0.2.tgz", "_from": "clone@>=1.0.2 <2.0.0" } ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/test-apart-ctx.htmlnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000000111012631326456032336 0ustar 00000000000000 Clone Test-Suite (Browser) ././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/test.htmlnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000001070712631326456032352 0ustar 00000000000000 Clone Test-Suite (Browser)

    Clone Test-Suite (Browser)

    Tests started: ; Tests finished: .
      ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/test.jsnpm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/node_modules/defaults/node_modules/clone/0000644000000000000000000001772012631326456032354 0ustar 00000000000000var clone = require('./'); function inspect(obj) { seen = []; return JSON.stringify(obj, function (key, val) { if (val != null && typeof val == "object") { if (seen.indexOf(val) >= 0) { return '[cyclic]'; } seen.push(val); } return val; }); } // Creates a new VM in node, or an iframe in a browser in order to run the // script function apartContext(context, script, callback) { var vm = require('vm'); if (vm) { var ctx = vm.createContext({ ctx: context }); callback(vm.runInContext(script, ctx)); } else if (document && document.createElement) { var iframe = document.createElement('iframe'); iframe.style.display = 'none'; document.body.appendChild(iframe); var myCtxId = 'tmpCtx' + Math.random(); window[myCtxId] = context; iframe.src = 'test-apart-ctx.html?' + myCtxId + '&' + encodeURIComponent(script); iframe.onload = function() { try { callback(iframe.contentWindow.results); } catch (e) { throw e; } }; } else { console.log('WARNING: cannot create an apart context.'); } } exports["clone string"] = function (test) { test.expect(2); // how many tests? var a = "foo"; test.strictEqual(clone(a), a); a = ""; test.strictEqual(clone(a), a); test.done(); }; exports["clone number"] = function (test) { test.expect(5); // how many tests? var a = 0; test.strictEqual(clone(a), a); a = 1; test.strictEqual(clone(a), a); a = -1000; test.strictEqual(clone(a), a); a = 3.1415927; test.strictEqual(clone(a), a); a = -3.1415927; test.strictEqual(clone(a), a); test.done(); }; exports["clone date"] = function (test) { test.expect(3); // how many tests? var a = new Date; var c = clone(a); test.ok(!!a.getUTCDate && !!a.toUTCString); test.ok(!!c.getUTCDate && !!c.toUTCString); test.equal(a.getTime(), c.getTime()); test.done(); }; exports["clone object"] = function (test) { test.expect(1); // how many tests? var a = { foo: { bar: "baz" } }; var b = clone(a); test.deepEqual(b, a); test.done(); }; exports["clone array"] = function (test) { test.expect(2); // how many tests? var a = [ { foo: "bar" }, "baz" ]; var b = clone(a); test.ok(b instanceof Array); test.deepEqual(b, a); test.done(); }; exports["clone buffer"] = function (test) { if (typeof Buffer == 'undefined') { return test.done(); } test.expect(1); var a = new Buffer("this is a test buffer"); var b = clone(a); // no underscore equal since it has no concept of Buffers test.deepEqual(b, a); test.done(); }; exports["clone regexp"] = function (test) { test.expect(5); var a = /abc123/gi; var b = clone(a); test.deepEqual(b, a); var c = /a/g; test.ok(c.lastIndex === 0); c.exec('123a456a'); test.ok(c.lastIndex === 4); var d = clone(c); test.ok(d.global); test.ok(d.lastIndex === 4); test.done(); }; exports["clone object containing array"] = function (test) { test.expect(1); // how many tests? var a = { arr1: [ { a: '1234', b: '2345' } ], arr2: [ { c: '345', d: '456' } ] }; var b = clone(a); test.deepEqual(b, a); test.done(); }; exports["clone object with circular reference"] = function (test) { test.expect(8); // how many tests? var c = [1, "foo", {'hello': 'bar'}, function () {}, false, [2]]; var b = [c, 2, 3, 4]; var a = {'b': b, 'c': c}; a.loop = a; a.loop2 = a; c.loop = c; c.aloop = a; var aCopy = clone(a); test.ok(a != aCopy); test.ok(a.c != aCopy.c); test.ok(aCopy.c == aCopy.b[0]); test.ok(aCopy.c.loop.loop.aloop == aCopy); test.ok(aCopy.c[0] == a.c[0]); test.ok(eq(a, aCopy)); aCopy.c[0] = 2; test.ok(!eq(a, aCopy)); aCopy.c = "2"; test.ok(!eq(a, aCopy)); function eq(x, y) { return inspect(x) === inspect(y); } test.done(); }; exports['clone prototype'] = function (test) { test.expect(3); // how many tests? var a = { a: "aaa", x: 123, y: 45.65 }; var b = clone.clonePrototype(a); test.strictEqual(b.a, a.a); test.strictEqual(b.x, a.x); test.strictEqual(b.y, a.y); test.done(); }; exports['clone within an apart context'] = function (test) { var results = apartContext({ clone: clone }, "results = ctx.clone({ a: [1, 2, 3], d: new Date(), r: /^foo$/ig })", function (results) { test.ok(results.a.constructor.toString() === Array.toString()); test.ok(results.d.constructor.toString() === Date.toString()); test.ok(results.r.constructor.toString() === RegExp.toString()); test.done(); }); }; exports['clone object with no constructor'] = function (test) { test.expect(3); var n = null; var a = { foo: 'bar' }; a.__proto__ = n; test.ok(typeof a === 'object'); test.ok(typeof a !== null); var b = clone(a); test.ok(a.foo, b.foo); test.done(); }; exports['clone object with depth argument'] = function (test) { test.expect(6); var a = { foo: { bar : { baz : 'qux' } } }; var b = clone(a, false, 1); test.deepEqual(b, a); test.notEqual(b, a); test.strictEqual(b.foo, a.foo); b = clone(a, true, 2); test.deepEqual(b, a); test.notEqual(b.foo, a.foo); test.strictEqual(b.foo.bar, a.foo.bar); test.done(); }; exports['maintain prototype chain in clones'] = function (test) { test.expect(1); function T() {} var a = new T(); var b = clone(a); test.strictEqual(Object.getPrototypeOf(a), Object.getPrototypeOf(b)); test.done(); }; exports['parent prototype is overriden with prototype provided'] = function (test) { test.expect(1); function T() {} var a = new T(); var b = clone(a, true, Infinity, null); test.strictEqual(b.__defineSetter__, undefined); test.done(); }; exports['clone object with null children'] = function (test) { test.expect(1); var a = { foo: { bar: null, baz: { qux: false } } }; var b = clone(a); test.deepEqual(b, a); test.done(); }; exports['clone instance with getter'] = function (test) { test.expect(1); function Ctor() {}; Object.defineProperty(Ctor.prototype, 'prop', { configurable: true, enumerable: true, get: function() { return 'value'; } }); var a = new Ctor(); var b = clone(a); test.strictEqual(b.prop, 'value'); test.done(); }; exports['get RegExp flags'] = function (test) { test.strictEqual(clone.__getRegExpFlags(/a/), '' ); test.strictEqual(clone.__getRegExpFlags(/a/i), 'i' ); test.strictEqual(clone.__getRegExpFlags(/a/g), 'g' ); test.strictEqual(clone.__getRegExpFlags(/a/gi), 'gi'); test.strictEqual(clone.__getRegExpFlags(/a/m), 'm' ); test.done(); }; exports["recognize Array object"] = function (test) { var results = apartContext(null, "results = [1, 2, 3]", function(alien) { var local = [4, 5, 6]; test.ok(clone.__isArray(alien)); // recognize in other context. test.ok(clone.__isArray(local)); // recognize in local context. test.ok(!clone.__isDate(alien)); test.ok(!clone.__isDate(local)); test.ok(!clone.__isRegExp(alien)); test.ok(!clone.__isRegExp(local)); test.done(); }); }; exports["recognize Date object"] = function (test) { var results = apartContext(null, "results = new Date()", function(alien) { var local = new Date(); test.ok(clone.__isDate(alien)); // recognize in other context. test.ok(clone.__isDate(local)); // recognize in local context. test.ok(!clone.__isArray(alien)); test.ok(!clone.__isArray(local)); test.ok(!clone.__isRegExp(alien)); test.ok(!clone.__isRegExp(local)); test.done(); }); }; exports["recognize RegExp object"] = function (test) { var results = apartContext(null, "results = /foo/", function(alien) { var local = /bar/; test.ok(clone.__isRegExp(alien)); // recognize in other context. test.ok(clone.__isRegExp(local)); // recognize in local context. test.ok(!clone.__isArray(alien)); test.ok(!clone.__isArray(local)); test.ok(!clone.__isDate(alien)); test.ok(!clone.__isDate(local)); test.done(); }); }; npm_3.5.2.orig/node_modules/columnify/node_modules/wcwidth/test/index.js0000644000000000000000000000267712631326456024742 0ustar 00000000000000"use strict" var wcwidth = require('../') var test = require('tape') test('handles regular strings', function(t) { t.strictEqual(wcwidth('abc'), 3) t.end() }) test('handles multibyte strings', function(t) { t.strictEqual(wcwidth('字的模块'), 8) t.end() }) test('handles multibyte characters mixed with regular characters', function(t) { t.strictEqual(wcwidth('abc 字的模块'), 12) t.end() }) test('ignores control characters e.g. \\n', function(t) { t.strictEqual(wcwidth('abc\n字的模块\ndef'), 14) t.end() }) test('ignores bad input', function(t) { t.strictEqual(wcwidth(''), 0) t.strictEqual(wcwidth(3), 0) t.strictEqual(wcwidth({}), 0) t.strictEqual(wcwidth([]), 0) t.strictEqual(wcwidth(), 0) t.end() }) test('ignores nul (charcode 0)', function(t) { t.strictEqual(wcwidth(String.fromCharCode(0)), 0) t.end() }) test('ignores nul mixed with chars', function(t) { t.strictEqual(wcwidth('a' + String.fromCharCode(0) + '\n字的'), 5) t.end() }) test('can have custom value for nul', function(t) { t.strictEqual(wcwidth.config({ nul: 10 })(String.fromCharCode(0) + 'a字的'), 15) t.end() }) test('can have custom control char value', function(t) { t.strictEqual(wcwidth.config({ control: 1 })('abc\n字的模块\ndef'), 16) t.end() }) test('negative custom control chars == -1', function(t) { t.strictEqual(wcwidth.config({ control: -1 })('abc\n字的模块\ndef'), -1) t.end() }) npm_3.5.2.orig/node_modules/config-chain/.npmignore0000644000000000000000000000005212631326456020470 0ustar 00000000000000node_modules node_modules/* npm_debug.log npm_3.5.2.orig/node_modules/config-chain/LICENCE0000644000000000000000000000205612631326456017464 0ustar 00000000000000Copyright (c) 2011 Dominic Tarr Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.npm_3.5.2.orig/node_modules/config-chain/index.js0000755000000000000000000001605012631326456020146 0ustar 00000000000000var ProtoList = require('proto-list') , path = require('path') , fs = require('fs') , ini = require('ini') , EE = require('events').EventEmitter , url = require('url') , http = require('http') var exports = module.exports = function () { var args = [].slice.call(arguments) , conf = new ConfigChain() while(args.length) { var a = args.shift() if(a) conf.push ( 'string' === typeof a ? json(a) : a ) } return conf } //recursively find a file... var find = exports.find = function () { var rel = path.join.apply(null, [].slice.call(arguments)) function find(start, rel) { var file = path.join(start, rel) try { fs.statSync(file) return file } catch (err) { if(path.dirname(start) !== start) // root return find(path.dirname(start), rel) } } return find(__dirname, rel) } var parse = exports.parse = function (content, file, type) { content = '' + content // if we don't know what it is, try json and fall back to ini // if we know what it is, then it must be that. if (!type) { try { return JSON.parse(content) } catch (er) { return ini.parse(content) } } else if (type === 'json') { if (this.emit) { try { return JSON.parse(content) } catch (er) { this.emit('error', er) } } else { return JSON.parse(content) } } else { return ini.parse(content) } } var json = exports.json = function () { var args = [].slice.call(arguments).filter(function (arg) { return arg != null }) var file = path.join.apply(null, args) var content try { content = fs.readFileSync(file,'utf-8') } catch (err) { return } return parse(content, file, 'json') } var env = exports.env = function (prefix, env) { env = env || process.env var obj = {} var l = prefix.length for(var k in env) { if(k.indexOf(prefix) === 0) obj[k.substring(l)] = env[k] } return obj } exports.ConfigChain = ConfigChain function ConfigChain () { EE.apply(this) ProtoList.apply(this, arguments) this._awaiting = 0 this._saving = 0 this.sources = {} } // multi-inheritance-ish var extras = { constructor: { value: ConfigChain } } Object.keys(EE.prototype).forEach(function (k) { extras[k] = Object.getOwnPropertyDescriptor(EE.prototype, k) }) ConfigChain.prototype = Object.create(ProtoList.prototype, extras) ConfigChain.prototype.del = function (key, where) { // if not specified where, then delete from the whole chain, scorched // earth style if (where) { var target = this.sources[where] target = target && target.data if (!target) { return this.emit('error', new Error('not found '+where)) } delete target[key] } else { for (var i = 0, l = this.list.length; i < l; i ++) { delete this.list[i][key] } } return this } ConfigChain.prototype.set = function (key, value, where) { var target if (where) { target = this.sources[where] target = target && target.data if (!target) { return this.emit('error', new Error('not found '+where)) } } else { target = this.list[0] if (!target) { return this.emit('error', new Error('cannot set, no confs!')) } } target[key] = value return this } ConfigChain.prototype.get = function (key, where) { if (where) { where = this.sources[where] if (where) where = where.data if (where && Object.hasOwnProperty.call(where, key)) return where[key] return undefined } return this.list[0][key] } ConfigChain.prototype.save = function (where, type, cb) { if (typeof type === 'function') cb = type, type = null var target = this.sources[where] if (!target || !(target.path || target.source) || !target.data) { // TODO: maybe save() to a url target could be a PUT or something? // would be easy to swap out with a reddis type thing, too return this.emit('error', new Error('bad save target: '+where)) } if (target.source) { var pref = target.prefix || '' Object.keys(target.data).forEach(function (k) { target.source[pref + k] = target.data[k] }) return this } var type = type || target.type var data = target.data if (target.type === 'json') { data = JSON.stringify(data) } else { data = ini.stringify(data) } this._saving ++ fs.writeFile(target.path, data, 'utf8', function (er) { this._saving -- if (er) { if (cb) return cb(er) else return this.emit('error', er) } if (this._saving === 0) { if (cb) cb() this.emit('save') } }.bind(this)) return this } ConfigChain.prototype.addFile = function (file, type, name) { name = name || file var marker = {__source__:name} this.sources[name] = { path: file, type: type } this.push(marker) this._await() fs.readFile(file, 'utf8', function (er, data) { if (er) this.emit('error', er) this.addString(data, file, type, marker) }.bind(this)) return this } ConfigChain.prototype.addEnv = function (prefix, env, name) { name = name || 'env' var data = exports.env(prefix, env) this.sources[name] = { data: data, source: env, prefix: prefix } return this.add(data, name) } ConfigChain.prototype.addUrl = function (req, type, name) { this._await() var href = url.format(req) name = name || href var marker = {__source__:name} this.sources[name] = { href: href, type: type } this.push(marker) http.request(req, function (res) { var c = [] var ct = res.headers['content-type'] if (!type) { type = ct.indexOf('json') !== -1 ? 'json' : ct.indexOf('ini') !== -1 ? 'ini' : href.match(/\.json$/) ? 'json' : href.match(/\.ini$/) ? 'ini' : null marker.type = type } res.on('data', c.push.bind(c)) .on('end', function () { this.addString(Buffer.concat(c), href, type, marker) }.bind(this)) .on('error', this.emit.bind(this, 'error')) }.bind(this)) .on('error', this.emit.bind(this, 'error')) .end() return this } ConfigChain.prototype.addString = function (data, file, type, marker) { data = this.parse(data, file, type) this.add(data, marker) return this } ConfigChain.prototype.add = function (data, marker) { if (marker && typeof marker === 'object') { var i = this.list.indexOf(marker) if (i === -1) { return this.emit('error', new Error('bad marker')) } this.splice(i, 1, data) marker = marker.__source__ this.sources[marker] = this.sources[marker] || {} this.sources[marker].data = data // we were waiting for this. maybe emit 'load' this._resolve() } else { if (typeof marker === 'string') { this.sources[marker] = this.sources[marker] || {} this.sources[marker].data = data } // trigger the load event if nothing was already going to do so. this._await() this.push(data) process.nextTick(this._resolve.bind(this)) } return this } ConfigChain.prototype.parse = exports.parse ConfigChain.prototype._await = function () { this._awaiting++ } ConfigChain.prototype._resolve = function () { this._awaiting-- if (this._awaiting === 0) this.emit('load', this) } npm_3.5.2.orig/node_modules/config-chain/node_modules/0000755000000000000000000000000012631326456021151 5ustar 00000000000000npm_3.5.2.orig/node_modules/config-chain/package.json0000644000000000000000000001511512631326456020765 0ustar 00000000000000{ "name": "config-chain", "version": "1.1.9", "licenses": [ { "type": "MIT", "url": "https://raw.githubusercontent.com/dominictarr/config-chain/master/LICENCE" } ], "description": "HANDLE CONFIGURATION ONCE AND FOR ALL", "homepage": "http://github.com/dominictarr/config-chain", "repository": { "type": "git", "url": "git+https://github.com/dominictarr/config-chain.git" }, "dependencies": { "proto-list": "~1.2.1", "ini": "1" }, "devDependencies": { "tap": "0.3.0" }, "author": { "name": "Dominic Tarr", "email": "dominic.tarr@gmail.com", "url": "http://dominictarr.com" }, "scripts": { "test": "tap test/" }, "readme": "#config-chain\n\nUSE THIS MODULE TO LOAD ALL YOUR CONFIGURATIONS\n\n``` js\n\n //npm install config-chain\n\n var cc = require('config-chain')\n , opts = require('optimist').argv //ALWAYS USE OPTIMIST FOR COMMAND LINE OPTIONS.\n , env = opts.env || process.env.YOUR_APP_ENV || 'dev' //SET YOUR ENV LIKE THIS.\n\n // EACH ARG TO CONFIGURATOR IS LOADED INTO CONFIGURATION CHAIN\n // EARLIER ITEMS OVERIDE LATER ITEMS\n // PUTS COMMAND LINE OPTS FIRST, AND DEFAULTS LAST!\n\n //strings are interpereted as filenames.\n //will be loaded synchronously\n\n var conf =\n cc(\n //OVERRIDE SETTINGS WITH COMMAND LINE OPTS\n opts,\n\n //ENV VARS IF PREFIXED WITH 'myApp_'\n\n cc.env('myApp_'), //myApp_foo = 'like this'\n\n //FILE NAMED BY ENV\n path.join(__dirname, 'config.' + env + '.json'),\n\n //IF `env` is PRODUCTION\n env === 'prod'\n ? path.join(__dirname, 'special.json') //load a special file\n : null //NULL IS IGNORED!\n\n //SUBDIR FOR ENV CONFIG\n path.join(__dirname, 'config', env, 'config.json'),\n\n //SEARCH PARENT DIRECTORIES FROM CURRENT DIR FOR FILE\n cc.find('config.json'),\n\n //PUT DEFAULTS LAST\n {\n host: 'localhost'\n port: 8000\n })\n\n var host = conf.get('host')\n\n // or\n\n var host = conf.store.host\n\n```\n\nFINALLY, EASY FLEXIBLE CONFIGURATIONS!\n\n##see also: [proto-list](https://github.com/isaacs/proto-list/)\n\nWHATS THAT YOU SAY?\n\nYOU WANT A \"CLASS\" SO THAT YOU CAN DO CRAYCRAY JQUERY CRAPS?\n\nEXTEND WITH YOUR OWN FUNCTIONALTY!?\n\n## CONFIGCHAIN LIVES TO SERVE ONLY YOU!\n\n```javascript\nvar cc = require('config-chain')\n\n// all the stuff you did before\nvar config = cc({\n some: 'object'\n },\n cc.find('config.json'),\n cc.env('myApp_')\n )\n // CONFIGS AS A SERVICE, aka \"CaaS\", aka EVERY DEVOPS DREAM OMG!\n .addUrl('http://configurator:1234/my-configs')\n // ASYNC FTW!\n .addFile('/path/to/file.json')\n\n // OBJECTS ARE OK TOO, they're SYNC but they still ORDER RIGHT\n // BECAUSE PROMISES ARE USED BUT NO, NOT *THOSE* PROMISES, JUST\n // ACTUAL PROMISES LIKE YOU MAKE TO YOUR MOM, KEPT OUT OF LOVE\n .add({ another: 'object' })\n\n // DIE A THOUSAND DEATHS IF THIS EVER HAPPENS!!\n .on('error', function (er) {\n // IF ONLY THERE WAS SOMETHIGN HARDER THAN THROW\n // MY SORROW COULD BE ADEQUATELY EXPRESSED. /o\\\n throw er\n })\n\n // THROW A PARTY IN YOUR FACE WHEN ITS ALL LOADED!!\n .on('load', function (config) {\n console.awesome('HOLY SHIT!')\n })\n```\n\n# BORING API DOCS\n\n## cc(...args)\n\nMAKE A CHAIN AND ADD ALL THE ARGS.\n\nIf the arg is a STRING, then it shall be a JSON FILENAME.\n\nSYNC I/O!\n\nRETURN THE CHAIN!\n\n## cc.json(...args)\n\nJoin the args INTO A JSON FILENAME!\n\nSYNC I/O!\n\n## cc.find(relativePath)\n\nSEEK the RELATIVE PATH by climbing the TREE OF DIRECTORIES.\n\nRETURN THE FOUND PATH!\n\nSYNC I/O!\n\n## cc.parse(content, file, type)\n\nParse the content string, and guess the type from either the\nspecified type or the filename.\n\nRETURN THE RESULTING OBJECT!\n\nNO I/O!\n\n## cc.env(prefix, env=process.env)\n\nGet all the keys on the provided env object (or process.env) which are\nprefixed by the specified prefix, and put the values on a new object.\n\nRETURN THE RESULTING OBJECT!\n\nNO I/O!\n\n## cc.ConfigChain()\n\nThe ConfigChain class for CRAY CRAY JQUERY STYLE METHOD CHAINING!\n\nOne of these is returned by the main exported function, as well.\n\nIt inherits (prototypically) from\n[ProtoList](https://github.com/isaacs/proto-list/), and also inherits\n(parasitically) from\n[EventEmitter](http://nodejs.org/api/events.html#events_class_events_eventemitter)\n\nIt has all the methods from both, and except where noted, they are\nunchanged.\n\n### LET IT BE KNOWN THAT chain IS AN INSTANCE OF ConfigChain.\n\n## chain.sources\n\nA list of all the places where it got stuff. The keys are the names\npassed to addFile or addUrl etc, and the value is an object with some\ninfo about the data source.\n\n## chain.addFile(filename, type, [name=filename])\n\nFilename is the name of the file. Name is an arbitrary string to be\nused later if you desire. Type is either 'ini' or 'json', and will\ntry to guess intelligently if omitted.\n\nLoaded files can be saved later.\n\n## chain.addUrl(url, type, [name=url])\n\nSame as the filename thing, but with a url.\n\nCan't be saved later.\n\n## chain.addEnv(prefix, env, [name='env'])\n\nAdd all the keys from the env object that start with the prefix.\n\n## chain.addString(data, file, type, [name])\n\nParse the string and add it to the set. (Mainly used internally.)\n\n## chain.add(object, [name])\n\nAdd the object to the set.\n\n## chain.root {Object}\n\nThe root from which all the other config objects in the set descend\nprototypically.\n\nPut your defaults here.\n\n## chain.set(key, value, name)\n\nSet the key to the value on the named config object. If name is\nunset, then set it on the first config object in the set. (That is,\nthe one with the highest priority, which was added first.)\n\n## chain.get(key, [name])\n\nGet the key from the named config object explicitly, or from the\nresolved configs if not specified.\n\n## chain.save(name, type)\n\nWrite the named config object back to its origin.\n\nCurrently only supported for env and file config types.\n\nFor files, encode the data according to the type.\n\n## chain.on('save', function () {})\n\nWhen one or more files are saved, emits `save` event when they're all\nsaved.\n\n## chain.on('load', function (chain) {})\n\nWhen the config chain has loaded all the specified files and urls and\nsuch, the 'load' event fires.\n", "readmeFilename": "readme.markdown", "bugs": { "url": "https://github.com/dominictarr/config-chain/issues" }, "_id": "config-chain@1.1.9", "_shasum": "39ac7d4dca84faad926124c54cff25a53aa8bf6e", "_resolved": "https://registry.npmjs.org/config-chain/-/config-chain-1.1.9.tgz", "_from": "config-chain@>=1.1.9 <1.2.0" } npm_3.5.2.orig/node_modules/config-chain/readme.markdown0000644000000000000000000001251312631326456021477 0ustar 00000000000000#config-chain USE THIS MODULE TO LOAD ALL YOUR CONFIGURATIONS ``` js //npm install config-chain var cc = require('config-chain') , opts = require('optimist').argv //ALWAYS USE OPTIMIST FOR COMMAND LINE OPTIONS. , env = opts.env || process.env.YOUR_APP_ENV || 'dev' //SET YOUR ENV LIKE THIS. // EACH ARG TO CONFIGURATOR IS LOADED INTO CONFIGURATION CHAIN // EARLIER ITEMS OVERIDE LATER ITEMS // PUTS COMMAND LINE OPTS FIRST, AND DEFAULTS LAST! //strings are interpereted as filenames. //will be loaded synchronously var conf = cc( //OVERRIDE SETTINGS WITH COMMAND LINE OPTS opts, //ENV VARS IF PREFIXED WITH 'myApp_' cc.env('myApp_'), //myApp_foo = 'like this' //FILE NAMED BY ENV path.join(__dirname, 'config.' + env + '.json'), //IF `env` is PRODUCTION env === 'prod' ? path.join(__dirname, 'special.json') //load a special file : null //NULL IS IGNORED! //SUBDIR FOR ENV CONFIG path.join(__dirname, 'config', env, 'config.json'), //SEARCH PARENT DIRECTORIES FROM CURRENT DIR FOR FILE cc.find('config.json'), //PUT DEFAULTS LAST { host: 'localhost' port: 8000 }) var host = conf.get('host') // or var host = conf.store.host ``` FINALLY, EASY FLEXIBLE CONFIGURATIONS! ##see also: [proto-list](https://github.com/isaacs/proto-list/) WHATS THAT YOU SAY? YOU WANT A "CLASS" SO THAT YOU CAN DO CRAYCRAY JQUERY CRAPS? EXTEND WITH YOUR OWN FUNCTIONALTY!? ## CONFIGCHAIN LIVES TO SERVE ONLY YOU! ```javascript var cc = require('config-chain') // all the stuff you did before var config = cc({ some: 'object' }, cc.find('config.json'), cc.env('myApp_') ) // CONFIGS AS A SERVICE, aka "CaaS", aka EVERY DEVOPS DREAM OMG! .addUrl('http://configurator:1234/my-configs') // ASYNC FTW! .addFile('/path/to/file.json') // OBJECTS ARE OK TOO, they're SYNC but they still ORDER RIGHT // BECAUSE PROMISES ARE USED BUT NO, NOT *THOSE* PROMISES, JUST // ACTUAL PROMISES LIKE YOU MAKE TO YOUR MOM, KEPT OUT OF LOVE .add({ another: 'object' }) // DIE A THOUSAND DEATHS IF THIS EVER HAPPENS!! .on('error', function (er) { // IF ONLY THERE WAS SOMETHIGN HARDER THAN THROW // MY SORROW COULD BE ADEQUATELY EXPRESSED. /o\ throw er }) // THROW A PARTY IN YOUR FACE WHEN ITS ALL LOADED!! .on('load', function (config) { console.awesome('HOLY SHIT!') }) ``` # BORING API DOCS ## cc(...args) MAKE A CHAIN AND ADD ALL THE ARGS. If the arg is a STRING, then it shall be a JSON FILENAME. SYNC I/O! RETURN THE CHAIN! ## cc.json(...args) Join the args INTO A JSON FILENAME! SYNC I/O! ## cc.find(relativePath) SEEK the RELATIVE PATH by climbing the TREE OF DIRECTORIES. RETURN THE FOUND PATH! SYNC I/O! ## cc.parse(content, file, type) Parse the content string, and guess the type from either the specified type or the filename. RETURN THE RESULTING OBJECT! NO I/O! ## cc.env(prefix, env=process.env) Get all the keys on the provided env object (or process.env) which are prefixed by the specified prefix, and put the values on a new object. RETURN THE RESULTING OBJECT! NO I/O! ## cc.ConfigChain() The ConfigChain class for CRAY CRAY JQUERY STYLE METHOD CHAINING! One of these is returned by the main exported function, as well. It inherits (prototypically) from [ProtoList](https://github.com/isaacs/proto-list/), and also inherits (parasitically) from [EventEmitter](http://nodejs.org/api/events.html#events_class_events_eventemitter) It has all the methods from both, and except where noted, they are unchanged. ### LET IT BE KNOWN THAT chain IS AN INSTANCE OF ConfigChain. ## chain.sources A list of all the places where it got stuff. The keys are the names passed to addFile or addUrl etc, and the value is an object with some info about the data source. ## chain.addFile(filename, type, [name=filename]) Filename is the name of the file. Name is an arbitrary string to be used later if you desire. Type is either 'ini' or 'json', and will try to guess intelligently if omitted. Loaded files can be saved later. ## chain.addUrl(url, type, [name=url]) Same as the filename thing, but with a url. Can't be saved later. ## chain.addEnv(prefix, env, [name='env']) Add all the keys from the env object that start with the prefix. ## chain.addString(data, file, type, [name]) Parse the string and add it to the set. (Mainly used internally.) ## chain.add(object, [name]) Add the object to the set. ## chain.root {Object} The root from which all the other config objects in the set descend prototypically. Put your defaults here. ## chain.set(key, value, name) Set the key to the value on the named config object. If name is unset, then set it on the first config object in the set. (That is, the one with the highest priority, which was added first.) ## chain.get(key, [name]) Get the key from the named config object explicitly, or from the resolved configs if not specified. ## chain.save(name, type) Write the named config object back to its origin. Currently only supported for env and file config types. For files, encode the data according to the type. ## chain.on('save', function () {}) When one or more files are saved, emits `save` event when they're all saved. ## chain.on('load', function (chain) {}) When the config chain has loaded all the specified files and urls and such, the 'load' event fires. npm_3.5.2.orig/node_modules/config-chain/test/0000755000000000000000000000000012631326456017453 5ustar 00000000000000npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/0000755000000000000000000000000012631326456023265 5ustar 00000000000000npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/LICENSE0000644000000000000000000000137512631326456024300 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/README.md0000644000000000000000000000012012631326456024535 0ustar 00000000000000A list of objects, bound by their prototype chain. Used in npm's config stuff. npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/package.json0000644000000000000000000000164512631326456025561 0ustar 00000000000000{ "name": "proto-list", "version": "1.2.4", "description": "A utility for managing a prototype chain", "main": "./proto-list.js", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/isaacs/proto-list.git" }, "license": "ISC", "devDependencies": { "tap": "0" }, "readme": "A list of objects, bound by their prototype chain.\n\nUsed in npm's config stuff.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/proto-list/issues" }, "homepage": "https://github.com/isaacs/proto-list#readme", "_id": "proto-list@1.2.4", "_shasum": "212d5bfe1318306a420f6402b8e26ff39647a849", "_resolved": "https://registry.npmjs.org/proto-list/-/proto-list-1.2.4.tgz", "_from": "proto-list@>=1.2.1 <1.3.0" } npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/proto-list.js0000644000000000000000000000434312631326456025743 0ustar 00000000000000 module.exports = ProtoList function setProto(obj, proto) { if (typeof Object.setPrototypeOf === "function") return Object.setPrototypeOf(obj, proto) else obj.__proto__ = proto } function ProtoList () { this.list = [] var root = null Object.defineProperty(this, 'root', { get: function () { return root }, set: function (r) { root = r if (this.list.length) { setProto(this.list[this.list.length - 1], r) } }, enumerable: true, configurable: true }) } ProtoList.prototype = { get length () { return this.list.length } , get keys () { var k = [] for (var i in this.list[0]) k.push(i) return k } , get snapshot () { var o = {} this.keys.forEach(function (k) { o[k] = this.get(k) }, this) return o } , get store () { return this.list[0] } , push : function (obj) { if (typeof obj !== "object") obj = {valueOf:obj} if (this.list.length >= 1) { setProto(this.list[this.list.length - 1], obj) } setProto(obj, this.root) return this.list.push(obj) } , pop : function () { if (this.list.length >= 2) { setProto(this.list[this.list.length - 2], this.root) } return this.list.pop() } , unshift : function (obj) { setProto(obj, this.list[0] || this.root) return this.list.unshift(obj) } , shift : function () { if (this.list.length === 1) { setProto(this.list[0], this.root) } return this.list.shift() } , get : function (key) { return this.list[0][key] } , set : function (key, val, save) { if (!this.length) this.push({}) if (save && this.list[0].hasOwnProperty(key)) this.push({}) return this.list[0][key] = val } , forEach : function (fn, thisp) { for (var key in this.list[0]) fn.call(thisp, key, this.list[0][key]) } , slice : function () { return this.list.slice.apply(this.list, arguments) } , splice : function () { // handle injections var ret = this.list.splice.apply(this.list, arguments) for (var i = 0, l = this.list.length; i < l; i++) { setProto(this.list[i], this.list[i + 1] || this.root) } return ret } } npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/test/0000755000000000000000000000000012631326456024244 5ustar 00000000000000npm_3.5.2.orig/node_modules/config-chain/node_modules/proto-list/test/basic.js0000644000000000000000000000245712631326456025673 0ustar 00000000000000var tap = require("tap") , test = tap.test , ProtoList = require("../proto-list.js") tap.plan(1) tap.test("protoList tests", function (t) { var p = new ProtoList p.push({foo:"bar"}) p.push({}) p.set("foo", "baz") t.equal(p.get("foo"), "baz") var p = new ProtoList p.push({foo:"bar"}) p.set("foo", "baz") t.equal(p.get("foo"), "baz") t.equal(p.length, 1) p.pop() t.equal(p.length, 0) p.set("foo", "asdf") t.equal(p.length, 1) t.equal(p.get("foo"), "asdf") p.push({bar:"baz"}) t.equal(p.length, 2) t.equal(p.get("foo"), "asdf") p.shift() t.equal(p.length, 1) t.equal(p.get("foo"), undefined) p.unshift({foo:"blo", bar:"rab"}) p.unshift({foo:"boo"}) t.equal(p.length, 3) t.equal(p.get("foo"), "boo") t.equal(p.get("bar"), "rab") var ret = p.splice(1, 1, {bar:"bar"}) t.same(ret, [{foo:"blo", bar:"rab"}]) t.equal(p.get("bar"), "bar") // should not inherit default object properties t.equal(p.get('hasOwnProperty'), undefined) // unless we give it those. p.root = {} t.equal(p.get('hasOwnProperty'), {}.hasOwnProperty) p.root = {default:'monkey'} t.equal(p.get('default'), 'monkey') p.push({red:'blue'}) p.push({red:'blue'}) p.push({red:'blue'}) while (p.length) { t.equal(p.get('default'), 'monkey') p.shift() } t.end() }) npm_3.5.2.orig/node_modules/config-chain/test/broken.js0000644000000000000000000000022212631326456021265 0ustar 00000000000000 var cc = require('..') var assert = require('assert') //throw on invalid json assert.throws(function () { cc(__dirname + '/broken.json') }) npm_3.5.2.orig/node_modules/config-chain/test/broken.json0000644000000000000000000000125712631326456021633 0ustar 00000000000000{ "name": "config-chain", "version": "0.3.0", "description": "HANDLE CONFIGURATION ONCE AND FOR ALL", "homepage": "http://github.com/dominictarr/config-chain", "repository": { "type": "git", "url": "https://github.com/dominictarr/config-chain.git" } //missing , and then this comment. this json is intensionally invalid "dependencies": { "proto-list": "1", "ini": "~1.0.2" }, "bundleDependencies": ["ini"], "REM": "REMEMBER TO REMOVE BUNDLING WHEN/IF ISAACS MERGES ini#7", "author": "Dominic Tarr (http://dominictarr.com)", "scripts": { "test": "node test/find-file.js && node test/ini.js && node test/env.js" } } npm_3.5.2.orig/node_modules/config-chain/test/chain-class.js0000644000000000000000000000620212631326456022176 0ustar 00000000000000var test = require('tap').test var CC = require('../index.js').ConfigChain var env = { foo_blaz : 'blzaa', foo_env : 'myenv' } var jsonObj = { blaz: 'json', json: true } var iniObj = { 'x.y.z': 'xyz', blaz: 'ini' } var fs = require('fs') var ini = require('ini') fs.writeFileSync('/tmp/config-chain-class.json', JSON.stringify(jsonObj)) fs.writeFileSync('/tmp/config-chain-class.ini', ini.stringify(iniObj)) var http = require('http') var reqs = 0 http.createServer(function (q, s) { if (++reqs === 2) this.close() if (q.url === '/json') { // make sure that the requests come back from the server // out of order. they should still be ordered properly // in the resulting config object set. setTimeout(function () { s.setHeader('content-type', 'application/json') s.end(JSON.stringify({ blaz: 'http', http: true, json: true })) }, 200) } else { s.setHeader('content-type', 'application/ini') s.end(ini.stringify({ blaz: 'http', http: true, ini: true, json: false })) } }).listen(1337) test('basic class test', function (t) { var cc = new CC() var expectlist = [ { blaz: 'json', json: true }, { 'x.y.z': 'xyz', blaz: 'ini' }, { blaz: 'blzaa', env: 'myenv' }, { blaz: 'http', http: true, json: true }, { blaz: 'http', http: true, ini: true, json: false } ] cc.addFile('/tmp/config-chain-class.json') .addFile('/tmp/config-chain-class.ini') .addEnv('foo_', env) .addUrl('http://localhost:1337/json') .addUrl('http://localhost:1337/ini') .on('load', function () { t.same(cc.list, expectlist) t.same(cc.snapshot, { blaz: 'json', json: true, 'x.y.z': 'xyz', env: 'myenv', http: true, ini: true }) cc.del('blaz', '/tmp/config-chain-class.json') t.same(cc.snapshot, { blaz: 'ini', json: true, 'x.y.z': 'xyz', env: 'myenv', http: true, ini: true }) cc.del('blaz') t.same(cc.snapshot, { json: true, 'x.y.z': 'xyz', env: 'myenv', http: true, ini: true }) cc.shift() t.same(cc.snapshot, { 'x.y.z': 'xyz', env: 'myenv', http: true, json: true, ini: true }) cc.shift() t.same(cc.snapshot, { env: 'myenv', http: true, json: true, ini: true }) cc.shift() t.same(cc.snapshot, { http: true, json: true, ini: true }) cc.shift() t.same(cc.snapshot, { http: true, ini: true, json: false }) cc.shift() t.same(cc.snapshot, {}) t.end() }) }) npm_3.5.2.orig/node_modules/config-chain/test/env.js0000644000000000000000000000027012631326456020600 0ustar 00000000000000var cc = require('..') var assert = require('assert') assert.deepEqual({ hello: true }, cc.env('test_', { 'test_hello': true, 'ignore_this': 4, 'ignore_test_this_too': [] })) npm_3.5.2.orig/node_modules/config-chain/test/find-file.js0000644000000000000000000000044612631326456021652 0ustar 00000000000000 var fs = require('fs') , assert = require('assert') , objx = { rand: Math.random() } fs.writeFileSync('/tmp/random-test-config.json', JSON.stringify(objx)) var cc = require('../') var path = cc.find('tmp/random-test-config.json') assert.equal(path, '/tmp/random-test-config.json')npm_3.5.2.orig/node_modules/config-chain/test/get.js0000644000000000000000000000060712631326456020573 0ustar 00000000000000var cc = require("../"); var chain = cc() , name = "forFun"; chain .add({ __sample:"for fun only" }, name) .on("load", function() { //It throw exception here console.log(chain.get("__sample", name)); //But if I drop the name param, it run normally and return as expected: "for fun only" //console.log(chain.get("__sample")); }); npm_3.5.2.orig/node_modules/config-chain/test/ignore-unfound-file.js0000644000000000000000000000011712631326456023664 0ustar 00000000000000 var cc = require('..') //should not throw cc(__dirname, 'non_existing_file') npm_3.5.2.orig/node_modules/config-chain/test/ini.js0000644000000000000000000000045612631326456020575 0ustar 00000000000000 var cc =require('..') var INI = require('ini') var assert = require('assert') function test(obj) { var _json, _ini var json = cc.parse (_json = JSON.stringify(obj)) var ini = cc.parse (_ini = INI.stringify(obj)) console.log(_ini, _json) assert.deepEqual(json, ini) } test({hello: true}) npm_3.5.2.orig/node_modules/config-chain/test/save.js0000644000000000000000000000346112631326456020753 0ustar 00000000000000var CC = require('../index.js').ConfigChain var test = require('tap').test var f1 = '/tmp/f1.ini' var f2 = '/tmp/f2.json' var ini = require('ini') var f1data = {foo: {bar: 'baz'}, bloo: 'jaus'} var f2data = {oof: {rab: 'zab'}, oolb: 'suaj'} var fs = require('fs') fs.writeFileSync(f1, ini.stringify(f1data), 'utf8') fs.writeFileSync(f2, JSON.stringify(f2data), 'utf8') test('test saving and loading ini files', function (t) { new CC() .add({grelb:'blerg'}, 'opt') .addFile(f1, 'ini', 'inifile') .addFile(f2, 'json', 'jsonfile') .on('load', function (cc) { t.same(cc.snapshot, { grelb: 'blerg', bloo: 'jaus', foo: { bar: 'baz' }, oof: { rab: 'zab' }, oolb: 'suaj' }) t.same(cc.list, [ { grelb: 'blerg' }, { bloo: 'jaus', foo: { bar: 'baz' } }, { oof: { rab: 'zab' }, oolb: 'suaj' } ]) cc.set('grelb', 'brelg', 'opt') .set('foo', 'zoo', 'inifile') .set('oof', 'ooz', 'jsonfile') .save('inifile') .save('jsonfile') .on('save', function () { t.equal(fs.readFileSync(f1, 'utf8'), "bloo = jaus\nfoo = zoo\n") t.equal(fs.readFileSync(f2, 'utf8'), "{\"oof\":\"ooz\",\"oolb\":\"suaj\"}") t.same(cc.snapshot, { grelb: 'brelg', bloo: 'jaus', foo: 'zoo', oof: 'ooz', oolb: 'suaj' }) t.same(cc.list, [ { grelb: 'brelg' }, { bloo: 'jaus', foo: 'zoo' }, { oof: 'ooz', oolb: 'suaj' } ]) t.pass('ok') t.end() }) }) }) npm_3.5.2.orig/node_modules/debuglog/LICENSE0000644000000000000000000000211112631326456016737 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/debuglog/README.md0000644000000000000000000000240612631326456017220 0ustar 00000000000000# debuglog - backport of util.debuglog() from node v0.11 To facilitate using the `util.debuglog()` function that will be available when node v0.12 is released now, this is a copy extracted from the source. ## require('debuglog') Return `util.debuglog`, if it exists, otherwise it will return an internal copy of the implementation from node v0.11. ## debuglog(section) * `section` {String} The section of the program to be debugged * Returns: {Function} The logging function This is used to create a function which conditionally writes to stderr based on the existence of a `NODE_DEBUG` environment variable. If the `section` name appears in that environment variable, then the returned function will be similar to `console.error()`. If not, then the returned function is a no-op. For example: ```javascript var debuglog = util.debuglog('foo'); var bar = 123; debuglog('hello from foo [%d]', bar); ``` If this program is run with `NODE_DEBUG=foo` in the environment, then it will output something like: FOO 3245: hello from foo [123] where `3245` is the process id. If it is not run with that environment variable set, then it will not print anything. You may separate multiple `NODE_DEBUG` environment variables with a comma. For example, `NODE_DEBUG=fs,net,tls`. npm_3.5.2.orig/node_modules/debuglog/debuglog.js0000644000000000000000000000105212631326456020063 0ustar 00000000000000var util = require('util'); module.exports = (util && util.debuglog) || debuglog; var debugs = {}; var debugEnviron = process.env.NODE_DEBUG || ''; function debuglog(set) { set = set.toUpperCase(); if (!debugs[set]) { if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { var pid = process.pid; debugs[set] = function() { var msg = util.format.apply(exports, arguments); console.error('%s %d: %s', set, pid, msg); }; } else { debugs[set] = function() {}; } } return debugs[set]; }; npm_3.5.2.orig/node_modules/debuglog/package.json0000644000000000000000000000220312631326456020222 0ustar 00000000000000{ "name": "debuglog", "version": "1.0.1", "description": "backport of util.debuglog from node v0.11", "license": "MIT", "main": "debuglog.js", "repository": { "type": "git", "url": "git+https://github.com/sam-github/node-debuglog.git" }, "author": { "name": "Sam Roberts", "email": "sam@strongloop.com" }, "engines": { "node": "*" }, "browser": { "util": false }, "bugs": { "url": "https://github.com/sam-github/node-debuglog/issues" }, "homepage": "https://github.com/sam-github/node-debuglog", "_id": "debuglog@1.0.1", "dist": { "shasum": "aa24ffb9ac3df9a2351837cfb2d279360cd78492", "tarball": "http://registry.npmjs.org/debuglog/-/debuglog-1.0.1.tgz" }, "_from": "debuglog@1.0.1", "_npmVersion": "1.4.3", "_npmUser": { "name": "octet", "email": "sam@strongloop.com" }, "maintainers": [ { "name": "octet", "email": "sam@strongloop.com" } ], "directories": {}, "_shasum": "aa24ffb9ac3df9a2351837cfb2d279360cd78492", "_resolved": "https://registry.npmjs.org/debuglog/-/debuglog-1.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/dezalgo/.travis.yml0000644000000000000000000000015712631326456017710 0ustar 00000000000000language: node_js before_script: npm install -g npm@latest node_js: - '0.8' - '0.10' - '0.12' - 'iojs' npm_3.5.2.orig/node_modules/dezalgo/LICENSE0000644000000000000000000000137512631326456016607 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/dezalgo/README.md0000644000000000000000000000121312631326456017050 0ustar 00000000000000# dezalgo Contain async insanity so that the dark pony lord doesn't eat souls See [this blog post](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony). ## USAGE Pass a callback to `dezalgo` and it will ensure that it is *always* called in a future tick, and never in this tick. ```javascript var dz = require('dezalgo') var cache = {} function maybeSync(arg, cb) { cb = dz(cb) // this will actually defer to nextTick if (cache[arg]) cb(null, cache[arg]) fs.readFile(arg, function (er, data) { // since this is *already* defered, it will call immediately if (er) cb(er) cb(null, cache[arg] = data) }) } ``` npm_3.5.2.orig/node_modules/dezalgo/dezalgo.js0000644000000000000000000000056012631326456017560 0ustar 00000000000000var wrappy = require('wrappy') module.exports = wrappy(dezalgo) var asap = require('asap') function dezalgo (cb) { var sync = true asap(function () { sync = false }) return function zalgoSafe() { var args = arguments var me = this if (sync) asap(function() { cb.apply(me, args) }) else cb.apply(me, args) } } npm_3.5.2.orig/node_modules/dezalgo/node_modules/0000755000000000000000000000000012631326456020251 5ustar 00000000000000npm_3.5.2.orig/node_modules/dezalgo/package.json0000644000000000000000000000654312631326456020072 0ustar 00000000000000{ "name": "dezalgo", "version": "1.0.3", "description": "Contain async insanity so that the dark pony lord doesn't eat souls", "main": "dezalgo.js", "directories": { "test": "test" }, "dependencies": { "asap": "^2.0.0", "wrappy": "1" }, "devDependencies": { "tap": "^1.2.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/npm/dezalgo.git" }, "keywords": [ "async", "zalgo", "the dark pony", "he comes", "asynchrony of all holy and good", "T̯̪ͅo̯͖̹ ̻̮̖̲͢i̥̖n̢͈͇̝͍v͏͉ok̭̬̝ͅe̞͍̩̫͍̩͝ ̩̮̖̟͇͉́t͔͔͎̗h͏̗̟e̘͉̰̦̠̞͓ ͕h͉̟͎̪̠̱͠ḭ̮̩v̺͉͇̩e̵͖-̺̪m͍i̜n̪̲̲̲̮d̷ ̢r̠̼̯̹̦̦͘ͅe͓̳͓̙p̺̗̫͙͘ͅr͔̰͜e̴͓̞s͉̩̩͟ͅe͏̣n͚͇̗̭̺͍tì͙̣n͏̖̥̗͎̰̪g̞͓̭̱̯̫̕ ̣̱͜ͅc̦̰̰̠̮͎͙̀hao̺̜̻͍͙ͅs͉͓̘.͎̼̺̼͕̹͘", "̠̞̱̰I͖͇̝̻n̦̰͍̰̟v̤̺̫̳̭̼̗͘ò̹̟̩̩͚k̢̥̠͍͉̦̬i̖͓͔̮̱̻͘n̶̳͙̫͎g̖̯̣̲̪͉ ̞͎̗͕͚ͅt̲͕̘̺̯̗̦h̘̦̲̜̻e̳͎͉̬͙ ̴̞̪̲̥f̜̯͓͓̭̭͢e̱̘͔̮e̜̤l̺̱͖̯͓͙͈͢i̵̦̬͉͔̫͚͕n͉g̨͖̙̙̹̹̟̤ ͉̪o̞̠͍̪̰͙ͅf̬̲̺ ͔͕̲͕͕̲̕c̙͉h̝͔̩̙̕ͅa̲͖̻̗̹o̥̼̫s̝̖̜̝͚̫̟.̺͚ ̸̱̲W̶̥̣͖̦i͏̤̬̱̳̣ͅt͉h̗̪̪ ̷̱͚̹̪ǫ͕̗̣̳̦͎u̼̦͔̥̮̕ţ͖͎̻͔͉ ̴͎̩òr̹̰̖͉͈͝d̷̲̦̖͓e̲͓̠r", "̧͚̜͓̰̭̭Ṯ̫̹̜̮̟̮͝h͚̘̩̘̖̰́e ̥̘͓͉͔͙̼N̟̜̣̘͔̪e̞̞̤͢z̰̖̘͇p̠͟e̺̱̣͍͙̝ṛ̘̬͔̙͇̠d͝ḭ̯̱̥̗̩a̛ͅn͏̦ ̷̥hi̥v̖̳̹͉̮̱͝e̹̪̘̖̰̟-̴͙͓͚̜̻mi̗̺̻͙̺ͅn̪̯͈d ͏̘͓̫̳ͅơ̹͔̳̖̣͓f͈̹̘ ͕ͅc̗̤̠̜̮̥̥h̡͍̩̭̫͚̱a̤͉̤͔͜os͕̤̼͍̲̀ͅ.̡̱ ̦Za̯̱̗̭͍̣͚l̗͉̰̤g͏̣̭̬̗̲͖ͅo̶̭̩̳̟͈.̪̦̰̳", "H̴̱̦̗̬̣͓̺e̮ ͉̠̰̞͎̖͟ẁh̛̺̯ͅo̖̫͡ ̢Ẁa̡̗i̸t͖̣͉̀ş͔̯̩ ̤̦̮͇̞̦̲B͎̭͇̦̼e̢hin͏͙̟̪d̴̰͓̻̣̮͕ͅ T͖̮̕h͖e̘̺̰̙͘ ̥Ẁ̦͔̻͚a̞͖̪͉l̪̠̻̰̣̠l̲͎͞", "Z̘͍̼͎̣͔͝Ą̲̜̱̱̹̤͇L̶̝̰̭͔G͍̖͍O̫͜ͅ!̼̤ͅ", "H̝̪̜͓̀̌̂̒E̢̙̠̣ ̴̳͇̥̟̠͍̐C̹̓̑̐̆͝Ó̶̭͓̚M̬̼Ĕ̖̤͔͔̟̹̽̿̊ͥ̍ͫS̻̰̦̻̖̘̱̒ͪ͌̅͟" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/dezalgo/issues" }, "homepage": "https://github.com/npm/dezalgo", "readme": "# dezalgo\n\nContain async insanity so that the dark pony lord doesn't eat souls\n\nSee [this blog\npost](http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony).\n\n## USAGE\n\nPass a callback to `dezalgo` and it will ensure that it is *always*\ncalled in a future tick, and never in this tick.\n\n```javascript\nvar dz = require('dezalgo')\n\nvar cache = {}\nfunction maybeSync(arg, cb) {\n cb = dz(cb)\n\n // this will actually defer to nextTick\n if (cache[arg]) cb(null, cache[arg])\n\n fs.readFile(arg, function (er, data) {\n // since this is *already* defered, it will call immediately\n if (er) cb(er)\n cb(null, cache[arg] = data)\n })\n}\n```\n", "readmeFilename": "README.md", "gitHead": "d4d3f3f6f47b1a326194d5281349c83dde258458", "_id": "dezalgo@1.0.3", "_shasum": "7f742de066fc748bc8db820569dddce49bf0d456", "_from": "dezalgo@>=1.0.3 <1.1.0" } npm_3.5.2.orig/node_modules/dezalgo/test/0000755000000000000000000000000012631326456016553 5ustar 00000000000000npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/0000755000000000000000000000000012631326456021175 5ustar 00000000000000npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/CHANGES.md0000644000000000000000000000521112631326456022566 0ustar 00000000000000 ## 2.0.3 Version 2.0.3 fixes a bug when adjusting the capacity of the task queue. ## 2.0.1-2.02 Version 2.0.1 fixes a bug in the way redirects were expressed that affected the function of Browserify, but which Mr would tolerate. ## 2.0.0 Version 2 of ASAP is a full rewrite with a few salient changes. First, the ASAP source is CommonJS only and designed with [Browserify][] and [Browserify-compatible][Mr] module loaders in mind. [Browserify]: https://github.com/substack/node-browserify [Mr]: https://github.com/montagejs/mr The new version has been refactored in two dimensions. Support for Node.js and browsers have been separated, using Browserify redirects and ASAP has been divided into two modules. The "raw" layer depends on the tasks to catch thrown exceptions and unravel Node.js domains. The full implementation of ASAP is loadable as `require("asap")` in both Node.js and browsers. The raw layer that lacks exception handling overhead is loadable as `require("asap/raw")`. The interface is the same for both layers. Tasks are no longer required to be functions, but can rather be any object that implements `task.call()`. With this feature you can recycle task objects to avoid garbage collector churn and avoid closures in general. The implementation has been rigorously documented so that our successors can understand the scope of the problem that this module solves and all of its nuances, ensuring that the next generation of implementations know what details are essential. - [asap.js](https://github.com/kriskowal/asap/blob/master/asap.js) - [raw.js](https://github.com/kriskowal/asap/blob/master/raw.js) - [browser-asap.js](https://github.com/kriskowal/asap/blob/master/browser-asap.js) - [browser-raw.js](https://github.com/kriskowal/asap/blob/master/browser-raw.js) The new version has also been rigorously tested across a broad spectrum of browsers, in both the window and worker context. The following charts capture the browser test results for the most recent release. The first chart shows test results for ASAP running in the main window context. The second chart shows test results for ASAP running in a web worker context. Test results are inconclusive (grey) on browsers that do not support web workers. These data are captured automatically by [Continuous Integration][]. ![Browser Compatibility](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-results-matrix.svg) ![Compatibility in Web Workers](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-worker-results-matrix.svg) [Continuous Integration]: https://github.com/kriskowal/asap/blob/master/CONTRIBUTING.md npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/LICENSE.md0000644000000000000000000000207312631326456022603 0ustar 00000000000000 Copyright 2009–2014 Contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/README.md0000644000000000000000000002355112631326456022462 0ustar 00000000000000# ASAP [![Build Status](https://travis-ci.org/kriskowal/asap.png?branch=master)](https://travis-ci.org/kriskowal/asap) Promise and asynchronous observer libraries, as well as hand-rolled callback programs and libraries, often need a mechanism to postpone the execution of a callback until the next available event. (See [Designing API’s for Asynchrony][Zalgo].) The `asap` function executes a task **as soon as possible** but not before it returns, waiting only for the completion of the current event and previously scheduled tasks. ```javascript asap(function () { // ... }); ``` [Zalgo]: http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony This CommonJS package provides an `asap` module that exports a function that executes a task function *as soon as possible*. ASAP strives to schedule events to occur before yielding for IO, reflow, or redrawing. Each event receives an independent stack, with only platform code in parent frames and the events run in the order they are scheduled. ASAP provides a fast event queue that will execute tasks until it is empty before yielding to the JavaScript engine's underlying event-loop. When a task gets added to a previously empty event queue, ASAP schedules a flush event, preferring for that event to occur before the JavaScript engine has an opportunity to perform IO tasks or rendering, thus making the first task and subsequent tasks semantically indistinguishable. ASAP uses a variety of techniques to preserve this invariant on different versions of browsers and Node.js. By design, ASAP prevents input events from being handled until the task queue is empty. If the process is busy enough, this may cause incoming connection requests to be dropped, and may cause existing connections to inform the sender to reduce the transmission rate or stall. ASAP allows this on the theory that, if there is enough work to do, there is no sense in looking for trouble. As a consequence, ASAP can interfere with smooth animation. If your task should be tied to the rendering loop, consider using `requestAnimationFrame` instead. A long sequence of tasks can also effect the long running script dialog. If this is a problem, you may be able to use ASAP’s cousin `setImmediate` to break long processes into shorter intervals and periodically allow the browser to breathe. `setImmediate` will yield for IO, reflow, and repaint events. It also returns a handler and can be canceled. For a `setImmediate` shim, consider [YuzuJS setImmediate][setImmediate]. [setImmediate]: https://github.com/YuzuJS/setImmediate Take care. ASAP can sustain infinite recursive calls without warning. It will not halt from a stack overflow, and it will not consume unbounded memory. This is behaviorally equivalent to an infinite loop. Just as with infinite loops, you can monitor a Node.js process for this behavior with a heart-beat signal. As with infinite loops, a very small amount of caution goes a long way to avoiding problems. ```javascript function loop() { asap(loop); } loop(); ``` In browsers, if a task throws an exception, it will not interrupt the flushing of high-priority tasks. The exception will be postponed to a later, low-priority event to avoid slow-downs. In Node.js, if a task throws an exception, ASAP will resume flushing only if—and only after—the error is handled by `domain.on("error")` or `process.on("uncaughtException")`. ## Raw ASAP Checking for exceptions comes at a cost. The package also provides an `asap/raw` module that exports the underlying implementation which is faster but stalls if a task throws an exception. This internal version of the ASAP function does not check for errors. If a task does throw an error, it will stall the event queue unless you manually call `rawAsap.requestFlush()` before throwing the error, or any time after. In Node.js, `asap/raw` also runs all tasks outside any domain. If you need a task to be bound to your domain, you will have to do it manually. ```js if (process.domain) { task = process.domain.bind(task); } rawAsap(task); ``` ## Tasks A task may be any object that implements `call()`. A function will suffice, but closures tend not to be reusable and can cause garbage collector churn. Both `asap` and `rawAsap` accept task objects to give you the option of recycling task objects or using higher callable object abstractions. See the `asap` source for an illustration. ## Compatibility ASAP is tested on Node.js v0.10 and in a broad spectrum of web browsers. The following charts capture the browser test results for the most recent release. The first chart shows test results for ASAP running in the main window context. The second chart shows test results for ASAP running in a web worker context. Test results are inconclusive (grey) on browsers that do not support web workers. These data are captured automatically by [Continuous Integration][]. [Continuous Integration]: https://github.com/kriskowal/asap/blob/master/CONTRIBUTING.md ![Browser Compatibility](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-results-matrix.svg) ![Compatibility in Web Workers](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-worker-results-matrix.svg) ## Caveats When a task is added to an empty event queue, it is not always possible to guarantee that the task queue will begin flushing immediately after the current event. However, once the task queue begins flushing, it will not yield until the queue is empty, even if the queue grows while executing tasks. The following browsers allow the use of [DOM mutation observers][] to access the HTML [microtask queue][], and thus begin flushing ASAP's task queue immediately at the end of the current event loop turn, before any rendering or IO: [microtask queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#microtask-queue [DOM mutation observers]: http://dom.spec.whatwg.org/#mutation-observers - Android 4–4.3 - Chrome 26–34 - Firefox 14–29 - Internet Explorer 11 - iPad Safari 6–7.1 - iPhone Safari 7–7.1 - Safari 6–7 In the absense of mutation observers, there are a few browsers, and situations like web workers in some of the above browsers, where [message channels][] would be a useful way to avoid falling back to timers. Message channels give direct access to the HTML [task queue][], so the ASAP task queue would flush after any already queued rendering and IO tasks, but without having the minimum delay imposed by timers. However, among these browsers, Internet Explorer 10 and Safari do not reliably dispatch messages, so they are not worth the trouble to implement. [message channels]: http://www.whatwg.org/specs/web-apps/current-work/multipage/web-messaging.html#message-channels [task queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#concept-task - Internet Explorer 10 - Safair 5.0-1 - Opera 11-12 In the absense of mutation observers, these browsers and the following browsers all fall back to using `setTimeout` and `setInterval` to ensure that a `flush` occurs. The implementation uses both and cancels whatever handler loses the race, since `setTimeout` tends to occasionally skip tasks in unisolated circumstances. Timers generally delay the flushing of ASAP's task queue for four milliseconds. - Firefox 3–13 - Internet Explorer 6–10 - iPad Safari 4.3 - Lynx 2.8.7 ## Heritage ASAP has been factored out of the [Q][] asynchronous promise library. It originally had a naïve implementation in terms of `setTimeout`, but [Malte Ubl][NonBlocking] provided an insight that `postMessage` might be useful for creating a high-priority, no-delay event dispatch hack. Since then, Internet Explorer proposed and implemented `setImmediate`. Robert Katić began contributing to Q by measuring the performance of the internal implementation of `asap`, paying particular attention to error recovery. Domenic, Robert, and Kris Kowal collectively settled on the current strategy of unrolling the high-priority event queue internally regardless of what strategy we used to dispatch the potentially lower-priority flush event. Domenic went on to make ASAP cooperate with Node.js domains. [Q]: https://github.com/kriskowal/q [NonBlocking]: http://www.nonblocking.io/2011/06/windownexttick.html For further reading, Nicholas Zakas provided a thorough article on [The Case for setImmediate][NCZ]. [NCZ]: http://www.nczonline.net/blog/2013/07/09/the-case-for-setimmediate/ Ember’s RSVP promise implementation later [adopted][RSVP ASAP] the name ASAP but further developed the implentation. Particularly, The `MessagePort` implementation was abandoned due to interaction [problems with Mobile Internet Explorer][IE Problems] in favor of an implementation backed on the newer and more reliable DOM `MutationObserver` interface. These changes were back-ported into this library. [IE Problems]: https://github.com/cujojs/when/issues/197 [RSVP ASAP]: https://github.com/tildeio/rsvp.js/blob/cddf7232546a9cf858524b75cde6f9edf72620a7/lib/rsvp/asap.js In addition, ASAP factored into `asap` and `asap/raw`, such that `asap` remained exception-safe, but `asap/raw` provided a tight kernel that could be used for tasks that guaranteed that they would not throw exceptions. This core is useful for promise implementations that capture thrown errors in rejected promises and do not need a second safety net. At the same time, the exception handling in `asap` was factored into separate implementations for Node.js and browsers, using the the [Browserify][Browser Config] `browser` property in `package.json` to instruct browser module loaders and bundlers, including [Browserify][], [Mr][], and [Mop][], to use the browser-only implementation. [Browser Config]: https://gist.github.com/defunctzombie/4339901 [Browserify]: https://github.com/substack/node-browserify [Mr]: https://github.com/montagejs/mr [Mop]: https://github.com/montagejs/mop ## License Copyright 2009-2014 by Contributors MIT License (enclosed) npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/asap.js0000644000000000000000000000364612631326456022470 0ustar 00000000000000"use strict"; var rawAsap = require("./raw"); var freeTasks = []; /** * Calls a task as soon as possible after returning, in its own event, with * priority over IO events. An exception thrown in a task can be handled by * `process.on("uncaughtException") or `domain.on("error")`, but will otherwise * crash the process. If the error is handled, all subsequent tasks will * resume. * * @param {{call}} task A callable object, typically a function that takes no * arguments. */ module.exports = asap; function asap(task) { var rawTask; if (freeTasks.length) { rawTask = freeTasks.pop(); } else { rawTask = new RawTask(); } rawTask.task = task; rawTask.domain = process.domain; rawAsap(rawTask); } function RawTask() { this.task = null; this.domain = null; } RawTask.prototype.call = function () { if (this.domain) { this.domain.enter(); } var threw = true; try { this.task.call(); threw = false; // If the task throws an exception (presumably) Node.js restores the // domain stack for the next event. if (this.domain) { this.domain.exit(); } } finally { // We use try/finally and a threw flag to avoid messing up stack traces // when we catch and release errors. if (threw) { // In Node.js, uncaught exceptions are considered fatal errors. // Re-throw them to interrupt flushing! // Ensure that flushing continues if an uncaught exception is // suppressed listening process.on("uncaughtException") or // domain.on("error"). rawAsap.requestFlush(); } // If the task threw an error, we do not want to exit the domain here. // Exiting the domain would prevent the domain from catching the error. this.task = null; this.domain = null; freeTasks.push(this); } }; npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/browser-asap.js0000644000000000000000000000421012631326456024135 0ustar 00000000000000"use strict"; // rawAsap provides everything we need except exception management. var rawAsap = require("./raw"); // RawTasks are recycled to reduce GC churn. var freeTasks = []; // We queue errors to ensure they are thrown in right order (FIFO). // Array-as-queue is good enough here, since we are just dealing with exceptions. var pendingErrors = []; var requestErrorThrow = rawAsap.makeRequestCallFromTimer(throwFirstError); function throwFirstError() { if (pendingErrors.length) { throw pendingErrors.shift(); } } /** * Calls a task as soon as possible after returning, in its own event, with priority * over other events like animation, reflow, and repaint. An error thrown from an * event will not interrupt, nor even substantially slow down the processing of * other events, but will be rather postponed to a lower priority event. * @param {{call}} task A callable object, typically a function that takes no * arguments. */ module.exports = asap; function asap(task) { var rawTask; if (freeTasks.length) { rawTask = freeTasks.pop(); } else { rawTask = new RawTask(); } rawTask.task = task; rawAsap(rawTask); } // We wrap tasks with recyclable task objects. A task object implements // `call`, just like a function. function RawTask() { this.task = null; } // The sole purpose of wrapping the task is to catch the exception and recycle // the task object after its single use. RawTask.prototype.call = function () { try { this.task.call(); } catch (error) { if (asap.onerror) { // This hook exists purely for testing purposes. // Its name will be periodically randomized to break any code that // depends on its existence. asap.onerror(error); } else { // In a web browser, exceptions are not fatal. However, to avoid // slowing down the queue of pending tasks, we rethrow the error in a // lower priority turn. pendingErrors.push(error); requestErrorThrow(); } } finally { this.task = null; freeTasks[freeTasks.length] = this; } }; npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/browser-raw.js0000644000000000000000000002246712631326456024020 0ustar 00000000000000"use strict"; // Use the fastest means possible to execute a task in its own turn, with // priority over other events including IO, animation, reflow, and redraw // events in browsers. // // An exception thrown by a task will permanently interrupt the processing of // subsequent tasks. The higher level `asap` function ensures that if an // exception is thrown by a task, that the task queue will continue flushing as // soon as possible, but if you use `rawAsap` directly, you are responsible to // either ensure that no exceptions are thrown from your task, or to manually // call `rawAsap.requestFlush` if an exception is thrown. module.exports = rawAsap; function rawAsap(task) { if (!queue.length) { requestFlush(); flushing = true; } // Equivalent to push, but avoids a function call. queue[queue.length] = task; } var queue = []; // Once a flush has been requested, no further calls to `requestFlush` are // necessary until the next `flush` completes. var flushing = false; // `requestFlush` is an implementation-specific method that attempts to kick // off a `flush` event as quickly as possible. `flush` will attempt to exhaust // the event queue before yielding to the browser's own event loop. var requestFlush; // The position of the next task to execute in the task queue. This is // preserved between calls to `flush` so that it can be resumed if // a task throws an exception. var index = 0; // If a task schedules additional tasks recursively, the task queue can grow // unbounded. To prevent memory exhaustion, the task queue will periodically // truncate already-completed tasks. var capacity = 1024; // The flush function processes all tasks that have been scheduled with // `rawAsap` unless and until one of those tasks throws an exception. // If a task throws an exception, `flush` ensures that its state will remain // consistent and will resume where it left off when called again. // However, `flush` does not make any arrangements to be called again if an // exception is thrown. function flush() { while (index < queue.length) { var currentIndex = index; // Advance the index before calling the task. This ensures that we will // begin flushing on the next task the task throws an error. index = index + 1; queue[currentIndex].call(); // Prevent leaking memory for long chains of recursive calls to `asap`. // If we call `asap` within tasks scheduled by `asap`, the queue will // grow, but to avoid an O(n) walk for every task we execute, we don't // shift tasks off the queue after they have been executed. // Instead, we periodically shift 1024 tasks off the queue. if (index > capacity) { // Manually shift all values starting at the index back to the // beginning of the queue. for (var scan = 0, newLength = queue.length - index; scan < newLength; scan++) { queue[scan] = queue[scan + index]; } queue.length -= index; index = 0; } } queue.length = 0; index = 0; flushing = false; } // `requestFlush` is implemented using a strategy based on data collected from // every available SauceLabs Selenium web driver worker at time of writing. // https://docs.google.com/spreadsheets/d/1mG-5UYGup5qxGdEMWkhP6BWCz053NUb2E1QoUTU16uA/edit#gid=783724593 // Safari 6 and 6.1 for desktop, iPad, and iPhone are the only browsers that // have WebKitMutationObserver but not un-prefixed MutationObserver. // Must use `global` instead of `window` to work in both frames and web // workers. `global` is a provision of Browserify, Mr, Mrs, or Mop. var BrowserMutationObserver = global.MutationObserver || global.WebKitMutationObserver; // MutationObservers are desirable because they have high priority and work // reliably everywhere they are implemented. // They are implemented in all modern browsers. // // - Android 4-4.3 // - Chrome 26-34 // - Firefox 14-29 // - Internet Explorer 11 // - iPad Safari 6-7.1 // - iPhone Safari 7-7.1 // - Safari 6-7 if (typeof BrowserMutationObserver === "function") { requestFlush = makeRequestCallFromMutationObserver(flush); // MessageChannels are desirable because they give direct access to the HTML // task queue, are implemented in Internet Explorer 10, Safari 5.0-1, and Opera // 11-12, and in web workers in many engines. // Although message channels yield to any queued rendering and IO tasks, they // would be better than imposing the 4ms delay of timers. // However, they do not work reliably in Internet Explorer or Safari. // Internet Explorer 10 is the only browser that has setImmediate but does // not have MutationObservers. // Although setImmediate yields to the browser's renderer, it would be // preferrable to falling back to setTimeout since it does not have // the minimum 4ms penalty. // Unfortunately there appears to be a bug in Internet Explorer 10 Mobile (and // Desktop to a lesser extent) that renders both setImmediate and // MessageChannel useless for the purposes of ASAP. // https://github.com/kriskowal/q/issues/396 // Timers are implemented universally. // We fall back to timers in workers in most engines, and in foreground // contexts in the following browsers. // However, note that even this simple case requires nuances to operate in a // broad spectrum of browsers. // // - Firefox 3-13 // - Internet Explorer 6-9 // - iPad Safari 4.3 // - Lynx 2.8.7 } else { requestFlush = makeRequestCallFromTimer(flush); } // `requestFlush` requests that the high priority event queue be flushed as // soon as possible. // This is useful to prevent an error thrown in a task from stalling the event // queue if the exception handled by Node.js’s // `process.on("uncaughtException")` or by a domain. rawAsap.requestFlush = requestFlush; // To request a high priority event, we induce a mutation observer by toggling // the text of a text node between "1" and "-1". function makeRequestCallFromMutationObserver(callback) { var toggle = 1; var observer = new BrowserMutationObserver(callback); var node = document.createTextNode(""); observer.observe(node, {characterData: true}); return function requestCall() { toggle = -toggle; node.data = toggle; }; } // The message channel technique was discovered by Malte Ubl and was the // original foundation for this library. // http://www.nonblocking.io/2011/06/windownexttick.html // Safari 6.0.5 (at least) intermittently fails to create message ports on a // page's first load. Thankfully, this version of Safari supports // MutationObservers, so we don't need to fall back in that case. // function makeRequestCallFromMessageChannel(callback) { // var channel = new MessageChannel(); // channel.port1.onmessage = callback; // return function requestCall() { // channel.port2.postMessage(0); // }; // } // For reasons explained above, we are also unable to use `setImmediate` // under any circumstances. // Even if we were, there is another bug in Internet Explorer 10. // It is not sufficient to assign `setImmediate` to `requestFlush` because // `setImmediate` must be called *by name* and therefore must be wrapped in a // closure. // Never forget. // function makeRequestCallFromSetImmediate(callback) { // return function requestCall() { // setImmediate(callback); // }; // } // Safari 6.0 has a problem where timers will get lost while the user is // scrolling. This problem does not impact ASAP because Safari 6.0 supports // mutation observers, so that implementation is used instead. // However, if we ever elect to use timers in Safari, the prevalent work-around // is to add a scroll event listener that calls for a flush. // `setTimeout` does not call the passed callback if the delay is less than // approximately 7 in web workers in Firefox 8 through 18, and sometimes not // even then. function makeRequestCallFromTimer(callback) { return function requestCall() { // We dispatch a timeout with a specified delay of 0 for engines that // can reliably accommodate that request. This will usually be snapped // to a 4 milisecond delay, but once we're flushing, there's no delay // between events. var timeoutHandle = setTimeout(handleTimer, 0); // However, since this timer gets frequently dropped in Firefox // workers, we enlist an interval handle that will try to fire // an event 20 times per second until it succeeds. var intervalHandle = setInterval(handleTimer, 50); function handleTimer() { // Whichever timer succeeds will cancel both timers and // execute the callback. clearTimeout(timeoutHandle); clearInterval(intervalHandle); callback(); } }; } // This is for `asap.js` only. // Its name will be periodically randomized to break any code that depends on // its existence. rawAsap.makeRequestCallFromTimer = makeRequestCallFromTimer; // ASAP was originally a nextTick shim included in Q. This was factored out // into this ASAP package. It was later adapted to RSVP which made further // amendments. These decisions, particularly to marginalize MessageChannel and // to capture the MutationObserver implementation in a closure, were integrated // back into ASAP proper. // https://github.com/tildeio/rsvp.js/blob/cddf7232546a9cf858524b75cde6f9edf72620a7/lib/rsvp/asap.js npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/package.json0000644000000000000000000003040312631326456023463 0ustar 00000000000000{ "name": "asap", "version": "2.0.3", "description": "High-priority task queue for Node.js and browsers", "keywords": [ "event", "task", "queue" ], "license": { "type": "MIT", "url": "https://github.com/kriskowal/asap/raw/master/LICENSE.md" }, "repository": { "type": "git", "url": "git+https://github.com/kriskowal/asap.git" }, "main": "./asap.js", "browser": { "./asap.js": "./browser-asap.js", "./raw.js": "./browser-raw.js", "./test/domain.js": "./test/browser-domain.js" }, "files": [ "raw.js", "asap.js", "browser-raw.js", "browser-asap.js" ], "scripts": { "test": "npm run lint && npm run test-node", "test-travis": "npm run lint && npm run test-node && npm run test-saucelabs && npm run test-saucelabs-worker", "test-node": "node test/asap-test.js", "test-publish": "node scripts/publish-bundle.js test/asap-test.js | pbcopy", "test-browser": "node scripts/publish-bundle.js test/asap-test.js | xargs opener", "test-saucelabs": "node scripts/saucelabs.js test/asap-test.js scripts/saucelabs-spot-configurations.json", "test-saucelabs-all": "node scripts/saucelabs.js test/asap-test.js scripts/saucelabs-all-configurations.json", "test-saucelabs-worker": "node scripts/saucelabs-worker-test.js scripts/saucelabs-spot-configurations.json", "test-saucelabs-worker-all": "node scripts/saucelabs-worker-test.js scripts/saucelabs-all-configurations.json", "lint": "jshint raw.js asap.js browser-raw.js browser-asap.js $(find scripts -name '*.js' | grep -v gauntlet)" }, "devDependencies": { "events": "^1.0.1", "jshint": "^2.5.1", "knox": "^0.8.10", "mr": "^2.0.5", "opener": "^1.3.0", "q": "^2.0.3", "q-io": "^2.0.3", "saucelabs": "^0.1.1", "wd": "^0.2.21", "weak-map": "^1.0.5" }, "readme": "# ASAP\n\n[![Build Status](https://travis-ci.org/kriskowal/asap.png?branch=master)](https://travis-ci.org/kriskowal/asap)\n\nPromise and asynchronous observer libraries, as well as hand-rolled callback\nprograms and libraries, often need a mechanism to postpone the execution of a\ncallback until the next available event.\n(See [Designing API’s for Asynchrony][Zalgo].)\nThe `asap` function executes a task **as soon as possible** but not before it\nreturns, waiting only for the completion of the current event and previously\nscheduled tasks.\n\n```javascript\nasap(function () {\n // ...\n});\n```\n\n[Zalgo]: http://blog.izs.me/post/59142742143/designing-apis-for-asynchrony\n\nThis CommonJS package provides an `asap` module that exports a function that\nexecutes a task function *as soon as possible*.\n\nASAP strives to schedule events to occur before yielding for IO, reflow,\nor redrawing.\nEach event receives an independent stack, with only platform code in parent\nframes and the events run in the order they are scheduled.\n\nASAP provides a fast event queue that will execute tasks until it is\nempty before yielding to the JavaScript engine's underlying event-loop.\nWhen a task gets added to a previously empty event queue, ASAP schedules a flush\nevent, preferring for that event to occur before the JavaScript engine has an\nopportunity to perform IO tasks or rendering, thus making the first task and\nsubsequent tasks semantically indistinguishable.\nASAP uses a variety of techniques to preserve this invariant on different\nversions of browsers and Node.js.\n\nBy design, ASAP prevents input events from being handled until the task\nqueue is empty.\nIf the process is busy enough, this may cause incoming connection requests to be\ndropped, and may cause existing connections to inform the sender to reduce the\ntransmission rate or stall.\nASAP allows this on the theory that, if there is enough work to do, there is no\nsense in looking for trouble.\nAs a consequence, ASAP can interfere with smooth animation.\nIf your task should be tied to the rendering loop, consider using\n`requestAnimationFrame` instead.\nA long sequence of tasks can also effect the long running script dialog.\nIf this is a problem, you may be able to use ASAP’s cousin `setImmediate` to\nbreak long processes into shorter intervals and periodically allow the browser\nto breathe.\n`setImmediate` will yield for IO, reflow, and repaint events.\nIt also returns a handler and can be canceled.\nFor a `setImmediate` shim, consider [YuzuJS setImmediate][setImmediate].\n\n[setImmediate]: https://github.com/YuzuJS/setImmediate\n\nTake care.\nASAP can sustain infinite recursive calls without warning.\nIt will not halt from a stack overflow, and it will not consume unbounded\nmemory.\nThis is behaviorally equivalent to an infinite loop.\nJust as with infinite loops, you can monitor a Node.js process for this behavior\nwith a heart-beat signal.\nAs with infinite loops, a very small amount of caution goes a long way to\navoiding problems.\n\n```javascript\nfunction loop() {\n asap(loop);\n}\nloop();\n```\n\nIn browsers, if a task throws an exception, it will not interrupt the flushing\nof high-priority tasks.\nThe exception will be postponed to a later, low-priority event to avoid\nslow-downs.\nIn Node.js, if a task throws an exception, ASAP will resume flushing only if—and\nonly after—the error is handled by `domain.on(\"error\")` or\n`process.on(\"uncaughtException\")`.\n\n## Raw ASAP\n\nChecking for exceptions comes at a cost.\nThe package also provides an `asap/raw` module that exports the underlying\nimplementation which is faster but stalls if a task throws an exception.\nThis internal version of the ASAP function does not check for errors.\nIf a task does throw an error, it will stall the event queue unless you manually\ncall `rawAsap.requestFlush()` before throwing the error, or any time after.\n\nIn Node.js, `asap/raw` also runs all tasks outside any domain.\nIf you need a task to be bound to your domain, you will have to do it manually.\n\n```js\nif (process.domain) {\n task = process.domain.bind(task);\n}\nrawAsap(task);\n```\n\n## Tasks\n\nA task may be any object that implements `call()`.\nA function will suffice, but closures tend not to be reusable and can cause\ngarbage collector churn.\nBoth `asap` and `rawAsap` accept task objects to give you the option of\nrecycling task objects or using higher callable object abstractions.\nSee the `asap` source for an illustration.\n\n\n## Compatibility\n\nASAP is tested on Node.js v0.10 and in a broad spectrum of web browsers.\nThe following charts capture the browser test results for the most recent\nrelease.\nThe first chart shows test results for ASAP running in the main window context.\nThe second chart shows test results for ASAP running in a web worker context.\nTest results are inconclusive (grey) on browsers that do not support web\nworkers.\nThese data are captured automatically by [Continuous\nIntegration][].\n\n[Continuous Integration]: https://github.com/kriskowal/asap/blob/master/CONTRIBUTING.md\n\n![Browser Compatibility](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-results-matrix.svg)\n\n![Compatibility in Web Workers](http://kriskowal-asap.s3-website-us-west-2.amazonaws.com/train/integration-2/saucelabs-worker-results-matrix.svg)\n\n## Caveats\n\nWhen a task is added to an empty event queue, it is not always possible to\nguarantee that the task queue will begin flushing immediately after the current\nevent.\nHowever, once the task queue begins flushing, it will not yield until the queue\nis empty, even if the queue grows while executing tasks.\n\nThe following browsers allow the use of [DOM mutation observers][] to access\nthe HTML [microtask queue][], and thus begin flushing ASAP's task queue\nimmediately at the end of the current event loop turn, before any rendering or\nIO:\n\n[microtask queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#microtask-queue\n[DOM mutation observers]: http://dom.spec.whatwg.org/#mutation-observers\n\n- Android 4–4.3\n- Chrome 26–34\n- Firefox 14–29\n- Internet Explorer 11\n- iPad Safari 6–7.1\n- iPhone Safari 7–7.1\n- Safari 6–7\n\nIn the absense of mutation observers, there are a few browsers, and situations\nlike web workers in some of the above browsers, where [message channels][]\nwould be a useful way to avoid falling back to timers.\nMessage channels give direct access to the HTML [task queue][], so the ASAP\ntask queue would flush after any already queued rendering and IO tasks, but\nwithout having the minimum delay imposed by timers.\nHowever, among these browsers, Internet Explorer 10 and Safari do not reliably\ndispatch messages, so they are not worth the trouble to implement.\n\n[message channels]: http://www.whatwg.org/specs/web-apps/current-work/multipage/web-messaging.html#message-channels\n[task queue]: http://www.whatwg.org/specs/web-apps/current-work/multipage/webappapis.html#concept-task\n\n- Internet Explorer 10\n- Safair 5.0-1\n- Opera 11-12\n\nIn the absense of mutation observers, these browsers and the following browsers\nall fall back to using `setTimeout` and `setInterval` to ensure that a `flush`\noccurs.\nThe implementation uses both and cancels whatever handler loses the race, since\n`setTimeout` tends to occasionally skip tasks in unisolated circumstances.\nTimers generally delay the flushing of ASAP's task queue for four milliseconds.\n\n- Firefox 3–13\n- Internet Explorer 6–10\n- iPad Safari 4.3\n- Lynx 2.8.7\n\n\n## Heritage\n\nASAP has been factored out of the [Q][] asynchronous promise library.\nIt originally had a naïve implementation in terms of `setTimeout`, but\n[Malte Ubl][NonBlocking] provided an insight that `postMessage` might be\nuseful for creating a high-priority, no-delay event dispatch hack.\nSince then, Internet Explorer proposed and implemented `setImmediate`.\nRobert Katić began contributing to Q by measuring the performance of\nthe internal implementation of `asap`, paying particular attention to\nerror recovery.\nDomenic, Robert, and Kris Kowal collectively settled on the current strategy of\nunrolling the high-priority event queue internally regardless of what strategy\nwe used to dispatch the potentially lower-priority flush event.\nDomenic went on to make ASAP cooperate with Node.js domains.\n\n[Q]: https://github.com/kriskowal/q\n[NonBlocking]: http://www.nonblocking.io/2011/06/windownexttick.html\n\nFor further reading, Nicholas Zakas provided a thorough article on [The\nCase for setImmediate][NCZ].\n\n[NCZ]: http://www.nczonline.net/blog/2013/07/09/the-case-for-setimmediate/\n\nEmber’s RSVP promise implementation later [adopted][RSVP ASAP] the name ASAP but\nfurther developed the implentation.\nParticularly, The `MessagePort` implementation was abandoned due to interaction\n[problems with Mobile Internet Explorer][IE Problems] in favor of an\nimplementation backed on the newer and more reliable DOM `MutationObserver`\ninterface.\nThese changes were back-ported into this library.\n\n[IE Problems]: https://github.com/cujojs/when/issues/197\n[RSVP ASAP]: https://github.com/tildeio/rsvp.js/blob/cddf7232546a9cf858524b75cde6f9edf72620a7/lib/rsvp/asap.js\n\nIn addition, ASAP factored into `asap` and `asap/raw`, such that `asap` remained\nexception-safe, but `asap/raw` provided a tight kernel that could be used for\ntasks that guaranteed that they would not throw exceptions.\nThis core is useful for promise implementations that capture thrown errors in\nrejected promises and do not need a second safety net.\nAt the same time, the exception handling in `asap` was factored into separate\nimplementations for Node.js and browsers, using the the [Browserify][Browser\nConfig] `browser` property in `package.json` to instruct browser module loaders\nand bundlers, including [Browserify][], [Mr][], and [Mop][], to use the\nbrowser-only implementation.\n\n[Browser Config]: https://gist.github.com/defunctzombie/4339901\n[Browserify]: https://github.com/substack/node-browserify\n[Mr]: https://github.com/montagejs/mr\n[Mop]: https://github.com/montagejs/mop\n\n## License\n\nCopyright 2009-2014 by Contributors\nMIT License (enclosed)\n\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/kriskowal/asap/issues" }, "homepage": "https://github.com/kriskowal/asap#readme", "_id": "asap@2.0.3", "_shasum": "1fc1d1564ee11620dfca6d67029850913f9f4679", "_resolved": "https://registry.npmjs.org/asap/-/asap-2.0.3.tgz", "_from": "asap@>=2.0.0 <3.0.0" } npm_3.5.2.orig/node_modules/dezalgo/node_modules/asap/raw.js0000644000000000000000000001005312631326456022323 0ustar 00000000000000"use strict"; var domain; // The domain module is executed on demand var hasSetImmediate = typeof setImmediate === "function"; // Use the fastest means possible to execute a task in its own turn, with // priority over other events including network IO events in Node.js. // // An exception thrown by a task will permanently interrupt the processing of // subsequent tasks. The higher level `asap` function ensures that if an // exception is thrown by a task, that the task queue will continue flushing as // soon as possible, but if you use `rawAsap` directly, you are responsible to // either ensure that no exceptions are thrown from your task, or to manually // call `rawAsap.requestFlush` if an exception is thrown. module.exports = rawAsap; function rawAsap(task) { if (!queue.length) { requestFlush(); flushing = true; } // Avoids a function call queue[queue.length] = task; } var queue = []; // Once a flush has been requested, no further calls to `requestFlush` are // necessary until the next `flush` completes. var flushing = false; // The position of the next task to execute in the task queue. This is // preserved between calls to `flush` so that it can be resumed if // a task throws an exception. var index = 0; // If a task schedules additional tasks recursively, the task queue can grow // unbounded. To prevent memory excaustion, the task queue will periodically // truncate already-completed tasks. var capacity = 1024; // The flush function processes all tasks that have been scheduled with // `rawAsap` unless and until one of those tasks throws an exception. // If a task throws an exception, `flush` ensures that its state will remain // consistent and will resume where it left off when called again. // However, `flush` does not make any arrangements to be called again if an // exception is thrown. function flush() { while (index < queue.length) { var currentIndex = index; // Advance the index before calling the task. This ensures that we will // begin flushing on the next task the task throws an error. index = index + 1; queue[currentIndex].call(); // Prevent leaking memory for long chains of recursive calls to `asap`. // If we call `asap` within tasks scheduled by `asap`, the queue will // grow, but to avoid an O(n) walk for every task we execute, we don't // shift tasks off the queue after they have been executed. // Instead, we periodically shift 1024 tasks off the queue. if (index > capacity) { // Manually shift all values starting at the index back to the // beginning of the queue. for (var scan = 0, newLength = queue.length - index; scan < newLength; scan++) { queue[scan] = queue[scan + index]; } queue.length -= index; index = 0; } } queue.length = 0; index = 0; flushing = false; } rawAsap.requestFlush = requestFlush; function requestFlush() { // Ensure flushing is not bound to any domain. // It is not sufficient to exit the domain, because domains exist on a stack. // To execute code outside of any domain, the following dance is necessary. var parentDomain = process.domain; if (parentDomain) { if (!domain) { // Lazy execute the domain module. // Only employed if the user elects to use domains. domain = require("domain"); } domain.active = process.domain = null; } // `setImmediate` is slower that `process.nextTick`, but `process.nextTick` // cannot handle recursion. // `requestFlush` will only be called recursively from `asap.js`, to resume // flushing after an error is thrown into a domain. // Conveniently, `setImmediate` was introduced in the same version // `process.nextTick` started throwing recursion errors. if (flushing && hasSetImmediate) { setImmediate(flush); } else { process.nextTick(flush); } if (parentDomain) { domain.active = process.domain = parentDomain; } } npm_3.5.2.orig/node_modules/dezalgo/test/basic.js0000644000000000000000000000106612631326456020175 0ustar 00000000000000var test = require('tap').test var dz = require('../dezalgo.js') test('the dark pony', function(t) { var n = 0 function foo(i, cb) { cb = dz(cb) if (++n % 2) cb(true, i) else process.nextTick(cb.bind(null, false, i)) } var called = 0 var order = [0, 2, 4, 6, 8, 1, 3, 5, 7, 9] var o = 0 for (var i = 0; i < 10; i++) { foo(i, function(cached, i) { t.equal(i, order[o++]) t.equal(i % 2, cached ? 0 : 1) called++ }) t.equal(called, 0) } setTimeout(function() { t.equal(called, 10) t.end() }) }) npm_3.5.2.orig/node_modules/editor/LICENSE0000644000000000000000000000216112631326456016442 0ustar 00000000000000Copyright 2013 James Halliday (mail@substack.net) This project is free software released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/editor/README.markdown0000644000000000000000000000121512631326456020135 0ustar 00000000000000editor ====== Launch $EDITOR in your program. example ======= ``` js var editor = require('editor'); editor('beep.json', function (code, sig) { console.log('finished editing with code ' + code); }); ``` *** ``` $ node edit.js ``` ![editor](http://substack.net/images/screenshots/editor.png) ``` finished editing with code 0 ``` methods ======= ``` js var editor = require('editor') ``` editor(file, opts={}, cb) ------------------------- Launch the `$EDITOR` (or `opts.editor`) for `file`. When the editor exits, `cb(code, sig)` fires. install ======= With [npm](http://npmjs.org) do: ``` npm install editor ``` license ======= MIT npm_3.5.2.orig/node_modules/editor/example/0000755000000000000000000000000012631326456017070 5ustar 00000000000000npm_3.5.2.orig/node_modules/editor/index.js0000644000000000000000000000112312631326456017077 0ustar 00000000000000var spawn = require('child_process').spawn; module.exports = function (file, opts, cb) { if (typeof opts === 'function') { cb = opts; opts = {}; } if (!opts) opts = {}; var ed = /^win/.test(process.platform) ? 'notepad' : 'vim'; var editor = opts.editor || process.env.VISUAL || process.env.EDITOR || ed; var args = editor.split(/\s+/); var bin = args.shift(); var ps = spawn(bin, args.concat([ file ]), { stdio: 'inherit' }); ps.on('exit', function (code, sig) { if (typeof cb === 'function') cb(code, sig) }); }; npm_3.5.2.orig/node_modules/editor/package.json0000644000000000000000000000262712631326456017732 0ustar 00000000000000{ "name": "editor", "version": "1.0.0", "description": "launch $EDITOR in your program", "main": "index.js", "directories": { "example": "example", "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.4.4" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/substack/node-editor.git" }, "homepage": "https://github.com/substack/node-editor", "keywords": [ "text", "edit", "shell" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "engine": { "node": ">=0.6" }, "gitHead": "15200af2c417c65a4df153f39f32143dcd476375", "bugs": { "url": "https://github.com/substack/node-editor/issues" }, "_id": "editor@1.0.0", "_shasum": "60c7f87bd62bcc6a894fa8ccd6afb7823a24f742", "_from": "editor@>=1.0.0 <1.1.0", "_npmVersion": "2.7.5", "_nodeVersion": "1.6.3", "_npmUser": { "name": "substack", "email": "substack@gmail.com" }, "dist": { "shasum": "60c7f87bd62bcc6a894fa8ccd6afb7823a24f742", "tarball": "http://registry.npmjs.org/editor/-/editor-1.0.0.tgz" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "_resolved": "https://registry.npmjs.org/editor/-/editor-1.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/editor/example/beep.json0000644000000000000000000000004412631326456020674 0ustar 00000000000000{ "a" : 3, "b" : 4, "c" : 5 } npm_3.5.2.orig/node_modules/editor/example/edit.js0000644000000000000000000000022012631326456020345 0ustar 00000000000000var editor = require('../'); editor(__dirname + '/beep.json', function (code, sig) { console.log('finished editing with code ' + code); }); npm_3.5.2.orig/node_modules/fs-vacuum/.eslintrc0000644000000000000000000000064512631326456017706 0ustar 00000000000000{ "env" : { "node" : true }, "rules" : { "curly" : 0, "no-lonely-if" : 1, "no-mixed-requires" : 0, "no-underscore-dangle" : 0, "no-unused-vars" : [2, {"vars" : "all", "args" : "after-used"}], "no-use-before-define" : [2, "nofunc"], "quotes" : [1, "double", "avoid-escape"], "semi" : [2, "never"], "space-after-keywords" : 1, "space-infix-ops" : 0, "strict" : 0 } } npm_3.5.2.orig/node_modules/fs-vacuum/.npmignore0000644000000000000000000000001512631326456020050 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/fs-vacuum/LICENSE0000644000000000000000000000134012631326456017060 0ustar 00000000000000Copyright (c) 2015, Forrest L Norvell Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/fs-vacuum/README.md0000644000000000000000000000226512631326456017341 0ustar 00000000000000# fs-vacuum Remove the empty branches of a directory tree, optionally up to (but not including) a specified base directory. Optionally nukes the leaf directory. ## Usage ```javascript var logger = require("npmlog"); var vacuum = require("fs-vacuum"); var options = { base : "/path/to/my/tree/root", purge : true, log : logger.silly.bind(logger, "myCleanup") }; /* Assuming there are no other files or directories in "out", "to", or "my", * the final path will just be "/path/to/my/tree/root". */ vacuum("/path/to/my/tree/root/out/to/my/files", function (error) { if (error) console.error("Unable to cleanly vacuum:", error.message); }); ``` # vacuum(directory, options, callback) * `directory` {String} Leaf node to remove. **Must be a directory, symlink, or file.** * `options` {Object} * `base` {String} No directories at or above this level of the filesystem will be removed. * `purge` {Boolean} If set, nuke the whole leaf directory, including its contents. * `log` {Function} A logging function that takes `npmlog`-compatible argument lists. * `callback` {Function} Function to call once vacuuming is complete. * `error` {Error} What went wrong along the way, if anything. npm_3.5.2.orig/node_modules/fs-vacuum/package.json0000644000000000000000000000273512631326456020352 0ustar 00000000000000{ "name": "fs-vacuum", "version": "1.2.7", "description": "recursively remove empty directories -- to a point", "main": "vacuum.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/npm/fs-vacuum.git" }, "keywords": [ "rm", "rimraf", "clean" ], "author": { "name": "Forrest L Norvell", "email": "ogd@aoaioxxysz.net" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/fs-vacuum/issues" }, "homepage": "https://github.com/npm/fs-vacuum", "devDependencies": { "mkdirp": "^0.5.0", "tap": "^0.4.11", "tmp": "0.0.24" }, "dependencies": { "graceful-fs": "^4.1.2", "path-is-inside": "^1.0.1", "rimraf": "^2.2.8" }, "gitHead": "498a44d987ee11bc355fe1ec479d55a689fc37ef", "_id": "fs-vacuum@1.2.7", "_shasum": "75e501f9d2889ba2fe9fe12f936ba5dad50ca35a", "_from": "fs-vacuum@>=1.2.7 <1.3.0", "_npmVersion": "2.14.3", "_nodeVersion": "2.2.2", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "75e501f9d2889ba2fe9fe12f936ba5dad50ca35a", "tarball": "http://registry.npmjs.org/fs-vacuum/-/fs-vacuum-1.2.7.tgz" }, "maintainers": [ { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/fs-vacuum/-/fs-vacuum-1.2.7.tgz" } npm_3.5.2.orig/node_modules/fs-vacuum/test/0000755000000000000000000000000012631326456017034 5ustar 00000000000000npm_3.5.2.orig/node_modules/fs-vacuum/vacuum.js0000644000000000000000000000611412631326456017715 0ustar 00000000000000var assert = require("assert") var dirname = require("path").dirname var resolve = require("path").resolve var isInside = require("path-is-inside") var rimraf = require("rimraf") var lstat = require("graceful-fs").lstat var readdir = require("graceful-fs").readdir var rmdir = require("graceful-fs").rmdir var unlink = require("graceful-fs").unlink module.exports = vacuum function vacuum(leaf, options, cb) { assert(typeof leaf === "string", "must pass in path to remove") assert(typeof cb === "function", "must pass in callback") if (!options) options = {} assert(typeof options === "object", "options must be an object") var log = options.log ? options.log : function () {} leaf = leaf && resolve(leaf) var base = options.base && resolve(options.base) if (base && !isInside(leaf, base)) { return cb(new Error(leaf + " is not a child of " + base)) } lstat(leaf, function (error, stat) { if (error) { if (error.code === "ENOENT") return cb(null) log(error.stack) return cb(error) } if (!(stat && (stat.isDirectory() || stat.isSymbolicLink() || stat.isFile()))) { log(leaf, "is not a directory, file, or link") return cb(new Error(leaf + " is not a directory, file, or link")) } if (options.purge) { log("purging", leaf) rimraf(leaf, function (error) { if (error) return cb(error) next(dirname(leaf)) }) } else if (!stat.isDirectory()) { log("removing", leaf) unlink(leaf, function (error) { if (error) return cb(error) next(dirname(leaf)) }) } else { next(leaf) } }) function next(branch) { branch = branch && resolve(branch) // either we've reached the base or we've reached the root if ((base && branch === base) || branch === dirname(branch)) { log("finished vacuuming up to", branch) return cb(null) } readdir(branch, function (error, files) { if (error) { if (error.code === "ENOENT") return cb(null) log("unable to check directory", branch, "due to", error.message) return cb(error) } if (files.length > 0) { log("quitting because other entries in", branch) return cb(null) } log("removing", branch) lstat(branch, function (error, stat) { if (error) { if (error.code === "ENOENT") return cb(null) log("unable to lstat", branch, "due to", error.message) return cb(error) } var remove = stat.isDirectory() ? rmdir : unlink remove(branch, function (error) { if (error) { if (error.code === "ENOENT") { log("quitting because lost the race to remove", branch) return cb(null) } if (error.code === "ENOTEMPTY") { log("quitting because new (racy) entries in", branch) return cb(null) } log("unable to remove", branch, "due to", error.message) return cb(error) } next(dirname(branch)) }) }) }) } } npm_3.5.2.orig/node_modules/fs-vacuum/test/arguments.js0000644000000000000000000000131612631326456021400 0ustar 00000000000000var test = require("tap").test var vacuum = require("../vacuum.js") test("vacuum throws on missing parameters", function (t) { t.throws(vacuum, "called with no parameters") t.throws(function () { vacuum("directory", {}) }, "called with no callback") t.end() }) test('vacuum throws on incorrect types ("Forrest is pedantic" section)', function (t) { t.throws(function () { vacuum({}, {}, function () {}) }, "called with path parameter of incorrect type") t.throws(function () { vacuum("directory", "directory", function () {}) }, "called with options of wrong type") t.throws(function () { vacuum("directory", {}, "whoops") }, "called with callback that isn't callable") t.end() }) npm_3.5.2.orig/node_modules/fs-vacuum/test/base-leaf-mismatch.js0000644000000000000000000000062612631326456023020 0ustar 00000000000000var test = require("tap").test var vacuum = require("../vacuum.js") test("vacuum errors when base is set and path is not under it", function (t) { vacuum("/a/made/up/path", {base : "/root/elsewhere"}, function (er) { t.ok(er, "got an error") t.equal( er.message, "/a/made/up/path is not a child of /root/elsewhere", "got the expected error message" ) t.end() }) }) npm_3.5.2.orig/node_modules/fs-vacuum/test/no-entries-file-no-purge.js0000644000000000000000000000444612631326456024134 0ustar 00000000000000var path = require("path") var test = require("tap").test var statSync = require("graceful-fs").statSync var writeFile = require("graceful-fs").writeFile var readdirSync = require("graceful-fs").readdirSync var mkdtemp = require("tmp").dir var mkdirp = require("mkdirp") var vacuum = require("../vacuum.js") // CONSTANTS var TEMP_OPTIONS = { unsafeCleanup : true, mode : "0700" } var SHORT_PATH = path.join("i", "am", "a", "path") var PARTIAL_PATH = path.join(SHORT_PATH, "that", "ends", "at", "a") var FULL_PATH = path.join(PARTIAL_PATH, "file") var messages = [] function log() { messages.push(Array.prototype.slice.call(arguments).join(" ")) } var testBase, partialPath, fullPath test("xXx setup xXx", function (t) { mkdtemp(TEMP_OPTIONS, function (er, tmpdir) { t.ifError(er, "temp directory exists") testBase = path.resolve(tmpdir, SHORT_PATH) partialPath = path.resolve(tmpdir, PARTIAL_PATH) fullPath = path.resolve(tmpdir, FULL_PATH) mkdirp(partialPath, function (er) { t.ifError(er, "made test path") writeFile(fullPath, new Buffer("hi"), function (er) { t.ifError(er, "made file") t.end() }) }) }) }) test("remove up to a point", function (t) { vacuum(fullPath, {purge : false, base : testBase, log : log}, function (er) { t.ifError(er, "cleaned up to base") t.equal(messages.length, 6, "got 5 removal & 1 finish message") t.equal(messages[5], "finished vacuuming up to " + testBase) var stat var verifyPath = fullPath function verify() { stat = statSync(verifyPath) } // handle the file separately t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isFile(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) for (var i = 0; i < 4; i++) { t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isDirectory(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) } t.doesNotThrow(function () { stat = statSync(testBase) }, testBase + " can be statted") t.ok(stat && stat.isDirectory(), testBase + " is still a directory") var files = readdirSync(testBase) t.equal(files.length, 0, "nothing left in base directory") t.end() }) }) npm_3.5.2.orig/node_modules/fs-vacuum/test/no-entries-link-no-purge.js0000644000000000000000000000464512631326456024153 0ustar 00000000000000var path = require("path") var test = require("tap").test var statSync = require("graceful-fs").statSync var symlinkSync = require("graceful-fs").symlinkSync var readdirSync = require("graceful-fs").readdirSync var mkdtemp = require("tmp").dir var mkdirp = require("mkdirp") var vacuum = require("../vacuum.js") // CONSTANTS var TEMP_OPTIONS = { unsafeCleanup : true, mode : "0700" } var SHORT_PATH = path.join("i", "am", "a", "path") var TARGET_PATH = path.join("target-link", "in", "the", "middle") var PARTIAL_PATH = path.join(SHORT_PATH, "with", "a") var FULL_PATH = path.join(PARTIAL_PATH, "link") var EXPANDO_PATH = path.join(SHORT_PATH, "with", "a", "link", "in", "the", "middle") var messages = [] function log() { messages.push(Array.prototype.slice.call(arguments).join(" ")) } var testBase, targetPath, partialPath, fullPath, expandoPath test("xXx setup xXx", function (t) { mkdtemp(TEMP_OPTIONS, function (er, tmpdir) { t.ifError(er, "temp directory exists") testBase = path.resolve(tmpdir, SHORT_PATH) targetPath = path.resolve(tmpdir, TARGET_PATH) partialPath = path.resolve(tmpdir, PARTIAL_PATH) fullPath = path.resolve(tmpdir, FULL_PATH) expandoPath = path.resolve(tmpdir, EXPANDO_PATH) mkdirp(partialPath, function (er) { t.ifError(er, "made test path") mkdirp(targetPath, function (er) { t.ifError(er, "made target path") symlinkSync(path.join(tmpdir, "target-link"), fullPath) t.end() }) }) }) }) test("remove up to a point", function (t) { vacuum(expandoPath, {purge : false, base : testBase, log : log}, function (er) { t.ifError(er, "cleaned up to base") t.equal(messages.length, 7, "got 6 removal & 1 finish message") t.equal(messages[6], "finished vacuuming up to " + testBase) var stat var verifyPath = expandoPath function verify() { stat = statSync(verifyPath) } for (var i = 0; i < 6; i++) { t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isDirectory(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) } t.doesNotThrow(function () { stat = statSync(testBase) }, testBase + " can be statted") t.ok(stat && stat.isDirectory(), testBase + " is still a directory") var files = readdirSync(testBase) t.equal(files.length, 0, "nothing left in base directory") t.end() }) }) npm_3.5.2.orig/node_modules/fs-vacuum/test/no-entries-no-purge.js0000644000000000000000000000323412631326456023211 0ustar 00000000000000var path = require("path") var test = require("tap").test var statSync = require("graceful-fs").statSync var mkdtemp = require("tmp").dir var mkdirp = require("mkdirp") var vacuum = require("../vacuum.js") // CONSTANTS var TEMP_OPTIONS = { unsafeCleanup : true, mode : "0700" } var SHORT_PATH = path.join("i", "am", "a", "path") var LONG_PATH = path.join(SHORT_PATH, "of", "a", "certain", "length") var messages = [] function log() { messages.push(Array.prototype.slice.call(arguments).join(" ")) } var testPath, testBase test("xXx setup xXx", function (t) { mkdtemp(TEMP_OPTIONS, function (er, tmpdir) { t.ifError(er, "temp directory exists") testBase = path.resolve(tmpdir, SHORT_PATH) testPath = path.resolve(tmpdir, LONG_PATH) mkdirp(testPath, function (er) { t.ifError(er, "made test path") t.end() }) }) }) test("remove up to a point", function (t) { vacuum(testPath, {purge : false, base : testBase, log : log}, function (er) { t.ifError(er, "cleaned up to base") t.equal(messages.length, 5, "got 4 removal & 1 finish message") t.equal(messages[4], "finished vacuuming up to " + testBase) var stat var verifyPath = testPath function verify() { stat = statSync(verifyPath) } for (var i = 0; i < 4; i++) { t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isDirectory(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) } t.doesNotThrow(function () { stat = statSync(testBase) }, testBase + " can be statted") t.ok(stat && stat.isDirectory(), testBase + " is still a directory") t.end() }) }) npm_3.5.2.orig/node_modules/fs-vacuum/test/no-entries-with-link-purge.js0000644000000000000000000000471512631326456024510 0ustar 00000000000000var path = require("path") var test = require("tap").test var statSync = require("graceful-fs").statSync var writeFileSync = require("graceful-fs").writeFileSync var symlinkSync = require("graceful-fs").symlinkSync var mkdtemp = require("tmp").dir var mkdirp = require("mkdirp") var vacuum = require("../vacuum.js") // CONSTANTS var TEMP_OPTIONS = { unsafeCleanup : true, mode : "0700" } var SHORT_PATH = path.join("i", "am", "a", "path") var TARGET_PATH = "link-target" var FIRST_FILE = path.join(TARGET_PATH, "monsieurs") var SECOND_FILE = path.join(TARGET_PATH, "mesdames") var PARTIAL_PATH = path.join(SHORT_PATH, "with", "a", "definite") var FULL_PATH = path.join(PARTIAL_PATH, "target") var messages = [] function log() { messages.push(Array.prototype.slice.call(arguments).join(" ")) } var testBase, partialPath, fullPath, targetPath test("xXx setup xXx", function (t) { mkdtemp(TEMP_OPTIONS, function (er, tmpdir) { t.ifError(er, "temp directory exists") testBase = path.resolve(tmpdir, SHORT_PATH) targetPath = path.resolve(tmpdir, TARGET_PATH) partialPath = path.resolve(tmpdir, PARTIAL_PATH) fullPath = path.resolve(tmpdir, FULL_PATH) mkdirp(partialPath, function (er) { t.ifError(er, "made test path") mkdirp(targetPath, function (er) { t.ifError(er, "made target path") writeFileSync(path.resolve(tmpdir, FIRST_FILE), new Buffer("c'est vraiment joli")) writeFileSync(path.resolve(tmpdir, SECOND_FILE), new Buffer("oui oui")) symlinkSync(targetPath, fullPath) t.end() }) }) }) }) test("remove up to a point", function (t) { vacuum(fullPath, {purge : true, base : testBase, log : log}, function (er) { t.ifError(er, "cleaned up to base") t.equal(messages.length, 5, "got 4 removal & 1 finish message") t.equal(messages[0], "purging " + fullPath) t.equal(messages[4], "finished vacuuming up to " + testBase) var stat var verifyPath = fullPath function verify() { stat = statSync(verifyPath) } for (var i = 0; i < 4; i++) { t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isDirectory(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) } t.doesNotThrow(function () { stat = statSync(testBase) }, testBase + " can be statted") t.ok(stat && stat.isDirectory(), testBase + " is still a directory") t.end() }) }) npm_3.5.2.orig/node_modules/fs-vacuum/test/no-entries-with-purge.js0000644000000000000000000000405012631326456023545 0ustar 00000000000000var path = require("path") var test = require("tap").test var statSync = require("graceful-fs").statSync var writeFileSync = require("graceful-fs").writeFileSync var mkdtemp = require("tmp").dir var mkdirp = require("mkdirp") var vacuum = require("../vacuum.js") // CONSTANTS var TEMP_OPTIONS = { unsafeCleanup : true, mode : "0700" } var SHORT_PATH = path.join("i", "am", "a", "path") var LONG_PATH = path.join(SHORT_PATH, "of", "a", "certain", "kind") var FIRST_FILE = path.join(LONG_PATH, "monsieurs") var SECOND_FILE = path.join(LONG_PATH, "mesdames") var messages = [] function log() { messages.push(Array.prototype.slice.call(arguments).join(" ")) } var testPath, testBase test("xXx setup xXx", function (t) { mkdtemp(TEMP_OPTIONS, function (er, tmpdir) { t.ifError(er, "temp directory exists") testBase = path.resolve(tmpdir, SHORT_PATH) testPath = path.resolve(tmpdir, LONG_PATH) mkdirp(testPath, function (er) { t.ifError(er, "made test path") writeFileSync(path.resolve(tmpdir, FIRST_FILE), new Buffer("c'est vraiment joli")) writeFileSync(path.resolve(tmpdir, SECOND_FILE), new Buffer("oui oui")) t.end() }) }) }) test("remove up to a point", function (t) { vacuum(testPath, {purge : true, base : testBase, log : log}, function (er) { t.ifError(er, "cleaned up to base") t.equal(messages.length, 5, "got 4 removal & 1 finish message") t.equal(messages[0], "purging " + testPath) t.equal(messages[4], "finished vacuuming up to " + testBase) var stat var verifyPath = testPath function verify() { stat = statSync(verifyPath) } for (var i = 0; i < 4; i++) { t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isDirectory(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) } t.doesNotThrow(function () { stat = statSync(testBase) }, testBase + " can be statted") t.ok(stat && stat.isDirectory(), testBase + " is still a directory") t.end() }) }) npm_3.5.2.orig/node_modules/fs-vacuum/test/other-directories-no-purge.js0000644000000000000000000000425312631326456024563 0ustar 00000000000000var path = require("path") var test = require("tap").test var statSync = require("graceful-fs").statSync var mkdtemp = require("tmp").dir var mkdirp = require("mkdirp") var vacuum = require("../vacuum.js") // CONSTANTS var TEMP_OPTIONS = { unsafeCleanup : true, mode : "0700" } var SHORT_PATH = path.join("i", "am", "a", "path") var REMOVE_PATH = path.join(SHORT_PATH, "of", "a", "certain", "length") var OTHER_PATH = path.join(SHORT_PATH, "of", "no", "qualities") var messages = [] function log() { messages.push(Array.prototype.slice.call(arguments).join(" ")) } var testBase, testPath, otherPath test("xXx setup xXx", function (t) { mkdtemp(TEMP_OPTIONS, function (er, tmpdir) { t.ifError(er, "temp directory exists") testBase = path.resolve(tmpdir, SHORT_PATH) testPath = path.resolve(tmpdir, REMOVE_PATH) otherPath = path.resolve(tmpdir, OTHER_PATH) mkdirp(testPath, function (er) { t.ifError(er, "made test path") mkdirp(otherPath, function (er) { t.ifError(er, "made other path") t.end() }) }) }) }) test("remove up to a point", function (t) { vacuum(testPath, {purge : false, base : testBase, log : log}, function (er) { t.ifError(er, "cleaned up to base") t.equal(messages.length, 4, "got 3 removal & 1 finish message") t.equal( messages[3], "quitting because other entries in " + testBase + "/of", "got expected final message" ) var stat var verifyPath = testPath function verify() { stat = statSync(verifyPath) } for (var i = 0; i < 3; i++) { t.throws(verify, verifyPath + " cannot be statted") t.notOk(stat && stat.isDirectory(), verifyPath + " is totally gone") verifyPath = path.dirname(verifyPath) } t.doesNotThrow(function () { stat = statSync(otherPath) }, otherPath + " can be statted") t.ok(stat && stat.isDirectory(), otherPath + " is still a directory") var intersection = path.join(testBase, "of") t.doesNotThrow(function () { stat = statSync(intersection) }, intersection + " can be statted") t.ok(stat && stat.isDirectory(), intersection + " is still a directory") t.end() }) }) npm_3.5.2.orig/node_modules/fs-write-stream-atomic/.npmignore0000644000000000000000000000004512631326456022450 0ustar 00000000000000node_modules/ coverage/ .nyc_output/ npm_3.5.2.orig/node_modules/fs-write-stream-atomic/.travis.yml0000644000000000000000000000021512631326456022561 0ustar 00000000000000language: node_js sudo: false before_install: - "npm -g install npm" node_js: - "0.8" - "0.10" - "0.12" - "iojs" - "4" - "5" npm_3.5.2.orig/node_modules/fs-write-stream-atomic/LICENSE0000644000000000000000000000137512631326456021465 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/fs-write-stream-atomic/README.md0000644000000000000000000000201112631326456021723 0ustar 00000000000000# fs-write-stream-atomic Like `fs.createWriteStream(...)`, but atomic. Writes to a tmp file and does an atomic `fs.rename` to move it into place when it's done. First rule of debugging: **It's always a race condition.** ## USAGE ```javascript var fsWriteStreamAtomic = require('fs-write-stream-atomic') // options are optional. var write = fsWriteStreamAtomic('output.txt', options) var read = fs.createReadStream('input.txt') read.pipe(write) // When the write stream emits a 'finish' or 'close' event, // you can be sure that it is moved into place, and contains // all the bytes that were written to it, even if something else // was writing to `output.txt` at the same time. ``` ### `fsWriteStreamAtomic(filename, [options])` * `filename` {String} The file we want to write to * `options` {Object} * `chown` {Object} User and group to set ownership after write * `uid` {Number} * `gid` {Number} * `encoding` {String} default = 'utf8' * `mode` {Number} default = `0666` * `flags` {String} default = `'w'` npm_3.5.2.orig/node_modules/fs-write-stream-atomic/index.js0000644000000000000000000000463512631326456022127 0ustar 00000000000000var fs = require('graceful-fs') var util = require('util') var MurmurHash3 = require('imurmurhash') function murmurhex () { var hash = MurmurHash3('') for (var ii = 0; ii < arguments.length; ++ii) { hash.hash(hash + arguments[ii]) } return hash.result() } var invocations = 0 function getTmpname (filename) { return filename + '.' + murmurhex(__filename, process.pid, ++invocations) } module.exports = WriteStream util.inherits(WriteStream, fs.WriteStream) function WriteStream (path, options) { if (!options) options = {} if (!(this instanceof WriteStream)) { return new WriteStream(path, options) } this.__atomicTarget = path this.__atomicChown = options.chown this.__atomicDidStuff = false this.__atomicTmp = getTmpname(path) fs.WriteStream.call(this, this.__atomicTmp, options) } function cleanup (er) { fs.unlink(this.__atomicTmp, function () { fs.WriteStream.prototype.emit.call(this, 'error', er) }.bind(this)) } function cleanupSync () { try { fs.unlinkSync(this.__atomicTmp) } finally { return } } // When we *would* emit 'close' or 'finish', instead do our stuff WriteStream.prototype.emit = function (ev) { if (ev === 'error') cleanupSync.call(this) if (ev !== 'close' && ev !== 'finish') { return fs.WriteStream.prototype.emit.apply(this, arguments) } // We handle emitting finish and close after the rename. if (ev === 'close' || ev === 'finish') { if (!this.__atomicDidStuff) { atomicDoStuff.call(this, function (er) { if (er) cleanup.call(this, er) }.bind(this)) } } } function atomicDoStuff (cb) { if (this.__atomicDidStuff) { throw new Error('Already did atomic move-into-place') } this.__atomicDidStuff = true if (this.__atomicChown) { var uid = this.__atomicChown.uid var gid = this.__atomicChown.gid return fs.chown(this.__atomicTmp, uid, gid, function (er) { if (er) return cb(er) moveIntoPlace.call(this, cb) }.bind(this)) } else { moveIntoPlace.call(this, cb) } } function moveIntoPlace (cb) { fs.rename(this.__atomicTmp, this.__atomicTarget, function (er) { cb(er) // emit finish, and then close on the next tick // This makes finish/close consistent across Node versions also. fs.WriteStream.prototype.emit.call(this, 'finish') process.nextTick(function () { fs.WriteStream.prototype.emit.call(this, 'close') }.bind(this)) }.bind(this)) } npm_3.5.2.orig/node_modules/fs-write-stream-atomic/package.json0000644000000000000000000000434212631326456022743 0ustar 00000000000000{ "_args": [ [ "fs-write-stream-atomic@^1.0.5", "/Users/ogd/Documents/projects/npm/npm" ] ], "_from": "fs-write-stream-atomic@>=1.0.5 <2.0.0", "_id": "fs-write-stream-atomic@1.0.5", "_inCache": true, "_installable": true, "_location": "/fs-write-stream-atomic", "_nodeVersion": "5.1.0", "_npmUser": { "email": "ogd@aoaioxxysz.net", "name": "othiym23" }, "_npmVersion": "3.5.1", "_phantomChildren": {}, "_requested": { "name": "fs-write-stream-atomic", "raw": "fs-write-stream-atomic@^1.0.5", "rawSpec": "^1.0.5", "scope": null, "spec": ">=1.0.5 <2.0.0", "type": "range" }, "_requiredBy": [ "/" ], "_shasum": "862a4dabdffcafabfc16499458e37310c39925f6", "_shrinkwrap": null, "_spec": "fs-write-stream-atomic@^1.0.5", "_where": "/Users/ogd/Documents/projects/npm/npm", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/npm/fs-write-stream-atomic/issues" }, "dependencies": { "graceful-fs": "^4.1.2", "imurmurhash": "^0.1.4" }, "description": "Like `fs.createWriteStream(...)`, but atomic.", "devDependencies": { "standard": "^5.4.1", "tap": "^2.3.1" }, "directories": { "test": "test" }, "dist": { "shasum": "862a4dabdffcafabfc16499458e37310c39925f6", "tarball": "http://registry.npmjs.org/fs-write-stream-atomic/-/fs-write-stream-atomic-1.0.5.tgz" }, "gitHead": "1bc752bf0e0d5b7aaaad7be696dbc0e4ea64258c", "homepage": "https://github.com/npm/fs-write-stream-atomic", "license": "ISC", "main": "index.js", "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" }, { "name": "isaacs", "email": "i@izs.me" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "name": "fs-write-stream-atomic", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git+https://github.com/npm/fs-write-stream-atomic.git" }, "scripts": { "test": "standard && tap --coverage test/*.js" }, "version": "1.0.5" } npm_3.5.2.orig/node_modules/fs-write-stream-atomic/test/0000755000000000000000000000000012631326456021431 5ustar 00000000000000npm_3.5.2.orig/node_modules/fs-write-stream-atomic/test/basic.js0000644000000000000000000000462412631326456023056 0ustar 00000000000000var test = require('tap').test var writeStream = require('../index.js') var fs = require('fs') var path = require('path') test('basic', function (t) { // open 10 write streams to the same file. // then write to each of them, and to the target // and verify at the end that each of them does their thing var target = path.resolve(__dirname, 'test.txt') var n = 10 var streams = [] for (var i = 0; i < n; i++) { var s = writeStream(target) s.on('finish', verifier('finish')) s.on('close', verifier('close')) streams.push(s) } var verifierCalled = 0 function verifier (ev) { return function () { if (ev === 'close') { t.equal(this.__emittedFinish, true) } else { this.__emittedFinish = true t.equal(ev, 'finish') } // make sure that one of the atomic streams won. var res = fs.readFileSync(target, 'utf8') var lines = res.trim().split(/\n/) lines.forEach(function (line) { var first = lines[0].match(/\d+$/)[0] var cur = line.match(/\d+$/)[0] t.equal(cur, first) }) var resExpr = /^first write \d+\nsecond write \d+\nthird write \d+\nfinal write \d+\n$/ t.similar(res, resExpr) // should be called once for each close, and each finish if (++verifierCalled === n * 2) { t.end() } } } // now write something to each stream. streams.forEach(function (stream, i) { stream.write('first write ' + i + '\n') }) // wait a sec for those writes to go out. setTimeout(function () { // write something else to the target. fs.writeFileSync(target, 'brutality!\n') // write some more stuff. streams.forEach(function (stream, i) { stream.write('second write ' + i + '\n') }) setTimeout(function () { // Oops! Deleted the file! fs.unlinkSync(target) // write some more stuff. streams.forEach(function (stream, i) { stream.write('third write ' + i + '\n') }) setTimeout(function () { fs.writeFileSync(target, 'brutality TWO!\n') streams.forEach(function (stream, i) { stream.end('final write ' + i + '\n') }) }, 50) }, 50) }, 50) }) test('cleanup', function (t) { fs.readdirSync(__dirname).filter(function (f) { return f.match(/^test.txt/) }).forEach(function (file) { fs.unlinkSync(path.resolve(__dirname, file)) }) t.end() }) npm_3.5.2.orig/node_modules/fs-write-stream-atomic/test/toolong.js0000644000000000000000000000142712631326456023454 0ustar 00000000000000var path = require('path') var test = require('tap').test var writeStream = require('../index.js') function repeat (times, string) { var output = '' for (var ii = 0; ii < times; ++ii) { output += string } return output } var target = path.resolve(__dirname, repeat(1000, 'test')) test('name too long', function (t) { // 0.8 streams smh if (process.version.indexOf('v0.8') !== -1) { t.plan(1) } else { t.plan(2) } var stream = writeStream(target) var hadError = false stream.on('error', function (er) { if (!hadError) { t.is(er.code, 'ENAMETOOLONG', target.length + ' character name results in ENAMETOOLONG') hadError = true } }) stream.on('close', function () { t.ok(hadError, 'got error before close') }) stream.end() }) npm_3.5.2.orig/node_modules/fstream/.npmignore0000644000000000000000000000011612631326456017605 0ustar 00000000000000.*.swp node_modules/ examples/deep-copy/ examples/path/ examples/filter-copy/ npm_3.5.2.orig/node_modules/fstream/.travis.yml0000644000000000000000000000021312631326456017715 0ustar 00000000000000language: node_js node_js: - iojs - 0.12 - 0.10 - 0.8 before_install: - "npm config set spin false" - "npm install -g npm/npm" npm_3.5.2.orig/node_modules/fstream/LICENSE0000644000000000000000000000137512631326456016623 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/fstream/README.md0000644000000000000000000000445112631326456017073 0ustar 00000000000000Like FS streams, but with stat on them, and supporting directories and symbolic links, as well as normal files. Also, you can use this to set the stats on a file, even if you don't change its contents, or to create a symlink, etc. So, for example, you can "write" a directory, and it'll call `mkdir`. You can specify a uid and gid, and it'll call `chown`. You can specify a `mtime` and `atime`, and it'll call `utimes`. You can call it a symlink and provide a `linkpath` and it'll call `symlink`. Note that it won't automatically resolve symbolic links. So, if you call `fstream.Reader('/some/symlink')` then you'll get an object that stats and then ends immediately (since it has no data). To follow symbolic links, do this: `fstream.Reader({path:'/some/symlink', follow: true })`. There are various checks to make sure that the bytes emitted are the same as the intended size, if the size is set. ## Examples ```javascript fstream .Writer({ path: "path/to/file" , mode: 0755 , size: 6 }) .write("hello\n") .end() ``` This will create the directories if they're missing, and then write `hello\n` into the file, chmod it to 0755, and assert that 6 bytes have been written when it's done. ```javascript fstream .Writer({ path: "path/to/file" , mode: 0755 , size: 6 , flags: "a" }) .write("hello\n") .end() ``` You can pass flags in, if you want to append to a file. ```javascript fstream .Writer({ path: "path/to/symlink" , linkpath: "./file" , SymbolicLink: true , mode: "0755" // octal strings supported }) .end() ``` If isSymbolicLink is a function, it'll be called, and if it returns true, then it'll treat it as a symlink. If it's not a function, then any truish value will make a symlink, or you can set `type: 'SymbolicLink'`, which does the same thing. Note that the linkpath is relative to the symbolic link location, not the parent dir or cwd. ```javascript fstream .Reader("path/to/dir") .pipe(fstream.Writer("path/to/other/dir")) ``` This will do like `cp -Rp path/to/dir path/to/other/dir`. If the other dir exists and isn't a directory, then it'll emit an error. It'll also set the uid, gid, mode, etc. to be identical. In this way, it's more like `rsync -a` than simply a copy. npm_3.5.2.orig/node_modules/fstream/examples/0000755000000000000000000000000012631326456017426 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream/fstream.js0000644000000000000000000000212512631326456017607 0ustar 00000000000000exports.Abstract = require('./lib/abstract.js') exports.Reader = require('./lib/reader.js') exports.Writer = require('./lib/writer.js') exports.File = { Reader: require('./lib/file-reader.js'), Writer: require('./lib/file-writer.js') } exports.Dir = { Reader: require('./lib/dir-reader.js'), Writer: require('./lib/dir-writer.js') } exports.Link = { Reader: require('./lib/link-reader.js'), Writer: require('./lib/link-writer.js') } exports.Proxy = { Reader: require('./lib/proxy-reader.js'), Writer: require('./lib/proxy-writer.js') } exports.Reader.Dir = exports.DirReader = exports.Dir.Reader exports.Reader.File = exports.FileReader = exports.File.Reader exports.Reader.Link = exports.LinkReader = exports.Link.Reader exports.Reader.Proxy = exports.ProxyReader = exports.Proxy.Reader exports.Writer.Dir = exports.DirWriter = exports.Dir.Writer exports.Writer.File = exports.FileWriter = exports.File.Writer exports.Writer.Link = exports.LinkWriter = exports.Link.Writer exports.Writer.Proxy = exports.ProxyWriter = exports.Proxy.Writer exports.collect = require('./lib/collect.js') npm_3.5.2.orig/node_modules/fstream/lib/0000755000000000000000000000000012631326456016356 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream/package.json0000644000000000000000000000321612631326456020100 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "fstream", "description": "Advanced file system stream things", "version": "1.0.8", "repository": { "type": "git", "url": "git://github.com/isaacs/fstream.git" }, "main": "fstream.js", "engines": { "node": ">=0.6" }, "dependencies": { "graceful-fs": "^4.1.2", "inherits": "~2.0.0", "mkdirp": ">=0.5 0", "rimraf": "2" }, "devDependencies": { "standard": "^4.0.0", "tap": "^1.2.0" }, "scripts": { "test": "standard && tap examples/*.js" }, "license": "ISC", "gitHead": "d9f81146c50e687f1df04c1a0e7e4c173eb3dae2", "bugs": { "url": "https://github.com/isaacs/fstream/issues" }, "homepage": "https://github.com/isaacs/fstream#readme", "_id": "fstream@1.0.8", "_shasum": "7e8d7a73abb3647ef36e4b8a15ca801dba03d038", "_from": "fstream@>=1.0.8 <1.1.0", "_npmVersion": "2.14.3", "_nodeVersion": "2.2.2", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "7e8d7a73abb3647ef36e4b8a15ca801dba03d038", "tarball": "http://registry.npmjs.org/fstream/-/fstream-1.0.8.tgz" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" }, { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/fstream/-/fstream-1.0.8.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/fstream/examples/filter-pipe.js0000644000000000000000000000666512631326456022221 0ustar 00000000000000var fstream = require('../fstream.js') var path = require('path') var r = fstream.Reader({ path: path.dirname(__dirname), filter: function () { return !this.basename.match(/^\./) && !this.basename.match(/^node_modules$/) && !this.basename.match(/^deep-copy$/) && !this.basename.match(/^filter-copy$/) } }) // this writer will only write directories var w = fstream.Writer({ path: path.resolve(__dirname, 'filter-copy'), type: 'Directory', filter: function () { return this.type === 'Directory' } }) var indent = '' r.on('entry', appears) r.on('ready', function () { console.error('ready to begin!', r.path) }) function appears (entry) { console.error(indent + 'a %s appears!', entry.type, entry.basename, typeof entry.basename) if (foggy) { console.error('FOGGY!') var p = entry do { console.error(p.depth, p.path, p._paused) p = p.parent } while (p) throw new Error('\u001b[mshould not have entries while foggy') } indent += '\t' entry.on('data', missile(entry)) entry.on('end', runaway(entry)) entry.on('entry', appears) } var foggy function missile (entry) { function liftFog (who) { if (!foggy) return if (who) { console.error('%s breaks the spell!', who && who.path) } else { console.error('the spell expires!') } console.error('\u001b[mthe fog lifts!\n') clearTimeout(foggy) foggy = null if (entry._paused) entry.resume() } if (entry.type === 'Directory') { var ended = false entry.once('end', function () { ended = true }) return function (c) { // throw in some pathological pause()/resume() behavior // just for extra fun. process.nextTick(function () { if (!foggy && !ended) { // && Math.random() < 0.3) { console.error(indent + '%s casts a spell', entry.basename) console.error('\na slowing fog comes over the battlefield...\n\u001b[32m') entry.pause() entry.once('resume', liftFog) foggy = setTimeout(liftFog, 1000) } }) } } return function (c) { var e = Math.random() < 0.5 console.error(indent + '%s %s for %d damage!', entry.basename, e ? 'is struck' : 'fires a chunk', c.length) } } function runaway (entry) { return function () { var e = Math.random() < 0.5 console.error(indent + '%s %s', entry.basename, e ? 'turns to flee' : 'is vanquished!') indent = indent.slice(0, -1) } } w.on('entry', attacks) // w.on('ready', function () { attacks(w) }) function attacks (entry) { console.error(indent + '%s %s!', entry.basename, entry.type === 'Directory' ? 'calls for backup' : 'attacks') entry.on('entry', attacks) } var ended = false var i = 1 r.on('end', function () { if (foggy) clearTimeout(foggy) console.error("\u001b[mIT'S OVER!!") console.error('A WINNAR IS YOU!') console.log('ok ' + (i++) + ' A WINNAR IS YOU') ended = true // now go through and verify that everything in there is a dir. var p = path.resolve(__dirname, 'filter-copy') var checker = fstream.Reader({ path: p }) checker.checker = true checker.on('child', function (e) { var ok = e.type === 'Directory' console.log((ok ? '' : 'not ') + 'ok ' + (i++) + ' should be a dir: ' + e.path.substr(checker.path.length + 1)) }) }) process.on('exit', function () { console.log((ended ? '' : 'not ') + 'ok ' + (i) + ' ended') console.log('1..' + i) }) r.pipe(w) npm_3.5.2.orig/node_modules/fstream/examples/pipe.js0000644000000000000000000000556312631326456020732 0ustar 00000000000000var fstream = require('../fstream.js') var path = require('path') var r = fstream.Reader({ path: path.dirname(__dirname), filter: function () { return !this.basename.match(/^\./) && !this.basename.match(/^node_modules$/) && !this.basename.match(/^deep-copy$/) } }) var w = fstream.Writer({ path: path.resolve(__dirname, 'deep-copy'), type: 'Directory' }) var indent = '' r.on('entry', appears) r.on('ready', function () { console.error('ready to begin!', r.path) }) function appears (entry) { console.error(indent + 'a %s appears!', entry.type, entry.basename, typeof entry.basename, entry) if (foggy) { console.error('FOGGY!') var p = entry do { console.error(p.depth, p.path, p._paused) p = p.parent } while (p) throw new Error('\u001b[mshould not have entries while foggy') } indent += '\t' entry.on('data', missile(entry)) entry.on('end', runaway(entry)) entry.on('entry', appears) } var foggy function missile (entry) { function liftFog (who) { if (!foggy) return if (who) { console.error('%s breaks the spell!', who && who.path) } else { console.error('the spell expires!') } console.error('\u001b[mthe fog lifts!\n') clearTimeout(foggy) foggy = null if (entry._paused) entry.resume() } if (entry.type === 'Directory') { var ended = false entry.once('end', function () { ended = true }) return function (c) { // throw in some pathological pause()/resume() behavior // just for extra fun. process.nextTick(function () { if (!foggy && !ended) { // && Math.random() < 0.3) { console.error(indent + '%s casts a spell', entry.basename) console.error('\na slowing fog comes over the battlefield...\n\u001b[32m') entry.pause() entry.once('resume', liftFog) foggy = setTimeout(liftFog, 10) } }) } } return function (c) { var e = Math.random() < 0.5 console.error(indent + '%s %s for %d damage!', entry.basename, e ? 'is struck' : 'fires a chunk', c.length) } } function runaway (entry) { return function () { var e = Math.random() < 0.5 console.error(indent + '%s %s', entry.basename, e ? 'turns to flee' : 'is vanquished!') indent = indent.slice(0, -1) } } w.on('entry', attacks) // w.on('ready', function () { attacks(w) }) function attacks (entry) { console.error(indent + '%s %s!', entry.basename, entry.type === 'Directory' ? 'calls for backup' : 'attacks') entry.on('entry', attacks) } var ended = false r.on('end', function () { if (foggy) clearTimeout(foggy) console.error("\u001b[mIT'S OVER!!") console.error('A WINNAR IS YOU!') console.log('ok 1 A WINNAR IS YOU') ended = true }) process.on('exit', function () { console.log((ended ? '' : 'not ') + 'ok 2 ended') console.log('1..2') }) r.pipe(w) npm_3.5.2.orig/node_modules/fstream/examples/reader.js0000644000000000000000000000271012631326456021226 0ustar 00000000000000var fstream = require('../fstream.js') var tap = require('tap') var fs = require('fs') var path = require('path') var dir = path.dirname(__dirname) tap.test('reader test', function (t) { var children = -1 var gotReady = false var ended = false var r = fstream.Reader({ path: dir, filter: function () { // return this.parent === r return this.parent === r || this === r } }) r.on('ready', function () { gotReady = true children = fs.readdirSync(dir).length console.error('Setting expected children to ' + children) t.equal(r.type, 'Directory', 'should be a directory') }) r.on('entry', function (entry) { children-- if (!gotReady) { t.fail('children before ready!') } t.equal(entry.dirname, r.path, 'basename is parent dir') }) r.on('error', function (er) { t.fail(er) t.end() process.exit(1) }) r.on('end', function () { t.equal(children, 0, 'should have seen all children') ended = true }) var closed = false r.on('close', function () { t.ok(ended, 'saw end before close') t.notOk(closed, 'close should only happen once') closed = true t.end() }) }) tap.test('reader error test', function (t) { // assumes non-root on a *nix system var r = fstream.Reader({ path: '/etc/shadow' }) r.once('error', function (er) { t.ok(true) t.end() }) r.on('end', function () { t.fail('reader ended without error') t.end() }) }) npm_3.5.2.orig/node_modules/fstream/examples/symlink-write.js0000644000000000000000000000135112631326456022602 0ustar 00000000000000var fstream = require('../fstream.js') var notOpen = false process.chdir(__dirname) fstream .Writer({ path: 'path/to/symlink', linkpath: './file', isSymbolicLink: true, mode: '0755' // octal strings supported }) .on('close', function () { notOpen = true var fs = require('fs') var s = fs.lstatSync('path/to/symlink') var isSym = s.isSymbolicLink() console.log((isSym ? '' : 'not ') + 'ok 1 should be symlink') var t = fs.readlinkSync('path/to/symlink') var isTarget = t === './file' console.log((isTarget ? '' : 'not ') + 'ok 2 should link to ./file') }) .end() process.on('exit', function () { console.log((notOpen ? '' : 'not ') + 'ok 3 should be closed') console.log('1..3') }) npm_3.5.2.orig/node_modules/fstream/lib/abstract.js0000644000000000000000000000376012631326456020525 0ustar 00000000000000// the parent class for all fstreams. module.exports = Abstract var Stream = require('stream').Stream var inherits = require('inherits') function Abstract () { Stream.call(this) } inherits(Abstract, Stream) Abstract.prototype.on = function (ev, fn) { if (ev === 'ready' && this.ready) { process.nextTick(fn.bind(this)) } else { Stream.prototype.on.call(this, ev, fn) } return this } Abstract.prototype.abort = function () { this._aborted = true this.emit('abort') } Abstract.prototype.destroy = function () {} Abstract.prototype.warn = function (msg, code) { var self = this var er = decorate(msg, code, self) if (!self.listeners('warn')) { console.error('%s %s\n' + 'path = %s\n' + 'syscall = %s\n' + 'fstream_type = %s\n' + 'fstream_path = %s\n' + 'fstream_unc_path = %s\n' + 'fstream_class = %s\n' + 'fstream_stack =\n%s\n', code || 'UNKNOWN', er.stack, er.path, er.syscall, er.fstream_type, er.fstream_path, er.fstream_unc_path, er.fstream_class, er.fstream_stack.join('\n')) } else { self.emit('warn', er) } } Abstract.prototype.info = function (msg, code) { this.emit('info', msg, code) } Abstract.prototype.error = function (msg, code, th) { var er = decorate(msg, code, this) if (th) throw er else this.emit('error', er) } function decorate (er, code, self) { if (!(er instanceof Error)) er = new Error(er) er.code = er.code || code er.path = er.path || self.path er.fstream_type = er.fstream_type || self.type er.fstream_path = er.fstream_path || self.path if (self._path !== self.path) { er.fstream_unc_path = er.fstream_unc_path || self._path } if (self.linkpath) { er.fstream_linkpath = er.fstream_linkpath || self.linkpath } er.fstream_class = er.fstream_class || self.constructor.name er.fstream_stack = er.fstream_stack || new Error().stack.split(/\n/).slice(3).map(function (s) { return s.replace(/^ {4}at /, '') }) return er } npm_3.5.2.orig/node_modules/fstream/lib/collect.js0000644000000000000000000000325612631326456020347 0ustar 00000000000000module.exports = collect function collect (stream) { if (stream._collected) return stream._collected = true stream.pause() stream.on('data', save) stream.on('end', save) var buf = [] function save (b) { if (typeof b === 'string') b = new Buffer(b) if (Buffer.isBuffer(b) && !b.length) return buf.push(b) } stream.on('entry', saveEntry) var entryBuffer = [] function saveEntry (e) { collect(e) entryBuffer.push(e) } stream.on('proxy', proxyPause) function proxyPause (p) { p.pause() } // replace the pipe method with a new version that will // unlock the buffered stuff. if you just call .pipe() // without a destination, then it'll re-play the events. stream.pipe = (function (orig) { return function (dest) { // console.error(' === open the pipes', dest && dest.path) // let the entries flow through one at a time. // Once they're all done, then we can resume completely. var e = 0 ;(function unblockEntry () { var entry = entryBuffer[e++] // console.error(" ==== unblock entry", entry && entry.path) if (!entry) return resume() entry.on('end', unblockEntry) if (dest) dest.add(entry) else stream.emit('entry', entry) })() function resume () { stream.removeListener('entry', saveEntry) stream.removeListener('data', save) stream.removeListener('end', save) stream.pipe = orig if (dest) stream.pipe(dest) buf.forEach(function (b) { if (b) stream.emit('data', b) else stream.emit('end') }) stream.resume() } return dest } })(stream.pipe) } npm_3.5.2.orig/node_modules/fstream/lib/dir-reader.js0000644000000000000000000001454512631326456020743 0ustar 00000000000000// A thing that emits "entry" events with Reader objects // Pausing it causes it to stop emitting entry events, and also // pauses the current entry if there is one. module.exports = DirReader var fs = require('graceful-fs') var inherits = require('inherits') var path = require('path') var Reader = require('./reader.js') var assert = require('assert').ok inherits(DirReader, Reader) function DirReader (props) { var self = this if (!(self instanceof DirReader)) { throw new Error('DirReader must be called as constructor.') } // should already be established as a Directory type if (props.type !== 'Directory' || !props.Directory) { throw new Error('Non-directory type ' + props.type) } self.entries = null self._index = -1 self._paused = false self._length = -1 if (props.sort) { this.sort = props.sort } Reader.call(this, props) } DirReader.prototype._getEntries = function () { var self = this // race condition. might pause() before calling _getEntries, // and then resume, and try to get them a second time. if (self._gotEntries) return self._gotEntries = true fs.readdir(self._path, function (er, entries) { if (er) return self.error(er) self.entries = entries self.emit('entries', entries) if (self._paused) self.once('resume', processEntries) else processEntries() function processEntries () { self._length = self.entries.length if (typeof self.sort === 'function') { self.entries = self.entries.sort(self.sort.bind(self)) } self._read() } }) } // start walking the dir, and emit an "entry" event for each one. DirReader.prototype._read = function () { var self = this if (!self.entries) return self._getEntries() if (self._paused || self._currentEntry || self._aborted) { // console.error('DR paused=%j, current=%j, aborted=%j', self._paused, !!self._currentEntry, self._aborted) return } self._index++ if (self._index >= self.entries.length) { if (!self._ended) { self._ended = true self.emit('end') self.emit('close') } return } // ok, handle this one, then. // save creating a proxy, by stat'ing the thing now. var p = path.resolve(self._path, self.entries[self._index]) assert(p !== self._path) assert(self.entries[self._index]) // set this to prevent trying to _read() again in the stat time. self._currentEntry = p fs[ self.props.follow ? 'stat' : 'lstat' ](p, function (er, stat) { if (er) return self.error(er) var who = self._proxy || self stat.path = p stat.basename = path.basename(p) stat.dirname = path.dirname(p) var childProps = self.getChildProps.call(who, stat) childProps.path = p childProps.basename = path.basename(p) childProps.dirname = path.dirname(p) var entry = Reader(childProps, stat) // console.error("DR Entry", p, stat.size) self._currentEntry = entry // "entry" events are for direct entries in a specific dir. // "child" events are for any and all children at all levels. // This nomenclature is not completely final. entry.on('pause', function (who) { if (!self._paused && !entry._disowned) { self.pause(who) } }) entry.on('resume', function (who) { if (self._paused && !entry._disowned) { self.resume(who) } }) entry.on('stat', function (props) { self.emit('_entryStat', entry, props) if (entry._aborted) return if (entry._paused) { entry.once('resume', function () { self.emit('entryStat', entry, props) }) } else self.emit('entryStat', entry, props) }) entry.on('ready', function EMITCHILD () { // console.error("DR emit child", entry._path) if (self._paused) { // console.error(" DR emit child - try again later") // pause the child, and emit the "entry" event once we drain. // console.error("DR pausing child entry") entry.pause(self) return self.once('resume', EMITCHILD) } // skip over sockets. they can't be piped around properly, // so there's really no sense even acknowledging them. // if someone really wants to see them, they can listen to // the "socket" events. if (entry.type === 'Socket') { self.emit('socket', entry) } else { self.emitEntry(entry) } }) var ended = false entry.on('close', onend) entry.on('disown', onend) function onend () { if (ended) return ended = true self.emit('childEnd', entry) self.emit('entryEnd', entry) self._currentEntry = null if (!self._paused) { self._read() } } // XXX Remove this. Works in node as of 0.6.2 or so. // Long filenames should not break stuff. entry.on('error', function (er) { if (entry._swallowErrors) { self.warn(er) entry.emit('end') entry.emit('close') } else { self.emit('error', er) } }) // proxy up some events. ;[ 'child', 'childEnd', 'warn' ].forEach(function (ev) { entry.on(ev, self.emit.bind(self, ev)) }) }) } DirReader.prototype.disown = function (entry) { entry.emit('beforeDisown') entry._disowned = true entry.parent = entry.root = null if (entry === this._currentEntry) { this._currentEntry = null } entry.emit('disown') } DirReader.prototype.getChildProps = function () { return { depth: this.depth + 1, root: this.root || this, parent: this, follow: this.follow, filter: this.filter, sort: this.props.sort, hardlinks: this.props.hardlinks } } DirReader.prototype.pause = function (who) { var self = this if (self._paused) return who = who || self self._paused = true if (self._currentEntry && self._currentEntry.pause) { self._currentEntry.pause(who) } self.emit('pause', who) } DirReader.prototype.resume = function (who) { var self = this if (!self._paused) return who = who || self self._paused = false // console.error('DR Emit Resume', self._path) self.emit('resume', who) if (self._paused) { // console.error('DR Re-paused', self._path) return } if (self._currentEntry) { if (self._currentEntry.resume) self._currentEntry.resume(who) } else self._read() } DirReader.prototype.emitEntry = function (entry) { this.emit('entry', entry) this.emit('child', entry) } npm_3.5.2.orig/node_modules/fstream/lib/dir-writer.js0000644000000000000000000001062512631326456021010 0ustar 00000000000000// It is expected that, when .add() returns false, the consumer // of the DirWriter will pause until a "drain" event occurs. Note // that this is *almost always going to be the case*, unless the // thing being written is some sort of unsupported type, and thus // skipped over. module.exports = DirWriter var Writer = require('./writer.js') var inherits = require('inherits') var mkdir = require('mkdirp') var path = require('path') var collect = require('./collect.js') inherits(DirWriter, Writer) function DirWriter (props) { var self = this if (!(self instanceof DirWriter)) { self.error('DirWriter must be called as constructor.', null, true) } // should already be established as a Directory type if (props.type !== 'Directory' || !props.Directory) { self.error('Non-directory type ' + props.type + ' ' + JSON.stringify(props), null, true) } Writer.call(this, props) } DirWriter.prototype._create = function () { var self = this mkdir(self._path, Writer.dirmode, function (er) { if (er) return self.error(er) // ready to start getting entries! self.ready = true self.emit('ready') self._process() }) } // a DirWriter has an add(entry) method, but its .write() doesn't // do anything. Why a no-op rather than a throw? Because this // leaves open the door for writing directory metadata for // gnu/solaris style dumpdirs. DirWriter.prototype.write = function () { return true } DirWriter.prototype.end = function () { this._ended = true this._process() } DirWriter.prototype.add = function (entry) { var self = this // console.error('\tadd', entry._path, '->', self._path) collect(entry) if (!self.ready || self._currentEntry) { self._buffer.push(entry) return false } // create a new writer, and pipe the incoming entry into it. if (self._ended) { return self.error('add after end') } self._buffer.push(entry) self._process() return this._buffer.length === 0 } DirWriter.prototype._process = function () { var self = this // console.error('DW Process p=%j', self._processing, self.basename) if (self._processing) return var entry = self._buffer.shift() if (!entry) { // console.error("DW Drain") self.emit('drain') if (self._ended) self._finish() return } self._processing = true // console.error("DW Entry", entry._path) self.emit('entry', entry) // ok, add this entry // // don't allow recursive copying var p = entry var pp do { pp = p._path || p.path if (pp === self.root._path || pp === self._path || (pp && pp.indexOf(self._path) === 0)) { // console.error('DW Exit (recursive)', entry.basename, self._path) self._processing = false if (entry._collected) entry.pipe() return self._process() } p = p.parent } while (p) // console.error("DW not recursive") // chop off the entry's root dir, replace with ours var props = { parent: self, root: self.root || self, type: entry.type, depth: self.depth + 1 } pp = entry._path || entry.path || entry.props.path if (entry.parent) { pp = pp.substr(entry.parent._path.length + 1) } // get rid of any ../../ shenanigans props.path = path.join(self.path, path.join('/', pp)) // if i have a filter, the child should inherit it. props.filter = self.filter // all the rest of the stuff, copy over from the source. Object.keys(entry.props).forEach(function (k) { if (!props.hasOwnProperty(k)) { props[k] = entry.props[k] } }) // not sure at this point what kind of writer this is. var child = self._currentChild = new Writer(props) child.on('ready', function () { // console.error("DW Child Ready", child.type, child._path) // console.error(" resuming", entry._path) entry.pipe(child) entry.resume() }) // XXX Make this work in node. // Long filenames should not break stuff. child.on('error', function (er) { if (child._swallowErrors) { self.warn(er) child.emit('end') child.emit('close') } else { self.emit('error', er) } }) // we fire _end internally *after* end, so that we don't move on // until any "end" listeners have had their chance to do stuff. child.on('close', onend) var ended = false function onend () { if (ended) return ended = true // console.error("* DW Child end", child.basename) self._currentChild = null self._processing = false self._process() } } npm_3.5.2.orig/node_modules/fstream/lib/file-reader.js0000644000000000000000000000762012631326456021100 0ustar 00000000000000// Basically just a wrapper around an fs.ReadStream module.exports = FileReader var fs = require('graceful-fs') var inherits = require('inherits') var Reader = require('./reader.js') var EOF = {EOF: true} var CLOSE = {CLOSE: true} inherits(FileReader, Reader) function FileReader (props) { // console.error(" FR create", props.path, props.size, new Error().stack) var self = this if (!(self instanceof FileReader)) { throw new Error('FileReader must be called as constructor.') } // should already be established as a File type // XXX Todo: preserve hardlinks by tracking dev+inode+nlink, // with a HardLinkReader class. if (!((props.type === 'Link' && props.Link) || (props.type === 'File' && props.File))) { throw new Error('Non-file type ' + props.type) } self._buffer = [] self._bytesEmitted = 0 Reader.call(self, props) } FileReader.prototype._getStream = function () { var self = this var stream = self._stream = fs.createReadStream(self._path, self.props) if (self.props.blksize) { stream.bufferSize = self.props.blksize } stream.on('open', self.emit.bind(self, 'open')) stream.on('data', function (c) { // console.error('\t\t%d %s', c.length, self.basename) self._bytesEmitted += c.length // no point saving empty chunks if (!c.length) { return } else if (self._paused || self._buffer.length) { self._buffer.push(c) self._read() } else self.emit('data', c) }) stream.on('end', function () { if (self._paused || self._buffer.length) { // console.error('FR Buffering End', self._path) self._buffer.push(EOF) self._read() } else { self.emit('end') } if (self._bytesEmitted !== self.props.size) { self.error("Didn't get expected byte count\n" + 'expect: ' + self.props.size + '\n' + 'actual: ' + self._bytesEmitted) } }) stream.on('close', function () { if (self._paused || self._buffer.length) { // console.error('FR Buffering Close', self._path) self._buffer.push(CLOSE) self._read() } else { // console.error('FR close 1', self._path) self.emit('close') } }) stream.on('error', function (e) { self.emit('error', e) }) self._read() } FileReader.prototype._read = function () { var self = this // console.error('FR _read', self._path) if (self._paused) { // console.error('FR _read paused', self._path) return } if (!self._stream) { // console.error('FR _getStream calling', self._path) return self._getStream() } // clear out the buffer, if there is one. if (self._buffer.length) { // console.error('FR _read has buffer', self._buffer.length, self._path) var buf = self._buffer for (var i = 0, l = buf.length; i < l; i++) { var c = buf[i] if (c === EOF) { // console.error('FR Read emitting buffered end', self._path) self.emit('end') } else if (c === CLOSE) { // console.error('FR Read emitting buffered close', self._path) self.emit('close') } else { // console.error('FR Read emitting buffered data', self._path) self.emit('data', c) } if (self._paused) { // console.error('FR Read Re-pausing at '+i, self._path) self._buffer = buf.slice(i) return } } self._buffer.length = 0 } // console.error("FR _read done") // that's about all there is to it. } FileReader.prototype.pause = function (who) { var self = this // console.error('FR Pause', self._path) if (self._paused) return who = who || self self._paused = true if (self._stream) self._stream.pause() self.emit('pause', who) } FileReader.prototype.resume = function (who) { var self = this // console.error('FR Resume', self._path) if (!self._paused) return who = who || self self.emit('resume', who) self._paused = false if (self._stream) self._stream.resume() self._read() } npm_3.5.2.orig/node_modules/fstream/lib/file-writer.js0000644000000000000000000000504112631326456021145 0ustar 00000000000000module.exports = FileWriter var fs = require('graceful-fs') var Writer = require('./writer.js') var inherits = require('inherits') var EOF = {} inherits(FileWriter, Writer) function FileWriter (props) { var self = this if (!(self instanceof FileWriter)) { throw new Error('FileWriter must be called as constructor.') } // should already be established as a File type if (props.type !== 'File' || !props.File) { throw new Error('Non-file type ' + props.type) } self._buffer = [] self._bytesWritten = 0 Writer.call(this, props) } FileWriter.prototype._create = function () { var self = this if (self._stream) return var so = {} if (self.props.flags) so.flags = self.props.flags so.mode = Writer.filemode if (self._old && self._old.blksize) so.bufferSize = self._old.blksize self._stream = fs.createWriteStream(self._path, so) self._stream.on('open', function () { // console.error("FW open", self._buffer, self._path) self.ready = true self._buffer.forEach(function (c) { if (c === EOF) self._stream.end() else self._stream.write(c) }) self.emit('ready') // give this a kick just in case it needs it. self.emit('drain') }) self._stream.on('error', function (er) { self.emit('error', er) }) self._stream.on('drain', function () { self.emit('drain') }) self._stream.on('close', function () { // console.error('\n\nFW Stream Close', self._path, self.size) self._finish() }) } FileWriter.prototype.write = function (c) { var self = this self._bytesWritten += c.length if (!self.ready) { if (!Buffer.isBuffer(c) && typeof c !== 'string') { throw new Error('invalid write data') } self._buffer.push(c) return false } var ret = self._stream.write(c) // console.error('\t-- fw wrote, _stream says', ret, self._stream._queue.length) // allow 2 buffered writes, because otherwise there's just too // much stop and go bs. if (ret === false && self._stream._queue) { return self._stream._queue.length <= 2 } else { return ret } } FileWriter.prototype.end = function (c) { var self = this if (c) self.write(c) if (!self.ready) { self._buffer.push(EOF) return false } return self._stream.end() } FileWriter.prototype._finish = function () { var self = this if (typeof self.size === 'number' && self._bytesWritten !== self.size) { self.error( 'Did not get expected byte count.\n' + 'expect: ' + self.size + '\n' + 'actual: ' + self._bytesWritten) } Writer.prototype._finish.call(self) } npm_3.5.2.orig/node_modules/fstream/lib/get-type.js0000644000000000000000000000114212631326456020450 0ustar 00000000000000module.exports = getType function getType (st) { var types = [ 'Directory', 'File', 'SymbolicLink', 'Link', // special for hardlinks from tarballs 'BlockDevice', 'CharacterDevice', 'FIFO', 'Socket' ] var type if (st.type && types.indexOf(st.type) !== -1) { st[st.type] = true return st.type } for (var i = 0, l = types.length; i < l; i++) { type = types[i] var is = st[type] || st['is' + type] if (typeof is === 'function') is = is.call(st) if (is) { st[type] = true st.type = type return type } } return null } npm_3.5.2.orig/node_modules/fstream/lib/link-reader.js0000644000000000000000000000274312631326456021117 0ustar 00000000000000// Basically just a wrapper around an fs.readlink // // XXX: Enhance this to support the Link type, by keeping // a lookup table of {:}, so that hardlinks // can be preserved in tarballs. module.exports = LinkReader var fs = require('graceful-fs') var inherits = require('inherits') var Reader = require('./reader.js') inherits(LinkReader, Reader) function LinkReader (props) { var self = this if (!(self instanceof LinkReader)) { throw new Error('LinkReader must be called as constructor.') } if (!((props.type === 'Link' && props.Link) || (props.type === 'SymbolicLink' && props.SymbolicLink))) { throw new Error('Non-link type ' + props.type) } Reader.call(self, props) } // When piping a LinkReader into a LinkWriter, we have to // already have the linkpath property set, so that has to // happen *before* the "ready" event, which means we need to // override the _stat method. LinkReader.prototype._stat = function (currentStat) { var self = this fs.readlink(self._path, function (er, linkpath) { if (er) return self.error(er) self.linkpath = self.props.linkpath = linkpath self.emit('linkpath', linkpath) Reader.prototype._stat.call(self, currentStat) }) } LinkReader.prototype._read = function () { var self = this if (self._paused) return // basically just a no-op, since we got all the info we need // from the _stat method if (!self._ended) { self.emit('end') self.emit('close') self._ended = true } } npm_3.5.2.orig/node_modules/fstream/lib/link-writer.js0000644000000000000000000000543612631326456021173 0ustar 00000000000000module.exports = LinkWriter var fs = require('graceful-fs') var Writer = require('./writer.js') var inherits = require('inherits') var path = require('path') var rimraf = require('rimraf') inherits(LinkWriter, Writer) function LinkWriter (props) { var self = this if (!(self instanceof LinkWriter)) { throw new Error('LinkWriter must be called as constructor.') } // should already be established as a Link type if (!((props.type === 'Link' && props.Link) || (props.type === 'SymbolicLink' && props.SymbolicLink))) { throw new Error('Non-link type ' + props.type) } if (props.linkpath === '') props.linkpath = '.' if (!props.linkpath) { self.error('Need linkpath property to create ' + props.type) } Writer.call(this, props) } LinkWriter.prototype._create = function () { // console.error(" LW _create") var self = this var hard = self.type === 'Link' || process.platform === 'win32' var link = hard ? 'link' : 'symlink' var lp = hard ? path.resolve(self.dirname, self.linkpath) : self.linkpath // can only change the link path by clobbering // For hard links, let's just assume that's always the case, since // there's no good way to read them if we don't already know. if (hard) return clobber(self, lp, link) fs.readlink(self._path, function (er, p) { // only skip creation if it's exactly the same link if (p && p === lp) return finish(self) clobber(self, lp, link) }) } function clobber (self, lp, link) { rimraf(self._path, function (er) { if (er) return self.error(er) create(self, lp, link) }) } function create (self, lp, link) { fs[link](lp, self._path, function (er) { // if this is a hard link, and we're in the process of writing out a // directory, it's very possible that the thing we're linking to // doesn't exist yet (especially if it was intended as a symlink), // so swallow ENOENT errors here and just soldier in. // Additionally, an EPERM or EACCES can happen on win32 if it's trying // to make a link to a directory. Again, just skip it. // A better solution would be to have fs.symlink be supported on // windows in some nice fashion. if (er) { if ((er.code === 'ENOENT' || er.code === 'EACCES' || er.code === 'EPERM') && process.platform === 'win32') { self.ready = true self.emit('ready') self.emit('end') self.emit('close') self.end = self._finish = function () {} } else return self.error(er) } finish(self) }) } function finish (self) { self.ready = true self.emit('ready') if (self._ended && !self._finished) self._finish() } LinkWriter.prototype.end = function () { // console.error("LW finish in end") this._ended = true if (this.ready) { this._finished = true this._finish() } } npm_3.5.2.orig/node_modules/fstream/lib/proxy-reader.js0000644000000000000000000000372412631326456021343 0ustar 00000000000000// A reader for when we don't yet know what kind of thing // the thing is. module.exports = ProxyReader var Reader = require('./reader.js') var getType = require('./get-type.js') var inherits = require('inherits') var fs = require('graceful-fs') inherits(ProxyReader, Reader) function ProxyReader (props) { var self = this if (!(self instanceof ProxyReader)) { throw new Error('ProxyReader must be called as constructor.') } self.props = props self._buffer = [] self.ready = false Reader.call(self, props) } ProxyReader.prototype._stat = function () { var self = this var props = self.props // stat the thing to see what the proxy should be. var stat = props.follow ? 'stat' : 'lstat' fs[stat](props.path, function (er, current) { var type if (er || !current) { type = 'File' } else { type = getType(current) } props[type] = true props.type = self.type = type self._old = current self._addProxy(Reader(props, current)) }) } ProxyReader.prototype._addProxy = function (proxy) { var self = this if (self._proxyTarget) { return self.error('proxy already set') } self._proxyTarget = proxy proxy._proxy = self ;[ 'error', 'data', 'end', 'close', 'linkpath', 'entry', 'entryEnd', 'child', 'childEnd', 'warn', 'stat' ].forEach(function (ev) { // console.error('~~ proxy event', ev, self.path) proxy.on(ev, self.emit.bind(self, ev)) }) self.emit('proxy', proxy) proxy.on('ready', function () { // console.error("~~ proxy is ready!", self.path) self.ready = true self.emit('ready') }) var calls = self._buffer self._buffer.length = 0 calls.forEach(function (c) { proxy[c[0]].apply(proxy, c[1]) }) } ProxyReader.prototype.pause = function () { return this._proxyTarget ? this._proxyTarget.pause() : false } ProxyReader.prototype.resume = function () { return this._proxyTarget ? this._proxyTarget.resume() : false } npm_3.5.2.orig/node_modules/fstream/lib/proxy-writer.js0000644000000000000000000000461612631326456021416 0ustar 00000000000000// A writer for when we don't know what kind of thing // the thing is. That is, it's not explicitly set, // so we're going to make it whatever the thing already // is, or "File" // // Until then, collect all events. module.exports = ProxyWriter var Writer = require('./writer.js') var getType = require('./get-type.js') var inherits = require('inherits') var collect = require('./collect.js') var fs = require('fs') inherits(ProxyWriter, Writer) function ProxyWriter (props) { var self = this if (!(self instanceof ProxyWriter)) { throw new Error('ProxyWriter must be called as constructor.') } self.props = props self._needDrain = false Writer.call(self, props) } ProxyWriter.prototype._stat = function () { var self = this var props = self.props // stat the thing to see what the proxy should be. var stat = props.follow ? 'stat' : 'lstat' fs[stat](props.path, function (er, current) { var type if (er || !current) { type = 'File' } else { type = getType(current) } props[type] = true props.type = self.type = type self._old = current self._addProxy(Writer(props, current)) }) } ProxyWriter.prototype._addProxy = function (proxy) { // console.error("~~ set proxy", this.path) var self = this if (self._proxy) { return self.error('proxy already set') } self._proxy = proxy ;[ 'ready', 'error', 'close', 'pipe', 'drain', 'warn' ].forEach(function (ev) { proxy.on(ev, self.emit.bind(self, ev)) }) self.emit('proxy', proxy) var calls = self._buffer calls.forEach(function (c) { // console.error("~~ ~~ proxy buffered call", c[0], c[1]) proxy[c[0]].apply(proxy, c[1]) }) self._buffer.length = 0 if (self._needsDrain) self.emit('drain') } ProxyWriter.prototype.add = function (entry) { // console.error("~~ proxy add") collect(entry) if (!this._proxy) { this._buffer.push(['add', [entry]]) this._needDrain = true return false } return this._proxy.add(entry) } ProxyWriter.prototype.write = function (c) { // console.error('~~ proxy write') if (!this._proxy) { this._buffer.push(['write', [c]]) this._needDrain = true return false } return this._proxy.write(c) } ProxyWriter.prototype.end = function (c) { // console.error('~~ proxy end') if (!this._proxy) { this._buffer.push(['end', [c]]) return false } return this._proxy.end(c) } npm_3.5.2.orig/node_modules/fstream/lib/reader.js0000644000000000000000000001565412631326456020171 0ustar 00000000000000module.exports = Reader var fs = require('graceful-fs') var Stream = require('stream').Stream var inherits = require('inherits') var path = require('path') var getType = require('./get-type.js') var hardLinks = Reader.hardLinks = {} var Abstract = require('./abstract.js') // Must do this *before* loading the child classes inherits(Reader, Abstract) var LinkReader = require('./link-reader.js') function Reader (props, currentStat) { var self = this if (!(self instanceof Reader)) return new Reader(props, currentStat) if (typeof props === 'string') { props = { path: props } } if (!props.path) { self.error('Must provide a path', null, true) } // polymorphism. // call fstream.Reader(dir) to get a DirReader object, etc. // Note that, unlike in the Writer case, ProxyReader is going // to be the *normal* state of affairs, since we rarely know // the type of a file prior to reading it. var type var ClassType if (props.type && typeof props.type === 'function') { type = props.type ClassType = type } else { type = getType(props) ClassType = Reader } if (currentStat && !type) { type = getType(currentStat) props[type] = true props.type = type } switch (type) { case 'Directory': ClassType = require('./dir-reader.js') break case 'Link': // XXX hard links are just files. // However, it would be good to keep track of files' dev+inode // and nlink values, and create a HardLinkReader that emits // a linkpath value of the original copy, so that the tar // writer can preserve them. // ClassType = HardLinkReader // break case 'File': ClassType = require('./file-reader.js') break case 'SymbolicLink': ClassType = LinkReader break case 'Socket': ClassType = require('./socket-reader.js') break case null: ClassType = require('./proxy-reader.js') break } if (!(self instanceof ClassType)) { return new ClassType(props) } Abstract.call(self) self.readable = true self.writable = false self.type = type self.props = props self.depth = props.depth = props.depth || 0 self.parent = props.parent || null self.root = props.root || (props.parent && props.parent.root) || self self._path = self.path = path.resolve(props.path) if (process.platform === 'win32') { self.path = self._path = self.path.replace(/\?/g, '_') if (self._path.length >= 260) { // how DOES one create files on the moon? // if the path has spaces in it, then UNC will fail. self._swallowErrors = true // if (self._path.indexOf(" ") === -1) { self._path = '\\\\?\\' + self.path.replace(/\//g, '\\') // } } } self.basename = props.basename = path.basename(self.path) self.dirname = props.dirname = path.dirname(self.path) // these have served their purpose, and are now just noisy clutter props.parent = props.root = null // console.error("\n\n\n%s setting size to", props.path, props.size) self.size = props.size self.filter = typeof props.filter === 'function' ? props.filter : null if (props.sort === 'alpha') props.sort = alphasort // start the ball rolling. // this will stat the thing, and then call self._read() // to start reading whatever it is. // console.error("calling stat", props.path, currentStat) self._stat(currentStat) } function alphasort (a, b) { return a === b ? 0 : a.toLowerCase() > b.toLowerCase() ? 1 : a.toLowerCase() < b.toLowerCase() ? -1 : a > b ? 1 : -1 } Reader.prototype._stat = function (currentStat) { var self = this var props = self.props var stat = props.follow ? 'stat' : 'lstat' // console.error("Reader._stat", self._path, currentStat) if (currentStat) process.nextTick(statCb.bind(null, null, currentStat)) else fs[stat](self._path, statCb) function statCb (er, props_) { // console.error("Reader._stat, statCb", self._path, props_, props_.nlink) if (er) return self.error(er) Object.keys(props_).forEach(function (k) { props[k] = props_[k] }) // if it's not the expected size, then abort here. if (undefined !== self.size && props.size !== self.size) { return self.error('incorrect size') } self.size = props.size var type = getType(props) var handleHardlinks = props.hardlinks !== false // special little thing for handling hardlinks. if (handleHardlinks && type !== 'Directory' && props.nlink && props.nlink > 1) { var k = props.dev + ':' + props.ino // console.error("Reader has nlink", self._path, k) if (hardLinks[k] === self._path || !hardLinks[k]) { hardLinks[k] = self._path } else { // switch into hardlink mode. type = self.type = self.props.type = 'Link' self.Link = self.props.Link = true self.linkpath = self.props.linkpath = hardLinks[k] // console.error("Hardlink detected, switching mode", self._path, self.linkpath) // Setting __proto__ would arguably be the "correct" // approach here, but that just seems too wrong. self._stat = self._read = LinkReader.prototype._read } } if (self.type && self.type !== type) { self.error('Unexpected type: ' + type) } // if the filter doesn't pass, then just skip over this one. // still have to emit end so that dir-walking can move on. if (self.filter) { var who = self._proxy || self // special handling for ProxyReaders if (!self.filter.call(who, who, props)) { if (!self._disowned) { self.abort() self.emit('end') self.emit('close') } return } } // last chance to abort or disown before the flow starts! var events = ['_stat', 'stat', 'ready'] var e = 0 ;(function go () { if (self._aborted) { self.emit('end') self.emit('close') return } if (self._paused && self.type !== 'Directory') { self.once('resume', go) return } var ev = events[e++] if (!ev) { return self._read() } self.emit(ev, props) go() })() } } Reader.prototype.pipe = function (dest) { var self = this if (typeof dest.add === 'function') { // piping to a multi-compatible, and we've got directory entries. self.on('entry', function (entry) { var ret = dest.add(entry) if (ret === false) { self.pause() } }) } // console.error("R Pipe apply Stream Pipe") return Stream.prototype.pipe.apply(this, arguments) } Reader.prototype.pause = function (who) { this._paused = true who = who || this this.emit('pause', who) if (this._stream) this._stream.pause(who) } Reader.prototype.resume = function (who) { this._paused = false who = who || this this.emit('resume', who) if (this._stream) this._stream.resume(who) this._read() } Reader.prototype._read = function () { this.error('Cannot read unknown type: ' + this.type) } npm_3.5.2.orig/node_modules/fstream/lib/socket-reader.js0000644000000000000000000000162312631326456021446 0ustar 00000000000000// Just get the stats, and then don't do anything. // You can't really "read" from a socket. You "connect" to it. // Mostly, this is here so that reading a dir with a socket in it // doesn't blow up. module.exports = SocketReader var inherits = require('inherits') var Reader = require('./reader.js') inherits(SocketReader, Reader) function SocketReader (props) { var self = this if (!(self instanceof SocketReader)) { throw new Error('SocketReader must be called as constructor.') } if (!(props.type === 'Socket' && props.Socket)) { throw new Error('Non-socket type ' + props.type) } Reader.call(self, props) } SocketReader.prototype._read = function () { var self = this if (self._paused) return // basically just a no-op, since we got all the info we have // from the _stat method if (!self._ended) { self.emit('end') self.emit('close') self._ended = true } } npm_3.5.2.orig/node_modules/fstream/lib/writer.js0000644000000000000000000002536712631326456020245 0ustar 00000000000000module.exports = Writer var fs = require('graceful-fs') var inherits = require('inherits') var rimraf = require('rimraf') var mkdir = require('mkdirp') var path = require('path') var umask = process.platform === 'win32' ? 0 : process.umask() var getType = require('./get-type.js') var Abstract = require('./abstract.js') // Must do this *before* loading the child classes inherits(Writer, Abstract) Writer.dirmode = parseInt('0777', 8) & (~umask) Writer.filemode = parseInt('0666', 8) & (~umask) var DirWriter = require('./dir-writer.js') var LinkWriter = require('./link-writer.js') var FileWriter = require('./file-writer.js') var ProxyWriter = require('./proxy-writer.js') // props is the desired state. current is optionally the current stat, // provided here so that subclasses can avoid statting the target // more than necessary. function Writer (props, current) { var self = this if (typeof props === 'string') { props = { path: props } } if (!props.path) self.error('Must provide a path', null, true) // polymorphism. // call fstream.Writer(dir) to get a DirWriter object, etc. var type = getType(props) var ClassType = Writer switch (type) { case 'Directory': ClassType = DirWriter break case 'File': ClassType = FileWriter break case 'Link': case 'SymbolicLink': ClassType = LinkWriter break case null: default: // Don't know yet what type to create, so we wrap in a proxy. ClassType = ProxyWriter break } if (!(self instanceof ClassType)) return new ClassType(props) // now get down to business. Abstract.call(self) // props is what we want to set. // set some convenience properties as well. self.type = props.type self.props = props self.depth = props.depth || 0 self.clobber = props.clobber === false ? props.clobber : true self.parent = props.parent || null self.root = props.root || (props.parent && props.parent.root) || self self._path = self.path = path.resolve(props.path) if (process.platform === 'win32') { self.path = self._path = self.path.replace(/\?/g, '_') if (self._path.length >= 260) { self._swallowErrors = true self._path = '\\\\?\\' + self.path.replace(/\//g, '\\') } } self.basename = path.basename(props.path) self.dirname = path.dirname(props.path) self.linkpath = props.linkpath || null props.parent = props.root = null // console.error("\n\n\n%s setting size to", props.path, props.size) self.size = props.size if (typeof props.mode === 'string') { props.mode = parseInt(props.mode, 8) } self.readable = false self.writable = true // buffer until ready, or while handling another entry self._buffer = [] self.ready = false self.filter = typeof props.filter === 'function' ? props.filter : null // start the ball rolling. // this checks what's there already, and then calls // self._create() to call the impl-specific creation stuff. self._stat(current) } // Calling this means that it's something we can't create. // Just assert that it's already there, otherwise raise a warning. Writer.prototype._create = function () { var self = this fs[self.props.follow ? 'stat' : 'lstat'](self._path, function (er) { if (er) { return self.warn('Cannot create ' + self._path + '\n' + 'Unsupported type: ' + self.type, 'ENOTSUP') } self._finish() }) } Writer.prototype._stat = function (current) { var self = this var props = self.props var stat = props.follow ? 'stat' : 'lstat' var who = self._proxy || self if (current) statCb(null, current) else fs[stat](self._path, statCb) function statCb (er, current) { if (self.filter && !self.filter.call(who, who, current)) { self._aborted = true self.emit('end') self.emit('close') return } // if it's not there, great. We'll just create it. // if it is there, then we'll need to change whatever differs if (er || !current) { return create(self) } self._old = current var currentType = getType(current) // if it's a type change, then we need to clobber or error. // if it's not a type change, then let the impl take care of it. if (currentType !== self.type) { return rimraf(self._path, function (er) { if (er) return self.error(er) self._old = null create(self) }) } // otherwise, just handle in the app-specific way // this creates a fs.WriteStream, or mkdir's, or whatever create(self) } } function create (self) { // console.error("W create", self._path, Writer.dirmode) // XXX Need to clobber non-dirs that are in the way, // unless { clobber: false } in the props. mkdir(path.dirname(self._path), Writer.dirmode, function (er, made) { // console.error("W created", path.dirname(self._path), er) if (er) return self.error(er) // later on, we have to set the mode and owner for these self._madeDir = made return self._create() }) } function endChmod (self, want, current, path, cb) { var wantMode = want.mode var chmod = want.follow || self.type !== 'SymbolicLink' ? 'chmod' : 'lchmod' if (!fs[chmod]) return cb() if (typeof wantMode !== 'number') return cb() var curMode = current.mode & parseInt('0777', 8) wantMode = wantMode & parseInt('0777', 8) if (wantMode === curMode) return cb() fs[chmod](path, wantMode, cb) } function endChown (self, want, current, path, cb) { // Don't even try it unless root. Too easy to EPERM. if (process.platform === 'win32') return cb() if (!process.getuid || process.getuid() !== 0) return cb() if (typeof want.uid !== 'number' && typeof want.gid !== 'number') return cb() if (current.uid === want.uid && current.gid === want.gid) return cb() var chown = (self.props.follow || self.type !== 'SymbolicLink') ? 'chown' : 'lchown' if (!fs[chown]) return cb() if (typeof want.uid !== 'number') want.uid = current.uid if (typeof want.gid !== 'number') want.gid = current.gid fs[chown](path, want.uid, want.gid, cb) } function endUtimes (self, want, current, path, cb) { if (!fs.utimes || process.platform === 'win32') return cb() var utimes = (want.follow || self.type !== 'SymbolicLink') ? 'utimes' : 'lutimes' if (utimes === 'lutimes' && !fs[utimes]) { utimes = 'utimes' } if (!fs[utimes]) return cb() var curA = current.atime var curM = current.mtime var meA = want.atime var meM = want.mtime if (meA === undefined) meA = curA if (meM === undefined) meM = curM if (!isDate(meA)) meA = new Date(meA) if (!isDate(meM)) meA = new Date(meM) if (meA.getTime() === curA.getTime() && meM.getTime() === curM.getTime()) return cb() fs[utimes](path, meA, meM, cb) } // XXX This function is beastly. Break it up! Writer.prototype._finish = function () { var self = this if (self._finishing) return self._finishing = true // console.error(" W Finish", self._path, self.size) // set up all the things. // At this point, we're already done writing whatever we've gotta write, // adding files to the dir, etc. var todo = 0 var errState = null var done = false if (self._old) { // the times will almost *certainly* have changed. // adds the utimes syscall, but remove another stat. self._old.atime = new Date(0) self._old.mtime = new Date(0) // console.error(" W Finish Stale Stat", self._path, self.size) setProps(self._old) } else { var stat = self.props.follow ? 'stat' : 'lstat' // console.error(" W Finish Stating", self._path, self.size) fs[stat](self._path, function (er, current) { // console.error(" W Finish Stated", self._path, self.size, current) if (er) { // if we're in the process of writing out a // directory, it's very possible that the thing we're linking to // doesn't exist yet (especially if it was intended as a symlink), // so swallow ENOENT errors here and just soldier on. if (er.code === 'ENOENT' && (self.type === 'Link' || self.type === 'SymbolicLink') && process.platform === 'win32') { self.ready = true self.emit('ready') self.emit('end') self.emit('close') self.end = self._finish = function () {} return } else return self.error(er) } setProps(self._old = current) }) } return function setProps (current) { todo += 3 endChmod(self, self.props, current, self._path, next('chmod')) endChown(self, self.props, current, self._path, next('chown')) endUtimes(self, self.props, current, self._path, next('utimes')) } function next (what) { return function (er) { // console.error(" W Finish", what, todo) if (errState) return if (er) { er.fstream_finish_call = what return self.error(errState = er) } if (--todo > 0) return if (done) return done = true // we may still need to set the mode/etc. on some parent dirs // that were created previously. delay end/close until then. if (!self._madeDir) return end() else endMadeDir(self, self._path, end) function end (er) { if (er) { er.fstream_finish_call = 'setupMadeDir' return self.error(er) } // all the props have been set, so we're completely done. self.emit('end') self.emit('close') } } } } function endMadeDir (self, p, cb) { var made = self._madeDir // everything *between* made and path.dirname(self._path) // needs to be set up. Note that this may just be one dir. var d = path.dirname(p) endMadeDir_(self, d, function (er) { if (er) return cb(er) if (d === made) { return cb() } endMadeDir(self, d, cb) }) } function endMadeDir_ (self, p, cb) { var dirProps = {} Object.keys(self.props).forEach(function (k) { dirProps[k] = self.props[k] // only make non-readable dirs if explicitly requested. if (k === 'mode' && self.type !== 'Directory') { dirProps[k] = dirProps[k] | parseInt('0111', 8) } }) var todo = 3 var errState = null fs.stat(p, function (er, current) { if (er) return cb(errState = er) endChmod(self, dirProps, current, p, next) endChown(self, dirProps, current, p, next) endUtimes(self, dirProps, current, p, next) }) function next (er) { if (errState) return if (er) return cb(errState = er) if (--todo === 0) return cb() } } Writer.prototype.pipe = function () { this.error("Can't pipe from writable stream") } Writer.prototype.add = function () { this.error("Can't add to non-Directory type") } Writer.prototype.write = function () { return true } function objectToString (d) { return Object.prototype.toString.call(d) } function isDate (d) { return typeof d === 'object' && objectToString(d) === '[object Date]' } npm_3.5.2.orig/node_modules/fstream-npm/.npmignore0000644000000000000000000000011712631326456020376 0ustar 00000000000000# ignore the output junk from the example scripts example/output node_modules/ npm_3.5.2.orig/node_modules/fstream-npm/.travis.yml0000644000000000000000000000036712631326456020517 0ustar 00000000000000language: node_js sudo: false node_js: - "5" - "4" - iojs - "0.12" - "0.10" - "0.8" before_install: - "npm config set spin false" - "npm install -g npm" script: "npm test" notifications: slack: npm-inc:kRqQjto7YbINqHPb1X6nS3g8 npm_3.5.2.orig/node_modules/fstream-npm/LICENSE0000644000000000000000000000137512631326456017413 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/fstream-npm/README.md0000644000000000000000000000070012631326456017654 0ustar 00000000000000# fstream-npm This is an fstream DirReader class that will read a directory and filter things according to the semantics of what goes in an npm package. For example: ```javascript // This will print out all the files that would be included // by 'npm publish' or 'npm install' of this directory. var FN = require("fstream-npm") FN({ path: "./" }) .on("child", function (e) { console.error(e.path.substr(e.root.path.length + 1)) }) ``` npm_3.5.2.orig/node_modules/fstream-npm/example/0000755000000000000000000000000012631326456020033 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/fstream-npm.js0000644000000000000000000002372312631326456021176 0ustar 00000000000000var Ignore = require('fstream-ignore') var inherits = require('inherits') var path = require('path') var fs = require('fs') module.exports = Packer inherits(Packer, Ignore) function Packer (props) { if (!(this instanceof Packer)) { return new Packer(props) } if (typeof props === 'string') { props = { path: props } } props.ignoreFiles = props.ignoreFiles || [ '.npmignore', '.gitignore', 'package.json' ] Ignore.call(this, props) this.bundled = props.bundled this.bundleLinks = props.bundleLinks this.package = props.package // only do the magic bundling stuff for the node_modules folder that // lives right next to a package.json file. this.bundleMagic = this.parent && this.parent.packageRoot && this.basename === 'node_modules' // in a node_modules folder, resolve symbolic links to // bundled dependencies when creating the package. props.follow = this.follow = this.bundleMagic // console.error("follow?", this.path, props.follow) if (this === this.root || this.parent && this.parent.bundleMagic && this.basename.charAt(0) !== '.') { this.readBundledLinks() } this.on('entryStat', function (entry, props) { // files should *always* get into tarballs // in a user-writable state, even if they're // being installed from some wackey vm-mounted // read-only filesystem. entry.mode = props.mode = props.mode | parseInt('0200', 8) }) } Packer.prototype.readBundledLinks = function () { if (this._paused) { this.once('resume', this.addIgnoreFiles) return } this.pause() fs.readdir(this.path + '/node_modules', function (er, list) { // no harm if there's no bundle var l = list && list.length if (er || l === 0) return this.resume() var errState = null var then = function then (er) { if (errState) return if (er) { errState = er return this.resume() } if (--l === 0) return this.resume() }.bind(this) list.forEach(function (pkg) { if (pkg.charAt(0) === '.') return then() var pd = this.path + '/node_modules/' + pkg fs.realpath(pd, function (er, rp) { if (er) return then() this.bundleLinks = this.bundleLinks || {} this.bundleLinks[pkg] = rp then() }.bind(this)) }, this) }.bind(this)) } Packer.prototype.applyIgnores = function (entry, partial, entryObj) { if (!entryObj || entryObj.type !== 'Directory') { // package.json files can never be ignored. if (entry === 'package.json') return true // readme files should never be ignored. if (entry.match(/^readme(\.[^\.]*)$/i)) return true // license files should never be ignored. if (entry.match(/^(license|licence)(\.[^\.]*)?$/i)) return true // changelogs should never be ignored. if (entry.match(/^(changes|changelog|history)(\.[^\.]*)?$/i)) return true } // special rules. see below. if (entry === 'node_modules' && this.packageRoot) return true // package.json main file should never be ignored. var mainFile = this.package && this.package.main if (mainFile && path.resolve(this.path, entry) === path.resolve(this.path, mainFile)) return true // some files are *never* allowed under any circumstances // (VCS folders, native build cruft, npm cruft, regular cruft) if (entry === '.git' || entry === 'CVS' || entry === '.svn' || entry === '.hg' || entry === '.lock-wscript' || entry.match(/^\.wafpickle-[0-9]+$/) || (this.parent && this.parent.packageRoot && this.basename === 'build' && entry === 'config.gypi') || entry === 'npm-debug.log' || entry === '.npmrc' || entry.match(/^\..*\.swp$/) || entry === '.DS_Store' || entry.match(/^\._/) ) { return false } // in a node_modules folder, we only include bundled dependencies // also, prevent packages in node_modules from being affected // by rules set in the containing package, so that // bundles don't get busted. // Also, once in a bundle, everything is installed as-is // To prevent infinite cycles in the case of cyclic deps that are // linked with npm link, even in a bundle, deps are only bundled // if they're not already present at a higher level. if (this.bundleMagic) { // bubbling up. stop here and allow anything the bundled pkg allows if (entry.indexOf('/') !== -1) return true // never include the .bin. It's typically full of platform-specific // stuff like symlinks and .cmd files anyway. if (entry === '.bin') return false // the package root. var p = this.parent // the package before this one. var pp = p && p.parent // if this entry has already been bundled, and is a symlink, // and it is the *same* symlink as this one, then exclude it. if (pp && pp.bundleLinks && this.bundleLinks && pp.bundleLinks[entry] && pp.bundleLinks[entry] === this.bundleLinks[entry]) { return false } // since it's *not* a symbolic link, if we're *already* in a bundle, // then we should include everything. if (pp && pp.package && pp.basename === 'node_modules') { return true } // only include it at this point if it's a bundleDependency var bd = this.package && this.package.bundleDependencies if (bd && !Array.isArray(bd)) { throw new Error(this.package.name + '\'s `bundledDependencies` should ' + 'be an array') } var shouldBundle = bd && bd.indexOf(entry) !== -1 // if we're not going to bundle it, then it doesn't count as a bundleLink // if (this.bundleLinks && !shouldBundle) delete this.bundleLinks[entry] return shouldBundle } // if (this.bundled) return true return Ignore.prototype.applyIgnores.call(this, entry, partial, entryObj) } Packer.prototype.addIgnoreFiles = function () { var entries = this.entries // if there's a .npmignore, then we do *not* want to // read the .gitignore. if (entries.indexOf('.npmignore') !== -1) { var i = entries.indexOf('.gitignore') if (i !== -1) { entries.splice(i, 1) } } this.entries = entries Ignore.prototype.addIgnoreFiles.call(this) } Packer.prototype.readRules = function (buf, e) { if (e !== 'package.json') { return Ignore.prototype.readRules.call(this, buf, e) } buf = buf.toString().trim() if (buf.length === 0) return [] try { var p = this.package = JSON.parse(buf) } catch (er) { // just pretend it's a normal old file, not magic at all. return [] } if (this === this.root) { this.bundleLinks = this.bundleLinks || {} this.bundleLinks[p.name] = this._path } this.packageRoot = true this.emit('package', p) // make bundle deps predictable if (p.bundledDependencies && !p.bundleDependencies) { p.bundleDependencies = p.bundledDependencies delete p.bundledDependencies } if (!p.files || !Array.isArray(p.files)) return [] // ignore everything except what's in the files array. return ['*'].concat(p.files.map(function (f) { return '!' + f })).concat(p.files.map(function (f) { return '!' + f.replace(/\/+$/, '') + '/**' })) } Packer.prototype.getChildProps = function (stat) { var props = Ignore.prototype.getChildProps.call(this, stat) props.package = this.package props.bundled = this.bundled && this.bundled.slice(0) props.bundleLinks = this.bundleLinks && Object.create(this.bundleLinks) // Directories have to be read as Packers // otherwise fstream.Reader will create a DirReader instead. if (stat.isDirectory()) { props.type = this.constructor } // only follow symbolic links directly in the node_modules folder. props.follow = false return props } var order = [ 'package.json', '.npmignore', '.gitignore', /^README(\.md)?$/, 'LICENCE', 'LICENSE', /\.js$/ ] Packer.prototype.sort = function (a, b) { for (var i = 0, l = order.length; i < l; i++) { var o = order[i] if (typeof o === 'string') { if (a === o) return -1 if (b === o) return 1 } else { if (a.match(o)) return -1 if (b.match(o)) return 1 } } // deps go in the back if (a === 'node_modules') return 1 if (b === 'node_modules') return -1 return Ignore.prototype.sort.call(this, a, b) } Packer.prototype.emitEntry = function (entry) { if (this._paused) { this.once('resume', this.emitEntry.bind(this, entry)) return } // if there is a .gitignore, then we're going to // rename it to .npmignore in the output. if (entry.basename === '.gitignore') { entry.basename = '.npmignore' entry.path = path.resolve(entry.dirname, entry.basename) } // all *.gyp files are renamed to binding.gyp for node-gyp // but only when they are in the same folder as a package.json file. if (entry.basename.match(/\.gyp$/) && this.entries.indexOf('package.json') !== -1) { entry.basename = 'binding.gyp' entry.path = path.resolve(entry.dirname, entry.basename) } // skip over symbolic links if (entry.type === 'SymbolicLink') { entry.abort() return } if (entry.type !== 'Directory') { // make it so that the folder in the tarball is named "package" var h = path.dirname((entry.root || entry).path) var t = entry.path.substr(h.length + 1).replace(/^[^\/\\]+/, 'package') var p = h + '/' + t entry.path = p entry.dirname = path.dirname(p) return Ignore.prototype.emitEntry.call(this, entry) } // we don't want empty directories to show up in package // tarballs. // don't emit entry events for dirs, but still walk through // and read them. This means that we need to proxy up their // entry events so that those entries won't be missed, since // .pipe() doesn't do anythign special with "child" events, on // with "entry" events. var me = this entry.on('entry', function (e) { if (e.parent === entry) { e.parent = me me.emit('entry', e) } }) entry.on('package', this.emit.bind(this, 'package')) } npm_3.5.2.orig/node_modules/fstream-npm/node_modules/0000755000000000000000000000000012631326456021055 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/package.json0000644000000000000000000000272612631326456020675 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "fstream-npm", "description": "fstream class for creating npm packages", "version": "1.0.7", "repository": { "type": "git", "url": "git+https://github.com/npm/fstream-npm.git" }, "scripts": { "test": "standard && tap test/*.js" }, "main": "./fstream-npm.js", "dependencies": { "fstream-ignore": "^1.0.0", "inherits": "2" }, "devDependencies": { "graceful-fs": "^4.1.2", "mkdirp": "^0.5.1", "rimraf": "^2.4.2", "standard": "^4.3.1", "tap": "^1.3.2" }, "license": "ISC", "readme": "# fstream-npm\n\nThis is an fstream DirReader class that will read a directory and filter\nthings according to the semantics of what goes in an npm package.\n\nFor example:\n\n```javascript\n// This will print out all the files that would be included\n// by 'npm publish' or 'npm install' of this directory.\n\nvar FN = require(\"fstream-npm\")\nFN({ path: \"./\" })\n .on(\"child\", function (e) {\n console.error(e.path.substr(e.root.path.length + 1))\n })\n```\n\n", "readmeFilename": "README.md", "gitHead": "d57b6b24f91156067f73417dd8785c6312bfc75f", "bugs": { "url": "https://github.com/npm/fstream-npm/issues" }, "homepage": "https://github.com/npm/fstream-npm#readme", "_id": "fstream-npm@1.0.7", "_shasum": "7ed0d1ac13d7686dd9e1bf6ceb8be273bf6d2f86", "_from": "fstream-npm@>=1.0.7 <1.1.0" } npm_3.5.2.orig/node_modules/fstream-npm/test/0000755000000000000000000000000012631326456017357 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/example/bundle.js0000644000000000000000000000053112631326456021641 0ustar 00000000000000// this example will bundle every dependency var P = require("../") P({ path: "./" }) .on("package", bundleIt) .on("entry", function (e) { console.error(e.constructor.name, e.path.substr(e.root.dirname.length + 1)) e.on("package", bundleIt) }) function bundleIt (p) { p.bundleDependencies = Object.keys(p.dependencies || {}) } npm_3.5.2.orig/node_modules/fstream-npm/example/dir-tar.js0000644000000000000000000000104412631326456021732 0ustar 00000000000000// this will show what ends up in the fstream-npm package var P = require('fstream').DirReader var tar = require('tar') function f (entry) { return entry.basename !== '.git' } new P({ path: './', type: 'Directory', Directory: true, filter: f }) .on('package', function (p) { console.error('package', p) }) .on('ignoreFile', function (e) { console.error('ignoreFile', e) }) .on('entry', function (e) { console.error(e.constructor.name, e.path.substr(e.root.path.length + 1)) }) .pipe(tar.Pack()) .pipe(process.stdout) npm_3.5.2.orig/node_modules/fstream-npm/example/dir.js0000644000000000000000000000122112631326456021143 0ustar 00000000000000// this will show what ends up in the fstream-npm package var P = require('../') var DW = require('fstream').DirWriter var target = new DW({ Directory: true, type: 'Directory', path: __dirname + '/output'}) function f (entry) { return entry.basename !== '.git' } P({ path: './', type: 'Directory', isDirectory: true, filter: f }) .on('package', function (p) { console.error('package', p) }) .on('ignoreFile', function (e) { console.error('ignoreFile', e) }) .on('entry', function (e) { console.error(e.constructor.name, e.path) }) .pipe(target) .on('end', function () { console.error('ended') }) npm_3.5.2.orig/node_modules/fstream-npm/example/example.js0000644000000000000000000000054612631326456022031 0ustar 00000000000000// this will show what ends up in the fstream-npm package var P = require('../') P({ path: './' }) .on('package', function (p) { console.error('package', p) }) .on('ignoreFile', function (e) { console.error('ignoreFile', e) }) .on('entry', function (e) { console.error(e.constructor.name, e.path.substr(e.root.dirname.length + 1)) }) npm_3.5.2.orig/node_modules/fstream-npm/example/ig-tar.js0000644000000000000000000000104112631326456021550 0ustar 00000000000000// this will show what ends up in the fstream-npm package var P = require('fstream-ignore') var tar = require('tar') function f (entry) { return entry.basename !== '.git' } new P({ path: './', type: 'Directory', Directory: true, filter: f }) .on('package', function (p) { console.error('package', p) }) .on('ignoreFile', function (e) { console.error('ignoreFile', e) }) .on('entry', function (e) { console.error(e.constructor.name, e.path.substr(e.root.path.length + 1)) }) .pipe(tar.Pack()) .pipe(process.stdout) npm_3.5.2.orig/node_modules/fstream-npm/example/tar.js0000644000000000000000000000113512631326456021157 0ustar 00000000000000// this will show what ends up in the fstream-npm package var P = require('../') var tar = require('tar') function f () { return true } // function f (entry) { // return entry.basename !== ".git" // } new P({ path: './', type: 'Directory', isDirectory: true, filter: f }) .on('package', function (p) { console.error('package', p) }) .on('ignoreFile', function (e) { console.error('ignoreFile', e) }) .on('entry', function (e) { console.error(e.constructor.name, e.path) }) .on('end', function () { console.error('ended') }) .pipe(tar.Pack()) .pipe(process.stdout) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/0000755000000000000000000000000012631326456023777 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/.npmignore0000644000000000000000000000001612631326456025773 0ustar 00000000000000test/fixtures npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/LICENSE0000644000000000000000000000137512631326456025012 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/README.md0000644000000000000000000000115512631326456025260 0ustar 00000000000000# fstream-ignore A fstream DirReader that filters out files that match globs in `.ignore` files throughout the tree, like how git ignores files based on a `.gitignore` file. Here's an example: ```javascript var Ignore = require("fstream-ignore") Ignore({ path: __dirname , ignoreFiles: [".ignore", ".gitignore"] }) .on("child", function (c) { console.error(c.path.substr(c.root.path.length + 1)) }) .pipe(tar.Pack()) .pipe(fs.createWriteStream("foo.tar")) ``` This will tar up the files in __dirname into `foo.tar`, ignoring anything matched by the globs in any .iginore or .gitignore file. npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/example/0000755000000000000000000000000012631326456025432 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/ignore.js0000644000000000000000000001651412631326456025627 0ustar 00000000000000// Essentially, this is a fstream.DirReader class, but with a // bit of special logic to read the specified sort of ignore files, // and a filter that prevents it from picking up anything excluded // by those files. var Minimatch = require("minimatch").Minimatch , fstream = require("fstream") , DirReader = fstream.DirReader , inherits = require("inherits") , path = require("path") , fs = require("fs") module.exports = IgnoreReader inherits(IgnoreReader, DirReader) function IgnoreReader (props) { if (!(this instanceof IgnoreReader)) { return new IgnoreReader(props) } // must be a Directory type if (typeof props === "string") { props = { path: path.resolve(props) } } props.type = "Directory" props.Directory = true if (!props.ignoreFiles) props.ignoreFiles = [".ignore"] this.ignoreFiles = props.ignoreFiles this.ignoreRules = null // ensure that .ignore files always show up at the top of the list // that way, they can be read before proceeding to handle other // entries in that same folder if (props.sort) { this._sort = props.sort === "alpha" ? alphasort : props.sort props.sort = null } this.on("entries", function () { // if there are any ignore files in the list, then // pause and add them. // then, filter the list based on our ignoreRules var hasIg = this.entries.some(this.isIgnoreFile, this) if (!hasIg) return this.filterEntries() this.addIgnoreFiles() }) // we filter entries before we know what they are. // however, directories have to be re-tested against // rules with a "/" appended, because "a/b/" will only // match if "a/b" is a dir, and not otherwise. this.on("_entryStat", function (entry, props) { var t = entry.basename if (!this.applyIgnores(entry.basename, entry.type === "Directory", entry)) { entry.abort() } }.bind(this)) DirReader.call(this, props) } IgnoreReader.prototype.addIgnoreFiles = function () { if (this._paused) { this.once("resume", this.addIgnoreFiles) return } if (this._ignoreFilesAdded) return this._ignoreFilesAdded = true var newIg = this.entries.filter(this.isIgnoreFile, this) , count = newIg.length , errState = null if (!count) return this.pause() var then = function (er) { if (errState) return if (er) return this.emit("error", errState = er) if (-- count === 0) { this.filterEntries() this.resume() } else { this.addIgnoreFile(newIg[newIg.length - count], then) } }.bind(this) this.addIgnoreFile(newIg[0], then) } IgnoreReader.prototype.isIgnoreFile = function (e) { return e !== "." && e !== ".." && -1 !== this.ignoreFiles.indexOf(e) } IgnoreReader.prototype.getChildProps = function (stat) { var props = DirReader.prototype.getChildProps.call(this, stat) props.ignoreFiles = this.ignoreFiles // Directories have to be read as IgnoreReaders // otherwise fstream.Reader will create a DirReader instead. if (stat.isDirectory()) { props.type = this.constructor } return props } IgnoreReader.prototype.addIgnoreFile = function (e, cb) { // read the file, and then call addIgnoreRules // if there's an error, then tell the cb about it. var ig = path.resolve(this.path, e) fs.readFile(ig, function (er, data) { if (er) return cb(er) this.emit("ignoreFile", e, data) var rules = this.readRules(data, e) this.addIgnoreRules(rules, e) cb() }.bind(this)) } IgnoreReader.prototype.readRules = function (buf, e) { return buf.toString().split(/\r?\n/) } // Override this to do fancier things, like read the // "files" array from a package.json file or something. IgnoreReader.prototype.addIgnoreRules = function (set, e) { // filter out anything obvious set = set.filter(function (s) { s = s.trim() return s && !s.match(/^#/) }) // no rules to add! if (!set.length) return // now get a minimatch object for each one of these. // Note that we need to allow dot files by default, and // not switch the meaning of their exclusion var mmopt = { matchBase: true, dot: true, flipNegate: true } , mm = set.map(function (s) { var m = new Minimatch(s, mmopt) m.ignoreFile = e return m }) if (!this.ignoreRules) this.ignoreRules = [] this.ignoreRules.push.apply(this.ignoreRules, mm) } IgnoreReader.prototype.filterEntries = function () { // this exclusion is at the point where we know the list of // entries in the dir, but don't know what they are. since // some of them *might* be directories, we have to run the // match in dir-mode as well, so that we'll pick up partials // of files that will be included later. Anything included // at this point will be checked again later once we know // what it is. this.entries = this.entries.filter(function (entry) { // at this point, we don't know if it's a dir or not. return this.applyIgnores(entry) || this.applyIgnores(entry, true) }, this) } IgnoreReader.prototype.applyIgnores = function (entry, partial, obj) { var included = true // this = /a/b/c // entry = d // parent /a/b sees c/d if (this.parent && this.parent.applyIgnores) { var pt = this.basename + "/" + entry included = this.parent.applyIgnores(pt, partial) } // Negated Rules // Since we're *ignoring* things here, negating means that a file // is re-included, if it would have been excluded by a previous // rule. So, negated rules are only relevant if the file // has been excluded. // // Similarly, if a file has been excluded, then there's no point // trying it against rules that have already been applied // // We're using the "flipnegate" flag here, which tells minimatch // to set the "negate" for our information, but still report // whether the core pattern was a hit or a miss. if (!this.ignoreRules) { return included } this.ignoreRules.forEach(function (rule) { // negation means inclusion if (rule.negate && included || !rule.negate && !included) { // unnecessary return } // first, match against /foo/bar var match = rule.match("/" + entry) if (!match) { // try with the leading / trimmed off the test // eg: foo/bar instead of /foo/bar match = rule.match(entry) } // if the entry is a directory, then it will match // with a trailing slash. eg: /foo/bar/ or foo/bar/ if (!match && partial) { match = rule.match("/" + entry + "/") || rule.match(entry + "/") } // When including a file with a negated rule, it's // relevant if a directory partially matches, since // it may then match a file within it. // Eg, if you ignore /a, but !/a/b/c if (!match && rule.negate && partial) { match = rule.match("/" + entry, true) || rule.match(entry, true) } if (match) { included = rule.negate } }, this) return included } IgnoreReader.prototype.sort = function (a, b) { var aig = this.ignoreFiles.indexOf(a) !== -1 , big = this.ignoreFiles.indexOf(b) !== -1 if (aig && !big) return -1 if (big && !aig) return 1 return this._sort(a, b) } IgnoreReader.prototype._sort = function (a, b) { return 0 } function alphasort (a, b) { return a === b ? 0 : a.toLowerCase() > b.toLowerCase() ? 1 : a.toLowerCase() < b.toLowerCase() ? -1 : a > b ? 1 : -1 } npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/0000755000000000000000000000000012631326456026454 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/package.json0000644000000000000000000000301012631326456026257 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "fstream-ignore", "description": "A thing for ignoring files based on globs", "version": "1.0.3", "repository": { "type": "git", "url": "git://github.com/isaacs/fstream-ignore.git" }, "main": "ignore.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "fstream": "^1.0.0", "inherits": "2", "minimatch": "^3.0.0" }, "devDependencies": { "mkdirp": "", "rimraf": "", "tap": "^2.2.0" }, "license": "ISC", "gitHead": "86c835eef61049496003f6b90c1e6c1236c92d6a", "bugs": { "url": "https://github.com/isaacs/fstream-ignore/issues" }, "homepage": "https://github.com/isaacs/fstream-ignore#readme", "_id": "fstream-ignore@1.0.3", "_shasum": "4c74d91fa88b22b42f4f86a18a2820dd79d8fcdd", "_from": "fstream-ignore@>=1.0.0 <2.0.0", "_npmVersion": "2.14.8", "_nodeVersion": "4.2.1", "_npmUser": { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, "dist": { "shasum": "4c74d91fa88b22b42f4f86a18a2820dd79d8fcdd", "tarball": "http://registry.npmjs.org/fstream-ignore/-/fstream-ignore-1.0.3.tgz" }, "maintainers": [ { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/fstream-ignore/-/fstream-ignore-1.0.3.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/0000755000000000000000000000000012631326456024756 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/example/basic.js0000644000000000000000000000053712631326456027056 0ustar 00000000000000var Ignore = require("../") Ignore({ path: __dirname , ignoreFiles: [".ignore", ".gitignore"] }) .on("child", function (c) { console.error(c.path.substr(c.root.path.length + 1)) c.on("ignoreFile", onIgnoreFile) }) .on("ignoreFile", onIgnoreFile) function onIgnoreFile (e) { console.error("adding ignore file", e.path) } npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/0000755000000000000000000000000012631326456030425 5ustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/LICENSE0000644000000000000000000000137512631326456031440 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/README.md0000644000000000000000000001471412631326456031713 0ustar 00000000000000# minimatch A minimal matching utility. [![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch) This is the matching library used internally by npm. It works by converting glob expressions into JavaScript `RegExp` objects. ## Usage ```javascript var minimatch = require("minimatch") minimatch("bar.foo", "*.foo") // true! minimatch("bar.foo", "*.bar") // false! minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! ``` ## Features Supports these glob features: * Brace Expansion * Extended glob matching * "Globstar" `**` matching See: * `man sh` * `man bash` * `man 3 fnmatch` * `man 5 gitignore` ## Minimatch Class Create a minimatch object by instanting the `minimatch.Minimatch` class. ```javascript var Minimatch = require("minimatch").Minimatch var mm = new Minimatch(pattern, options) ``` ### Properties * `pattern` The original pattern the minimatch object represents. * `options` The options supplied to the constructor. * `set` A 2-dimensional array of regexp or string expressions. Each row in the array corresponds to a brace-expanded pattern. Each item in the row corresponds to a single path-part. For example, the pattern `{a,b/c}/d` would expand to a set of patterns like: [ [ a, d ] , [ b, c, d ] ] If a portion of the pattern doesn't have any "magic" in it (that is, it's something like `"foo"` rather than `fo*o?`), then it will be left as a string rather than converted to a regular expression. * `regexp` Created by the `makeRe` method. A single regular expression expressing the entire pattern. This is useful in cases where you wish to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. * `negate` True if the pattern is negated. * `comment` True if the pattern is a comment. * `empty` True if the pattern is `""`. ### Methods * `makeRe` Generate the `regexp` member if necessary, and return it. Will return `false` if the pattern is invalid. * `match(fname)` Return true if the filename matches the pattern, or false otherwise. * `matchOne(fileArray, patternArray, partial)` Take a `/`-split filename, and match it against a single row in the `regExpSet`. This method is mainly for internal use, but is exposed so that it can be used by a glob-walker that needs to avoid excessive filesystem calls. All other methods are internal, and will be called as necessary. ## Functions The top-level exported function has a `cache` property, which is an LRU cache set to store 100 items. So, calling these methods repeatedly with the same pattern and options will use the same Minimatch object, saving the cost of parsing it multiple times. ### minimatch(path, pattern, options) Main export. Tests a path against the pattern using the options. ```javascript var isJS = minimatch(file, "*.js", { matchBase: true }) ``` ### minimatch.filter(pattern, options) Returns a function that tests its supplied argument, suitable for use with `Array.filter`. Example: ```javascript var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) ``` ### minimatch.match(list, pattern, options) Match against the list of files, in the style of fnmatch or glob. If nothing is matched, and options.nonull is set, then return a list containing the pattern itself. ```javascript var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) ``` ### minimatch.makeRe(pattern, options) Make a regular expression object from the pattern. ## Options All options are `false` by default. ### debug Dump a ton of stuff to stderr. ### nobrace Do not expand `{a,b}` and `{1..3}` brace sets. ### noglobstar Disable `**` matching against multiple folder names. ### dot Allow patterns to match filenames starting with a period, even if the pattern does not explicitly have a period in that spot. Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` is set. ### noext Disable "extglob" style patterns like `+(a|b)`. ### nocase Perform a case-insensitive match. ### nonull When a match is not found by `minimatch.match`, return a list containing the pattern itself if this option is set. When not set, an empty list is returned if there are no matches. ### matchBase If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. ### nocomment Suppress the behavior of treating `#` at the start of a pattern as a comment. ### nonegate Suppress the behavior of treating a leading `!` character as negation. ### flipNegate Returns from negate expressions the same as if they were not negated. (Ie, true on a hit, false on a miss.) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between minimatch and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.1, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. If an escaped pattern has no matches, and the `nonull` flag is set, then minimatch.match returns the pattern as-provided, rather than interpreting the character escapes. For example, `minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. ././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/minimatch.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/minimatch0000644000000000000000000006044012631326456032325 0ustar 00000000000000module.exports = minimatch minimatch.Minimatch = Minimatch var path = { sep: '/' } try { path = require('path') } catch (er) {} var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} var expand = require('brace-expansion') // any single thing other than / // don't need to escape / when using new RegExp() var qmark = '[^/]' // * => any number of characters var star = qmark + '*?' // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. var twoStarDot = '(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?' // not a ^ or / followed by a dot, // followed by anything, any number of times. var twoStarNoDot = '(?:(?!(?:\\\/|^)\\.).)*?' // characters that need to be escaped in RegExp. var reSpecials = charSet('().*{}+?[]^$\\!') // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split('').reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { return false } // "" only matches "" if (pattern.trim() === '') return p === '' return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options) } if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} pattern = pattern.trim() // windows support: need to use /, not \ if (path.sep !== '/') { pattern = pattern.split(path.sep).join('/') } this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function () {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return s.indexOf(false) === -1 }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern var negate = false var options = this.options var negateOffset = 0 if (options.nonegate) return for (var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === '!' ; i++) { negate = !negate negateOffset++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return braceExpand(pattern, options) } Minimatch.prototype.braceExpand = braceExpand function braceExpand (pattern, options) { if (!options) { if (this instanceof Minimatch) { options = this.options } else { options = {} } } pattern = typeof pattern === 'undefined' ? this.pattern : pattern if (typeof pattern === 'undefined') { throw new Error('undefined pattern') } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } return expand(pattern) } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === '**') return GLOBSTAR if (pattern === '') return '' var re = '' var hasMagic = !!options.nocase var escaping = false // ? => one single character var patternListStack = [] var negativeLists = [] var plType var stateChar var inClass = false var reClassStart = -1 var classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. var patternStart = pattern.charAt(0) === '.' ? '' // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))' : '(?!\\.)' var self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case '*': re += star hasMagic = true break case '?': re += qmark hasMagic = true break default: re += '\\' + stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for (var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i++) { this.debug('%s\t%s %s %j', pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += '\\' + c escaping = false continue } switch (c) { case '/': // completely not allowed, even escaped. // Should already be path-split by now. return false case '\\': clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case '?': case '*': case '+': case '@': case '!': this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === '!' && i === classStart + 1) c = '^' re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case '(': if (inClass) { re += '(' continue } if (!stateChar) { re += '\\(' continue } plType = stateChar patternListStack.push({ type: plType, start: i - 1, reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === '!' ? '(?:(?!(?:' : '(?:' this.debug('plType %j %j', stateChar, re) stateChar = false continue case ')': if (inClass || !patternListStack.length) { re += '\\)' continue } clearStateChar() hasMagic = true re += ')' var pl = patternListStack.pop() plType = pl.type // negation is (?:(?!js)[^/]*) // The others are (?:) switch (plType) { case '!': negativeLists.push(pl) re += ')[^/]*?)' pl.reEnd = re.length break case '?': case '+': case '*': re += plType break case '@': break // the default anyway } continue case '|': if (inClass || !patternListStack.length || escaping) { re += '\\|' escaping = false continue } clearStateChar() re += '|' continue // these are mostly the same in regexp and glob case '[': // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += '\\' + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case ']': // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += '\\' + c escaping = false continue } // handle the case where we left a class open. // "[z-a]" is valid, equivalent to "\[z-a\]" if (inClass) { // split where the last [ was, make sure we don't have // an invalid re. if so, re-walk the contents of the // would-be class to re-translate any characters that // were passed through as-is // TODO: It would probably be faster to determine this // without a try/catch and a new RegExp, but it's tricky // to do safely. For now, this is safe and works. var cs = pattern.substring(classStart + 1, i) try { RegExp('[' + cs + ']') } catch (er) { // not a valid class! var sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]' hasMagic = hasMagic || sp[1] inClass = false continue } } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === '^' && inClass)) { re += '\\' } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is cs = pattern.substr(classStart + 1) sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = '\\' } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + '|' }) this.debug('tail=%j\n %s', tail, tail) var t = pl.type === '*' ? star : pl.type === '?' ? qmark : '\\' + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + '\\(' + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += '\\\\' } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case '.': case '[': case '(': addPatternStart = true } // Hack to work around lack of negative lookbehind in JS // A pattern like: *.!(x).!(y|z) needs to ensure that a name // like 'a.xyz.yz' doesn't match. So, the first negative // lookahead, has to look ALL the way ahead, to the end of // the pattern. for (var n = negativeLists.length - 1; n > -1; n--) { var nl = negativeLists[n] var nlBefore = re.slice(0, nl.reStart) var nlFirst = re.slice(nl.reStart, nl.reEnd - 8) var nlLast = re.slice(nl.reEnd - 8, nl.reEnd) var nlAfter = re.slice(nl.reEnd) nlLast += nlAfter // Handle nested stuff like *(*.js|!(*.json)), where open parens // mean that we should *not* include the ) in the bit that is considered // "after" the negated section. var openParensBefore = nlBefore.split('(').length - 1 var cleanAfter = nlAfter for (i = 0; i < openParensBefore; i++) { cleanAfter = cleanAfter.replace(/\)[+*?]?/, '') } nlAfter = cleanAfter var dollar = '' if (nlAfter === '' && isSub !== SUBPARSE) { dollar = '$' } var newRe = nlBefore + nlFirst + nlAfter + dollar + nlLast re = newRe } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== '' && hasMagic) { re = '(?=.)' + re } if (addPatternStart) { re = patternStart + re } // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [re, hasMagic] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? 'i' : '' var regExp = new RegExp('^' + re + '$', flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) { this.regexp = false return this.regexp } var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot var flags = options.nocase ? 'i' : '' var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === 'string') ? regExpEscape(p) : p._src }).join('\\\/') }).join('|') // must match entire pattern // ending in a * or ** will make it less strict. re = '^(?:' + re + ')$' // can match anything, as long as it's not this. if (this.negate) re = '^(?!' + re + ').*$' try { this.regexp = new RegExp(re, flags) } catch (ex) { this.regexp = false } return this.regexp } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug('match', f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === '' if (f === '/' && partial) return true var options = this.options // windows: need to use /, not \ if (path.sep !== '/') { f = f.split(path.sep).join('/') } // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, 'split', f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, 'set', set) // Find the basename of the path by looking for the last non-empty segment var filename var i for (i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (i = 0; i < set.length; i++) { var pattern = set[i] var file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug('matchOne', { 'this': this, file: file, pattern: pattern }) this.debug('matchOne', file.length, pattern.length) for (var fi = 0, pi = 0, fl = file.length, pl = pattern.length ; (fi < fl) && (pi < pl) ; fi++, pi++) { this.debug('matchOne loop') var p = pattern[pi] var f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi var pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for (; fi < fl; fi++) { if (file[fi] === '.' || file[fi] === '..' || (!options.dot && file[fi].charAt(0) === '.')) return false } return true } // ok, let's see if we can swallow whatever we can. while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === '.' || swallowee === '..' || (!options.dot && swallowee.charAt(0) === '.')) { this.debug('dot detected!', file, fr, pattern, pr) break } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug('\n>>> no match, partial?', file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === 'string') { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug('string match', p, f, hit) } else { hit = f.match(p) this.debug('pattern match', p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === '') return emptyFileEnd } // should be unreachable. throw new Error('wtf?') } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, '$1') } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&') } ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/package.jsonnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/package.j0000644000000000000000000000267512631326456032205 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "minimatch", "description": "a glob matcher in javascript", "version": "3.0.0", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "main": "minimatch.js", "scripts": { "posttest": "standard minimatch.js test/*.js", "test": "tap test/*.js" }, "engines": { "node": "*" }, "dependencies": { "brace-expansion": "^1.0.0" }, "devDependencies": { "standard": "^3.7.2", "tap": "^1.2.0" }, "license": "ISC", "files": [ "minimatch.js" ], "gitHead": "270dbea567f0af6918cb18103e98c612aa717a20", "bugs": { "url": "https://github.com/isaacs/minimatch/issues" }, "homepage": "https://github.com/isaacs/minimatch#readme", "_id": "minimatch@3.0.0", "_shasum": "5236157a51e4f004c177fb3c527ff7dd78f0ef83", "_from": "minimatch@>=3.0.0 <4.0.0", "_npmVersion": "3.3.2", "_nodeVersion": "4.0.0", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "5236157a51e4f004c177fb3c527ff7dd78f0ef83", "tarball": "http://registry.npmjs.org/minimatch/-/minimatch-3.0.0.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.0.0.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/.npmignorenpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000003412631326456032316 0ustar 00000000000000test .gitignore .travis.yml ././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/README.mdnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000673112631326456032330 0ustar 00000000000000# brace-expansion [Brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html), as known from sh/bash, in JavaScript. [![build status](https://secure.travis-ci.org/juliangruber/brace-expansion.svg)](http://travis-ci.org/juliangruber/brace-expansion) [![downloads](https://img.shields.io/npm/dm/brace-expansion.svg)](https://www.npmjs.org/package/brace-expansion) [![testling badge](https://ci.testling.com/juliangruber/brace-expansion.png)](https://ci.testling.com/juliangruber/brace-expansion) ## Example ```js var expand = require('brace-expansion'); expand('file-{a,b,c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('-v{,,}') // => ['-v', '-v', '-v'] expand('file{0..2}.jpg') // => ['file0.jpg', 'file1.jpg', 'file2.jpg'] expand('file-{a..c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('file{2..0}.jpg') // => ['file2.jpg', 'file1.jpg', 'file0.jpg'] expand('file{0..4..2}.jpg') // => ['file0.jpg', 'file2.jpg', 'file4.jpg'] expand('file-{a..e..2}.jpg') // => ['file-a.jpg', 'file-c.jpg', 'file-e.jpg'] expand('file{00..10..5}.jpg') // => ['file00.jpg', 'file05.jpg', 'file10.jpg'] expand('{{A..C},{a..c}}') // => ['A', 'B', 'C', 'a', 'b', 'c'] expand('ppp{,config,oe{,conf}}') // => ['ppp', 'pppconfig', 'pppoe', 'pppoeconf'] ``` ## API ```js var expand = require('brace-expansion'); ``` ### var expanded = expand(str) Return an array of all possible and valid expansions of `str`. If none are found, `[str]` is returned. Valid expansions are: ```js /^(.*,)+(.+)?$/ // {a,b,...} ``` A comma seperated list of options, like `{a,b}` or `{a,{b,c}}` or `{,a,}`. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` A numeric sequence from `x` to `y` inclusive, with optional increment. If `x` or `y` start with a leading `0`, all the numbers will be padded to have equal length. Negative numbers and backwards iteration work too. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` An alphabetic sequence from `x` to `y` inclusive, with optional increment. `x` and `y` must be exactly one character, and if given, `incr` must be a number. For compatibility reasons, the string `${` is not eligible for brace expansion. ## Installation With [npm](https://npmjs.org) do: ```bash npm install brace-expansion ``` ## Contributors - [Julian Gruber](https://github.com/juliangruber) - [Isaac Z. Schlueter](https://github.com/isaacs) ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/example.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000061612631326456032324 0ustar 00000000000000var expand = require('./'); console.log(expand('http://any.org/archive{1996..1999}/vol{1..4}/part{a,b,c}.html')); console.log(expand('http://www.numericals.com/file{1..100..10}.txt')); console.log(expand('http://www.letters.com/file{a..z..2}.txt')); console.log(expand('mkdir /usr/local/src/bash/{old,new,dist,bugs}')); console.log(expand('chown root /usr/{ucb/{ex,edit},lib/{ex?.?*,how_ex}}')); ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/index.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000001035712631326456032327 0ustar 00000000000000var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; return expand(escapeBraces(str), true).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } function expand(str, isTop) { var expansions = []; var m = balanced('{', '}', str); if (!m || /\$$/.test(m.pre)) return [str]; var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body); var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = /^(.*,)+(.+)?$/.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (m.post.match(/,.*}/)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0], false).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post, false) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post, false) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el, false) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { var expansion = pre + N[j] + post[k]; if (!isTop || isSequence || expansion) expansions.push(expansion); } } return expansions; } ././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000020500000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/package.jsonnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000365112631326456032326 0ustar 00000000000000{ "name": "brace-expansion", "description": "Brace expansion as known from sh/bash", "version": "1.1.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/brace-expansion.git" }, "homepage": "https://github.com/juliangruber/brace-expansion", "main": "index.js", "scripts": { "test": "tape test/*.js", "gentest": "bash test/generate.sh" }, "dependencies": { "balanced-match": "^0.2.0", "concat-map": "0.0.1" }, "devDependencies": { "tape": "^3.0.3" }, "keywords": [], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/20..latest", "firefox/nightly", "chrome/25..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "f50da498166d76ea570cf3b30179f01f0f119612", "bugs": { "url": "https://github.com/juliangruber/brace-expansion/issues" }, "_id": "brace-expansion@1.1.1", "_shasum": "da5fb78aef4c44c9e4acf525064fb3208ebab045", "_from": "brace-expansion@>=1.0.0 <2.0.0", "_npmVersion": "2.6.1", "_nodeVersion": "0.10.36", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" }, { "name": "isaacs", "email": "isaacs@npmjs.com" } ], "dist": { "shasum": "da5fb78aef4c44c9e4acf525064fb3208ebab045", "tarball": "http://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.npmignorenpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000002712631326456032320 0ustar 00000000000000node_modules .DS_Store ././@LongLink0000000000000000000000000000024000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.travis.ymlnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000004612631326456032321 0ustar 00000000000000language: node_js node_js: - "0.10" ././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/LICENSE.mdnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000211012631326456032313 0ustar 00000000000000(MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/Makefilenpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000007112631326456032317 0ustar 00000000000000 test: @node_modules/.bin/tape test/*.js .PHONY: test ././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/README.mdnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000517312631326456032327 0ustar 00000000000000# balanced-match Match balanced string pairs, like `{` and `}` or `` and ``. [![build status](https://secure.travis-ci.org/juliangruber/balanced-match.svg)](http://travis-ci.org/juliangruber/balanced-match) [![downloads](https://img.shields.io/npm/dm/balanced-match.svg)](https://www.npmjs.org/package/balanced-match) [![testling badge](https://ci.testling.com/juliangruber/balanced-match.png)](https://ci.testling.com/juliangruber/balanced-match) ## Example Get the first matching pair of braces: ```js var balanced = require('balanced-match'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ``` The matches are: ```bash $ node example.js { start: 3, end: 14, pre: 'pre', body: 'in{nested}', post: 'post' } { start: 3, end: 9, pre: 'pre', body: 'first', post: 'between{second}post' } ``` ## API ### var m = balanced(a, b, str) For the first non-nested matching pair of `a` and `b` in `str`, return an object with those keys: * **start** the index of the first match of `a` * **end** the index of the matching `b` * **pre** the preamble, `a` and `b` not included * **body** the match, `a` and `b` not included * **post** the postscript, `a` and `b` not included If there's no match, `undefined` will be returned. If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `['{', 'a', '']`. ## Installation With [npm](https://npmjs.org) do: ```bash npm install balanced-match ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/example.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000023212631326456032316 0ustar 00000000000000var balanced = require('./'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/index.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000160612631326456032324 0ustar 00000000000000module.exports = balanced; function balanced(a, b, str) { var bal = 0; var m = {}; var ended = false; for (var i = 0; i < str.length; i++) { if (a == str.substr(i, a.length)) { if (!('start' in m)) m.start = i; bal++; } else if (b == str.substr(i, b.length) && 'start' in m) { ended = true; bal--; if (!bal) { m.end = i; m.pre = str.substr(0, m.start); m.body = (m.end - m.start > 1) ? str.substring(m.start + a.length, m.end) : ''; m.post = str.slice(m.end + b.length); return m; } } } // if we opened more than we closed, find the one we closed if (bal && ended) { var start = m.start + a.length; m = balanced(a, b, str.substr(start)); if (m) { m.start += start; m.end += start; m.pre = str.slice(0, start) + m.pre; } return m; } } ././@LongLink0000000000000000000000000000024100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/package.jsonnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000350212631326456032321 0ustar 00000000000000{ "name": "balanced-match", "description": "Match balanced character pairs, like \"{\" and \"}\"", "version": "0.2.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/balanced-match.git" }, "homepage": "https://github.com/juliangruber/balanced-match", "main": "index.js", "scripts": { "test": "make test" }, "dependencies": {}, "devDependencies": { "tape": "~1.1.1" }, "keywords": [ "match", "regexp", "test", "balanced", "parse" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/20..latest", "firefox/nightly", "chrome/25..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "d743dd31d7376e0fcf99392a4be7227f2e99bf5d", "bugs": { "url": "https://github.com/juliangruber/balanced-match/issues" }, "_id": "balanced-match@0.2.1", "_shasum": "7bc658b4bed61eee424ad74f75f5c3e2c4df3cc7", "_from": "balanced-match@>=0.2.0 <0.3.0", "_npmVersion": "2.14.7", "_nodeVersion": "4.2.1", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "dist": { "shasum": "7bc658b4bed61eee424ad74f75f5c3e2c4df3cc7", "tarball": "http://registry.npmjs.org/balanced-match/-/balanced-match-0.2.1.tgz" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-0.2.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/test/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000024500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/test/balanced.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000233312631326456032322 0ustar 00000000000000var test = require('tape'); var balanced = require('..'); test('balanced', function(t) { t.deepEqual(balanced('{', '}', 'pre{in{nest}}post'), { start: 3, end: 12, pre: 'pre', body: 'in{nest}', post: 'post' }); t.deepEqual(balanced('{', '}', '{{{{{{{{{in}post'), { start: 8, end: 11, pre: '{{{{{{{{', body: 'in', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre{body{in}post'), { start: 8, end: 11, pre: 'pre{body', body: 'in', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre}{in{nest}}post'), { start: 4, end: 13, pre: 'pre}', body: 'in{nest}', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre{body}between{body2}post'), { start: 3, end: 8, pre: 'pre', body: 'body', post: 'between{body2}post' }); t.notOk(balanced('{', '}', 'nope'), 'should be notOk'); t.deepEqual(balanced('', '', 'preinnestpost'), { start: 3, end: 19, pre: 'pre', body: 'innest', post: 'post' }); t.deepEqual(balanced('', '', 'pre
      innestpost'), { start: 7, end: 23, pre: 'pre', body: 'innest', post: 'post' }); t.end(); }); ././@LongLink0000000000000000000000000000023400000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/.travis.ymlnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000005312631326456032317 0ustar 00000000000000language: node_js node_js: - 0.4 - 0.6 ././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/LICENSEnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000206112631326456032320 0ustar 00000000000000This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000024000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/README.markdownnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000221512631326456032321 0ustar 00000000000000concat-map ========== Concatenative mapdashery. [![browser support](http://ci.testling.com/substack/node-concat-map.png)](http://ci.testling.com/substack/node-concat-map) [![build status](https://secure.travis-ci.org/substack/node-concat-map.png)](http://travis-ci.org/substack/node-concat-map) example ======= ``` js var concatMap = require('concat-map'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ``` *** ``` [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ] ``` methods ======= ``` js var concatMap = require('concat-map') ``` concatMap(xs, fn) ----------------- Return an array of concatenated elements by calling `fn(x, i)` for each element `x` and each index `i` in the array `xs`. When `fn(x, i)` returns an array, its result will be concatenated with the result array. If `fn(x, i)` returns anything else, that value will be pushed onto the end of the result array. install ======= With [npm](http://npmjs.org) do: ``` npm install concat-map ``` license ======= MIT notes ===== This module was written while sitting high above the ground in a tree. ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/index.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000053112631326456032320 0ustar 00000000000000module.exports = function (xs, fn) { var res = []; for (var i = 0; i < xs.length; i++) { var x = fn(xs[i], i); if (isArray(x)) res.push.apply(res, x); else res.push(x); } return res; }; var isArray = Array.isArray || function (xs) { return Object.prototype.toString.call(xs) === '[object Array]'; }; ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/package.jsonnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000321512631326456032322 0ustar 00000000000000{ "name": "concat-map", "description": "concatenative mapdashery", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/substack/node-concat-map.git" }, "main": "index.js", "keywords": [ "concat", "concatMap", "map", "functional", "higher-order" ], "directories": { "example": "example", "test": "test" }, "scripts": { "test": "tape test/*.js" }, "devDependencies": { "tape": "~2.4.0" }, "license": "MIT", "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "testling": { "files": "test/*.js", "browsers": { "ie": [ 6, 7, 8, 9 ], "ff": [ 3.5, 10, 15 ], "chrome": [ 10, 22 ], "safari": [ 5.1 ], "opera": [ 12 ] } }, "bugs": { "url": "https://github.com/substack/node-concat-map/issues" }, "homepage": "https://github.com/substack/node-concat-map", "_id": "concat-map@0.0.1", "dist": { "shasum": "d8a96bd77fd68df7793a73036a3ba0d5405d477b", "tarball": "http://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz" }, "_from": "concat-map@0.0.1", "_npmVersion": "1.3.21", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "_shasum": "d8a96bd77fd68df7793a73036a3ba0d5405d477b", "_resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/test/npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000755000000000000000000000000012631326456032317 5ustar 00000000000000././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/map.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000025312631326456032321 0ustar 00000000000000var concatMap = require('../'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ././@LongLink0000000000000000000000000000023400000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/test/map.jsnpm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/node_modules/minimatch/node_modu0000644000000000000000000000206312631326456032322 0ustar 00000000000000var concatMap = require('../'); var test = require('tape'); test('empty or not', function (t) { var xs = [ 1, 2, 3, 4, 5, 6 ]; var ixes = []; var ys = concatMap(xs, function (x, ix) { ixes.push(ix); return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); t.same(ys, [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ]); t.same(ixes, [ 0, 1, 2, 3, 4, 5 ]); t.end(); }); test('always something', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function (x) { return x === 'b' ? [ 'B', 'B', 'B' ] : [ x ]; }); t.same(ys, [ 'a', 'B', 'B', 'B', 'c', 'd' ]); t.end(); }); test('scalars', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function (x) { return x === 'b' ? [ 'B', 'B', 'B' ] : x; }); t.same(ys, [ 'a', 'B', 'B', 'B', 'c', 'd' ]); t.end(); }); test('undefs', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function () {}); t.same(ys, [ undefined, undefined, undefined, undefined ]); t.end(); }); npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/.ignore0000644000000000000000000000002212631326456026234 0ustar 00000000000000.gitignore .*.swp npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/.npmignore0000644000000000000000000000000412631326456026747 0ustar 00000000000000*/a npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/00-setup.js0000644000000000000000000000337512631326456026701 0ustar 00000000000000// The test fixtures work like this: // These dirs are all created: {a,b,c}/{a,b,c}/{a,b,c}/ // in each one, these files are created: // {.,}{a,b,c}{a,b,c}{a,b,c} // // So, there'll be a/b/c/abc, a/b/c/aba, etc., and dot-versions of each. // // Each test then writes their own ignore file rules for their purposes, // and is responsible for removing them afterwards. var mkdirp = require("mkdirp") var path = require("path") var i = 0 var tap = require("tap") var fs = require("fs") var rimraf = require("rimraf") var fixtures = path.resolve(__dirname, "fixtures") var chars = ['a', 'b', 'c'] var dirs = [] for (var i = 0; i < 3; i ++) { for (var j = 0; j < 3; j ++) { for (var k = 0; k < 3; k ++) { dirs.push(chars[i] + '/' + chars[j] + '/' + chars[k]) } } } var files = [] for (var i = 0; i < 3; i ++) { for (var j = 0; j < 3; j ++) { for (var k = 0; k < 3; k ++) { files.push(chars[i] + chars[j] + chars[k]) files.push('.' + chars[i] + chars[j] + chars[k]) } } } tap.test("remove fixtures", function (t) { rimraf(path.resolve(__dirname, "fixtures"), function (er) { t.ifError(er, "remove fixtures") t.end() }) }) tap.test("create fixtures", function (t) { dirs.forEach(function (dir) { dir = path.resolve(fixtures, dir) t.test("mkdir "+dir, function (t) { mkdirp(dir, function (er) { t.ifError(er, "mkdir "+dir) if (er) return t.end() files.forEach(function (file) { file = path.resolve(dir, file) t.test("writeFile "+file, function (t) { fs.writeFile(file, path.basename(file), function (er) { t.ifError(er, "writing "+file) t.end() }) }) }) t.end() }) }) }) t.end() }) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/basic.js0000644000000000000000000000127712631326456026404 0ustar 00000000000000var IgnoreFile = require("../") // set the ignores just for this test var c = require("./common.js") c.ignores({ "a/.basic-ignore": ["b/", "aca"] }) // the files that we expect to not see var notAllowed = [ /^\/a\/b\/.*/ , /^\/a\/.*\/aca$/ ] require("tap").test("basic ignore rules", function (t) { t.pass("start") IgnoreFile({ path: __dirname + "/fixtures" , ignoreFiles: [".basic-ignore"] }) .on("ignoreFile", function (e) { console.error("ignore file!", e) }) .on("child", function (e) { var p = e.path.substr(e.root.path.length) notAllowed.forEach(function (na) { t.dissimilar(p, na) }) }) .on("close", t.end.bind(t)) }) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/common.js0000644000000000000000000000157712631326456026616 0ustar 00000000000000if (require.main === module) { console.log("1..1") console.log("ok 1 trivial pass") return } var fs = require("fs") var path = require("path") var rimraf = require("rimraf") exports.ignores = ignores exports.writeIgnoreFile = writeIgnoreFile exports.writeIgnores = writeIgnores exports.clearIgnores = clearIgnores function writeIgnoreFile (file, rules) { file = path.resolve(__dirname, "fixtures", file) if (Array.isArray(rules)) { rules = rules.join("\n") } fs.writeFileSync(file, rules) console.error(file, rules) } function writeIgnores (set) { Object.keys(set).forEach(function (f) { writeIgnoreFile(f, set[f]) }) } function clearIgnores (set) { Object.keys(set).forEach(function (file) { fs.unlinkSync(path.resolve(__dirname, "fixtures", file)) }) } function ignores (set) { writeIgnores(set) process.on("exit", clearIgnores.bind(null, set)) } npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/ignore-most.js0000644000000000000000000000172012631326456027557 0ustar 00000000000000// ignore most things var IgnoreFile = require("../") // set the ignores just for this test var c = require("./common.js") c.ignores({ ".ignore": ["*", "!a/b/c/.abc", "!/c/b/a/cba"] }) // the only files we expect to see var expected = [ "/a/b/c/.abc" , "/a" , "/a/b" , "/a/b/c" , "/c/b/a/cba" , "/c" , "/c/b" , "/c/b/a" ] require("tap").test("basic ignore rules", function (t) { t.pass("start") IgnoreFile({ path: __dirname + "/fixtures" , ignoreFiles: [".ignore"] }) .on("ignoreFile", function (e) { console.error("ignore file!", e) }) .on("child", function (e) { var p = e.path.substr(e.root.path.length) var i = expected.indexOf(p) if (i === -1) { t.fail("unexpected file found", {file: p}) } else { t.pass(p) expected.splice(i, 1) } }) .on("close", function () { t.notOk(expected.length, "all expected files should be seen") t.end() }) }) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/nested-ignores.js0000644000000000000000000000242112631326456030241 0ustar 00000000000000// ignore most things var IgnoreFile = require("../") // set the ignores just for this test var c = require("./common.js") c.ignores( { ".ignore": ["*", "a", "c", "!a/b/c/.abc", "!/c/b/a/cba"] , "a/.ignore": [ "!*", ".ignore" ] // unignore everything , "a/a/.ignore": [ "*" ] // re-ignore everything , "a/b/.ignore": [ "*", "!/c/.abc" ] // original unignore , "a/c/.ignore": [ "*" ] // ignore everything again , "c/b/a/.ignore": [ "!cba", "!.cba", "!/a{bc,cb}" ] }) // the only files we expect to see var expected = [ "/a" , "/a/a" , "/a/b" , "/a/b/c" , "/a/b/c/.abc" , "/a/c" , "/c" , "/c/b" , "/c/b/a" , "/c/b/a/cba" , "/c/b/a/.cba" , "/c/b/a/abc" , "/c/b/a/acb" ] require("tap").test("basic ignore rules", function (t) { t.pass("start") IgnoreFile({ path: __dirname + "/fixtures" , ignoreFiles: [".ignore"] }) .on("child", function (e) { var p = e.path.substr(e.root.path.length) var i = expected.indexOf(p) if (i === -1) { console.log("not ok "+p) t.fail("unexpected file found", {found: p}) } else { t.pass(p) expected.splice(i, 1) } }) .on("close", function () { t.deepEqual(expected, [], "all expected files should be seen") t.end() }) }) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/read-file-order.js0000644000000000000000000000376712631326456030272 0ustar 00000000000000var IgnoreFile = require("../") , fs = require('fs') // set the ignores just for this test var c = require("./common.js") c.ignores({ ".gitignore": ["a/b/c/abc"] }) c.ignores({ ".ignore": ["*", "!a/b/c/abc"] }) // the only files we expect to see var expected = [ "/a" , "/a/b" , "/a/b/c" , "/a/b/c/abc" ] var originalReadFile = fs.readFile , parallelCount = 0 , firstCall // Overwrite fs.readFile so that when .gitignore and .ignore are read in // parallel, .ignore will always be read first. fs.readFile = function (filename, options, callback) { if (typeof options === 'function') { callback = options options = false } parallelCount++ process.nextTick(function () { if (parallelCount > 1) { if (!firstCall) { return firstCall = function (cb) { originalReadFile(filename, options, function (err, data) { callback(err, data) if (cb) cb() }) } } if (filename.indexOf('.gitignore') !== -1) { firstCall(function () { originalReadFile(filename, options, callback) }) } else { originalReadFile(filename, options, function (err, data) { callback(err, data) firstCall() }) } } else { originalReadFile(filename, options, callback) parallelCount = 0 } }) } require("tap").test("read file order", function (t) { t.pass("start") IgnoreFile({ path: __dirname + "/fixtures" , ignoreFiles: [".gitignore", ".ignore"] }) .on("ignoreFile", function (e) { console.error("ignore file!", e) }) .on("child", function (e) { var p = e.path.substr(e.root.path.length) var i = expected.indexOf(p) if (i === -1) { t.fail("unexpected file found", {f: p}) } else { t.pass(p) expected.splice(i, 1) } }) .on("close", function () { fs.readFile = originalReadFile t.notOk(expected.length, "all expected files should be seen") t.end() }) }) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/unignore-child.js0000644000000000000000000000160412631326456030224 0ustar 00000000000000// ignore most things var IgnoreFile = require("../") // set the ignores just for this test var c = require("./common.js") c.ignores({ ".ignore": ["*", "a", "c", "!a/b/c/.abc", "!/c/b/a/cba"] }) // the only files we expect to see var expected = [ "/a/b/c/.abc" , "/a" , "/a/b" , "/a/b/c" , "/c/b/a/cba" , "/c" , "/c/b" , "/c/b/a" ] require("tap").test("basic ignore rules", function (t) { t.pass("start") IgnoreFile({ path: __dirname + "/fixtures" , ignoreFiles: [".ignore"] }) .on("child", function (e) { var p = e.path.substr(e.root.path.length) var i = expected.indexOf(p) if (i === -1) { t.fail("unexpected file found", {f: p}) } else { t.pass(p) expected.splice(i, 1) } }) .on("close", function () { t.notOk(expected.length, "all expected files should be seen") t.end() }) }) npm_3.5.2.orig/node_modules/fstream-npm/node_modules/fstream-ignore/test/zz-cleanup.js0000644000000000000000000000036212631326456027405 0ustar 00000000000000var tap = require("tap") , rimraf = require("rimraf") , path = require("path") tap.test("remove fixtures", function (t) { rimraf(path.resolve(__dirname, "fixtures"), function (er) { t.ifError(er, "remove fixtures") t.end() }) }) npm_3.5.2.orig/node_modules/fstream-npm/test/ignores.js0000644000000000000000000000544612631326456021374 0ustar 00000000000000var fs = require('graceful-fs') var join = require('path').join var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var Packer = require('..') var pkg = join(__dirname, 'test-package') var elfJS = function () {/* module.exports = function () { console.log("i'm a elf") } */}.toString().split('\n').slice(1, -1).join() var json = { 'name': 'test-package', 'version': '3.1.4', 'main': 'elf.js' } test('setup', function (t) { setup() t.end() }) var included = [ 'package.json', 'elf.js', join('deps', 'foo', 'config', 'config.gypi') ] test('follows npm package ignoring rules', function (t) { var subject = new Packer({ path: pkg, type: 'Directory', isDirectory: true }) var filenames = [] subject.on('entry', function (entry) { t.equal(entry.type, 'File', 'only files in this package') // include relative path in filename var filename = entry._path.slice(entry.root._path.length + 1) filenames.push(filename) }) // need to do this so fstream doesn't explode when files are removed from // under it subject.on('end', function () { // ensure we get *exactly* the results we expect by comparing in both // directions filenames.forEach(function (filename) { t.ok( included.indexOf(filename) > -1, filename + ' is included' ) }) included.forEach(function (filename) { t.ok( filenames.indexOf(filename) > -1, filename + ' is not included' ) }) t.end() }) }) test('cleanup', function (t) { // rimraf.sync chokes here for some reason rimraf(pkg, function () { t.end() }) }) function setup () { rimraf.sync(pkg) mkdirp.sync(pkg) fs.writeFileSync( join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync( join(pkg, 'elf.js'), elfJS ) fs.writeFileSync( join(pkg, '.npmrc'), 'packaged=false' ) fs.writeFileSync( join(pkg, '.npmignore'), '.npmignore\ndummy\npackage.json' ) fs.writeFileSync( join(pkg, 'dummy'), 'foo' ) var buildDir = join(pkg, 'build') mkdirp.sync(buildDir) fs.writeFileSync( join(buildDir, 'config.gypi'), "i_wont_be_included_by_fstream='with any luck'" ) var depscfg = join(pkg, 'deps', 'foo', 'config') mkdirp.sync(depscfg) fs.writeFileSync( join(depscfg, 'config.gypi'), "i_will_be_included_by_fstream='with any luck'" ) fs.writeFileSync( join(buildDir, 'npm-debug.log'), '0 lol\n' ) var gitDir = join(pkg, '.git') mkdirp.sync(gitDir) fs.writeFileSync( join(gitDir, 'gitstub'), "won't fool git, also won't be included by fstream" ) var historyDir = join(pkg, 'node_modules/history') mkdirp.sync(historyDir) fs.writeFileSync( join(historyDir, 'README.md'), "please don't include me" ) } npm_3.5.2.orig/node_modules/glob/LICENSE0000644000000000000000000000137512631326456016105 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/glob/README.md0000644000000000000000000003612512631326456016360 0ustar 00000000000000[![Build Status](https://travis-ci.org/isaacs/node-glob.svg?branch=master)](https://travis-ci.org/isaacs/node-glob/) [![Dependency Status](https://david-dm.org/isaacs/node-glob.svg)](https://david-dm.org/isaacs/node-glob) [![devDependency Status](https://david-dm.org/isaacs/node-glob/dev-status.svg)](https://david-dm.org/isaacs/node-glob#info=devDependencies) [![optionalDependency Status](https://david-dm.org/isaacs/node-glob/optional-status.svg)](https://david-dm.org/isaacs/node-glob#info=optionalDependencies) # Glob Match files using the patterns the shell uses, like stars and stuff. This is a glob implementation in JavaScript. It uses the `minimatch` library to do its matching. ![](oh-my-glob.gif) ## Usage ```javascript var glob = require("glob") // options is optional glob("**/*.js", options, function (er, files) { // files is an array of filenames. // If the `nonull` option is set, and nothing // was found, then files is ["**/*.js"] // er is an error object or null. }) ``` ## Glob Primer "Globs" are the patterns you type when you do stuff like `ls *.js` on the command line, or put `build/*` in a `.gitignore` file. Before parsing the path part patterns, braced sections are expanded into a set. Braced sections start with `{` and end with `}`, with any number of comma-delimited sections within. Braced sections may contain slash characters, so `a{/b/c,bcd}` would expand into `a/b/c` and `abcd`. The following characters have special magic meaning when used in a path portion: * `*` Matches 0 or more characters in a single path portion * `?` Matches 1 character * `[...]` Matches a range of characters, similar to a RegExp range. If the first character of the range is `!` or `^` then it matches any character not in the range. * `!(pattern|pattern|pattern)` Matches anything that does not match any of the patterns provided. * `?(pattern|pattern|pattern)` Matches zero or one occurrence of the patterns provided. * `+(pattern|pattern|pattern)` Matches one or more occurrences of the patterns provided. * `*(a|b|c)` Matches zero or more occurrences of the patterns provided * `@(pattern|pat*|pat?erN)` Matches exactly one of the patterns provided * `**` If a "globstar" is alone in a path portion, then it matches zero or more directories and subdirectories searching for matches. It does not crawl symlinked directories. ### Dots If a file or directory path portion has a `.` as the first character, then it will not match any glob pattern unless that pattern's corresponding path part also has a `.` as its first character. For example, the pattern `a/.*/c` would match the file at `a/.b/c`. However the pattern `a/*/c` would not, because `*` does not start with a dot character. You can make glob treat dots as normal characters by setting `dot:true` in the options. ### Basename Matching If you set `matchBase:true` in the options, and the pattern has no slashes in it, then it will seek for any file anywhere in the tree with a matching basename. For example, `*.js` would match `test/simple/basic.js`. ### Negation The intent for negation would be for a pattern starting with `!` to match everything that *doesn't* match the supplied pattern. However, the implementation is weird, and for the time being, this should be avoided. The behavior is deprecated in version 5, and will be removed entirely in version 6. ### Empty Sets If no matching files are found, then an empty array is returned. This differs from the shell, where the pattern itself is returned. For example: $ echo a*s*d*f a*s*d*f To get the bash-style behavior, set the `nonull:true` in the options. ### See Also: * `man sh` * `man bash` (Search for "Pattern Matching") * `man 3 fnmatch` * `man 5 gitignore` * [minimatch documentation](https://github.com/isaacs/minimatch) ## glob.hasMagic(pattern, [options]) Returns `true` if there are any special characters in the pattern, and `false` otherwise. Note that the options affect the results. If `noext:true` is set in the options object, then `+(a|b)` will not be considered a magic pattern. If the pattern has a brace expansion, like `a/{b/c,x/y}` then that is considered magical, unless `nobrace:true` is set in the options. ## glob(pattern, [options], cb) * `pattern` {String} Pattern to be matched * `options` {Object} * `cb` {Function} * `err` {Error | null} * `matches` {Array} filenames found matching the pattern Perform an asynchronous glob search. ## glob.sync(pattern, [options]) * `pattern` {String} Pattern to be matched * `options` {Object} * return: {Array} filenames found matching the pattern Perform a synchronous glob search. ## Class: glob.Glob Create a Glob object by instantiating the `glob.Glob` class. ```javascript var Glob = require("glob").Glob var mg = new Glob(pattern, options, cb) ``` It's an EventEmitter, and starts walking the filesystem to find matches immediately. ### new glob.Glob(pattern, [options], [cb]) * `pattern` {String} pattern to search for * `options` {Object} * `cb` {Function} Called when an error occurs, or matches are found * `err` {Error | null} * `matches` {Array} filenames found matching the pattern Note that if the `sync` flag is set in the options, then matches will be immediately available on the `g.found` member. ### Properties * `minimatch` The minimatch object that the glob uses. * `options` The options object passed in. * `aborted` Boolean which is set to true when calling `abort()`. There is no way at this time to continue a glob search after aborting, but you can re-use the statCache to avoid having to duplicate syscalls. * `cache` Convenience object. Each field has the following possible values: * `false` - Path does not exist * `true` - Path exists * `'DIR'` - Path exists, and is not a directory * `'FILE'` - Path exists, and is a directory * `[file, entries, ...]` - Path exists, is a directory, and the array value is the results of `fs.readdir` * `statCache` Cache of `fs.stat` results, to prevent statting the same path multiple times. * `symlinks` A record of which paths are symbolic links, which is relevant in resolving `**` patterns. * `realpathCache` An optional object which is passed to `fs.realpath` to minimize unnecessary syscalls. It is stored on the instantiated Glob object, and may be re-used. ### Events * `end` When the matching is finished, this is emitted with all the matches found. If the `nonull` option is set, and no match was found, then the `matches` list contains the original pattern. The matches are sorted, unless the `nosort` flag is set. * `match` Every time a match is found, this is emitted with the matched. * `error` Emitted when an unexpected error is encountered, or whenever any fs error occurs if `options.strict` is set. * `abort` When `abort()` is called, this event is raised. ### Methods * `pause` Temporarily stop the search * `resume` Resume the search * `abort` Stop the search forever ### Options All the options that can be passed to Minimatch can also be passed to Glob to change pattern matching behavior. Also, some have been added, or have glob-specific ramifications. All options are false by default, unless otherwise noted. All options are added to the Glob object, as well. If you are running many `glob` operations, you can pass a Glob object as the `options` argument to a subsequent operation to shortcut some `stat` and `readdir` calls. At the very least, you may pass in shared `symlinks`, `statCache`, `realpathCache`, and `cache` options, so that parallel glob operations will be sped up by sharing information about the filesystem. * `cwd` The current working directory in which to search. Defaults to `process.cwd()`. * `root` The place where patterns starting with `/` will be mounted onto. Defaults to `path.resolve(options.cwd, "/")` (`/` on Unix systems, and `C:\` or some such on Windows.) * `dot` Include `.dot` files in normal matches and `globstar` matches. Note that an explicit dot in a portion of the pattern will always match dot files. * `nomount` By default, a pattern starting with a forward-slash will be "mounted" onto the root setting, so that a valid filesystem path is returned. Set this flag to disable that behavior. * `mark` Add a `/` character to directory matches. Note that this requires additional stat calls. * `nosort` Don't sort the results. * `stat` Set to true to stat *all* results. This reduces performance somewhat, and is completely unnecessary, unless `readdir` is presumed to be an untrustworthy indicator of file existence. * `silent` When an unusual error is encountered when attempting to read a directory, a warning will be printed to stderr. Set the `silent` option to true to suppress these warnings. * `strict` When an unusual error is encountered when attempting to read a directory, the process will just continue on in search of other matches. Set the `strict` option to raise an error in these cases. * `cache` See `cache` property above. Pass in a previously generated cache object to save some fs calls. * `statCache` A cache of results of filesystem information, to prevent unnecessary stat calls. While it should not normally be necessary to set this, you may pass the statCache from one glob() call to the options object of another, if you know that the filesystem will not change between calls. (See "Race Conditions" below.) * `symlinks` A cache of known symbolic links. You may pass in a previously generated `symlinks` object to save `lstat` calls when resolving `**` matches. * `sync` DEPRECATED: use `glob.sync(pattern, opts)` instead. * `nounique` In some cases, brace-expanded patterns can result in the same file showing up multiple times in the result set. By default, this implementation prevents duplicates in the result set. Set this flag to disable that behavior. * `nonull` Set to never return an empty set, instead returning a set containing the pattern itself. This is the default in glob(3). * `debug` Set to enable debug logging in minimatch and glob. * `nobrace` Do not expand `{a,b}` and `{1..3}` brace sets. * `noglobstar` Do not match `**` against multiple filenames. (Ie, treat it as a normal `*` instead.) * `noext` Do not match `+(a|b)` "extglob" patterns. * `nocase` Perform a case-insensitive match. Note: on case-insensitive filesystems, non-magic patterns will match by default, since `stat` and `readdir` will not raise errors. * `matchBase` Perform a basename-only match if the pattern does not contain any slash characters. That is, `*.js` would be treated as equivalent to `**/*.js`, matching all js files in all directories. * `nodir` Do not match directories, only files. (Note: to match *only* directories, simply put a `/` at the end of the pattern.) * `ignore` Add a pattern or an array of patterns to exclude matches. * `follow` Follow symlinked directories when expanding `**` patterns. Note that this can result in a lot of duplicate references in the presence of cyclic links. * `realpath` Set to true to call `fs.realpath` on all of the results. In the case of a symlink that cannot be resolved, the full absolute path to the matched entry is returned (though it will usually be a broken symlink) * `nonegate` Suppress deprecated `negate` behavior. (See below.) Default=true * `nocomment` Suppress deprecated `comment` behavior. (See below.) Default=true ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between node-glob and other implementations, and are intentional. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.3, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. Note that symlinked directories are not crawled as part of a `**`, though their contents may match against subsequent portions of the pattern. This prevents infinite loops and duplicates and the like. If an escaped pattern has no matches, and the `nonull` flag is set, then glob returns the pattern as-provided, rather than interpreting the character escapes. For example, `glob.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. ### Comments and Negation **Note**: In version 5 of this module, negation and comments are **disabled** by default. You can explicitly set `nonegate:false` or `nocomment:false` to re-enable them. They are going away entirely in version 6. The intent for negation would be for a pattern starting with `!` to match everything that *doesn't* match the supplied pattern. However, the implementation is weird. It is better to use the `ignore` option to set a pattern or set of patterns to exclude from matches. If you want the "everything except *x*" type of behavior, you can use `**` as the main pattern, and set an `ignore` for the things to exclude. The comments feature is added in minimatch, primarily to more easily support use cases like ignore files, where a `#` at the start of a line makes the pattern "empty". However, in the context of a straightforward filesystem globber, "comments" don't make much sense. ## Windows **Please only use forward-slashes in glob expressions.** Though windows uses either `/` or `\` as its path separator, only `/` characters are used by this glob implementation. You must use forward-slashes **only** in glob expressions. Back-slashes will always be interpreted as escape characters, not path separators. Results from absolute patterns such as `/foo/*` are mounted onto the root setting using `path.join`. On windows, this will by default result in `/foo/*` matching `C:\foo\bar.txt`. ## Race Conditions Glob searching, by its very nature, is susceptible to race conditions, since it relies on directory walking and such. As a result, it is possible that a file that exists when glob looks for it may have been deleted or modified by the time it returns the result. As part of its internal implementation, this program caches all stat and readdir calls that it makes, in order to cut down on system overhead. However, this also makes it even more susceptible to races, especially if the cache or statCache objects are reused between glob calls. Users are thus advised not to use a glob result as a guarantee of filesystem state in the face of rapid changes. For the vast majority of operations, this is never a problem. ## Contributing Any change to behavior (including bugfixes) must come with a test. Patches that fail tests or reduce performance will be rejected. ``` # to run tests npm test # to re-generate test fixtures npm run test-regen # to benchmark against bash/zsh npm run bench # to profile javascript npm run prof ``` npm_3.5.2.orig/node_modules/glob/common.js0000644000000000000000000001401312631326456016717 0ustar 00000000000000exports.alphasort = alphasort exports.alphasorti = alphasorti exports.setopts = setopts exports.ownProp = ownProp exports.makeAbs = makeAbs exports.finish = finish exports.mark = mark exports.isIgnored = isIgnored exports.childrenIgnored = childrenIgnored function ownProp (obj, field) { return Object.prototype.hasOwnProperty.call(obj, field) } var path = require("path") var minimatch = require("minimatch") var isAbsolute = require("path-is-absolute") var Minimatch = minimatch.Minimatch function alphasorti (a, b) { return a.toLowerCase().localeCompare(b.toLowerCase()) } function alphasort (a, b) { return a.localeCompare(b) } function setupIgnores (self, options) { self.ignore = options.ignore || [] if (!Array.isArray(self.ignore)) self.ignore = [self.ignore] if (self.ignore.length) { self.ignore = self.ignore.map(ignoreMap) } } function ignoreMap (pattern) { var gmatcher = null if (pattern.slice(-3) === '/**') { var gpattern = pattern.replace(/(\/\*\*)+$/, '') gmatcher = new Minimatch(gpattern) } return { matcher: new Minimatch(pattern), gmatcher: gmatcher } } function setopts (self, pattern, options) { if (!options) options = {} // base-matching: just use globstar for that. if (options.matchBase && -1 === pattern.indexOf("/")) { if (options.noglobstar) { throw new Error("base matching requires globstar") } pattern = "**/" + pattern } self.silent = !!options.silent self.pattern = pattern self.strict = options.strict !== false self.realpath = !!options.realpath self.realpathCache = options.realpathCache || Object.create(null) self.follow = !!options.follow self.dot = !!options.dot self.mark = !!options.mark self.nodir = !!options.nodir if (self.nodir) self.mark = true self.sync = !!options.sync self.nounique = !!options.nounique self.nonull = !!options.nonull self.nosort = !!options.nosort self.nocase = !!options.nocase self.stat = !!options.stat self.noprocess = !!options.noprocess self.maxLength = options.maxLength || Infinity self.cache = options.cache || Object.create(null) self.statCache = options.statCache || Object.create(null) self.symlinks = options.symlinks || Object.create(null) setupIgnores(self, options) self.changedCwd = false var cwd = process.cwd() if (!ownProp(options, "cwd")) self.cwd = cwd else { self.cwd = options.cwd self.changedCwd = path.resolve(options.cwd) !== cwd } self.root = options.root || path.resolve(self.cwd, "/") self.root = path.resolve(self.root) if (process.platform === "win32") self.root = self.root.replace(/\\/g, "/") self.nomount = !!options.nomount // disable comments and negation unless the user explicitly // passes in false as the option. options.nonegate = options.nonegate === false ? false : true options.nocomment = options.nocomment === false ? false : true deprecationWarning(options) self.minimatch = new Minimatch(pattern, options) self.options = self.minimatch.options } // TODO(isaacs): remove entirely in v6 // exported to reset in tests exports.deprecationWarned function deprecationWarning(options) { if (!options.nonegate || !options.nocomment) { if (process.noDeprecation !== true && !exports.deprecationWarned) { var msg = 'glob WARNING: comments and negation will be disabled in v6' if (process.throwDeprecation) throw new Error(msg) else if (process.traceDeprecation) console.trace(msg) else console.error(msg) exports.deprecationWarned = true } } } function finish (self) { var nou = self.nounique var all = nou ? [] : Object.create(null) for (var i = 0, l = self.matches.length; i < l; i ++) { var matches = self.matches[i] if (!matches || Object.keys(matches).length === 0) { if (self.nonull) { // do like the shell, and spit out the literal glob var literal = self.minimatch.globSet[i] if (nou) all.push(literal) else all[literal] = true } } else { // had matches var m = Object.keys(matches) if (nou) all.push.apply(all, m) else m.forEach(function (m) { all[m] = true }) } } if (!nou) all = Object.keys(all) if (!self.nosort) all = all.sort(self.nocase ? alphasorti : alphasort) // at *some* point we statted all of these if (self.mark) { for (var i = 0; i < all.length; i++) { all[i] = self._mark(all[i]) } if (self.nodir) { all = all.filter(function (e) { return !(/\/$/.test(e)) }) } } if (self.ignore.length) all = all.filter(function(m) { return !isIgnored(self, m) }) self.found = all } function mark (self, p) { var abs = makeAbs(self, p) var c = self.cache[abs] var m = p if (c) { var isDir = c === 'DIR' || Array.isArray(c) var slash = p.slice(-1) === '/' if (isDir && !slash) m += '/' else if (!isDir && slash) m = m.slice(0, -1) if (m !== p) { var mabs = makeAbs(self, m) self.statCache[mabs] = self.statCache[abs] self.cache[mabs] = self.cache[abs] } } return m } // lotta situps... function makeAbs (self, f) { var abs = f if (f.charAt(0) === '/') { abs = path.join(self.root, f) } else if (isAbsolute(f) || f === '') { abs = f } else if (self.changedCwd) { abs = path.resolve(self.cwd, f) } else { abs = path.resolve(f) } return abs } // Return true, if pattern ends with globstar '**', for the accompanying parent directory. // Ex:- If node_modules/** is the pattern, add 'node_modules' to ignore list along with it's contents function isIgnored (self, path) { if (!self.ignore.length) return false return self.ignore.some(function(item) { return item.matcher.match(path) || !!(item.gmatcher && item.gmatcher.match(path)) }) } function childrenIgnored (self, path) { if (!self.ignore.length) return false return self.ignore.some(function(item) { return !!(item.gmatcher && item.gmatcher.match(path)) }) } npm_3.5.2.orig/node_modules/glob/glob.js0000644000000000000000000004430112631326456016355 0ustar 00000000000000// Approach: // // 1. Get the minimatch set // 2. For each pattern in the set, PROCESS(pattern, false) // 3. Store matches per-set, then uniq them // // PROCESS(pattern, inGlobStar) // Get the first [n] items from pattern that are all strings // Join these together. This is PREFIX. // If there is no more remaining, then stat(PREFIX) and // add to matches if it succeeds. END. // // If inGlobStar and PREFIX is symlink and points to dir // set ENTRIES = [] // else readdir(PREFIX) as ENTRIES // If fail, END // // with ENTRIES // If pattern[n] is GLOBSTAR // // handle the case where the globstar match is empty // // by pruning it out, and testing the resulting pattern // PROCESS(pattern[0..n] + pattern[n+1 .. $], false) // // handle other cases. // for ENTRY in ENTRIES (not dotfiles) // // attach globstar + tail onto the entry // // Mark that this entry is a globstar match // PROCESS(pattern[0..n] + ENTRY + pattern[n .. $], true) // // else // not globstar // for ENTRY in ENTRIES (not dotfiles, unless pattern[n] is dot) // Test ENTRY against pattern[n] // If fails, continue // If passes, PROCESS(pattern[0..n] + item + pattern[n+1 .. $]) // // Caveat: // Cache all stats and readdirs results to minimize syscall. Since all // we ever care about is existence and directory-ness, we can just keep // `true` for files, and [children,...] for directories, or `false` for // things that don't exist. module.exports = glob var fs = require('fs') var minimatch = require('minimatch') var Minimatch = minimatch.Minimatch var inherits = require('inherits') var EE = require('events').EventEmitter var path = require('path') var assert = require('assert') var isAbsolute = require('path-is-absolute') var globSync = require('./sync.js') var common = require('./common.js') var alphasort = common.alphasort var alphasorti = common.alphasorti var setopts = common.setopts var ownProp = common.ownProp var inflight = require('inflight') var util = require('util') var childrenIgnored = common.childrenIgnored var isIgnored = common.isIgnored var once = require('once') function glob (pattern, options, cb) { if (typeof options === 'function') cb = options, options = {} if (!options) options = {} if (options.sync) { if (cb) throw new TypeError('callback provided to sync glob') return globSync(pattern, options) } return new Glob(pattern, options, cb) } glob.sync = globSync var GlobSync = glob.GlobSync = globSync.GlobSync // old api surface glob.glob = glob glob.hasMagic = function (pattern, options_) { var options = util._extend({}, options_) options.noprocess = true var g = new Glob(pattern, options) var set = g.minimatch.set if (set.length > 1) return true for (var j = 0; j < set[0].length; j++) { if (typeof set[0][j] !== 'string') return true } return false } glob.Glob = Glob inherits(Glob, EE) function Glob (pattern, options, cb) { if (typeof options === 'function') { cb = options options = null } if (options && options.sync) { if (cb) throw new TypeError('callback provided to sync glob') return new GlobSync(pattern, options) } if (!(this instanceof Glob)) return new Glob(pattern, options, cb) setopts(this, pattern, options) this._didRealPath = false // process each pattern in the minimatch set var n = this.minimatch.set.length // The matches are stored as {: true,...} so that // duplicates are automagically pruned. // Later, we do an Object.keys() on these. // Keep them as a list so we can fill in when nonull is set. this.matches = new Array(n) if (typeof cb === 'function') { cb = once(cb) this.on('error', cb) this.on('end', function (matches) { cb(null, matches) }) } var self = this var n = this.minimatch.set.length this._processing = 0 this.matches = new Array(n) this._emitQueue = [] this._processQueue = [] this.paused = false if (this.noprocess) return this if (n === 0) return done() for (var i = 0; i < n; i ++) { this._process(this.minimatch.set[i], i, false, done) } function done () { --self._processing if (self._processing <= 0) self._finish() } } Glob.prototype._finish = function () { assert(this instanceof Glob) if (this.aborted) return if (this.realpath && !this._didRealpath) return this._realpath() common.finish(this) this.emit('end', this.found) } Glob.prototype._realpath = function () { if (this._didRealpath) return this._didRealpath = true var n = this.matches.length if (n === 0) return this._finish() var self = this for (var i = 0; i < this.matches.length; i++) this._realpathSet(i, next) function next () { if (--n === 0) self._finish() } } Glob.prototype._realpathSet = function (index, cb) { var matchset = this.matches[index] if (!matchset) return cb() var found = Object.keys(matchset) var self = this var n = found.length if (n === 0) return cb() var set = this.matches[index] = Object.create(null) found.forEach(function (p, i) { // If there's a problem with the stat, then it means that // one or more of the links in the realpath couldn't be // resolved. just return the abs value in that case. p = self._makeAbs(p) fs.realpath(p, self.realpathCache, function (er, real) { if (!er) set[real] = true else if (er.syscall === 'stat') set[p] = true else self.emit('error', er) // srsly wtf right here if (--n === 0) { self.matches[index] = set cb() } }) }) } Glob.prototype._mark = function (p) { return common.mark(this, p) } Glob.prototype._makeAbs = function (f) { return common.makeAbs(this, f) } Glob.prototype.abort = function () { this.aborted = true this.emit('abort') } Glob.prototype.pause = function () { if (!this.paused) { this.paused = true this.emit('pause') } } Glob.prototype.resume = function () { if (this.paused) { this.emit('resume') this.paused = false if (this._emitQueue.length) { var eq = this._emitQueue.slice(0) this._emitQueue.length = 0 for (var i = 0; i < eq.length; i ++) { var e = eq[i] this._emitMatch(e[0], e[1]) } } if (this._processQueue.length) { var pq = this._processQueue.slice(0) this._processQueue.length = 0 for (var i = 0; i < pq.length; i ++) { var p = pq[i] this._processing-- this._process(p[0], p[1], p[2], p[3]) } } } } Glob.prototype._process = function (pattern, index, inGlobStar, cb) { assert(this instanceof Glob) assert(typeof cb === 'function') if (this.aborted) return this._processing++ if (this.paused) { this._processQueue.push([pattern, index, inGlobStar, cb]) return } //console.error('PROCESS %d', this._processing, pattern) // Get the first [n] parts of pattern that are all strings. var n = 0 while (typeof pattern[n] === 'string') { n ++ } // now n is the index of the first one that is *not* a string. // see if there's anything else var prefix switch (n) { // if not, then this is rather simple case pattern.length: this._processSimple(pattern.join('/'), index, cb) return case 0: // pattern *starts* with some non-trivial item. // going to readdir(cwd), but not include the prefix in matches. prefix = null break default: // pattern has some string bits in the front. // whatever it starts with, whether that's 'absolute' like /foo/bar, // or 'relative' like '../baz' prefix = pattern.slice(0, n).join('/') break } var remain = pattern.slice(n) // get the list of entries. var read if (prefix === null) read = '.' else if (isAbsolute(prefix) || isAbsolute(pattern.join('/'))) { if (!prefix || !isAbsolute(prefix)) prefix = '/' + prefix read = prefix } else read = prefix var abs = this._makeAbs(read) //if ignored, skip _processing if (childrenIgnored(this, read)) return cb() var isGlobStar = remain[0] === minimatch.GLOBSTAR if (isGlobStar) this._processGlobStar(prefix, read, abs, remain, index, inGlobStar, cb) else this._processReaddir(prefix, read, abs, remain, index, inGlobStar, cb) } Glob.prototype._processReaddir = function (prefix, read, abs, remain, index, inGlobStar, cb) { var self = this this._readdir(abs, inGlobStar, function (er, entries) { return self._processReaddir2(prefix, read, abs, remain, index, inGlobStar, entries, cb) }) } Glob.prototype._processReaddir2 = function (prefix, read, abs, remain, index, inGlobStar, entries, cb) { // if the abs isn't a dir, then nothing can match! if (!entries) return cb() // It will only match dot entries if it starts with a dot, or if // dot is set. Stuff like @(.foo|.bar) isn't allowed. var pn = remain[0] var negate = !!this.minimatch.negate var rawGlob = pn._glob var dotOk = this.dot || rawGlob.charAt(0) === '.' var matchedEntries = [] for (var i = 0; i < entries.length; i++) { var e = entries[i] if (e.charAt(0) !== '.' || dotOk) { var m if (negate && !prefix) { m = !e.match(pn) } else { m = e.match(pn) } if (m) matchedEntries.push(e) } } //console.error('prd2', prefix, entries, remain[0]._glob, matchedEntries) var len = matchedEntries.length // If there are no matched entries, then nothing matches. if (len === 0) return cb() // if this is the last remaining pattern bit, then no need for // an additional stat *unless* the user has specified mark or // stat explicitly. We know they exist, since readdir returned // them. if (remain.length === 1 && !this.mark && !this.stat) { if (!this.matches[index]) this.matches[index] = Object.create(null) for (var i = 0; i < len; i ++) { var e = matchedEntries[i] if (prefix) { if (prefix !== '/') e = prefix + '/' + e else e = prefix + e } if (e.charAt(0) === '/' && !this.nomount) { e = path.join(this.root, e) } this._emitMatch(index, e) } // This was the last one, and no stats were needed return cb() } // now test all matched entries as stand-ins for that part // of the pattern. remain.shift() for (var i = 0; i < len; i ++) { var e = matchedEntries[i] var newPattern if (prefix) { if (prefix !== '/') e = prefix + '/' + e else e = prefix + e } this._process([e].concat(remain), index, inGlobStar, cb) } cb() } Glob.prototype._emitMatch = function (index, e) { if (this.aborted) return if (this.matches[index][e]) return if (isIgnored(this, e)) return if (this.paused) { this._emitQueue.push([index, e]) return } var abs = this._makeAbs(e) if (this.nodir) { var c = this.cache[abs] if (c === 'DIR' || Array.isArray(c)) return } if (this.mark) e = this._mark(e) this.matches[index][e] = true var st = this.statCache[abs] if (st) this.emit('stat', e, st) this.emit('match', e) } Glob.prototype._readdirInGlobStar = function (abs, cb) { if (this.aborted) return // follow all symlinked directories forever // just proceed as if this is a non-globstar situation if (this.follow) return this._readdir(abs, false, cb) var lstatkey = 'lstat\0' + abs var self = this var lstatcb = inflight(lstatkey, lstatcb_) if (lstatcb) fs.lstat(abs, lstatcb) function lstatcb_ (er, lstat) { if (er) return cb() var isSym = lstat.isSymbolicLink() self.symlinks[abs] = isSym // If it's not a symlink or a dir, then it's definitely a regular file. // don't bother doing a readdir in that case. if (!isSym && !lstat.isDirectory()) { self.cache[abs] = 'FILE' cb() } else self._readdir(abs, false, cb) } } Glob.prototype._readdir = function (abs, inGlobStar, cb) { if (this.aborted) return cb = inflight('readdir\0'+abs+'\0'+inGlobStar, cb) if (!cb) return //console.error('RD %j %j', +inGlobStar, abs) if (inGlobStar && !ownProp(this.symlinks, abs)) return this._readdirInGlobStar(abs, cb) if (ownProp(this.cache, abs)) { var c = this.cache[abs] if (!c || c === 'FILE') return cb() if (Array.isArray(c)) return cb(null, c) } var self = this fs.readdir(abs, readdirCb(this, abs, cb)) } function readdirCb (self, abs, cb) { return function (er, entries) { if (er) self._readdirError(abs, er, cb) else self._readdirEntries(abs, entries, cb) } } Glob.prototype._readdirEntries = function (abs, entries, cb) { if (this.aborted) return // if we haven't asked to stat everything, then just // assume that everything in there exists, so we can avoid // having to stat it a second time. if (!this.mark && !this.stat) { for (var i = 0; i < entries.length; i ++) { var e = entries[i] if (abs === '/') e = abs + e else e = abs + '/' + e this.cache[e] = true } } this.cache[abs] = entries return cb(null, entries) } Glob.prototype._readdirError = function (f, er, cb) { if (this.aborted) return // handle errors, and cache the information switch (er.code) { case 'ENOTSUP': // https://github.com/isaacs/node-glob/issues/205 case 'ENOTDIR': // totally normal. means it *does* exist. this.cache[this._makeAbs(f)] = 'FILE' break case 'ENOENT': // not terribly unusual case 'ELOOP': case 'ENAMETOOLONG': case 'UNKNOWN': this.cache[this._makeAbs(f)] = false break default: // some unusual error. Treat as failure. this.cache[this._makeAbs(f)] = false if (this.strict) { this.emit('error', er) // If the error is handled, then we abort // if not, we threw out of here this.abort() } if (!this.silent) console.error('glob error', er) break } return cb() } Glob.prototype._processGlobStar = function (prefix, read, abs, remain, index, inGlobStar, cb) { var self = this this._readdir(abs, inGlobStar, function (er, entries) { self._processGlobStar2(prefix, read, abs, remain, index, inGlobStar, entries, cb) }) } Glob.prototype._processGlobStar2 = function (prefix, read, abs, remain, index, inGlobStar, entries, cb) { //console.error('pgs2', prefix, remain[0], entries) // no entries means not a dir, so it can never have matches // foo.txt/** doesn't match foo.txt if (!entries) return cb() // test without the globstar, and with every child both below // and replacing the globstar. var remainWithoutGlobStar = remain.slice(1) var gspref = prefix ? [ prefix ] : [] var noGlobStar = gspref.concat(remainWithoutGlobStar) // the noGlobStar pattern exits the inGlobStar state this._process(noGlobStar, index, false, cb) var isSym = this.symlinks[abs] var len = entries.length // If it's a symlink, and we're in a globstar, then stop if (isSym && inGlobStar) return cb() for (var i = 0; i < len; i++) { var e = entries[i] if (e.charAt(0) === '.' && !this.dot) continue // these two cases enter the inGlobStar state var instead = gspref.concat(entries[i], remainWithoutGlobStar) this._process(instead, index, true, cb) var below = gspref.concat(entries[i], remain) this._process(below, index, true, cb) } cb() } Glob.prototype._processSimple = function (prefix, index, cb) { // XXX review this. Shouldn't it be doing the mounting etc // before doing stat? kinda weird? var self = this this._stat(prefix, function (er, exists) { self._processSimple2(prefix, index, er, exists, cb) }) } Glob.prototype._processSimple2 = function (prefix, index, er, exists, cb) { //console.error('ps2', prefix, exists) if (!this.matches[index]) this.matches[index] = Object.create(null) // If it doesn't exist, then just mark the lack of results if (!exists) return cb() if (prefix && isAbsolute(prefix) && !this.nomount) { var trail = /[\/\\]$/.test(prefix) if (prefix.charAt(0) === '/') { prefix = path.join(this.root, prefix) } else { prefix = path.resolve(this.root, prefix) if (trail) prefix += '/' } } if (process.platform === 'win32') prefix = prefix.replace(/\\/g, '/') // Mark this as a match this._emitMatch(index, prefix) cb() } // Returns either 'DIR', 'FILE', or false Glob.prototype._stat = function (f, cb) { var abs = this._makeAbs(f) var needDir = f.slice(-1) === '/' if (f.length > this.maxLength) return cb() if (!this.stat && ownProp(this.cache, abs)) { var c = this.cache[abs] if (Array.isArray(c)) c = 'DIR' // It exists, but maybe not how we need it if (!needDir || c === 'DIR') return cb(null, c) if (needDir && c === 'FILE') return cb() // otherwise we have to stat, because maybe c=true // if we know it exists, but not what it is. } var exists var stat = this.statCache[abs] if (stat !== undefined) { if (stat === false) return cb(null, stat) else { var type = stat.isDirectory() ? 'DIR' : 'FILE' if (needDir && type === 'FILE') return cb() else return cb(null, type, stat) } } var self = this var statcb = inflight('stat\0' + abs, lstatcb_) if (statcb) fs.lstat(abs, statcb) function lstatcb_ (er, lstat) { if (lstat && lstat.isSymbolicLink()) { // If it's a symlink, then treat it as the target, unless // the target does not exist, then treat it as a file. return fs.stat(abs, function (er, stat) { if (er) self._stat2(f, abs, null, lstat, cb) else self._stat2(f, abs, er, stat, cb) }) } else { self._stat2(f, abs, er, lstat, cb) } } } Glob.prototype._stat2 = function (f, abs, er, stat, cb) { if (er) { this.statCache[abs] = false return cb() } var needDir = f.slice(-1) === '/' this.statCache[abs] = stat if (abs.slice(-1) === '/' && !stat.isDirectory()) return cb(null, false, stat) var c = stat.isDirectory() ? 'DIR' : 'FILE' this.cache[abs] = this.cache[abs] || c if (needDir && c !== 'DIR') return cb() return cb(null, c, stat) } npm_3.5.2.orig/node_modules/glob/node_modules/0000755000000000000000000000000012631326456017547 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/package.json0000644000000000000000000000343712631326456017367 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "glob", "description": "a little globber", "version": "5.0.15", "repository": { "type": "git", "url": "git://github.com/isaacs/node-glob.git" }, "main": "glob.js", "files": [ "glob.js", "sync.js", "common.js" ], "engines": { "node": "*" }, "dependencies": { "inflight": "^1.0.4", "inherits": "2", "minimatch": "2 || 3", "once": "^1.3.0", "path-is-absolute": "^1.0.0" }, "devDependencies": { "mkdirp": "0", "rimraf": "^2.2.8", "tap": "^1.1.4", "tick": "0.0.6" }, "scripts": { "prepublish": "npm run benchclean", "profclean": "rm -f v8.log profile.txt", "test": "tap test/*.js --cov", "test-regen": "npm run profclean && TEST_REGEN=1 node test/00-setup.js", "bench": "bash benchmark.sh", "prof": "bash prof.sh && cat profile.txt", "benchclean": "node benchclean.js" }, "license": "ISC", "gitHead": "3a7e71d453dd80e75b196fd262dd23ed54beeceb", "bugs": { "url": "https://github.com/isaacs/node-glob/issues" }, "homepage": "https://github.com/isaacs/node-glob#readme", "_id": "glob@5.0.15", "_shasum": "1bc936b9e02f4a603fcc222ecf7633d30b8b93b1", "_from": "glob@>=5.0.15 <5.1.0", "_npmVersion": "3.3.2", "_nodeVersion": "4.0.0", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "1bc936b9e02f4a603fcc222ecf7633d30b8b93b1", "tarball": "http://registry.npmjs.org/glob/-/glob-5.0.15.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/glob/-/glob-5.0.15.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/glob/sync.js0000644000000000000000000002630512631326456016412 0ustar 00000000000000module.exports = globSync globSync.GlobSync = GlobSync var fs = require('fs') var minimatch = require('minimatch') var Minimatch = minimatch.Minimatch var Glob = require('./glob.js').Glob var util = require('util') var path = require('path') var assert = require('assert') var isAbsolute = require('path-is-absolute') var common = require('./common.js') var alphasort = common.alphasort var alphasorti = common.alphasorti var setopts = common.setopts var ownProp = common.ownProp var childrenIgnored = common.childrenIgnored function globSync (pattern, options) { if (typeof options === 'function' || arguments.length === 3) throw new TypeError('callback provided to sync glob\n'+ 'See: https://github.com/isaacs/node-glob/issues/167') return new GlobSync(pattern, options).found } function GlobSync (pattern, options) { if (!pattern) throw new Error('must provide pattern') if (typeof options === 'function' || arguments.length === 3) throw new TypeError('callback provided to sync glob\n'+ 'See: https://github.com/isaacs/node-glob/issues/167') if (!(this instanceof GlobSync)) return new GlobSync(pattern, options) setopts(this, pattern, options) if (this.noprocess) return this var n = this.minimatch.set.length this.matches = new Array(n) for (var i = 0; i < n; i ++) { this._process(this.minimatch.set[i], i, false) } this._finish() } GlobSync.prototype._finish = function () { assert(this instanceof GlobSync) if (this.realpath) { var self = this this.matches.forEach(function (matchset, index) { var set = self.matches[index] = Object.create(null) for (var p in matchset) { try { p = self._makeAbs(p) var real = fs.realpathSync(p, self.realpathCache) set[real] = true } catch (er) { if (er.syscall === 'stat') set[self._makeAbs(p)] = true else throw er } } }) } common.finish(this) } GlobSync.prototype._process = function (pattern, index, inGlobStar) { assert(this instanceof GlobSync) // Get the first [n] parts of pattern that are all strings. var n = 0 while (typeof pattern[n] === 'string') { n ++ } // now n is the index of the first one that is *not* a string. // See if there's anything else var prefix switch (n) { // if not, then this is rather simple case pattern.length: this._processSimple(pattern.join('/'), index) return case 0: // pattern *starts* with some non-trivial item. // going to readdir(cwd), but not include the prefix in matches. prefix = null break default: // pattern has some string bits in the front. // whatever it starts with, whether that's 'absolute' like /foo/bar, // or 'relative' like '../baz' prefix = pattern.slice(0, n).join('/') break } var remain = pattern.slice(n) // get the list of entries. var read if (prefix === null) read = '.' else if (isAbsolute(prefix) || isAbsolute(pattern.join('/'))) { if (!prefix || !isAbsolute(prefix)) prefix = '/' + prefix read = prefix } else read = prefix var abs = this._makeAbs(read) //if ignored, skip processing if (childrenIgnored(this, read)) return var isGlobStar = remain[0] === minimatch.GLOBSTAR if (isGlobStar) this._processGlobStar(prefix, read, abs, remain, index, inGlobStar) else this._processReaddir(prefix, read, abs, remain, index, inGlobStar) } GlobSync.prototype._processReaddir = function (prefix, read, abs, remain, index, inGlobStar) { var entries = this._readdir(abs, inGlobStar) // if the abs isn't a dir, then nothing can match! if (!entries) return // It will only match dot entries if it starts with a dot, or if // dot is set. Stuff like @(.foo|.bar) isn't allowed. var pn = remain[0] var negate = !!this.minimatch.negate var rawGlob = pn._glob var dotOk = this.dot || rawGlob.charAt(0) === '.' var matchedEntries = [] for (var i = 0; i < entries.length; i++) { var e = entries[i] if (e.charAt(0) !== '.' || dotOk) { var m if (negate && !prefix) { m = !e.match(pn) } else { m = e.match(pn) } if (m) matchedEntries.push(e) } } var len = matchedEntries.length // If there are no matched entries, then nothing matches. if (len === 0) return // if this is the last remaining pattern bit, then no need for // an additional stat *unless* the user has specified mark or // stat explicitly. We know they exist, since readdir returned // them. if (remain.length === 1 && !this.mark && !this.stat) { if (!this.matches[index]) this.matches[index] = Object.create(null) for (var i = 0; i < len; i ++) { var e = matchedEntries[i] if (prefix) { if (prefix.slice(-1) !== '/') e = prefix + '/' + e else e = prefix + e } if (e.charAt(0) === '/' && !this.nomount) { e = path.join(this.root, e) } this.matches[index][e] = true } // This was the last one, and no stats were needed return } // now test all matched entries as stand-ins for that part // of the pattern. remain.shift() for (var i = 0; i < len; i ++) { var e = matchedEntries[i] var newPattern if (prefix) newPattern = [prefix, e] else newPattern = [e] this._process(newPattern.concat(remain), index, inGlobStar) } } GlobSync.prototype._emitMatch = function (index, e) { var abs = this._makeAbs(e) if (this.mark) e = this._mark(e) if (this.matches[index][e]) return if (this.nodir) { var c = this.cache[this._makeAbs(e)] if (c === 'DIR' || Array.isArray(c)) return } this.matches[index][e] = true if (this.stat) this._stat(e) } GlobSync.prototype._readdirInGlobStar = function (abs) { // follow all symlinked directories forever // just proceed as if this is a non-globstar situation if (this.follow) return this._readdir(abs, false) var entries var lstat var stat try { lstat = fs.lstatSync(abs) } catch (er) { // lstat failed, doesn't exist return null } var isSym = lstat.isSymbolicLink() this.symlinks[abs] = isSym // If it's not a symlink or a dir, then it's definitely a regular file. // don't bother doing a readdir in that case. if (!isSym && !lstat.isDirectory()) this.cache[abs] = 'FILE' else entries = this._readdir(abs, false) return entries } GlobSync.prototype._readdir = function (abs, inGlobStar) { var entries if (inGlobStar && !ownProp(this.symlinks, abs)) return this._readdirInGlobStar(abs) if (ownProp(this.cache, abs)) { var c = this.cache[abs] if (!c || c === 'FILE') return null if (Array.isArray(c)) return c } try { return this._readdirEntries(abs, fs.readdirSync(abs)) } catch (er) { this._readdirError(abs, er) return null } } GlobSync.prototype._readdirEntries = function (abs, entries) { // if we haven't asked to stat everything, then just // assume that everything in there exists, so we can avoid // having to stat it a second time. if (!this.mark && !this.stat) { for (var i = 0; i < entries.length; i ++) { var e = entries[i] if (abs === '/') e = abs + e else e = abs + '/' + e this.cache[e] = true } } this.cache[abs] = entries // mark and cache dir-ness return entries } GlobSync.prototype._readdirError = function (f, er) { // handle errors, and cache the information switch (er.code) { case 'ENOTSUP': // https://github.com/isaacs/node-glob/issues/205 case 'ENOTDIR': // totally normal. means it *does* exist. this.cache[this._makeAbs(f)] = 'FILE' break case 'ENOENT': // not terribly unusual case 'ELOOP': case 'ENAMETOOLONG': case 'UNKNOWN': this.cache[this._makeAbs(f)] = false break default: // some unusual error. Treat as failure. this.cache[this._makeAbs(f)] = false if (this.strict) throw er if (!this.silent) console.error('glob error', er) break } } GlobSync.prototype._processGlobStar = function (prefix, read, abs, remain, index, inGlobStar) { var entries = this._readdir(abs, inGlobStar) // no entries means not a dir, so it can never have matches // foo.txt/** doesn't match foo.txt if (!entries) return // test without the globstar, and with every child both below // and replacing the globstar. var remainWithoutGlobStar = remain.slice(1) var gspref = prefix ? [ prefix ] : [] var noGlobStar = gspref.concat(remainWithoutGlobStar) // the noGlobStar pattern exits the inGlobStar state this._process(noGlobStar, index, false) var len = entries.length var isSym = this.symlinks[abs] // If it's a symlink, and we're in a globstar, then stop if (isSym && inGlobStar) return for (var i = 0; i < len; i++) { var e = entries[i] if (e.charAt(0) === '.' && !this.dot) continue // these two cases enter the inGlobStar state var instead = gspref.concat(entries[i], remainWithoutGlobStar) this._process(instead, index, true) var below = gspref.concat(entries[i], remain) this._process(below, index, true) } } GlobSync.prototype._processSimple = function (prefix, index) { // XXX review this. Shouldn't it be doing the mounting etc // before doing stat? kinda weird? var exists = this._stat(prefix) if (!this.matches[index]) this.matches[index] = Object.create(null) // If it doesn't exist, then just mark the lack of results if (!exists) return if (prefix && isAbsolute(prefix) && !this.nomount) { var trail = /[\/\\]$/.test(prefix) if (prefix.charAt(0) === '/') { prefix = path.join(this.root, prefix) } else { prefix = path.resolve(this.root, prefix) if (trail) prefix += '/' } } if (process.platform === 'win32') prefix = prefix.replace(/\\/g, '/') // Mark this as a match this.matches[index][prefix] = true } // Returns either 'DIR', 'FILE', or false GlobSync.prototype._stat = function (f) { var abs = this._makeAbs(f) var needDir = f.slice(-1) === '/' if (f.length > this.maxLength) return false if (!this.stat && ownProp(this.cache, abs)) { var c = this.cache[abs] if (Array.isArray(c)) c = 'DIR' // It exists, but maybe not how we need it if (!needDir || c === 'DIR') return c if (needDir && c === 'FILE') return false // otherwise we have to stat, because maybe c=true // if we know it exists, but not what it is. } var exists var stat = this.statCache[abs] if (!stat) { var lstat try { lstat = fs.lstatSync(abs) } catch (er) { return false } if (lstat.isSymbolicLink()) { try { stat = fs.statSync(abs) } catch (er) { stat = lstat } } else { stat = lstat } } this.statCache[abs] = stat var c = stat.isDirectory() ? 'DIR' : 'FILE' this.cache[abs] = this.cache[abs] || c if (needDir && c !== 'DIR') return false return c } GlobSync.prototype._mark = function (p) { return common.mark(this, p) } GlobSync.prototype._makeAbs = function (f) { return common.makeAbs(this, f) } npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/0000755000000000000000000000000012631326456021520 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/path-is-absolute/0000755000000000000000000000000012631326456022730 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/LICENSE0000644000000000000000000000137512631326456022533 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/README.md0000644000000000000000000001471412631326456023006 0ustar 00000000000000# minimatch A minimal matching utility. [![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch) This is the matching library used internally by npm. It works by converting glob expressions into JavaScript `RegExp` objects. ## Usage ```javascript var minimatch = require("minimatch") minimatch("bar.foo", "*.foo") // true! minimatch("bar.foo", "*.bar") // false! minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! ``` ## Features Supports these glob features: * Brace Expansion * Extended glob matching * "Globstar" `**` matching See: * `man sh` * `man bash` * `man 3 fnmatch` * `man 5 gitignore` ## Minimatch Class Create a minimatch object by instanting the `minimatch.Minimatch` class. ```javascript var Minimatch = require("minimatch").Minimatch var mm = new Minimatch(pattern, options) ``` ### Properties * `pattern` The original pattern the minimatch object represents. * `options` The options supplied to the constructor. * `set` A 2-dimensional array of regexp or string expressions. Each row in the array corresponds to a brace-expanded pattern. Each item in the row corresponds to a single path-part. For example, the pattern `{a,b/c}/d` would expand to a set of patterns like: [ [ a, d ] , [ b, c, d ] ] If a portion of the pattern doesn't have any "magic" in it (that is, it's something like `"foo"` rather than `fo*o?`), then it will be left as a string rather than converted to a regular expression. * `regexp` Created by the `makeRe` method. A single regular expression expressing the entire pattern. This is useful in cases where you wish to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. * `negate` True if the pattern is negated. * `comment` True if the pattern is a comment. * `empty` True if the pattern is `""`. ### Methods * `makeRe` Generate the `regexp` member if necessary, and return it. Will return `false` if the pattern is invalid. * `match(fname)` Return true if the filename matches the pattern, or false otherwise. * `matchOne(fileArray, patternArray, partial)` Take a `/`-split filename, and match it against a single row in the `regExpSet`. This method is mainly for internal use, but is exposed so that it can be used by a glob-walker that needs to avoid excessive filesystem calls. All other methods are internal, and will be called as necessary. ## Functions The top-level exported function has a `cache` property, which is an LRU cache set to store 100 items. So, calling these methods repeatedly with the same pattern and options will use the same Minimatch object, saving the cost of parsing it multiple times. ### minimatch(path, pattern, options) Main export. Tests a path against the pattern using the options. ```javascript var isJS = minimatch(file, "*.js", { matchBase: true }) ``` ### minimatch.filter(pattern, options) Returns a function that tests its supplied argument, suitable for use with `Array.filter`. Example: ```javascript var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) ``` ### minimatch.match(list, pattern, options) Match against the list of files, in the style of fnmatch or glob. If nothing is matched, and options.nonull is set, then return a list containing the pattern itself. ```javascript var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) ``` ### minimatch.makeRe(pattern, options) Make a regular expression object from the pattern. ## Options All options are `false` by default. ### debug Dump a ton of stuff to stderr. ### nobrace Do not expand `{a,b}` and `{1..3}` brace sets. ### noglobstar Disable `**` matching against multiple folder names. ### dot Allow patterns to match filenames starting with a period, even if the pattern does not explicitly have a period in that spot. Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` is set. ### noext Disable "extglob" style patterns like `+(a|b)`. ### nocase Perform a case-insensitive match. ### nonull When a match is not found by `minimatch.match`, return a list containing the pattern itself if this option is set. When not set, an empty list is returned if there are no matches. ### matchBase If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. ### nocomment Suppress the behavior of treating `#` at the start of a pattern as a comment. ### nonegate Suppress the behavior of treating a leading `!` character as negation. ### flipNegate Returns from negate expressions the same as if they were not negated. (Ie, true on a hit, false on a miss.) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between minimatch and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.1, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. If an escaped pattern has no matches, and the `nonull` flag is set, then minimatch.match returns the pattern as-provided, rather than interpreting the character escapes. For example, `minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/minimatch.js0000644000000000000000000006044012631326456024033 0ustar 00000000000000module.exports = minimatch minimatch.Minimatch = Minimatch var path = { sep: '/' } try { path = require('path') } catch (er) {} var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} var expand = require('brace-expansion') // any single thing other than / // don't need to escape / when using new RegExp() var qmark = '[^/]' // * => any number of characters var star = qmark + '*?' // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. var twoStarDot = '(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?' // not a ^ or / followed by a dot, // followed by anything, any number of times. var twoStarNoDot = '(?:(?!(?:\\\/|^)\\.).)*?' // characters that need to be escaped in RegExp. var reSpecials = charSet('().*{}+?[]^$\\!') // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split('').reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { return false } // "" only matches "" if (pattern.trim() === '') return p === '' return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options) } if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} pattern = pattern.trim() // windows support: need to use /, not \ if (path.sep !== '/') { pattern = pattern.split(path.sep).join('/') } this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function () {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return s.indexOf(false) === -1 }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern var negate = false var options = this.options var negateOffset = 0 if (options.nonegate) return for (var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === '!' ; i++) { negate = !negate negateOffset++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return braceExpand(pattern, options) } Minimatch.prototype.braceExpand = braceExpand function braceExpand (pattern, options) { if (!options) { if (this instanceof Minimatch) { options = this.options } else { options = {} } } pattern = typeof pattern === 'undefined' ? this.pattern : pattern if (typeof pattern === 'undefined') { throw new Error('undefined pattern') } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } return expand(pattern) } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === '**') return GLOBSTAR if (pattern === '') return '' var re = '' var hasMagic = !!options.nocase var escaping = false // ? => one single character var patternListStack = [] var negativeLists = [] var plType var stateChar var inClass = false var reClassStart = -1 var classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. var patternStart = pattern.charAt(0) === '.' ? '' // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))' : '(?!\\.)' var self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case '*': re += star hasMagic = true break case '?': re += qmark hasMagic = true break default: re += '\\' + stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for (var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i++) { this.debug('%s\t%s %s %j', pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += '\\' + c escaping = false continue } switch (c) { case '/': // completely not allowed, even escaped. // Should already be path-split by now. return false case '\\': clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case '?': case '*': case '+': case '@': case '!': this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === '!' && i === classStart + 1) c = '^' re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case '(': if (inClass) { re += '(' continue } if (!stateChar) { re += '\\(' continue } plType = stateChar patternListStack.push({ type: plType, start: i - 1, reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === '!' ? '(?:(?!(?:' : '(?:' this.debug('plType %j %j', stateChar, re) stateChar = false continue case ')': if (inClass || !patternListStack.length) { re += '\\)' continue } clearStateChar() hasMagic = true re += ')' var pl = patternListStack.pop() plType = pl.type // negation is (?:(?!js)[^/]*) // The others are (?:) switch (plType) { case '!': negativeLists.push(pl) re += ')[^/]*?)' pl.reEnd = re.length break case '?': case '+': case '*': re += plType break case '@': break // the default anyway } continue case '|': if (inClass || !patternListStack.length || escaping) { re += '\\|' escaping = false continue } clearStateChar() re += '|' continue // these are mostly the same in regexp and glob case '[': // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += '\\' + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case ']': // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += '\\' + c escaping = false continue } // handle the case where we left a class open. // "[z-a]" is valid, equivalent to "\[z-a\]" if (inClass) { // split where the last [ was, make sure we don't have // an invalid re. if so, re-walk the contents of the // would-be class to re-translate any characters that // were passed through as-is // TODO: It would probably be faster to determine this // without a try/catch and a new RegExp, but it's tricky // to do safely. For now, this is safe and works. var cs = pattern.substring(classStart + 1, i) try { RegExp('[' + cs + ']') } catch (er) { // not a valid class! var sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]' hasMagic = hasMagic || sp[1] inClass = false continue } } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === '^' && inClass)) { re += '\\' } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is cs = pattern.substr(classStart + 1) sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = '\\' } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + '|' }) this.debug('tail=%j\n %s', tail, tail) var t = pl.type === '*' ? star : pl.type === '?' ? qmark : '\\' + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + '\\(' + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += '\\\\' } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case '.': case '[': case '(': addPatternStart = true } // Hack to work around lack of negative lookbehind in JS // A pattern like: *.!(x).!(y|z) needs to ensure that a name // like 'a.xyz.yz' doesn't match. So, the first negative // lookahead, has to look ALL the way ahead, to the end of // the pattern. for (var n = negativeLists.length - 1; n > -1; n--) { var nl = negativeLists[n] var nlBefore = re.slice(0, nl.reStart) var nlFirst = re.slice(nl.reStart, nl.reEnd - 8) var nlLast = re.slice(nl.reEnd - 8, nl.reEnd) var nlAfter = re.slice(nl.reEnd) nlLast += nlAfter // Handle nested stuff like *(*.js|!(*.json)), where open parens // mean that we should *not* include the ) in the bit that is considered // "after" the negated section. var openParensBefore = nlBefore.split('(').length - 1 var cleanAfter = nlAfter for (i = 0; i < openParensBefore; i++) { cleanAfter = cleanAfter.replace(/\)[+*?]?/, '') } nlAfter = cleanAfter var dollar = '' if (nlAfter === '' && isSub !== SUBPARSE) { dollar = '$' } var newRe = nlBefore + nlFirst + nlAfter + dollar + nlLast re = newRe } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== '' && hasMagic) { re = '(?=.)' + re } if (addPatternStart) { re = patternStart + re } // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [re, hasMagic] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? 'i' : '' var regExp = new RegExp('^' + re + '$', flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) { this.regexp = false return this.regexp } var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot var flags = options.nocase ? 'i' : '' var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === 'string') ? regExpEscape(p) : p._src }).join('\\\/') }).join('|') // must match entire pattern // ending in a * or ** will make it less strict. re = '^(?:' + re + ')$' // can match anything, as long as it's not this. if (this.negate) re = '^(?!' + re + ').*$' try { this.regexp = new RegExp(re, flags) } catch (ex) { this.regexp = false } return this.regexp } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug('match', f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === '' if (f === '/' && partial) return true var options = this.options // windows: need to use /, not \ if (path.sep !== '/') { f = f.split(path.sep).join('/') } // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, 'split', f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, 'set', set) // Find the basename of the path by looking for the last non-empty segment var filename var i for (i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (i = 0; i < set.length; i++) { var pattern = set[i] var file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug('matchOne', { 'this': this, file: file, pattern: pattern }) this.debug('matchOne', file.length, pattern.length) for (var fi = 0, pi = 0, fl = file.length, pl = pattern.length ; (fi < fl) && (pi < pl) ; fi++, pi++) { this.debug('matchOne loop') var p = pattern[pi] var f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi var pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for (; fi < fl; fi++) { if (file[fi] === '.' || file[fi] === '..' || (!options.dot && file[fi].charAt(0) === '.')) return false } return true } // ok, let's see if we can swallow whatever we can. while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === '.' || swallowee === '..' || (!options.dot && swallowee.charAt(0) === '.')) { this.debug('dot detected!', file, fr, pattern, pr) break } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug('\n>>> no match, partial?', file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === 'string') { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug('string match', p, f, hit) } else { hit = f.match(p) this.debug('pattern match', p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === '') return emptyFileEnd } // should be unreachable. throw new Error('wtf?') } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, '$1') } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&') } npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/0000755000000000000000000000000012631326456024175 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/package.json0000644000000000000000000000271512631326456024013 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "minimatch", "description": "a glob matcher in javascript", "version": "3.0.0", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "main": "minimatch.js", "scripts": { "posttest": "standard minimatch.js test/*.js", "test": "tap test/*.js" }, "engines": { "node": "*" }, "dependencies": { "brace-expansion": "^1.0.0" }, "devDependencies": { "standard": "^3.7.2", "tap": "^1.2.0" }, "license": "ISC", "files": [ "minimatch.js" ], "gitHead": "270dbea567f0af6918cb18103e98c612aa717a20", "bugs": { "url": "https://github.com/isaacs/minimatch/issues" }, "homepage": "https://github.com/isaacs/minimatch#readme", "_id": "minimatch@3.0.0", "_shasum": "5236157a51e4f004c177fb3c527ff7dd78f0ef83", "_from": "minimatch@>=2.0.0 <3.0.0||>=3.0.0 <4.0.0", "_npmVersion": "3.3.2", "_nodeVersion": "4.0.0", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "5236157a51e4f004c177fb3c527ff7dd78f0ef83", "tarball": "http://registry.npmjs.org/minimatch/-/minimatch-3.0.0.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/0000755000000000000000000000000012631326456027253 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/.npmignore0000644000000000000000000000003412631326456031247 0ustar 00000000000000test .gitignore .travis.yml npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/README.md0000644000000000000000000000673112631326456030541 0ustar 00000000000000# brace-expansion [Brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html), as known from sh/bash, in JavaScript. [![build status](https://secure.travis-ci.org/juliangruber/brace-expansion.svg)](http://travis-ci.org/juliangruber/brace-expansion) [![downloads](https://img.shields.io/npm/dm/brace-expansion.svg)](https://www.npmjs.org/package/brace-expansion) [![testling badge](https://ci.testling.com/juliangruber/brace-expansion.png)](https://ci.testling.com/juliangruber/brace-expansion) ## Example ```js var expand = require('brace-expansion'); expand('file-{a,b,c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('-v{,,}') // => ['-v', '-v', '-v'] expand('file{0..2}.jpg') // => ['file0.jpg', 'file1.jpg', 'file2.jpg'] expand('file-{a..c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('file{2..0}.jpg') // => ['file2.jpg', 'file1.jpg', 'file0.jpg'] expand('file{0..4..2}.jpg') // => ['file0.jpg', 'file2.jpg', 'file4.jpg'] expand('file-{a..e..2}.jpg') // => ['file-a.jpg', 'file-c.jpg', 'file-e.jpg'] expand('file{00..10..5}.jpg') // => ['file00.jpg', 'file05.jpg', 'file10.jpg'] expand('{{A..C},{a..c}}') // => ['A', 'B', 'C', 'a', 'b', 'c'] expand('ppp{,config,oe{,conf}}') // => ['ppp', 'pppconfig', 'pppoe', 'pppoeconf'] ``` ## API ```js var expand = require('brace-expansion'); ``` ### var expanded = expand(str) Return an array of all possible and valid expansions of `str`. If none are found, `[str]` is returned. Valid expansions are: ```js /^(.*,)+(.+)?$/ // {a,b,...} ``` A comma seperated list of options, like `{a,b}` or `{a,{b,c}}` or `{,a,}`. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` A numeric sequence from `x` to `y` inclusive, with optional increment. If `x` or `y` start with a leading `0`, all the numbers will be padded to have equal length. Negative numbers and backwards iteration work too. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` An alphabetic sequence from `x` to `y` inclusive, with optional increment. `x` and `y` must be exactly one character, and if given, `incr` must be a number. For compatibility reasons, the string `${` is not eligible for brace expansion. ## Installation With [npm](https://npmjs.org) do: ```bash npm install brace-expansion ``` ## Contributors - [Julian Gruber](https://github.com/juliangruber) - [Isaac Z. Schlueter](https://github.com/isaacs) ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/example.js0000644000000000000000000000061612631326456031247 0ustar 00000000000000var expand = require('./'); console.log(expand('http://any.org/archive{1996..1999}/vol{1..4}/part{a,b,c}.html')); console.log(expand('http://www.numericals.com/file{1..100..10}.txt')); console.log(expand('http://www.letters.com/file{a..z..2}.txt')); console.log(expand('mkdir /usr/local/src/bash/{old,new,dist,bugs}')); console.log(expand('chown root /usr/{ucb/{ex,edit},lib/{ex?.?*,how_ex}}')); npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/index.js0000644000000000000000000001035712631326456030726 0ustar 00000000000000var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; return expand(escapeBraces(str), true).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } function expand(str, isTop) { var expansions = []; var m = balanced('{', '}', str); if (!m || /\$$/.test(m.pre)) return [str]; var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body); var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = /^(.*,)+(.+)?$/.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (m.post.match(/,.*}/)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0], false).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post, false) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post, false) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el, false) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { var expansion = pre + N[j] + post[k]; if (!isTop || isSequence || expansion) expansions.push(expansion); } } return expansions; } npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/0000755000000000000000000000000012631326456031730 5ustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/package.json0000644000000000000000000000365112631326456031546 0ustar 00000000000000{ "name": "brace-expansion", "description": "Brace expansion as known from sh/bash", "version": "1.1.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/brace-expansion.git" }, "homepage": "https://github.com/juliangruber/brace-expansion", "main": "index.js", "scripts": { "test": "tape test/*.js", "gentest": "bash test/generate.sh" }, "dependencies": { "balanced-match": "^0.2.0", "concat-map": "0.0.1" }, "devDependencies": { "tape": "^3.0.3" }, "keywords": [], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/20..latest", "firefox/nightly", "chrome/25..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "f50da498166d76ea570cf3b30179f01f0f119612", "bugs": { "url": "https://github.com/juliangruber/brace-expansion/issues" }, "_id": "brace-expansion@1.1.1", "_shasum": "da5fb78aef4c44c9e4acf525064fb3208ebab045", "_from": "brace-expansion@>=1.0.0 <2.0.0", "_npmVersion": "2.6.1", "_nodeVersion": "0.10.36", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" }, { "name": "isaacs", "email": "isaacs@npmjs.com" } ], "dist": { "shasum": "da5fb78aef4c44c9e4acf525064fb3208ebab045", "tarball": "http://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000755000000000000000000000000012631326456032233 5ustar 00000000000000././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000755000000000000000000000000012631326456032252 5ustar 00000000000000././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.npmignorenpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000002712631326456032234 0ustar 00000000000000node_modules .DS_Store ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.travis.ymlnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000006012631326456032231 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/Makefilenpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000007112631326456032233 0ustar 00000000000000 test: @node_modules/.bin/tape test/*.js .PHONY: test ././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/README.mdnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000517312631326456032243 0ustar 00000000000000# balanced-match Match balanced string pairs, like `{` and `}` or `` and ``. [![build status](https://secure.travis-ci.org/juliangruber/balanced-match.svg)](http://travis-ci.org/juliangruber/balanced-match) [![downloads](https://img.shields.io/npm/dm/balanced-match.svg)](https://www.npmjs.org/package/balanced-match) [![testling badge](https://ci.testling.com/juliangruber/balanced-match.png)](https://ci.testling.com/juliangruber/balanced-match) ## Example Get the first matching pair of braces: ```js var balanced = require('balanced-match'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ``` The matches are: ```bash $ node example.js { start: 3, end: 14, pre: 'pre', body: 'in{nested}', post: 'post' } { start: 3, end: 9, pre: 'pre', body: 'first', post: 'between{second}post' } ``` ## API ### var m = balanced(a, b, str) For the first non-nested matching pair of `a` and `b` in `str`, return an object with those keys: * **start** the index of the first match of `a` * **end** the index of the matching `b` * **pre** the preamble, `a` and `b` not included * **body** the match, `a` and `b` not included * **post** the postscript, `a` and `b` not included If there's no match, `undefined` will be returned. If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `['{', 'a', '']`. ## Installation With [npm](https://npmjs.org) do: ```bash npm install balanced-match ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/example.jsnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000023212631326456032232 0ustar 00000000000000var balanced = require('./'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/index.jsnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000160612631326456032240 0ustar 00000000000000module.exports = balanced; function balanced(a, b, str) { var bal = 0; var m = {}; var ended = false; for (var i = 0; i < str.length; i++) { if (a == str.substr(i, a.length)) { if (!('start' in m)) m.start = i; bal++; } else if (b == str.substr(i, b.length) && 'start' in m) { ended = true; bal--; if (!bal) { m.end = i; m.pre = str.substr(0, m.start); m.body = (m.end - m.start > 1) ? str.substring(m.start + a.length, m.end) : ''; m.post = str.slice(m.end + b.length); return m; } } } // if we opened more than we closed, find the one we closed if (bal && ended) { var start = m.start + a.length; m = balanced(a, b, str.substr(start)); if (m) { m.start += start; m.end += start; m.pre = str.slice(0, start) + m.pre; } return m; } } ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/package.jsonnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000001005712631326456032240 0ustar 00000000000000{ "name": "balanced-match", "description": "Match balanced character pairs, like \"{\" and \"}\"", "version": "0.2.0", "repository": { "type": "git", "url": "git://github.com/juliangruber/balanced-match.git" }, "homepage": "https://github.com/juliangruber/balanced-match", "main": "index.js", "scripts": { "test": "make test" }, "dependencies": {}, "devDependencies": { "tape": "~1.1.1" }, "keywords": [ "match", "regexp", "test", "balanced", "parse" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/20..latest", "firefox/nightly", "chrome/25..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "readme": "# balanced-match\n\nMatch balanced string pairs, like `{` and `}` or `` and ``.\n\n[![build status](https://secure.travis-ci.org/juliangruber/balanced-match.svg)](http://travis-ci.org/juliangruber/balanced-match)\n[![downloads](https://img.shields.io/npm/dm/balanced-match.svg)](https://www.npmjs.org/package/balanced-match)\n\n[![testling badge](https://ci.testling.com/juliangruber/balanced-match.png)](https://ci.testling.com/juliangruber/balanced-match)\n\n## Example\n\nGet the first matching pair of braces:\n\n```js\nvar balanced = require('balanced-match');\n\nconsole.log(balanced('{', '}', 'pre{in{nested}}post'));\nconsole.log(balanced('{', '}', 'pre{first}between{second}post'));\n```\n\nThe matches are:\n\n```bash\n$ node example.js\n{ start: 3, end: 14, pre: 'pre', body: 'in{nested}', post: 'post' }\n{ start: 3,\n end: 9,\n pre: 'pre',\n body: 'first',\n post: 'between{second}post' }\n```\n\n## API\n\n### var m = balanced(a, b, str)\n\nFor the first non-nested matching pair of `a` and `b` in `str`, return an\nobject with those keys:\n\n* **start** the index of the first match of `a`\n* **end** the index of the matching `b`\n* **pre** the preamble, `a` and `b` not included\n* **body** the match, `a` and `b` not included\n* **post** the postscript, `a` and `b` not included\n\nIf there's no match, `undefined` will be returned.\n\nIf the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `['{', 'a', '']`.\n\n## Installation\n\nWith [npm](https://npmjs.org) do:\n\n```bash\nnpm install balanced-match\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber <julian@juliangruber.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/juliangruber/balanced-match/issues" }, "_id": "balanced-match@0.2.0", "_shasum": "38f6730c03aab6d5edbb52bd934885e756d71674", "_resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-0.2.0.tgz", "_from": "balanced-match@>=0.2.0 <0.3.0" } ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/test/npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000755000000000000000000000000012631326456032233 5ustar 00000000000000././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/test/balanced.jsnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/ba0000644000000000000000000000233312631326456032236 0ustar 00000000000000var test = require('tape'); var balanced = require('..'); test('balanced', function(t) { t.deepEqual(balanced('{', '}', 'pre{in{nest}}post'), { start: 3, end: 12, pre: 'pre', body: 'in{nest}', post: 'post' }); t.deepEqual(balanced('{', '}', '{{{{{{{{{in}post'), { start: 8, end: 11, pre: '{{{{{{{{', body: 'in', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre{body{in}post'), { start: 8, end: 11, pre: 'pre{body', body: 'in', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre}{in{nest}}post'), { start: 4, end: 13, pre: 'pre}', body: 'in{nest}', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre{body}between{body2}post'), { start: 3, end: 8, pre: 'pre', body: 'body', post: 'between{body2}post' }); t.notOk(balanced('{', '}', 'nope'), 'should be notOk'); t.deepEqual(balanced('', '', 'preinnestpost'), { start: 3, end: 19, pre: 'pre', body: 'innest', post: 'post' }); t.deepEqual(balanced('', '', 'preinnestpost'), { start: 7, end: 23, pre: 'pre', body: 'innest', post: 'post' }); t.end(); }); ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/.travis.ymlnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000005312631326456032252 0ustar 00000000000000language: node_js node_js: - 0.4 - 0.6 ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/LICENSEnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000206112631326456032253 0ustar 00000000000000This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/README.markdownnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000221512631326456032254 0ustar 00000000000000concat-map ========== Concatenative mapdashery. [![browser support](http://ci.testling.com/substack/node-concat-map.png)](http://ci.testling.com/substack/node-concat-map) [![build status](https://secure.travis-ci.org/substack/node-concat-map.png)](http://travis-ci.org/substack/node-concat-map) example ======= ``` js var concatMap = require('concat-map'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ``` *** ``` [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ] ``` methods ======= ``` js var concatMap = require('concat-map') ``` concatMap(xs, fn) ----------------- Return an array of concatenated elements by calling `fn(x, i)` for each element `x` and each index `i` in the array `xs`. When `fn(x, i)` returns an array, its result will be concatenated with the result array. If `fn(x, i)` returns anything else, that value will be pushed onto the end of the result array. install ======= With [npm](http://npmjs.org) do: ``` npm install concat-map ``` license ======= MIT notes ===== This module was written while sitting high above the ground in a tree. ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000755000000000000000000000000012631326456032252 5ustar 00000000000000././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/index.jsnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000053112631326456032253 0ustar 00000000000000module.exports = function (xs, fn) { var res = []; for (var i = 0; i < xs.length; i++) { var x = fn(xs[i], i); if (isArray(x)) res.push.apply(res, x); else res.push(x); } return res; }; var isArray = Array.isArray || function (xs) { return Object.prototype.toString.call(xs) === '[object Array]'; }; ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/package.jsonnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000314112631326456032253 0ustar 00000000000000{ "name": "concat-map", "description": "concatenative mapdashery", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/substack/node-concat-map.git" }, "main": "index.js", "keywords": [ "concat", "concatMap", "map", "functional", "higher-order" ], "directories": { "example": "example", "test": "test" }, "scripts": { "test": "tape test/*.js" }, "devDependencies": { "tape": "~2.4.0" }, "license": "MIT", "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "testling": { "files": "test/*.js", "browsers": { "ie": [ 6, 7, 8, 9 ], "ff": [ 3.5, 10, 15 ], "chrome": [ 10, 22 ], "safari": [ 5.1 ], "opera": [ 12 ] } }, "bugs": { "url": "https://github.com/substack/node-concat-map/issues" }, "homepage": "https://github.com/substack/node-concat-map", "_id": "concat-map@0.0.1", "dist": { "shasum": "d8a96bd77fd68df7793a73036a3ba0d5405d477b", "tarball": "http://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz" }, "_from": "concat-map@0.0.1", "_npmVersion": "1.3.21", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "_shasum": "d8a96bd77fd68df7793a73036a3ba0d5405d477b", "_resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz" } ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/test/npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000755000000000000000000000000012631326456032252 5ustar 00000000000000././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/map.jsnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000025312631326456032254 0ustar 00000000000000var concatMap = require('../'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/test/map.jsnpm_3.5.2.orig/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/co0000644000000000000000000000206312631326456032255 0ustar 00000000000000var concatMap = require('../'); var test = require('tape'); test('empty or not', function (t) { var xs = [ 1, 2, 3, 4, 5, 6 ]; var ixes = []; var ys = concatMap(xs, function (x, ix) { ixes.push(ix); return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); t.same(ys, [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ]); t.same(ixes, [ 0, 1, 2, 3, 4, 5 ]); t.end(); }); test('always something', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function (x) { return x === 'b' ? [ 'B', 'B', 'B' ] : [ x ]; }); t.same(ys, [ 'a', 'B', 'B', 'B', 'c', 'd' ]); t.end(); }); test('scalars', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function (x) { return x === 'b' ? [ 'B', 'B', 'B' ] : x; }); t.same(ys, [ 'a', 'B', 'B', 'B', 'c', 'd' ]); t.end(); }); test('undefs', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function () {}); t.same(ys, [ undefined, undefined, undefined, undefined ]); t.end(); }); npm_3.5.2.orig/node_modules/glob/node_modules/path-is-absolute/index.js0000644000000000000000000000112712631326456024376 0ustar 00000000000000'use strict'; function posix(path) { return path.charAt(0) === '/'; }; function win32(path) { // https://github.com/joyent/node/blob/b3fcc245fb25539909ef1d5eaa01dbf92e168633/lib/path.js#L56 var splitDeviceRe = /^([a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/]+[^\\\/]+)?([\\\/])?([\s\S]*?)$/; var result = splitDeviceRe.exec(path); var device = result[1] || ''; var isUnc = !!device && device.charAt(1) !== ':'; // UNC paths are always absolute return !!result[2] || isUnc; }; module.exports = process.platform === 'win32' ? win32 : posix; module.exports.posix = posix; module.exports.win32 = win32; npm_3.5.2.orig/node_modules/glob/node_modules/path-is-absolute/license0000644000000000000000000000213712631326456024300 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/glob/node_modules/path-is-absolute/package.json0000644000000000000000000000416712631326456025226 0ustar 00000000000000{ "name": "path-is-absolute", "version": "1.0.0", "description": "Node.js 0.12 path.isAbsolute() ponyfill", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/path-is-absolute.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "node test.js" }, "files": [ "index.js" ], "keywords": [ "path", "paths", "file", "dir", "absolute", "isabsolute", "is-absolute", "built-in", "util", "utils", "core", "ponyfill", "polyfill", "shim", "is", "detect", "check" ], "readme": "# path-is-absolute [![Build Status](https://travis-ci.org/sindresorhus/path-is-absolute.svg?branch=master)](https://travis-ci.org/sindresorhus/path-is-absolute)\n\n> Node.js 0.12 [`path.isAbsolute()`](http://nodejs.org/api/path.html#path_path_isabsolute_path) ponyfill\n\n> Ponyfill: A polyfill that doesn't overwrite the native method\n\n\n## Install\n\n```\n$ npm install --save path-is-absolute\n```\n\n\n## Usage\n\n```js\nvar pathIsAbsolute = require('path-is-absolute');\n\n// Linux\npathIsAbsolute('/home/foo');\n//=> true\n\n// Windows\npathIsAbsolute('C:/Users/');\n//=> true\n\n// Any OS\npathIsAbsolute.posix('/home/foo');\n//=> true\n```\n\n\n## API\n\nSee the [`path.isAbsolute()` docs](http://nodejs.org/api/path.html#path_path_isabsolute_path).\n\n### pathIsAbsolute(path)\n\n### pathIsAbsolute.posix(path)\n\nThe Posix specific version.\n\n### pathIsAbsolute.win32(path)\n\nThe Windows specific version.\n\n\n## License\n\nMIT © [Sindre Sorhus](http://sindresorhus.com)\n", "readmeFilename": "readme.md", "bugs": { "url": "https://github.com/sindresorhus/path-is-absolute/issues" }, "homepage": "https://github.com/sindresorhus/path-is-absolute#readme", "_id": "path-is-absolute@1.0.0", "_shasum": "263dada66ab3f2fb10bf7f9d24dd8f3e570ef912", "_resolved": "https://registry.npmjs.org/path-is-absolute/-/path-is-absolute-1.0.0.tgz", "_from": "path-is-absolute@>=1.0.0 <2.0.0" } npm_3.5.2.orig/node_modules/glob/node_modules/path-is-absolute/readme.md0000644000000000000000000000165112631326456024512 0ustar 00000000000000# path-is-absolute [![Build Status](https://travis-ci.org/sindresorhus/path-is-absolute.svg?branch=master)](https://travis-ci.org/sindresorhus/path-is-absolute) > Node.js 0.12 [`path.isAbsolute()`](http://nodejs.org/api/path.html#path_path_isabsolute_path) ponyfill > Ponyfill: A polyfill that doesn't overwrite the native method ## Install ``` $ npm install --save path-is-absolute ``` ## Usage ```js var pathIsAbsolute = require('path-is-absolute'); // Linux pathIsAbsolute('/home/foo'); //=> true // Windows pathIsAbsolute('C:/Users/'); //=> true // Any OS pathIsAbsolute.posix('/home/foo'); //=> true ``` ## API See the [`path.isAbsolute()` docs](http://nodejs.org/api/path.html#path_path_isabsolute_path). ### pathIsAbsolute(path) ### pathIsAbsolute.posix(path) The Posix specific version. ### pathIsAbsolute.win32(path) The Windows specific version. ## License MIT © [Sindre Sorhus](http://sindresorhus.com) npm_3.5.2.orig/node_modules/graceful-fs/LICENSE0000644000000000000000000000141512631326456017353 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter, Ben Noordhuis, and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/graceful-fs/README.md0000644000000000000000000000216212631326456017625 0ustar 00000000000000# graceful-fs graceful-fs functions as a drop-in replacement for the fs module, making various improvements. The improvements are meant to normalize behavior across different platforms and environments, and to make filesystem access more resilient to errors. ## Improvements over [fs module](http://api.nodejs.org/fs.html) graceful-fs: * Queues up `open` and `readdir` calls, and retries them once something closes if there is an EMFILE error from too many file descriptors. * fixes `lchmod` for Node versions prior to 0.6.2. * implements `fs.lutimes` if possible. Otherwise it becomes a noop. * ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or `lchown` if the user isn't root. * makes `lchmod` and `lchown` become noops, if not available. * retries reading a file if `read` results in EAGAIN error. On Windows, it retries renaming a file for up to one second if `EACCESS` or `EPERM` error occurs, likely because antivirus software has locked the directory. ## USAGE ```javascript // use just like fs var fs = require('graceful-fs') // now go and do stuff with it... fs.readFileSync('some-file-or-whatever') ``` npm_3.5.2.orig/node_modules/graceful-fs/fs.js0000644000000000000000000000065512631326456017321 0ustar 00000000000000'use strict' var fs = require('fs') module.exports = clone(fs) function clone (obj) { if (obj === null || typeof obj !== 'object') return obj if (obj instanceof Object) var copy = { __proto__: obj.__proto__ } else var copy = Object.create(null) Object.getOwnPropertyNames(obj).forEach(function (key) { Object.defineProperty(copy, key, Object.getOwnPropertyDescriptor(obj, key)) }) return copy } npm_3.5.2.orig/node_modules/graceful-fs/graceful-fs.js0000644000000000000000000001516612631326456021112 0ustar 00000000000000var fs = require('fs') var polyfills = require('./polyfills.js') var legacy = require('./legacy-streams.js') var queue = [] var util = require('util') function noop () {} var debug = noop if (util.debuglog) debug = util.debuglog('gfs4') else if (/\bgfs4\b/i.test(process.env.NODE_DEBUG || '')) debug = function() { var m = util.format.apply(util, arguments) m = 'GFS4: ' + m.split(/\n/).join('\nGFS4: ') console.error(m) } if (/\bgfs4\b/i.test(process.env.NODE_DEBUG || '')) { process.on('exit', function() { debug(queue) require('assert').equal(queue.length, 0) }) } module.exports = patch(require('./fs.js')) if (process.env.TEST_GRACEFUL_FS_GLOBAL_PATCH) { module.exports = patch(fs) } // Always patch fs.close/closeSync, because we want to // retry() whenever a close happens *anywhere* in the program. // This is essential when multiple graceful-fs instances are // in play at the same time. fs.close = (function (fs$close) { return function (fd, cb) { return fs$close.call(fs, fd, function (err) { if (!err) retry() if (typeof cb === 'function') cb.apply(this, arguments) }) }})(fs.close) fs.closeSync = (function (fs$closeSync) { return function (fd) { // Note that graceful-fs also retries when fs.closeSync() fails. // Looks like a bug to me, although it's probably a harmless one. var rval = fs$closeSync.apply(fs, arguments) retry() return rval }})(fs.closeSync) function patch (fs) { // Everything that references the open() function needs to be in here polyfills(fs) fs.gracefulify = patch fs.FileReadStream = ReadStream; // Legacy name. fs.FileWriteStream = WriteStream; // Legacy name. fs.createReadStream = createReadStream fs.createWriteStream = createWriteStream var fs$readFile = fs.readFile fs.readFile = readFile function readFile (path, options, cb) { if (typeof options === 'function') cb = options, options = null return go$readFile(path, options, cb) function go$readFile (path, options, cb) { return fs$readFile(path, options, function (err) { if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) enqueue([go$readFile, [path, options, cb]]) else { if (typeof cb === 'function') cb.apply(this, arguments) retry() } }) } } var fs$writeFile = fs.writeFile fs.writeFile = writeFile function writeFile (path, data, options, cb) { if (typeof options === 'function') cb = options, options = null return go$writeFile(path, data, options, cb) function go$writeFile (path, data, options, cb) { return fs$writeFile(path, data, options, function (err) { if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) enqueue([go$writeFile, [path, data, options, cb]]) else { if (typeof cb === 'function') cb.apply(this, arguments) retry() } }) } } var fs$appendFile = fs.appendFile if (fs$appendFile) fs.appendFile = appendFile function appendFile (path, data, options, cb) { if (typeof options === 'function') cb = options, options = null return go$appendFile(path, data, options, cb) function go$appendFile (path, data, options, cb) { return fs$appendFile(path, data, options, function (err) { if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) enqueue([go$appendFile, [path, data, options, cb]]) else { if (typeof cb === 'function') cb.apply(this, arguments) retry() } }) } } var fs$readdir = fs.readdir fs.readdir = readdir function readdir (path, cb) { return go$readdir(path, cb) function go$readdir () { return fs$readdir(path, function (err, files) { if (files && files.sort) files.sort(); // Backwards compatibility with graceful-fs. if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) enqueue([go$readdir, [path, cb]]) else { if (typeof cb === 'function') cb.apply(this, arguments) retry() } }) } } if (process.version.substr(0, 4) === 'v0.8') { var legStreams = legacy(fs) ReadStream = legStreams.ReadStream WriteStream = legStreams.WriteStream } var fs$ReadStream = fs.ReadStream ReadStream.prototype = Object.create(fs$ReadStream.prototype) ReadStream.prototype.open = ReadStream$open var fs$WriteStream = fs.WriteStream WriteStream.prototype = Object.create(fs$WriteStream.prototype) WriteStream.prototype.open = WriteStream$open fs.ReadStream = ReadStream fs.WriteStream = WriteStream function ReadStream (path, options) { if (this instanceof ReadStream) return fs$ReadStream.apply(this, arguments), this else return ReadStream.apply(Object.create(ReadStream.prototype), arguments) } function ReadStream$open () { var that = this open(that.path, that.flags, that.mode, function (err, fd) { if (err) { if (that.autoClose) that.destroy() that.emit('error', err) } else { that.fd = fd that.emit('open', fd) that.read() } }) } function WriteStream (path, options) { if (this instanceof WriteStream) return fs$WriteStream.apply(this, arguments), this else return WriteStream.apply(Object.create(WriteStream.prototype), arguments) } function WriteStream$open () { var that = this open(that.path, that.flags, that.mode, function (err, fd) { if (err) { that.destroy() that.emit('error', err) } else { that.fd = fd that.emit('open', fd) } }) } function createReadStream (path, options) { return new ReadStream(path, options) } function createWriteStream (path, options) { return new WriteStream(path, options) } var fs$open = fs.open fs.open = open function open (path, flags, mode, cb) { if (typeof mode === 'function') cb = mode, mode = null return go$open(path, flags, mode, cb) function go$open (path, flags, mode, cb) { return fs$open(path, flags, mode, function (err, fd) { if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) enqueue([go$open, [path, flags, mode, cb]]) else { if (typeof cb === 'function') cb.apply(this, arguments) retry() } }) } } return fs } function enqueue (elem) { debug('ENQUEUE', elem[0].name, elem[1]) queue.push(elem) } function retry () { var elem = queue.shift() if (elem) { debug('RETRY', elem[0].name, elem[1]) elem[0].apply(null, elem[1]) } } npm_3.5.2.orig/node_modules/graceful-fs/legacy-streams.js0000644000000000000000000000513712631326456021631 0ustar 00000000000000var Stream = require('stream').Stream module.exports = legacy function legacy (fs) { return { ReadStream: ReadStream, WriteStream: WriteStream } function ReadStream (path, options) { if (!(this instanceof ReadStream)) return new ReadStream(path, options); Stream.call(this); var self = this; this.path = path; this.fd = null; this.readable = true; this.paused = false; this.flags = 'r'; this.mode = 438; /*=0666*/ this.bufferSize = 64 * 1024; options = options || {}; // Mixin options into this var keys = Object.keys(options); for (var index = 0, length = keys.length; index < length; index++) { var key = keys[index]; this[key] = options[key]; } if (this.encoding) this.setEncoding(this.encoding); if (this.start !== undefined) { if ('number' !== typeof this.start) { throw TypeError('start must be a Number'); } if (this.end === undefined) { this.end = Infinity; } else if ('number' !== typeof this.end) { throw TypeError('end must be a Number'); } if (this.start > this.end) { throw new Error('start must be <= end'); } this.pos = this.start; } if (this.fd !== null) { process.nextTick(function() { self._read(); }); return; } fs.open(this.path, this.flags, this.mode, function (err, fd) { if (err) { self.emit('error', err); self.readable = false; return; } self.fd = fd; self.emit('open', fd); self._read(); }) } function WriteStream (path, options) { if (!(this instanceof WriteStream)) return new WriteStream(path, options); Stream.call(this); this.path = path; this.fd = null; this.writable = true; this.flags = 'w'; this.encoding = 'binary'; this.mode = 438; /*=0666*/ this.bytesWritten = 0; options = options || {}; // Mixin options into this var keys = Object.keys(options); for (var index = 0, length = keys.length; index < length; index++) { var key = keys[index]; this[key] = options[key]; } if (this.start !== undefined) { if ('number' !== typeof this.start) { throw TypeError('start must be a Number'); } if (this.start < 0) { throw new Error('start must be >= zero'); } this.pos = this.start; } this.busy = false; this._queue = []; if (this.fd === null) { this._open = fs.open; this._queue.push([this._open, this.path, this.flags, this.mode, undefined]); this.flush(); } } } npm_3.5.2.orig/node_modules/graceful-fs/package.json0000644000000000000000000000455012631326456020637 0ustar 00000000000000{ "name": "graceful-fs", "description": "A drop-in replacement for fs, making various improvements.", "version": "4.1.2", "repository": { "type": "git", "url": "git+https://github.com/isaacs/node-graceful-fs.git" }, "main": "graceful-fs.js", "engines": { "node": ">=0.4.0" }, "directories": { "test": "test" }, "scripts": { "test": "node test.js | tap -" }, "keywords": [ "fs", "module", "reading", "retry", "retries", "queue", "error", "errors", "handling", "EMFILE", "EAGAIN", "EINVAL", "EPERM", "EACCESS" ], "license": "ISC", "devDependencies": { "mkdirp": "^0.5.0", "rimraf": "^2.2.8", "tap": "^1.2.0" }, "files": [ "fs.js", "graceful-fs.js", "legacy-streams.js", "polyfills.js" ], "readme": "# graceful-fs\n\ngraceful-fs functions as a drop-in replacement for the fs module,\nmaking various improvements.\n\nThe improvements are meant to normalize behavior across different\nplatforms and environments, and to make filesystem access more\nresilient to errors.\n\n## Improvements over [fs module](http://api.nodejs.org/fs.html)\n\ngraceful-fs:\n\n* Queues up `open` and `readdir` calls, and retries them once\n something closes if there is an EMFILE error from too many file\n descriptors.\n* fixes `lchmod` for Node versions prior to 0.6.2.\n* implements `fs.lutimes` if possible. Otherwise it becomes a noop.\n* ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or\n `lchown` if the user isn't root.\n* makes `lchmod` and `lchown` become noops, if not available.\n* retries reading a file if `read` results in EAGAIN error.\n\nOn Windows, it retries renaming a file for up to one second if `EACCESS`\nor `EPERM` error occurs, likely because antivirus software has locked\nthe directory.\n\n## USAGE\n\n```javascript\n// use just like fs\nvar fs = require('graceful-fs')\n\n// now go and do stuff with it...\nfs.readFileSync('some-file-or-whatever')\n```\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/node-graceful-fs/issues" }, "homepage": "https://github.com/isaacs/node-graceful-fs#readme", "_id": "graceful-fs@4.1.2", "_shasum": "fe2239b7574972e67e41f808823f9bfa4a991e37", "_resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.1.2.tgz", "_from": "graceful-fs@>=4.1.2 <4.2.0" } npm_3.5.2.orig/node_modules/graceful-fs/polyfills.js0000644000000000000000000001465512631326456020733 0ustar 00000000000000var fs = require('./fs.js') var constants = require('constants') var origCwd = process.cwd var cwd = null process.cwd = function() { if (!cwd) cwd = origCwd.call(process) return cwd } try { process.cwd() } catch (er) {} var chdir = process.chdir process.chdir = function(d) { cwd = null chdir.call(process, d) } module.exports = patch function patch (fs) { // (re-)implement some things that are known busted or missing. // lchmod, broken prior to 0.6.2 // back-port the fix here. if (constants.hasOwnProperty('O_SYMLINK') && process.version.match(/^v0\.6\.[0-2]|^v0\.5\./)) { patchLchmod(fs) } // lutimes implementation, or no-op if (!fs.lutimes) { patchLutimes(fs) } // https://github.com/isaacs/node-graceful-fs/issues/4 // Chown should not fail on einval or eperm if non-root. // It should not fail on enosys ever, as this just indicates // that a fs doesn't support the intended operation. fs.chown = chownFix(fs.chown) fs.fchown = chownFix(fs.fchown) fs.lchown = chownFix(fs.lchown) fs.chmod = chownFix(fs.chmod) fs.fchmod = chownFix(fs.fchmod) fs.lchmod = chownFix(fs.lchmod) fs.chownSync = chownFixSync(fs.chownSync) fs.fchownSync = chownFixSync(fs.fchownSync) fs.lchownSync = chownFixSync(fs.lchownSync) fs.chmodSync = chownFix(fs.chmodSync) fs.fchmodSync = chownFix(fs.fchmodSync) fs.lchmodSync = chownFix(fs.lchmodSync) // if lchmod/lchown do not exist, then make them no-ops if (!fs.lchmod) { fs.lchmod = function (path, mode, cb) { process.nextTick(cb) } fs.lchmodSync = function () {} } if (!fs.lchown) { fs.lchown = function (path, uid, gid, cb) { process.nextTick(cb) } fs.lchownSync = function () {} } // on Windows, A/V software can lock the directory, causing this // to fail with an EACCES or EPERM if the directory contains newly // created files. Try again on failure, for up to 1 second. if (process.platform === "win32") { fs.rename = (function (fs$rename) { return function (from, to, cb) { var start = Date.now() fs$rename(from, to, function CB (er) { if (er && (er.code === "EACCES" || er.code === "EPERM") && Date.now() - start < 1000) { return fs$rename(from, to, CB) } if (cb) cb(er) }) }})(fs.rename) } // if read() returns EAGAIN, then just try it again. fs.read = (function (fs$read) { return function (fd, buffer, offset, length, position, callback_) { var callback if (callback_ && typeof callback_ === 'function') { var eagCounter = 0 callback = function (er, _, __) { if (er && er.code === 'EAGAIN' && eagCounter < 10) { eagCounter ++ return fs$read.call(fs, fd, buffer, offset, length, position, callback) } callback_.apply(this, arguments) } } return fs$read.call(fs, fd, buffer, offset, length, position, callback) }})(fs.read) fs.readSync = (function (fs$readSync) { return function (fd, buffer, offset, length, position) { var eagCounter = 0 while (true) { try { return fs$readSync.call(fs, fd, buffer, offset, length, position) } catch (er) { if (er.code === 'EAGAIN' && eagCounter < 10) { eagCounter ++ continue } throw er } } }})(fs.readSync) } function patchLchmod (fs) { fs.lchmod = function (path, mode, callback) { callback = callback || noop fs.open( path , constants.O_WRONLY | constants.O_SYMLINK , mode , function (err, fd) { if (err) { callback(err) return } // prefer to return the chmod error, if one occurs, // but still try to close, and report closing errors if they occur. fs.fchmod(fd, mode, function (err) { fs.close(fd, function(err2) { callback(err || err2) }) }) }) } fs.lchmodSync = function (path, mode) { var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK, mode) // prefer to return the chmod error, if one occurs, // but still try to close, and report closing errors if they occur. var threw = true var ret try { ret = fs.fchmodSync(fd, mode) threw = false } finally { if (threw) { try { fs.closeSync(fd) } catch (er) {} } else { fs.closeSync(fd) } } return ret } } function patchLutimes (fs) { if (constants.hasOwnProperty("O_SYMLINK")) { fs.lutimes = function (path, at, mt, cb) { fs.open(path, constants.O_SYMLINK, function (er, fd) { cb = cb || noop if (er) return cb(er) fs.futimes(fd, at, mt, function (er) { fs.close(fd, function (er2) { return cb(er || er2) }) }) }) } fs.lutimesSync = function (path, at, mt) { var fd = fs.openSync(path, constants.O_SYMLINK) var ret var threw = true try { ret = fs.futimesSync(fd, at, mt) threw = false } finally { if (threw) { try { fs.closeSync(fd) } catch (er) {} } else { fs.closeSync(fd) } } return ret } } else { fs.lutimes = function (_a, _b, _c, cb) { process.nextTick(cb) } fs.lutimesSync = function () {} } } function chownFix (orig) { if (!orig) return orig return function (target, uid, gid, cb) { return orig.call(fs, target, uid, gid, function (er, res) { if (chownErOk(er)) er = null cb(er, res) }) } } function chownFixSync (orig) { if (!orig) return orig return function (target, uid, gid) { try { return orig.call(fs, target, uid, gid) } catch (er) { if (!chownErOk(er)) throw er } } } // ENOSYS means that the fs doesn't support the op. Just ignore // that, because it doesn't matter. // // if there's no getuid, or if getuid() is something other // than 0, and the error is EINVAL or EPERM, then just ignore // it. // // This specific case is a silent failure in cp, install, tar, // and most other unix tools that manage permissions. // // When running as root, or if other types of errors are // encountered, then it's strict. function chownErOk (er) { if (!er) return true if (er.code === "ENOSYS") return true var nonroot = !process.getuid || process.getuid() !== 0 if (nonroot) { if (er.code === "EINVAL" || er.code === "EPERM") return true } return false } npm_3.5.2.orig/node_modules/has-unicode/.npmignore0000644000000000000000000000114712631326456020350 0ustar 00000000000000# Logs logs *.log # Runtime data pids *.pid *.seed # Directory for instrumented libs generated by jscoverage/JSCover lib-cov # Coverage directory used by tools like istanbul coverage # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) .grunt # Compiled binary addons (http://nodejs.org/api/addons.html) build/Release # Dependency directory # Commenting this out is preferred by some people, see # https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git- node_modules # Users Environment Variables .lock-wscript # Editor temp files *~ .#* npm_3.5.2.orig/node_modules/has-unicode/LICENSE0000644000000000000000000000136012631326456017353 0ustar 00000000000000Copyright (c) 2014, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/has-unicode/README.md0000644000000000000000000000213212631326456017623 0ustar 00000000000000has-unicode =========== Try to guess if your terminal supports unicode ```javascript var hasUnicode = require("has-unicode") if (hasUnicode()) { // the terminal probably has unicode support } ``` ```javascript var hasUnicode = require("has-unicode").tryHarder hasUnicode(function(unicodeSupported) { if (unicodeSupported) { // the terminal probably has unicode support } }) ``` ## Detecting Unicode What we actually detect is UTF-8 support, as that's what Node itself supports. If you have a UTF-16 locale then you won't be detected as unicode capable. ### Windows Since at least Windows 7, `cmd` and `powershell` have been unicode capable. As such, we report any Windows installation as unicode capable. ### Unix Like Operating Systems We look at the environment variables `LC_ALL`, `LC_CTYPE`, and `LANG` in that order. For `LC_ALL` and `LANG`, it looks for `.UTF-8` in the value. For `LC_CTYPE` it looks to see if the value is `UTF-8`. This is sufficient for most POSIX systems. While locale data can be put in `/etc/locale.conf` as well, AFAIK it's always copied into the environment. npm_3.5.2.orig/node_modules/has-unicode/index.js0000644000000000000000000000065112631326456020015 0ustar 00000000000000"use strict" var os = require("os") var hasUnicode = module.exports = function () { // Supported Win32 platforms (>XP) support unicode in the console, though // font support isn't fantastic. if (os.type() == "Windows_NT") { return true } var isUTF8 = /[.]UTF-8/ if (isUTF8.test(process.env.LC_ALL) || process.env.LC_CTYPE == 'UTF-8' || isUTF8.test(process.env.LANG)) { return true } return false } npm_3.5.2.orig/node_modules/has-unicode/package.json0000644000000000000000000000250612631326456020637 0ustar 00000000000000{ "name": "has-unicode", "version": "1.0.1", "description": "Try to guess if your terminal supports unicode", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/has-unicode.git" }, "keywords": [ "unicode", "terminal" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/has-unicode/issues" }, "homepage": "https://github.com/iarna/has-unicode", "devDependencies": { "require-inject": "^1.1.1", "tap": "^0.4.13" }, "gitHead": "d4ad300c67b25c197582e42e936ea928f7935d01", "_id": "has-unicode@1.0.1", "_shasum": "c46fceea053eb8ec789bffbba25fca52dfdcf38e", "_from": "has-unicode@>=1.0.1 <1.1.0", "_npmVersion": "3.3.6", "_nodeVersion": "4.1.1", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "dist": { "shasum": "c46fceea053eb8ec789bffbba25fca52dfdcf38e", "tarball": "http://registry.npmjs.org/has-unicode/-/has-unicode-1.0.1.tgz" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/has-unicode/-/has-unicode-1.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/has-unicode/test/0000755000000000000000000000000012631326456017325 5ustar 00000000000000npm_3.5.2.orig/node_modules/has-unicode/test/index.js0000644000000000000000000000156212631326456020776 0ustar 00000000000000"use strict" var test = require("tap").test var requireInject = require("require-inject") test("Windows", function (t) { t.plan(1) var hasUnicode = requireInject("../index.js", { os: { type: function () { return "Windows_NT" } } }) t.is(hasUnicode(), true, "Windows is assumed to be unicode aware") }) test("Unix Env", function (t) { t.plan(3) var hasUnicode = requireInject("../index.js", { os: { type: function () { return "Linux" } }, child_process: { exec: function (cmd,cb) { cb(new Error("not available")) } } }) process.env.LANG = "en_US.UTF-8" process.env.LC_ALL = null t.is(hasUnicode(), true, "Linux with a UTF8 language") process.env.LANG = null process.env.LC_ALL = "en_US.UTF-8" t.is(hasUnicode(), true, "Linux with UTF8 locale") process.env.LC_ALL = null t.is(hasUnicode(), false, "Linux without UTF8 language or locale") }) npm_3.5.2.orig/node_modules/hosted-git-info/.npmignore0000644000000000000000000000002312631326456021141 0ustar 00000000000000*~ .# node_modules npm_3.5.2.orig/node_modules/hosted-git-info/.travis.yml0000644000000000000000000000010412631326456021253 0ustar 00000000000000language: node_js node_js: - "0.11" - "0.10" script: "npm test" npm_3.5.2.orig/node_modules/hosted-git-info/LICENSE0000644000000000000000000000133512631326456020156 0ustar 00000000000000Copyright (c) 2015, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/hosted-git-info/README.md0000644000000000000000000000525512631326456020435 0ustar 00000000000000# hosted-git-info This will let you identify and transform various git hosts URLs between protocols. It also can tell you what the URL is for the raw path for particular file for direct access without git. ## Usage ```javascript var hostedGitInfo = require("hosted-git-info") var info = hostedGitInfo.fromUrl("git@github.com:npm/hosted-git-info.git") /* info looks like: { type: "github", domain: "github.com", user: "npm", project: "hosted-git-info" } */ ``` If the URL can't be matched with a git host, `null` will be returned. We can match git, ssh and https urls. Additionally, we can match ssh connect strings (`git@github.com:npm/hosted-git-info`) and shortcuts (eg, `github:npm/hosted-git-info`). Github specifically, is detected in the case of a third, unprefixed, form: `npm/hosted-git-info`. If it does match, the returned object has properties of: * info.type -- The short name of the service * info.domain -- The domain for git protocol use * info.user -- The name of the user/org on the git host * info.project -- The name of the project on the git host And methods of: * info.file(path) Given the path of a file relative to the repository, returns a URL for directly fetching it from the githost. If no committish was set then `master` will be used as the default. For example `hostedGitInfo.fromUrl("git@github.com:npm/hosted-git-info.git#v1.0.0").file("package.json")` would return `https://raw.githubusercontent.com/npm/hosted-git-info/v1.0.0/package.json` * info.shortcut() eg, `github:npm/hosted-git-info` * info.browse() eg, `https://github.com/npm/hosted-git-info/tree/v1.2.0` * info.bugs() eg, `https://github.com/npm/hosted-git-info/issues` * info.docs() eg, `https://github.com/npm/hosted-git-info/tree/v1.2.0#readme` * info.https() eg, `git+https://github.com/npm/hosted-git-info.git` * info.sshurl() eg, `git+ssh://git@github.com/npm/hosted-git-info.git` * info.ssh() eg, `git@github.com:npm/hosted-git-info.git` * info.path() eg, `npm/hosted-git-info` * info.getDefaultRepresentation() Returns the default output type. The default output type is based on the string you passed in to be parsed * info.toString() Uses the getDefaultRepresentation to call one of the other methods to get a URL for this resource. As such `hostedGitInfo.fromUrl(url).toString()` will give you a normalized version of the URL that still uses the same protocol. Shortcuts will still be returned as shortcuts, but the special case github form of `org/project` will be normalized to `github:org/project`. SSH connect strings will be normalized into `git+ssh` URLs. ## Supported hosts Currently this supports Github, Bitbucket and Gitlab. Pull requests for additional hosts welcome. npm_3.5.2.orig/node_modules/hosted-git-info/git-host-info.js0000644000000000000000000000541412631326456022200 0ustar 00000000000000'use strict' var gitHosts = module.exports = { github: { // First two are insecure and generally shouldn't be used any more, but // they are still supported. 'protocols': [ 'git', 'http', 'git+ssh', 'git+https', 'ssh', 'https' ], 'domain': 'github.com', 'treepath': 'tree', 'filetemplate': 'https://{auth@}raw.githubusercontent.com/{user}/{project}/{committish}/{path}', 'bugstemplate': 'https://{domain}/{user}/{project}/issues', 'gittemplate': 'git://{auth@}{domain}/{user}/{project}.git{#committish}' }, bitbucket: { 'protocols': [ 'git+ssh', 'git+https', 'ssh', 'https' ], 'domain': 'bitbucket.org', 'treepath': 'src' }, gitlab: { 'protocols': [ 'git+ssh', 'git+https', 'ssh', 'https' ], 'domain': 'gitlab.com', 'treepath': 'tree', 'docstemplate': 'https://{domain}/{user}/{project}{/tree/committish}#README', 'bugstemplate': 'https://{domain}/{user}/{project}/issues' }, gist: { 'protocols': [ 'git', 'git+ssh', 'git+https', 'ssh', 'https' ], 'domain': 'gist.github.com', 'pathmatch': /^[/](?:([^/]+)[/])?([a-z0-9]+)(?:[.]git)?$/, 'filetemplate': 'https://gist.githubusercontent.com/{user}/{project}/raw{/committish}/{path}', 'bugstemplate': 'https://{domain}/{project}', 'gittemplate': 'git://{domain}/{project}.git{#committish}', 'sshtemplate': 'git@{domain}:/{project}.git{#committish}', 'sshurltemplate': 'git+ssh://git@{domain}/{project}.git{#committish}', 'browsetemplate': 'https://{domain}/{project}{/committish}', 'docstemplate': 'https://{domain}/{project}{/committish}', 'httpstemplate': 'git+https://{domain}/{project}.git{#committish}', 'shortcuttemplate': '{type}:{project}{#committish}', 'pathtemplate': '{project}{#committish}' } } var gitHostDefaults = { 'sshtemplate': 'git@{domain}:{user}/{project}.git{#committish}', 'sshurltemplate': 'git+ssh://git@{domain}/{user}/{project}.git{#committish}', 'browsetemplate': 'https://{domain}/{user}/{project}{/tree/committish}', 'docstemplate': 'https://{domain}/{user}/{project}{/tree/committish}#readme', 'httpstemplate': 'git+https://{auth@}{domain}/{user}/{project}.git{#committish}', 'filetemplate': 'https://{domain}/{user}/{project}/raw/{committish}/{path}', 'shortcuttemplate': '{type}:{user}/{project}{#committish}', 'pathtemplate': '{user}/{project}{#committish}', 'pathmatch': /^[/]([^/]+)[/]([^/]+?)(?:[.]git)?$/ } Object.keys(gitHosts).forEach(function (name) { Object.keys(gitHostDefaults).forEach(function (key) { if (gitHosts[name][key]) return gitHosts[name][key] = gitHostDefaults[key] }) gitHosts[name].protocols_re = RegExp('^(' + gitHosts[name].protocols.map(function (protocol) { return protocol.replace(/([\\+*{}()\[\]$^|])/g, '\\$1') }).join('|') + '):$') }) npm_3.5.2.orig/node_modules/hosted-git-info/git-host.js0000644000000000000000000000503612631326456021247 0ustar 00000000000000'use strict' var gitHosts = require('./git-host-info.js') var GitHost = module.exports = function (type, user, auth, project, committish, defaultRepresentation) { var gitHostInfo = this gitHostInfo.type = type Object.keys(gitHosts[type]).forEach(function (key) { gitHostInfo[key] = gitHosts[type][key] }) gitHostInfo.user = user gitHostInfo.auth = auth gitHostInfo.project = project gitHostInfo.committish = committish gitHostInfo.default = defaultRepresentation } GitHost.prototype = {} GitHost.prototype.hash = function () { return this.committish ? '#' + this.committish : '' } GitHost.prototype._fill = function (template, vars) { if (!template) return if (!vars) vars = {} var self = this Object.keys(this).forEach(function (key) { if (self[key] != null && vars[key] == null) vars[key] = self[key] }) var rawAuth = vars.auth var rawComittish = vars.committish Object.keys(vars).forEach(function (key) { vars[key] = encodeURIComponent(vars[key]) }) vars['auth@'] = rawAuth ? rawAuth + '@' : '' vars['#committish'] = rawComittish ? '#' + rawComittish : '' vars['/tree/committish'] = vars.committish ? '/' + vars.treepath + '/' + vars.committish : '' vars['/committish'] = vars.committish ? '/' + vars.committish : '' vars.committish = vars.committish || 'master' var res = template Object.keys(vars).forEach(function (key) { res = res.replace(new RegExp('[{]' + key + '[}]', 'g'), vars[key]) }) return res } GitHost.prototype.ssh = function () { return this._fill(this.sshtemplate) } GitHost.prototype.sshurl = function () { return this._fill(this.sshurltemplate) } GitHost.prototype.browse = function () { return this._fill(this.browsetemplate) } GitHost.prototype.docs = function () { return this._fill(this.docstemplate) } GitHost.prototype.bugs = function () { return this._fill(this.bugstemplate) } GitHost.prototype.https = function () { return this._fill(this.httpstemplate) } GitHost.prototype.git = function () { return this._fill(this.gittemplate) } GitHost.prototype.shortcut = function () { return this._fill(this.shortcuttemplate) } GitHost.prototype.path = function () { return this._fill(this.pathtemplate) } GitHost.prototype.file = function (P) { return this._fill(this.filetemplate, { path: P.replace(/^[/]+/g, '') }) } GitHost.prototype.getDefaultRepresentation = function () { return this.default } GitHost.prototype.toString = function () { return (this[this.default] || this.sshurl).call(this) } npm_3.5.2.orig/node_modules/hosted-git-info/index.js0000644000000000000000000000662012631326456020620 0ustar 00000000000000'use strict' var url = require('url') var gitHosts = require('./git-host-info.js') var GitHost = module.exports = require('./git-host.js') var protocolToRepresentationMap = { 'git+ssh': 'sshurl', 'git+https': 'https', 'ssh': 'sshurl', 'git': 'git' } function protocolToRepresentation (protocol) { if (protocol.substr(-1) === ':') protocol = protocol.slice(0, -1) return protocolToRepresentationMap[protocol] || protocol } var authProtocols = { 'git:': true, 'https:': true, 'git+https:': true, 'http:': true, 'git+http:': true } module.exports.fromUrl = function (giturl) { if (giturl == null || giturl === '') return var url = fixupUnqualifiedGist( isGitHubShorthand(giturl) ? 'github:' + giturl : giturl ) var parsed = parseGitUrl(url) var matches = Object.keys(gitHosts).map(function (gitHostName) { var gitHostInfo = gitHosts[gitHostName] var auth = null if (parsed.auth && authProtocols[parsed.protocol]) { auth = decodeURIComponent(parsed.auth) } var committish = parsed.hash ? decodeURIComponent(parsed.hash.substr(1)) : null var user = null var project = null var defaultRepresentation = null if (parsed.protocol === gitHostName + ':') { user = decodeURIComponent(parsed.host) project = parsed.path && decodeURIComponent(parsed.path.replace(/^[/](.*?)(?:[.]git)?$/, '$1')) defaultRepresentation = 'shortcut' } else { if (parsed.host !== gitHostInfo.domain) return if (!gitHostInfo.protocols_re.test(parsed.protocol)) return var pathmatch = gitHostInfo.pathmatch var matched = parsed.path.match(pathmatch) if (!matched) return if (matched[1] != null) user = decodeURIComponent(matched[1]) if (matched[2] != null) project = decodeURIComponent(matched[2]) defaultRepresentation = protocolToRepresentation(parsed.protocol) } return new GitHost(gitHostName, user, auth, project, committish, defaultRepresentation) }).filter(function (gitHostInfo) { return gitHostInfo }) if (matches.length !== 1) return return matches[0] } function isGitHubShorthand (arg) { // Note: This does not fully test the git ref format. // See https://www.kernel.org/pub/software/scm/git/docs/git-check-ref-format.html // // The only way to do this properly would be to shell out to // git-check-ref-format, and as this is a fast sync function, // we don't want to do that. Just let git fail if it turns // out that the commit-ish is invalid. // GH usernames cannot start with . or - return /^[^:@%/\s.-][^:@%/\s]*[/][^:@\s/%]+(?:#.*)?$/.test(arg) } function fixupUnqualifiedGist (giturl) { // necessary for round-tripping gists var parsed = url.parse(giturl) if (parsed.protocol === 'gist:' && parsed.host && !parsed.path) { return parsed.protocol + '/' + parsed.host } else { return giturl } } function parseGitUrl (giturl) { if (typeof giturl !== 'string') giturl = '' + giturl var matched = giturl.match(/^([^@]+)@([^:]+):[/]?((?:[^/]+[/])?[^/]+?)(?:[.]git)?(#.*)?$/) if (!matched) return url.parse(giturl) return { protocol: 'git+ssh:', slashes: true, auth: matched[1], host: matched[2], port: null, hostname: matched[2], hash: matched[4], search: null, query: null, pathname: '/' + matched[3], path: '/' + matched[3], href: 'git+ssh://' + matched[1] + '@' + matched[2] + '/' + matched[3] + (matched[4] || '') } } npm_3.5.2.orig/node_modules/hosted-git-info/package.json0000644000000000000000000000741512631326456021444 0ustar 00000000000000{ "name": "hosted-git-info", "version": "2.1.4", "description": "Provides metadata and conversions from repository urls for Github, Bitbucket and Gitlab", "main": "index.js", "repository": { "type": "git", "url": "git+https://github.com/npm/hosted-git-info.git" }, "keywords": [ "git", "github", "bitbucket", "gitlab" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org", "url": "http://re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/hosted-git-info/issues" }, "homepage": "https://github.com/npm/hosted-git-info", "scripts": { "test": "standard && tap test/*.js" }, "devDependencies": { "standard": "^3.3.2", "tap": "^0.4.13" }, "readme": "# hosted-git-info\n\nThis will let you identify and transform various git hosts URLs between\nprotocols. It also can tell you what the URL is for the raw path for\nparticular file for direct access without git.\n\n## Usage\n\n```javascript\nvar hostedGitInfo = require(\"hosted-git-info\")\nvar info = hostedGitInfo.fromUrl(\"git@github.com:npm/hosted-git-info.git\")\n/* info looks like:\n{\n type: \"github\",\n domain: \"github.com\",\n user: \"npm\",\n project: \"hosted-git-info\"\n}\n*/\n```\n\nIf the URL can't be matched with a git host, `null` will be returned. We\ncan match git, ssh and https urls. Additionally, we can match ssh connect\nstrings (`git@github.com:npm/hosted-git-info`) and shortcuts (eg,\n`github:npm/hosted-git-info`). Github specifically, is detected in the case\nof a third, unprefixed, form: `npm/hosted-git-info`.\n\nIf it does match, the returned object has properties of:\n\n* info.type -- The short name of the service\n* info.domain -- The domain for git protocol use\n* info.user -- The name of the user/org on the git host\n* info.project -- The name of the project on the git host\n\nAnd methods of:\n\n* info.file(path)\n\nGiven the path of a file relative to the repository, returns a URL for\ndirectly fetching it from the githost. If no committish was set then\n`master` will be used as the default.\n\nFor example `hostedGitInfo.fromUrl(\"git@github.com:npm/hosted-git-info.git#v1.0.0\").file(\"package.json\")`\nwould return `https://raw.githubusercontent.com/npm/hosted-git-info/v1.0.0/package.json`\n\n* info.shortcut()\n\neg, `github:npm/hosted-git-info`\n\n* info.browse()\n\neg, `https://github.com/npm/hosted-git-info/tree/v1.2.0`\n\n* info.bugs()\n\neg, `https://github.com/npm/hosted-git-info/issues`\n\n* info.docs()\n\neg, `https://github.com/npm/hosted-git-info/tree/v1.2.0#readme`\n\n* info.https()\n\neg, `git+https://github.com/npm/hosted-git-info.git`\n\n* info.sshurl()\n\neg, `git+ssh://git@github.com/npm/hosted-git-info.git`\n\n* info.ssh()\n\neg, `git@github.com:npm/hosted-git-info.git`\n\n* info.path()\n\neg, `npm/hosted-git-info`\n\n* info.getDefaultRepresentation()\n\nReturns the default output type. The default output type is based on the\nstring you passed in to be parsed\n\n* info.toString()\n\nUses the getDefaultRepresentation to call one of the other methods to get a URL for\nthis resource. As such `hostedGitInfo.fromUrl(url).toString()` will give\nyou a normalized version of the URL that still uses the same protocol.\n\nShortcuts will still be returned as shortcuts, but the special case github\nform of `org/project` will be normalized to `github:org/project`.\n\nSSH connect strings will be normalized into `git+ssh` URLs.\n\n\n## Supported hosts\n\nCurrently this supports Github, Bitbucket and Gitlab. Pull requests for\nadditional hosts welcome.\n\n", "readmeFilename": "README.md", "gitHead": "9e1a36df8eb050a663713c79e56d89dadba2bd8d", "_id": "hosted-git-info@2.1.4", "_shasum": "d9e953b26988be88096c46e926494d9604c300f8", "_from": "hosted-git-info@>=2.1.4 <2.2.0" } npm_3.5.2.orig/node_modules/hosted-git-info/test/0000755000000000000000000000000012631326456020126 5ustar 00000000000000npm_3.5.2.orig/node_modules/hosted-git-info/test/basic.js0000644000000000000000000000174112631326456021550 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('basic', function (t) { t.is(HostedGit.fromUrl('https://google.com'), undefined, 'null on failure') t.is(HostedGit.fromUrl('https://github.com/abc/def').getDefaultRepresentation(), 'https', 'match https urls') t.is(HostedGit.fromUrl('ssh://git@github.com/abc/def').getDefaultRepresentation(), 'sshurl', 'match ssh urls') t.is(HostedGit.fromUrl('git+ssh://git@github.com/abc/def').getDefaultRepresentation(), 'sshurl', 'match git+ssh urls') t.is(HostedGit.fromUrl('git+https://github.com/abc/def').getDefaultRepresentation(), 'https', 'match git+https urls') t.is(HostedGit.fromUrl('git@github.com:abc/def').getDefaultRepresentation(), 'sshurl', 'match ssh connect strings') t.is(HostedGit.fromUrl('git://github.com/abc/def').getDefaultRepresentation(), 'git', 'match git urls') t.is(HostedGit.fromUrl('github:abc/def').getDefaultRepresentation(), 'shortcut', 'match shortcuts') t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/bitbucket-https-with-embedded-auth.js0000644000000000000000000000163112631326456027240 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('Bitbucket HTTPS URLs with embedded auth', function (t) { t.is( HostedGit.fromUrl('https://user:pass@bitbucket.org/user/repo.git').toString(), 'git+https://user:pass@bitbucket.org/user/repo.git', 'credentials were included in URL' ) t.is( HostedGit.fromUrl('https://user:pass@bitbucket.org/user/repo').toString(), 'git+https://user:pass@bitbucket.org/user/repo.git', 'credentials were included in URL' ) t.is( HostedGit.fromUrl('git+https://user:pass@bitbucket.org/user/repo.git').toString(), 'git+https://user:pass@bitbucket.org/user/repo.git', 'credentials were included in URL' ) t.is( HostedGit.fromUrl('git+https://user:pass@bitbucket.org/user/repo').toString(), 'git+https://user:pass@bitbucket.org/user/repo.git', 'credentials were included in URL' ) t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/bitbucket.js0000644000000000000000000000214012631326456022435 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('fromUrl(bitbucket url)', function (t) { function verify (host, label, branch) { var hostinfo = HostedGit.fromUrl(host) var hash = branch ? '#' + branch : '' t.ok(hostinfo, label) if (!hostinfo) return t.is(hostinfo.https(), 'git+https://bitbucket.org/111/222.git' + hash, label + ' -> https') t.is(hostinfo.browse(), 'https://bitbucket.org/111/222' + (branch ? '/src/' + branch : ''), label + ' -> browse') t.is(hostinfo.docs(), 'https://bitbucket.org/111/222' + (branch ? '/src/' + branch : '') + '#readme', label + ' -> docs') t.is(hostinfo.ssh(), 'git@bitbucket.org:111/222.git' + hash, label + ' -> ssh') t.is(hostinfo.sshurl(), 'git+ssh://git@bitbucket.org/111/222.git' + hash, label + ' -> sshurl') t.is(hostinfo.shortcut(), 'bitbucket:111/222' + hash, label + ' -> shortcut') t.is(hostinfo.file('C'), 'https://bitbucket.org/111/222/raw/' + (branch || 'master') + '/C', label + ' -> file') } require('./lib/standard-tests')(verify, 'bitbucket.org', 'bitbucket') t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/gist.js0000644000000000000000000000367612631326456021446 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('fromUrl(gist url)', function (t) { function verify (host, label, branch) { var hostinfo = HostedGit.fromUrl(host) var hash = branch ? '#' + branch : '' t.ok(hostinfo, label) if (!hostinfo) return t.is(hostinfo.https(), 'git+https://gist.github.com/222.git' + hash, label + ' -> https') t.is(hostinfo.git(), 'git://gist.github.com/222.git' + hash, label + ' -> git') t.is(hostinfo.browse(), 'https://gist.github.com/222' + (branch ? '/' + branch : ''), label + ' -> browse') t.is(hostinfo.bugs(), 'https://gist.github.com/222', label + ' -> bugs') t.is(hostinfo.docs(), 'https://gist.github.com/222' + (branch ? '/' + branch : ''), label + ' -> docs') t.is(hostinfo.ssh(), 'git@gist.github.com:/222.git' + hash, label + ' -> ssh') t.is(hostinfo.sshurl(), 'git+ssh://git@gist.github.com/222.git' + hash, label + ' -> sshurl') t.is(hostinfo.shortcut(), 'gist:222' + hash, label + ' -> shortcut') if (hostinfo.user) { t.is(hostinfo.file('C'), 'https://gist.githubusercontent.com/111/222/raw/' + (branch ? branch + '/' : '') + 'C', label + ' -> file') } } verify('git@gist.github.com:222.git', 'git@') var hostinfo = HostedGit.fromUrl('git@gist.github.com:/ef860c7z5e0de3179341.git') if (t.ok(hostinfo, 'git@hex')) { t.is(hostinfo.https(), 'git+https://gist.github.com/ef860c7z5e0de3179341.git', 'git@hex -> https') } verify('git@gist.github.com:/222.git', 'git@/') verify('git://gist.github.com/222', 'git') verify('git://gist.github.com/222.git', 'git.git') verify('git://gist.github.com/222#branch', 'git#branch', 'branch') verify('git://gist.github.com/222.git#branch', 'git.git#branch', 'branch') require('./lib/standard-tests')(verify, 'gist.github.com', 'gist') verify(HostedGit.fromUrl('gist:111/222').toString(), 'round-tripped shortcut') verify('gist:222', 'shortened shortcut') t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/github.js0000644000000000000000000000355612631326456021757 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('fromUrl(github url)', function (t) { function verify (host, label, branch) { var hostinfo = HostedGit.fromUrl(host) var hash = branch ? '#' + branch : '' t.ok(hostinfo, label) if (!hostinfo) return t.is(hostinfo.https(), 'git+https://github.com/111/222.git' + hash, label + ' -> https') t.is(hostinfo.git(), 'git://github.com/111/222.git' + hash, label + ' -> git') t.is(hostinfo.browse(), 'https://github.com/111/222' + (branch ? '/tree/' + branch : ''), label + ' -> browse') t.is(hostinfo.bugs(), 'https://github.com/111/222/issues', label + ' -> bugs') t.is(hostinfo.docs(), 'https://github.com/111/222' + (branch ? '/tree/' + branch : '') + '#readme', label + ' -> docs') t.is(hostinfo.ssh(), 'git@github.com:111/222.git' + hash, label + ' -> ssh') t.is(hostinfo.sshurl(), 'git+ssh://git@github.com/111/222.git' + hash, label + ' -> sshurl') t.is(hostinfo.shortcut(), 'github:111/222' + hash, label + ' -> shortcut') t.is(hostinfo.file('C'), 'https://raw.githubusercontent.com/111/222/' + (branch || 'master') + '/C', label + ' -> file') } // github shorturls verify('111/222', 'github-short') verify('111/222#branch', 'github-short#branch', 'branch') // insecure protocols verify('git://github.com/111/222', 'git') verify('git://github.com/111/222.git', 'git.git') verify('git://github.com/111/222#branch', 'git#branch', 'branch') verify('git://github.com/111/222.git#branch', 'git.git#branch', 'branch') verify('http://github.com/111/222', 'http') verify('http://github.com/111/222.git', 'http.git') verify('http://github.com/111/222#branch', 'http#branch', 'branch') verify('http://github.com/111/222.git#branch', 'http.git#branch', 'branch') require('./lib/standard-tests')(verify, 'github.com', 'github') t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/gitlab.js0000644000000000000000000000210412631326456021723 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('fromUrl(gitlab url)', function (t) { function verify (host, label, branch) { var hostinfo = HostedGit.fromUrl(host) var hash = branch ? '#' + branch : '' t.ok(hostinfo, label) if (!hostinfo) return t.is(hostinfo.https(), 'git+https://gitlab.com/111/222.git' + hash, label + ' -> https') t.is(hostinfo.browse(), 'https://gitlab.com/111/222' + (branch ? '/tree/' + branch : ''), label + ' -> browse') t.is(hostinfo.docs(), 'https://gitlab.com/111/222' + (branch ? '/tree/' + branch : '') + '#README', label + ' -> docs') t.is(hostinfo.ssh(), 'git@gitlab.com:111/222.git' + hash, label + ' -> ssh') t.is(hostinfo.sshurl(), 'git+ssh://git@gitlab.com/111/222.git' + hash, label + ' -> sshurl') t.is(hostinfo.shortcut(), 'gitlab:111/222' + hash, label + ' -> shortcut') t.is(hostinfo.file('C'), 'https://gitlab.com/111/222/raw/' + (branch || 'master') + '/C', label + ' -> file') } require('./lib/standard-tests')(verify, 'gitlab.com', 'gitlab') t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/https-with-inline-auth.js0000644000000000000000000000417012631326456025014 0ustar 00000000000000'use strict' var HostedGit = require('../index') var test = require('tap').test test('HTTPS GitHub URL with embedded auth -- generally not a good idea', function (t) { function verify (host, label, branch) { var hostinfo = HostedGit.fromUrl(host) var hash = branch ? '#' + branch : '' t.ok(hostinfo, label) if (!hostinfo) return t.is(hostinfo.https(), 'git+https://user:pass@github.com/111/222.git' + hash, label + ' -> https') t.is(hostinfo.git(), 'git://user:pass@github.com/111/222.git' + hash, label + ' -> git') t.is(hostinfo.browse(), 'https://github.com/111/222' + (branch ? '/tree/' + branch : ''), label + ' -> browse') t.is(hostinfo.bugs(), 'https://github.com/111/222/issues', label + ' -> bugs') t.is(hostinfo.docs(), 'https://github.com/111/222' + (branch ? '/tree/' + branch : '') + '#readme', label + ' -> docs') t.is(hostinfo.ssh(), 'git@github.com:111/222.git' + hash, label + ' -> ssh') t.is(hostinfo.sshurl(), 'git+ssh://git@github.com/111/222.git' + hash, label + ' -> sshurl') t.is(hostinfo.shortcut(), 'github:111/222' + hash, label + ' -> shortcut') t.is(hostinfo.file('C'), 'https://user:pass@raw.githubusercontent.com/111/222/' + (branch || 'master') + '/C', label + ' -> file') } // insecure protocols verify('git://user:pass@github.com/111/222', 'git') verify('git://user:pass@github.com/111/222.git', 'git.git') verify('git://user:pass@github.com/111/222#branch', 'git#branch', 'branch') verify('git://user:pass@github.com/111/222.git#branch', 'git.git#branch', 'branch') verify('https://user:pass@github.com/111/222', 'https') verify('https://user:pass@github.com/111/222.git', 'https.git') verify('https://user:pass@github.com/111/222#branch', 'https#branch', 'branch') verify('https://user:pass@github.com/111/222.git#branch', 'https.git#branch', 'branch') verify('http://user:pass@github.com/111/222', 'http') verify('http://user:pass@github.com/111/222.git', 'http.git') verify('http://user:pass@github.com/111/222#branch', 'http#branch', 'branch') verify('http://user:pass@github.com/111/222.git#branch', 'http.git#branch', 'branch') t.end() }) npm_3.5.2.orig/node_modules/hosted-git-info/test/lib/0000755000000000000000000000000012631326456020674 5ustar 00000000000000npm_3.5.2.orig/node_modules/hosted-git-info/test/lib/standard-tests.js0000644000000000000000000000263112631326456024174 0ustar 00000000000000'use strict' module.exports = function (verify, domain, shortname) { verify('https://' + domain + '/111/222', 'https') verify('https://' + domain + '/111/222.git', 'https.git') verify('https://' + domain + '/111/222#branch', 'https#branch', 'branch') verify('https://' + domain + '/111/222.git#branch', 'https.git#branch', 'branch') verify('git+https://' + domain + '/111/222', 'git+https') verify('git+https://' + domain + '/111/222.git', 'git+https.git') verify('git+https://' + domain + '/111/222#branch', 'git+https#branch', 'branch') verify('git+https://' + domain + '/111/222.git#branch', 'git+https.git#branch', 'branch') verify('git@' + domain + ':111/222', 'ssh') verify('git@' + domain + ':111/222.git', 'ssh.git') verify('git@' + domain + ':111/222#branch', 'ssh', 'branch') verify('git@' + domain + ':111/222.git#branch', 'ssh.git', 'branch') verify('git+ssh://git@' + domain + '/111/222', 'ssh url') verify('git+ssh://git@' + domain + '/111/222.git', 'ssh url.git') verify('git+ssh://git@' + domain + '/111/222#branch', 'ssh url#branch', 'branch') verify('git+ssh://git@' + domain + '/111/222.git#branch', 'ssh url.git#branch', 'branch') verify(shortname + ':111/222', 'shortcut') verify(shortname + ':111/222.git', 'shortcut.git') verify(shortname + ':111/222#branch', 'shortcut#branch', 'branch') verify(shortname + ':111/222.git#branch', 'shortcut.git#branch', 'branch') } npm_3.5.2.orig/node_modules/iferr/.npmignore0000644000000000000000000000001512631326456017251 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/iferr/LICENSE0000644000000000000000000000206412631326456016265 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2014 Nadav Ivgi Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.npm_3.5.2.orig/node_modules/iferr/README.md0000644000000000000000000000124312631326456016535 0ustar 00000000000000# iferr Higher-order functions for easier error handling. `if (err) return cb(err);` be gone! ## Install ```bash npm install iferr ``` ## Use ### JavaScript example ```js var iferr = require('iferr'); function get_friends_count(id, cb) { User.load_user(id, iferr(cb, function(user) { user.load_friends(iferr(cb, function(friends) { cb(null, friends.length); })); })); } ``` ### CoffeeScript example ```coffee iferr = require 'iferr' get_friends_count = (id, cb) -> User.load_user id, iferr cb, (user) -> user.load_friends iferr cb, (friends) -> cb null, friends.length ``` (TODO: document tiferr, throwerr and printerr) ## License MIT npm_3.5.2.orig/node_modules/iferr/index.coffee0000644000000000000000000000144312631326456017540 0ustar 00000000000000# Delegates to `succ` on sucecss or to `fail` on error # ex: Thing.load 123, iferr cb, (thing) -> ... iferr = (fail, succ) -> (err, a...) -> if err? then fail err else succ? a... # Like iferr, but also catches errors thrown from `succ` and passes to `fail` tiferr = (fail, succ) -> iferr fail, (a...) -> try succ a... catch err then fail err # Delegate to the success function on success, or throw the error otherwise # ex: Thing.load 123, throwerr (thing) -> ... throwerr = iferr.bind null, (err) -> throw err # Prints errors when one is passed, or does nothing otherwise # ex: thing.save printerr printerr = iferr (err) -> console.error err.stack or err module.exports = exports = iferr exports.iferr = iferr exports.tiferr = tiferr exports.throwerr = throwerr exports.printerr = printerr npm_3.5.2.orig/node_modules/iferr/index.js0000644000000000000000000000205512631326456016725 0ustar 00000000000000// Generated by CoffeeScript 1.7.1 (function() { var exports, iferr, printerr, throwerr, tiferr, __slice = [].slice; iferr = function(fail, succ) { return function() { var a, err; err = arguments[0], a = 2 <= arguments.length ? __slice.call(arguments, 1) : []; if (err != null) { return fail(err); } else { return typeof succ === "function" ? succ.apply(null, a) : void 0; } }; }; tiferr = function(fail, succ) { return iferr(fail, function() { var a, err; a = 1 <= arguments.length ? __slice.call(arguments, 0) : []; try { return succ.apply(null, a); } catch (_error) { err = _error; return fail(err); } }); }; throwerr = iferr.bind(null, function(err) { throw err; }); printerr = iferr(function(err) { return console.error(err.stack || err); }); module.exports = exports = iferr; exports.iferr = iferr; exports.tiferr = tiferr; exports.throwerr = throwerr; exports.printerr = printerr; }).call(this); npm_3.5.2.orig/node_modules/iferr/package.json0000644000000000000000000000220412631326456017542 0ustar 00000000000000{ "name": "iferr", "version": "0.1.5", "description": "Higher-order functions for easier error handling", "main": "index.js", "scripts": { "test": "mocha", "prepublish": "coffee -c index.coffee" }, "repository": { "type": "git", "url": "https://github.com/shesek/iferr" }, "keywords": [ "error", "errors" ], "author": { "name": "Nadav Ivgi" }, "license": "MIT", "bugs": { "url": "https://github.com/shesek/iferr/issues" }, "homepage": "https://github.com/shesek/iferr", "devDependencies": { "coffee-script": "^1.7.1", "mocha": "^1.18.2" }, "_id": "iferr@0.1.5", "dist": { "shasum": "c60eed69e6d8fdb6b3104a1fcbca1c192dc5b501", "tarball": "http://registry.npmjs.org/iferr/-/iferr-0.1.5.tgz" }, "_from": "iferr@>=0.1.5 <0.2.0", "_npmVersion": "1.4.4", "_npmUser": { "name": "nadav", "email": "npm@shesek.info" }, "maintainers": [ { "name": "nadav", "email": "npm@shesek.info" } ], "directories": {}, "_shasum": "c60eed69e6d8fdb6b3104a1fcbca1c192dc5b501", "_resolved": "https://registry.npmjs.org/iferr/-/iferr-0.1.5.tgz" } npm_3.5.2.orig/node_modules/iferr/test/0000755000000000000000000000000012631326456016235 5ustar 00000000000000npm_3.5.2.orig/node_modules/iferr/test/index.coffee0000644000000000000000000000216412631326456020520 0ustar 00000000000000{ iferr, tiferr, throwerr } = require '../index.coffee' { equal: eq, throws } = require 'assert' invoke_fail = (cb) -> cb new Error 'callback error' invoke_succ = (cb) -> cb null throw_error = -> throw new Error 'thrown' describe 'iferr', -> it 'calls the error callback on errors', (done) -> invoke_fail iferr( (err) -> eq err.message, 'callback error' do done -> done new Error 'shouldn\'t call the success callback' ) it 'calls the success callback on success', (done) -> invoke_succ iferr( -> done new Error 'shouldn\'t call the error callback' done ) describe 'tiferr', -> it 'catches errors in the success callback', (done) -> invoke_succ tiferr( (err) -> eq err.message, 'thrown' do done throw_error ) describe 'throwerr', -> it 'throws errors passed to the callback', (done)-> try invoke_fail throwerr -> done 'shouldn\'t call the success callback' catch err eq err.message, 'callback error' do done it 'delegates to the success callback otherwise', (done) -> invoke_succ throwerr done npm_3.5.2.orig/node_modules/iferr/test/mocha.opts0000644000000000000000000000007212631326456020232 0ustar 00000000000000--compilers coffee:coffee-script/register --reporter spec npm_3.5.2.orig/node_modules/imurmurhash/README.md0000644000000000000000000001123312631326456017772 0ustar 00000000000000iMurmurHash.js ============== An incremental implementation of the MurmurHash3 (32-bit) hashing algorithm for JavaScript based on [Gary Court's implementation](https://github.com/garycourt/murmurhash-js) with [kazuyukitanimura's modifications](https://github.com/kazuyukitanimura/murmurhash-js). This version works significantly faster than the non-incremental version if you need to hash many small strings into a single hash, since string concatenation (to build the single string to pass the non-incremental version) is fairly costly. In one case tested, using the incremental version was about 50% faster than concatenating 5-10 strings and then hashing. Installation ------------ To use iMurmurHash in the browser, [download the latest version](https://raw.github.com/jensyt/imurmurhash-js/master/imurmurhash.min.js) and include it as a script on your site. ```html ``` --- To use iMurmurHash in Node.js, install the module using NPM: ```bash npm install imurmurhash ``` Then simply include it in your scripts: ```javascript MurmurHash3 = require('imurmurhash'); ``` Quick Example ------------- ```javascript // Create the initial hash var hashState = MurmurHash3('string'); // Incrementally add text hashState.hash('more strings'); hashState.hash('even more strings'); // All calls can be chained if desired hashState.hash('and').hash('some').hash('more'); // Get a result hashState.result(); // returns 0xe4ccfe6b ``` Functions --------- ### MurmurHash3 ([string], [seed]) Get a hash state object, optionally initialized with the given _string_ and _seed_. _Seed_ must be a positive integer if provided. Calling this function without the `new` keyword will return a cached state object that has been reset. This is safe to use as long as the object is only used from a single thread and no other hashes are created while operating on this one. If this constraint cannot be met, you can use `new` to create a new state object. For example: ```javascript // Use the cached object, calling the function again will return the same // object (but reset, so the current state would be lost) hashState = MurmurHash3(); ... // Create a new object that can be safely used however you wish. Calling the // function again will simply return a new state object, and no state loss // will occur, at the cost of creating more objects. hashState = new MurmurHash3(); ``` Both methods can be mixed however you like if you have different use cases. --- ### MurmurHash3.prototype.hash (string) Incrementally add _string_ to the hash. This can be called as many times as you want for the hash state object, including after a call to `result()`. Returns `this` so calls can be chained. --- ### MurmurHash3.prototype.result () Get the result of the hash as a 32-bit positive integer. This performs the tail and finalizer portions of the algorithm, but does not store the result in the state object. This means that it is perfectly safe to get results and then continue adding strings via `hash`. ```javascript // Do the whole string at once MurmurHash3('this is a test string').result(); // 0x70529328 // Do part of the string, get a result, then the other part var m = MurmurHash3('this is a'); m.result(); // 0xbfc4f834 m.hash(' test string').result(); // 0x70529328 (same as above) ``` --- ### MurmurHash3.prototype.reset ([seed]) Reset the state object for reuse, optionally using the given _seed_ (defaults to 0 like the constructor). Returns `this` so calls can be chained. --- License (MIT) ------------- Copyright (c) 2013 Gary Court, Jens Taylor Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/imurmurhash/imurmurhash.js0000644000000000000000000001047412631326456021423 0ustar 00000000000000/** * @preserve * JS Implementation of incremental MurmurHash3 (r150) (as of May 10, 2013) * * @author Jens Taylor * @see http://github.com/homebrewing/brauhaus-diff * @author Gary Court * @see http://github.com/garycourt/murmurhash-js * @author Austin Appleby * @see http://sites.google.com/site/murmurhash/ */ (function(){ var cache; // Call this function without `new` to use the cached object (good for // single-threaded environments), or with `new` to create a new object. // // @param {string} key A UTF-16 or ASCII string // @param {number} seed An optional positive integer // @return {object} A MurmurHash3 object for incremental hashing function MurmurHash3(key, seed) { var m = this instanceof MurmurHash3 ? this : cache; m.reset(seed) if (typeof key === 'string' && key.length > 0) { m.hash(key); } if (m !== this) { return m; } }; // Incrementally add a string to this hash // // @param {string} key A UTF-16 or ASCII string // @return {object} this MurmurHash3.prototype.hash = function(key) { var h1, k1, i, top, len; len = key.length; this.len += len; k1 = this.k1; i = 0; switch (this.rem) { case 0: k1 ^= len > i ? (key.charCodeAt(i++) & 0xffff) : 0; case 1: k1 ^= len > i ? (key.charCodeAt(i++) & 0xffff) << 8 : 0; case 2: k1 ^= len > i ? (key.charCodeAt(i++) & 0xffff) << 16 : 0; case 3: k1 ^= len > i ? (key.charCodeAt(i) & 0xff) << 24 : 0; k1 ^= len > i ? (key.charCodeAt(i++) & 0xff00) >> 8 : 0; } this.rem = (len + this.rem) & 3; // & 3 is same as % 4 len -= this.rem; if (len > 0) { h1 = this.h1; while (1) { k1 = (k1 * 0x2d51 + (k1 & 0xffff) * 0xcc9e0000) & 0xffffffff; k1 = (k1 << 15) | (k1 >>> 17); k1 = (k1 * 0x3593 + (k1 & 0xffff) * 0x1b870000) & 0xffffffff; h1 ^= k1; h1 = (h1 << 13) | (h1 >>> 19); h1 = (h1 * 5 + 0xe6546b64) & 0xffffffff; if (i >= len) { break; } k1 = ((key.charCodeAt(i++) & 0xffff)) ^ ((key.charCodeAt(i++) & 0xffff) << 8) ^ ((key.charCodeAt(i++) & 0xffff) << 16); top = key.charCodeAt(i++); k1 ^= ((top & 0xff) << 24) ^ ((top & 0xff00) >> 8); } k1 = 0; switch (this.rem) { case 3: k1 ^= (key.charCodeAt(i + 2) & 0xffff) << 16; case 2: k1 ^= (key.charCodeAt(i + 1) & 0xffff) << 8; case 1: k1 ^= (key.charCodeAt(i) & 0xffff); } this.h1 = h1; } this.k1 = k1; return this; }; // Get the result of this hash // // @return {number} The 32-bit hash MurmurHash3.prototype.result = function() { var k1, h1; k1 = this.k1; h1 = this.h1; if (k1 > 0) { k1 = (k1 * 0x2d51 + (k1 & 0xffff) * 0xcc9e0000) & 0xffffffff; k1 = (k1 << 15) | (k1 >>> 17); k1 = (k1 * 0x3593 + (k1 & 0xffff) * 0x1b870000) & 0xffffffff; h1 ^= k1; } h1 ^= this.len; h1 ^= h1 >>> 16; h1 = (h1 * 0xca6b + (h1 & 0xffff) * 0x85eb0000) & 0xffffffff; h1 ^= h1 >>> 13; h1 = (h1 * 0xae35 + (h1 & 0xffff) * 0xc2b20000) & 0xffffffff; h1 ^= h1 >>> 16; return h1 >>> 0; }; // Reset the hash object for reuse // // @param {number} seed An optional positive integer MurmurHash3.prototype.reset = function(seed) { this.h1 = typeof seed === 'number' ? seed : 0; this.rem = this.k1 = this.len = 0; return this; }; // A cached object to use. This can be safely used if you're in a single- // threaded environment, otherwise you need to create new hashes to use. cache = new MurmurHash3(); if (typeof(module) != 'undefined') { module.exports = MurmurHash3; } else { this.MurmurHash3 = MurmurHash3; } }()); npm_3.5.2.orig/node_modules/imurmurhash/imurmurhash.min.js0000644000000000000000000000354612631326456022207 0ustar 00000000000000/** * @preserve * JS Implementation of incremental MurmurHash3 (r150) (as of May 10, 2013) * * @author Jens Taylor * @see http://github.com/homebrewing/brauhaus-diff * @author Gary Court * @see http://github.com/garycourt/murmurhash-js * @author Austin Appleby * @see http://sites.google.com/site/murmurhash/ */ !function(){function t(h,r){var s=this instanceof t?this:e;return s.reset(r),"string"==typeof h&&h.length>0&&s.hash(h),s!==this?s:void 0}var e;t.prototype.hash=function(t){var e,h,r,s,i;switch(i=t.length,this.len+=i,h=this.k1,r=0,this.rem){case 0:h^=i>r?65535&t.charCodeAt(r++):0;case 1:h^=i>r?(65535&t.charCodeAt(r++))<<8:0;case 2:h^=i>r?(65535&t.charCodeAt(r++))<<16:0;case 3:h^=i>r?(255&t.charCodeAt(r))<<24:0,h^=i>r?(65280&t.charCodeAt(r++))>>8:0}if(this.rem=3&i+this.rem,i-=this.rem,i>0){for(e=this.h1;;){if(h=4294967295&11601*h+3432906752*(65535&h),h=h<<15|h>>>17,h=4294967295&13715*h+461832192*(65535&h),e^=h,e=e<<13|e>>>19,e=4294967295&5*e+3864292196,r>=i)break;h=65535&t.charCodeAt(r++)^(65535&t.charCodeAt(r++))<<8^(65535&t.charCodeAt(r++))<<16,s=t.charCodeAt(r++),h^=(255&s)<<24^(65280&s)>>8}switch(h=0,this.rem){case 3:h^=(65535&t.charCodeAt(r+2))<<16;case 2:h^=(65535&t.charCodeAt(r+1))<<8;case 1:h^=65535&t.charCodeAt(r)}this.h1=e}return this.k1=h,this},t.prototype.result=function(){var t,e;return t=this.k1,e=this.h1,t>0&&(t=4294967295&11601*t+3432906752*(65535&t),t=t<<15|t>>>17,t=4294967295&13715*t+461832192*(65535&t),e^=t),e^=this.len,e^=e>>>16,e=4294967295&51819*e+2246770688*(65535&e),e^=e>>>13,e=4294967295&44597*e+3266445312*(65535&e),e^=e>>>16,e>>>0},t.prototype.reset=function(t){return this.h1="number"==typeof t?t:0,this.rem=this.k1=this.len=0,this},e=new t,"undefined"!=typeof module?module.exports=t:this.MurmurHash3=t}();npm_3.5.2.orig/node_modules/imurmurhash/package.json0000644000000000000000000001551412631326456021007 0ustar 00000000000000{ "_args": [ [ "imurmurhash@^0.1.4", "/Users/ogd/Documents/projects/npm/npm/node_modules/fs-write-stream-atomic" ] ], "_from": "imurmurhash@>=0.1.4 <0.2.0", "_id": "imurmurhash@0.1.4", "_inCache": true, "_installable": true, "_location": "/imurmurhash", "_npmUser": { "email": "jensyt@gmail.com", "name": "jensyt" }, "_npmVersion": "1.3.2", "_phantomChildren": {}, "_requested": { "name": "imurmurhash", "raw": "imurmurhash@^0.1.4", "rawSpec": "^0.1.4", "scope": null, "spec": ">=0.1.4 <0.2.0", "type": "range" }, "_requiredBy": [ "/fs-write-stream-atomic" ], "_resolved": "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz", "_shasum": "9218b9b2b928a238b13dc4fb6b6d576f231453ea", "_shrinkwrap": null, "_spec": "imurmurhash@^0.1.4", "_where": "/Users/ogd/Documents/projects/npm/npm/node_modules/fs-write-stream-atomic", "author": { "email": "jensyt@gmail.com", "name": "Jens Taylor", "url": "https://github.com/homebrewing" }, "bugs": { "url": "https://github.com/jensyt/imurmurhash-js/issues" }, "dependencies": {}, "description": "An incremental implementation of MurmurHash3", "devDependencies": {}, "directories": {}, "dist": { "shasum": "9218b9b2b928a238b13dc4fb6b6d576f231453ea", "tarball": "http://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz" }, "engines": { "node": ">=0.8.19" }, "files": [ "README.md", "imurmurhash.js", "imurmurhash.min.js", "package.json" ], "homepage": "https://github.com/jensyt/imurmurhash-js", "keywords": [ "hash", "incremental", "murmur", "murmurhash", "murmurhash3" ], "license": "MIT", "main": "imurmurhash.js", "maintainers": [ { "name": "jensyt", "email": "jensyt@gmail.com" } ], "name": "imurmurhash", "optionalDependencies": {}, "readme": "iMurmurHash.js\n==============\n\nAn incremental implementation of the MurmurHash3 (32-bit) hashing algorithm for JavaScript based on [Gary Court's implementation](https://github.com/garycourt/murmurhash-js) with [kazuyukitanimura's modifications](https://github.com/kazuyukitanimura/murmurhash-js).\n\nThis version works significantly faster than the non-incremental version if you need to hash many small strings into a single hash, since string concatenation (to build the single string to pass the non-incremental version) is fairly costly. In one case tested, using the incremental version was about 50% faster than concatenating 5-10 strings and then hashing.\n\nInstallation\n------------\n\nTo use iMurmurHash in the browser, [download the latest version](https://raw.github.com/jensyt/imurmurhash-js/master/imurmurhash.min.js) and include it as a script on your site.\n\n```html\n\n\n```\n\n---\n\nTo use iMurmurHash in Node.js, install the module using NPM:\n\n```bash\nnpm install imurmurhash\n```\n\nThen simply include it in your scripts:\n\n```javascript\nMurmurHash3 = require('imurmurhash');\n```\n\nQuick Example\n-------------\n\n```javascript\n// Create the initial hash\nvar hashState = MurmurHash3('string');\n\n// Incrementally add text\nhashState.hash('more strings');\nhashState.hash('even more strings');\n\n// All calls can be chained if desired\nhashState.hash('and').hash('some').hash('more');\n\n// Get a result\nhashState.result();\n// returns 0xe4ccfe6b\n```\n\nFunctions\n---------\n\n### MurmurHash3 ([string], [seed])\nGet a hash state object, optionally initialized with the given _string_ and _seed_. _Seed_ must be a positive integer if provided. Calling this function without the `new` keyword will return a cached state object that has been reset. This is safe to use as long as the object is only used from a single thread and no other hashes are created while operating on this one. If this constraint cannot be met, you can use `new` to create a new state object. For example:\n\n```javascript\n// Use the cached object, calling the function again will return the same\n// object (but reset, so the current state would be lost)\nhashState = MurmurHash3();\n...\n\n// Create a new object that can be safely used however you wish. Calling the\n// function again will simply return a new state object, and no state loss\n// will occur, at the cost of creating more objects.\nhashState = new MurmurHash3();\n```\n\nBoth methods can be mixed however you like if you have different use cases.\n\n---\n\n### MurmurHash3.prototype.hash (string)\nIncrementally add _string_ to the hash. This can be called as many times as you want for the hash state object, including after a call to `result()`. Returns `this` so calls can be chained.\n\n---\n\n### MurmurHash3.prototype.result ()\nGet the result of the hash as a 32-bit positive integer. This performs the tail and finalizer portions of the algorithm, but does not store the result in the state object. This means that it is perfectly safe to get results and then continue adding strings via `hash`.\n\n```javascript\n// Do the whole string at once\nMurmurHash3('this is a test string').result();\n// 0x70529328\n\n// Do part of the string, get a result, then the other part\nvar m = MurmurHash3('this is a');\nm.result();\n// 0xbfc4f834\nm.hash(' test string').result();\n// 0x70529328 (same as above)\n```\n\n---\n\n### MurmurHash3.prototype.reset ([seed])\nReset the state object for reuse, optionally using the given _seed_ (defaults to 0 like the constructor). Returns `this` so calls can be chained.\n\n---\n\nLicense (MIT)\n-------------\nCopyright (c) 2013 Gary Court, Jens Taylor\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of\nthe Software, and to permit persons to whom the Software is furnished to do so,\nsubject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS\nFOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR\nCOPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\nCONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n", "readmeFilename": "README.md", "repository": { "type": "git", "url": "git+https://github.com/jensyt/imurmurhash-js.git" }, "version": "0.1.4" } npm_3.5.2.orig/node_modules/inflight/LICENSE0000644000000000000000000000135412631326456016763 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/inflight/README.md0000644000000000000000000000173712631326456017242 0ustar 00000000000000# inflight Add callbacks to requests in flight to avoid async duplication ## USAGE ```javascript var inflight = require('inflight') // some request that does some stuff function req(key, callback) { // key is any random string. like a url or filename or whatever. // // will return either a falsey value, indicating that the // request for this key is already in flight, or a new callback // which when called will call all callbacks passed to inflightk // with the same key callback = inflight(key, callback) // If we got a falsey value back, then there's already a req going if (!callback) return // this is where you'd fetch the url or whatever // callback is also once()-ified, so it can safely be assigned // to multiple events etc. First call wins. setTimeout(function() { callback(null, key) }, 100) } // only assigns a single setTimeout // when it dings, all cbs get called req('foo', cb1) req('foo', cb2) req('foo', cb3) req('foo', cb4) ``` npm_3.5.2.orig/node_modules/inflight/inflight.js0000644000000000000000000000160112631326456020113 0ustar 00000000000000var wrappy = require('wrappy') var reqs = Object.create(null) var once = require('once') module.exports = wrappy(inflight) function inflight (key, cb) { if (reqs[key]) { reqs[key].push(cb) return null } else { reqs[key] = [cb] return makeres(key) } } function makeres (key) { return once(function RES () { var cbs = reqs[key] var len = cbs.length var args = slice(arguments) for (var i = 0; i < len; i++) { cbs[i].apply(null, args) } if (cbs.length > len) { // added more in the interim. // de-zalgo, just in case, but don't call again. cbs.splice(0, len) process.nextTick(function () { RES.apply(null, args) }) } else { delete reqs[key] } }) } function slice (args) { var length = args.length var array = [] for (var i = 0; i < length; i++) array[i] = args[i] return array } npm_3.5.2.orig/node_modules/inflight/package.json0000644000000000000000000000362012631326456020242 0ustar 00000000000000{ "name": "inflight", "version": "1.0.4", "description": "Add callbacks to requests in flight to avoid async duplication", "main": "inflight.js", "dependencies": { "once": "^1.3.0", "wrappy": "1" }, "devDependencies": { "tap": "^0.4.10" }, "scripts": { "test": "tap test.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/inflight.git" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/isaacs/inflight/issues" }, "homepage": "https://github.com/isaacs/inflight", "license": "ISC", "readme": "# inflight\n\nAdd callbacks to requests in flight to avoid async duplication\n\n## USAGE\n\n```javascript\nvar inflight = require('inflight')\n\n// some request that does some stuff\nfunction req(key, callback) {\n // key is any random string. like a url or filename or whatever.\n //\n // will return either a falsey value, indicating that the\n // request for this key is already in flight, or a new callback\n // which when called will call all callbacks passed to inflightk\n // with the same key\n callback = inflight(key, callback)\n\n // If we got a falsey value back, then there's already a req going\n if (!callback) return\n\n // this is where you'd fetch the url or whatever\n // callback is also once()-ified, so it can safely be assigned\n // to multiple events etc. First call wins.\n setTimeout(function() {\n callback(null, key)\n }, 100)\n}\n\n// only assigns a single setTimeout\n// when it dings, all cbs get called\nreq('foo', cb1)\nreq('foo', cb2)\nreq('foo', cb3)\nreq('foo', cb4)\n```\n", "readmeFilename": "README.md", "_id": "inflight@1.0.4", "_shasum": "6cbb4521ebd51ce0ec0a936bfd7657ef7e9b172a", "_resolved": "https://registry.npmjs.org/inflight/-/inflight-1.0.4.tgz", "_from": "inflight@>=1.0.4 <1.1.0" } npm_3.5.2.orig/node_modules/inflight/test.js0000644000000000000000000000337312631326456017276 0ustar 00000000000000var test = require('tap').test var inf = require('./inflight.js') function req (key, cb) { cb = inf(key, cb) if (cb) setTimeout(function () { cb(key) cb(key) }) return cb } test('basic', function (t) { var calleda = false var a = req('key', function (k) { t.notOk(calleda) calleda = true t.equal(k, 'key') if (calledb) t.end() }) t.ok(a, 'first returned cb function') var calledb = false var b = req('key', function (k) { t.notOk(calledb) calledb = true t.equal(k, 'key') if (calleda) t.end() }) t.notOk(b, 'second should get falsey inflight response') }) test('timing', function (t) { var expect = [ 'method one', 'start one', 'end one', 'two', 'tick', 'three' ] var i = 0 function log (m) { t.equal(m, expect[i], m + ' === ' + expect[i]) ++i if (i === expect.length) t.end() } function method (name, cb) { log('method ' + name) process.nextTick(cb) } var one = inf('foo', function () { log('start one') var three = inf('foo', function () { log('three') }) if (three) method('three', three) log('end one') }) method('one', one) var two = inf('foo', function () { log('two') }) if (two) method('one', two) process.nextTick(log.bind(null, 'tick')) }) test('parameters', function (t) { t.plan(8) var a = inf('key', function (first, second, third) { t.equal(first, 1) t.equal(second, 2) t.equal(third, 3) }) t.ok(a, 'first returned cb function') var b = inf('key', function (first, second, third) { t.equal(first, 1) t.equal(second, 2) t.equal(third, 3) }) t.notOk(b, 'second should get falsey inflight response') setTimeout(function () { a(1, 2, 3) }) }) npm_3.5.2.orig/node_modules/inherits/LICENSE0000644000000000000000000000135512631326456017005 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/inherits/README.md0000644000000000000000000000313112631326456017251 0ustar 00000000000000Browser-friendly inheritance fully compatible with standard node.js [inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor). This package exports standard `inherits` from node.js `util` module in node environment, but also provides alternative browser-friendly implementation through [browser field](https://gist.github.com/shtylman/4339901). Alternative implementation is a literal copy of standard one located in standalone module to avoid requiring of `util`. It also has a shim for old browsers with no `Object.create` support. While keeping you sure you are using standard `inherits` implementation in node.js environment, it allows bundlers such as [browserify](https://github.com/substack/node-browserify) to not include full `util` package to your client code if all you need is just `inherits` function. It worth, because browser shim for `util` package is large and `inherits` is often the single function you need from it. It's recommended to use this package instead of `require('util').inherits` for any code that has chances to be used not only in node.js but in browser too. ## usage ```js var inherits = require('inherits'); // then use exactly as the standard one ``` ## note on version ~1.0 Version ~1.0 had completely different motivation and is not compatible neither with 2.0 nor with standard node.js `inherits`. If you are using version ~1.0 and planning to switch to ~2.0, be careful: * new version uses `super_` instead of `super` for referencing superclass * new version overwrites current prototype while old one preserves any existing fields on it npm_3.5.2.orig/node_modules/inherits/inherits.js0000644000000000000000000000005212631326456020154 0ustar 00000000000000module.exports = require('util').inherits npm_3.5.2.orig/node_modules/inherits/inherits_browser.js0000644000000000000000000000124012631326456021717 0ustar 00000000000000if (typeof Object.create === 'function') { // implementation from standard node.js 'util' module module.exports = function inherits(ctor, superCtor) { ctor.super_ = superCtor ctor.prototype = Object.create(superCtor.prototype, { constructor: { value: ctor, enumerable: false, writable: true, configurable: true } }); }; } else { // old school shim for old browsers module.exports = function inherits(ctor, superCtor) { ctor.super_ = superCtor var TempCtor = function () {} TempCtor.prototype = superCtor.prototype ctor.prototype = new TempCtor() ctor.prototype.constructor = ctor } } npm_3.5.2.orig/node_modules/inherits/package.json0000644000000000000000000000477712631326456020301 0ustar 00000000000000{ "name": "inherits", "description": "Browser-friendly inheritance fully compatible with standard node.js inherits()", "version": "2.0.1", "keywords": [ "inheritance", "class", "klass", "oop", "object-oriented", "inherits", "browser", "browserify" ], "main": "./inherits.js", "browser": "./inherits_browser.js", "repository": { "type": "git", "url": "git://github.com/isaacs/inherits.git" }, "license": "ISC", "scripts": { "test": "node test" }, "readme": "Browser-friendly inheritance fully compatible with standard node.js\n[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor).\n\nThis package exports standard `inherits` from node.js `util` module in\nnode environment, but also provides alternative browser-friendly\nimplementation through [browser\nfield](https://gist.github.com/shtylman/4339901). Alternative\nimplementation is a literal copy of standard one located in standalone\nmodule to avoid requiring of `util`. It also has a shim for old\nbrowsers with no `Object.create` support.\n\nWhile keeping you sure you are using standard `inherits`\nimplementation in node.js environment, it allows bundlers such as\n[browserify](https://github.com/substack/node-browserify) to not\ninclude full `util` package to your client code if all you need is\njust `inherits` function. It worth, because browser shim for `util`\npackage is large and `inherits` is often the single function you need\nfrom it.\n\nIt's recommended to use this package instead of\n`require('util').inherits` for any code that has chances to be used\nnot only in node.js but in browser too.\n\n## usage\n\n```js\nvar inherits = require('inherits');\n// then use exactly as the standard one\n```\n\n## note on version ~1.0\n\nVersion ~1.0 had completely different motivation and is not compatible\nneither with 2.0 nor with standard node.js `inherits`.\n\nIf you are using version ~1.0 and planning to switch to ~2.0, be\ncareful:\n\n* new version uses `super_` instead of `super` for referencing\n superclass\n* new version overwrites current prototype while old one preserves any\n existing fields on it\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/inherits/issues" }, "homepage": "https://github.com/isaacs/inherits#readme", "_id": "inherits@2.0.1", "_shasum": "b17d08d326b4423e568eff719f91b0b1cbdf69f1", "_resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.1.tgz", "_from": "inherits@>=2.0.1 <2.1.0" } npm_3.5.2.orig/node_modules/inherits/test.js0000644000000000000000000000077612631326456017323 0ustar 00000000000000var inherits = require('./inherits.js') var assert = require('assert') function test(c) { assert(c.constructor === Child) assert(c.constructor.super_ === Parent) assert(Object.getPrototypeOf(c) === Child.prototype) assert(Object.getPrototypeOf(Object.getPrototypeOf(c)) === Parent.prototype) assert(c instanceof Child) assert(c instanceof Parent) } function Child() { Parent.call(this) test(this) } function Parent() {} inherits(Child, Parent) var c = new Child test(c) console.log('ok') npm_3.5.2.orig/node_modules/ini/LICENSE0000644000000000000000000000137512631326456015741 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/ini/README.md0000644000000000000000000000523612631326456016213 0ustar 00000000000000An ini format parser and serializer for node. Sections are treated as nested objects. Items before the first heading are saved on the object directly. ## Usage Consider an ini-file `config.ini` that looks like this: ; this comment is being ignored scope = global [database] user = dbuser password = dbpassword database = use_this_database [paths.default] datadir = /var/lib/data array[] = first value array[] = second value array[] = third value You can read, manipulate and write the ini-file like so: var fs = require('fs') , ini = require('ini') var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8')) config.scope = 'local' config.database.database = 'use_another_database' config.paths.default.tmpdir = '/tmp' delete config.paths.default.datadir config.paths.default.array.push('fourth value') fs.writeFileSync('./config_modified.ini', ini.stringify(config, { section: 'section' })) This will result in a file called `config_modified.ini` being written to the filesystem with the following content: [section] scope=local [section.database] user=dbuser password=dbpassword database=use_another_database [section.paths.default] tmpdir=/tmp array[]=first value array[]=second value array[]=third value array[]=fourth value ## API ### decode(inistring) Decode the ini-style formatted `inistring` into a nested object. ### parse(inistring) Alias for `decode(inistring)` ### encode(object, [options]) Encode the object `object` into an ini-style formatted string. If the optional parameter `section` is given, then all top-level properties of the object are put into this section and the `section`-string is prepended to all sub-sections, see the usage example above. The `options` object may contain the following: * `section` A string which will be the first `section` in the encoded ini data. Defaults to none. * `whitespace` Boolean to specify whether to put whitespace around the `=` character. By default, whitespace is omitted, to be friendly to some persnickety old parsers that don't tolerate it well. But some find that it's more human-readable and pretty with the whitespace. For backwards compatibility reasons, if a `string` options is passed in, then it is assumed to be the `section` value. ### stringify(object, [options]) Alias for `encode(object, [options])` ### safe(val) Escapes the string `val` such that it is safe to be used as a key or value in an ini-file. Basically escapes quotes. For example ini.safe('"unsafe string"') would result in "\"unsafe string\"" ### unsafe(val) Unescapes the string `val` npm_3.5.2.orig/node_modules/ini/ini.js0000644000000000000000000001132112631326456016041 0ustar 00000000000000 exports.parse = exports.decode = decode exports.stringify = exports.encode = encode exports.safe = safe exports.unsafe = unsafe var eol = process.platform === "win32" ? "\r\n" : "\n" function encode (obj, opt) { var children = [] , out = "" if (typeof opt === "string") { opt = { section: opt, whitespace: false } } else { opt = opt || {} opt.whitespace = opt.whitespace === true } var separator = opt.whitespace ? " = " : "=" Object.keys(obj).forEach(function (k, _, __) { var val = obj[k] if (val && Array.isArray(val)) { val.forEach(function(item) { out += safe(k + "[]") + separator + safe(item) + "\n" }) } else if (val && typeof val === "object") { children.push(k) } else { out += safe(k) + separator + safe(val) + eol } }) if (opt.section && out.length) { out = "[" + safe(opt.section) + "]" + eol + out } children.forEach(function (k, _, __) { var nk = dotSplit(k).join('\\.') var section = (opt.section ? opt.section + "." : "") + nk var child = encode(obj[k], { section: section, whitespace: opt.whitespace }) if (out.length && child.length) { out += eol } out += child }) return out } function dotSplit (str) { return str.replace(/\1/g, '\u0002LITERAL\\1LITERAL\u0002') .replace(/\\\./g, '\u0001') .split(/\./).map(function (part) { return part.replace(/\1/g, '\\.') .replace(/\2LITERAL\\1LITERAL\2/g, '\u0001') }) } function decode (str) { var out = {} , p = out , section = null , state = "START" // section |key = value , re = /^\[([^\]]*)\]$|^([^=]+)(=(.*))?$/i , lines = str.split(/[\r\n]+/g) , section = null lines.forEach(function (line, _, __) { if (!line || line.match(/^\s*[;#]/)) return var match = line.match(re) if (!match) return if (match[1] !== undefined) { section = unsafe(match[1]) p = out[section] = out[section] || {} return } var key = unsafe(match[2]) , value = match[3] ? unsafe((match[4] || "")) : true switch (value) { case 'true': case 'false': case 'null': value = JSON.parse(value) } // Convert keys with '[]' suffix to an array if (key.length > 2 && key.slice(-2) === "[]") { key = key.substring(0, key.length - 2) if (!p[key]) { p[key] = [] } else if (!Array.isArray(p[key])) { p[key] = [p[key]] } } // safeguard against resetting a previously defined // array by accidentally forgetting the brackets if (Array.isArray(p[key])) { p[key].push(value) } else { p[key] = value } }) // {a:{y:1},"a.b":{x:2}} --> {a:{y:1,b:{x:2}}} // use a filter to return the keys that have to be deleted. Object.keys(out).filter(function (k, _, __) { if (!out[k] || typeof out[k] !== "object" || Array.isArray(out[k])) return false // see if the parent section is also an object. // if so, add it to that, and mark this one for deletion var parts = dotSplit(k) , p = out , l = parts.pop() , nl = l.replace(/\\\./g, '.') parts.forEach(function (part, _, __) { if (!p[part] || typeof p[part] !== "object") p[part] = {} p = p[part] }) if (p === out && nl === l) return false p[nl] = out[k] return true }).forEach(function (del, _, __) { delete out[del] }) return out } function isQuoted (val) { return (val.charAt(0) === "\"" && val.slice(-1) === "\"") || (val.charAt(0) === "'" && val.slice(-1) === "'") } function safe (val) { return ( typeof val !== "string" || val.match(/[=\r\n]/) || val.match(/^\[/) || (val.length > 1 && isQuoted(val)) || val !== val.trim() ) ? JSON.stringify(val) : val.replace(/;/g, '\\;').replace(/#/g, "\\#") } function unsafe (val, doUnesc) { val = (val || "").trim() if (isQuoted(val)) { // remove the single quotes before calling JSON.parse if (val.charAt(0) === "'") { val = val.substr(1, val.length - 2); } try { val = JSON.parse(val) } catch (_) {} } else { // walk the val to find the first not-escaped ; character var esc = false var unesc = ""; for (var i = 0, l = val.length; i < l; i++) { var c = val.charAt(i) if (esc) { if ("\\;#".indexOf(c) !== -1) unesc += c else unesc += "\\" + c esc = false } else if (";#".indexOf(c) !== -1) { break } else if (c === "\\") { esc = true } else { unesc += c } } if (esc) unesc += "\\" return unesc } return val } npm_3.5.2.orig/node_modules/ini/package.json0000644000000000000000000000715412631326456017223 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "ini", "description": "An ini encoder/decoder for node", "version": "1.3.4", "repository": { "type": "git", "url": "git://github.com/isaacs/ini.git" }, "main": "ini.js", "scripts": { "test": "tap test/*.js" }, "engines": { "node": "*" }, "dependencies": {}, "devDependencies": { "tap": "^1.2.0" }, "license": "ISC", "files": [ "ini.js" ], "readme": "An ini format parser and serializer for node.\n\nSections are treated as nested objects. Items before the first\nheading are saved on the object directly.\n\n## Usage\n\nConsider an ini-file `config.ini` that looks like this:\n\n ; this comment is being ignored\n scope = global\n\n [database]\n user = dbuser\n password = dbpassword\n database = use_this_database\n\n [paths.default]\n datadir = /var/lib/data\n array[] = first value\n array[] = second value\n array[] = third value\n\nYou can read, manipulate and write the ini-file like so:\n\n var fs = require('fs')\n , ini = require('ini')\n\n var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8'))\n\n config.scope = 'local'\n config.database.database = 'use_another_database'\n config.paths.default.tmpdir = '/tmp'\n delete config.paths.default.datadir\n config.paths.default.array.push('fourth value')\n\n fs.writeFileSync('./config_modified.ini', ini.stringify(config, { section: 'section' }))\n\nThis will result in a file called `config_modified.ini` being written\nto the filesystem with the following content:\n\n [section]\n scope=local\n [section.database]\n user=dbuser\n password=dbpassword\n database=use_another_database\n [section.paths.default]\n tmpdir=/tmp\n array[]=first value\n array[]=second value\n array[]=third value\n array[]=fourth value\n\n\n## API\n\n### decode(inistring)\n\nDecode the ini-style formatted `inistring` into a nested object.\n\n### parse(inistring)\n\nAlias for `decode(inistring)`\n\n### encode(object, [options])\n\nEncode the object `object` into an ini-style formatted string. If the\noptional parameter `section` is given, then all top-level properties\nof the object are put into this section and the `section`-string is\nprepended to all sub-sections, see the usage example above.\n\nThe `options` object may contain the following:\n\n* `section` A string which will be the first `section` in the encoded\n ini data. Defaults to none.\n* `whitespace` Boolean to specify whether to put whitespace around the\n `=` character. By default, whitespace is omitted, to be friendly to\n some persnickety old parsers that don't tolerate it well. But some\n find that it's more human-readable and pretty with the whitespace.\n\nFor backwards compatibility reasons, if a `string` options is passed\nin, then it is assumed to be the `section` value.\n\n### stringify(object, [options])\n\nAlias for `encode(object, [options])`\n\n### safe(val)\n\nEscapes the string `val` such that it is safe to be used as a key or\nvalue in an ini-file. Basically escapes quotes. For example\n\n ini.safe('\"unsafe string\"')\n\nwould result in\n\n \"\\\"unsafe string\\\"\"\n\n### unsafe(val)\n\nUnescapes the string `val`\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/ini/issues" }, "homepage": "https://github.com/isaacs/ini#readme", "_id": "ini@1.3.4", "_shasum": "0537cb79daf59b59a1a517dff706c86ec039162e", "_resolved": "https://registry.npmjs.org/ini/-/ini-1.3.4.tgz", "_from": "ini@>=1.3.4 <1.4.0" } npm_3.5.2.orig/node_modules/init-package-json/.npmignore0000644000000000000000000000003012631326456021442 0ustar 00000000000000node_modules/ .eslintrc npm_3.5.2.orig/node_modules/init-package-json/.travis.yml0000644000000000000000000000007412631326456021564 0ustar 00000000000000language: node_js node_js: - '0.10' - '0.12' - 'iojs' npm_3.5.2.orig/node_modules/init-package-json/LICENSE0000644000000000000000000000135412631326456020462 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/init-package-json/README.md0000644000000000000000000000241012631326456020726 0ustar 00000000000000# init-package-json A node module to get your node module started. [![Build Status](https://secure.travis-ci.org/npm/init-package-json.svg)](http://travis-ci.org/npm/init-package-json) ## Usage ```javascript var init = require('init-package-json') var path = require('path') // a path to a promzard module. In the event that this file is // not found, one will be provided for you. var initFile = path.resolve(process.env.HOME, '.npm-init') // the dir where we're doin stuff. var dir = process.cwd() // extra stuff that gets put into the PromZard module's context. // In npm, this is the resolved config object. Exposed as 'config' // Optional. var configData = { some: 'extra stuff' } // Any existing stuff from the package.json file is also exposed in the // PromZard module as the `package` object. There will also be free // vars for: // * `filename` path to the package.json file // * `basename` the tip of the package dir // * `dirname` the parent of the package dir init(dir, initFile, configData, function (er, data) { // the data's already been written to {dir}/package.json // now you can do stuff with it }) ``` Or from the command line: ``` $ npm-init ``` See [PromZard](https://github.com/isaacs/promzard) for details about what can go in the config file. npm_3.5.2.orig/node_modules/init-package-json/default-input.js0000644000000000000000000001507112631326456022575 0ustar 00000000000000var fs = require('fs') var glob = require('glob') var path = require('path') var validateLicense = require('validate-npm-package-license') var validateName = require('validate-npm-package-name') var npa = require('npm-package-arg') var semver = require('semver') // more popular packages should go here, maybe? function isTestPkg (p) { return !!p.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/) } function niceName (n) { return n.replace(/^node-|[.-]js$/g, '').toLowerCase() } function readDeps (test) { return function (cb) { fs.readdir('node_modules', function (er, dir) { if (er) return cb() var deps = {} var n = dir.length if (n === 0) return cb(null, deps) dir.forEach(function (d) { if (d.match(/^\./)) return next() if (test !== isTestPkg(d)) return next() var dp = path.join(dirname, 'node_modules', d, 'package.json') fs.readFile(dp, 'utf8', function (er, p) { if (er) return next() try { p = JSON.parse(p) } catch (e) { return next() } if (!p.version) return next() if (p._requiredBy) { if (!p._requiredBy.some(function (req) { return req === '#USER' })) return next() } deps[d] = config.get('save-exact') ? p.version : config.get('save-prefix') + p.version return next() }) }) function next () { if (--n === 0) return cb(null, deps) } }) }} var name = package.name || basename var spec = npa(name) var scope = config.get('scope') if (scope) { if (scope.charAt(0) !== '@') scope = '@' + scope if (spec.scope) { name = scope + '/' + spec.name.split('/')[1] } else { name = scope + '/' + name } } exports.name = yes ? name : prompt('name', name, function (data) { var its = validateName(data) if (its.validForNewPackages) return data var errors = (its.errors || []).concat(its.warnings || []) var er = new Error('Sorry, ' + errors.join(' and ') + '.') er.notValid = true return er }) var version = package.version || config.get('init.version') || config.get('init-version') || '1.0.0' exports.version = yes ? version : prompt('version', version, function (version) { if (semver.valid(version)) return version var er = new Error('Invalid version: "' + version + '"') er.notValid = true return er }) if (!package.description) { exports.description = yes ? '' : prompt('description') } if (!package.main) { exports.main = function (cb) { fs.readdir(dirname, function (er, f) { if (er) f = [] f = f.filter(function (f) { return f.match(/\.js$/) }) if (f.indexOf('index.js') !== -1) f = 'index.js' else if (f.indexOf('main.js') !== -1) f = 'main.js' else if (f.indexOf(basename + '.js') !== -1) f = basename + '.js' else f = f[0] var index = f || 'index.js' return cb(null, yes ? index : prompt('entry point', index)) }) } } if (!package.bin) { exports.bin = function (cb) { fs.readdir(path.resolve(dirname, 'bin'), function (er, d) { // no bins if (er) return cb() // just take the first js file we find there, or nada return cb(null, d.filter(function (f) { return f.match(/\.js$/) })[0]) }) } } exports.directories = function (cb) { fs.readdir(dirname, function (er, dirs) { if (er) return cb(er) var res = {} dirs.forEach(function (d) { switch (d) { case 'example': case 'examples': return res.example = d case 'test': case 'tests': return res.test = d case 'doc': case 'docs': return res.doc = d case 'man': return res.man = d } }) if (Object.keys(res).length === 0) res = undefined return cb(null, res) }) } if (!package.dependencies) { exports.dependencies = readDeps(false) } if (!package.devDependencies) { exports.devDependencies = readDeps(true) } // MUST have a test script! var s = package.scripts || {} var notest = 'echo "Error: no test specified" && exit 1' if (!package.scripts) { exports.scripts = function (cb) { fs.readdir(path.join(dirname, 'node_modules'), function (er, d) { setupScripts(d || [], cb) }) } } function setupScripts (d, cb) { // check to see what framework is in use, if any function tx (test) { return test || notest } if (!s.test || s.test === notest) { var commands = { 'tap':'tap test/*.js' , 'expresso':'expresso test' , 'mocha':'mocha' } var command Object.keys(commands).forEach(function (k) { if (d.indexOf(k) !== -1) command = commands[k] }) var ps = 'test command' if (yes) { s.test = command || notest } else { s.test = command ? prompt(ps, command, tx) : prompt(ps, tx) } } return cb(null, s) } if (!package.repository) { exports.repository = function (cb) { fs.readFile('.git/config', 'utf8', function (er, gconf) { if (er || !gconf) { return cb(null, yes ? '' : prompt('git repository')) } gconf = gconf.split(/\r?\n/) var i = gconf.indexOf('[remote "origin"]') if (i !== -1) { var u = gconf[i + 1] if (!u.match(/^\s*url =/)) u = gconf[i + 2] if (!u.match(/^\s*url =/)) u = null else u = u.replace(/^\s*url = /, '') } if (u && u.match(/^git@github.com:/)) u = u.replace(/^git@github.com:/, 'https://github.com/') return cb(null, yes ? u : prompt('git repository', u)) }) } } if (!package.keywords) { exports.keywords = yes ? '' : prompt('keywords', function (s) { if (!s) return undefined if (Array.isArray(s)) s = s.join(' ') if (typeof s !== 'string') return s return s.split(/[\s,]+/) }) } if (!package.author) { exports.author = config.get('init.author.name') || config.get('init-author-name') ? { "name" : config.get('init.author.name') || config.get('init-author-name'), "email" : config.get('init.author.email') || config.get('init-author-email'), "url" : config.get('init.author.url') || config.get('init-author-url') } : yes ? '' : prompt('author') } var license = package.license || config.get('init.license') || config.get('init-license') || 'ISC' exports.license = yes ? license : prompt('license', license, function (data) { var its = validateLicense(data) if (its.validForNewPackages) return data var errors = (its.errors || []).concat(its.warnings || []) var er = new Error('Sorry, ' + errors.join(' and ') + '.') er.notValid = true return er }) npm_3.5.2.orig/node_modules/init-package-json/example/0000755000000000000000000000000012631326456021105 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/init-package-json.js0000644000000000000000000001005112631326456023310 0ustar 00000000000000 module.exports = init module.exports.yes = yes var PZ = require('promzard').PromZard var path = require('path') var def = require.resolve('./default-input.js') var fs = require('fs') var semver = require('semver') var read = require('read') // to validate the data object at the end as a worthwhile package // and assign default values for things. // readJson.extras(file, data, cb) var readJson = require('read-package-json') function yes (conf) { return !!( conf.get('yes') || conf.get('y') || conf.get('force') || conf.get('f') ) } function init (dir, input, config, cb) { if (typeof config === 'function') cb = config, config = {} // accept either a plain-jane object, or a config object // with a "get" method. if (typeof config.get !== 'function') { var data = config config = { get: function (k) { return data[k] }, toJSON: function () { return data } } } var package = path.resolve(dir, 'package.json') input = path.resolve(input) var pkg var ctx = { yes: yes(config) } var es = readJson.extraSet readJson.extraSet = es.filter(function (fn) { return fn.name !== 'authors' && fn.name !== 'mans' }) readJson(package, function (er, d) { readJson.extraSet = es if (er) pkg = {} else pkg = d ctx.filename = package ctx.dirname = path.dirname(package) ctx.basename = path.basename(ctx.dirname) if (!pkg.version || !semver.valid(pkg.version)) delete pkg.version ctx.package = pkg ctx.config = config || {} // make sure that the input is valid. // if not, use the default var pz = new PZ(input, ctx) pz.backupFile = def pz.on('error', cb) pz.on('data', function (data) { Object.keys(data).forEach(function (k) { if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k] }) // only do a few of these. // no need for mans or contributors if they're in the files var es = readJson.extraSet readJson.extraSet = es.filter(function (fn) { return fn.name !== 'authors' && fn.name !== 'mans' }) readJson.extras(package, pkg, function (er, pkg) { readJson.extraSet = es if (er) return cb(er, pkg) pkg = unParsePeople(pkg) // no need for the readme now. delete pkg.readme delete pkg.readmeFilename // really don't want to have this lying around in the file delete pkg._id // ditto delete pkg.gitHead // if the repo is empty, remove it. if (!pkg.repository) delete pkg.repository // readJson filters out empty descriptions, but init-package-json // traditionally leaves them alone if (!pkg.description) pkg.description = data.description var d = JSON.stringify(pkg, null, 2) + '\n' function write (yes) { fs.writeFile(package, d, 'utf8', function (er) { if (!er && yes && !config.get('silent')) { console.log('Wrote to %s:\n\n%s\n', package, d) } return cb(er, pkg) }) } if (ctx.yes) { return write(true) } console.log('About to write to %s:\n\n%s\n', package, d) read({prompt:'Is this ok? ', default: 'yes'}, function (er, ok) { if (!ok || ok.toLowerCase().charAt(0) !== 'y') { console.log('Aborted.') } else { return write() } }) }) }) }) } // turn the objects into somewhat more humane strings. function unParsePeople (data) { if (data.author) data.author = unParsePerson(data.author) ;["maintainers", "contributors"].forEach(function (set) { if (!Array.isArray(data[set])) return; data[set] = data[set].map(unParsePerson) }) return data } function unParsePerson (person) { if (typeof person === "string") return person var name = person.name || "" var u = person.url || person.web var url = u ? (" ("+u+")") : "" var e = person.email || person.mail var email = e ? (" <"+e+">") : "" return name+email+url } npm_3.5.2.orig/node_modules/init-package-json/node_modules/0000755000000000000000000000000012631326456022127 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/package.json0000644000000000000000000000401512631326456021740 0ustar 00000000000000{ "name": "init-package-json", "version": "1.9.1", "main": "init-package-json.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/init-package-json.git" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "description": "A node module to get your node module started", "dependencies": { "glob": "^5.0.3", "npm-package-arg": "^4.0.0", "promzard": "^0.3.0", "read": "~1.0.1", "read-package-json": "1 || 2", "semver": "2.x || 3.x || 4 || 5", "validate-npm-package-license": "^3.0.1", "validate-npm-package-name": "^2.0.1" }, "devDependencies": { "npm": "^2", "rimraf": "^2.1.4", "tap": "^1.2.0" }, "keywords": [ "init", "package.json", "package", "helper", "wizard", "wizerd", "prompt", "start" ], "gitHead": "37c38b4e23189eb5645901fa6851f343fddd4b73", "bugs": { "url": "https://github.com/isaacs/init-package-json/issues" }, "homepage": "https://github.com/isaacs/init-package-json#readme", "_id": "init-package-json@1.9.1", "_shasum": "a28e05b5baeb3363cd473df68d30d3a80523a31c", "_from": "init-package-json@>=1.9.1 <1.10.0", "_npmVersion": "2.14.1", "_nodeVersion": "2.2.2", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "a28e05b5baeb3363cd473df68d30d3a80523a31c", "tarball": "http://registry.npmjs.org/init-package-json/-/init-package-json-1.9.1.tgz" }, "maintainers": [ { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "iarna", "email": "me@re-becca.org" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/init-package-json/-/init-package-json-1.9.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/init-package-json/test/0000755000000000000000000000000012631326456020431 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/example/example-basic.js0000644000000000000000000000033412631326456024155 0ustar 00000000000000var init = require('../init-package-json.js') var dir = process.cwd() var initFile = require.resolve('./init/basic-init.js') init(dir, initFile, function (err, data) { if (!err) console.log('written successfully') }) npm_3.5.2.orig/node_modules/init-package-json/example/example-default.js0000644000000000000000000000026712631326456024525 0ustar 00000000000000var init = require('../init-package-json.js') var dir = process.cwd() init(dir, 'file that does not exist', function (err, data) { if (!err) console.log('written successfully') }) npm_3.5.2.orig/node_modules/init-package-json/example/example-npm.js0000644000000000000000000000044312631326456023667 0ustar 00000000000000var init = require('../init-package-json.js') var dir = process.cwd() var npm = require('npm') npm.load(function (er, npm) { if (er) throw er init(dir, npm.config.get('init-module'), npm.config, function (er, data) { if (er) throw er console.log('written successfully') }) }) npm_3.5.2.orig/node_modules/init-package-json/example/init/0000755000000000000000000000000012631326456022050 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/example/init/basic-init.js0000644000000000000000000000013512631326456024427 0ustar 00000000000000exports.flavor = prompt("what's your favorite flavor of ice cream buddy?", "I LIKE THEM ALL")npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/0000755000000000000000000000000012631326456023765 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/.npmignore0000644000000000000000000000003612631326456025763 0ustar 00000000000000example/npm-init/package.json npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/LICENSE0000644000000000000000000000135412631326456024775 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/README.md0000644000000000000000000001057212631326456025251 0ustar 00000000000000# promzard A prompting wizard for building files from specialized PromZard modules. Used by `npm init`. A reimplementation of @SubStack's [prompter](https://github.com/substack/node-prompter), which does not use AST traversal. From another point of view, it's a reimplementation of [@Marak](https://github.com/marak)'s [wizard](https://github.com/Marak/wizard) which doesn't use schemas. The goal is a nice drop-in enhancement for `npm init`. ## Usage ```javascript var promzard = require('promzard') promzard(inputFile, optionalContextAdditions, function (er, data) { // .. you know what you doing .. }) ``` In the `inputFile` you can have something like this: ```javascript var fs = require('fs') module.exports = { "greeting": prompt("Who shall you greet?", "world", function (who) { return "Hello, " + who }), "filename": __filename, "directory": function (cb) { fs.readdir(__dirname, cb) } } ``` When run, promzard will display the prompts and resolve the async functions in order, and then either give you an error, or the resolved data, ready to be dropped into a JSON file or some other place. ### promzard(inputFile, ctx, callback) The inputFile is just a node module. You can require() things, set module.exports, etc. Whatever that module exports is the result, and it is walked over to call any functions as described below. The only caveat is that you must give PromZard the full absolute path to the module (you can get this via Node's `require.resolve`.) Also, the `prompt` function is injected into the context object, so watch out. Whatever you put in that `ctx` will of course also be available in the module. You can get quite fancy with this, passing in existing configs and so on. ### Class: promzard.PromZard(file, ctx) Just like the `promzard` function, but the EventEmitter that makes it all happen. Emits either a `data` event with the data, or a `error` event if it blows up. If `error` is emitted, then `data` never will be. ### prompt(...) In the promzard input module, you can call the `prompt` function. This prompts the user to input some data. The arguments are interpreted based on type: 1. `string` The first string encountered is the prompt. The second is the default value. 2. `function` A transformer function which receives the data and returns something else. More than meets the eye. 3. `object` The `prompt` member is the prompt, the `default` member is the default value, and the `transform` is the transformer. Whatever the final value is, that's what will be put on the resulting object. ### Functions If there are any functions on the promzard input module's exports, then promzard will call each of them with a callback. This way, your module can do asynchronous actions if necessary to validate or ascertain whatever needs verification. The functions are called in the context of the ctx object, and are given a single argument, which is a callback that should be called with either an error, or the result to assign to that spot. In the async function, you can also call prompt() and return the result of the prompt in the callback. For example, this works fine in a promzard module: ``` exports.asyncPrompt = function (cb) { fs.stat(someFile, function (er, st) { // if there's an error, no prompt, just error // otherwise prompt and use the actual file size as the default cb(er, prompt('file size', st.size)) }) } ``` You can also return other async functions in the async function callback. Though that's a bit silly, it could be a handy way to reuse functionality in some cases. ### Sync vs Async The `prompt()` function is not synchronous, though it appears that way. It just returns a token that is swapped out when the data object is walked over asynchronously later, and returns a token. For that reason, prompt() calls whose results don't end up on the data object are never shown to the user. For example, this will only prompt once: ``` exports.promptThreeTimes = prompt('prompt me once', 'shame on you') exports.promptThreeTimes = prompt('prompt me twice', 'um....') exports.promptThreeTimes = prompt('you cant prompt me again') ``` ### Isn't this exactly the sort of 'looks sync' that you said was bad about other libraries? Yeah, sorta. I wouldn't use promzard for anything more complicated than a wizard that spits out prompts to set up a config file or something. Maybe there are other use cases I haven't considered. npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/0000755000000000000000000000000012631326456025420 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/package.json0000644000000000000000000000234112631326456026253 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "promzard", "description": "prompting wizardly", "version": "0.3.0", "repository": { "url": "git://github.com/isaacs/promzard.git" }, "dependencies": { "read": "1" }, "devDependencies": { "tap": "~0.2.5" }, "main": "promzard.js", "scripts": { "test": "tap test/*.js" }, "license": "ISC", "gitHead": "780ead051299aa28be2584199ab6fa503a32d354", "bugs": { "url": "https://github.com/isaacs/promzard/issues" }, "homepage": "https://github.com/isaacs/promzard", "_id": "promzard@0.3.0", "_shasum": "26a5d6ee8c7dee4cb12208305acfb93ba382a9ee", "_from": "promzard@>=0.3.0 <0.4.0", "_npmVersion": "2.7.1", "_nodeVersion": "1.4.2", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "26a5d6ee8c7dee4cb12208305acfb93ba382a9ee", "tarball": "http://registry.npmjs.org/promzard/-/promzard-0.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/promzard/-/promzard-0.3.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/promzard.js0000644000000000000000000001365112631326456026167 0ustar 00000000000000module.exports = promzard promzard.PromZard = PromZard var fs = require('fs') var vm = require('vm') var util = require('util') var files = {} var crypto = require('crypto') var EventEmitter = require('events').EventEmitter var read = require('read') var Module = require('module').Module var path = require('path') function promzard (file, ctx, cb) { if (typeof ctx === 'function') cb = ctx, ctx = null; if (!ctx) ctx = {}; var pz = new PromZard(file, ctx) pz.on('error', cb) pz.on('data', function (data) { cb(null, data) }) } promzard.fromBuffer = function (buf, ctx, cb) { var filename = 0 do { filename = '\0' + Math.random(); } while (files[filename]) files[filename] = buf var ret = promzard(filename, ctx, cb) delete files[filename] return ret } function PromZard (file, ctx) { if (!(this instanceof PromZard)) return new PromZard(file, ctx) EventEmitter.call(this) this.file = file this.ctx = ctx this.unique = crypto.randomBytes(8).toString('hex') this.load() } PromZard.prototype = Object.create( EventEmitter.prototype, { constructor: { value: PromZard, readable: true, configurable: true, writable: true, enumerable: false } } ) PromZard.prototype.load = function () { if (files[this.file]) return this.loaded() fs.readFile(this.file, 'utf8', function (er, d) { if (er && this.backupFile) { this.file = this.backupFile delete this.backupFile return this.load() } if (er) return this.emit('error', this.error = er) files[this.file] = d this.loaded() }.bind(this)) } PromZard.prototype.loaded = function () { this.ctx.prompt = this.makePrompt() this.ctx.__filename = this.file this.ctx.__dirname = path.dirname(this.file) this.ctx.__basename = path.basename(this.file) var mod = this.ctx.module = this.makeModule() this.ctx.require = function (path) { return mod.require(path) } this.ctx.require.resolve = function(path) { return Module._resolveFilename(path, mod); } this.ctx.exports = mod.exports this.script = this.wrap(files[this.file]) var fn = vm.runInThisContext(this.script, this.file) var args = Object.keys(this.ctx).map(function (k) { return this.ctx[k] }.bind(this)) try { var res = fn.apply(this.ctx, args) } catch (er) { this.emit('error', er) } if (res && typeof res === 'object' && exports === mod.exports && Object.keys(exports).length === 1) { this.result = res } else { this.result = mod.exports } this.walk() } PromZard.prototype.makeModule = function () { var mod = new Module(this.file, module) mod.loaded = true mod.filename = this.file mod.id = this.file mod.paths = Module._nodeModulePaths(path.dirname(this.file)) return mod } PromZard.prototype.wrap = function (body) { var s = '(function( %s ) { %s\n })' var args = Object.keys(this.ctx).join(', ') return util.format(s, args, body) } PromZard.prototype.makePrompt = function () { this.prompts = [] return prompt.bind(this) function prompt () { var p, d, t for (var i = 0; i < arguments.length; i++) { var a = arguments[i] if (typeof a === 'string' && p) d = a else if (typeof a === 'string') p = a else if (typeof a === 'function') t = a else if (a && typeof a === 'object') { p = a.prompt || p d = a.default || d t = a.transform || t } } try { return this.unique + '-' + this.prompts.length } finally { this.prompts.push([p, d, t]) } } } PromZard.prototype.walk = function (o, cb) { o = o || this.result cb = cb || function (er, res) { if (er) return this.emit('error', this.error = er) this.result = res return this.emit('data', res) } cb = cb.bind(this) var keys = Object.keys(o) var i = 0 var len = keys.length L.call(this) function L () { if (this.error) return while (i < len) { var k = keys[i] var v = o[k] i++ if (v && typeof v === 'object') { return this.walk(v, function (er, res) { if (er) return cb(er) o[k] = res L.call(this) }.bind(this)) } else if (v && typeof v === 'string' && v.indexOf(this.unique) === 0) { var n = +v.substr(this.unique.length + 1) var prompt = this.prompts[n] if (isNaN(n) || !prompt) continue // default to the key if (undefined === prompt[0]) prompt[0] = k // default to the ctx value, if there is one if (undefined === prompt[1]) prompt[1] = this.ctx[k] return this.prompt(prompt, function (er, res) { if (er) { if (!er.notValid) { return this.emit('error', this.error = er); } console.log(er.message) i -- return L.call(this) } o[k] = res L.call(this) }.bind(this)) } else if (typeof v === 'function') { try { return v.call(this.ctx, function (er, res) { if (er) return this.emit('error', this.error = er) o[k] = res // back up so that we process this one again. // this is because it might return a prompt() call in the cb. i -- L.call(this) }.bind(this)) } catch (er) { this.emit('error', er) } } } // made it to the end of the loop, maybe if (i >= len) return cb(null, o) } } PromZard.prototype.prompt = function (pdt, cb) { var prompt = pdt[0] var def = pdt[1] var tx = pdt[2] if (tx) { cb = function (cb) { return function (er, data) { try { var res = tx(data) if (!er && res instanceof Error && !!res.notValid) { return cb(res, null) } return cb(er, res) } catch (er) { this.emit('error', er) } }}(cb).bind(this) } read({ prompt: prompt + ':' , default: def }, cb) } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/0000755000000000000000000000000012631326456024744 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/buffer.js0000644000000000000000000000051712631326456027232 0ustar 00000000000000var pz = require('../promzard') var path = require('path') var file = path.resolve(__dirname, 'substack-input.js') var buf = require('fs').readFileSync(file) var ctx = { basename: path.basename(path.dirname(file)) } pz.fromBuffer(buf, ctx, function (er, res) { if (er) throw er console.error(JSON.stringify(res, null, 2)) }) npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/index.js0000644000000000000000000000043212631326456027064 0ustar 00000000000000var pz = require('../promzard') var path = require('path') var file = path.resolve(__dirname, 'substack-input.js') var ctx = { basename: path.basename(path.dirname(file)) } pz(file, ctx, function (er, res) { if (er) throw er console.error(JSON.stringify(res, null, 2)) }) npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/npm-init/0000755000000000000000000000000012631326456027153 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/substack-input.js0000644000000000000000000000341312631326456030733 0ustar 00000000000000module.exports = { "name" : basename.replace(/^node-/, ''), "version" : "0.0.0", "description" : (function (cb) { var fs = require('fs'); var value; try { var src = fs.readFileSync('README.markdown', 'utf8'); value = src.split('\n').filter(function (line) { return /\s+/.test(line) && line.trim() !== basename.replace(/^node-/, '') ; })[0] .trim() .replace(/^./, function (c) { return c.toLowerCase() }) .replace(/\.$/, '') ; } catch (e) {} return prompt('description', value); })(), "main" : prompt('entry point', 'index.js'), "bin" : function (cb) { var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; exists('bin/cmd.js', function (ex) { var bin if (ex) { var bin = {} bin[basename.replace(/^node-/, '')] = 'bin/cmd.js' } cb(null, bin); }); }, "directories" : { "example" : "example", "test" : "test" }, "dependencies" : {}, "devDependencies" : { "tap" : "~0.2.5" }, "scripts" : { "test" : "tap test/*.js" }, "repository" : { "type" : "git", "url" : "git://github.com/substack/" + basename + ".git" }, "homepage" : "https://github.com/substack/" + basename, "keywords" : prompt(function (s) { return s.split(/\s+/) }), "author" : { "name" : "James Halliday", "email" : "mail@substack.net", "url" : "http://substack.net" }, "license" : "MIT", "engine" : { "node" : ">=0.6" } } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/npm-init/README.md0000644000000000000000000000023412631326456030431 0ustar 00000000000000# npm-init An initter you init wit, innit? ## More stuff here Blerp derp herp lerg borgle pop munch efemerate baz foo a gandt synergy jorka chatt slurm. npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/npm-init/init-input.js0000644000000000000000000001365212631326456031620 0ustar 00000000000000var fs = require('fs') var path = require('path'); module.exports = { "name" : prompt('name', typeof name === 'undefined' ? basename.replace(/^node-|[.-]js$/g, ''): name), "version" : prompt('version', typeof version !== "undefined" ? version : '0.0.0'), "description" : (function () { if (typeof description !== 'undefined' && description) { return description } var value; try { var src = fs.readFileSync('README.md', 'utf8'); value = src.split('\n').filter(function (line) { return /\s+/.test(line) && line.trim() !== basename.replace(/^node-/, '') && !line.trim().match(/^#/) ; })[0] .trim() .replace(/^./, function (c) { return c.toLowerCase() }) .replace(/\.$/, '') ; } catch (e) { try { // Wouldn't it be nice if that file mattered? var d = fs.readFileSync('.git/description', 'utf8') } catch (e) {} if (d.trim() && !value) value = d } return prompt('description', value); })(), "main" : (function () { var f try { f = fs.readdirSync(dirname).filter(function (f) { return f.match(/\.js$/) }) if (f.indexOf('index.js') !== -1) f = 'index.js' else if (f.indexOf('main.js') !== -1) f = 'main.js' else if (f.indexOf(basename + '.js') !== -1) f = basename + '.js' else f = f[0] } catch (e) {} return prompt('entry point', f || 'index.js') })(), "bin" : function (cb) { fs.readdir(dirname + '/bin', function (er, d) { // no bins if (er) return cb() // just take the first js file we find there, or nada return cb(null, d.filter(function (f) { return f.match(/\.js$/) })[0]) }) }, "directories" : function (cb) { fs.readdir('.', function (er, dirs) { if (er) return cb(er) var res = {} dirs.forEach(function (d) { switch (d) { case 'example': case 'examples': return res.example = d case 'test': case 'tests': return res.test = d case 'doc': case 'docs': return res.doc = d case 'man': return res.man = d } }) if (Object.keys(res).length === 0) res = undefined return cb(null, res) }) }, "dependencies" : typeof dependencies !== 'undefined' ? dependencies : function (cb) { fs.readdir('node_modules', function (er, dir) { if (er) return cb() var deps = {} var n = dir.length dir.forEach(function (d) { if (d.match(/^\./)) return next() if (d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)) return next() fs.readFile('node_modules/' + d + '/package.json', function (er, p) { if (er) return next() try { p = JSON.parse(p) } catch (e) { return next() } if (!p.version) return next() deps[d] = '~' + p.version return next() }) }) function next () { if (--n === 0) return cb(null, deps) } }) }, "devDependencies" : typeof devDependencies !== 'undefined' ? devDependencies : function (cb) { // same as dependencies but for dev deps fs.readdir('node_modules', function (er, dir) { if (er) return cb() var deps = {} var n = dir.length dir.forEach(function (d) { if (d.match(/^\./)) return next() if (!d.match(/^(expresso|mocha|tap|coffee-script|coco|streamline)$/)) return next() fs.readFile('node_modules/' + d + '/package.json', function (er, p) { if (er) return next() try { p = JSON.parse(p) } catch (e) { return next() } if (!p.version) return next() deps[d] = '~' + p.version return next() }) }) function next () { if (--n === 0) return cb(null, deps) } }) }, "scripts" : (function () { // check to see what framework is in use, if any try { var d = fs.readdirSync('node_modules') } catch (e) { d = [] } var s = typeof scripts === 'undefined' ? {} : scripts if (d.indexOf('coffee-script') !== -1) s.prepublish = prompt('build command', s.prepublish || 'coffee src/*.coffee -o lib') var notest = 'echo "Error: no test specified" && exit 1' function tx (test) { return test || notest } if (!s.test || s.test === notest) { if (d.indexOf('tap') !== -1) s.test = prompt('test command', 'tap test/*.js', tx) else if (d.indexOf('expresso') !== -1) s.test = prompt('test command', 'expresso test', tx) else if (d.indexOf('mocha') !== -1) s.test = prompt('test command', 'mocha', tx) else s.test = prompt('test command', tx) } return s })(), "repository" : (function () { try { var gconf = fs.readFileSync('.git/config') } catch (e) { gconf = null } if (gconf) { gconf = gconf.split(/\r?\n/) var i = gconf.indexOf('[remote "origin"]') if (i !== -1) { var u = gconf[i + 1] if (!u.match(/^\s*url =/)) u = gconf[i + 2] if (!u.match(/^\s*url =/)) u = null else u = u.replace(/^\s*url = /, '') } if (u && u.match(/^git@github.com:/)) u = u.replace(/^git@github.com:/, 'git://github.com/') } return prompt('git repository', u) })(), "keywords" : prompt(function (s) { if (!s) return undefined if (Array.isArray(s)) s = s.join(' ') if (typeof s !== 'string') return s return s.split(/[\s,]+/) }), "author" : config['init.author.name'] ? { "name" : config['init.author.name'], "email" : config['init.author.email'], "url" : config['init.author.url'] } : undefined, "license" : prompt('license', 'BSD') } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/npm-init/init.js0000644000000000000000000000201012631326456030445 0ustar 00000000000000var PZ = require('../../promzard').PromZard var path = require('path') var input = path.resolve(__dirname, 'init-input.js') var fs = require('fs') var package = path.resolve(__dirname, 'package.json') var pkg fs.readFile(package, 'utf8', function (er, d) { if (er) ctx = {} try { ctx = JSON.parse(d); pkg = JSON.parse(d) } catch (e) { ctx = {} } ctx.dirname = path.dirname(package) ctx.basename = path.basename(ctx.dirname) if (!ctx.version) ctx.version = undefined // this should be replaced with the npm conf object ctx.config = {} console.error('ctx=', ctx) var pz = new PZ(input, ctx) pz.on('data', function (data) { console.error('pz data', data) if (!pkg) pkg = {} Object.keys(data).forEach(function (k) { if (data[k] !== undefined && data[k] !== null) pkg[k] = data[k] }) console.error('package data %s', JSON.stringify(data, null, 2)) fs.writeFile(package, JSON.stringify(pkg, null, 2), function (er) { if (er) throw er console.log('ok') }) }) }) npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/example/npm-init/package.json0000644000000000000000000000026312631326456031442 0ustar 00000000000000{ "name": "npm-init", "version": "0.0.0", "description": "an initter you init wit, innit?", "main": "index.js", "scripts": { "test": "asdf" }, "license": "BSD" }npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/basic.js0000644000000000000000000000407212631326456026366 0ustar 00000000000000var tap = require('tap') var pz = require('../promzard.js') var spawn = require('child_process').spawn tap.test('run the example', function (t) { var example = require.resolve('../example/index.js') var node = process.execPath var expect = { "name": "example", "version": "0.0.0", "description": "testing description", "main": "test-entry.js", "directories": { "example": "example", "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.2.5" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/substack/example.git" }, "homepage": "https://github.com/substack/example", "keywords": [ "fugazi", "function", "waiting", "room" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "engine": { "node": ">=0.6" } } console.error('%s %s', node, example) var c = spawn(node, [example], { customFds: [-1,-1,-1] }) var output = '' c.stdout.on('data', function (d) { output += d respond() }) var actual = '' c.stderr.on('data', function (d) { actual += d }) function respond () { console.error('respond', output) if (output.match(/description: $/)) { c.stdin.write('testing description\n') return } if (output.match(/entry point: \(index\.js\) $/)) { c.stdin.write('test-entry.js\n') return } if (output.match(/keywords: $/)) { c.stdin.write('fugazi function waiting room\n') // "read" module is weird on node >= 0.10 when not a TTY // requires explicit ending for reasons. // could dig in, but really just wanna make tests pass, whatever. c.stdin.end() return } } c.on('exit', function () { console.error('exit event') }) c.on('close', function () { console.error('actual', actual) actual = JSON.parse(actual) t.deepEqual(actual, expect) t.end() }) }) npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/buffer.js0000644000000000000000000000362612631326456026562 0ustar 00000000000000var tap = require('tap') var pz = require('../promzard.js') var spawn = require('child_process').spawn tap.test('run the example using a buffer', function (t) { var example = require.resolve('../example/buffer.js') var node = process.execPath var expect = { "name": "example", "version": "0.0.0", "description": "testing description", "main": "test-entry.js", "directories": { "example": "example", "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.2.5" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/substack/example.git" }, "homepage": "https://github.com/substack/example", "keywords": [ "fugazi", "function", "waiting", "room" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "engine": { "node": ">=0.6" } } var c = spawn(node, [example], { customFds: [-1,-1,-1] }) var output = '' c.stdout.on('data', function (d) { output += d respond() }) var actual = '' c.stderr.on('data', function (d) { actual += d }) function respond () { if (output.match(/description: $/)) { c.stdin.write('testing description\n') return } if (output.match(/entry point: \(index\.js\) $/)) { c.stdin.write('test-entry.js\n') return } if (output.match(/keywords: $/)) { c.stdin.write('fugazi function waiting room\n') // "read" module is weird on node >= 0.10 when not a TTY // requires explicit ending for reasons. // could dig in, but really just wanna make tests pass, whatever. c.stdin.end() return } } c.on('close', function () { actual = JSON.parse(actual) t.deepEqual(actual, expect) t.end() }) }) npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/exports.input0000644000000000000000000000021612631326456027530 0ustar 00000000000000exports.a = 1 + 2 exports.b = prompt('To be or not to be?', '!2b') exports.c = {} exports.c.x = prompt() exports.c.y = tmpdir + "/y/file.txt" npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/exports.js0000644000000000000000000000164312631326456027012 0ustar 00000000000000var test = require('tap').test; var promzard = require('../'); if (process.argv[2] === 'child') { return child() } test('exports', function (t) { t.plan(1); var spawn = require('child_process').spawn var child = spawn(process.execPath, [__filename, 'child']) var output = '' child.stderr.on('data', function (c) { output += c }) setTimeout(function () { child.stdin.write('\n'); }, 100) setTimeout(function () { child.stdin.end('55\n'); }, 200) child.on('close', function () { console.error('output=%j', output) output = JSON.parse(output) t.same({ a : 3, b : '!2b', c : { x : 55, y : '/tmp/y/file.txt', } }, output); t.end() }) }); function child () { var ctx = { tmpdir : '/tmp' } var file = __dirname + '/exports.input'; promzard(file, ctx, function (err, output) { console.error(JSON.stringify(output)) }); } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/fn.input0000644000000000000000000000067412631326456026437 0ustar 00000000000000var fs = require('fs') module.exports = { "a": 1 + 2, "b": prompt('To be or not to be?', '!2b', function (s) { return s.toUpperCase() + '...' }), "c": { "x": prompt(function (x) { return x * 100 }), "y": tmpdir + "/y/file.txt" }, a_function: function (cb) { fs.readFile(__filename, 'utf8', cb) }, asyncPrompt: function (cb) { return cb(null, prompt('a prompt at any other time would smell as sweet')) } } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/fn.js0000644000000000000000000000220312631326456025702 0ustar 00000000000000var test = require('tap').test; var promzard = require('../'); var fs = require('fs') var file = __dirname + '/fn.input'; var expect = { a : 3, b : '!2B...', c : { x : 5500, y : '/tmp/y/file.txt', } } expect.a_function = fs.readFileSync(file, 'utf8') expect.asyncPrompt = 'async prompt' if (process.argv[2] === 'child') { return child() } test('prompt callback param', function (t) { t.plan(1); var spawn = require('child_process').spawn var child = spawn(process.execPath, [__filename, 'child']) var output = '' child.stderr.on('data', function (c) { output += c }) child.on('close', function () { console.error('output=%j', output) output = JSON.parse(output) t.same(output, expect); t.end() }) setTimeout(function () { child.stdin.write('\n') }, 100) setTimeout(function () { child.stdin.write('55\n') }, 150) setTimeout(function () { child.stdin.end('async prompt\n') }, 200) }) function child () { var ctx = { tmpdir : '/tmp' } var file = __dirname + '/fn.input'; promzard(file, ctx, function (err, output) { console.error(JSON.stringify(output)) }) } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/simple.input0000644000000000000000000000022012631326456027310 0ustar 00000000000000module.exports = { "a": 1 + 2, "b": prompt('To be or not to be?', '!2b'), "c": { "x": prompt(), "y": tmpdir + "/y/file.txt" } } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/simple.js0000644000000000000000000000123512631326456026574 0ustar 00000000000000var test = require('tap').test; var promzard = require('../'); test('simple', function (t) { t.plan(1); var ctx = { tmpdir : '/tmp' } var file = __dirname + '/simple.input'; promzard(file, ctx, function (err, output) { t.same( { a : 3, b : '!2b', c : { x : 55, y : '/tmp/y/file.txt', } }, output ); }); setTimeout(function () { process.stdin.emit('data', '\n'); }, 100); setTimeout(function () { process.stdin.emit('data', '55\n'); }, 200); }); npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/validate.input0000644000000000000000000000026212631326456027616 0ustar 00000000000000module.exports = { "name": prompt("name", function (data) { if (data === 'cool') return data var er = new Error('not cool') er.notValid = true return er }) } npm_3.5.2.orig/node_modules/init-package-json/node_modules/promzard/test/validate.js0000644000000000000000000000072512631326456027077 0ustar 00000000000000 var promzard = require('../') var test = require('tap').test test('validate', function (t) { t.plan(2) var ctx = { tmpdir : '/tmp' } var file = __dirname + '/validate.input' promzard(file, ctx, function (er, found) { t.ok(!er) var wanted = { name: 'cool' } t.same(found, wanted) }) setTimeout(function () { process.stdin.emit('data', 'not cool\n') }, 100) setTimeout(function () { process.stdin.emit('data', 'cool\n') }, 200) }) npm_3.5.2.orig/node_modules/init-package-json/test/basic.input0000644000000000000000000000117312631326456022575 0ustar 00000000000000var assert = require('assert') exports.name = prompt('name', package.name || basename) exports.version = '1.2.5' exports.description = prompt('description', package.description) exports.author = 'npmbot (http://npm.im)' exports.scripts = package.scripts || {} exports.scripts.test = 'make test' exports.main = package.main || 'main.js' exports.config = JSON.parse(JSON.stringify(config)) try {delete exports.config.config}catch(e){} try {delete exports.package.config}catch(e){} try {delete exports.package.package}catch(e){} try {delete exports.config.package}catch(e){} exports.package = JSON.parse(JSON.stringify(package)) npm_3.5.2.orig/node_modules/init-package-json/test/basic.js0000644000000000000000000000151712631326456022054 0ustar 00000000000000var common = require('./lib/common') var init = require('../') var path = require('path') var rimraf = require('rimraf') var test = require('tap').test test('the basics', function (t) { var i = path.join(__dirname, 'basic.input') rimraf.sync(__dirname + '/package.json') init(__dirname, i, { foo: 'bar' }, function (er, data) { if (er) throw er var expect = { name: 'the-name', version: '1.2.5', description: 'description', author: 'npmbot (http://npm.im)', scripts: { test: 'make test' }, main: 'main.js', config: { foo: 'bar' }, package: {} } console.log('') t.same(data, expect) t.end() }) common.drive([ 'the-name\n', 'description\n', 'yes\n' ]) }) test('teardown', function (t) { rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/lib/0000755000000000000000000000000012631326456021177 5ustar 00000000000000npm_3.5.2.orig/node_modules/init-package-json/test/license.js0000644000000000000000000000141412631326456022411 0ustar 00000000000000var test = require('tap').test var init = require('../') var rimraf = require('rimraf') var common = require('./lib/common') test('license', function (t) { init(__dirname, '', {}, function (er, data) { if (er) throw er var wanted = { name: 'the-name', version: '1.0.0', description: '', scripts: { test: 'echo "Error: no test specified" && exit 1' }, license: 'Apache-2.0', author: '', main: 'basic.js' } console.log('') t.has(data, wanted) t.end() }) common.drive([ 'the-name\n', '\n', '\n', '\n', '\n', '\n', '\n', '\n', 'Apache\n', 'Apache-2.0\n', 'yes\n' ]) }) test('teardown', function (t) { rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/name-spaces.js0000644000000000000000000000144612631326456023170 0ustar 00000000000000var test = require('tap').test var init = require('../') var rimraf = require('rimraf') var common = require('./lib/common') test('spaces', function (t) { rimraf.sync(__dirname + '/package.json') init(__dirname, '', {}, function (er, data) { if (er) throw er var wanted = { name: 'the-name', version: '1.0.0', description: '', scripts: { test: 'echo "Error: no test specified" && exit 1' }, license: 'ISC', author: '', main: 'basic.js' } console.log('') t.has(data, wanted) t.end() }) common.drive([ 'the name\n', 'the-name\n', '\n', '\n', '\n', '\n', '\n', '\n', '\n', '\n', 'yes\n' ]) }) test('teardown', function (t) { rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/name-uppercase.js0000644000000000000000000000137712631326456023704 0ustar 00000000000000var test = require('tap').test var init = require('../') var rimraf = require('rimraf') var common = require('./lib/common') test('uppercase', function (t) { init(__dirname, '', {}, function (er, data) { if (er) throw er var wanted = { name: 'the-name', version: '1.0.0', description: '', scripts: { test: 'echo "Error: no test specified" && exit 1' }, license: 'ISC', author: '', main: 'basic.js' } console.log('') t.has(data, wanted) t.end() }) common.drive([ 'THE-NAME\n', 'the-name\n', '\n', '\n', '\n', '\n', '\n', '\n', '\n', '\n', 'yes\n' ]) }) test('teardown', function (t) { rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/npm-defaults.js0000644000000000000000000000552612631326456023376 0ustar 00000000000000var test = require('tap').test var rimraf = require('rimraf') var resolve = require('path').resolve var npm = require('npm') var init = require('../') var EXPECTED = { name: 'test', version: '3.1.4', description: '', main: 'basic.js', scripts: { test: 'echo "Error: no test specified" && exit 1' }, keywords: [], author: 'npmbot (http://npm.im/)', license: 'WTFPL' } test('npm configuration values pulled from environment', function (t) { /*eslint camelcase:0 */ process.env.npm_config_yes = 'yes' process.env.npm_config_init_author_name = 'npmbot' process.env.npm_config_init_author_email = 'n@p.m' process.env.npm_config_init_author_url = 'http://npm.im' process.env.npm_config_init_license = EXPECTED.license process.env.npm_config_init_version = EXPECTED.version npm.load({}, function (err) { t.ifError(err, 'npm loaded successfully') // clear out dotted names from test environment npm.config.del('init.author.name') npm.config.del('init.author.email') npm.config.del('init.author.url') // the following have npm defaults, and need to be explicitly overridden npm.config.set('init.license', '') npm.config.set('init.version', '') process.chdir(resolve(__dirname)) init(__dirname, __dirname, npm.config, function (er, data) { t.ifError(err, 'init ran successfully') t.same(data, EXPECTED, 'got the package data from the environment') t.end() }) }) }) test('npm configuration values pulled from dotted config', function (t) { /*eslint camelcase:0 */ var config = { yes: 'yes', 'init.author.name': 'npmbot', 'init.author.email': 'n@p.m', 'init.author.url': 'http://npm.im', 'init.license': EXPECTED.license, 'init.version': EXPECTED.version } npm.load(config, function (err) { t.ifError(err, 'npm loaded successfully') process.chdir(resolve(__dirname)) init(__dirname, __dirname, npm.config, function (er, data) { t.ifError(err, 'init ran successfully') t.same(data, EXPECTED, 'got the package data from the config') t.end() }) }) }) test('npm configuration values pulled from dashed config', function (t) { /*eslint camelcase:0 */ var config = { yes: 'yes', 'init-author-name': 'npmbot', 'init-author-email': 'n@p.m', 'init-author-url': 'http://npm.im', 'init-license': EXPECTED.license, 'init-version': EXPECTED.version } npm.load(config, function (err) { t.ifError(err, 'npm loaded successfully') process.chdir(resolve(__dirname)) init(__dirname, __dirname, npm.config, function (er, data) { t.ifError(err, 'init ran successfully') t.same(data, EXPECTED, 'got the package data from the config') t.end() }) }) }) test('cleanup', function (t) { rimraf.sync(resolve(__dirname, 'package.json')) t.pass('cleaned up') t.end() }) npm_3.5.2.orig/node_modules/init-package-json/test/scope-in-config-existing-name.js0000644000000000000000000000141612631326456026517 0ustar 00000000000000var fs = require('fs') var path = require('path') var rimraf = require('rimraf') var tap = require('tap') var init = require('../') var json = { name: '@already/scoped', version: '1.0.0' } tap.test('with existing package.json', function (t) { fs.writeFileSync(path.join(__dirname, 'package.json'), JSON.stringify(json, null, 2)) console.log(fs.readFileSync(path.join(__dirname, 'package.json'), 'utf8')) console.error('wrote json', json) init(__dirname, __dirname, { yes: 'yes', scope: '@still' }, function (er, data) { if (er) throw er console.log('') t.equal(data.name, '@still/scoped', 'new scope is added, basic name is kept') t.end() }) }) tap.test('teardown', function (t) { rimraf.sync(path.join(__dirname, 'package.json')) t.end() }) npm_3.5.2.orig/node_modules/init-package-json/test/scope-in-config.js0000644000000000000000000000125412631326456023751 0ustar 00000000000000var fs = require('fs') var path = require('path') var rimraf = require('rimraf') var tap = require('tap') var init = require('../') var EXPECT = { name: '@scoped/test', version: '1.0.0', description: '', author: '', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, main: 'basic.js', keywords: [], license: 'ISC' } tap.test('--yes with scope', function (t) { init(__dirname, __dirname, { yes: 'yes', scope: '@scoped' }, function (er, data) { if (er) throw er console.log('') t.has(data, EXPECT) t.end() }) }) tap.test('teardown', function (t) { rimraf.sync(path.join(__dirname, 'package.json')) t.end() }) npm_3.5.2.orig/node_modules/init-package-json/test/scope.js0000644000000000000000000000160512631326456022102 0ustar 00000000000000var tap = require('tap') var init = require('../') var rimraf = require('rimraf') var EXPECT = { name: '@foo/test', version: '1.2.5', description: 'description', author: 'npmbot (http://npm.im)', scripts: { test: 'make test' }, main: 'main.js', config: { scope: '@foo' }, package: {} } tap.test('the scope', function (t) { var i = __dirname + '/basic.input' var dir = __dirname init(dir, i, {scope: '@foo'}, function (er, data) { if (er) throw er console.log('') t.has(data, EXPECT) t.end() }) setTimeout(function () { process.stdin.emit('data', '@foo/test\n') }, 50) setTimeout(function () { process.stdin.emit('data', 'description\n') }, 100) setTimeout(function () { process.stdin.emit('data', 'yes\n') }, 150) }) tap.test('teardown', function (t) { rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/silent.js0000644000000000000000000000075312631326456022272 0ustar 00000000000000var tap = require('tap') var init = require('../') var rimraf = require('rimraf') var log = console.log var logged = false console.log = function () { logged = true } tap.test('silent: true', function (t) { init(__dirname, __dirname, {yes: 'yes', silent: true}, function (er, data) { if (er) throw er t.false(logged, 'did not print anything') t.end() }) }) tap.test('teardown', function (t) { console.log = log rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/yes-defaults.js0000644000000000000000000000112412631326456023372 0ustar 00000000000000var tap = require('tap') var init = require('../') var rimraf = require('rimraf') var EXPECT = { name: 'test', version: '1.0.0', description: '', author: '', scripts: { test: 'echo "Error: no test specified" && exit 1' }, main: 'basic.js', keywords: [], license: 'ISC' } tap.test('--yes defaults', function (t) { init(__dirname, __dirname, {yes: 'yes'}, function (er, data) { if (er) throw er t.has(data, EXPECT, 'used the default data') t.end() }) }) tap.test('teardown', function (t) { rimraf(__dirname + '/package.json', t.end.bind(t)) }) npm_3.5.2.orig/node_modules/init-package-json/test/lib/common.js0000644000000000000000000000076612631326456023036 0ustar 00000000000000module.exports.drive = drive var semver = require('semver') function drive (input) { var stdin = process.stdin function emit (chunk, ms) { setTimeout(function () { stdin.emit('data', chunk) }, ms) } if (semver.gte(process.versions.node, '0.11.0')) { input.forEach(function (chunk) { stdin.push(chunk) }) } else { stdin.once('readable', function () { var ms = 0 input.forEach(function (chunk) { emit(chunk, ms += 50) }) }) } } npm_3.5.2.orig/node_modules/lockfile/LICENSE0000644000000000000000000000137512631326456016752 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/lockfile/README.md0000644000000000000000000000403612631326456017221 0ustar 00000000000000# lockfile A very polite lock file utility, which endeavors to not litter, and to wait patiently for others. ## Usage ```javascript var lockFile = require('lockfile') // opts is optional, and defaults to {} lockFile.lock('some-file.lock', opts, function (er) { // if the er happens, then it failed to acquire a lock. // if there was not an error, then the file was created, // and won't be deleted until we unlock it. // do my stuff, free of interruptions // then, some time later, do: lockFile.unlock('some-file.lock', function (er) { // er means that an error happened, and is probably bad. }) }) ``` ## Methods Sync methods return the value/throw the error, others don't. Standard node fs stuff. All known locks are removed when the process exits. Of course, it's possible for certain types of failures to cause this to fail, but a best effort is made to not be a litterbug. ### lockFile.lock(path, [opts], cb) Acquire a file lock on the specified path ### lockFile.lockSync(path, [opts]) Acquire a file lock on the specified path ### lockFile.unlock(path, cb) Close and unlink the lockfile. ### lockFile.unlockSync(path) Close and unlink the lockfile. ### lockFile.check(path, [opts], cb) Check if the lockfile is locked and not stale. Callback is called with `cb(error, isLocked)`. ### lockFile.checkSync(path, [opts]) Check if the lockfile is locked and not stale. Returns boolean. ## Options ### opts.wait A number of milliseconds to wait for locks to expire before giving up. Only used by lockFile.lock. Poll for `opts.wait` ms. If the lock is not cleared by the time the wait expires, then it returns with the original error. ### opts.pollPeriod When using `opts.wait`, this is the period in ms in which it polls to check if the lock has expired. Defaults to `100`. ### opts.stale A number of milliseconds before locks are considered to have expired. ### opts.retries Used by lock and lockSync. Retry `n` number of times before giving up. ### opts.retryWait Used by lock. Wait `n` milliseconds before retrying. npm_3.5.2.orig/node_modules/lockfile/lockfile.js0000644000000000000000000002004212631326456020063 0ustar 00000000000000var fs = require('fs') var wx = 'wx' if (process.version.match(/^v0\.[0-6]/)) { var c = require('constants') wx = c.O_TRUNC | c.O_CREAT | c.O_WRONLY | c.O_EXCL } var os = require('os') exports.filetime = 'ctime' if (os.platform() == "win32") { exports.filetime = 'mtime' } var debug var util = require('util') if (util.debuglog) debug = util.debuglog('LOCKFILE') else if (/\blockfile\b/i.test(process.env.NODE_DEBUG)) debug = function() { var msg = util.format.apply(util, arguments) console.error('LOCKFILE %d %s', process.pid, msg) } else debug = function() {} var locks = {} function hasOwnProperty (obj, prop) { return Object.prototype.hasOwnProperty.call(obj, prop) } process.on('exit', function () { debug('exit listener') // cleanup Object.keys(locks).forEach(exports.unlockSync) }) // XXX https://github.com/joyent/node/issues/3555 // Remove when node 0.8 is deprecated. if (/^v0\.[0-8]\./.test(process.version)) { debug('uncaughtException, version = %s', process.version) process.on('uncaughtException', function H (er) { debug('uncaughtException') var l = process.listeners('uncaughtException').filter(function (h) { return h !== H }) if (!l.length) { // cleanup try { Object.keys(locks).forEach(exports.unlockSync) } catch (e) {} process.removeListener('uncaughtException', H) throw er } }) } exports.unlock = function (path, cb) { debug('unlock', path) // best-effort. unlocking an already-unlocked lock is a noop delete locks[path] fs.unlink(path, function (unlinkEr) { cb() }) } exports.unlockSync = function (path) { debug('unlockSync', path) // best-effort. unlocking an already-unlocked lock is a noop try { fs.unlinkSync(path) } catch (er) {} delete locks[path] } // if the file can be opened in readonly mode, then it's there. // if the error is something other than ENOENT, then it's not. exports.check = function (path, opts, cb) { if (typeof opts === 'function') cb = opts, opts = {} debug('check', path, opts) fs.open(path, 'r', function (er, fd) { if (er) { if (er.code !== 'ENOENT') return cb(er) return cb(null, false) } if (!opts.stale) { return fs.close(fd, function (er) { return cb(er, true) }) } fs.fstat(fd, function (er, st) { if (er) return fs.close(fd, function (er2) { return cb(er) }) fs.close(fd, function (er) { var age = Date.now() - st[exports.filetime].getTime() return cb(er, age <= opts.stale) }) }) }) } exports.checkSync = function (path, opts) { opts = opts || {} debug('checkSync', path, opts) if (opts.wait) { throw new Error('opts.wait not supported sync for obvious reasons') } try { var fd = fs.openSync(path, 'r') } catch (er) { if (er.code !== 'ENOENT') throw er return false } if (!opts.stale) { try { fs.closeSync(fd) } catch (er) {} return true } // file exists. however, might be stale if (opts.stale) { try { var st = fs.fstatSync(fd) } finally { fs.closeSync(fd) } var age = Date.now() - st[exports.filetime].getTime() return (age <= opts.stale) } } var req = 1 exports.lock = function (path, opts, cb) { if (typeof opts === 'function') cb = opts, opts = {} opts.req = opts.req || req++ debug('lock', path, opts) opts.start = opts.start || Date.now() if (typeof opts.retries === 'number' && opts.retries > 0) { debug('has retries', opts.retries) var retries = opts.retries opts.retries = 0 cb = (function (orig) { return function cb (er, fd) { debug('retry-mutated callback') retries -= 1 if (!er || retries < 0) return orig(er, fd) debug('lock retry', path, opts) if (opts.retryWait) setTimeout(retry, opts.retryWait) else retry() function retry () { opts.start = Date.now() debug('retrying', opts.start) exports.lock(path, opts, cb) } }})(cb) } // try to engage the lock. // if this succeeds, then we're in business. fs.open(path, wx, function (er, fd) { if (!er) { debug('locked', path, fd) locks[path] = fd return fs.close(fd, function () { return cb() }) } // something other than "currently locked" // maybe eperm or something. if (er.code !== 'EEXIST') return cb(er) // someone's got this one. see if it's valid. if (!opts.stale) return notStale(er, path, opts, cb) return maybeStale(er, path, opts, false, cb) }) } // Staleness checking algorithm // 1. acquire $lock, fail // 2. stat $lock, find that it is stale // 3. acquire $lock.STALE // 4. stat $lock, assert that it is still stale // 5. unlink $lock // 6. link $lock.STALE $lock // 7. unlink $lock.STALE // On any failure, clean up whatever we've done, and raise the error. function maybeStale (originalEr, path, opts, hasStaleLock, cb) { fs.stat(path, function (statEr, st) { if (statEr) { if (statEr.code === 'ENOENT') { // expired already! opts.stale = false debug('lock stale enoent retry', path, opts) exports.lock(path, opts, cb) return } return cb(statEr) } var age = Date.now() - st[exports.filetime].getTime() if (age <= opts.stale) return notStale(originalEr, path, opts, cb) debug('lock stale', path, opts) if (hasStaleLock) { exports.unlock(path, function (er) { if (er) return cb(er) debug('lock stale retry', path, opts) fs.link(path + '.STALE', path, function (er) { fs.unlink(path + '.STALE', function () { // best effort. if the unlink fails, oh well. cb(er) }) }) }) } else { debug('acquire .STALE file lock', opts) exports.lock(path + '.STALE', opts, function (er) { if (er) return cb(er) maybeStale(originalEr, path, opts, true, cb) }) } }) } function notStale (er, path, opts, cb) { debug('notStale', path, opts) // if we can't wait, then just call it a failure if (typeof opts.wait !== 'number' || opts.wait <= 0) return cb(er) // poll for some ms for the lock to clear var now = Date.now() var start = opts.start || now var end = start + opts.wait if (end <= now) return cb(er) debug('now=%d, wait until %d (delta=%d)', start, end, end-start) var wait = Math.min(end - start, opts.pollPeriod || 100) var timer = setTimeout(poll, wait) function poll () { debug('notStale, polling', path, opts) exports.lock(path, opts, cb) } } exports.lockSync = function (path, opts) { opts = opts || {} opts.req = opts.req || req++ debug('lockSync', path, opts) if (opts.wait || opts.retryWait) { throw new Error('opts.wait not supported sync for obvious reasons') } try { var fd = fs.openSync(path, wx) locks[path] = fd try { fs.closeSync(fd) } catch (er) {} debug('locked sync!', path, fd) return } catch (er) { if (er.code !== 'EEXIST') return retryThrow(path, opts, er) if (opts.stale) { var st = fs.statSync(path) var ct = st[exports.filetime].getTime() if (!(ct % 1000) && (opts.stale % 1000)) { // probably don't have subsecond resolution. // round up the staleness indicator. // Yes, this will be wrong 1/1000 times on platforms // with subsecond stat precision, but that's acceptable // in exchange for not mistakenly removing locks on // most other systems. opts.stale = 1000 * Math.ceil(opts.stale / 1000) } var age = Date.now() - ct if (age > opts.stale) { debug('lockSync stale', path, opts, age) exports.unlockSync(path) return exports.lockSync(path, opts) } } // failed to lock! debug('failed to lock', path, opts, er) return retryThrow(path, opts, er) } } function retryThrow (path, opts, er) { if (typeof opts.retries === 'number' && opts.retries > 0) { var newRT = opts.retries - 1 debug('retryThrow', path, opts, newRT) opts.retries = newRT return exports.lockSync(path, opts) } throw er } npm_3.5.2.orig/node_modules/lockfile/package.json0000644000000000000000000000302112631326456020221 0ustar 00000000000000{ "name": "lockfile", "version": "1.0.1", "main": "lockfile.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "~0.2.5", "touch": "0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/lockfile.git" }, "keywords": [ "lockfile", "lock", "file", "fs", "O_EXCL" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "description": "A very polite lock file utility, which endeavors to not litter, and to wait patiently for others.", "gitHead": "9d338ed8e3e3a166955d051f6b5fb6bb1e563ca8", "bugs": { "url": "https://github.com/isaacs/lockfile/issues" }, "homepage": "https://github.com/isaacs/lockfile#readme", "_id": "lockfile@1.0.1", "_shasum": "9d353ecfe3f54d150bb57f89d51746935a39c4f5", "_from": "lockfile@>=1.0.1 <1.1.0", "_npmVersion": "2.10.0", "_nodeVersion": "2.0.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "9d353ecfe3f54d150bb57f89d51746935a39c4f5", "tarball": "http://registry.npmjs.org/lockfile/-/lockfile-1.0.1.tgz" }, "maintainers": [ { "name": "trevorburnham", "email": "trevorburnham@gmail.com" }, { "name": "isaacs", "email": "i@izs.me" } ], "_resolved": "https://registry.npmjs.org/lockfile/-/lockfile-1.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/lockfile/test/0000755000000000000000000000000012631326456016716 5ustar 00000000000000npm_3.5.2.orig/node_modules/lockfile/test/basic.js0000644000000000000000000002063212631326456020340 0ustar 00000000000000var test = require('tap').test var lockFile = require('../lockfile.js') var path = require('path') var fs = require('fs') var touch = require('touch') // On Unix systems, it uses ctime by default for staleness checks, since it's // the most reliable. However, because this test artificially sets some locks // to an earlier time to simulate staleness, we use mtime here. lockFile.filetime = 'mtime' test('setup', function (t) { try { lockFile.unlockSync('basic-lock') } catch (er) {} try { lockFile.unlockSync('sync-lock') } catch (er) {} try { lockFile.unlockSync('never-forget') } catch (er) {} try { lockFile.unlockSync('stale-lock') } catch (er) {} try { lockFile.unlockSync('watch-lock') } catch (er) {} try { lockFile.unlockSync('retry-lock') } catch (er) {} try { lockFile.unlockSync('contentious-lock') } catch (er) {} try { lockFile.unlockSync('stale-wait-lock') } catch (er) {} try { lockFile.unlockSync('stale-windows-lock') } catch (er) {} t.end() }) test('lock contention', function (t) { var gotlocks = 0; var N = 200 var delay = 10 // allow for some time for each lock acquisition and release. // note that raising N higher will mean that the overhead // increases, because we're creating more and more watchers. // irl, you should never have several hundred contenders for a // single lock, so this situation is somewhat pathological. var overhead = 200 var wait = N * overhead + delay // first make it locked, so that everyone has to wait lockFile.lock('contentious-lock', function(er, lock) { t.ifError(er, 'acquiring starter') if (er) throw er; t.pass('acquired starter lock') setTimeout(function() { lockFile.unlock('contentious-lock', function (er) { t.ifError(er, 'unlocking starter') if (er) throw er t.pass('unlocked starter') }) }, delay) }) for (var i=0; i < N; i++) lockFile.lock('contentious-lock', { wait: wait }, function(er, lock) { if (er) throw er; lockFile.unlock('contentious-lock', function(er) { if (er) throw er gotlocks++ t.pass('locked and unlocked #' + gotlocks) if (gotlocks === N) { t.pass('got all locks') t.end() } }) }) }) test('basic test', function (t) { lockFile.check('basic-lock', function (er, locked) { if (er) throw er t.notOk(locked) lockFile.lock('basic-lock', function (er) { if (er) throw er lockFile.lock('basic-lock', function (er) { t.ok(er) lockFile.check('basic-lock', function (er, locked) { if (er) throw er t.ok(locked) lockFile.unlock('basic-lock', function (er) { if (er) throw er lockFile.check('basic-lock', function (er, locked) { if (er) throw er t.notOk(locked) t.end() }) }) }) }) }) }) }) test('sync test', function (t) { var locked locked = lockFile.checkSync('sync-lock') t.notOk(locked) lockFile.lockSync('sync-lock') locked = lockFile.checkSync('sync-lock') t.ok(locked) lockFile.unlockSync('sync-lock') locked = lockFile.checkSync('sync-lock') t.notOk(locked) t.end() }) test('exit cleanup test', function (t) { var child = require.resolve('./fixtures/child.js') var node = process.execPath var spawn = require('child_process').spawn spawn(node, [child]).on('exit', function () { setTimeout(function () { var locked = lockFile.checkSync('never-forget') t.notOk(locked) t.end() }, 100) }) }) test('error exit cleanup test', function (t) { var child = require.resolve('./fixtures/bad-child.js') var node = process.execPath var spawn = require('child_process').spawn spawn(node, [child]).on('exit', function () { setTimeout(function () { var locked = lockFile.checkSync('never-forget') t.notOk(locked) t.end() }, 100) }) }) test('staleness test', function (t) { lockFile.lock('stale-lock', function (er) { if (er) throw er // simulate 2s old touch.sync('stale-lock', { time: new Date(Date.now() - 2000) }) var opts = { stale: 1 } lockFile.check('stale-lock', opts, function (er, locked) { if (er) throw er t.notOk(locked) lockFile.lock('stale-lock', opts, function (er) { if (er) throw er lockFile.unlock('stale-lock', function (er) { if (er) throw er t.end() }) }) }) }) }) test('staleness sync test', function (t) { var opts = { stale: 1 } lockFile.lockSync('stale-lock') // simulate 2s old touch.sync('stale-lock', { time: new Date(Date.now() - 2000) }) var locked locked = lockFile.checkSync('stale-lock', opts) t.notOk(locked) lockFile.lockSync('stale-lock', opts) lockFile.unlockSync('stale-lock') t.end() }) test('retries', function (t) { // next 5 opens will fail. var opens = 5 fs._open = fs.open fs.open = function (path, mode, cb) { if (--opens === 0) { fs.open = fs._open return fs.open(path, mode, cb) } var er = new Error('bogus') // to be, or not to be, that is the question. er.code = opens % 2 ? 'EEXIST' : 'ENOENT' process.nextTick(cb.bind(null, er)) } lockFile.lock('retry-lock', { retries: opens }, function (er) { if (er) throw er t.equal(opens, 0) lockFile.unlockSync('retry-lock') t.end() }) }) test('retryWait', function (t) { // next 5 opens will fail. var opens = 5 fs._open = fs.open fs.open = function (path, mode, cb) { if (--opens === 0) { fs.open = fs._open return fs.open(path, mode, cb) } var er = new Error('bogus') // to be, or not to be, that is the question. er.code = opens % 2 ? 'EEXIST' : 'ENOENT' process.nextTick(cb.bind(null, er)) } var opts = { retries: opens, retryWait: 100 } lockFile.lock('retry-lock', opts, function (er) { if (er) throw er t.equal(opens, 0) lockFile.unlockSync('retry-lock') t.end() }) }) test('retry sync', function (t) { // next 5 opens will fail. var opens = 5 fs._openSync = fs.openSync fs.openSync = function (path, mode) { if (--opens === 0) { fs.openSync = fs._openSync return fs.openSync(path, mode) } var er = new Error('bogus') // to be, or not to be, that is the question. er.code = opens % 2 ? 'EEXIST' : 'ENOENT' throw er } var opts = { retries: opens } lockFile.lockSync('retry-lock', opts) t.equal(opens, 0) lockFile.unlockSync('retry-lock') t.end() }) test('wait and stale together', function (t) { // first locker. var interval lockFile.lock('stale-wait-lock', function(er) { // keep refreshing the lock, so we keep it forever interval = setInterval(function() { touch.sync('stale-wait-lock') }, 10) // try to get another lock. this must fail! var opt = { stale: 1000, wait: 2000, pollInterval: 1000 } lockFile.lock('stale-wait-lock', opt, function (er) { if (!er) t.fail('got second lock? that unpossible!') else t.pass('second lock failed, as i have foreseen it') clearInterval(interval) t.end() }) }) }) test('stale windows file tunneling test', function (t) { // for windows only // nt file system tunneling feature will make file creation time not updated var opts = { stale: 1000 } lockFile.lockSync('stale-windows-lock') touch.sync('stale-windows-lock', { time: new Date(Date.now() - 3000) }) var locked lockFile.unlockSync('stale-windows-lock') lockFile.lockSync('stale-windows-lock', opts) locked = lockFile.checkSync('stale-windows-lock', opts) t.ok(locked, "should be locked and not stale") lockFile.lock('stale-windows-lock', opts, function (er) { if (!er) t.fail('got second lock? impossible, windows file tunneling problem!') else t.pass('second lock failed, windows file tunneling problem fixed') t.end() }) }) test('cleanup', function (t) { try { lockFile.unlockSync('basic-lock') } catch (er) {} try { lockFile.unlockSync('sync-lock') } catch (er) {} try { lockFile.unlockSync('never-forget') } catch (er) {} try { lockFile.unlockSync('stale-lock') } catch (er) {} try { lockFile.unlockSync('watch-lock') } catch (er) {} try { lockFile.unlockSync('retry-lock') } catch (er) {} try { lockFile.unlockSync('contentious-lock') } catch (er) {} try { lockFile.unlockSync('stale-wait-lock') } catch (er) {} try { lockFile.unlockSync('stale-windows-lock') } catch (er) {} t.end() }) npm_3.5.2.orig/node_modules/lockfile/test/fixtures/0000755000000000000000000000000012631326456020567 5ustar 00000000000000npm_3.5.2.orig/node_modules/lockfile/test/retry-time.js0000644000000000000000000000271112631326456021356 0ustar 00000000000000// In these tests, we do the following: // try for 200ms (rt=2) // wait for 300ms // try for 200ms (rt=1) // wait for 300ms // try for 200ms (rt=0) // fail after 1200 // Actual time will be more like 1220-ish for setTimeout irregularity // But it should NOT be as slow as 2000. var lockFile = require('../') var touch = require('touch') var test = require('tap').test var fs = require('fs') var RETRYWAIT = 100 var WAIT = 100 var RETRIES = 2 var EXPECTTIME = (RETRYWAIT * RETRIES) + (WAIT * (RETRIES + 1)) var TOOLONG = EXPECTTIME * 1.1 test('setup', function (t) { touch.sync('file.lock') t.end() }) var pollPeriods = [10, 100, 10000] pollPeriods.forEach(function (pp) { test('retry+wait, poll=' + pp, function (t) { var ended = false var timer = setTimeout(function() { t.fail('taking too long!') ended = true t.end() }, 2000) timer.unref() var start = Date.now() lockFile.lock('file.lock', { wait: WAIT, retries: RETRIES, retryWait: RETRYWAIT, pollPeriod: pp }, function (er) { if (ended) return var time = Date.now() - start console.error('t=%d', time) t.ok(time >= EXPECTTIME, 'should take at least ' + EXPECTTIME) t.ok(time < TOOLONG, 'should take less than ' + TOOLONG) clearTimeout(timer) t.end() }) }) }) test('cleanup', function (t) { fs.unlinkSync('file.lock') t.end() setTimeout(function() { process.exit(1) }, 500).unref() }) npm_3.5.2.orig/node_modules/lockfile/test/stale-contention.js0000644000000000000000000000407012631326456022543 0ustar 00000000000000var fs = require('fs') var lockFile = require('../') var test = require('tap').test var path = require('path') var lock = path.resolve(__dirname, 'stale.lock') var touch = require('touch') var spawn = require('child_process').spawn var node = process.execPath // We're using a lockfile with an artificially old date, // so make it use that instead of ctime. // Probably you should never do this in production! lockFile.filetime = 'mtime' if (process.argv[2] === 'child') { return child() } function child () { // Make fs.stat take 100ms to return its data // This is important because, in a test scenario where // we're statting the same exact file rapid-fire like this, // it'll end up being cached by the FS, and never trigger // the race condition we're trying to expose. fs.stat = function (stat) { return function () { var args = [].slice.call(arguments) var cb = args.pop() stat.apply(fs, args.concat(function(er, st) { setTimeout(function () { cb(er, st) }, 100) })) }}(fs.stat) lockFile.lock(lock, { stale: 100000 }, function (er) { if (er && er.code !== 'EEXIST') throw er else if (er) process.exit(17) else setTimeout(function(){}, 500) }) } test('create stale file', function (t) { try { fs.unlinkSync(lock) } catch (er) {} touch.sync(lock, { time: '1979-07-01T19:10:00.000Z' }) t.end() }) test('contenders', function (t) { var n = 10 var fails = 0 var wins = 0 var args = [ __filename, 'child' ] var opt = { stdio: [0, "pipe", 2] } for (var i = 0; i < n; i++) { spawn(node, args, opt).on('close', then) } function then (code) { if (code === 17) { fails ++ } else if (code) { t.fail("unexpected failure", code) fails ++ } else { wins ++ } if (fails + wins === n) { done() } } function done () { t.equal(wins, 1, "should have 1 lock winner") t.equal(fails, n - 1, "all others should lose") t.end() } }) test('remove stale file', function (t) { try { fs.unlinkSync(lock) } catch (er) {} t.end() }) npm_3.5.2.orig/node_modules/lockfile/test/fixtures/bad-child.js0000644000000000000000000000015612631326456022736 0ustar 00000000000000var lockFile = require('../../lockfile.js') lockFile.lockSync('never-forget') throw new Error('waaaaaaaaa') npm_3.5.2.orig/node_modules/lockfile/test/fixtures/child.js0000644000000000000000000000013312631326456022205 0ustar 00000000000000var lockFile = require('../../lockfile.js') lockFile.lock('never-forget', function () {}) npm_3.5.2.orig/node_modules/lodash._baseindexof/LICENSE.txt0000644000000000000000000000232712631326456021675 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js 1.7.0, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash._baseindexof/README.md0000644000000000000000000000104512631326456021325 0ustar 00000000000000# lodash._baseindexof v3.1.0 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseIndexOf` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._baseindexof ``` In Node.js/io.js: ```js var baseIndexOf = require('lodash._baseindexof'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.1.0-npm-packages/lodash._baseindexof) for more details. npm_3.5.2.orig/node_modules/lodash._baseindexof/index.js0000644000000000000000000000333312631326456021515 0ustar 00000000000000/** * lodash 3.1.0 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.2 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * The base implementation of `_.indexOf` without support for binary searches. * * @private * @param {Array} array The array to search. * @param {*} value The value to search for. * @param {number} fromIndex The index to search from. * @returns {number} Returns the index of the matched value, else `-1`. */ function baseIndexOf(array, value, fromIndex) { if (value !== value) { return indexOfNaN(array, fromIndex); } var index = fromIndex - 1, length = array.length; while (++index < length) { if (array[index] === value) { return index; } } return -1; } /** * Gets the index at which the first occurrence of `NaN` is found in `array`. * If `fromRight` is provided elements of `array` are iterated from right to left. * * @private * @param {Array} array The array to search. * @param {number} fromIndex The index to search from. * @param {boolean} [fromRight] Specify iterating from right to left. * @returns {number} Returns the index of the matched `NaN`, else `-1`. */ function indexOfNaN(array, fromIndex, fromRight) { var length = array.length, index = fromIndex + (fromRight ? 0 : -1); while ((fromRight ? index-- : ++index < length)) { var other = array[index]; if (other !== other) { return index; } } return -1; } module.exports = baseIndexOf; npm_3.5.2.orig/node_modules/lodash._baseindexof/package.json0000644000000000000000000000414112631326456022334 0ustar 00000000000000{ "name": "lodash._baseindexof", "version": "3.1.0", "description": "The modern build of lodash’s internal `baseIndexOf` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._baseindexof v3.1.0\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseIndexOf` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._baseindexof\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseIndexOf = require('lodash._baseindexof');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.1.0-npm-packages/lodash._baseindexof) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._baseindexof@3.1.0", "_shasum": "fe52b53a1c6761e42618d654e4a25789ed61822c", "_resolved": "https://registry.npmjs.org/lodash._baseindexof/-/lodash._baseindexof-3.1.0.tgz", "_from": "lodash._baseindexof@3.1.0" } npm_3.5.2.orig/node_modules/lodash._baseuniq/LICENSE0000644000000000000000000000232112631326456020371 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash._baseuniq/README.md0000644000000000000000000000102312631326456020641 0ustar 00000000000000# lodash._baseuniq v3.0.3 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseUniq` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._baseuniq ``` In Node.js/io.js: ```js var baseUniq = require('lodash._baseuniq'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.3-npm-packages/lodash._baseuniq) for more details. npm_3.5.2.orig/node_modules/lodash._baseuniq/index.js0000644000000000000000000000365512631326456021044 0ustar 00000000000000/** * lodash 3.0.3 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseIndexOf = require('lodash._baseindexof'), cacheIndexOf = require('lodash._cacheindexof'), createCache = require('lodash._createcache'); /** Used as the size to enable large array optimizations. */ var LARGE_ARRAY_SIZE = 200; /** * The base implementation of `_.uniq` without support for callback shorthands * and `this` binding. * * @private * @param {Array} array The array to inspect. * @param {Function} [iteratee] The function invoked per iteration. * @returns {Array} Returns the new duplicate-value-free array. */ function baseUniq(array, iteratee) { var index = -1, indexOf = baseIndexOf, length = array.length, isCommon = true, isLarge = isCommon && length >= LARGE_ARRAY_SIZE, seen = isLarge ? createCache() : null, result = []; if (seen) { indexOf = cacheIndexOf; isCommon = false; } else { isLarge = false; seen = iteratee ? [] : result; } outer: while (++index < length) { var value = array[index], computed = iteratee ? iteratee(value, index, array) : value; if (isCommon && value === value) { var seenIndex = seen.length; while (seenIndex--) { if (seen[seenIndex] === computed) { continue outer; } } if (iteratee) { seen.push(computed); } result.push(value); } else if (indexOf(seen, computed, 0) < 0) { if (iteratee || isLarge) { seen.push(computed); } result.push(value); } } return result; } module.exports = baseUniq; npm_3.5.2.orig/node_modules/lodash._baseuniq/package.json0000644000000000000000000000456412631326456021665 0ustar 00000000000000{ "name": "lodash._baseuniq", "version": "3.0.3", "description": "The modern build of lodash’s internal `baseUniq` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._baseindexof": "^3.0.0", "lodash._cacheindexof": "^3.0.0", "lodash._createcache": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._baseuniq@3.0.3", "_shasum": "2123fa0db2d69c28d5beb1c1f36d61522a740234", "_from": "lodash._baseuniq@3.0.3", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" }, { "name": "d10", "email": "demoneaux@gmail.com" } ], "dist": { "shasum": "2123fa0db2d69c28d5beb1c1f36d61522a740234", "tarball": "http://registry.npmjs.org/lodash._baseuniq/-/lodash._baseuniq-3.0.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._baseuniq/-/lodash._baseuniq-3.0.3.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/lodash._bindcallback/LICENSE.txt0000644000000000000000000000232112631326456021771 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash._bindcallback/README.md0000644000000000000000000000105312631326456021426 0ustar 00000000000000# lodash._bindcallback v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `bindCallback` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._bindcallback ``` In Node.js/io.js: ```js var bindCallback = require('lodash._bindcallback'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._bindcallback) for more details. npm_3.5.2.orig/node_modules/lodash._bindcallback/index.js0000644000000000000000000000356412631326456021625 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * A specialized version of `baseCallback` which only supports `this` binding * and specifying the number of arguments to provide to `func`. * * @private * @param {Function} func The function to bind. * @param {*} thisArg The `this` binding of `func`. * @param {number} [argCount] The number of arguments to provide to `func`. * @returns {Function} Returns the callback. */ function bindCallback(func, thisArg, argCount) { if (typeof func != 'function') { return identity; } if (thisArg === undefined) { return func; } switch (argCount) { case 1: return function(value) { return func.call(thisArg, value); }; case 3: return function(value, index, collection) { return func.call(thisArg, value, index, collection); }; case 4: return function(accumulator, value, index, collection) { return func.call(thisArg, accumulator, value, index, collection); }; case 5: return function(value, other, key, object, source) { return func.call(thisArg, value, other, key, object, source); }; } return function() { return func.apply(thisArg, arguments); }; } /** * This method returns the first argument provided to it. * * @static * @memberOf _ * @category Utility * @param {*} value Any value. * @returns {*} Returns `value`. * @example * * var object = { 'user': 'fred' }; * * _.identity(object) === object; * // => true */ function identity(value) { return value; } module.exports = bindCallback; npm_3.5.2.orig/node_modules/lodash._bindcallback/package.json0000644000000000000000000000415512631326456022443 0ustar 00000000000000{ "name": "lodash._bindcallback", "version": "3.0.1", "description": "The modern build of lodash’s internal `bindCallback` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._bindcallback v3.0.1\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `bindCallback` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._bindcallback\n```\n\nIn Node.js/io.js:\n\n```js\nvar bindCallback = require('lodash._bindcallback');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._bindcallback) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._bindcallback@3.0.1", "_shasum": "e531c27644cf8b57a99e17ed95b35c748789392e", "_resolved": "https://registry.npmjs.org/lodash._bindcallback/-/lodash._bindcallback-3.0.1.tgz", "_from": "lodash._bindcallback@3.0.1" } npm_3.5.2.orig/node_modules/lodash._cacheindexof/LICENSE.txt0000644000000000000000000000232112631326456022020 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash._cacheindexof/README.md0000644000000000000000000000105312631326456021455 0ustar 00000000000000# lodash._cacheindexof v3.0.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `cacheIndexOf` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._cacheindexof ``` In Node.js/io.js: ```js var cacheIndexOf = require('lodash._cacheindexof'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash._cacheindexof) for more details. npm_3.5.2.orig/node_modules/lodash._cacheindexof/index.js0000644000000000000000000000316712631326456021653 0ustar 00000000000000/** * lodash 3.0.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Checks if `value` is in `cache` mimicking the return signature of * `_.indexOf` by returning `0` if the value is found, else `-1`. * * @private * @param {Object} cache The cache to search. * @param {*} value The value to search for. * @returns {number} Returns `0` if `value` is found, else `-1`. */ function cacheIndexOf(cache, value) { var data = cache.data, result = (typeof value == 'string' || isObject(value)) ? data.set.has(value) : data.hash[value]; return result ? 0 : -1; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } module.exports = cacheIndexOf; npm_3.5.2.orig/node_modules/lodash._cacheindexof/package.json0000644000000000000000000000415512631326456022472 0ustar 00000000000000{ "name": "lodash._cacheindexof", "version": "3.0.2", "description": "The modern build of lodash’s internal `cacheIndexOf` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._cacheindexof v3.0.2\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `cacheIndexOf` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._cacheindexof\n```\n\nIn Node.js/io.js:\n\n```js\nvar cacheIndexOf = require('lodash._cacheindexof');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash._cacheindexof) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._cacheindexof@3.0.2", "_shasum": "3dc69ac82498d2ee5e3ce56091bafd2adc7bde92", "_resolved": "https://registry.npmjs.org/lodash._cacheindexof/-/lodash._cacheindexof-3.0.2.tgz", "_from": "lodash._cacheindexof@3.0.2" } npm_3.5.2.orig/node_modules/lodash._createcache/LICENSE0000644000000000000000000000232112631326456021011 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash._createcache/README.md0000644000000000000000000000104512631326456021265 0ustar 00000000000000# lodash._createcache v3.1.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createCache` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createcache ``` In Node.js/io.js: ```js var createCache = require('lodash._createcache'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.1.2-npm-packages/lodash._createcache) for more details. npm_3.5.2.orig/node_modules/lodash._createcache/index.js0000644000000000000000000000457512631326456021466 0ustar 00000000000000/** * lodash 3.1.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var getNative = require('lodash._getnative'); /** Native method references. */ var Set = getNative(global, 'Set'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCreate = getNative(Object, 'create'); /** * * Creates a cache object to store unique values. * * @private * @param {Array} [values] The values to cache. */ function SetCache(values) { var length = values ? values.length : 0; this.data = { 'hash': nativeCreate(null), 'set': new Set }; while (length--) { this.push(values[length]); } } /** * Adds `value` to the cache. * * @private * @name push * @memberOf SetCache * @param {*} value The value to cache. */ function cachePush(value) { var data = this.data; if (typeof value == 'string' || isObject(value)) { data.set.add(value); } else { data.hash[value] = true; } } /** * Creates a `Set` cache object to optimize linear searches of large arrays. * * @private * @param {Array} [values] The values to cache. * @returns {null|Object} Returns the new cache object if `Set` is supported, else `null`. */ function createCache(values) { return (nativeCreate && Set) ? new SetCache(values) : null; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } // Add functions to the `Set` cache. SetCache.prototype.push = cachePush; module.exports = createCache; npm_3.5.2.orig/node_modules/lodash._createcache/package.json0000644000000000000000000000423412631326456022277 0ustar 00000000000000{ "name": "lodash._createcache", "version": "3.1.2", "description": "The modern build of lodash’s internal `createCache` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._getnative": "^3.0.0" }, "readme": "# lodash._createcache v3.1.2\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createCache` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._createcache\n```\n\nIn Node.js/io.js:\n\n```js\nvar createCache = require('lodash._createcache');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.1.2-npm-packages/lodash._createcache) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createcache@3.1.2", "_shasum": "56d6a064017625e79ebca6b8018e17440bdcf093", "_resolved": "https://registry.npmjs.org/lodash._createcache/-/lodash._createcache-3.1.2.tgz", "_from": "lodash._createcache@3.1.2" } npm_3.5.2.orig/node_modules/lodash._getnative/LICENSE0000644000000000000000000000232112631326456020550 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash._getnative/README.md0000644000000000000000000000103112631326456021017 0ustar 00000000000000# lodash._getnative v3.9.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `getNative` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._getnative ``` In Node.js/io.js: ```js var getNative = require('lodash._getnative'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.9.1-npm-packages/lodash._getnative) for more details. npm_3.5.2.orig/node_modules/lodash._getnative/index.js0000644000000000000000000000743612631326456021224 0ustar 00000000000000/** * lodash 3.9.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** `Object#toString` result references. */ var funcTag = '[object Function]'; /** Used to detect host constructors (Safari > 5). */ var reIsHostCtor = /^\[object .+?Constructor\]$/; /** * Checks if `value` is object-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is object-like, else `false`. */ function isObjectLike(value) { return !!value && typeof value == 'object'; } /** Used for native method references. */ var objectProto = Object.prototype; /** Used to resolve the decompiled source of functions. */ var fnToString = Function.prototype.toString; /** Used to check objects for own properties. */ var hasOwnProperty = objectProto.hasOwnProperty; /** * Used to resolve the [`toStringTag`](http://ecma-international.org/ecma-262/6.0/#sec-object.prototype.tostring) * of values. */ var objToString = objectProto.toString; /** Used to detect if a method is native. */ var reIsNative = RegExp('^' + fnToString.call(hasOwnProperty).replace(/[\\^$.*+?()[\]{}|]/g, '\\$&') .replace(/hasOwnProperty|(function).*?(?=\\\()| for .+?(?=\\\])/g, '$1.*?') + '$' ); /** * Gets the native function at `key` of `object`. * * @private * @param {Object} object The object to query. * @param {string} key The key of the method to get. * @returns {*} Returns the function if it's native, else `undefined`. */ function getNative(object, key) { var value = object == null ? undefined : object[key]; return isNative(value) ? value : undefined; } /** * Checks if `value` is classified as a `Function` object. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is correctly classified, else `false`. * @example * * _.isFunction(_); * // => true * * _.isFunction(/abc/); * // => false */ function isFunction(value) { // The use of `Object#toString` avoids issues with the `typeof` operator // in older versions of Chrome and Safari which return 'function' for regexes // and Safari 8 equivalents which return 'object' for typed array constructors. return isObject(value) && objToString.call(value) == funcTag; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } /** * Checks if `value` is a native function. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a native function, else `false`. * @example * * _.isNative(Array.prototype.push); * // => true * * _.isNative(_); * // => false */ function isNative(value) { if (value == null) { return false; } if (isFunction(value)) { return reIsNative.test(fnToString.call(value)); } return isObjectLike(value) && reIsHostCtor.test(value); } module.exports = getNative; npm_3.5.2.orig/node_modules/lodash._getnative/package.json0000644000000000000000000000411112631326456022030 0ustar 00000000000000{ "name": "lodash._getnative", "version": "3.9.1", "description": "The modern build of lodash’s internal `getNative` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._getnative v3.9.1\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `getNative` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._getnative\n```\n\nIn Node.js/io.js:\n\n```js\nvar getNative = require('lodash._getnative');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.9.1-npm-packages/lodash._getnative) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._getnative@3.9.1", "_shasum": "570bc7dede46d61cdcde687d65d3eecbaa3aaff5", "_resolved": "https://registry.npmjs.org/lodash._getnative/-/lodash._getnative-3.9.1.tgz", "_from": "lodash._getnative@3.9.1" } npm_3.5.2.orig/node_modules/lodash.clonedeep/LICENSE0000644000000000000000000000232112631326456020361 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.clonedeep/README.md0000644000000000000000000000110412631326456020631 0ustar 00000000000000# lodash.clonedeep v3.0.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.cloneDeep` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.clonedeep ``` In Node.js/io.js: ```js var cloneDeep = require('lodash.clonedeep'); ``` See the [documentation](https://lodash.com/docs#cloneDeep) or [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash.clonedeep) for more details. npm_3.5.2.orig/node_modules/lodash.clonedeep/index.js0000644000000000000000000000422212631326456021023 0ustar 00000000000000/** * lodash 3.0.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseClone = require('lodash._baseclone'), bindCallback = require('lodash._bindcallback'); /** * Creates a deep clone of `value`. If `customizer` is provided it's invoked * to produce the cloned values. If `customizer` returns `undefined` cloning * is handled by the method instead. The `customizer` is bound to `thisArg` * and invoked with up to three argument; (value [, index|key, object]). * * **Note:** This method is loosely based on the * [structured clone algorithm](http://www.w3.org/TR/html5/infrastructure.html#internal-structured-cloning-algorithm). * The enumerable properties of `arguments` objects and objects created by * constructors other than `Object` are cloned to plain `Object` objects. An * empty object is returned for uncloneable values such as functions, DOM nodes, * Maps, Sets, and WeakMaps. * * @static * @memberOf _ * @category Lang * @param {*} value The value to deep clone. * @param {Function} [customizer] The function to customize cloning values. * @param {*} [thisArg] The `this` binding of `customizer`. * @returns {*} Returns the deep cloned value. * @example * * var users = [ * { 'user': 'barney' }, * { 'user': 'fred' } * ]; * * var deep = _.cloneDeep(users); * deep[0] === users[0]; * // => false * * // using a customizer callback * var el = _.cloneDeep(document.body, function(value) { * if (_.isElement(value)) { * return value.cloneNode(true); * } * }); * * el === document.body * // => false * el.nodeName * // => BODY * el.childNodes.length; * // => 20 */ function cloneDeep(value, customizer, thisArg) { return typeof customizer == 'function' ? baseClone(value, true, bindCallback(customizer, thisArg, 3)) : baseClone(value, true); } module.exports = cloneDeep; npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/0000755000000000000000000000000012631326456022033 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/package.json0000644000000000000000000000445012631326456021647 0ustar 00000000000000{ "name": "lodash.clonedeep", "version": "3.0.2", "description": "The modern build of lodash’s `_.cloneDeep` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._baseclone": "^3.0.0", "lodash._bindcallback": "^3.0.0" }, "readme": "# lodash.clonedeep v3.0.2\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.cloneDeep` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.clonedeep\n```\n\nIn Node.js/io.js:\n\n```js\nvar cloneDeep = require('lodash.clonedeep');\n```\n\nSee the [documentation](https://lodash.com/docs#cloneDeep) or [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash.clonedeep) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.clonedeep@3.0.2", "_shasum": "a0a1e40d82a5ea89ff5b147b8444ed63d92827db", "_resolved": "https://registry.npmjs.org/lodash.clonedeep/-/lodash.clonedeep-3.0.2.tgz", "_from": "lodash.clonedeep@>=3.0.2 <3.1.0" } npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/0000755000000000000000000000000012631326456025376 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/LICENSE0000644000000000000000000000232112631326456026401 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/README.md0000644000000000000000000000103112631326456026650 0ustar 00000000000000# lodash._baseclone v3.3.0 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseClone` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._baseclone ``` In Node.js/io.js: ```js var baseClone = require('lodash._baseclone'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.3.0-npm-packages/lodash._baseclone) for more details. npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/index.js0000644000000000000000000002026512631326456027050 0ustar 00000000000000/** * lodash 3.3.0 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var arrayCopy = require('lodash._arraycopy'), arrayEach = require('lodash._arrayeach'), baseAssign = require('lodash._baseassign'), baseFor = require('lodash._basefor'), isArray = require('lodash.isarray'), keys = require('lodash.keys'); /** `Object#toString` result references. */ var argsTag = '[object Arguments]', arrayTag = '[object Array]', boolTag = '[object Boolean]', dateTag = '[object Date]', errorTag = '[object Error]', funcTag = '[object Function]', mapTag = '[object Map]', numberTag = '[object Number]', objectTag = '[object Object]', regexpTag = '[object RegExp]', setTag = '[object Set]', stringTag = '[object String]', weakMapTag = '[object WeakMap]'; var arrayBufferTag = '[object ArrayBuffer]', float32Tag = '[object Float32Array]', float64Tag = '[object Float64Array]', int8Tag = '[object Int8Array]', int16Tag = '[object Int16Array]', int32Tag = '[object Int32Array]', uint8Tag = '[object Uint8Array]', uint8ClampedTag = '[object Uint8ClampedArray]', uint16Tag = '[object Uint16Array]', uint32Tag = '[object Uint32Array]'; /** Used to match `RegExp` flags from their coerced string values. */ var reFlags = /\w*$/; /** Used to identify `toStringTag` values supported by `_.clone`. */ var cloneableTags = {}; cloneableTags[argsTag] = cloneableTags[arrayTag] = cloneableTags[arrayBufferTag] = cloneableTags[boolTag] = cloneableTags[dateTag] = cloneableTags[float32Tag] = cloneableTags[float64Tag] = cloneableTags[int8Tag] = cloneableTags[int16Tag] = cloneableTags[int32Tag] = cloneableTags[numberTag] = cloneableTags[objectTag] = cloneableTags[regexpTag] = cloneableTags[stringTag] = cloneableTags[uint8Tag] = cloneableTags[uint8ClampedTag] = cloneableTags[uint16Tag] = cloneableTags[uint32Tag] = true; cloneableTags[errorTag] = cloneableTags[funcTag] = cloneableTags[mapTag] = cloneableTags[setTag] = cloneableTags[weakMapTag] = false; /** Used for native method references. */ var objectProto = Object.prototype; /** Used to check objects for own properties. */ var hasOwnProperty = objectProto.hasOwnProperty; /** * Used to resolve the [`toStringTag`](http://ecma-international.org/ecma-262/6.0/#sec-object.prototype.tostring) * of values. */ var objToString = objectProto.toString; /** Native method references. */ var ArrayBuffer = global.ArrayBuffer, Uint8Array = global.Uint8Array; /** * The base implementation of `_.clone` without support for argument juggling * and `this` binding `customizer` functions. * * @private * @param {*} value The value to clone. * @param {boolean} [isDeep] Specify a deep clone. * @param {Function} [customizer] The function to customize cloning values. * @param {string} [key] The key of `value`. * @param {Object} [object] The object `value` belongs to. * @param {Array} [stackA=[]] Tracks traversed source objects. * @param {Array} [stackB=[]] Associates clones with source counterparts. * @returns {*} Returns the cloned value. */ function baseClone(value, isDeep, customizer, key, object, stackA, stackB) { var result; if (customizer) { result = object ? customizer(value, key, object) : customizer(value); } if (result !== undefined) { return result; } if (!isObject(value)) { return value; } var isArr = isArray(value); if (isArr) { result = initCloneArray(value); if (!isDeep) { return arrayCopy(value, result); } } else { var tag = objToString.call(value), isFunc = tag == funcTag; if (tag == objectTag || tag == argsTag || (isFunc && !object)) { result = initCloneObject(isFunc ? {} : value); if (!isDeep) { return baseAssign(result, value); } } else { return cloneableTags[tag] ? initCloneByTag(value, tag, isDeep) : (object ? value : {}); } } // Check for circular references and return its corresponding clone. stackA || (stackA = []); stackB || (stackB = []); var length = stackA.length; while (length--) { if (stackA[length] == value) { return stackB[length]; } } // Add the source value to the stack of traversed objects and associate it with its clone. stackA.push(value); stackB.push(result); // Recursively populate clone (susceptible to call stack limits). (isArr ? arrayEach : baseForOwn)(value, function(subValue, key) { result[key] = baseClone(subValue, isDeep, customizer, key, value, stackA, stackB); }); return result; } /** * The base implementation of `_.forOwn` without support for callback * shorthands and `this` binding. * * @private * @param {Object} object The object to iterate over. * @param {Function} iteratee The function invoked per iteration. * @returns {Object} Returns `object`. */ function baseForOwn(object, iteratee) { return baseFor(object, iteratee, keys); } /** * Creates a clone of the given array buffer. * * @private * @param {ArrayBuffer} buffer The array buffer to clone. * @returns {ArrayBuffer} Returns the cloned array buffer. */ function bufferClone(buffer) { var result = new ArrayBuffer(buffer.byteLength), view = new Uint8Array(result); view.set(new Uint8Array(buffer)); return result; } /** * Initializes an array clone. * * @private * @param {Array} array The array to clone. * @returns {Array} Returns the initialized clone. */ function initCloneArray(array) { var length = array.length, result = new array.constructor(length); // Add array properties assigned by `RegExp#exec`. if (length && typeof array[0] == 'string' && hasOwnProperty.call(array, 'index')) { result.index = array.index; result.input = array.input; } return result; } /** * Initializes an object clone. * * @private * @param {Object} object The object to clone. * @returns {Object} Returns the initialized clone. */ function initCloneObject(object) { var Ctor = object.constructor; if (!(typeof Ctor == 'function' && Ctor instanceof Ctor)) { Ctor = Object; } return new Ctor; } /** * Initializes an object clone based on its `toStringTag`. * * **Note:** This function only supports cloning values with tags of * `Boolean`, `Date`, `Error`, `Number`, `RegExp`, or `String`. * * @private * @param {Object} object The object to clone. * @param {string} tag The `toStringTag` of the object to clone. * @param {boolean} [isDeep] Specify a deep clone. * @returns {Object} Returns the initialized clone. */ function initCloneByTag(object, tag, isDeep) { var Ctor = object.constructor; switch (tag) { case arrayBufferTag: return bufferClone(object); case boolTag: case dateTag: return new Ctor(+object); case float32Tag: case float64Tag: case int8Tag: case int16Tag: case int32Tag: case uint8Tag: case uint8ClampedTag: case uint16Tag: case uint32Tag: var buffer = object.buffer; return new Ctor(isDeep ? bufferClone(buffer) : buffer, object.byteOffset, object.length); case numberTag: case stringTag: return new Ctor(object); case regexpTag: var result = new Ctor(object.source, reFlags.exec(object)); result.lastIndex = object.lastIndex; } return result; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } module.exports = baseClone; npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/0000755000000000000000000000000012631326456030053 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/package.json0000644000000000000000000000446212631326456027672 0ustar 00000000000000{ "name": "lodash._baseclone", "version": "3.3.0", "description": "The modern build of lodash’s internal `baseClone` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._arraycopy": "^3.0.0", "lodash._arrayeach": "^3.0.0", "lodash._baseassign": "^3.0.0", "lodash._basefor": "^3.0.0", "lodash.isarray": "^3.0.0", "lodash.keys": "^3.0.0" }, "readme": "# lodash._baseclone v3.3.0\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseClone` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._baseclone\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseClone = require('lodash._baseclone');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.3.0-npm-packages/lodash._baseclone) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._baseclone@3.3.0", "_shasum": "303519bf6393fe7e42f34d8b630ef7794e3542b7", "_resolved": "https://registry.npmjs.org/lodash._baseclone/-/lodash._baseclone-3.3.0.tgz", "_from": "lodash._baseclone@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arraycopy/npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000755000000000000000000000000012631326456032170 5ustar 00000000000000././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arrayeach/npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000755000000000000000000000000012631326456032170 5ustar 00000000000000././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000755000000000000000000000000012631326456032151 5ustar 00000000000000././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._basefor/npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000755000000000000000000000000012631326456032151 5ustar 00000000000000././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arraycopy/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000232712631326456032176 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js 1.7.0, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arraycopy/README.mdnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000103112631326456032165 0ustar 00000000000000# lodash._arraycopy v3.0.0 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `arrayCopy` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._arraycopy ``` In Node.js/io.js: ```js var arrayCopy = require('lodash._arraycopy'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.0-npm-packages/lodash._arraycopy) for more details. ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arraycopy/index.jsnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000153712631326456032200 0ustar 00000000000000/** * lodash 3.0.0 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.7.0 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Copies the values of `source` to `array`. * * @private * @param {Array} source The array to copy values from. * @param {Array} [array=[]] The array to copy values to. * @returns {Array} Returns `array`. */ function arrayCopy(source, array) { var index = -1, length = source.length; array || (array = Array(length)); while (++index < length) { array[index] = source[index]; } return array; } module.exports = arrayCopy; ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arraycopy/package.jsonnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000412212631326456032171 0ustar 00000000000000{ "name": "lodash._arraycopy", "version": "3.0.0", "description": "The modern build of lodash’s internal `arrayCopy` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._arraycopy v3.0.0\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `arrayCopy` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._arraycopy\n```\n\nIn Node.js/io.js:\n\n```js\nvar arrayCopy = require('lodash._arraycopy');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.0-npm-packages/lodash._arraycopy) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._arraycopy@3.0.0", "_shasum": "76e7b7c1f1fb92547374878a562ed06a3e50f6e1", "_resolved": "https://registry.npmjs.org/lodash._arraycopy/-/lodash._arraycopy-3.0.0.tgz", "_from": "lodash._arraycopy@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arrayeach/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000232712631326456032176 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js 1.7.0, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arrayeach/README.mdnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000103112631326456032165 0ustar 00000000000000# lodash._arrayeach v3.0.0 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `arrayEach` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._arrayeach ``` In Node.js/io.js: ```js var arrayEach = require('lodash._arrayeach'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.0-npm-packages/lodash._arrayeach) for more details. ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arrayeach/index.jsnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000165612631326456032202 0ustar 00000000000000/** * lodash 3.0.0 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.7.0 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * A specialized version of `_.forEach` for arrays without support for callback * shorthands or `this` binding. * * @private * @param {Array} array The array to iterate over. * @param {Function} iteratee The function invoked per iteration. * @returns {Array} Returns `array`. */ function arrayEach(array, iteratee) { var index = -1, length = array.length; while (++index < length) { if (iteratee(array[index], index, array) === false) { break; } } return array; } module.exports = arrayEach; ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arrayeach/package.jsonnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._arr0000644000000000000000000000412212631326456032171 0ustar 00000000000000{ "name": "lodash._arrayeach", "version": "3.0.0", "description": "The modern build of lodash’s internal `arrayEach` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._arrayeach v3.0.0\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `arrayEach` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._arrayeach\n```\n\nIn Node.js/io.js:\n\n```js\nvar arrayEach = require('lodash._arrayeach');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.0-npm-packages/lodash._arrayeach) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._arrayeach@3.0.0", "_shasum": "bab156b2a90d3f1bbd5c653403349e5e5933ef9e", "_resolved": "https://registry.npmjs.org/lodash._arrayeach/-/lodash._arrayeach-3.0.0.tgz", "_from": "lodash._arrayeach@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000232112631326456032151 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/README.mdnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000103712631326456032154 0ustar 00000000000000# lodash._baseassign v3.2.0 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseAssign` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._baseassign ``` In Node.js/io.js: ```js var baseAssign = require('lodash._baseassign'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.2.0-npm-packages/lodash._baseassign) for more details. ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/index.jsnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000163712631326456032162 0ustar 00000000000000/** * lodash 3.2.0 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseCopy = require('lodash._basecopy'), keys = require('lodash.keys'); /** * The base implementation of `_.assign` without support for argument juggling, * multiple sources, and `customizer` functions. * * @private * @param {Object} object The destination object. * @param {Object} source The source object. * @returns {Object} Returns `object`. */ function baseAssign(object, source) { return source == null ? object : baseCopy(source, keys(source), object); } module.exports = baseAssign; ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/node_modules/npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000755000000000000000000000000012631326456032151 5ustar 00000000000000././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/package.jsonnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000426512631326456032162 0ustar 00000000000000{ "name": "lodash._baseassign", "version": "3.2.0", "description": "The modern build of lodash’s internal `baseAssign` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basecopy": "^3.0.0", "lodash.keys": "^3.0.0" }, "readme": "# lodash._baseassign v3.2.0\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseAssign` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._baseassign\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseAssign = require('lodash._baseassign');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.2.0-npm-packages/lodash._baseassign) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._baseassign@3.2.0", "_shasum": "8c38a099500f215ad09e59f1722fd0c52bfe0a4e", "_resolved": "https://registry.npmjs.org/lodash._baseassign/-/lodash._baseassign-3.2.0.tgz", "_from": "lodash._baseassign@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000021300000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/node_modules/lodash._basecopy/npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000755000000000000000000000000012631326456032151 5ustar 00000000000000././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/node_modules/lodash._basecopy/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000232112631326456032151 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/node_modules/lodash._basecopy/README.mdnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000102312631326456032147 0ustar 00000000000000# lodash._basecopy v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseCopy` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basecopy ``` In Node.js/io.js: ```js var baseCopy = require('lodash._basecopy'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basecopy) for more details. ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/node_modules/lodash._basecopy/index.jsnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000167212631326456032161 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Copies properties of `source` to `object`. * * @private * @param {Object} source The object to copy properties from. * @param {Array} props The property names to copy. * @param {Object} [object={}] The object to copy properties to. * @returns {Object} Returns `object`. */ function baseCopy(source, props, object) { object || (object = {}); var index = -1, length = props.length; while (++index < length) { var key = props[index]; object[key] = source[key]; } return object; } module.exports = baseCopy; ././@LongLink0000000000000000000000000000022700000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._baseassign/node_modules/lodash._basecopy/package.jsonnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000410612631326456032154 0ustar 00000000000000{ "name": "lodash._basecopy", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseCopy` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._basecopy v3.0.1\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseCopy` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._basecopy\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseCopy = require('lodash._basecopy');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basecopy) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basecopy@3.0.1", "_shasum": "8da0e6a876cf344c0ad8a54882111dd3c5c7ca36", "_resolved": "https://registry.npmjs.org/lodash._basecopy/-/lodash._basecopy-3.0.1.tgz", "_from": "lodash._basecopy@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._basefor/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000232112631326456032151 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._basefor/README.mdnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000101512631326456032150 0ustar 00000000000000# lodash._basefor v3.0.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseFor` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basefor ``` In Node.js/io.js: ```js var baseFor = require('lodash._basefor'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash._basefor) for more details. ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._basefor/index.jsnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000476312631326456032165 0ustar 00000000000000/** * lodash 3.0.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * The base implementation of `baseForIn` and `baseForOwn` which iterates * over `object` properties returned by `keysFunc` invoking `iteratee` for * each property. Iteratee functions may exit iteration early by explicitly * returning `false`. * * @private * @param {Object} object The object to iterate over. * @param {Function} iteratee The function invoked per iteration. * @param {Function} keysFunc The function to get the keys of `object`. * @returns {Object} Returns `object`. */ var baseFor = createBaseFor(); /** * Creates a base function for `_.forIn` or `_.forInRight`. * * @private * @param {boolean} [fromRight] Specify iterating from right to left. * @returns {Function} Returns the new base function. */ function createBaseFor(fromRight) { return function(object, iteratee, keysFunc) { var iterable = toObject(object), props = keysFunc(object), length = props.length, index = fromRight ? length : -1; while ((fromRight ? index-- : ++index < length)) { var key = props[index]; if (iteratee(iterable[key], key, iterable) === false) { break; } } return object; }; } /** * Converts `value` to an object if it's not one. * * @private * @param {*} value The value to process. * @returns {Object} Returns the object. */ function toObject(value) { return isObject(value) ? value : Object(value); } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } module.exports = baseFor; ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._basefor/package.jsonnpm_3.5.2.orig/node_modules/lodash.clonedeep/node_modules/lodash._baseclone/node_modules/lodash._bas0000644000000000000000000000407212631326456032156 0ustar 00000000000000{ "name": "lodash._basefor", "version": "3.0.2", "description": "The modern build of lodash’s internal `baseFor` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._basefor v3.0.2\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseFor` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._basefor\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseFor = require('lodash._basefor');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash._basefor) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basefor@3.0.2", "_shasum": "3a4cece5b7031eae78a441c5416b90878eeee5a1", "_resolved": "https://registry.npmjs.org/lodash._basefor/-/lodash._basefor-3.0.2.tgz", "_from": "lodash._basefor@>=3.0.0 <4.0.0" } npm_3.5.2.orig/node_modules/lodash.isarguments/LICENSE0000644000000000000000000000232112631326456020764 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.isarguments/README.md0000644000000000000000000000112212631326456021234 0ustar 00000000000000# lodash.isarguments v3.0.4 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.isArguments` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.isarguments ``` In Node.js/io.js: ```js var isArguments = require('lodash.isarguments'); ``` See the [documentation](https://lodash.com/docs#isArguments) or [package source](https://github.com/lodash/lodash/blob/3.0.4-npm-packages/lodash.isarguments) for more details. npm_3.5.2.orig/node_modules/lodash.isarguments/index.js0000644000000000000000000000601412631326456021427 0ustar 00000000000000/** * lodash 3.0.4 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Checks if `value` is object-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is object-like, else `false`. */ function isObjectLike(value) { return !!value && typeof value == 'object'; } /** Used for native method references. */ var objectProto = Object.prototype; /** Used to check objects for own properties. */ var hasOwnProperty = objectProto.hasOwnProperty; /** Native method references. */ var propertyIsEnumerable = objectProto.propertyIsEnumerable; /** * Used as the [maximum length](http://ecma-international.org/ecma-262/6.0/#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * The base implementation of `_.property` without support for deep paths. * * @private * @param {string} key The key of the property to get. * @returns {Function} Returns the new function. */ function baseProperty(key) { return function(object) { return object == null ? undefined : object[key]; }; } /** * Gets the "length" property value of `object`. * * **Note:** This function is used to avoid a [JIT bug](https://bugs.webkit.org/show_bug.cgi?id=142792) * that affects Safari on at least iOS 8.1-8.3 ARM64. * * @private * @param {Object} object The object to query. * @returns {*} Returns the "length" value. */ var getLength = baseProperty('length'); /** * Checks if `value` is array-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is array-like, else `false`. */ function isArrayLike(value) { return value != null && isLength(getLength(value)); } /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](http://ecma-international.org/ecma-262/6.0/#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } /** * Checks if `value` is classified as an `arguments` object. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is correctly classified, else `false`. * @example * * _.isArguments(function() { return arguments; }()); * // => true * * _.isArguments([1, 2, 3]); * // => false */ function isArguments(value) { return isObjectLike(value) && isArrayLike(value) && hasOwnProperty.call(value, 'callee') && !propertyIsEnumerable.call(value, 'callee'); } module.exports = isArguments; npm_3.5.2.orig/node_modules/lodash.isarguments/package.json0000644000000000000000000000433012631326456022247 0ustar 00000000000000{ "name": "lodash.isarguments", "version": "3.0.4", "description": "The modern build of lodash’s `_.isArguments` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash.isarguments v3.0.4\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.isArguments` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.isarguments\n```\n\nIn Node.js/io.js:\n\n```js\nvar isArguments = require('lodash.isarguments');\n```\n\nSee the [documentation](https://lodash.com/docs#isArguments) or [package source](https://github.com/lodash/lodash/blob/3.0.4-npm-packages/lodash.isarguments) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.isarguments@3.0.4", "_shasum": "ebbb884c48d27366a44ea6fee57ed7b5a32a81e0", "_resolved": "https://registry.npmjs.org/lodash.isarguments/-/lodash.isarguments-3.0.4.tgz", "_from": "lodash.isarguments@3.0.4" } npm_3.5.2.orig/node_modules/lodash.isarray/LICENSE0000644000000000000000000000232112631326456020075 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.isarray/README.md0000644000000000000000000000106612631326456020354 0ustar 00000000000000# lodash.isarray v3.0.4 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.isArray` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.isarray ``` In Node.js/io.js: ```js var isArray = require('lodash.isarray'); ``` See the [documentation](https://lodash.com/docs#isArray) or [package source](https://github.com/lodash/lodash/blob/3.0.4-npm-packages/lodash.isarray) for more details. npm_3.5.2.orig/node_modules/lodash.isarray/index.js0000644000000000000000000001205412631326456020541 0ustar 00000000000000/** * lodash 3.0.4 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** `Object#toString` result references. */ var arrayTag = '[object Array]', funcTag = '[object Function]'; /** Used to detect host constructors (Safari > 5). */ var reIsHostCtor = /^\[object .+?Constructor\]$/; /** * Checks if `value` is object-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is object-like, else `false`. */ function isObjectLike(value) { return !!value && typeof value == 'object'; } /** Used for native method references. */ var objectProto = Object.prototype; /** Used to resolve the decompiled source of functions. */ var fnToString = Function.prototype.toString; /** Used to check objects for own properties. */ var hasOwnProperty = objectProto.hasOwnProperty; /** * Used to resolve the [`toStringTag`](http://ecma-international.org/ecma-262/6.0/#sec-object.prototype.tostring) * of values. */ var objToString = objectProto.toString; /** Used to detect if a method is native. */ var reIsNative = RegExp('^' + fnToString.call(hasOwnProperty).replace(/[\\^$.*+?()[\]{}|]/g, '\\$&') .replace(/hasOwnProperty|(function).*?(?=\\\()| for .+?(?=\\\])/g, '$1.*?') + '$' ); /* Native method references for those with the same name as other `lodash` methods. */ var nativeIsArray = getNative(Array, 'isArray'); /** * Used as the [maximum length](http://ecma-international.org/ecma-262/6.0/#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * Gets the native function at `key` of `object`. * * @private * @param {Object} object The object to query. * @param {string} key The key of the method to get. * @returns {*} Returns the function if it's native, else `undefined`. */ function getNative(object, key) { var value = object == null ? undefined : object[key]; return isNative(value) ? value : undefined; } /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](http://ecma-international.org/ecma-262/6.0/#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } /** * Checks if `value` is classified as an `Array` object. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is correctly classified, else `false`. * @example * * _.isArray([1, 2, 3]); * // => true * * _.isArray(function() { return arguments; }()); * // => false */ var isArray = nativeIsArray || function(value) { return isObjectLike(value) && isLength(value.length) && objToString.call(value) == arrayTag; }; /** * Checks if `value` is classified as a `Function` object. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is correctly classified, else `false`. * @example * * _.isFunction(_); * // => true * * _.isFunction(/abc/); * // => false */ function isFunction(value) { // The use of `Object#toString` avoids issues with the `typeof` operator // in older versions of Chrome and Safari which return 'function' for regexes // and Safari 8 equivalents which return 'object' for typed array constructors. return isObject(value) && objToString.call(value) == funcTag; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } /** * Checks if `value` is a native function. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a native function, else `false`. * @example * * _.isNative(Array.prototype.push); * // => true * * _.isNative(_); * // => false */ function isNative(value) { if (value == null) { return false; } if (isFunction(value)) { return reIsNative.test(fnToString.call(value)); } return isObjectLike(value) && reIsHostCtor.test(value); } module.exports = isArray; npm_3.5.2.orig/node_modules/lodash.isarray/package.json0000644000000000000000000000424412631326456021364 0ustar 00000000000000{ "name": "lodash.isarray", "version": "3.0.4", "description": "The modern build of lodash’s `_.isArray` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash.isarray v3.0.4\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.isArray` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.isarray\n```\n\nIn Node.js/io.js:\n\n```js\nvar isArray = require('lodash.isarray');\n```\n\nSee the [documentation](https://lodash.com/docs#isArray) or [package source](https://github.com/lodash/lodash/blob/3.0.4-npm-packages/lodash.isarray) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.isarray@3.0.4", "_shasum": "79e4eb88c36a8122af86f844aa9bcd851b5fbb55", "_resolved": "https://registry.npmjs.org/lodash.isarray/-/lodash.isarray-3.0.4.tgz", "_from": "lodash.isarray@3.0.4" } npm_3.5.2.orig/node_modules/lodash.keys/LICENSE0000644000000000000000000000232112631326456017376 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.keys/README.md0000644000000000000000000000104112631326456017646 0ustar 00000000000000# lodash.keys v3.1.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.keys` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.keys ``` In Node.js/io.js: ```js var keys = require('lodash.keys'); ``` See the [documentation](https://lodash.com/docs#keys) or [package source](https://github.com/lodash/lodash/blob/3.1.2-npm-packages/lodash.keys) for more details. npm_3.5.2.orig/node_modules/lodash.keys/index.js0000644000000000000000000001473012631326456020045 0ustar 00000000000000/** * lodash 3.1.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var getNative = require('lodash._getnative'), isArguments = require('lodash.isarguments'), isArray = require('lodash.isarray'); /** Used to detect unsigned integer values. */ var reIsUint = /^\d+$/; /** Used for native method references. */ var objectProto = Object.prototype; /** Used to check objects for own properties. */ var hasOwnProperty = objectProto.hasOwnProperty; /* Native method references for those with the same name as other `lodash` methods. */ var nativeKeys = getNative(Object, 'keys'); /** * Used as the [maximum length](http://ecma-international.org/ecma-262/6.0/#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * The base implementation of `_.property` without support for deep paths. * * @private * @param {string} key The key of the property to get. * @returns {Function} Returns the new function. */ function baseProperty(key) { return function(object) { return object == null ? undefined : object[key]; }; } /** * Gets the "length" property value of `object`. * * **Note:** This function is used to avoid a [JIT bug](https://bugs.webkit.org/show_bug.cgi?id=142792) * that affects Safari on at least iOS 8.1-8.3 ARM64. * * @private * @param {Object} object The object to query. * @returns {*} Returns the "length" value. */ var getLength = baseProperty('length'); /** * Checks if `value` is array-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is array-like, else `false`. */ function isArrayLike(value) { return value != null && isLength(getLength(value)); } /** * Checks if `value` is a valid array-like index. * * @private * @param {*} value The value to check. * @param {number} [length=MAX_SAFE_INTEGER] The upper bounds of a valid index. * @returns {boolean} Returns `true` if `value` is a valid index, else `false`. */ function isIndex(value, length) { value = (typeof value == 'number' || reIsUint.test(value)) ? +value : -1; length = length == null ? MAX_SAFE_INTEGER : length; return value > -1 && value % 1 == 0 && value < length; } /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](http://ecma-international.org/ecma-262/6.0/#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } /** * A fallback implementation of `Object.keys` which creates an array of the * own enumerable property names of `object`. * * @private * @param {Object} object The object to query. * @returns {Array} Returns the array of property names. */ function shimKeys(object) { var props = keysIn(object), propsLength = props.length, length = propsLength && object.length; var allowIndexes = !!length && isLength(length) && (isArray(object) || isArguments(object)); var index = -1, result = []; while (++index < propsLength) { var key = props[index]; if ((allowIndexes && isIndex(key, length)) || hasOwnProperty.call(object, key)) { result.push(key); } } return result; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } /** * Creates an array of the own enumerable property names of `object`. * * **Note:** Non-object values are coerced to objects. See the * [ES spec](http://ecma-international.org/ecma-262/6.0/#sec-object.keys) * for more details. * * @static * @memberOf _ * @category Object * @param {Object} object The object to query. * @returns {Array} Returns the array of property names. * @example * * function Foo() { * this.a = 1; * this.b = 2; * } * * Foo.prototype.c = 3; * * _.keys(new Foo); * // => ['a', 'b'] (iteration order is not guaranteed) * * _.keys('hi'); * // => ['0', '1'] */ var keys = !nativeKeys ? shimKeys : function(object) { var Ctor = object == null ? undefined : object.constructor; if ((typeof Ctor == 'function' && Ctor.prototype === object) || (typeof object != 'function' && isArrayLike(object))) { return shimKeys(object); } return isObject(object) ? nativeKeys(object) : []; }; /** * Creates an array of the own and inherited enumerable property names of `object`. * * **Note:** Non-object values are coerced to objects. * * @static * @memberOf _ * @category Object * @param {Object} object The object to query. * @returns {Array} Returns the array of property names. * @example * * function Foo() { * this.a = 1; * this.b = 2; * } * * Foo.prototype.c = 3; * * _.keysIn(new Foo); * // => ['a', 'b', 'c'] (iteration order is not guaranteed) */ function keysIn(object) { if (object == null) { return []; } if (!isObject(object)) { object = Object(object); } var length = object.length; length = (length && isLength(length) && (isArray(object) || isArguments(object)) && length) || 0; var Ctor = object.constructor, index = -1, isProto = typeof Ctor == 'function' && Ctor.prototype === object, result = Array(length), skipIndexes = length > 0; while (++index < length) { result[index] = (index + ''); } for (var key in object) { if (!(skipIndexes && isIndex(key, length)) && !(key == 'constructor' && (isProto || !hasOwnProperty.call(object, key)))) { result.push(key); } } return result; } module.exports = keys; npm_3.5.2.orig/node_modules/lodash.keys/package.json0000644000000000000000000000437412631326456020671 0ustar 00000000000000{ "name": "lodash.keys", "version": "3.1.2", "description": "The modern build of lodash’s `_.keys` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._getnative": "^3.0.0", "lodash.isarguments": "^3.0.0", "lodash.isarray": "^3.0.0" }, "readme": "# lodash.keys v3.1.2\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.keys` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.keys\n```\n\nIn Node.js/io.js:\n\n```js\nvar keys = require('lodash.keys');\n```\n\nSee the [documentation](https://lodash.com/docs#keys) or [package source](https://github.com/lodash/lodash/blob/3.1.2-npm-packages/lodash.keys) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.keys@3.1.2", "_shasum": "4dbc0472b156be50a0b286855d1bd0b0c656098a", "_resolved": "https://registry.npmjs.org/lodash.keys/-/lodash.keys-3.1.2.tgz", "_from": "lodash.keys@3.1.2" } npm_3.5.2.orig/node_modules/lodash.restparam/LICENSE.txt0000644000000000000000000000232112631326456021237 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.restparam/README.md0000644000000000000000000000110412631326456020671 0ustar 00000000000000# lodash.restparam v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.restParam` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.restparam ``` In Node.js/io.js: ```js var restParam = require('lodash.restparam'); ``` See the [documentation](https://lodash.com/docs#restParam) or [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash.restparam) for more details. npm_3.5.2.orig/node_modules/lodash.restparam/index.js0000644000000000000000000000441712631326456021071 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** Used as the `TypeError` message for "Functions" methods. */ var FUNC_ERROR_TEXT = 'Expected a function'; /* Native method references for those with the same name as other `lodash` methods. */ var nativeMax = Math.max; /** * Creates a function that invokes `func` with the `this` binding of the * created function and arguments from `start` and beyond provided as an array. * * **Note:** This method is based on the [rest parameter](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/rest_parameters). * * @static * @memberOf _ * @category Function * @param {Function} func The function to apply a rest parameter to. * @param {number} [start=func.length-1] The start position of the rest parameter. * @returns {Function} Returns the new function. * @example * * var say = _.restParam(function(what, names) { * return what + ' ' + _.initial(names).join(', ') + * (_.size(names) > 1 ? ', & ' : '') + _.last(names); * }); * * say('hello', 'fred', 'barney', 'pebbles'); * // => 'hello fred, barney, & pebbles' */ function restParam(func, start) { if (typeof func != 'function') { throw new TypeError(FUNC_ERROR_TEXT); } start = nativeMax(start === undefined ? (func.length - 1) : (+start || 0), 0); return function() { var args = arguments, index = -1, length = nativeMax(args.length - start, 0), rest = Array(length); while (++index < length) { rest[index] = args[start + index]; } switch (start) { case 0: return func.call(this, rest); case 1: return func.call(this, args[0], rest); case 2: return func.call(this, args[0], args[1], rest); } var otherArgs = Array(start + 1); index = -1; while (++index < start) { otherArgs[index] = args[index]; } otherArgs[start] = rest; return func.apply(this, otherArgs); }; } module.exports = restParam; npm_3.5.2.orig/node_modules/lodash.restparam/package.json0000644000000000000000000000427612631326456021715 0ustar 00000000000000{ "name": "lodash.restparam", "version": "3.6.1", "description": "The modern build of lodash’s `_.restParam` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash.restparam v3.6.1\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.restParam` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.restparam\n```\n\nIn Node.js/io.js:\n\n```js\nvar restParam = require('lodash.restparam');\n```\n\nSee the [documentation](https://lodash.com/docs#restParam) or [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash.restparam) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.restparam@3.6.1", "_shasum": "936a4e309ef330a7645ed4145986c85ae5b20805", "_resolved": "https://registry.npmjs.org/lodash.restparam/-/lodash.restparam-3.6.1.tgz", "_from": "lodash.restparam@3.6.1" } npm_3.5.2.orig/node_modules/lodash.union/LICENSE.txt0000644000000000000000000000232112631326456020371 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.union/README.md0000644000000000000000000000105012631326456020023 0ustar 00000000000000# lodash.union v3.1.0 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.union` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.union ``` In Node.js/io.js: ```js var union = require('lodash.union'); ``` See the [documentation](https://lodash.com/docs#union) or [package source](https://github.com/lodash/lodash/blob/3.1.0-npm-packages/lodash.union) for more details. npm_3.5.2.orig/node_modules/lodash.union/index.js0000644000000000000000000000233212631326456020215 0ustar 00000000000000/** * lodash 3.1.0 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.2 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseFlatten = require('lodash._baseflatten'), baseUniq = require('lodash._baseuniq'), restParam = require('lodash.restparam'); /** * Creates an array of unique values, in order, of the provided arrays using * `SameValueZero` for equality comparisons. * * **Note:** [`SameValueZero`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-samevaluezero) * comparisons are like strict equality comparisons, e.g. `===`, except that * `NaN` matches `NaN`. * * @static * @memberOf _ * @category Array * @param {...Array} [arrays] The arrays to inspect. * @returns {Array} Returns the new array of combined values. * @example * * _.union([1, 2], [4, 2], [2, 1]); * // => [1, 2, 4] */ var union = restParam(function(arrays) { return baseUniq(baseFlatten(arrays, false, true)); }); module.exports = union; npm_3.5.2.orig/node_modules/lodash.union/node_modules/0000755000000000000000000000000012631326456021225 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.union/package.json0000644000000000000000000000456112631326456021044 0ustar 00000000000000{ "name": "lodash.union", "version": "3.1.0", "description": "The modern build of lodash’s `_.union` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "https://github.com/lodash/lodash" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._baseflatten": "^3.0.0", "lodash._baseuniq": "^3.0.0", "lodash.restparam": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.union@3.1.0", "_shasum": "a4a3066fc15d6a7f8151cce9bdfe63dce7f5bcff", "_from": "lodash.union@>=3.1.0 <3.2.0", "_npmVersion": "2.7.3", "_nodeVersion": "0.12.0", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" }, { "name": "d10", "email": "demoneaux@gmail.com" } ], "dist": { "shasum": "a4a3066fc15d6a7f8151cce9bdfe63dce7f5bcff", "tarball": "http://registry.npmjs.org/lodash.union/-/lodash.union-3.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.union/-/lodash.union-3.1.0.tgz" } npm_3.5.2.orig/node_modules/lodash.union/node_modules/lodash._baseflatten/0000755000000000000000000000000012631326456025125 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.union/node_modules/lodash._baseflatten/LICENSE0000644000000000000000000000232112631326456026130 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.union/node_modules/lodash._baseflatten/README.md0000644000000000000000000000104512631326456026404 0ustar 00000000000000# lodash._baseflatten v3.1.4 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseFlatten` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._baseflatten ``` In Node.js/io.js: ```js var baseFlatten = require('lodash._baseflatten'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.1.4-npm-packages/lodash._baseflatten) for more details. npm_3.5.2.orig/node_modules/lodash.union/node_modules/lodash._baseflatten/index.js0000644000000000000000000000732412631326456026600 0ustar 00000000000000/** * lodash 3.1.4 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var isArguments = require('lodash.isarguments'), isArray = require('lodash.isarray'); /** * Checks if `value` is object-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is object-like, else `false`. */ function isObjectLike(value) { return !!value && typeof value == 'object'; } /** * Used as the [maximum length](http://ecma-international.org/ecma-262/6.0/#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * Appends the elements of `values` to `array`. * * @private * @param {Array} array The array to modify. * @param {Array} values The values to append. * @returns {Array} Returns `array`. */ function arrayPush(array, values) { var index = -1, length = values.length, offset = array.length; while (++index < length) { array[offset + index] = values[index]; } return array; } /** * The base implementation of `_.flatten` with added support for restricting * flattening and specifying the start index. * * @private * @param {Array} array The array to flatten. * @param {boolean} [isDeep] Specify a deep flatten. * @param {boolean} [isStrict] Restrict flattening to arrays-like objects. * @param {Array} [result=[]] The initial result value. * @returns {Array} Returns the new flattened array. */ function baseFlatten(array, isDeep, isStrict, result) { result || (result = []); var index = -1, length = array.length; while (++index < length) { var value = array[index]; if (isObjectLike(value) && isArrayLike(value) && (isStrict || isArray(value) || isArguments(value))) { if (isDeep) { // Recursively flatten arrays (susceptible to call stack limits). baseFlatten(value, isDeep, isStrict, result); } else { arrayPush(result, value); } } else if (!isStrict) { result[result.length] = value; } } return result; } /** * The base implementation of `_.property` without support for deep paths. * * @private * @param {string} key The key of the property to get. * @returns {Function} Returns the new function. */ function baseProperty(key) { return function(object) { return object == null ? undefined : object[key]; }; } /** * Gets the "length" property value of `object`. * * **Note:** This function is used to avoid a [JIT bug](https://bugs.webkit.org/show_bug.cgi?id=142792) * that affects Safari on at least iOS 8.1-8.3 ARM64. * * @private * @param {Object} object The object to query. * @returns {*} Returns the "length" value. */ var getLength = baseProperty('length'); /** * Checks if `value` is array-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is array-like, else `false`. */ function isArrayLike(value) { return value != null && isLength(getLength(value)); } /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](http://ecma-international.org/ecma-262/6.0/#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } module.exports = baseFlatten; npm_3.5.2.orig/node_modules/lodash.union/node_modules/lodash._baseflatten/package.json0000644000000000000000000000430612631326456027416 0ustar 00000000000000{ "name": "lodash._baseflatten", "version": "3.1.4", "description": "The modern build of lodash’s internal `baseFlatten` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.isarguments": "^3.0.0", "lodash.isarray": "^3.0.0" }, "readme": "# lodash._baseflatten v3.1.4\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseFlatten` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._baseflatten\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseFlatten = require('lodash._baseflatten');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.1.4-npm-packages/lodash._baseflatten) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._baseflatten@3.1.4", "_shasum": "0770ff80131af6e34f3b511796a7ba5214e65ff7", "_resolved": "https://registry.npmjs.org/lodash._baseflatten/-/lodash._baseflatten-3.1.4.tgz", "_from": "lodash._baseflatten@>=3.0.0 <4.0.0" } npm_3.5.2.orig/node_modules/lodash.uniq/LICENSE0000644000000000000000000000232112631326456017377 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.uniq/README.md0000644000000000000000000000104112631326456017647 0ustar 00000000000000# lodash.uniq v3.2.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.uniq` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.uniq ``` In Node.js/io.js: ```js var uniq = require('lodash.uniq'); ``` See the [documentation](https://lodash.com/docs#uniq) or [package source](https://github.com/lodash/lodash/blob/3.2.2-npm-packages/lodash.uniq) for more details. npm_3.5.2.orig/node_modules/lodash.uniq/index.js0000644000000000000000000000706212631326456020046 0ustar 00000000000000/** * lodash 3.2.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseCallback = require('lodash._basecallback'), baseUniq = require('lodash._baseuniq'), isIterateeCall = require('lodash._isiterateecall'); /** * An implementation of `_.uniq` optimized for sorted arrays without support * for callback shorthands and `this` binding. * * @private * @param {Array} array The array to inspect. * @param {Function} [iteratee] The function invoked per iteration. * @returns {Array} Returns the new duplicate-value-free array. */ function sortedUniq(array, iteratee) { var seen, index = -1, length = array.length, resIndex = -1, result = []; while (++index < length) { var value = array[index], computed = iteratee ? iteratee(value, index, array) : value; if (!index || seen !== computed) { seen = computed; result[++resIndex] = value; } } return result; } /** * Creates a duplicate-free version of an array, using * [`SameValueZero`](http://ecma-international.org/ecma-262/6.0/#sec-samevaluezero) * for equality comparisons, in which only the first occurence of each element * is kept. Providing `true` for `isSorted` performs a faster search algorithm * for sorted arrays. If an iteratee function is provided it is invoked for * each element in the array to generate the criterion by which uniqueness * is computed. The `iteratee` is bound to `thisArg` and invoked with three * arguments: (value, index, array). * * If a property name is provided for `iteratee` the created `_.property` * style callback returns the property value of the given element. * * If a value is also provided for `thisArg` the created `_.matchesProperty` * style callback returns `true` for elements that have a matching property * value, else `false`. * * If an object is provided for `iteratee` the created `_.matches` style * callback returns `true` for elements that have the properties of the given * object, else `false`. * * @static * @memberOf _ * @alias unique * @category Array * @param {Array} array The array to inspect. * @param {boolean} [isSorted] Specify the array is sorted. * @param {Function|Object|string} [iteratee] The function invoked per iteration. * @param {*} [thisArg] The `this` binding of `iteratee`. * @returns {Array} Returns the new duplicate-value-free array. * @example * * _.uniq([2, 1, 2]); * // => [2, 1] * * // using `isSorted` * _.uniq([1, 1, 2], true); * // => [1, 2] * * // using an iteratee function * _.uniq([1, 2.5, 1.5, 2], function(n) { * return this.floor(n); * }, Math); * // => [1, 2.5] * * // using the `_.property` callback shorthand * _.uniq([{ 'x': 1 }, { 'x': 2 }, { 'x': 1 }], 'x'); * // => [{ 'x': 1 }, { 'x': 2 }] */ function uniq(array, isSorted, iteratee, thisArg) { var length = array ? array.length : 0; if (!length) { return []; } if (isSorted != null && typeof isSorted != 'boolean') { thisArg = iteratee; iteratee = isIterateeCall(array, isSorted, thisArg) ? undefined : isSorted; isSorted = false; } iteratee = iteratee == null ? iteratee : baseCallback(iteratee, thisArg, 3); return (isSorted) ? sortedUniq(array, iteratee) : baseUniq(array, iteratee); } module.exports = uniq; npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/0000755000000000000000000000000012631326456021051 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/package.json0000644000000000000000000000475012631326456020670 0ustar 00000000000000{ "name": "lodash.uniq", "version": "3.2.2", "description": "The modern build of lodash’s `_.uniq` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basecallback": "^3.0.0", "lodash._baseuniq": "^3.0.0", "lodash._getnative": "^3.0.0", "lodash._isiterateecall": "^3.0.0", "lodash.isarray": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.uniq@3.2.2", "_shasum": "146c36f25e75d19501ba402e88ba14937f63cd8b", "_from": "lodash.uniq@>=3.2.2 <3.3.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" }, { "name": "d10", "email": "demoneaux@gmail.com" } ], "dist": { "shasum": "146c36f25e75d19501ba402e88ba14937f63cd8b", "tarball": "http://registry.npmjs.org/lodash.uniq/-/lodash.uniq-3.2.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.uniq/-/lodash.uniq-3.2.2.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/0000755000000000000000000000000012631326456025050 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._isiterateecall/0000755000000000000000000000000012631326456025453 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/LICENSE0000644000000000000000000000232112631326456026053 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/README.md0000644000000000000000000000105312631326456026326 0ustar 00000000000000# lodash._basecallback v3.3.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseCallback` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basecallback ``` In Node.js/io.js: ```js var baseCallback = require('lodash._basecallback'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.3.1-npm-packages/lodash._basecallback) for more details. npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/index.js0000644000000000000000000002574112631326456026526 0ustar 00000000000000/** * lodash 3.3.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseIsEqual = require('lodash._baseisequal'), bindCallback = require('lodash._bindcallback'), isArray = require('lodash.isarray'), pairs = require('lodash.pairs'); /** Used to match property names within property paths. */ var reIsDeepProp = /\.|\[(?:[^[\]]*|(["'])(?:(?!\1)[^\n\\]|\\.)*?\1)\]/, reIsPlainProp = /^\w*$/, rePropName = /[^.[\]]+|\[(?:(-?\d+(?:\.\d+)?)|(["'])((?:(?!\2)[^\n\\]|\\.)*?)\2)\]/g; /** Used to match backslashes in property paths. */ var reEscapeChar = /\\(\\)?/g; /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } /** * The base implementation of `_.callback` which supports specifying the * number of arguments to provide to `func`. * * @private * @param {*} [func=_.identity] The value to convert to a callback. * @param {*} [thisArg] The `this` binding of `func`. * @param {number} [argCount] The number of arguments to provide to `func`. * @returns {Function} Returns the callback. */ function baseCallback(func, thisArg, argCount) { var type = typeof func; if (type == 'function') { return thisArg === undefined ? func : bindCallback(func, thisArg, argCount); } if (func == null) { return identity; } if (type == 'object') { return baseMatches(func); } return thisArg === undefined ? property(func) : baseMatchesProperty(func, thisArg); } /** * The base implementation of `get` without support for string paths * and default values. * * @private * @param {Object} object The object to query. * @param {Array} path The path of the property to get. * @param {string} [pathKey] The key representation of path. * @returns {*} Returns the resolved value. */ function baseGet(object, path, pathKey) { if (object == null) { return; } if (pathKey !== undefined && pathKey in toObject(object)) { path = [pathKey]; } var index = 0, length = path.length; while (object != null && index < length) { object = object[path[index++]]; } return (index && index == length) ? object : undefined; } /** * The base implementation of `_.isMatch` without support for callback * shorthands and `this` binding. * * @private * @param {Object} object The object to inspect. * @param {Array} matchData The propery names, values, and compare flags to match. * @param {Function} [customizer] The function to customize comparing objects. * @returns {boolean} Returns `true` if `object` is a match, else `false`. */ function baseIsMatch(object, matchData, customizer) { var index = matchData.length, length = index, noCustomizer = !customizer; if (object == null) { return !length; } object = toObject(object); while (index--) { var data = matchData[index]; if ((noCustomizer && data[2]) ? data[1] !== object[data[0]] : !(data[0] in object) ) { return false; } } while (++index < length) { data = matchData[index]; var key = data[0], objValue = object[key], srcValue = data[1]; if (noCustomizer && data[2]) { if (objValue === undefined && !(key in object)) { return false; } } else { var result = customizer ? customizer(objValue, srcValue, key) : undefined; if (!(result === undefined ? baseIsEqual(srcValue, objValue, customizer, true) : result)) { return false; } } } return true; } /** * The base implementation of `_.matches` which does not clone `source`. * * @private * @param {Object} source The object of property values to match. * @returns {Function} Returns the new function. */ function baseMatches(source) { var matchData = getMatchData(source); if (matchData.length == 1 && matchData[0][2]) { var key = matchData[0][0], value = matchData[0][1]; return function(object) { if (object == null) { return false; } return object[key] === value && (value !== undefined || (key in toObject(object))); }; } return function(object) { return baseIsMatch(object, matchData); }; } /** * The base implementation of `_.matchesProperty` which does not clone `srcValue`. * * @private * @param {string} path The path of the property to get. * @param {*} srcValue The value to compare. * @returns {Function} Returns the new function. */ function baseMatchesProperty(path, srcValue) { var isArr = isArray(path), isCommon = isKey(path) && isStrictComparable(srcValue), pathKey = (path + ''); path = toPath(path); return function(object) { if (object == null) { return false; } var key = pathKey; object = toObject(object); if ((isArr || !isCommon) && !(key in object)) { object = path.length == 1 ? object : baseGet(object, baseSlice(path, 0, -1)); if (object == null) { return false; } key = last(path); object = toObject(object); } return object[key] === srcValue ? (srcValue !== undefined || (key in object)) : baseIsEqual(srcValue, object[key], undefined, true); }; } /** * The base implementation of `_.property` without support for deep paths. * * @private * @param {string} key The key of the property to get. * @returns {Function} Returns the new function. */ function baseProperty(key) { return function(object) { return object == null ? undefined : object[key]; }; } /** * A specialized version of `baseProperty` which supports deep paths. * * @private * @param {Array|string} path The path of the property to get. * @returns {Function} Returns the new function. */ function basePropertyDeep(path) { var pathKey = (path + ''); path = toPath(path); return function(object) { return baseGet(object, path, pathKey); }; } /** * The base implementation of `_.slice` without an iteratee call guard. * * @private * @param {Array} array The array to slice. * @param {number} [start=0] The start position. * @param {number} [end=array.length] The end position. * @returns {Array} Returns the slice of `array`. */ function baseSlice(array, start, end) { var index = -1, length = array.length; start = start == null ? 0 : (+start || 0); if (start < 0) { start = -start > length ? 0 : (length + start); } end = (end === undefined || end > length) ? length : (+end || 0); if (end < 0) { end += length; } length = start > end ? 0 : ((end - start) >>> 0); start >>>= 0; var result = Array(length); while (++index < length) { result[index] = array[index + start]; } return result; } /** * Gets the propery names, values, and compare flags of `object`. * * @private * @param {Object} object The object to query. * @returns {Array} Returns the match data of `object`. */ function getMatchData(object) { var result = pairs(object), length = result.length; while (length--) { result[length][2] = isStrictComparable(result[length][1]); } return result; } /** * Checks if `value` is a property name and not a property path. * * @private * @param {*} value The value to check. * @param {Object} [object] The object to query keys on. * @returns {boolean} Returns `true` if `value` is a property name, else `false`. */ function isKey(value, object) { var type = typeof value; if ((type == 'string' && reIsPlainProp.test(value)) || type == 'number') { return true; } if (isArray(value)) { return false; } var result = !reIsDeepProp.test(value); return result || (object != null && value in toObject(object)); } /** * Checks if `value` is suitable for strict equality comparisons, i.e. `===`. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` if suitable for strict * equality comparisons, else `false`. */ function isStrictComparable(value) { return value === value && !isObject(value); } /** * Converts `value` to an object if it's not one. * * @private * @param {*} value The value to process. * @returns {Object} Returns the object. */ function toObject(value) { return isObject(value) ? value : Object(value); } /** * Converts `value` to property path array if it's not one. * * @private * @param {*} value The value to process. * @returns {Array} Returns the property path array. */ function toPath(value) { if (isArray(value)) { return value; } var result = []; baseToString(value).replace(rePropName, function(match, number, quote, string) { result.push(quote ? string.replace(reEscapeChar, '$1') : (number || match)); }); return result; } /** * Gets the last element of `array`. * * @static * @memberOf _ * @category Array * @param {Array} array The array to query. * @returns {*} Returns the last element of `array`. * @example * * _.last([1, 2, 3]); * // => 3 */ function last(array) { var length = array ? array.length : 0; return length ? array[length - 1] : undefined; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } /** * This method returns the first argument provided to it. * * @static * @memberOf _ * @category Utility * @param {*} value Any value. * @returns {*} Returns `value`. * @example * * var object = { 'user': 'fred' }; * * _.identity(object) === object; * // => true */ function identity(value) { return value; } /** * Creates a function that returns the property value at `path` on a * given object. * * @static * @memberOf _ * @category Utility * @param {Array|string} path The path of the property to get. * @returns {Function} Returns the new function. * @example * * var objects = [ * { 'a': { 'b': { 'c': 2 } } }, * { 'a': { 'b': { 'c': 1 } } } * ]; * * _.map(objects, _.property('a.b.c')); * // => [2, 1] * * _.pluck(_.sortBy(objects, _.property(['a', 'b', 'c'])), 'a.b.c'); * // => [1, 2] */ function property(path) { return isKey(path) ? baseProperty(path) : basePropertyDeep(path); } module.exports = baseCallback; npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/0000755000000000000000000000000012631326456027525 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/package.json0000644000000000000000000000442712631326456027345 0ustar 00000000000000{ "name": "lodash._basecallback", "version": "3.3.1", "description": "The modern build of lodash’s internal `baseCallback` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._baseisequal": "^3.0.0", "lodash._bindcallback": "^3.0.0", "lodash.isarray": "^3.0.0", "lodash.pairs": "^3.0.0" }, "readme": "# lodash._basecallback v3.3.1\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseCallback` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._basecallback\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseCallback = require('lodash._basecallback');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.3.1-npm-packages/lodash._basecallback) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basecallback@3.3.1", "_shasum": "b7b2bb43dc2160424a21ccf26c57e443772a8e27", "_resolved": "https://registry.npmjs.org/lodash._basecallback/-/lodash._basecallback-3.3.1.tgz", "_from": "lodash._basecallback@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000755000000000000000000000000012631326456032141 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/0000755000000000000000000000000012631326456032114 5ustar 00000000000000././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000232112631326456032141 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/README.mdnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000104512631326456032143 0ustar 00000000000000# lodash._baseisequal v3.0.7 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseIsEqual` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._baseisequal ``` In Node.js/io.js: ```js var baseIsEqual = require('lodash._baseisequal'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.7-npm-packages/lodash._baseisequal) for more details. ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/index.jsnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000002630112631326456032145 0ustar 00000000000000/** * lodash 3.0.7 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var isArray = require('lodash.isarray'), isTypedArray = require('lodash.istypedarray'), keys = require('lodash.keys'); /** `Object#toString` result references. */ var argsTag = '[object Arguments]', arrayTag = '[object Array]', boolTag = '[object Boolean]', dateTag = '[object Date]', errorTag = '[object Error]', numberTag = '[object Number]', objectTag = '[object Object]', regexpTag = '[object RegExp]', stringTag = '[object String]'; /** * Checks if `value` is object-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is object-like, else `false`. */ function isObjectLike(value) { return !!value && typeof value == 'object'; } /** Used for native method references. */ var objectProto = Object.prototype; /** Used to check objects for own properties. */ var hasOwnProperty = objectProto.hasOwnProperty; /** * Used to resolve the [`toStringTag`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-object.prototype.tostring) * of values. */ var objToString = objectProto.toString; /** * A specialized version of `_.some` for arrays without support for callback * shorthands and `this` binding. * * @private * @param {Array} array The array to iterate over. * @param {Function} predicate The function invoked per iteration. * @returns {boolean} Returns `true` if any element passes the predicate check, * else `false`. */ function arraySome(array, predicate) { var index = -1, length = array.length; while (++index < length) { if (predicate(array[index], index, array)) { return true; } } return false; } /** * The base implementation of `_.isEqual` without support for `this` binding * `customizer` functions. * * @private * @param {*} value The value to compare. * @param {*} other The other value to compare. * @param {Function} [customizer] The function to customize comparing values. * @param {boolean} [isLoose] Specify performing partial comparisons. * @param {Array} [stackA] Tracks traversed `value` objects. * @param {Array} [stackB] Tracks traversed `other` objects. * @returns {boolean} Returns `true` if the values are equivalent, else `false`. */ function baseIsEqual(value, other, customizer, isLoose, stackA, stackB) { if (value === other) { return true; } if (value == null || other == null || (!isObject(value) && !isObjectLike(other))) { return value !== value && other !== other; } return baseIsEqualDeep(value, other, baseIsEqual, customizer, isLoose, stackA, stackB); } /** * A specialized version of `baseIsEqual` for arrays and objects which performs * deep comparisons and tracks traversed objects enabling objects with circular * references to be compared. * * @private * @param {Object} object The object to compare. * @param {Object} other The other object to compare. * @param {Function} equalFunc The function to determine equivalents of values. * @param {Function} [customizer] The function to customize comparing objects. * @param {boolean} [isLoose] Specify performing partial comparisons. * @param {Array} [stackA=[]] Tracks traversed `value` objects. * @param {Array} [stackB=[]] Tracks traversed `other` objects. * @returns {boolean} Returns `true` if the objects are equivalent, else `false`. */ function baseIsEqualDeep(object, other, equalFunc, customizer, isLoose, stackA, stackB) { var objIsArr = isArray(object), othIsArr = isArray(other), objTag = arrayTag, othTag = arrayTag; if (!objIsArr) { objTag = objToString.call(object); if (objTag == argsTag) { objTag = objectTag; } else if (objTag != objectTag) { objIsArr = isTypedArray(object); } } if (!othIsArr) { othTag = objToString.call(other); if (othTag == argsTag) { othTag = objectTag; } else if (othTag != objectTag) { othIsArr = isTypedArray(other); } } var objIsObj = objTag == objectTag, othIsObj = othTag == objectTag, isSameTag = objTag == othTag; if (isSameTag && !(objIsArr || objIsObj)) { return equalByTag(object, other, objTag); } if (!isLoose) { var objIsWrapped = objIsObj && hasOwnProperty.call(object, '__wrapped__'), othIsWrapped = othIsObj && hasOwnProperty.call(other, '__wrapped__'); if (objIsWrapped || othIsWrapped) { return equalFunc(objIsWrapped ? object.value() : object, othIsWrapped ? other.value() : other, customizer, isLoose, stackA, stackB); } } if (!isSameTag) { return false; } // Assume cyclic values are equal. // For more information on detecting circular references see https://es5.github.io/#JO. stackA || (stackA = []); stackB || (stackB = []); var length = stackA.length; while (length--) { if (stackA[length] == object) { return stackB[length] == other; } } // Add `object` and `other` to the stack of traversed objects. stackA.push(object); stackB.push(other); var result = (objIsArr ? equalArrays : equalObjects)(object, other, equalFunc, customizer, isLoose, stackA, stackB); stackA.pop(); stackB.pop(); return result; } /** * A specialized version of `baseIsEqualDeep` for arrays with support for * partial deep comparisons. * * @private * @param {Array} array The array to compare. * @param {Array} other The other array to compare. * @param {Function} equalFunc The function to determine equivalents of values. * @param {Function} [customizer] The function to customize comparing arrays. * @param {boolean} [isLoose] Specify performing partial comparisons. * @param {Array} [stackA] Tracks traversed `value` objects. * @param {Array} [stackB] Tracks traversed `other` objects. * @returns {boolean} Returns `true` if the arrays are equivalent, else `false`. */ function equalArrays(array, other, equalFunc, customizer, isLoose, stackA, stackB) { var index = -1, arrLength = array.length, othLength = other.length; if (arrLength != othLength && !(isLoose && othLength > arrLength)) { return false; } // Ignore non-index properties. while (++index < arrLength) { var arrValue = array[index], othValue = other[index], result = customizer ? customizer(isLoose ? othValue : arrValue, isLoose ? arrValue : othValue, index) : undefined; if (result !== undefined) { if (result) { continue; } return false; } // Recursively compare arrays (susceptible to call stack limits). if (isLoose) { if (!arraySome(other, function(othValue) { return arrValue === othValue || equalFunc(arrValue, othValue, customizer, isLoose, stackA, stackB); })) { return false; } } else if (!(arrValue === othValue || equalFunc(arrValue, othValue, customizer, isLoose, stackA, stackB))) { return false; } } return true; } /** * A specialized version of `baseIsEqualDeep` for comparing objects of * the same `toStringTag`. * * **Note:** This function only supports comparing values with tags of * `Boolean`, `Date`, `Error`, `Number`, `RegExp`, or `String`. * * @private * @param {Object} value The object to compare. * @param {Object} other The other object to compare. * @param {string} tag The `toStringTag` of the objects to compare. * @returns {boolean} Returns `true` if the objects are equivalent, else `false`. */ function equalByTag(object, other, tag) { switch (tag) { case boolTag: case dateTag: // Coerce dates and booleans to numbers, dates to milliseconds and booleans // to `1` or `0` treating invalid dates coerced to `NaN` as not equal. return +object == +other; case errorTag: return object.name == other.name && object.message == other.message; case numberTag: // Treat `NaN` vs. `NaN` as equal. return (object != +object) ? other != +other : object == +other; case regexpTag: case stringTag: // Coerce regexes to strings and treat strings primitives and string // objects as equal. See https://es5.github.io/#x15.10.6.4 for more details. return object == (other + ''); } return false; } /** * A specialized version of `baseIsEqualDeep` for objects with support for * partial deep comparisons. * * @private * @param {Object} object The object to compare. * @param {Object} other The other object to compare. * @param {Function} equalFunc The function to determine equivalents of values. * @param {Function} [customizer] The function to customize comparing values. * @param {boolean} [isLoose] Specify performing partial comparisons. * @param {Array} [stackA] Tracks traversed `value` objects. * @param {Array} [stackB] Tracks traversed `other` objects. * @returns {boolean} Returns `true` if the objects are equivalent, else `false`. */ function equalObjects(object, other, equalFunc, customizer, isLoose, stackA, stackB) { var objProps = keys(object), objLength = objProps.length, othProps = keys(other), othLength = othProps.length; if (objLength != othLength && !isLoose) { return false; } var index = objLength; while (index--) { var key = objProps[index]; if (!(isLoose ? key in other : hasOwnProperty.call(other, key))) { return false; } } var skipCtor = isLoose; while (++index < objLength) { key = objProps[index]; var objValue = object[key], othValue = other[key], result = customizer ? customizer(isLoose ? othValue : objValue, isLoose? objValue : othValue, key) : undefined; // Recursively compare objects (susceptible to call stack limits). if (!(result === undefined ? equalFunc(objValue, othValue, customizer, isLoose, stackA, stackB) : result)) { return false; } skipCtor || (skipCtor = key == 'constructor'); } if (!skipCtor) { var objCtor = object.constructor, othCtor = other.constructor; // Non `Object` object instances with different constructors are not equal. if (objCtor != othCtor && ('constructor' in object && 'constructor' in other) && !(typeof objCtor == 'function' && objCtor instanceof objCtor && typeof othCtor == 'function' && othCtor instanceof othCtor)) { return false; } } return true; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } module.exports = baseIsEqual; ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/node_modules/npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000755000000000000000000000000012631326456032141 5ustar 00000000000000././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/package.jsonnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000434412631326456032150 0ustar 00000000000000{ "name": "lodash._baseisequal", "version": "3.0.7", "description": "The modern build of lodash’s internal `baseIsEqual` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.isarray": "^3.0.0", "lodash.istypedarray": "^3.0.0", "lodash.keys": "^3.0.0" }, "readme": "# lodash._baseisequal v3.0.7\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseIsEqual` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._baseisequal\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseIsEqual = require('lodash._baseisequal');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.7-npm-packages/lodash._baseisequal) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._baseisequal@3.0.7", "_shasum": "d8025f76339d29342767dcc887ce5cb95a5b51f1", "_resolved": "https://registry.npmjs.org/lodash._baseisequal/-/lodash._baseisequal-3.0.7.tgz", "_from": "lodash._baseisequal@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/node_modules/lodash.istypedarray/npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000755000000000000000000000000012631326456032141 5ustar 00000000000000././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/node_modules/lodash.istypedarray/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000232112631326456032141 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/node_modules/lodash.istypedarray/README.mdnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000113112631326456032137 0ustar 00000000000000# lodash.istypedarray v3.0.2 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.isTypedArray` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.istypedarray ``` In Node.js/io.js: ```js var isTypedArray = require('lodash.istypedarray'); ``` See the [documentation](https://lodash.com/docs#isTypedArray) or [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash.istypedarray) for more details. ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/node_modules/lodash.istypedarray/index.jsnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000717412631326456032154 0ustar 00000000000000/** * lodash 3.0.2 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** `Object#toString` result references. */ var argsTag = '[object Arguments]', arrayTag = '[object Array]', boolTag = '[object Boolean]', dateTag = '[object Date]', errorTag = '[object Error]', funcTag = '[object Function]', mapTag = '[object Map]', numberTag = '[object Number]', objectTag = '[object Object]', regexpTag = '[object RegExp]', setTag = '[object Set]', stringTag = '[object String]', weakMapTag = '[object WeakMap]'; var arrayBufferTag = '[object ArrayBuffer]', float32Tag = '[object Float32Array]', float64Tag = '[object Float64Array]', int8Tag = '[object Int8Array]', int16Tag = '[object Int16Array]', int32Tag = '[object Int32Array]', uint8Tag = '[object Uint8Array]', uint8ClampedTag = '[object Uint8ClampedArray]', uint16Tag = '[object Uint16Array]', uint32Tag = '[object Uint32Array]'; /** Used to identify `toStringTag` values of typed arrays. */ var typedArrayTags = {}; typedArrayTags[float32Tag] = typedArrayTags[float64Tag] = typedArrayTags[int8Tag] = typedArrayTags[int16Tag] = typedArrayTags[int32Tag] = typedArrayTags[uint8Tag] = typedArrayTags[uint8ClampedTag] = typedArrayTags[uint16Tag] = typedArrayTags[uint32Tag] = true; typedArrayTags[argsTag] = typedArrayTags[arrayTag] = typedArrayTags[arrayBufferTag] = typedArrayTags[boolTag] = typedArrayTags[dateTag] = typedArrayTags[errorTag] = typedArrayTags[funcTag] = typedArrayTags[mapTag] = typedArrayTags[numberTag] = typedArrayTags[objectTag] = typedArrayTags[regexpTag] = typedArrayTags[setTag] = typedArrayTags[stringTag] = typedArrayTags[weakMapTag] = false; /** * Checks if `value` is object-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is object-like, else `false`. */ function isObjectLike(value) { return !!value && typeof value == 'object'; } /** Used for native method references. */ var objectProto = Object.prototype; /** * Used to resolve the [`toStringTag`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-object.prototype.tostring) * of values. */ var objToString = objectProto.toString; /** * Used as the [maximum length](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } /** * Checks if `value` is classified as a typed array. * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is correctly classified, else `false`. * @example * * _.isTypedArray(new Uint8Array); * // => true * * _.isTypedArray([]); * // => false */ function isTypedArray(value) { return isObjectLike(value) && isLength(value.length) && !!typedArrayTags[objToString.call(value)]; } module.exports = isTypedArray; ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._baseisequal/node_modules/lodash.istypedarray/package.jsonnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash._basei0000644000000000000000000000435612631326456032153 0ustar 00000000000000{ "name": "lodash.istypedarray", "version": "3.0.2", "description": "The modern build of lodash’s `_.isTypedArray` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash.istypedarray v3.0.2\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.isTypedArray` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.istypedarray\n```\n\nIn Node.js/io.js:\n\n```js\nvar isTypedArray = require('lodash.istypedarray');\n```\n\nSee the [documentation](https://lodash.com/docs#isTypedArray) or [package source](https://github.com/lodash/lodash/blob/3.0.2-npm-packages/lodash.istypedarray) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.istypedarray@3.0.2", "_shasum": "9397b113c15f424f320af06caa59cc495e2093ce", "_resolved": "https://registry.npmjs.org/lodash.istypedarray/-/lodash.istypedarray-3.0.2.tgz", "_from": "lodash.istypedarray@>=3.0.0 <4.0.0" } ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/LICENSE.txtnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/0000644000000000000000000000232112631326456032114 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/README.mdnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/0000644000000000000000000000105012631326456032112 0ustar 00000000000000# lodash.pairs v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.pairs` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.pairs ``` In Node.js/io.js: ```js var pairs = require('lodash.pairs'); ``` See the [documentation](https://lodash.com/docs#pairs) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.pairs) for more details. ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/index.jsnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/0000644000000000000000000000407212631326456032121 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var keys = require('lodash.keys'); /** * Converts `value` to an object if it's not one. * * @private * @param {*} value The value to process. * @returns {Object} Returns the object. */ function toObject(value) { return isObject(value) ? value : Object(value); } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } /** * Creates a two dimensional array of the key-value pairs for `object`, * e.g. `[[key1, value1], [key2, value2]]`. * * @static * @memberOf _ * @category Object * @param {Object} object The object to query. * @returns {Array} Returns the new array of key-value pairs. * @example * * _.pairs({ 'barney': 36, 'fred': 40 }); * // => [['barney', 36], ['fred', 40]] (iteration order is not guaranteed) */ function pairs(object) { object = toObject(object); var index = -1, props = keys(object), length = props.length, result = Array(length); while (++index < length) { var key = props[index]; result[index] = [key, object[key]]; } return result; } module.exports = pairs; ././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/package.jsonnpm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._basecallback/node_modules/lodash.pairs/0000644000000000000000000000431012631326456032114 0ustar 00000000000000{ "name": "lodash.pairs", "version": "3.0.1", "description": "The modern build of lodash’s `_.pairs` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.keys": "^3.0.0" }, "readme": "# lodash.pairs v3.0.1\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.pairs` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash.pairs\n```\n\nIn Node.js/io.js:\n\n```js\nvar pairs = require('lodash.pairs');\n```\n\nSee the [documentation](https://lodash.com/docs#pairs) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.pairs) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.pairs@3.0.1", "_shasum": "bbe08d5786eeeaa09a15c91ebf0dcb7d2be326a9", "_resolved": "https://registry.npmjs.org/lodash.pairs/-/lodash.pairs-3.0.1.tgz", "_from": "lodash.pairs@>=3.0.0 <4.0.0" } npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._isiterateecall/LICENSE.txt0000644000000000000000000000232112631326456027274 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._isiterateecall/README.md0000644000000000000000000000106712631326456026736 0ustar 00000000000000# lodash._isiterateecall v3.0.9 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `isIterateeCall` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._isiterateecall ``` In Node.js/io.js: ```js var isIterateeCall = require('lodash._isiterateecall'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.9-npm-packages/lodash._isiterateecall) for more details. npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._isiterateecall/index.js0000644000000000000000000000772212631326456027130 0ustar 00000000000000/** * lodash 3.0.9 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** Used to detect unsigned integer values. */ var reIsUint = /^\d+$/; /** * Used as the [maximum length](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * The base implementation of `_.property` without support for deep paths. * * @private * @param {string} key The key of the property to get. * @returns {Function} Returns the new function. */ function baseProperty(key) { return function(object) { return object == null ? undefined : object[key]; }; } /** * Gets the "length" property value of `object`. * * **Note:** This function is used to avoid a [JIT bug](https://bugs.webkit.org/show_bug.cgi?id=142792) * that affects Safari on at least iOS 8.1-8.3 ARM64. * * @private * @param {Object} object The object to query. * @returns {*} Returns the "length" value. */ var getLength = baseProperty('length'); /** * Checks if `value` is array-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is array-like, else `false`. */ function isArrayLike(value) { return value != null && isLength(getLength(value)); } /** * Checks if `value` is a valid array-like index. * * @private * @param {*} value The value to check. * @param {number} [length=MAX_SAFE_INTEGER] The upper bounds of a valid index. * @returns {boolean} Returns `true` if `value` is a valid index, else `false`. */ function isIndex(value, length) { value = (typeof value == 'number' || reIsUint.test(value)) ? +value : -1; length = length == null ? MAX_SAFE_INTEGER : length; return value > -1 && value % 1 == 0 && value < length; } /** * Checks if the provided arguments are from an iteratee call. * * @private * @param {*} value The potential iteratee value argument. * @param {*} index The potential iteratee index or key argument. * @param {*} object The potential iteratee object argument. * @returns {boolean} Returns `true` if the arguments are from an iteratee call, else `false`. */ function isIterateeCall(value, index, object) { if (!isObject(object)) { return false; } var type = typeof index; if (type == 'number' ? (isArrayLike(object) && isIndex(index, object.length)) : (type == 'string' && index in object)) { var other = object[index]; return value === value ? (value === other) : (other !== other); } return false; } /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } /** * Checks if `value` is the [language type](https://es5.github.io/#x8) of `Object`. * (e.g. arrays, functions, objects, regexes, `new Number(0)`, and `new String('')`) * * @static * @memberOf _ * @category Lang * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is an object, else `false`. * @example * * _.isObject({}); * // => true * * _.isObject([1, 2, 3]); * // => true * * _.isObject(1); * // => false */ function isObject(value) { // Avoid a V8 JIT bug in Chrome 19-20. // See https://code.google.com/p/v8/issues/detail?id=2291 for more details. var type = typeof value; return !!value && (type == 'object' || type == 'function'); } module.exports = isIterateeCall; npm_3.5.2.orig/node_modules/lodash.uniq/node_modules/lodash._isiterateecall/package.json0000644000000000000000000000421612631326456027744 0ustar 00000000000000{ "name": "lodash._isiterateecall", "version": "3.0.9", "description": "The modern build of lodash’s internal `isIterateeCall` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "readme": "# lodash._isiterateecall v3.0.9\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `isIterateeCall` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._isiterateecall\n```\n\nIn Node.js/io.js:\n\n```js\nvar isIterateeCall = require('lodash._isiterateecall');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.9-npm-packages/lodash._isiterateecall) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._isiterateecall@3.0.9", "_shasum": "5203ad7ba425fae842460e696db9cf3e6aac057c", "_resolved": "https://registry.npmjs.org/lodash._isiterateecall/-/lodash._isiterateecall-3.0.9.tgz", "_from": "lodash._isiterateecall@>=3.0.0 <4.0.0" } npm_3.5.2.orig/node_modules/lodash.without/LICENSE.txt0000644000000000000000000000232112631326456020744 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.without/README.md0000644000000000000000000000106612631326456020405 0ustar 00000000000000# lodash.without v3.2.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.without` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.without ``` In Node.js/io.js: ```js var without = require('lodash.without'); ``` See the [documentation](https://lodash.com/docs#without) or [package source](https://github.com/lodash/lodash/blob/3.2.1-npm-packages/lodash.without) for more details. npm_3.5.2.orig/node_modules/lodash.without/index.js0000644000000000000000000000521712631326456020575 0ustar 00000000000000/** * lodash 3.2.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseDifference = require('lodash._basedifference'), restParam = require('lodash.restparam'); /** * Used as the [maximum length](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-number.max_safe_integer) * of an array-like value. */ var MAX_SAFE_INTEGER = 9007199254740991; /** * The base implementation of `_.property` without support for deep paths. * * @private * @param {string} key The key of the property to get. * @returns {Function} Returns the new function. */ function baseProperty(key) { return function(object) { return object == null ? undefined : object[key]; }; } /** * Gets the "length" property value of `object`. * * **Note:** This function is used to avoid a [JIT bug](https://bugs.webkit.org/show_bug.cgi?id=142792) * that affects Safari on at least iOS 8.1-8.3 ARM64. * * @private * @param {Object} object The object to query. * @returns {*} Returns the "length" value. */ var getLength = baseProperty('length'); /** * Checks if `value` is array-like. * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is array-like, else `false`. */ function isArrayLike(value) { return value != null && isLength(getLength(value)); } /** * Checks if `value` is a valid array-like length. * * **Note:** This function is based on [`ToLength`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-tolength). * * @private * @param {*} value The value to check. * @returns {boolean} Returns `true` if `value` is a valid length, else `false`. */ function isLength(value) { return typeof value == 'number' && value > -1 && value % 1 == 0 && value <= MAX_SAFE_INTEGER; } /** * Creates an array excluding all provided values using * [`SameValueZero`](https://people.mozilla.org/~jorendorff/es6-draft.html#sec-samevaluezero) * for equality comparisons. * * @static * @memberOf _ * @category Array * @param {Array} array The array to filter. * @param {...*} [values] The values to exclude. * @returns {Array} Returns the new array of filtered values. * @example * * _.without([1, 2, 1, 3], 1, 2); * // => [3] */ var without = restParam(function(array, values) { return isArrayLike(array) ? baseDifference(array, values) : []; }); module.exports = without; npm_3.5.2.orig/node_modules/lodash.without/node_modules/0000755000000000000000000000000012631326456021600 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.without/package.json0000644000000000000000000000455312631326456021420 0ustar 00000000000000{ "name": "lodash.without", "version": "3.2.1", "description": "The modern build of lodash’s `_.without` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basedifference": "^3.0.0", "lodash.restparam": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.without@3.2.1", "_shasum": "d69614b3512e52294b6abab782e7ca96538ce816", "_from": "lodash.without@>=3.2.1 <3.3.0", "_npmVersion": "2.10.0", "_nodeVersion": "0.12.3", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" }, { "name": "d10", "email": "demoneaux@gmail.com" } ], "dist": { "shasum": "d69614b3512e52294b6abab782e7ca96538ce816", "tarball": "http://registry.npmjs.org/lodash.without/-/lodash.without-3.2.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.without/-/lodash.without-3.2.1.tgz" } npm_3.5.2.orig/node_modules/lodash.without/node_modules/lodash._basedifference/0000755000000000000000000000000012631326456026135 5ustar 00000000000000npm_3.5.2.orig/node_modules/lodash.without/node_modules/lodash._basedifference/LICENSE0000644000000000000000000000232112631326456027140 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/lodash.without/node_modules/lodash._basedifference/README.md0000644000000000000000000000106712631326456027420 0ustar 00000000000000# lodash._basedifference v3.0.3 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseDifference` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basedifference ``` In Node.js/io.js: ```js var baseDifference = require('lodash._basedifference'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.3-npm-packages/lodash._basedifference) for more details. npm_3.5.2.orig/node_modules/lodash.without/node_modules/lodash._basedifference/index.js0000644000000000000000000000337712631326456027614 0ustar 00000000000000/** * lodash 3.0.3 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseIndexOf = require('lodash._baseindexof'), cacheIndexOf = require('lodash._cacheindexof'), createCache = require('lodash._createcache'); /** Used as the size to enable large array optimizations. */ var LARGE_ARRAY_SIZE = 200; /** * The base implementation of `_.difference` which accepts a single array * of values to exclude. * * @private * @param {Array} array The array to inspect. * @param {Array} values The values to exclude. * @returns {Array} Returns the new array of filtered values. */ function baseDifference(array, values) { var length = array ? array.length : 0, result = []; if (!length) { return result; } var index = -1, indexOf = baseIndexOf, isCommon = true, cache = (isCommon && values.length >= LARGE_ARRAY_SIZE) ? createCache(values) : null, valuesLength = values.length; if (cache) { indexOf = cacheIndexOf; isCommon = false; values = cache; } outer: while (++index < length) { var value = array[index]; if (isCommon && value === value) { var valuesIndex = valuesLength; while (valuesIndex--) { if (values[valuesIndex] === value) { continue outer; } } result.push(value); } else if (indexOf(values, value, 0) < 0) { result.push(value); } } return result; } module.exports = baseDifference; npm_3.5.2.orig/node_modules/lodash.without/node_modules/lodash._basedifference/package.json0000644000000000000000000000442612631326456030431 0ustar 00000000000000{ "name": "lodash._basedifference", "version": "3.0.3", "description": "The modern build of lodash’s internal `baseDifference` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._baseindexof": "^3.0.0", "lodash._cacheindexof": "^3.0.0", "lodash._createcache": "^3.0.0" }, "readme": "# lodash._basedifference v3.0.3\n\nThe [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseDifference` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module.\n\n## Installation\n\nUsing npm:\n\n```bash\n$ {sudo -H} npm i -g npm\n$ npm i --save lodash._basedifference\n```\n\nIn Node.js/io.js:\n\n```js\nvar baseDifference = require('lodash._basedifference');\n```\n\nSee the [package source](https://github.com/lodash/lodash/blob/3.0.3-npm-packages/lodash._basedifference) for more details.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basedifference@3.0.3", "_shasum": "f2c204296c2a78e02b389081b6edcac933cf629c", "_resolved": "https://registry.npmjs.org/lodash._basedifference/-/lodash._basedifference-3.0.3.tgz", "_from": "lodash._basedifference@>=3.0.0 <4.0.0" } npm_3.5.2.orig/node_modules/mkdirp/.travis.yml0000644000000000000000000000016412631326456017547 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" - "0.12" - "iojs" before_install: - npm install -g npm@~1.4.6 npm_3.5.2.orig/node_modules/mkdirp/LICENSE0000644000000000000000000000216512631326456016446 0ustar 00000000000000Copyright 2010 James Halliday (mail@substack.net) This project is free software released under the MIT/X11 license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/mkdirp/README.markdown0000644000000000000000000000405712631326456020144 0ustar 00000000000000# mkdirp Like `mkdir -p`, but in node.js! [![build status](https://secure.travis-ci.org/substack/node-mkdirp.png)](http://travis-ci.org/substack/node-mkdirp) # example ## pow.js ```js var mkdirp = require('mkdirp'); mkdirp('/tmp/foo/bar/baz', function (err) { if (err) console.error(err) else console.log('pow!') }); ``` Output ``` pow! ``` And now /tmp/foo/bar/baz exists, huzzah! # methods ```js var mkdirp = require('mkdirp'); ``` ## mkdirp(dir, opts, cb) Create a new directory and any necessary subdirectories at `dir` with octal permission string `opts.mode`. If `opts` is a non-object, it will be treated as the `opts.mode`. If `opts.mode` isn't specified, it defaults to `0777 & (~process.umask())`. `cb(err, made)` fires with the error or the first directory `made` that had to be created, if any. You can optionally pass in an alternate `fs` implementation by passing in `opts.fs`. Your implementation should have `opts.fs.mkdir(path, mode, cb)` and `opts.fs.stat(path, cb)`. ## mkdirp.sync(dir, opts) Synchronously create a new directory and any necessary subdirectories at `dir` with octal permission string `opts.mode`. If `opts` is a non-object, it will be treated as the `opts.mode`. If `opts.mode` isn't specified, it defaults to `0777 & (~process.umask())`. Returns the first directory that had to be created, if any. You can optionally pass in an alternate `fs` implementation by passing in `opts.fs`. Your implementation should have `opts.fs.mkdirSync(path, mode)` and `opts.fs.statSync(path)`. # usage This package also ships with a `mkdirp` command. ``` usage: mkdirp [DIR1,DIR2..] {OPTIONS} Create each supplied directory including any necessary parent directories that don't yet exist. If the directory already exists, do nothing. OPTIONS are: -m, --mode If a directory needs to be created, set the mode as an octal permission string. ``` # install With [npm](http://npmjs.org) do: ``` npm install mkdirp ``` to get the library, or ``` npm install -g mkdirp ``` to get the command. # license MIT npm_3.5.2.orig/node_modules/mkdirp/bin/0000755000000000000000000000000012631326456016205 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/examples/0000755000000000000000000000000012631326456017253 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/index.js0000644000000000000000000000510612631326456017104 0ustar 00000000000000var path = require('path'); var fs = require('fs'); var _0777 = parseInt('0777', 8); module.exports = mkdirP.mkdirp = mkdirP.mkdirP = mkdirP; function mkdirP (p, opts, f, made) { if (typeof opts === 'function') { f = opts; opts = {}; } else if (!opts || typeof opts !== 'object') { opts = { mode: opts }; } var mode = opts.mode; var xfs = opts.fs || fs; if (mode === undefined) { mode = _0777 & (~process.umask()); } if (!made) made = null; var cb = f || function () {}; p = path.resolve(p); xfs.mkdir(p, mode, function (er) { if (!er) { made = made || p; return cb(null, made); } switch (er.code) { case 'ENOENT': mkdirP(path.dirname(p), opts, function (er, made) { if (er) cb(er, made); else mkdirP(p, opts, cb, made); }); break; // In the case of any other error, just see if there's a dir // there already. If so, then hooray! If not, then something // is borked. default: xfs.stat(p, function (er2, stat) { // if the stat fails, then that's super weird. // let the original error be the failure reason. if (er2 || !stat.isDirectory()) cb(er, made) else cb(null, made); }); break; } }); } mkdirP.sync = function sync (p, opts, made) { if (!opts || typeof opts !== 'object') { opts = { mode: opts }; } var mode = opts.mode; var xfs = opts.fs || fs; if (mode === undefined) { mode = _0777 & (~process.umask()); } if (!made) made = null; p = path.resolve(p); try { xfs.mkdirSync(p, mode); made = made || p; } catch (err0) { switch (err0.code) { case 'ENOENT' : made = sync(path.dirname(p), opts, made); sync(p, opts, made); break; // In the case of any other error, just see if there's a dir // there already. If so, then hooray! If not, then something // is borked. default: var stat; try { stat = xfs.statSync(p); } catch (err1) { throw err0; } if (!stat.isDirectory()) throw err0; break; } } return made; }; npm_3.5.2.orig/node_modules/mkdirp/node_modules/0000755000000000000000000000000012631326456020112 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/package.json0000644000000000000000000000620212631326456017723 0ustar 00000000000000{ "name": "mkdirp", "description": "Recursively mkdir, like `mkdir -p`", "version": "0.5.1", "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "main": "index.js", "keywords": [ "mkdir", "directory" ], "repository": { "type": "git", "url": "git+https://github.com/substack/node-mkdirp.git" }, "scripts": { "test": "tap test/*.js" }, "dependencies": { "minimist": "0.0.8" }, "devDependencies": { "tap": "1", "mock-fs": "2 >=2.7.0" }, "bin": { "mkdirp": "bin/cmd.js" }, "license": "MIT", "readme": "# mkdirp\n\nLike `mkdir -p`, but in node.js!\n\n[![build status](https://secure.travis-ci.org/substack/node-mkdirp.png)](http://travis-ci.org/substack/node-mkdirp)\n\n# example\n\n## pow.js\n\n```js\nvar mkdirp = require('mkdirp');\n \nmkdirp('/tmp/foo/bar/baz', function (err) {\n if (err) console.error(err)\n else console.log('pow!')\n});\n```\n\nOutput\n\n```\npow!\n```\n\nAnd now /tmp/foo/bar/baz exists, huzzah!\n\n# methods\n\n```js\nvar mkdirp = require('mkdirp');\n```\n\n## mkdirp(dir, opts, cb)\n\nCreate a new directory and any necessary subdirectories at `dir` with octal\npermission string `opts.mode`. If `opts` is a non-object, it will be treated as\nthe `opts.mode`.\n\nIf `opts.mode` isn't specified, it defaults to `0777 & (~process.umask())`.\n\n`cb(err, made)` fires with the error or the first directory `made`\nthat had to be created, if any.\n\nYou can optionally pass in an alternate `fs` implementation by passing in\n`opts.fs`. Your implementation should have `opts.fs.mkdir(path, mode, cb)` and\n`opts.fs.stat(path, cb)`.\n\n## mkdirp.sync(dir, opts)\n\nSynchronously create a new directory and any necessary subdirectories at `dir`\nwith octal permission string `opts.mode`. If `opts` is a non-object, it will be\ntreated as the `opts.mode`.\n\nIf `opts.mode` isn't specified, it defaults to `0777 & (~process.umask())`.\n\nReturns the first directory that had to be created, if any.\n\nYou can optionally pass in an alternate `fs` implementation by passing in\n`opts.fs`. Your implementation should have `opts.fs.mkdirSync(path, mode)` and\n`opts.fs.statSync(path)`.\n\n# usage\n\nThis package also ships with a `mkdirp` command.\n\n```\nusage: mkdirp [DIR1,DIR2..] {OPTIONS}\n\n Create each supplied directory including any necessary parent directories that\n don't yet exist.\n \n If the directory already exists, do nothing.\n\nOPTIONS are:\n\n -m, --mode If a directory needs to be created, set the mode as an octal\n permission string.\n\n```\n\n# install\n\nWith [npm](http://npmjs.org) do:\n\n```\nnpm install mkdirp\n```\n\nto get the library, or\n\n```\nnpm install -g mkdirp\n```\n\nto get the command.\n\n# license\n\nMIT\n", "readmeFilename": "readme.markdown", "bugs": { "url": "https://github.com/substack/node-mkdirp/issues" }, "homepage": "https://github.com/substack/node-mkdirp#readme", "_id": "mkdirp@0.5.1", "_shasum": "30057438eac6cf7f8c4767f38648d6697d75c903", "_resolved": "https://registry.npmjs.org/mkdirp/-/mkdirp-0.5.1.tgz", "_from": "mkdirp@>=0.5.1 <0.6.0" } npm_3.5.2.orig/node_modules/mkdirp/test/0000755000000000000000000000000012631326456016414 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/bin/cmd.js0000755000000000000000000000133312631326456017311 0ustar 00000000000000#!/usr/bin/env node var mkdirp = require('../'); var minimist = require('minimist'); var fs = require('fs'); var argv = minimist(process.argv.slice(2), { alias: { m: 'mode', h: 'help' }, string: [ 'mode' ] }); if (argv.help) { fs.createReadStream(__dirname + '/usage.txt').pipe(process.stdout); return; } var paths = argv._.slice(); var mode = argv.mode ? parseInt(argv.mode, 8) : undefined; (function next () { if (paths.length === 0) return; var p = paths.shift(); if (mode === undefined) mkdirp(p, cb) else mkdirp(p, mode, cb) function cb (err) { if (err) { console.error(err.message); process.exit(1); } else next(); } })(); npm_3.5.2.orig/node_modules/mkdirp/bin/usage.txt0000644000000000000000000000047312631326456020056 0ustar 00000000000000usage: mkdirp [DIR1,DIR2..] {OPTIONS} Create each supplied directory including any necessary parent directories that don't yet exist. If the directory already exists, do nothing. OPTIONS are: -m, --mode If a directory needs to be created, set the mode as an octal permission string. npm_3.5.2.orig/node_modules/mkdirp/examples/pow.js0000644000000000000000000000021612631326456020415 0ustar 00000000000000var mkdirp = require('mkdirp'); mkdirp('/tmp/foo/bar/baz', function (err) { if (err) console.error(err) else console.log('pow!') }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/0000755000000000000000000000000012631326456021743 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/.travis.yml0000644000000000000000000000006012631326456024050 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/LICENSE0000644000000000000000000000206112631326456022747 0ustar 00000000000000This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/example/0000755000000000000000000000000012631326456023376 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/index.js0000644000000000000000000001272512631326456023417 0ustar 00000000000000module.exports = function (args, opts) { if (!opts) opts = {}; var flags = { bools : {}, strings : {} }; [].concat(opts['boolean']).filter(Boolean).forEach(function (key) { flags.bools[key] = true; }); [].concat(opts.string).filter(Boolean).forEach(function (key) { flags.strings[key] = true; }); var aliases = {}; Object.keys(opts.alias || {}).forEach(function (key) { aliases[key] = [].concat(opts.alias[key]); aliases[key].forEach(function (x) { aliases[x] = [key].concat(aliases[key].filter(function (y) { return x !== y; })); }); }); var defaults = opts['default'] || {}; var argv = { _ : [] }; Object.keys(flags.bools).forEach(function (key) { setArg(key, defaults[key] === undefined ? false : defaults[key]); }); var notFlags = []; if (args.indexOf('--') !== -1) { notFlags = args.slice(args.indexOf('--')+1); args = args.slice(0, args.indexOf('--')); } function setArg (key, val) { var value = !flags.strings[key] && isNumber(val) ? Number(val) : val ; setKey(argv, key.split('.'), value); (aliases[key] || []).forEach(function (x) { setKey(argv, x.split('.'), value); }); } for (var i = 0; i < args.length; i++) { var arg = args[i]; if (/^--.+=/.test(arg)) { // Using [\s\S] instead of . because js doesn't support the // 'dotall' regex modifier. See: // http://stackoverflow.com/a/1068308/13216 var m = arg.match(/^--([^=]+)=([\s\S]*)$/); setArg(m[1], m[2]); } else if (/^--no-.+/.test(arg)) { var key = arg.match(/^--no-(.+)/)[1]; setArg(key, false); } else if (/^--.+/.test(arg)) { var key = arg.match(/^--(.+)/)[1]; var next = args[i + 1]; if (next !== undefined && !/^-/.test(next) && !flags.bools[key] && (aliases[key] ? !flags.bools[aliases[key]] : true)) { setArg(key, next); i++; } else if (/^(true|false)$/.test(next)) { setArg(key, next === 'true'); i++; } else { setArg(key, flags.strings[key] ? '' : true); } } else if (/^-[^-]+/.test(arg)) { var letters = arg.slice(1,-1).split(''); var broken = false; for (var j = 0; j < letters.length; j++) { var next = arg.slice(j+2); if (next === '-') { setArg(letters[j], next) continue; } if (/[A-Za-z]/.test(letters[j]) && /-?\d+(\.\d*)?(e-?\d+)?$/.test(next)) { setArg(letters[j], next); broken = true; break; } if (letters[j+1] && letters[j+1].match(/\W/)) { setArg(letters[j], arg.slice(j+2)); broken = true; break; } else { setArg(letters[j], flags.strings[letters[j]] ? '' : true); } } var key = arg.slice(-1)[0]; if (!broken && key !== '-') { if (args[i+1] && !/^(-|--)[^-]/.test(args[i+1]) && !flags.bools[key] && (aliases[key] ? !flags.bools[aliases[key]] : true)) { setArg(key, args[i+1]); i++; } else if (args[i+1] && /true|false/.test(args[i+1])) { setArg(key, args[i+1] === 'true'); i++; } else { setArg(key, flags.strings[key] ? '' : true); } } } else { argv._.push( flags.strings['_'] || !isNumber(arg) ? arg : Number(arg) ); } } Object.keys(defaults).forEach(function (key) { if (!hasKey(argv, key.split('.'))) { setKey(argv, key.split('.'), defaults[key]); (aliases[key] || []).forEach(function (x) { setKey(argv, x.split('.'), defaults[key]); }); } }); notFlags.forEach(function(key) { argv._.push(key); }); return argv; }; function hasKey (obj, keys) { var o = obj; keys.slice(0,-1).forEach(function (key) { o = (o[key] || {}); }); var key = keys[keys.length - 1]; return key in o; } function setKey (obj, keys, value) { var o = obj; keys.slice(0,-1).forEach(function (key) { if (o[key] === undefined) o[key] = {}; o = o[key]; }); var key = keys[keys.length - 1]; if (o[key] === undefined || typeof o[key] === 'boolean') { o[key] = value; } else if (Array.isArray(o[key])) { o[key].push(value); } else { o[key] = [ o[key], value ]; } } function isNumber (x) { if (typeof x === 'number') return true; if (/^0x[0-9a-f]+$/i.test(x)) return true; return /^[-+]?(?:\d+(?:\.\d*)?|\.\d+)(e[-+]?\d+)?$/.test(x); } function longest (xs) { return Math.max.apply(null, xs.map(function (x) { return x.length })); } npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/package.json0000644000000000000000000000544412631326456024240 0ustar 00000000000000{ "name": "minimist", "version": "0.0.8", "description": "parse argument options", "main": "index.js", "devDependencies": { "tape": "~1.0.4", "tap": "~0.4.0" }, "scripts": { "test": "tap test/*.js" }, "testling": { "files": "test/*.js", "browsers": [ "ie/6..latest", "ff/5", "firefox/latest", "chrome/10", "chrome/latest", "safari/5.1", "safari/latest", "opera/12" ] }, "repository": { "type": "git", "url": "git://github.com/substack/minimist.git" }, "homepage": "https://github.com/substack/minimist", "keywords": [ "argv", "getopt", "parser", "optimist" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "readme": "# minimist\n\nparse argument options\n\nThis module is the guts of optimist's argument parser without all the\nfanciful decoration.\n\n[![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist)\n\n[![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist)\n\n# example\n\n``` js\nvar argv = require('minimist')(process.argv.slice(2));\nconsole.dir(argv);\n```\n\n```\n$ node example/parse.js -a beep -b boop\n{ _: [], a: 'beep', b: 'boop' }\n```\n\n```\n$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz\n{ _: [ 'foo', 'bar', 'baz' ],\n x: 3,\n y: 4,\n n: 5,\n a: true,\n b: true,\n c: true,\n beep: 'boop' }\n```\n\n# methods\n\n``` js\nvar parseArgs = require('minimist')\n```\n\n## var argv = parseArgs(args, opts={})\n\nReturn an argument object `argv` populated with the array arguments from `args`.\n\n`argv._` contains all the arguments that didn't have an option associated with\nthem.\n\nNumeric-looking arguments will be returned as numbers unless `opts.string` or\n`opts.boolean` is set for that argument name.\n\nAny arguments after `'--'` will not be parsed and will end up in `argv._`.\n\noptions can be:\n\n* `opts.string` - a string or array of strings argument names to always treat as\nstrings\n* `opts.boolean` - a string or array of strings to always treat as booleans\n* `opts.alias` - an object mapping string names to strings or arrays of string\nargument names to use as aliases\n* `opts.default` - an object mapping string argument names to default values\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install minimist\n```\n\n# license\n\nMIT\n", "readmeFilename": "readme.markdown", "bugs": { "url": "https://github.com/substack/minimist/issues" }, "_id": "minimist@0.0.8", "_shasum": "857fcabfc3397d2625b8228262e86aa7a011b05d", "_resolved": "https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz", "_from": "minimist@0.0.8" } npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/readme.markdown0000644000000000000000000000314712631326456024751 0ustar 00000000000000# minimist parse argument options This module is the guts of optimist's argument parser without all the fanciful decoration. [![browser support](https://ci.testling.com/substack/minimist.png)](http://ci.testling.com/substack/minimist) [![build status](https://secure.travis-ci.org/substack/minimist.png)](http://travis-ci.org/substack/minimist) # example ``` js var argv = require('minimist')(process.argv.slice(2)); console.dir(argv); ``` ``` $ node example/parse.js -a beep -b boop { _: [], a: 'beep', b: 'boop' } ``` ``` $ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz { _: [ 'foo', 'bar', 'baz' ], x: 3, y: 4, n: 5, a: true, b: true, c: true, beep: 'boop' } ``` # methods ``` js var parseArgs = require('minimist') ``` ## var argv = parseArgs(args, opts={}) Return an argument object `argv` populated with the array arguments from `args`. `argv._` contains all the arguments that didn't have an option associated with them. Numeric-looking arguments will be returned as numbers unless `opts.string` or `opts.boolean` is set for that argument name. Any arguments after `'--'` will not be parsed and will end up in `argv._`. options can be: * `opts.string` - a string or array of strings argument names to always treat as strings * `opts.boolean` - a string or array of strings to always treat as booleans * `opts.alias` - an object mapping string names to strings or arrays of string argument names to use as aliases * `opts.default` - an object mapping string argument names to default values # install With [npm](https://npmjs.org) do: ``` npm install minimist ``` # license MIT npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/0000755000000000000000000000000012631326456022722 5ustar 00000000000000npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/example/parse.js0000644000000000000000000000010512631326456025042 0ustar 00000000000000var argv = require('../')(process.argv.slice(2)); console.dir(argv); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/dash.js0000644000000000000000000000132612631326456024201 0ustar 00000000000000var parse = require('../'); var test = require('tape'); test('-', function (t) { t.plan(5); t.deepEqual(parse([ '-n', '-' ]), { n: '-', _: [] }); t.deepEqual(parse([ '-' ]), { _: [ '-' ] }); t.deepEqual(parse([ '-f-' ]), { f: '-', _: [] }); t.deepEqual( parse([ '-b', '-' ], { boolean: 'b' }), { b: true, _: [ '-' ] } ); t.deepEqual( parse([ '-s', '-' ], { string: 's' }), { s: '-', _: [] } ); }); test('-a -- b', function (t) { t.plan(3); t.deepEqual(parse([ '-a', '--', 'b' ]), { a: true, _: [ 'b' ] }); t.deepEqual(parse([ '--a', '--', 'b' ]), { a: true, _: [ 'b' ] }); t.deepEqual(parse([ '--a', '--', 'b' ]), { a: true, _: [ 'b' ] }); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/default_bool.js0000644000000000000000000000070612631326456025722 0ustar 00000000000000var test = require('tape'); var parse = require('../'); test('boolean default true', function (t) { var argv = parse([], { boolean: 'sometrue', default: { sometrue: true } }); t.equal(argv.sometrue, true); t.end(); }); test('boolean default false', function (t) { var argv = parse([], { boolean: 'somefalse', default: { somefalse: false } }); t.equal(argv.somefalse, false); t.end(); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/dotted.js0000644000000000000000000000067112631326456024547 0ustar 00000000000000var parse = require('../'); var test = require('tape'); test('dotted alias', function (t) { var argv = parse(['--a.b', '22'], {default: {'a.b': 11}, alias: {'a.b': 'aa.bb'}}); t.equal(argv.a.b, 22); t.equal(argv.aa.bb, 22); t.end(); }); test('dotted default', function (t) { var argv = parse('', {default: {'a.b': 11}, alias: {'a.b': 'aa.bb'}}); t.equal(argv.a.b, 11); t.equal(argv.aa.bb, 11); t.end(); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/long.js0000644000000000000000000000141312631326456024216 0ustar 00000000000000var test = require('tape'); var parse = require('../'); test('long opts', function (t) { t.deepEqual( parse([ '--bool' ]), { bool : true, _ : [] }, 'long boolean' ); t.deepEqual( parse([ '--pow', 'xixxle' ]), { pow : 'xixxle', _ : [] }, 'long capture sp' ); t.deepEqual( parse([ '--pow=xixxle' ]), { pow : 'xixxle', _ : [] }, 'long capture eq' ); t.deepEqual( parse([ '--host', 'localhost', '--port', '555' ]), { host : 'localhost', port : 555, _ : [] }, 'long captures sp' ); t.deepEqual( parse([ '--host=localhost', '--port=555' ]), { host : 'localhost', port : 555, _ : [] }, 'long captures eq' ); t.end(); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/parse.js0000644000000000000000000001651212631326456024377 0ustar 00000000000000var parse = require('../'); var test = require('tape'); test('parse args', function (t) { t.deepEqual( parse([ '--no-moo' ]), { moo : false, _ : [] }, 'no' ); t.deepEqual( parse([ '-v', 'a', '-v', 'b', '-v', 'c' ]), { v : ['a','b','c'], _ : [] }, 'multi' ); t.end(); }); test('comprehensive', function (t) { t.deepEqual( parse([ '--name=meowmers', 'bare', '-cats', 'woo', '-h', 'awesome', '--multi=quux', '--key', 'value', '-b', '--bool', '--no-meep', '--multi=baz', '--', '--not-a-flag', 'eek' ]), { c : true, a : true, t : true, s : 'woo', h : 'awesome', b : true, bool : true, key : 'value', multi : [ 'quux', 'baz' ], meep : false, name : 'meowmers', _ : [ 'bare', '--not-a-flag', 'eek' ] } ); t.end(); }); test('nums', function (t) { var argv = parse([ '-x', '1234', '-y', '5.67', '-z', '1e7', '-w', '10f', '--hex', '0xdeadbeef', '789' ]); t.deepEqual(argv, { x : 1234, y : 5.67, z : 1e7, w : '10f', hex : 0xdeadbeef, _ : [ 789 ] }); t.deepEqual(typeof argv.x, 'number'); t.deepEqual(typeof argv.y, 'number'); t.deepEqual(typeof argv.z, 'number'); t.deepEqual(typeof argv.w, 'string'); t.deepEqual(typeof argv.hex, 'number'); t.deepEqual(typeof argv._[0], 'number'); t.end(); }); test('flag boolean', function (t) { var argv = parse([ '-t', 'moo' ], { boolean: 't' }); t.deepEqual(argv, { t : true, _ : [ 'moo' ] }); t.deepEqual(typeof argv.t, 'boolean'); t.end(); }); test('flag boolean value', function (t) { var argv = parse(['--verbose', 'false', 'moo', '-t', 'true'], { boolean: [ 't', 'verbose' ], default: { verbose: true } }); t.deepEqual(argv, { verbose: false, t: true, _: ['moo'] }); t.deepEqual(typeof argv.verbose, 'boolean'); t.deepEqual(typeof argv.t, 'boolean'); t.end(); }); test('flag boolean default false', function (t) { var argv = parse(['moo'], { boolean: ['t', 'verbose'], default: { verbose: false, t: false } }); t.deepEqual(argv, { verbose: false, t: false, _: ['moo'] }); t.deepEqual(typeof argv.verbose, 'boolean'); t.deepEqual(typeof argv.t, 'boolean'); t.end(); }); test('boolean groups', function (t) { var argv = parse([ '-x', '-z', 'one', 'two', 'three' ], { boolean: ['x','y','z'] }); t.deepEqual(argv, { x : true, y : false, z : true, _ : [ 'one', 'two', 'three' ] }); t.deepEqual(typeof argv.x, 'boolean'); t.deepEqual(typeof argv.y, 'boolean'); t.deepEqual(typeof argv.z, 'boolean'); t.end(); }); test('newlines in params' , function (t) { var args = parse([ '-s', "X\nX" ]) t.deepEqual(args, { _ : [], s : "X\nX" }); // reproduce in bash: // VALUE="new // line" // node program.js --s="$VALUE" args = parse([ "--s=X\nX" ]) t.deepEqual(args, { _ : [], s : "X\nX" }); t.end(); }); test('strings' , function (t) { var s = parse([ '-s', '0001234' ], { string: 's' }).s; t.equal(s, '0001234'); t.equal(typeof s, 'string'); var x = parse([ '-x', '56' ], { string: 'x' }).x; t.equal(x, '56'); t.equal(typeof x, 'string'); t.end(); }); test('stringArgs', function (t) { var s = parse([ ' ', ' ' ], { string: '_' })._; t.same(s.length, 2); t.same(typeof s[0], 'string'); t.same(s[0], ' '); t.same(typeof s[1], 'string'); t.same(s[1], ' '); t.end(); }); test('empty strings', function(t) { var s = parse([ '-s' ], { string: 's' }).s; t.equal(s, ''); t.equal(typeof s, 'string'); var str = parse([ '--str' ], { string: 'str' }).str; t.equal(str, ''); t.equal(typeof str, 'string'); var letters = parse([ '-art' ], { string: [ 'a', 't' ] }); t.equal(letters.a, ''); t.equal(letters.r, true); t.equal(letters.t, ''); t.end(); }); test('slashBreak', function (t) { t.same( parse([ '-I/foo/bar/baz' ]), { I : '/foo/bar/baz', _ : [] } ); t.same( parse([ '-xyz/foo/bar/baz' ]), { x : true, y : true, z : '/foo/bar/baz', _ : [] } ); t.end(); }); test('alias', function (t) { var argv = parse([ '-f', '11', '--zoom', '55' ], { alias: { z: 'zoom' } }); t.equal(argv.zoom, 55); t.equal(argv.z, argv.zoom); t.equal(argv.f, 11); t.end(); }); test('multiAlias', function (t) { var argv = parse([ '-f', '11', '--zoom', '55' ], { alias: { z: [ 'zm', 'zoom' ] } }); t.equal(argv.zoom, 55); t.equal(argv.z, argv.zoom); t.equal(argv.z, argv.zm); t.equal(argv.f, 11); t.end(); }); test('nested dotted objects', function (t) { var argv = parse([ '--foo.bar', '3', '--foo.baz', '4', '--foo.quux.quibble', '5', '--foo.quux.o_O', '--beep.boop' ]); t.same(argv.foo, { bar : 3, baz : 4, quux : { quibble : 5, o_O : true } }); t.same(argv.beep, { boop : true }); t.end(); }); test('boolean and alias with chainable api', function (t) { var aliased = [ '-h', 'derp' ]; var regular = [ '--herp', 'derp' ]; var opts = { herp: { alias: 'h', boolean: true } }; var aliasedArgv = parse(aliased, { boolean: 'herp', alias: { h: 'herp' } }); var propertyArgv = parse(regular, { boolean: 'herp', alias: { h: 'herp' } }); var expected = { herp: true, h: true, '_': [ 'derp' ] }; t.same(aliasedArgv, expected); t.same(propertyArgv, expected); t.end(); }); test('boolean and alias with options hash', function (t) { var aliased = [ '-h', 'derp' ]; var regular = [ '--herp', 'derp' ]; var opts = { alias: { 'h': 'herp' }, boolean: 'herp' }; var aliasedArgv = parse(aliased, opts); var propertyArgv = parse(regular, opts); var expected = { herp: true, h: true, '_': [ 'derp' ] }; t.same(aliasedArgv, expected); t.same(propertyArgv, expected); t.end(); }); test('boolean and alias using explicit true', function (t) { var aliased = [ '-h', 'true' ]; var regular = [ '--herp', 'true' ]; var opts = { alias: { h: 'herp' }, boolean: 'h' }; var aliasedArgv = parse(aliased, opts); var propertyArgv = parse(regular, opts); var expected = { herp: true, h: true, '_': [ ] }; t.same(aliasedArgv, expected); t.same(propertyArgv, expected); t.end(); }); // regression, see https://github.com/substack/node-optimist/issues/71 test('boolean and --x=true', function(t) { var parsed = parse(['--boool', '--other=true'], { boolean: 'boool' }); t.same(parsed.boool, true); t.same(parsed.other, 'true'); parsed = parse(['--boool', '--other=false'], { boolean: 'boool' }); t.same(parsed.boool, true); t.same(parsed.other, 'false'); t.end(); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/parse_modified.js0000644000000000000000000000036012631326456026231 0ustar 00000000000000var parse = require('../'); var test = require('tape'); test('parse with modifier functions' , function (t) { t.plan(1); var argv = parse([ '-b', '123' ], { boolean: 'b' }); t.deepEqual(argv, { b: true, _: ['123'] }); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/short.js0000644000000000000000000000307112631326456024420 0ustar 00000000000000var parse = require('../'); var test = require('tape'); test('numeric short args', function (t) { t.plan(2); t.deepEqual(parse([ '-n123' ]), { n: 123, _: [] }); t.deepEqual( parse([ '-123', '456' ]), { 1: true, 2: true, 3: 456, _: [] } ); }); test('short', function (t) { t.deepEqual( parse([ '-b' ]), { b : true, _ : [] }, 'short boolean' ); t.deepEqual( parse([ 'foo', 'bar', 'baz' ]), { _ : [ 'foo', 'bar', 'baz' ] }, 'bare' ); t.deepEqual( parse([ '-cats' ]), { c : true, a : true, t : true, s : true, _ : [] }, 'group' ); t.deepEqual( parse([ '-cats', 'meow' ]), { c : true, a : true, t : true, s : 'meow', _ : [] }, 'short group next' ); t.deepEqual( parse([ '-h', 'localhost' ]), { h : 'localhost', _ : [] }, 'short capture' ); t.deepEqual( parse([ '-h', 'localhost', '-p', '555' ]), { h : 'localhost', p : 555, _ : [] }, 'short captures' ); t.end(); }); test('mixed short bool and capture', function (t) { t.same( parse([ '-h', 'localhost', '-fp', '555', 'script.js' ]), { f : true, p : 555, h : 'localhost', _ : [ 'script.js' ] } ); t.end(); }); test('short and long', function (t) { t.deepEqual( parse([ '-h', 'localhost', '-fp', '555', 'script.js' ]), { f : true, p : 555, h : 'localhost', _ : [ 'script.js' ] } ); t.end(); }); npm_3.5.2.orig/node_modules/mkdirp/node_modules/minimist/test/whitespace.js0000644000000000000000000000027712631326456025422 0ustar 00000000000000var parse = require('../'); var test = require('tape'); test('whitespace should be whitespace' , function (t) { t.plan(1); var x = parse([ '-x', '\t' ]).x; t.equal(x, '\t'); }); npm_3.5.2.orig/node_modules/mkdirp/test/chmod.js0000644000000000000000000000216112631326456020044 0ustar 00000000000000var mkdirp = require('../').mkdirp; var path = require('path'); var fs = require('fs'); var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); var _0744 = parseInt('0744', 8); var ps = [ '', 'tmp' ]; for (var i = 0; i < 25; i++) { var dir = Math.floor(Math.random() * Math.pow(16,4)).toString(16); ps.push(dir); } var file = ps.join('/'); test('chmod-pre', function (t) { var mode = _0744 mkdirp(file, mode, function (er) { t.ifError(er, 'should not error'); fs.stat(file, function (er, stat) { t.ifError(er, 'should exist'); t.ok(stat && stat.isDirectory(), 'should be directory'); t.equal(stat && stat.mode & _0777, mode, 'should be 0744'); t.end(); }); }); }); test('chmod', function (t) { var mode = _0755 mkdirp(file, mode, function (er) { t.ifError(er, 'should not error'); fs.stat(file, function (er, stat) { t.ifError(er, 'should exist'); t.ok(stat && stat.isDirectory(), 'should be directory'); t.end(); }); }); }); npm_3.5.2.orig/node_modules/mkdirp/test/clobber.js0000644000000000000000000000152712631326456020367 0ustar 00000000000000var mkdirp = require('../').mkdirp; var path = require('path'); var fs = require('fs'); var test = require('tap').test; var _0755 = parseInt('0755', 8); var ps = [ '', 'tmp' ]; for (var i = 0; i < 25; i++) { var dir = Math.floor(Math.random() * Math.pow(16,4)).toString(16); ps.push(dir); } var file = ps.join('/'); // a file in the way var itw = ps.slice(0, 3).join('/'); test('clobber-pre', function (t) { console.error("about to write to "+itw) fs.writeFileSync(itw, 'I AM IN THE WAY, THE TRUTH, AND THE LIGHT.'); fs.stat(itw, function (er, stat) { t.ifError(er) t.ok(stat && stat.isFile(), 'should be file') t.end() }) }) test('clobber', function (t) { t.plan(2); mkdirp(file, _0755, function (err) { t.ok(err); t.equal(err.code, 'ENOTDIR'); t.end(); }); }); npm_3.5.2.orig/node_modules/mkdirp/test/mkdirp.js0000644000000000000000000000160412631326456020241 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('woo', function (t) { t.plan(5); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/tmp/' + [x,y,z].join('/'); mkdirp(file, _0755, function (err) { t.ifError(err); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }) }) }); }); npm_3.5.2.orig/node_modules/mkdirp/test/opts_fs.js0000644000000000000000000000165212631326456020433 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var test = require('tap').test; var mockfs = require('mock-fs'); var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('opts.fs', function (t) { t.plan(5); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/beep/boop/' + [x,y,z].join('/'); var xfs = mockfs.fs(); mkdirp(file, { fs: xfs, mode: _0755 }, function (err) { t.ifError(err); xfs.exists(file, function (ex) { t.ok(ex, 'created file'); xfs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }); }); }); }); npm_3.5.2.orig/node_modules/mkdirp/test/opts_fs_sync.js0000644000000000000000000000154412631326456021467 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var test = require('tap').test; var mockfs = require('mock-fs'); var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('opts.fs sync', function (t) { t.plan(4); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/beep/boop/' + [x,y,z].join('/'); var xfs = mockfs.fs(); mkdirp.sync(file, { fs: xfs, mode: _0755 }); xfs.exists(file, function (ex) { t.ok(ex, 'created file'); xfs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }); }); }); npm_3.5.2.orig/node_modules/mkdirp/test/perm.js0000644000000000000000000000154712631326456017724 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('async perm', function (t) { t.plan(5); var file = '/tmp/' + (Math.random() * (1<<30)).toString(16); mkdirp(file, _0755, function (err) { t.ifError(err); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }) }) }); }); test('async root perm', function (t) { mkdirp('/tmp', _0755, function (err) { if (err) t.fail(err); t.end(); }); t.end(); }); npm_3.5.2.orig/node_modules/mkdirp/test/perm_sync.js0000644000000000000000000000173712631326456020761 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('sync perm', function (t) { t.plan(4); var file = '/tmp/' + (Math.random() * (1<<30)).toString(16) + '.json'; mkdirp.sync(file, _0755); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }); }); }); test('sync root perm', function (t) { t.plan(3); var file = '/tmp'; mkdirp.sync(file, _0755); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.ok(stat.isDirectory(), 'target not a directory'); }) }); }); npm_3.5.2.orig/node_modules/mkdirp/test/race.js0000644000000000000000000000173412631326456017671 0ustar 00000000000000var mkdirp = require('../').mkdirp; var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('race', function (t) { t.plan(10); var ps = [ '', 'tmp' ]; for (var i = 0; i < 25; i++) { var dir = Math.floor(Math.random() * Math.pow(16,4)).toString(16); ps.push(dir); } var file = ps.join('/'); var res = 2; mk(file); mk(file); function mk (file, cb) { mkdirp(file, _0755, function (err) { t.ifError(err); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }); }) }); } }); npm_3.5.2.orig/node_modules/mkdirp/test/rel.js0000644000000000000000000000173312631326456017540 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('rel', function (t) { t.plan(5); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var cwd = process.cwd(); process.chdir('/tmp'); var file = [x,y,z].join('/'); mkdirp(file, _0755, function (err) { t.ifError(err); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); process.chdir(cwd); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }) }) }); }); npm_3.5.2.orig/node_modules/mkdirp/test/return.js0000644000000000000000000000147212631326456020275 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var test = require('tap').test; test('return value', function (t) { t.plan(4); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/tmp/' + [x,y,z].join('/'); // should return the first dir created. // By this point, it would be profoundly surprising if /tmp didn't // already exist, since every other test makes things in there. mkdirp(file, function (err, made) { t.ifError(err); t.equal(made, '/tmp/' + x); mkdirp(file, function (err, made) { t.ifError(err); t.equal(made, null); }); }); }); npm_3.5.2.orig/node_modules/mkdirp/test/return_sync.js0000644000000000000000000000152712631326456021332 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var test = require('tap').test; test('return value', function (t) { t.plan(2); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/tmp/' + [x,y,z].join('/'); // should return the first dir created. // By this point, it would be profoundly surprising if /tmp didn't // already exist, since every other test makes things in there. // Note that this will throw on failure, which will fail the test. var made = mkdirp.sync(file); t.equal(made, '/tmp/' + x); // making the same file again should have no effect. made = mkdirp.sync(file); t.equal(made, null); }); npm_3.5.2.orig/node_modules/mkdirp/test/root.js0000644000000000000000000000076012631326456017740 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var test = require('tap').test; var _0755 = parseInt('0755', 8); test('root', function (t) { // '/' on unix, 'c:/' on windows. var file = path.resolve('/'); mkdirp(file, _0755, function (err) { if (err) throw err fs.stat(file, function (er, stat) { if (er) throw er t.ok(stat.isDirectory(), 'target is a directory'); t.end(); }) }); }); npm_3.5.2.orig/node_modules/mkdirp/test/sync.js0000644000000000000000000000161212631326456017726 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('sync', function (t) { t.plan(4); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/tmp/' + [x,y,z].join('/'); try { mkdirp.sync(file, _0755); } catch (err) { t.fail(err); return t.end(); } exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0755); t.ok(stat.isDirectory(), 'target not a directory'); }); }); }); npm_3.5.2.orig/node_modules/mkdirp/test/umask.js0000644000000000000000000000165012631326456020074 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('implicit mode from umask', function (t) { t.plan(5); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/tmp/' + [x,y,z].join('/'); mkdirp(file, function (err) { t.ifError(err); exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, _0777 & (~process.umask())); t.ok(stat.isDirectory(), 'target not a directory'); }); }) }); }); npm_3.5.2.orig/node_modules/mkdirp/test/umask_sync.js0000644000000000000000000000164612631326456021135 0ustar 00000000000000var mkdirp = require('../'); var path = require('path'); var fs = require('fs'); var exists = fs.exists || path.exists; var test = require('tap').test; var _0777 = parseInt('0777', 8); var _0755 = parseInt('0755', 8); test('umask sync modes', function (t) { t.plan(4); var x = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var y = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var z = Math.floor(Math.random() * Math.pow(16,4)).toString(16); var file = '/tmp/' + [x,y,z].join('/'); try { mkdirp.sync(file); } catch (err) { t.fail(err); return t.end(); } exists(file, function (ex) { t.ok(ex, 'file created'); fs.stat(file, function (err, stat) { t.ifError(err); t.equal(stat.mode & _0777, (_0777 & (~process.umask()))); t.ok(stat.isDirectory(), 'target not a directory'); }); }); }); npm_3.5.2.orig/node_modules/node-gyp/.jshintrc0000644000000000000000000000013012631326456017510 0ustar 00000000000000{ "asi": true, "laxcomma": true, "es5": true, "node": true, "strict": false } npm_3.5.2.orig/node_modules/node-gyp/.npmignore0000644000000000000000000000004512631326456017667 0ustar 00000000000000gyp/test node_modules test/.node-gyp npm_3.5.2.orig/node_modules/node-gyp/0001-gyp-always-install-into-PRODUCT_DIR.patch0000644000000000000000000000301412631326456025612 0ustar 00000000000000From 9b5e8dc426ada891d67d27b09acc73122ab46849 Mon Sep 17 00:00:00 2001 From: Nathan Rajlich Date: Wed, 14 Nov 2012 16:48:52 -0800 Subject: [PATCH 1/3] gyp: always install into $PRODUCT_DIR --- gyp/pylib/gyp/generator/make.py | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/gyp/pylib/gyp/generator/make.py b/gyp/pylib/gyp/generator/make.py index b88a433..9b3e4e3 100644 --- a/gyp/pylib/gyp/generator/make.py +++ b/gyp/pylib/gyp/generator/make.py @@ -1888,11 +1888,13 @@ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj)/%%%s FORCE_DO_CMD """Returns the location of the final output for an installable target.""" # Xcode puts shared_library results into PRODUCT_DIR, and some gyp files # rely on this. Emulate this behavior for mac. - if (self.type == 'shared_library' and - (self.flavor != 'mac' or self.toolset != 'target')): - # Install all shared libs into a common directory (per toolset) for - # convenient access with LD_LIBRARY_PATH. - return '$(builddir)/lib.%s/%s' % (self.toolset, self.alias) + + # XXX(TooTallNate): disabling this code since we don't want this behavior... + #if (self.type == 'shared_library' and + # (self.flavor != 'mac' or self.toolset != 'target')): + # # Install all shared libs into a common directory (per toolset) for + # # convenient access with LD_LIBRARY_PATH. + # return '$(builddir)/lib.%s/%s' % (self.toolset, self.alias) return '$(builddir)/' + self.alias -- 2.3.2 (Apple Git-55) npm_3.5.2.orig/node_modules/node-gyp/0002-gyp-apply-https-codereview.chromium.org-11361103.patch0000644000000000000000000000306012631326456030011 0ustar 00000000000000From 511840e82116662aa825088fb8a52a9f799f7767 Mon Sep 17 00:00:00 2001 From: Nathan Rajlich Date: Wed, 14 Nov 2012 16:54:04 -0800 Subject: [PATCH 2/3] gyp: apply https://codereview.chromium.org/11361103/ --- gyp/pylib/gyp/generator/msvs.py | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/gyp/pylib/gyp/generator/msvs.py b/gyp/pylib/gyp/generator/msvs.py index d8e0872..c59aea1 100644 --- a/gyp/pylib/gyp/generator/msvs.py +++ b/gyp/pylib/gyp/generator/msvs.py @@ -2720,6 +2720,9 @@ def _GetMSBuildAttributes(spec, config, build_file): product_name = spec.get('product_name', '$(ProjectName)') target_name = prefix + product_name msbuild_attributes['TargetName'] = target_name + if 'TargetExt' not in msbuild_attributes and 'product_extension' in spec: + ext = spec.get('product_extension') + msbuild_attributes['TargetExt'] = '.' + ext if spec.get('msvs_external_builder'): external_out_dir = spec.get('msvs_external_builder_out_dir', '.') @@ -2773,6 +2776,9 @@ def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file): attributes['OutputDirectory']) _AddConditionalProperty(properties, condition, 'TargetName', attributes['TargetName']) + if 'TargetExt' in attributes: + _AddConditionalProperty(properties, condition, 'TargetExt', + attributes['TargetExt']) if attributes.get('TargetPath'): _AddConditionalProperty(properties, condition, 'TargetPath', -- 2.3.2 (Apple Git-55) npm_3.5.2.orig/node_modules/node-gyp/0003-gyp-don-t-use-links-at-all-just-copy-the-files-inste.patch0000644000000000000000000000257012631326456031125 0ustar 00000000000000From 0cd9f08a6d4f4be6643001b6c3b5ad40e094bdcc Mon Sep 17 00:00:00 2001 From: Nathan Zadoks Date: Tue, 2 Jul 2013 11:07:16 -0700 Subject: [PATCH 3/3] gyp: don't use links at all, just copy the files instead --- gyp/pylib/gyp/generator/make.py | 2 +- gyp/pylib/gyp/generator/ninja.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/gyp/pylib/gyp/generator/make.py b/gyp/pylib/gyp/generator/make.py index 9b3e4e3..b3f8a2b 100644 --- a/gyp/pylib/gyp/generator/make.py +++ b/gyp/pylib/gyp/generator/make.py @@ -372,7 +372,7 @@ cmd_touch = touch $@ quiet_cmd_copy = COPY $@ # send stderr to /dev/null to ignore messages when linking directories. -cmd_copy = ln -f "$<" "$@" 2>/dev/null || (rm -rf "$@" && cp -af "$<" "$@") +cmd_copy = rm -rf "$@" && cp -af "$<" "$@" %(link_commands)s """ diff --git a/gyp/pylib/gyp/generator/ninja.py b/gyp/pylib/gyp/generator/ninja.py index 7461814..c2951a4 100644 --- a/gyp/pylib/gyp/generator/ninja.py +++ b/gyp/pylib/gyp/generator/ninja.py @@ -2020,7 +2020,7 @@ def GenerateOutputForConfig(target_list, target_dicts, data, params, master_ninja.rule( 'copy', description='COPY $in $out', - command='ln -f $in $out 2>/dev/null || (rm -rf $out && cp -af $in $out)') + command='rm -rf $out && cp -af $in $out') master_ninja.newline() all_targets = set() -- 2.3.2 (Apple Git-55) npm_3.5.2.orig/node_modules/node-gyp/CHANGELOG.md0000644000000000000000000000707312631326456017511 0ustar 00000000000000v3.1.0 2015-11-14 * [[`9049241f91`](https://github.com/nodejs/node-gyp/commit/9049241f91)] - **gyp**: don't use links at all, just copy the files instead (Nathan Zadoks) * [[`8ef90348d1`](https://github.com/nodejs/node-gyp/commit/8ef90348d1)] - **gyp**: apply https://codereview.chromium.org/11361103/ (Nathan Rajlich) * [[`a2ed0df84e`](https://github.com/nodejs/node-gyp/commit/a2ed0df84e)] - **gyp**: always install into $PRODUCT_DIR (Nathan Rajlich) * [[`cc8b2fa83e`](https://github.com/nodejs/node-gyp/commit/cc8b2fa83e)] - Update gyp to b3cef02. (Imran Iqbal) [#781](https://github.com/nodejs/node-gyp/pull/781) * [[`f5d86eb84e`](https://github.com/nodejs/node-gyp/commit/f5d86eb84e)] - Update to tar@2.0.0. (Edgar Muentes) [#797](https://github.com/nodejs/node-gyp/pull/797) * [[`2ac7de02c4`](https://github.com/nodejs/node-gyp/commit/2ac7de02c4)] - Fix infinite loop with zero-length options. (Ben Noordhuis) [#745](https://github.com/nodejs/node-gyp/pull/745) * [[`101bed639b`](https://github.com/nodejs/node-gyp/commit/101bed639b)] - This platform value came from debian package, and now the value (Jérémy Lal) [#738](https://github.com/nodejs/node-gyp/pull/738) v3.0.3 2015-09-14 * [[`ad827cda30`](https://github.com/nodejs/node-gyp/commit/ad827cda30)] - tarballUrl global and && when checking for iojs (Lars-Magnus Skog) [#729](https://github.com/nodejs/node-gyp/pull/729) v3.0.2 2015-09-12 * [[`6e8c3bf3c6`](https://github.com/nodejs/node-gyp/commit/6e8c3bf3c6)] - add back support for passing additional cmdline args (Rod Vagg) [#723](https://github.com/nodejs/node-gyp/pull/723) * [[`ff82f2f3b9`](https://github.com/nodejs/node-gyp/commit/ff82f2f3b9)] - fixed broken link in docs to Visual Studio 2013 download (simon-p-r) [#722](https://github.com/nodejs/node-gyp/pull/722) v3.0.1 2015-09-08 * [[`846337e36b`](https://github.com/nodejs/node-gyp/commit/846337e36b)] - normalise versions for target == this comparison (Rod Vagg) [#716](https://github.com/nodejs/node-gyp/pull/716) v3.0.0 2015-09-08 * [[`9720d0373c`](https://github.com/nodejs/node-gyp/commit/9720d0373c)] - remove node_modules from tree (Rod Vagg) [#711](https://github.com/nodejs/node-gyp/pull/711) * [[`6dcf220db7`](https://github.com/nodejs/node-gyp/commit/6dcf220db7)] - test version major directly, don't use semver.satisfies() (Rod Vagg) [#711](https://github.com/nodejs/node-gyp/pull/711) * [[`938dd18d1c`](https://github.com/nodejs/node-gyp/commit/938dd18d1c)] - refactor for clarity, fix dist-url, add env var dist-url functionality (Rod Vagg) [#711](https://github.com/nodejs/node-gyp/pull/711) * [[`9e9df66a06`](https://github.com/nodejs/node-gyp/commit/9e9df66a06)] - use process.release, make aware of io.js & node v4 differences (Rod Vagg) [#711](https://github.com/nodejs/node-gyp/pull/711) * [[`1ea7ed01f4`](https://github.com/nodejs/node-gyp/commit/1ea7ed01f4)] - **deps**: update graceful-fs dependency to the latest (Sakthipriyan Vairamani) [#714](https://github.com/nodejs/node-gyp/pull/714) * [[`0fbc387b35`](https://github.com/nodejs/node-gyp/commit/0fbc387b35)] - Update repository URLs. (Ben Noordhuis) [#715](https://github.com/nodejs/node-gyp/pull/715) * [[`bbedb8868b`](https://github.com/nodejs/node-gyp/commit/bbedb8868b)] - **(SEMVER-MAJOR)** **win**: enable delay-load hook by default (Jeremiah Senkpiel) [#708](https://github.com/nodejs/node-gyp/pull/708) * [[`85ed107565`](https://github.com/nodejs/node-gyp/commit/85ed107565)] - Merge pull request #664 from othiym23/othiym23/allow-semver-5 (Nathan Rajlich) * [[`0c720d234c`](https://github.com/nodejs/node-gyp/commit/0c720d234c)] - allow semver@5 (Forrest L Norvell) npm_3.5.2.orig/node_modules/node-gyp/History.md0000644000000000000000000000315012631326456017653 0ustar 00000000000000 2.0.2 / 2015-07-14 ================== * Use HTTPS for dist url (#656, @SonicHedgehog) * Merge pull request #648 from nevosegal/master * Merge pull request #650 from magic890/patch-1 * Updated Installation section on README * Updated link to gyp user documentation * Fix download error message spelling (#643, @tomxtobin) * Merge pull request #637 from lygstate/master * Set NODE_GYP_DIR for addon.gypi to setting absolute path for src/win_delay_load_hook.c, and fixes of the long relative path issue on Win32. Fixes #636 (#637, @lygstate). 2.0.1 / 2015-05-28 ================== * configure: try/catch the semver range.test() call * README: update for visual studio 2013 (#510, @samccone) 2.0.0 / 2015-05-24 ================== * configure: check for python2 executable by default, fallback to python * configure: don't clobber existing $PYTHONPATH * configure: use "path-array" for PYTHONPATH * gyp: fix for non-acsii userprofile name on Windows * gyp: always install into $PRODUCT_DIR * gyp: apply https://codereview.chromium.org/11361103/ * gyp: don't use links at all, just copy the files instead * gyp: update gyp to e1c8fcf7 * Updated README.md with updated Windows build info * Show URL when a download fails * package: add a "license" field * move HMODULE m declaration to top * Only add "-undefined dynamic_lookup" to loadable_module targets * win: optionally allow node.exe/iojs.exe to be renamed * Avoid downloading shasums if using tarPath * Add target name preprocessor define: `NODE_GYP_MODULE_NAME` * Show better error message in case of bad network settings npm_3.5.2.orig/node_modules/node-gyp/LICENSE0000644000000000000000000000211612631326456016676 0ustar 00000000000000(The MIT License) Copyright (c) 2012 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/README.md0000644000000000000000000002050112631326456017146 0ustar 00000000000000node-gyp ========= ### Node.js native addon build tool `node-gyp` is a cross-platform command-line tool written in Node.js for compiling native addon modules for Node.js. It bundles the [gyp](https://code.google.com/p/gyp/) project used by the Chromium team and takes away the pain of dealing with the various differences in build platforms. It is the replacement to the `node-waf` program which is removed for node `v0.8`. If you have a native addon for node that still has a `wscript` file, then you should definitely add a `binding.gyp` file to support the latest versions of node. Multiple target versions of node are supported (i.e. `0.8`, `0.9`, `0.10`, ..., `1.0`, etc.), regardless of what version of node is actually installed on your system (`node-gyp` downloads the necessary development files for the target version). #### Features: * Easy to use, consistent interface * Same commands to build your module on every platform * Supports multiple target versions of Node Installation ------------ You can install with `npm`: ``` bash $ npm install -g node-gyp ``` You will also need to install: * On Unix: * `python` (`v2.7` recommended, `v3.x.x` is __*not*__ supported) * `make` * A proper C/C++ compiler toolchain, like [GCC](https://gcc.gnu.org) * On Mac OS X: * `python` (`v2.7` recommended, `v3.x.x` is __*not*__ supported) (already installed on Mac OS X) * [Xcode](https://developer.apple.com/xcode/download/) * You also need to install the `Command Line Tools` via Xcode. You can find this under the menu `Xcode -> Preferences -> Downloads` * This step will install `gcc` and the related toolchain containing `make` * On Windows: * Python ([`v2.7.10`][python-v2.7.10] recommended, `v3.x.x` is __*not*__ supported) * Make sure that you have a PYTHON environment variable, and it is set to drive:\path\to\python.exe not to a folder * Windows XP/Vista/7: * Microsoft Visual Studio C++ 2013 ([Express][msvc2013] version works well) * If the install fails, try uninstalling any C++ 2010 x64&x86 Redistributable that you have installed first * If you get errors that the 64-bit compilers are not installed you may also need the [compiler update for the Windows SDK 7.1] * Windows 7/8: * Microsoft Visual Studio C++ 2013 for Windows Desktop ([Express][msvc2013] version works well) * Windows 10: * Install the latest version of npm (3.3.6 at the time of writing) * Install Python 2.7 from https://www.python.org/download/releases/2.7/ and make sure its on the System Path * Install Visual Studio Community 2015 Edition. (Custom Install, Select Visual C++ during the installation) * Set the environment variable GYP_MSVS_VERSION=2015 * Run the command prompt as Administrator * $ npm install (--msvs_version=2015) <-- Shouldn't be needed if you have set GYP_MSVS_VERSION env * If the above steps have not worked or you are unsure please visit http://www.serverpals.com/blog/building-using-node-gyp-with-visual-studio-express-2015-on-windows-10-pro-x64 for a full walkthrough * All Windows Versions * For 64-bit builds of node and native modules you will _**also**_ need the [Windows 7 64-bit SDK][win7sdk] * You may need to run one of the following commands if your build complains about WindowsSDKDir not being set, and you are sure you have already installed the SDK: ``` call "C:\Program Files\Microsoft SDKs\Windows\v7.1\bin\Setenv.cmd" /Release /x86 call "C:\Program Files\Microsoft SDKs\Windows\v7.1\bin\Setenv.cmd" /Release /x64 ``` If you have multiple Python versions installed, you can identify which Python version `node-gyp` uses by setting the '--python' variable: ``` bash $ node-gyp --python /path/to/python2.7 ``` If `node-gyp` is called by way of `npm` *and* you have multiple versions of Python installed, then you can set `npm`'s 'python' config key to the appropriate value: ``` bash $ npm config set python /path/to/executable/python2.7 ``` Note that OS X is just a flavour of Unix and so needs `python`, `make`, and C/C++. An easy way to obtain these is to install XCode from Apple, and then use it to install the command line tools (under Preferences -> Downloads). How to Use ---------- To compile your native addon, first go to its root directory: ``` bash $ cd my_node_addon ``` The next step is to generate the appropriate project build files for the current platform. Use `configure` for that: ``` bash $ node-gyp configure ``` __Note__: The `configure` step looks for the `binding.gyp` file in the current directory to process. See below for instructions on creating the `binding.gyp` file. Now you will have either a `Makefile` (on Unix platforms) or a `vcxproj` file (on Windows) in the `build/` directory. Next invoke the `build` command: ``` bash $ node-gyp build ``` Now you have your compiled `.node` bindings file! The compiled bindings end up in `build/Debug/` or `build/Release/`, depending on the build mode. At this point you can require the `.node` file with Node and run your tests! __Note:__ To create a _Debug_ build of the bindings file, pass the `--debug` (or `-d`) switch when running either the `configure`, `build` or `rebuild` command. The "binding.gyp" file ---------------------- Previously when node had `node-waf` you had to write a `wscript` file. The replacement for that is the `binding.gyp` file, which describes the configuration to build your module in a JSON-like format. This file gets placed in the root of your package, alongside the `package.json` file. A barebones `gyp` file appropriate for building a node addon looks like: ``` python { "targets": [ { "target_name": "binding", "sources": [ "src/binding.cc" ] } ] } ``` Some additional resources for addons and writing `gyp` files: * ["Going Native" a nodeschool.io tutorial](http://nodeschool.io/#goingnative) * ["Hello World" node addon example](https://github.com/nodejs/node/tree/master/test/addons/hello-world) * [gyp user documentation](https://gyp.gsrc.io/docs/UserDocumentation.md) * [gyp input format reference](https://gyp.gsrc.io/docs/InputFormatReference.md) * [*"binding.gyp" files out in the wild* wiki page](https://github.com/nodejs/node-gyp/wiki/%22binding.gyp%22-files-out-in-the-wild) Commands -------- `node-gyp` responds to the following commands: | **Command** | **Description** |:--------------|:--------------------------------------------------------------- | `build` | Invokes `make`/`msbuild.exe` and builds the native addon | `clean` | Removes the `build` directory if it exists | `configure` | Generates project build files for the current platform | `rebuild` | Runs `clean`, `configure` and `build` all in a row | `install` | Installs node development header files for the given version | `list` | Lists the currently installed node development file versions | `remove` | Removes the node development header files for the given version License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. [python-v2.7.10]: https://www.python.org/downloads/release/python-2710/ [msvc2013]: https://www.microsoft.com/en-gb/download/details.aspx?id=44914 [win7sdk]: https://www.microsoft.com/en-us/download/details.aspx?id=8279 [compiler update for the Windows SDK 7.1]: https://www.microsoft.com/en-us/download/details.aspx?id=4422 npm_3.5.2.orig/node_modules/node-gyp/addon.gypi0000644000000000000000000000574312631326456017661 0ustar 00000000000000{ 'target_defaults': { 'type': 'loadable_module', 'win_delay_load_hook': 'true', 'product_prefix': '', 'include_dirs': [ '<(node_root_dir)/include/node', '<(node_root_dir)/src', '<(node_root_dir)/deps/uv/include', '<(node_root_dir)/deps/v8/include' ], 'defines': [ 'NODE_GYP_MODULE_NAME=>(_target_name)' ], 'target_conditions': [ ['_type=="loadable_module"', { 'product_extension': 'node', 'defines': [ 'BUILDING_NODE_EXTENSION' ], 'xcode_settings': { 'OTHER_LDFLAGS': [ '-undefined dynamic_lookup' ], }, }], ['_type=="static_library"', { # set to `1` to *disable* the -T thin archive 'ld' flag. # older linkers don't support this flag. 'standalone_static_library': '<(standalone_static_library)' }], ['_win_delay_load_hook=="true"', { # If the addon specifies `'win_delay_load_hook': 'true'` in its # binding.gyp, link a delay-load hook into the DLL. This hook ensures # that the addon will work regardless of whether the node/iojs binary # is named node.exe, iojs.exe, or something else. 'conditions': [ [ 'OS=="win"', { 'sources': [ '<(node_gyp_dir)/src/win_delay_load_hook.c', ], 'msvs_settings': { 'VCLinkerTool': { 'DelayLoadDLLs': [ 'iojs.exe', 'node.exe' ], # Don't print a linker warning when no imports from either .exe # are used. 'AdditionalOptions': [ '/ignore:4199' ], }, }, }], ], }], ], 'conditions': [ [ 'OS=="mac"', { 'defines': [ '_DARWIN_USE_64_BIT_INODE=1' ], 'xcode_settings': { 'DYLIB_INSTALL_NAME_BASE': '@rpath' }, }], [ 'OS=="aix"', { 'ldflags': [ '-Wl,-bimport:<(node_exp_file)' ], }], [ 'OS=="win"', { 'libraries': [ '-lkernel32.lib', '-luser32.lib', '-lgdi32.lib', '-lwinspool.lib', '-lcomdlg32.lib', '-ladvapi32.lib', '-lshell32.lib', '-lole32.lib', '-loleaut32.lib', '-luuid.lib', '-lodbc32.lib', '-lDelayImp.lib', '-l"<(node_root_dir)/$(ConfigurationName)/<(node_lib_file)"' ], 'msvs_disabled_warnings': [ # warning C4251: 'node::ObjectWrap::handle_' : class 'v8::Persistent' # needs to have dll-interface to be used by # clients of class 'node::ObjectWrap' 4251 ], }, { # OS!="win" 'defines': [ '_LARGEFILE_SOURCE', '_FILE_OFFSET_BITS=64' ], }], [ 'OS=="freebsd" or OS=="openbsd" or OS=="solaris" or (OS=="linux" and target_arch!="ia32")', { 'cflags': [ '-fPIC' ], }] ] } } npm_3.5.2.orig/node_modules/node-gyp/bin/0000755000000000000000000000000012631326456016441 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/0000755000000000000000000000000012631326456016470 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/lib/0000755000000000000000000000000012631326456016437 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/0000755000000000000000000000000012631326456020346 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/package.json0000644000000000000000000000577112631326456020171 0ustar 00000000000000{ "_args": [ [ "node-gyp@~3.2.1", "/Users/ogd/Documents/projects/npm/npm" ] ], "_from": "node-gyp@>=3.2.1 <3.3.0", "_id": "node-gyp@3.2.1", "_inCache": true, "_installable": true, "_location": "/node-gyp", "_nodeVersion": "6.0.0-pre", "_npmUser": { "email": "info@bnoordhuis.nl", "name": "bnoordhuis" }, "_npmVersion": "3.3.12", "_phantomChildren": { "brace-expansion": "1.1.2", "core-util-is": "1.0.2", "debug": "2.2.0", "has-unicode": "1.0.1", "inflight": "1.0.4", "inherits": "2.0.1", "isarray": "0.0.1", "once": "1.3.2", "string_decoder": "0.10.31" }, "_requested": { "name": "node-gyp", "raw": "node-gyp@~3.2.1", "rawSpec": "~3.2.1", "scope": null, "spec": ">=3.2.1 <3.3.0", "type": "range" }, "_requiredBy": [ "/" ], "_resolved": "https://registry.npmjs.org/node-gyp/-/node-gyp-3.2.1.tgz", "_shasum": "f5dd569970a508464cc3c15d7e9e8d2de8638dd5", "_shrinkwrap": null, "_spec": "node-gyp@~3.2.1", "_where": "/Users/ogd/Documents/projects/npm/npm", "author": { "email": "nathan@tootallnate.net", "name": "Nathan Rajlich", "url": "http://tootallnate.net" }, "bin": { "node-gyp": "./bin/node-gyp.js" }, "bugs": { "url": "https://github.com/nodejs/node-gyp/issues" }, "dependencies": { "fstream": "^1.0.0", "glob": "3 || 4", "graceful-fs": "^4.1.2", "minimatch": "1", "mkdirp": "^0.5.0", "nopt": "2 || 3", "npmlog": "0 || 1", "osenv": "0", "path-array": "^1.0.0", "request": "2", "rimraf": "2", "semver": "2.x || 3.x || 4 || 5", "tar": "^2.0.0", "which": "1" }, "description": "Node.js native addon build tool", "devDependencies": { "tape": "~4.2.0" }, "directories": {}, "dist": { "shasum": "f5dd569970a508464cc3c15d7e9e8d2de8638dd5", "tarball": "http://registry.npmjs.org/node-gyp/-/node-gyp-3.2.1.tgz" }, "engines": { "node": ">= 0.8.0" }, "gitHead": "89692c9187e10df944b0bf587ed44381b004a08c", "homepage": "https://github.com/nodejs/node-gyp#readme", "installVersion": 9, "keywords": [ "addon", "bindings", "c", "c++", "gyp", "module", "native" ], "license": "MIT", "main": "./lib/node-gyp.js", "maintainers": [ { "name": "TooTallNate", "email": "nathan@tootallnate.net" }, { "name": "bnoordhuis", "email": "info@bnoordhuis.nl" }, { "name": "fishrock123", "email": "fishrock123@rocketmail.com" }, { "name": "isaacs", "email": "i@izs.me" }, { "name": "rvagg", "email": "rod@vagg.org" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "name": "node-gyp", "optionalDependencies": {}, "preferGlobal": true, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/nodejs/node-gyp.git" }, "scripts": { "test": "tape test/test-*" }, "version": "3.2.1" } npm_3.5.2.orig/node_modules/node-gyp/src/0000755000000000000000000000000012631326456016460 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/test/0000755000000000000000000000000012631326456016650 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/bin/node-gyp.js0000755000000000000000000000605512631326456020532 0ustar 00000000000000#!/usr/bin/env node /** * Set the title. */ process.title = 'node-gyp' /** * Module dependencies. */ var gyp = require('../') var log = require('npmlog') /** * Process and execute the selected commands. */ var prog = gyp() var completed = false prog.parseArgv(process.argv) if (prog.todo.length === 0) { if (~process.argv.indexOf('-v') || ~process.argv.indexOf('--version')) { console.log('v%s', prog.version) } else { console.log('%s', prog.usage()) } return process.exit(0) } log.info('it worked if it ends with', 'ok') log.verbose('cli', process.argv) log.info('using', 'node-gyp@%s', prog.version) log.info('using', 'node@%s | %s | %s', process.versions.node, process.platform, process.arch) /** * Change dir if -C/--directory was passed. */ var dir = prog.opts.directory if (dir) { var fs = require('fs') try { var stat = fs.statSync(dir) if (stat.isDirectory()) { log.info('chdir', dir) process.chdir(dir) } else { log.warn('chdir', dir + ' is not a directory') } } catch (e) { if (e.code === 'ENOENT') { log.warn('chdir', dir + ' is not a directory') } else { log.warn('chdir', 'error during chdir() "%s"', e.message) } } } function run () { var command = prog.todo.shift() if (!command) { // done! completed = true log.info('ok') return } prog.commands[command.name](command.args, function (err) { if (err) { log.error(command.name + ' error') log.error('stack', err.stack) errorMessage() log.error('not ok') return process.exit(1) } if (command.name == 'list') { var versions = arguments[1] if (versions.length > 0) { versions.forEach(function (version) { console.log(version) }) } else { console.log('No node development files installed. Use `node-gyp install` to install a version.') } } else if (arguments.length >= 2) { console.log.apply(console, [].slice.call(arguments, 1)) } // now run the next command in the queue process.nextTick(run) }) } process.on('exit', function (code) { if (!completed && !code) { log.error('Completion callback never invoked!') issueMessage() process.exit(6) } }) process.on('uncaughtException', function (err) { log.error('UNCAUGHT EXCEPTION') log.error('stack', err.stack) issueMessage() process.exit(7) }) function errorMessage () { // copied from npm's lib/util/error-handler.js var os = require('os') log.error('System', os.type() + ' ' + os.release()) log.error('command', process.argv .map(JSON.stringify).join(' ')) log.error('cwd', process.cwd()) log.error('node -v', process.version) log.error('node-gyp -v', 'v' + prog.package.version) } function issueMessage () { errorMessage() log.error('', [ 'This is a bug in `node-gyp`.' , 'Try to update node-gyp and file an Issue if it does not help:' , ' ' ].join('\n')) } // start running the given commands! run() npm_3.5.2.orig/node_modules/node-gyp/gyp/.npmignore0000644000000000000000000000000612631326456020463 0ustar 00000000000000*.pyc npm_3.5.2.orig/node_modules/node-gyp/gyp/AUTHORS0000644000000000000000000000050712631326456017542 0ustar 00000000000000# Names should be added to this file like so: # Name or Organization Google Inc. Bloomberg Finance L.P. Yandex LLC Steven Knight Ryan Norton David J. Sankel Eric N. Vander Weele Tom Freudenberg npm_3.5.2.orig/node_modules/node-gyp/gyp/DEPS0000644000000000000000000000106612631326456017151 0ustar 00000000000000# DEPS file for gclient use in buildbot execution of gyp tests. # # (You don't need to use gclient for normal GYP development work.) vars = { "chrome_trunk": "http://src.chromium.org/svn/trunk", "googlecode_url": "http://%s.googlecode.com/svn", } deps = { } deps_os = { "win": { "third_party/cygwin": Var("chrome_trunk") + "/deps/third_party/cygwin@66844", "third_party/python_26": Var("chrome_trunk") + "/tools/third_party/python_26@89111", "src/third_party/pefile": (Var("googlecode_url") % "pefile") + "/trunk@63", }, } npm_3.5.2.orig/node_modules/node-gyp/gyp/LICENSE0000644000000000000000000000270312631326456017477 0ustar 00000000000000Copyright (c) 2009 Google Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Google Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npm_3.5.2.orig/node_modules/node-gyp/gyp/OWNERS0000644000000000000000000000000212631326456017420 0ustar 00000000000000* npm_3.5.2.orig/node_modules/node-gyp/gyp/PRESUBMIT.py0000644000000000000000000000711612631326456020421 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Top-level presubmit script for GYP. See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts for more details about the presubmit API built into gcl. """ PYLINT_BLACKLIST = [ # TODO: fix me. # From SCons, not done in google style. 'test/lib/TestCmd.py', 'test/lib/TestCommon.py', 'test/lib/TestGyp.py', ] PYLINT_DISABLED_WARNINGS = [ # TODO: fix me. # Many tests include modules they don't use. 'W0611', # Possible unbalanced tuple unpacking with sequence. 'W0632', # Attempting to unpack a non-sequence. 'W0633', # Include order doesn't properly include local files? 'F0401', # Some use of built-in names. 'W0622', # Some unused variables. 'W0612', # Operator not preceded/followed by space. 'C0323', 'C0322', # Unnecessary semicolon. 'W0301', # Unused argument. 'W0613', # String has no effect (docstring in wrong place). 'W0105', # map/filter on lambda could be replaced by comprehension. 'W0110', # Use of eval. 'W0123', # Comma not followed by space. 'C0324', # Access to a protected member. 'W0212', # Bad indent. 'W0311', # Line too long. 'C0301', # Undefined variable. 'E0602', # Not exception type specified. 'W0702', # No member of that name. 'E1101', # Dangerous default {}. 'W0102', # Cyclic import. 'R0401', # Others, too many to sort. 'W0201', 'W0232', 'E1103', 'W0621', 'W0108', 'W0223', 'W0231', 'R0201', 'E0101', 'C0321', # ************* Module copy # W0104:427,12:_test.odict.__setitem__: Statement seems to have no effect 'W0104', ] def CheckChangeOnUpload(input_api, output_api): report = [] report.extend(input_api.canned_checks.PanProjectChecks( input_api, output_api)) return report def CheckChangeOnCommit(input_api, output_api): report = [] # Accept any year number from 2009 to the current year. current_year = int(input_api.time.strftime('%Y')) allowed_years = (str(s) for s in reversed(xrange(2009, current_year + 1))) years_re = '(' + '|'.join(allowed_years) + ')' # The (c) is deprecated, but tolerate it until it's removed from all files. license = ( r'.*? Copyright (\(c\) )?%(year)s Google Inc\. All rights reserved\.\n' r'.*? Use of this source code is governed by a BSD-style license that ' r'can be\n' r'.*? found in the LICENSE file\.\n' ) % { 'year': years_re, } report.extend(input_api.canned_checks.PanProjectChecks( input_api, output_api, license_header=license)) report.extend(input_api.canned_checks.CheckTreeIsOpen( input_api, output_api, 'http://gyp-status.appspot.com/status', 'http://gyp-status.appspot.com/current')) import os import sys old_sys_path = sys.path try: sys.path = ['pylib', 'test/lib'] + sys.path blacklist = PYLINT_BLACKLIST if sys.platform == 'win32': blacklist = [os.path.normpath(x).replace('\\', '\\\\') for x in PYLINT_BLACKLIST] report.extend(input_api.canned_checks.RunPylint( input_api, output_api, black_list=blacklist, disabled_warnings=PYLINT_DISABLED_WARNINGS)) finally: sys.path = old_sys_path return report TRYBOTS = [ 'linux_try', 'mac_try', 'win_try', ] def GetPreferredTryMasters(_, change): return { 'client.gyp': { t: set(['defaulttests']) for t in TRYBOTS }, } npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/0000755000000000000000000000000012631326456020274 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/codereview.settings0000644000000000000000000000056512631326456022414 0ustar 00000000000000# This file is used by gcl to get repository specific information. CODE_REVIEW_SERVER: codereview.chromium.org CC_LIST: gyp-developer@googlegroups.com VIEW_VC: https://chromium.googlesource.com/external/gyp/+/ TRY_ON_UPLOAD: False TRYSERVER_PROJECT: gyp TRYSERVER_PATCHLEVEL: 1 TRYSERVER_ROOT: gyp TRYSERVER_SVN_URL: svn://svn.chromium.org/chrome-try/try-nacl PROJECT: gyp npm_3.5.2.orig/node_modules/node-gyp/gyp/data/0000755000000000000000000000000012631326456017401 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/gyp0000755000000000000000000000036012631326456017214 0ustar 00000000000000#!/bin/sh # Copyright 2013 The Chromium Authors. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. set -e base=$(dirname "$0") exec python "${base}/gyp_main.py" "$@" npm_3.5.2.orig/node_modules/node-gyp/gyp/gyp.bat0000755000000000000000000000031112631326456017755 0ustar 00000000000000@rem Copyright (c) 2009 Google Inc. All rights reserved. @rem Use of this source code is governed by a BSD-style license that can be @rem found in the LICENSE file. @python "%~dp0gyp_main.py" %* npm_3.5.2.orig/node_modules/node-gyp/gyp/gyp_main.py0000755000000000000000000000067712631326456020662 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2009 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import os import sys # Make sure we're using the version of pylib in this repo, not one installed # elsewhere on the system. sys.path.insert(0, os.path.join(os.path.dirname(sys.argv[0]), 'pylib')) import gyp if __name__ == '__main__': sys.exit(gyp.script_main()) npm_3.5.2.orig/node_modules/node-gyp/gyp/gyptest.py0000755000000000000000000001752312631326456020554 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. __doc__ = """ gyptest.py -- test runner for GYP tests. """ import os import optparse import subprocess import sys class CommandRunner(object): """ Executor class for commands, including "commands" implemented by Python functions. """ verbose = True active = True def __init__(self, dictionary={}): self.subst_dictionary(dictionary) def subst_dictionary(self, dictionary): self._subst_dictionary = dictionary def subst(self, string, dictionary=None): """ Substitutes (via the format operator) the values in the specified dictionary into the specified command. The command can be an (action, string) tuple. In all cases, we perform substitution on strings and don't worry if something isn't a string. (It's probably a Python function to be executed.) """ if dictionary is None: dictionary = self._subst_dictionary if dictionary: try: string = string % dictionary except TypeError: pass return string def display(self, command, stdout=None, stderr=None): if not self.verbose: return if type(command) == type(()): func = command[0] args = command[1:] s = '%s(%s)' % (func.__name__, ', '.join(map(repr, args))) if type(command) == type([]): # TODO: quote arguments containing spaces # TODO: handle meta characters? s = ' '.join(command) else: s = self.subst(command) if not s.endswith('\n'): s += '\n' sys.stdout.write(s) sys.stdout.flush() def execute(self, command, stdout=None, stderr=None): """ Executes a single command. """ if not self.active: return 0 if type(command) == type(''): command = self.subst(command) cmdargs = shlex.split(command) if cmdargs[0] == 'cd': command = (os.chdir,) + tuple(cmdargs[1:]) if type(command) == type(()): func = command[0] args = command[1:] return func(*args) else: if stdout is sys.stdout: # Same as passing sys.stdout, except python2.4 doesn't fail on it. subout = None else: # Open pipe for anything else so Popen works on python2.4. subout = subprocess.PIPE if stderr is sys.stderr: # Same as passing sys.stderr, except python2.4 doesn't fail on it. suberr = None elif stderr is None: # Merge with stdout if stderr isn't specified. suberr = subprocess.STDOUT else: # Open pipe for anything else so Popen works on python2.4. suberr = subprocess.PIPE p = subprocess.Popen(command, shell=(sys.platform == 'win32'), stdout=subout, stderr=suberr) p.wait() if stdout is None: self.stdout = p.stdout.read() elif stdout is not sys.stdout: stdout.write(p.stdout.read()) if stderr not in (None, sys.stderr): stderr.write(p.stderr.read()) return p.returncode def run(self, command, display=None, stdout=None, stderr=None): """ Runs a single command, displaying it first. """ if display is None: display = command self.display(display) return self.execute(command, stdout, stderr) class Unbuffered(object): def __init__(self, fp): self.fp = fp def write(self, arg): self.fp.write(arg) self.fp.flush() def __getattr__(self, attr): return getattr(self.fp, attr) sys.stdout = Unbuffered(sys.stdout) sys.stderr = Unbuffered(sys.stderr) def is_test_name(f): return f.startswith('gyptest') and f.endswith('.py') def find_all_gyptest_files(directory): result = [] for root, dirs, files in os.walk(directory): if '.svn' in dirs: dirs.remove('.svn') result.extend([ os.path.join(root, f) for f in files if is_test_name(f) ]) result.sort() return result def main(argv=None): if argv is None: argv = sys.argv usage = "gyptest.py [-ahlnq] [-f formats] [test ...]" parser = optparse.OptionParser(usage=usage) parser.add_option("-a", "--all", action="store_true", help="run all tests") parser.add_option("-C", "--chdir", action="store", default=None, help="chdir to the specified directory") parser.add_option("-f", "--format", action="store", default='', help="run tests with the specified formats") parser.add_option("-G", '--gyp_option', action="append", default=[], help="Add -G options to the gyp command line") parser.add_option("-l", "--list", action="store_true", help="list available tests and exit") parser.add_option("-n", "--no-exec", action="store_true", help="no execute, just print the command line") parser.add_option("--passed", action="store_true", help="report passed tests") parser.add_option("--path", action="append", default=[], help="additional $PATH directory") parser.add_option("-q", "--quiet", action="store_true", help="quiet, don't print test command lines") opts, args = parser.parse_args(argv[1:]) if opts.chdir: os.chdir(opts.chdir) if opts.path: extra_path = [os.path.abspath(p) for p in opts.path] extra_path = os.pathsep.join(extra_path) os.environ['PATH'] = extra_path + os.pathsep + os.environ['PATH'] if not args: if not opts.all: sys.stderr.write('Specify -a to get all tests.\n') return 1 args = ['test'] tests = [] for arg in args: if os.path.isdir(arg): tests.extend(find_all_gyptest_files(os.path.normpath(arg))) else: if not is_test_name(os.path.basename(arg)): print >>sys.stderr, arg, 'is not a valid gyp test name.' sys.exit(1) tests.append(arg) if opts.list: for test in tests: print test sys.exit(0) CommandRunner.verbose = not opts.quiet CommandRunner.active = not opts.no_exec cr = CommandRunner() os.environ['PYTHONPATH'] = os.path.abspath('test/lib') if not opts.quiet: sys.stdout.write('PYTHONPATH=%s\n' % os.environ['PYTHONPATH']) passed = [] failed = [] no_result = [] if opts.format: format_list = opts.format.split(',') else: # TODO: not duplicate this mapping from pylib/gyp/__init__.py format_list = { 'aix5': ['make'], 'freebsd7': ['make'], 'freebsd8': ['make'], 'openbsd5': ['make'], 'cygwin': ['msvs'], 'win32': ['msvs', 'ninja'], 'linux2': ['make', 'ninja'], 'linux3': ['make', 'ninja'], 'darwin': ['make', 'ninja', 'xcode', 'xcode-ninja'], }[sys.platform] for format in format_list: os.environ['TESTGYP_FORMAT'] = format if not opts.quiet: sys.stdout.write('TESTGYP_FORMAT=%s\n' % format) gyp_options = [] for option in opts.gyp_option: gyp_options += ['-G', option] if gyp_options and not opts.quiet: sys.stdout.write('Extra Gyp options: %s\n' % gyp_options) for test in tests: status = cr.run([sys.executable, test] + gyp_options, stdout=sys.stdout, stderr=sys.stderr) if status == 2: no_result.append(test) elif status: failed.append(test) else: passed.append(test) if not opts.quiet: def report(description, tests): if tests: if len(tests) == 1: sys.stdout.write("\n%s the following test:\n" % description) else: fmt = "\n%s the following %d tests:\n" sys.stdout.write(fmt % (description, len(tests))) sys.stdout.write("\t" + "\n\t".join(tests) + "\n") if opts.passed: report("Passed", passed) report("Failed", failed) report("No result from", no_result) if failed: return 1 else: return 0 if __name__ == "__main__": sys.exit(main()) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/0000755000000000000000000000000012631326456017607 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/samples/0000755000000000000000000000000012631326456020134 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/setup.py0000755000000000000000000000103012631326456020177 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2009 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. from setuptools import setup setup( name='gyp', version='0.1', description='Generate Your Projects', author='Chromium Authors', author_email='chromium-dev@googlegroups.com', url='http://code.google.com/p/gyp', package_dir = {'': 'pylib'}, packages=['gyp', 'gyp.generator'], entry_points = {'console_scripts': ['gyp=gyp:script_main'] } ) npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/0000755000000000000000000000000012631326456017630 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/aosp_manifest.xml0000644000000000000000000017617212631326456023664 0ustar 00000000000000 npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/buildbot_run.py0000755000000000000000000001020412631326456023336 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Argument-less script to select what to run on the buildbots.""" import os import shutil import subprocess import sys BUILDBOT_DIR = os.path.dirname(os.path.abspath(__file__)) TRUNK_DIR = os.path.dirname(BUILDBOT_DIR) ROOT_DIR = os.path.dirname(TRUNK_DIR) CMAKE_DIR = os.path.join(ROOT_DIR, 'cmake') CMAKE_BIN_DIR = os.path.join(CMAKE_DIR, 'bin') OUT_DIR = os.path.join(TRUNK_DIR, 'out') def CallSubProcess(*args, **kwargs): """Wrapper around subprocess.call which treats errors as build exceptions.""" with open(os.devnull) as devnull_fd: retcode = subprocess.call(stdin=devnull_fd, *args, **kwargs) if retcode != 0: print '@@@STEP_EXCEPTION@@@' sys.exit(1) def PrepareCmake(): """Build CMake 2.8.8 since the version in Precise is 2.8.7.""" if os.environ['BUILDBOT_CLOBBER'] == '1': print '@@@BUILD_STEP Clobber CMake checkout@@@' shutil.rmtree(CMAKE_DIR) # We always build CMake 2.8.8, so no need to do anything # if the directory already exists. if os.path.isdir(CMAKE_DIR): return print '@@@BUILD_STEP Initialize CMake checkout@@@' os.mkdir(CMAKE_DIR) print '@@@BUILD_STEP Sync CMake@@@' CallSubProcess( ['git', 'clone', '--depth', '1', '--single-branch', '--branch', 'v2.8.8', '--', 'git://cmake.org/cmake.git', CMAKE_DIR], cwd=CMAKE_DIR) print '@@@BUILD_STEP Build CMake@@@' CallSubProcess( ['/bin/bash', 'bootstrap', '--prefix=%s' % CMAKE_DIR], cwd=CMAKE_DIR) CallSubProcess( ['make', 'cmake'], cwd=CMAKE_DIR) def GypTestFormat(title, format=None, msvs_version=None, tests=[]): """Run the gyp tests for a given format, emitting annotator tags. See annotator docs at: https://sites.google.com/a/chromium.org/dev/developers/testing/chromium-build-infrastructure/buildbot-annotations Args: format: gyp format to test. Returns: 0 for sucesss, 1 for failure. """ if not format: format = title print '@@@BUILD_STEP ' + title + '@@@' sys.stdout.flush() env = os.environ.copy() if msvs_version: env['GYP_MSVS_VERSION'] = msvs_version command = ' '.join( [sys.executable, 'gyp/gyptest.py', '--all', '--passed', '--format', format, '--path', CMAKE_BIN_DIR, '--chdir', 'gyp'] + tests) retcode = subprocess.call(command, cwd=ROOT_DIR, env=env, shell=True) if retcode: # Emit failure tag, and keep going. print '@@@STEP_FAILURE@@@' return 1 return 0 def GypBuild(): # Dump out/ directory. print '@@@BUILD_STEP cleanup@@@' print 'Removing %s...' % OUT_DIR shutil.rmtree(OUT_DIR, ignore_errors=True) print 'Done.' retcode = 0 if sys.platform.startswith('linux'): retcode += GypTestFormat('ninja') retcode += GypTestFormat('make') PrepareCmake() retcode += GypTestFormat('cmake') elif sys.platform == 'darwin': retcode += GypTestFormat('ninja') retcode += GypTestFormat('xcode') retcode += GypTestFormat('make') elif sys.platform == 'win32': retcode += GypTestFormat('ninja') if os.environ['BUILDBOT_BUILDERNAME'] == 'gyp-win64': retcode += GypTestFormat('msvs-ninja-2013', format='msvs-ninja', msvs_version='2013', tests=[ r'test\generator-output\gyptest-actions.py', r'test\generator-output\gyptest-relocate.py', r'test\generator-output\gyptest-rules.py']) retcode += GypTestFormat('msvs-2013', format='msvs', msvs_version='2013') else: raise Exception('Unknown platform') if retcode: # TODO(bradnelson): once the annotator supports a postscript (section for # after the build proper that could be used for cumulative failures), # use that instead of this. This isolates the final return value so # that it isn't misattributed to the last stage. print '@@@BUILD_STEP failures@@@' sys.exit(retcode) if __name__ == '__main__': GypBuild() npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/commit_queue/0000755000000000000000000000000012631326456022770 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/commit_queue/OWNERS0000644000000000000000000000017212631326456023730 0ustar 00000000000000set noparent bradnelson@chromium.org bradnelson@google.com iannucci@chromium.org scottmg@chromium.org thakis@chromium.org npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/commit_queue/README0000644000000000000000000000023012631326456023643 0ustar 00000000000000cq_config.json describes the trybots that must pass in order to land a change through the commit queue. Comments are here as the file is strictly JSON. npm_3.5.2.orig/node_modules/node-gyp/gyp/buildbot/commit_queue/cq_config.json0000644000000000000000000000056712631326456025623 0ustar 00000000000000{ "trybots": { "launched": { "tryserver.nacl": { "gyp-presubmit": ["defaulttests"], "gyp-linux": ["defaulttests"], "gyp-mac": ["defaulttests"], "gyp-win32": ["defaulttests"], "gyp-win64": ["defaulttests"] } }, "triggered": { } } } npm_3.5.2.orig/node_modules/node-gyp/gyp/data/win/0000755000000000000000000000000012631326456020176 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/data/win/large-pdb-shim.cc0000644000000000000000000000121512631326456023277 0ustar 00000000000000// Copyright (c) 2013 Google Inc. All rights reserved. // Use of this source code is governed by a BSD-style license that can be // found in the LICENSE file. // This file is used to generate an empty .pdb -- with a 4KB pagesize -- that is // then used during the final link for modules that have large PDBs. Otherwise, // the linker will generate a pdb with a page size of 1KB, which imposes a limit // of 1GB on the .pdb. By generating an initial empty .pdb with the compiler // (rather than the linker), this limit is avoided. With this in place PDBs may // grow to 2GB. // // This file is referenced by the msvs_large_pdb mechanism in MSVSUtil.py. npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/0000755000000000000000000000000012631326456020406 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSNew.py0000644000000000000000000002753412631326456022235 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """New implementation of Visual Studio project generation.""" import os import random import gyp.common # hashlib is supplied as of Python 2.5 as the replacement interface for md5 # and other secure hashes. In 2.6, md5 is deprecated. Import hashlib if # available, avoiding a deprecation warning under 2.6. Import md5 otherwise, # preserving 2.4 compatibility. try: import hashlib _new_md5 = hashlib.md5 except ImportError: import md5 _new_md5 = md5.new # Initialize random number generator random.seed() # GUIDs for project types ENTRY_TYPE_GUIDS = { 'project': '{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}', 'folder': '{2150E333-8FDC-42A3-9474-1A3956D46DE8}', } #------------------------------------------------------------------------------ # Helper functions def MakeGuid(name, seed='msvs_new'): """Returns a GUID for the specified target name. Args: name: Target name. seed: Seed for MD5 hash. Returns: A GUID-line string calculated from the name and seed. This generates something which looks like a GUID, but depends only on the name and seed. This means the same name/seed will always generate the same GUID, so that projects and solutions which refer to each other can explicitly determine the GUID to refer to explicitly. It also means that the GUID will not change when the project for a target is rebuilt. """ # Calculate a MD5 signature for the seed and name. d = _new_md5(str(seed) + str(name)).hexdigest().upper() # Convert most of the signature to GUID form (discard the rest) guid = ('{' + d[:8] + '-' + d[8:12] + '-' + d[12:16] + '-' + d[16:20] + '-' + d[20:32] + '}') return guid #------------------------------------------------------------------------------ class MSVSSolutionEntry(object): def __cmp__(self, other): # Sort by name then guid (so things are in order on vs2008). return cmp((self.name, self.get_guid()), (other.name, other.get_guid())) class MSVSFolder(MSVSSolutionEntry): """Folder in a Visual Studio project or solution.""" def __init__(self, path, name = None, entries = None, guid = None, items = None): """Initializes the folder. Args: path: Full path to the folder. name: Name of the folder. entries: List of folder entries to nest inside this folder. May contain Folder or Project objects. May be None, if the folder is empty. guid: GUID to use for folder, if not None. items: List of solution items to include in the folder project. May be None, if the folder does not directly contain items. """ if name: self.name = name else: # Use last layer. self.name = os.path.basename(path) self.path = path self.guid = guid # Copy passed lists (or set to empty lists) self.entries = sorted(list(entries or [])) self.items = list(items or []) self.entry_type_guid = ENTRY_TYPE_GUIDS['folder'] def get_guid(self): if self.guid is None: # Use consistent guids for folders (so things don't regenerate). self.guid = MakeGuid(self.path, seed='msvs_folder') return self.guid #------------------------------------------------------------------------------ class MSVSProject(MSVSSolutionEntry): """Visual Studio project.""" def __init__(self, path, name = None, dependencies = None, guid = None, spec = None, build_file = None, config_platform_overrides = None, fixpath_prefix = None): """Initializes the project. Args: path: Absolute path to the project file. name: Name of project. If None, the name will be the same as the base name of the project file. dependencies: List of other Project objects this project is dependent upon, if not None. guid: GUID to use for project, if not None. spec: Dictionary specifying how to build this project. build_file: Filename of the .gyp file that the vcproj file comes from. config_platform_overrides: optional dict of configuration platforms to used in place of the default for this target. fixpath_prefix: the path used to adjust the behavior of _fixpath """ self.path = path self.guid = guid self.spec = spec self.build_file = build_file # Use project filename if name not specified self.name = name or os.path.splitext(os.path.basename(path))[0] # Copy passed lists (or set to empty lists) self.dependencies = list(dependencies or []) self.entry_type_guid = ENTRY_TYPE_GUIDS['project'] if config_platform_overrides: self.config_platform_overrides = config_platform_overrides else: self.config_platform_overrides = {} self.fixpath_prefix = fixpath_prefix self.msbuild_toolset = None def set_dependencies(self, dependencies): self.dependencies = list(dependencies or []) def get_guid(self): if self.guid is None: # Set GUID from path # TODO(rspangler): This is fragile. # 1. We can't just use the project filename sans path, since there could # be multiple projects with the same base name (for example, # foo/unittest.vcproj and bar/unittest.vcproj). # 2. The path needs to be relative to $SOURCE_ROOT, so that the project # GUID is the same whether it's included from base/base.sln or # foo/bar/baz/baz.sln. # 3. The GUID needs to be the same each time this builder is invoked, so # that we don't need to rebuild the solution when the project changes. # 4. We should be able to handle pre-built project files by reading the # GUID from the files. self.guid = MakeGuid(self.name) return self.guid def set_msbuild_toolset(self, msbuild_toolset): self.msbuild_toolset = msbuild_toolset #------------------------------------------------------------------------------ class MSVSSolution(object): """Visual Studio solution.""" def __init__(self, path, version, entries=None, variants=None, websiteProperties=True): """Initializes the solution. Args: path: Path to solution file. version: Format version to emit. entries: List of entries in solution. May contain Folder or Project objects. May be None, if the folder is empty. variants: List of build variant strings. If none, a default list will be used. websiteProperties: Flag to decide if the website properties section is generated. """ self.path = path self.websiteProperties = websiteProperties self.version = version # Copy passed lists (or set to empty lists) self.entries = list(entries or []) if variants: # Copy passed list self.variants = variants[:] else: # Use default self.variants = ['Debug|Win32', 'Release|Win32'] # TODO(rspangler): Need to be able to handle a mapping of solution config # to project config. Should we be able to handle variants being a dict, # or add a separate variant_map variable? If it's a dict, we can't # guarantee the order of variants since dict keys aren't ordered. # TODO(rspangler): Automatically write to disk for now; should delay until # node-evaluation time. self.Write() def Write(self, writer=gyp.common.WriteOnDiff): """Writes the solution file to disk. Raises: IndexError: An entry appears multiple times. """ # Walk the entry tree and collect all the folders and projects. all_entries = set() entries_to_check = self.entries[:] while entries_to_check: e = entries_to_check.pop(0) # If this entry has been visited, nothing to do. if e in all_entries: continue all_entries.add(e) # If this is a folder, check its entries too. if isinstance(e, MSVSFolder): entries_to_check += e.entries all_entries = sorted(all_entries) # Open file and print header f = writer(self.path) f.write('Microsoft Visual Studio Solution File, ' 'Format Version %s\r\n' % self.version.SolutionVersion()) f.write('# %s\r\n' % self.version.Description()) # Project entries sln_root = os.path.split(self.path)[0] for e in all_entries: relative_path = gyp.common.RelativePath(e.path, sln_root) # msbuild does not accept an empty folder_name. # use '.' in case relative_path is empty. folder_name = relative_path.replace('/', '\\') or '.' f.write('Project("%s") = "%s", "%s", "%s"\r\n' % ( e.entry_type_guid, # Entry type GUID e.name, # Folder name folder_name, # Folder name (again) e.get_guid(), # Entry GUID )) # TODO(rspangler): Need a way to configure this stuff if self.websiteProperties: f.write('\tProjectSection(WebsiteProperties) = preProject\r\n' '\t\tDebug.AspNetCompiler.Debug = "True"\r\n' '\t\tRelease.AspNetCompiler.Debug = "False"\r\n' '\tEndProjectSection\r\n') if isinstance(e, MSVSFolder): if e.items: f.write('\tProjectSection(SolutionItems) = preProject\r\n') for i in e.items: f.write('\t\t%s = %s\r\n' % (i, i)) f.write('\tEndProjectSection\r\n') if isinstance(e, MSVSProject): if e.dependencies: f.write('\tProjectSection(ProjectDependencies) = postProject\r\n') for d in e.dependencies: f.write('\t\t%s = %s\r\n' % (d.get_guid(), d.get_guid())) f.write('\tEndProjectSection\r\n') f.write('EndProject\r\n') # Global section f.write('Global\r\n') # Configurations (variants) f.write('\tGlobalSection(SolutionConfigurationPlatforms) = preSolution\r\n') for v in self.variants: f.write('\t\t%s = %s\r\n' % (v, v)) f.write('\tEndGlobalSection\r\n') # Sort config guids for easier diffing of solution changes. config_guids = [] config_guids_overrides = {} for e in all_entries: if isinstance(e, MSVSProject): config_guids.append(e.get_guid()) config_guids_overrides[e.get_guid()] = e.config_platform_overrides config_guids.sort() f.write('\tGlobalSection(ProjectConfigurationPlatforms) = postSolution\r\n') for g in config_guids: for v in self.variants: nv = config_guids_overrides[g].get(v, v) # Pick which project configuration to build for this solution # configuration. f.write('\t\t%s.%s.ActiveCfg = %s\r\n' % ( g, # Project GUID v, # Solution build configuration nv, # Project build config for that solution config )) # Enable project in this solution configuration. f.write('\t\t%s.%s.Build.0 = %s\r\n' % ( g, # Project GUID v, # Solution build configuration nv, # Project build config for that solution config )) f.write('\tEndGlobalSection\r\n') # TODO(rspangler): Should be able to configure this stuff too (though I've # never seen this be any different) f.write('\tGlobalSection(SolutionProperties) = preSolution\r\n') f.write('\t\tHideSolutionNode = FALSE\r\n') f.write('\tEndGlobalSection\r\n') # Folder mappings # Omit this section if there are no folders if any([e.entries for e in all_entries if isinstance(e, MSVSFolder)]): f.write('\tGlobalSection(NestedProjects) = preSolution\r\n') for e in all_entries: if not isinstance(e, MSVSFolder): continue # Does not apply to projects, only folders for subentry in e.entries: f.write('\t\t%s = %s\r\n' % (subentry.get_guid(), e.get_guid())) f.write('\tEndGlobalSection\r\n') f.write('EndGlobal\r\n') f.close() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSProject.py0000644000000000000000000001436312631326456023106 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Visual Studio project reader/writer.""" import gyp.common import gyp.easy_xml as easy_xml #------------------------------------------------------------------------------ class Tool(object): """Visual Studio tool.""" def __init__(self, name, attrs=None): """Initializes the tool. Args: name: Tool name. attrs: Dict of tool attributes; may be None. """ self._attrs = attrs or {} self._attrs['Name'] = name def _GetSpecification(self): """Creates an element for the tool. Returns: A new xml.dom.Element for the tool. """ return ['Tool', self._attrs] class Filter(object): """Visual Studio filter - that is, a virtual folder.""" def __init__(self, name, contents=None): """Initializes the folder. Args: name: Filter (folder) name. contents: List of filenames and/or Filter objects contained. """ self.name = name self.contents = list(contents or []) #------------------------------------------------------------------------------ class Writer(object): """Visual Studio XML project writer.""" def __init__(self, project_path, version, name, guid=None, platforms=None): """Initializes the project. Args: project_path: Path to the project file. version: Format version to emit. name: Name of the project. guid: GUID to use for project, if not None. platforms: Array of string, the supported platforms. If null, ['Win32'] """ self.project_path = project_path self.version = version self.name = name self.guid = guid # Default to Win32 for platforms. if not platforms: platforms = ['Win32'] # Initialize the specifications of the various sections. self.platform_section = ['Platforms'] for platform in platforms: self.platform_section.append(['Platform', {'Name': platform}]) self.tool_files_section = ['ToolFiles'] self.configurations_section = ['Configurations'] self.files_section = ['Files'] # Keep a dict keyed on filename to speed up access. self.files_dict = dict() def AddToolFile(self, path): """Adds a tool file to the project. Args: path: Relative path from project to tool file. """ self.tool_files_section.append(['ToolFile', {'RelativePath': path}]) def _GetSpecForConfiguration(self, config_type, config_name, attrs, tools): """Returns the specification for a configuration. Args: config_type: Type of configuration node. config_name: Configuration name. attrs: Dict of configuration attributes; may be None. tools: List of tools (strings or Tool objects); may be None. Returns: """ # Handle defaults if not attrs: attrs = {} if not tools: tools = [] # Add configuration node and its attributes node_attrs = attrs.copy() node_attrs['Name'] = config_name specification = [config_type, node_attrs] # Add tool nodes and their attributes if tools: for t in tools: if isinstance(t, Tool): specification.append(t._GetSpecification()) else: specification.append(Tool(t)._GetSpecification()) return specification def AddConfig(self, name, attrs=None, tools=None): """Adds a configuration to the project. Args: name: Configuration name. attrs: Dict of configuration attributes; may be None. tools: List of tools (strings or Tool objects); may be None. """ spec = self._GetSpecForConfiguration('Configuration', name, attrs, tools) self.configurations_section.append(spec) def _AddFilesToNode(self, parent, files): """Adds files and/or filters to the parent node. Args: parent: Destination node files: A list of Filter objects and/or relative paths to files. Will call itself recursively, if the files list contains Filter objects. """ for f in files: if isinstance(f, Filter): node = ['Filter', {'Name': f.name}] self._AddFilesToNode(node, f.contents) else: node = ['File', {'RelativePath': f}] self.files_dict[f] = node parent.append(node) def AddFiles(self, files): """Adds files to the project. Args: files: A list of Filter objects and/or relative paths to files. This makes a copy of the file/filter tree at the time of this call. If you later add files to a Filter object which was passed into a previous call to AddFiles(), it will not be reflected in this project. """ self._AddFilesToNode(self.files_section, files) # TODO(rspangler) This also doesn't handle adding files to an existing # filter. That is, it doesn't merge the trees. def AddFileConfig(self, path, config, attrs=None, tools=None): """Adds a configuration to a file. Args: path: Relative path to the file. config: Name of configuration to add. attrs: Dict of configuration attributes; may be None. tools: List of tools (strings or Tool objects); may be None. Raises: ValueError: Relative path does not match any file added via AddFiles(). """ # Find the file node with the right relative path parent = self.files_dict.get(path) if not parent: raise ValueError('AddFileConfig: file "%s" not in project.' % path) # Add the config to the file node spec = self._GetSpecForConfiguration('FileConfiguration', config, attrs, tools) parent.append(spec) def WriteIfChanged(self): """Writes the project file.""" # First create XML content definition content = [ 'VisualStudioProject', {'ProjectType': 'Visual C++', 'Version': self.version.ProjectVersion(), 'Name': self.name, 'ProjectGUID': self.guid, 'RootNamespace': self.name, 'Keyword': 'Win32Proj' }, self.platform_section, self.tool_files_section, self.configurations_section, ['References'], # empty section self.files_section, ['Globals'] # empty section ] easy_xml.WriteXmlIfChanged(content, self.project_path, encoding="Windows-1252") npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings.py0000644000000000000000000012776512631326456023313 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. r"""Code to validate and convert settings of the Microsoft build tools. This file contains code to validate and convert settings of the Microsoft build tools. The function ConvertToMSBuildSettings(), ValidateMSVSSettings(), and ValidateMSBuildSettings() are the entry points. This file was created by comparing the projects created by Visual Studio 2008 and Visual Studio 2010 for all available settings through the user interface. The MSBuild schemas were also considered. They are typically found in the MSBuild install directory, e.g. c:\Program Files (x86)\MSBuild """ import sys import re # Dictionaries of settings validators. The key is the tool name, the value is # a dictionary mapping setting names to validation functions. _msvs_validators = {} _msbuild_validators = {} # A dictionary of settings converters. The key is the tool name, the value is # a dictionary mapping setting names to conversion functions. _msvs_to_msbuild_converters = {} # Tool name mapping from MSVS to MSBuild. _msbuild_name_of_tool = {} class _Tool(object): """Represents a tool used by MSVS or MSBuild. Attributes: msvs_name: The name of the tool in MSVS. msbuild_name: The name of the tool in MSBuild. """ def __init__(self, msvs_name, msbuild_name): self.msvs_name = msvs_name self.msbuild_name = msbuild_name def _AddTool(tool): """Adds a tool to the four dictionaries used to process settings. This only defines the tool. Each setting also needs to be added. Args: tool: The _Tool object to be added. """ _msvs_validators[tool.msvs_name] = {} _msbuild_validators[tool.msbuild_name] = {} _msvs_to_msbuild_converters[tool.msvs_name] = {} _msbuild_name_of_tool[tool.msvs_name] = tool.msbuild_name def _GetMSBuildToolSettings(msbuild_settings, tool): """Returns an MSBuild tool dictionary. Creates it if needed.""" return msbuild_settings.setdefault(tool.msbuild_name, {}) class _Type(object): """Type of settings (Base class).""" def ValidateMSVS(self, value): """Verifies that the value is legal for MSVS. Args: value: the value to check for this type. Raises: ValueError if value is not valid for MSVS. """ def ValidateMSBuild(self, value): """Verifies that the value is legal for MSBuild. Args: value: the value to check for this type. Raises: ValueError if value is not valid for MSBuild. """ def ConvertToMSBuild(self, value): """Returns the MSBuild equivalent of the MSVS value given. Args: value: the MSVS value to convert. Returns: the MSBuild equivalent. Raises: ValueError if value is not valid. """ return value class _String(_Type): """A setting that's just a string.""" def ValidateMSVS(self, value): if not isinstance(value, basestring): raise ValueError('expected string; got %r' % value) def ValidateMSBuild(self, value): if not isinstance(value, basestring): raise ValueError('expected string; got %r' % value) def ConvertToMSBuild(self, value): # Convert the macros return ConvertVCMacrosToMSBuild(value) class _StringList(_Type): """A settings that's a list of strings.""" def ValidateMSVS(self, value): if not isinstance(value, basestring) and not isinstance(value, list): raise ValueError('expected string list; got %r' % value) def ValidateMSBuild(self, value): if not isinstance(value, basestring) and not isinstance(value, list): raise ValueError('expected string list; got %r' % value) def ConvertToMSBuild(self, value): # Convert the macros if isinstance(value, list): return [ConvertVCMacrosToMSBuild(i) for i in value] else: return ConvertVCMacrosToMSBuild(value) class _Boolean(_Type): """Boolean settings, can have the values 'false' or 'true'.""" def _Validate(self, value): if value != 'true' and value != 'false': raise ValueError('expected bool; got %r' % value) def ValidateMSVS(self, value): self._Validate(value) def ValidateMSBuild(self, value): self._Validate(value) def ConvertToMSBuild(self, value): self._Validate(value) return value class _Integer(_Type): """Integer settings.""" def __init__(self, msbuild_base=10): _Type.__init__(self) self._msbuild_base = msbuild_base def ValidateMSVS(self, value): # Try to convert, this will raise ValueError if invalid. self.ConvertToMSBuild(value) def ValidateMSBuild(self, value): # Try to convert, this will raise ValueError if invalid. int(value, self._msbuild_base) def ConvertToMSBuild(self, value): msbuild_format = (self._msbuild_base == 10) and '%d' or '0x%04x' return msbuild_format % int(value) class _Enumeration(_Type): """Type of settings that is an enumeration. In MSVS, the values are indexes like '0', '1', and '2'. MSBuild uses text labels that are more representative, like 'Win32'. Constructor args: label_list: an array of MSBuild labels that correspond to the MSVS index. In the rare cases where MSVS has skipped an index value, None is used in the array to indicate the unused spot. new: an array of labels that are new to MSBuild. """ def __init__(self, label_list, new=None): _Type.__init__(self) self._label_list = label_list self._msbuild_values = set(value for value in label_list if value is not None) if new is not None: self._msbuild_values.update(new) def ValidateMSVS(self, value): # Try to convert. It will raise an exception if not valid. self.ConvertToMSBuild(value) def ValidateMSBuild(self, value): if value not in self._msbuild_values: raise ValueError('unrecognized enumerated value %s' % value) def ConvertToMSBuild(self, value): index = int(value) if index < 0 or index >= len(self._label_list): raise ValueError('index value (%d) not in expected range [0, %d)' % (index, len(self._label_list))) label = self._label_list[index] if label is None: raise ValueError('converted value for %s not specified.' % value) return label # Instantiate the various generic types. _boolean = _Boolean() _integer = _Integer() # For now, we don't do any special validation on these types: _string = _String() _file_name = _String() _folder_name = _String() _file_list = _StringList() _folder_list = _StringList() _string_list = _StringList() # Some boolean settings went from numerical values to boolean. The # mapping is 0: default, 1: false, 2: true. _newly_boolean = _Enumeration(['', 'false', 'true']) def _Same(tool, name, setting_type): """Defines a setting that has the same name in MSVS and MSBuild. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. name: the name of the setting. setting_type: the type of this setting. """ _Renamed(tool, name, name, setting_type) def _Renamed(tool, msvs_name, msbuild_name, setting_type): """Defines a setting for which the name has changed. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. msvs_name: the name of the MSVS setting. msbuild_name: the name of the MSBuild setting. setting_type: the type of this setting. """ def _Translate(value, msbuild_settings): msbuild_tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool) msbuild_tool_settings[msbuild_name] = setting_type.ConvertToMSBuild(value) _msvs_validators[tool.msvs_name][msvs_name] = setting_type.ValidateMSVS _msbuild_validators[tool.msbuild_name][msbuild_name] = ( setting_type.ValidateMSBuild) _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate def _Moved(tool, settings_name, msbuild_tool_name, setting_type): _MovedAndRenamed(tool, settings_name, msbuild_tool_name, settings_name, setting_type) def _MovedAndRenamed(tool, msvs_settings_name, msbuild_tool_name, msbuild_settings_name, setting_type): """Defines a setting that may have moved to a new section. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. msvs_settings_name: the MSVS name of the setting. msbuild_tool_name: the name of the MSBuild tool to place the setting under. msbuild_settings_name: the MSBuild name of the setting. setting_type: the type of this setting. """ def _Translate(value, msbuild_settings): tool_settings = msbuild_settings.setdefault(msbuild_tool_name, {}) tool_settings[msbuild_settings_name] = setting_type.ConvertToMSBuild(value) _msvs_validators[tool.msvs_name][msvs_settings_name] = ( setting_type.ValidateMSVS) validator = setting_type.ValidateMSBuild _msbuild_validators[msbuild_tool_name][msbuild_settings_name] = validator _msvs_to_msbuild_converters[tool.msvs_name][msvs_settings_name] = _Translate def _MSVSOnly(tool, name, setting_type): """Defines a setting that is only found in MSVS. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. name: the name of the setting. setting_type: the type of this setting. """ def _Translate(unused_value, unused_msbuild_settings): # Since this is for MSVS only settings, no translation will happen. pass _msvs_validators[tool.msvs_name][name] = setting_type.ValidateMSVS _msvs_to_msbuild_converters[tool.msvs_name][name] = _Translate def _MSBuildOnly(tool, name, setting_type): """Defines a setting that is only found in MSBuild. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. name: the name of the setting. setting_type: the type of this setting. """ def _Translate(value, msbuild_settings): # Let msbuild-only properties get translated as-is from msvs_settings. tool_settings = msbuild_settings.setdefault(tool.msbuild_name, {}) tool_settings[name] = value _msbuild_validators[tool.msbuild_name][name] = setting_type.ValidateMSBuild _msvs_to_msbuild_converters[tool.msvs_name][name] = _Translate def _ConvertedToAdditionalOption(tool, msvs_name, flag): """Defines a setting that's handled via a command line option in MSBuild. Args: tool: a dictionary that gives the names of the tool for MSVS and MSBuild. msvs_name: the name of the MSVS setting that if 'true' becomes a flag flag: the flag to insert at the end of the AdditionalOptions """ def _Translate(value, msbuild_settings): if value == 'true': tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool) if 'AdditionalOptions' in tool_settings: new_flags = '%s %s' % (tool_settings['AdditionalOptions'], flag) else: new_flags = flag tool_settings['AdditionalOptions'] = new_flags _msvs_validators[tool.msvs_name][msvs_name] = _boolean.ValidateMSVS _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate def _CustomGeneratePreprocessedFile(tool, msvs_name): def _Translate(value, msbuild_settings): tool_settings = _GetMSBuildToolSettings(msbuild_settings, tool) if value == '0': tool_settings['PreprocessToFile'] = 'false' tool_settings['PreprocessSuppressLineNumbers'] = 'false' elif value == '1': # /P tool_settings['PreprocessToFile'] = 'true' tool_settings['PreprocessSuppressLineNumbers'] = 'false' elif value == '2': # /EP /P tool_settings['PreprocessToFile'] = 'true' tool_settings['PreprocessSuppressLineNumbers'] = 'true' else: raise ValueError('value must be one of [0, 1, 2]; got %s' % value) # Create a bogus validator that looks for '0', '1', or '2' msvs_validator = _Enumeration(['a', 'b', 'c']).ValidateMSVS _msvs_validators[tool.msvs_name][msvs_name] = msvs_validator msbuild_validator = _boolean.ValidateMSBuild msbuild_tool_validators = _msbuild_validators[tool.msbuild_name] msbuild_tool_validators['PreprocessToFile'] = msbuild_validator msbuild_tool_validators['PreprocessSuppressLineNumbers'] = msbuild_validator _msvs_to_msbuild_converters[tool.msvs_name][msvs_name] = _Translate fix_vc_macro_slashes_regex_list = ('IntDir', 'OutDir') fix_vc_macro_slashes_regex = re.compile( r'(\$\((?:%s)\))(?:[\\/]+)' % "|".join(fix_vc_macro_slashes_regex_list) ) # Regular expression to detect keys that were generated by exclusion lists _EXCLUDED_SUFFIX_RE = re.compile('^(.*)_excluded$') def _ValidateExclusionSetting(setting, settings, error_msg, stderr=sys.stderr): """Verify that 'setting' is valid if it is generated from an exclusion list. If the setting appears to be generated from an exclusion list, the root name is checked. Args: setting: A string that is the setting name to validate settings: A dictionary where the keys are valid settings error_msg: The message to emit in the event of error stderr: The stream receiving the error messages. """ # This may be unrecognized because it's an exclusion list. If the # setting name has the _excluded suffix, then check the root name. unrecognized = True m = re.match(_EXCLUDED_SUFFIX_RE, setting) if m: root_setting = m.group(1) unrecognized = root_setting not in settings if unrecognized: # We don't know this setting. Give a warning. print >> stderr, error_msg def FixVCMacroSlashes(s): """Replace macros which have excessive following slashes. These macros are known to have a built-in trailing slash. Furthermore, many scripts hiccup on processing paths with extra slashes in the middle. This list is probably not exhaustive. Add as needed. """ if '$' in s: s = fix_vc_macro_slashes_regex.sub(r'\1', s) return s def ConvertVCMacrosToMSBuild(s): """Convert the the MSVS macros found in the string to the MSBuild equivalent. This list is probably not exhaustive. Add as needed. """ if '$' in s: replace_map = { '$(ConfigurationName)': '$(Configuration)', '$(InputDir)': '%(RelativeDir)', '$(InputExt)': '%(Extension)', '$(InputFileName)': '%(Filename)%(Extension)', '$(InputName)': '%(Filename)', '$(InputPath)': '%(Identity)', '$(ParentName)': '$(ProjectFileName)', '$(PlatformName)': '$(Platform)', '$(SafeInputName)': '%(Filename)', } for old, new in replace_map.iteritems(): s = s.replace(old, new) s = FixVCMacroSlashes(s) return s def ConvertToMSBuildSettings(msvs_settings, stderr=sys.stderr): """Converts MSVS settings (VS2008 and earlier) to MSBuild settings (VS2010+). Args: msvs_settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. Returns: A dictionary of MSBuild settings. The key is either the MSBuild tool name or the empty string (for the global settings). The values are themselves dictionaries of settings and their values. """ msbuild_settings = {} for msvs_tool_name, msvs_tool_settings in msvs_settings.iteritems(): if msvs_tool_name in _msvs_to_msbuild_converters: msvs_tool = _msvs_to_msbuild_converters[msvs_tool_name] for msvs_setting, msvs_value in msvs_tool_settings.iteritems(): if msvs_setting in msvs_tool: # Invoke the translation function. try: msvs_tool[msvs_setting](msvs_value, msbuild_settings) except ValueError, e: print >> stderr, ('Warning: while converting %s/%s to MSBuild, ' '%s' % (msvs_tool_name, msvs_setting, e)) else: _ValidateExclusionSetting(msvs_setting, msvs_tool, ('Warning: unrecognized setting %s/%s ' 'while converting to MSBuild.' % (msvs_tool_name, msvs_setting)), stderr) else: print >> stderr, ('Warning: unrecognized tool %s while converting to ' 'MSBuild.' % msvs_tool_name) return msbuild_settings def ValidateMSVSSettings(settings, stderr=sys.stderr): """Validates that the names of the settings are valid for MSVS. Args: settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. """ _ValidateSettings(_msvs_validators, settings, stderr) def ValidateMSBuildSettings(settings, stderr=sys.stderr): """Validates that the names of the settings are valid for MSBuild. Args: settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. """ _ValidateSettings(_msbuild_validators, settings, stderr) def _ValidateSettings(validators, settings, stderr): """Validates that the settings are valid for MSBuild or MSVS. We currently only validate the names of the settings, not their values. Args: validators: A dictionary of tools and their validators. settings: A dictionary. The key is the tool name. The values are themselves dictionaries of settings and their values. stderr: The stream receiving the error messages. """ for tool_name in settings: if tool_name in validators: tool_validators = validators[tool_name] for setting, value in settings[tool_name].iteritems(): if setting in tool_validators: try: tool_validators[setting](value) except ValueError, e: print >> stderr, ('Warning: for %s/%s, %s' % (tool_name, setting, e)) else: _ValidateExclusionSetting(setting, tool_validators, ('Warning: unrecognized setting %s/%s' % (tool_name, setting)), stderr) else: print >> stderr, ('Warning: unrecognized tool %s' % tool_name) # MSVS and MBuild names of the tools. _compile = _Tool('VCCLCompilerTool', 'ClCompile') _link = _Tool('VCLinkerTool', 'Link') _midl = _Tool('VCMIDLTool', 'Midl') _rc = _Tool('VCResourceCompilerTool', 'ResourceCompile') _lib = _Tool('VCLibrarianTool', 'Lib') _manifest = _Tool('VCManifestTool', 'Manifest') _masm = _Tool('MASM', 'MASM') _AddTool(_compile) _AddTool(_link) _AddTool(_midl) _AddTool(_rc) _AddTool(_lib) _AddTool(_manifest) _AddTool(_masm) # Add sections only found in the MSBuild settings. _msbuild_validators[''] = {} _msbuild_validators['ProjectReference'] = {} _msbuild_validators['ManifestResourceCompile'] = {} # Descriptions of the compiler options, i.e. VCCLCompilerTool in MSVS and # ClCompile in MSBuild. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\cl.xml" for # the schema of the MSBuild ClCompile settings. # Options that have the same name in MSVS and MSBuild _Same(_compile, 'AdditionalIncludeDirectories', _folder_list) # /I _Same(_compile, 'AdditionalOptions', _string_list) _Same(_compile, 'AdditionalUsingDirectories', _folder_list) # /AI _Same(_compile, 'AssemblerListingLocation', _file_name) # /Fa _Same(_compile, 'BrowseInformationFile', _file_name) _Same(_compile, 'BufferSecurityCheck', _boolean) # /GS _Same(_compile, 'DisableLanguageExtensions', _boolean) # /Za _Same(_compile, 'DisableSpecificWarnings', _string_list) # /wd _Same(_compile, 'EnableFiberSafeOptimizations', _boolean) # /GT _Same(_compile, 'EnablePREfast', _boolean) # /analyze Visible='false' _Same(_compile, 'ExpandAttributedSource', _boolean) # /Fx _Same(_compile, 'FloatingPointExceptions', _boolean) # /fp:except _Same(_compile, 'ForceConformanceInForLoopScope', _boolean) # /Zc:forScope _Same(_compile, 'ForcedIncludeFiles', _file_list) # /FI _Same(_compile, 'ForcedUsingFiles', _file_list) # /FU _Same(_compile, 'GenerateXMLDocumentationFiles', _boolean) # /doc _Same(_compile, 'IgnoreStandardIncludePath', _boolean) # /X _Same(_compile, 'MinimalRebuild', _boolean) # /Gm _Same(_compile, 'OmitDefaultLibName', _boolean) # /Zl _Same(_compile, 'OmitFramePointers', _boolean) # /Oy _Same(_compile, 'PreprocessorDefinitions', _string_list) # /D _Same(_compile, 'ProgramDataBaseFileName', _file_name) # /Fd _Same(_compile, 'RuntimeTypeInfo', _boolean) # /GR _Same(_compile, 'ShowIncludes', _boolean) # /showIncludes _Same(_compile, 'SmallerTypeCheck', _boolean) # /RTCc _Same(_compile, 'StringPooling', _boolean) # /GF _Same(_compile, 'SuppressStartupBanner', _boolean) # /nologo _Same(_compile, 'TreatWChar_tAsBuiltInType', _boolean) # /Zc:wchar_t _Same(_compile, 'UndefineAllPreprocessorDefinitions', _boolean) # /u _Same(_compile, 'UndefinePreprocessorDefinitions', _string_list) # /U _Same(_compile, 'UseFullPaths', _boolean) # /FC _Same(_compile, 'WholeProgramOptimization', _boolean) # /GL _Same(_compile, 'XMLDocumentationFileName', _file_name) _Same(_compile, 'AssemblerOutput', _Enumeration(['NoListing', 'AssemblyCode', # /FA 'All', # /FAcs 'AssemblyAndMachineCode', # /FAc 'AssemblyAndSourceCode'])) # /FAs _Same(_compile, 'BasicRuntimeChecks', _Enumeration(['Default', 'StackFrameRuntimeCheck', # /RTCs 'UninitializedLocalUsageCheck', # /RTCu 'EnableFastChecks'])) # /RTC1 _Same(_compile, 'BrowseInformation', _Enumeration(['false', 'true', # /FR 'true'])) # /Fr _Same(_compile, 'CallingConvention', _Enumeration(['Cdecl', # /Gd 'FastCall', # /Gr 'StdCall', # /Gz 'VectorCall'])) # /Gv _Same(_compile, 'CompileAs', _Enumeration(['Default', 'CompileAsC', # /TC 'CompileAsCpp'])) # /TP _Same(_compile, 'DebugInformationFormat', _Enumeration(['', # Disabled 'OldStyle', # /Z7 None, 'ProgramDatabase', # /Zi 'EditAndContinue'])) # /ZI _Same(_compile, 'EnableEnhancedInstructionSet', _Enumeration(['NotSet', 'StreamingSIMDExtensions', # /arch:SSE 'StreamingSIMDExtensions2', # /arch:SSE2 'AdvancedVectorExtensions', # /arch:AVX (vs2012+) 'NoExtensions', # /arch:IA32 (vs2012+) # This one only exists in the new msbuild format. 'AdvancedVectorExtensions2', # /arch:AVX2 (vs2013r2+) ])) _Same(_compile, 'ErrorReporting', _Enumeration(['None', # /errorReport:none 'Prompt', # /errorReport:prompt 'Queue'], # /errorReport:queue new=['Send'])) # /errorReport:send" _Same(_compile, 'ExceptionHandling', _Enumeration(['false', 'Sync', # /EHsc 'Async'], # /EHa new=['SyncCThrow'])) # /EHs _Same(_compile, 'FavorSizeOrSpeed', _Enumeration(['Neither', 'Speed', # /Ot 'Size'])) # /Os _Same(_compile, 'FloatingPointModel', _Enumeration(['Precise', # /fp:precise 'Strict', # /fp:strict 'Fast'])) # /fp:fast _Same(_compile, 'InlineFunctionExpansion', _Enumeration(['Default', 'OnlyExplicitInline', # /Ob1 'AnySuitable'], # /Ob2 new=['Disabled'])) # /Ob0 _Same(_compile, 'Optimization', _Enumeration(['Disabled', # /Od 'MinSpace', # /O1 'MaxSpeed', # /O2 'Full'])) # /Ox _Same(_compile, 'RuntimeLibrary', _Enumeration(['MultiThreaded', # /MT 'MultiThreadedDebug', # /MTd 'MultiThreadedDLL', # /MD 'MultiThreadedDebugDLL'])) # /MDd _Same(_compile, 'StructMemberAlignment', _Enumeration(['Default', '1Byte', # /Zp1 '2Bytes', # /Zp2 '4Bytes', # /Zp4 '8Bytes', # /Zp8 '16Bytes'])) # /Zp16 _Same(_compile, 'WarningLevel', _Enumeration(['TurnOffAllWarnings', # /W0 'Level1', # /W1 'Level2', # /W2 'Level3', # /W3 'Level4'], # /W4 new=['EnableAllWarnings'])) # /Wall # Options found in MSVS that have been renamed in MSBuild. _Renamed(_compile, 'EnableFunctionLevelLinking', 'FunctionLevelLinking', _boolean) # /Gy _Renamed(_compile, 'EnableIntrinsicFunctions', 'IntrinsicFunctions', _boolean) # /Oi _Renamed(_compile, 'KeepComments', 'PreprocessKeepComments', _boolean) # /C _Renamed(_compile, 'ObjectFile', 'ObjectFileName', _file_name) # /Fo _Renamed(_compile, 'OpenMP', 'OpenMPSupport', _boolean) # /openmp _Renamed(_compile, 'PrecompiledHeaderThrough', 'PrecompiledHeaderFile', _file_name) # Used with /Yc and /Yu _Renamed(_compile, 'PrecompiledHeaderFile', 'PrecompiledHeaderOutputFile', _file_name) # /Fp _Renamed(_compile, 'UsePrecompiledHeader', 'PrecompiledHeader', _Enumeration(['NotUsing', # VS recognized '' for this value too. 'Create', # /Yc 'Use'])) # /Yu _Renamed(_compile, 'WarnAsError', 'TreatWarningAsError', _boolean) # /WX _ConvertedToAdditionalOption(_compile, 'DefaultCharIsUnsigned', '/J') # MSVS options not found in MSBuild. _MSVSOnly(_compile, 'Detect64BitPortabilityProblems', _boolean) _MSVSOnly(_compile, 'UseUnicodeResponseFiles', _boolean) # MSBuild options not found in MSVS. _MSBuildOnly(_compile, 'BuildingInIDE', _boolean) _MSBuildOnly(_compile, 'CompileAsManaged', _Enumeration([], new=['false', 'true'])) # /clr _MSBuildOnly(_compile, 'CreateHotpatchableImage', _boolean) # /hotpatch _MSBuildOnly(_compile, 'MultiProcessorCompilation', _boolean) # /MP _MSBuildOnly(_compile, 'PreprocessOutputPath', _string) # /Fi _MSBuildOnly(_compile, 'ProcessorNumber', _integer) # the number of processors _MSBuildOnly(_compile, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_compile, 'TreatSpecificWarningsAsErrors', _string_list) # /we _MSBuildOnly(_compile, 'UseUnicodeForAssemblerListing', _boolean) # /FAu # Defines a setting that needs very customized processing _CustomGeneratePreprocessedFile(_compile, 'GeneratePreprocessedFile') # Directives for converting MSVS VCLinkerTool to MSBuild Link. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\link.xml" for # the schema of the MSBuild Link settings. # Options that have the same name in MSVS and MSBuild _Same(_link, 'AdditionalDependencies', _file_list) _Same(_link, 'AdditionalLibraryDirectories', _folder_list) # /LIBPATH # /MANIFESTDEPENDENCY: _Same(_link, 'AdditionalManifestDependencies', _file_list) _Same(_link, 'AdditionalOptions', _string_list) _Same(_link, 'AddModuleNamesToAssembly', _file_list) # /ASSEMBLYMODULE _Same(_link, 'AllowIsolation', _boolean) # /ALLOWISOLATION _Same(_link, 'AssemblyLinkResource', _file_list) # /ASSEMBLYLINKRESOURCE _Same(_link, 'BaseAddress', _string) # /BASE _Same(_link, 'CLRUnmanagedCodeCheck', _boolean) # /CLRUNMANAGEDCODECHECK _Same(_link, 'DelayLoadDLLs', _file_list) # /DELAYLOAD _Same(_link, 'DelaySign', _boolean) # /DELAYSIGN _Same(_link, 'EmbedManagedResourceFile', _file_list) # /ASSEMBLYRESOURCE _Same(_link, 'EnableUAC', _boolean) # /MANIFESTUAC _Same(_link, 'EntryPointSymbol', _string) # /ENTRY _Same(_link, 'ForceSymbolReferences', _file_list) # /INCLUDE _Same(_link, 'FunctionOrder', _file_name) # /ORDER _Same(_link, 'GenerateDebugInformation', _boolean) # /DEBUG _Same(_link, 'GenerateMapFile', _boolean) # /MAP _Same(_link, 'HeapCommitSize', _string) _Same(_link, 'HeapReserveSize', _string) # /HEAP _Same(_link, 'IgnoreAllDefaultLibraries', _boolean) # /NODEFAULTLIB _Same(_link, 'IgnoreEmbeddedIDL', _boolean) # /IGNOREIDL _Same(_link, 'ImportLibrary', _file_name) # /IMPLIB _Same(_link, 'KeyContainer', _file_name) # /KEYCONTAINER _Same(_link, 'KeyFile', _file_name) # /KEYFILE _Same(_link, 'ManifestFile', _file_name) # /ManifestFile _Same(_link, 'MapExports', _boolean) # /MAPINFO:EXPORTS _Same(_link, 'MapFileName', _file_name) _Same(_link, 'MergedIDLBaseFileName', _file_name) # /IDLOUT _Same(_link, 'MergeSections', _string) # /MERGE _Same(_link, 'MidlCommandFile', _file_name) # /MIDL _Same(_link, 'ModuleDefinitionFile', _file_name) # /DEF _Same(_link, 'OutputFile', _file_name) # /OUT _Same(_link, 'PerUserRedirection', _boolean) _Same(_link, 'Profile', _boolean) # /PROFILE _Same(_link, 'ProfileGuidedDatabase', _file_name) # /PGD _Same(_link, 'ProgramDatabaseFile', _file_name) # /PDB _Same(_link, 'RegisterOutput', _boolean) _Same(_link, 'SetChecksum', _boolean) # /RELEASE _Same(_link, 'StackCommitSize', _string) _Same(_link, 'StackReserveSize', _string) # /STACK _Same(_link, 'StripPrivateSymbols', _file_name) # /PDBSTRIPPED _Same(_link, 'SupportUnloadOfDelayLoadedDLL', _boolean) # /DELAY:UNLOAD _Same(_link, 'SuppressStartupBanner', _boolean) # /NOLOGO _Same(_link, 'SwapRunFromCD', _boolean) # /SWAPRUN:CD _Same(_link, 'TurnOffAssemblyGeneration', _boolean) # /NOASSEMBLY _Same(_link, 'TypeLibraryFile', _file_name) # /TLBOUT _Same(_link, 'TypeLibraryResourceID', _integer) # /TLBID _Same(_link, 'UACUIAccess', _boolean) # /uiAccess='true' _Same(_link, 'Version', _string) # /VERSION _Same(_link, 'EnableCOMDATFolding', _newly_boolean) # /OPT:ICF _Same(_link, 'FixedBaseAddress', _newly_boolean) # /FIXED _Same(_link, 'LargeAddressAware', _newly_boolean) # /LARGEADDRESSAWARE _Same(_link, 'OptimizeReferences', _newly_boolean) # /OPT:REF _Same(_link, 'RandomizedBaseAddress', _newly_boolean) # /DYNAMICBASE _Same(_link, 'TerminalServerAware', _newly_boolean) # /TSAWARE _subsystem_enumeration = _Enumeration( ['NotSet', 'Console', # /SUBSYSTEM:CONSOLE 'Windows', # /SUBSYSTEM:WINDOWS 'Native', # /SUBSYSTEM:NATIVE 'EFI Application', # /SUBSYSTEM:EFI_APPLICATION 'EFI Boot Service Driver', # /SUBSYSTEM:EFI_BOOT_SERVICE_DRIVER 'EFI ROM', # /SUBSYSTEM:EFI_ROM 'EFI Runtime', # /SUBSYSTEM:EFI_RUNTIME_DRIVER 'WindowsCE'], # /SUBSYSTEM:WINDOWSCE new=['POSIX']) # /SUBSYSTEM:POSIX _target_machine_enumeration = _Enumeration( ['NotSet', 'MachineX86', # /MACHINE:X86 None, 'MachineARM', # /MACHINE:ARM 'MachineEBC', # /MACHINE:EBC 'MachineIA64', # /MACHINE:IA64 None, 'MachineMIPS', # /MACHINE:MIPS 'MachineMIPS16', # /MACHINE:MIPS16 'MachineMIPSFPU', # /MACHINE:MIPSFPU 'MachineMIPSFPU16', # /MACHINE:MIPSFPU16 None, None, None, 'MachineSH4', # /MACHINE:SH4 None, 'MachineTHUMB', # /MACHINE:THUMB 'MachineX64']) # /MACHINE:X64 _Same(_link, 'AssemblyDebug', _Enumeration(['', 'true', # /ASSEMBLYDEBUG 'false'])) # /ASSEMBLYDEBUG:DISABLE _Same(_link, 'CLRImageType', _Enumeration(['Default', 'ForceIJWImage', # /CLRIMAGETYPE:IJW 'ForcePureILImage', # /Switch="CLRIMAGETYPE:PURE 'ForceSafeILImage'])) # /Switch="CLRIMAGETYPE:SAFE _Same(_link, 'CLRThreadAttribute', _Enumeration(['DefaultThreadingAttribute', # /CLRTHREADATTRIBUTE:NONE 'MTAThreadingAttribute', # /CLRTHREADATTRIBUTE:MTA 'STAThreadingAttribute'])) # /CLRTHREADATTRIBUTE:STA _Same(_link, 'DataExecutionPrevention', _Enumeration(['', 'false', # /NXCOMPAT:NO 'true'])) # /NXCOMPAT _Same(_link, 'Driver', _Enumeration(['NotSet', 'Driver', # /Driver 'UpOnly', # /DRIVER:UPONLY 'WDM'])) # /DRIVER:WDM _Same(_link, 'LinkTimeCodeGeneration', _Enumeration(['Default', 'UseLinkTimeCodeGeneration', # /LTCG 'PGInstrument', # /LTCG:PGInstrument 'PGOptimization', # /LTCG:PGOptimize 'PGUpdate'])) # /LTCG:PGUpdate _Same(_link, 'ShowProgress', _Enumeration(['NotSet', 'LinkVerbose', # /VERBOSE 'LinkVerboseLib'], # /VERBOSE:Lib new=['LinkVerboseICF', # /VERBOSE:ICF 'LinkVerboseREF', # /VERBOSE:REF 'LinkVerboseSAFESEH', # /VERBOSE:SAFESEH 'LinkVerboseCLR'])) # /VERBOSE:CLR _Same(_link, 'SubSystem', _subsystem_enumeration) _Same(_link, 'TargetMachine', _target_machine_enumeration) _Same(_link, 'UACExecutionLevel', _Enumeration(['AsInvoker', # /level='asInvoker' 'HighestAvailable', # /level='highestAvailable' 'RequireAdministrator'])) # /level='requireAdministrator' _Same(_link, 'MinimumRequiredVersion', _string) _Same(_link, 'TreatLinkerWarningAsErrors', _boolean) # /WX # Options found in MSVS that have been renamed in MSBuild. _Renamed(_link, 'ErrorReporting', 'LinkErrorReporting', _Enumeration(['NoErrorReport', # /ERRORREPORT:NONE 'PromptImmediately', # /ERRORREPORT:PROMPT 'QueueForNextLogin'], # /ERRORREPORT:QUEUE new=['SendErrorReport'])) # /ERRORREPORT:SEND _Renamed(_link, 'IgnoreDefaultLibraryNames', 'IgnoreSpecificDefaultLibraries', _file_list) # /NODEFAULTLIB _Renamed(_link, 'ResourceOnlyDLL', 'NoEntryPoint', _boolean) # /NOENTRY _Renamed(_link, 'SwapRunFromNet', 'SwapRunFromNET', _boolean) # /SWAPRUN:NET _Moved(_link, 'GenerateManifest', '', _boolean) _Moved(_link, 'IgnoreImportLibrary', '', _boolean) _Moved(_link, 'LinkIncremental', '', _newly_boolean) _Moved(_link, 'LinkLibraryDependencies', 'ProjectReference', _boolean) _Moved(_link, 'UseLibraryDependencyInputs', 'ProjectReference', _boolean) # MSVS options not found in MSBuild. _MSVSOnly(_link, 'OptimizeForWindows98', _newly_boolean) _MSVSOnly(_link, 'UseUnicodeResponseFiles', _boolean) # MSBuild options not found in MSVS. _MSBuildOnly(_link, 'BuildingInIDE', _boolean) _MSBuildOnly(_link, 'ImageHasSafeExceptionHandlers', _boolean) # /SAFESEH _MSBuildOnly(_link, 'LinkDLL', _boolean) # /DLL Visible='false' _MSBuildOnly(_link, 'LinkStatus', _boolean) # /LTCG:STATUS _MSBuildOnly(_link, 'PreventDllBinding', _boolean) # /ALLOWBIND _MSBuildOnly(_link, 'SupportNobindOfDelayLoadedDLL', _boolean) # /DELAY:NOBIND _MSBuildOnly(_link, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_link, 'MSDOSStubFileName', _file_name) # /STUB Visible='false' _MSBuildOnly(_link, 'SectionAlignment', _integer) # /ALIGN _MSBuildOnly(_link, 'SpecifySectionAttributes', _string) # /SECTION _MSBuildOnly(_link, 'ForceFileOutput', _Enumeration([], new=['Enabled', # /FORCE # /FORCE:MULTIPLE 'MultiplyDefinedSymbolOnly', 'UndefinedSymbolOnly'])) # /FORCE:UNRESOLVED _MSBuildOnly(_link, 'CreateHotPatchableImage', _Enumeration([], new=['Enabled', # /FUNCTIONPADMIN 'X86Image', # /FUNCTIONPADMIN:5 'X64Image', # /FUNCTIONPADMIN:6 'ItaniumImage'])) # /FUNCTIONPADMIN:16 _MSBuildOnly(_link, 'CLRSupportLastError', _Enumeration([], new=['Enabled', # /CLRSupportLastError 'Disabled', # /CLRSupportLastError:NO # /CLRSupportLastError:SYSTEMDLL 'SystemDlls'])) # Directives for converting VCResourceCompilerTool to ResourceCompile. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\rc.xml" for # the schema of the MSBuild ResourceCompile settings. _Same(_rc, 'AdditionalOptions', _string_list) _Same(_rc, 'AdditionalIncludeDirectories', _folder_list) # /I _Same(_rc, 'Culture', _Integer(msbuild_base=16)) _Same(_rc, 'IgnoreStandardIncludePath', _boolean) # /X _Same(_rc, 'PreprocessorDefinitions', _string_list) # /D _Same(_rc, 'ResourceOutputFileName', _string) # /fo _Same(_rc, 'ShowProgress', _boolean) # /v # There is no UI in VisualStudio 2008 to set the following properties. # However they are found in CL and other tools. Include them here for # completeness, as they are very likely to have the same usage pattern. _Same(_rc, 'SuppressStartupBanner', _boolean) # /nologo _Same(_rc, 'UndefinePreprocessorDefinitions', _string_list) # /u # MSBuild options not found in MSVS. _MSBuildOnly(_rc, 'NullTerminateStrings', _boolean) # /n _MSBuildOnly(_rc, 'TrackerLogDirectory', _folder_name) # Directives for converting VCMIDLTool to Midl. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\midl.xml" for # the schema of the MSBuild Midl settings. _Same(_midl, 'AdditionalIncludeDirectories', _folder_list) # /I _Same(_midl, 'AdditionalOptions', _string_list) _Same(_midl, 'CPreprocessOptions', _string) # /cpp_opt _Same(_midl, 'ErrorCheckAllocations', _boolean) # /error allocation _Same(_midl, 'ErrorCheckBounds', _boolean) # /error bounds_check _Same(_midl, 'ErrorCheckEnumRange', _boolean) # /error enum _Same(_midl, 'ErrorCheckRefPointers', _boolean) # /error ref _Same(_midl, 'ErrorCheckStubData', _boolean) # /error stub_data _Same(_midl, 'GenerateStublessProxies', _boolean) # /Oicf _Same(_midl, 'GenerateTypeLibrary', _boolean) _Same(_midl, 'HeaderFileName', _file_name) # /h _Same(_midl, 'IgnoreStandardIncludePath', _boolean) # /no_def_idir _Same(_midl, 'InterfaceIdentifierFileName', _file_name) # /iid _Same(_midl, 'MkTypLibCompatible', _boolean) # /mktyplib203 _Same(_midl, 'OutputDirectory', _string) # /out _Same(_midl, 'PreprocessorDefinitions', _string_list) # /D _Same(_midl, 'ProxyFileName', _file_name) # /proxy _Same(_midl, 'RedirectOutputAndErrors', _file_name) # /o _Same(_midl, 'SuppressStartupBanner', _boolean) # /nologo _Same(_midl, 'TypeLibraryName', _file_name) # /tlb _Same(_midl, 'UndefinePreprocessorDefinitions', _string_list) # /U _Same(_midl, 'WarnAsError', _boolean) # /WX _Same(_midl, 'DefaultCharType', _Enumeration(['Unsigned', # /char unsigned 'Signed', # /char signed 'Ascii'])) # /char ascii7 _Same(_midl, 'TargetEnvironment', _Enumeration(['NotSet', 'Win32', # /env win32 'Itanium', # /env ia64 'X64'])) # /env x64 _Same(_midl, 'EnableErrorChecks', _Enumeration(['EnableCustom', 'None', # /error none 'All'])) # /error all _Same(_midl, 'StructMemberAlignment', _Enumeration(['NotSet', '1', # Zp1 '2', # Zp2 '4', # Zp4 '8'])) # Zp8 _Same(_midl, 'WarningLevel', _Enumeration(['0', # /W0 '1', # /W1 '2', # /W2 '3', # /W3 '4'])) # /W4 _Renamed(_midl, 'DLLDataFileName', 'DllDataFileName', _file_name) # /dlldata _Renamed(_midl, 'ValidateParameters', 'ValidateAllParameters', _boolean) # /robust # MSBuild options not found in MSVS. _MSBuildOnly(_midl, 'ApplicationConfigurationMode', _boolean) # /app_config _MSBuildOnly(_midl, 'ClientStubFile', _file_name) # /cstub _MSBuildOnly(_midl, 'GenerateClientFiles', _Enumeration([], new=['Stub', # /client stub 'None'])) # /client none _MSBuildOnly(_midl, 'GenerateServerFiles', _Enumeration([], new=['Stub', # /client stub 'None'])) # /client none _MSBuildOnly(_midl, 'LocaleID', _integer) # /lcid DECIMAL _MSBuildOnly(_midl, 'ServerStubFile', _file_name) # /sstub _MSBuildOnly(_midl, 'SuppressCompilerWarnings', _boolean) # /no_warn _MSBuildOnly(_midl, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_midl, 'TypeLibFormat', _Enumeration([], new=['NewFormat', # /newtlb 'OldFormat'])) # /oldtlb # Directives for converting VCLibrarianTool to Lib. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\lib.xml" for # the schema of the MSBuild Lib settings. _Same(_lib, 'AdditionalDependencies', _file_list) _Same(_lib, 'AdditionalLibraryDirectories', _folder_list) # /LIBPATH _Same(_lib, 'AdditionalOptions', _string_list) _Same(_lib, 'ExportNamedFunctions', _string_list) # /EXPORT _Same(_lib, 'ForceSymbolReferences', _string) # /INCLUDE _Same(_lib, 'IgnoreAllDefaultLibraries', _boolean) # /NODEFAULTLIB _Same(_lib, 'IgnoreSpecificDefaultLibraries', _file_list) # /NODEFAULTLIB _Same(_lib, 'ModuleDefinitionFile', _file_name) # /DEF _Same(_lib, 'OutputFile', _file_name) # /OUT _Same(_lib, 'SuppressStartupBanner', _boolean) # /NOLOGO _Same(_lib, 'UseUnicodeResponseFiles', _boolean) _Same(_lib, 'LinkTimeCodeGeneration', _boolean) # /LTCG _Same(_lib, 'TargetMachine', _target_machine_enumeration) # TODO(jeanluc) _link defines the same value that gets moved to # ProjectReference. We may want to validate that they are consistent. _Moved(_lib, 'LinkLibraryDependencies', 'ProjectReference', _boolean) _MSBuildOnly(_lib, 'DisplayLibrary', _string) # /LIST Visible='false' _MSBuildOnly(_lib, 'ErrorReporting', _Enumeration([], new=['PromptImmediately', # /ERRORREPORT:PROMPT 'QueueForNextLogin', # /ERRORREPORT:QUEUE 'SendErrorReport', # /ERRORREPORT:SEND 'NoErrorReport'])) # /ERRORREPORT:NONE _MSBuildOnly(_lib, 'MinimumRequiredVersion', _string) _MSBuildOnly(_lib, 'Name', _file_name) # /NAME _MSBuildOnly(_lib, 'RemoveObjects', _file_list) # /REMOVE _MSBuildOnly(_lib, 'SubSystem', _subsystem_enumeration) _MSBuildOnly(_lib, 'TrackerLogDirectory', _folder_name) _MSBuildOnly(_lib, 'TreatLibWarningAsErrors', _boolean) # /WX _MSBuildOnly(_lib, 'Verbose', _boolean) # Directives for converting VCManifestTool to Mt. # See "c:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\1033\mt.xml" for # the schema of the MSBuild Lib settings. # Options that have the same name in MSVS and MSBuild _Same(_manifest, 'AdditionalManifestFiles', _file_list) # /manifest _Same(_manifest, 'AdditionalOptions', _string_list) _Same(_manifest, 'AssemblyIdentity', _string) # /identity: _Same(_manifest, 'ComponentFileName', _file_name) # /dll _Same(_manifest, 'GenerateCatalogFiles', _boolean) # /makecdfs _Same(_manifest, 'InputResourceManifests', _string) # /inputresource _Same(_manifest, 'OutputManifestFile', _file_name) # /out _Same(_manifest, 'RegistrarScriptFile', _file_name) # /rgs _Same(_manifest, 'ReplacementsFile', _file_name) # /replacements _Same(_manifest, 'SuppressStartupBanner', _boolean) # /nologo _Same(_manifest, 'TypeLibraryFile', _file_name) # /tlb: _Same(_manifest, 'UpdateFileHashes', _boolean) # /hashupdate _Same(_manifest, 'UpdateFileHashesSearchPath', _file_name) _Same(_manifest, 'VerboseOutput', _boolean) # /verbose # Options that have moved location. _MovedAndRenamed(_manifest, 'ManifestResourceFile', 'ManifestResourceCompile', 'ResourceOutputFileName', _file_name) _Moved(_manifest, 'EmbedManifest', '', _boolean) # MSVS options not found in MSBuild. _MSVSOnly(_manifest, 'DependencyInformationFile', _file_name) _MSVSOnly(_manifest, 'UseFAT32Workaround', _boolean) _MSVSOnly(_manifest, 'UseUnicodeResponseFiles', _boolean) # MSBuild options not found in MSVS. _MSBuildOnly(_manifest, 'EnableDPIAwareness', _boolean) _MSBuildOnly(_manifest, 'GenerateCategoryTags', _boolean) # /category _MSBuildOnly(_manifest, 'ManifestFromManagedAssembly', _file_name) # /managedassemblyname _MSBuildOnly(_manifest, 'OutputResourceManifests', _string) # /outputresource _MSBuildOnly(_manifest, 'SuppressDependencyElement', _boolean) # /nodependency _MSBuildOnly(_manifest, 'TrackerLogDirectory', _folder_name) # Directives for MASM. # See "$(VCTargetsPath)\BuildCustomizations\masm.xml" for the schema of the # MSBuild MASM settings. # Options that have the same name in MSVS and MSBuild. _Same(_masm, 'UseSafeExceptionHandlers', _boolean) # /safeseh npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSSettings_test.py0000755000000000000000000020062112631326456024334 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Unit tests for the MSVSSettings.py file.""" import StringIO import unittest import gyp.MSVSSettings as MSVSSettings class TestSequenceFunctions(unittest.TestCase): def setUp(self): self.stderr = StringIO.StringIO() def _ExpectedWarnings(self, expected): """Compares recorded lines to expected warnings.""" self.stderr.seek(0) actual = self.stderr.read().split('\n') actual = [line for line in actual if line] self.assertEqual(sorted(expected), sorted(actual)) def testValidateMSVSSettings_tool_names(self): """Tests that only MSVS tool names are allowed.""" MSVSSettings.ValidateMSVSSettings( {'VCCLCompilerTool': {}, 'VCLinkerTool': {}, 'VCMIDLTool': {}, 'foo': {}, 'VCResourceCompilerTool': {}, 'VCLibrarianTool': {}, 'VCManifestTool': {}, 'ClCompile': {}}, self.stderr) self._ExpectedWarnings([ 'Warning: unrecognized tool foo', 'Warning: unrecognized tool ClCompile']) def testValidateMSVSSettings_settings(self): """Tests that for invalid MSVS settings.""" MSVSSettings.ValidateMSVSSettings( {'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': ['string1', 'string2'], 'AdditionalUsingDirectories': 'folder1;folder2', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': '0', 'BasicRuntimeChecks': '5', 'BrowseInformation': 'fdkslj', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'CallingConvention': '-1', 'CompileAs': '1', 'DebugInformationFormat': '2', 'DefaultCharIsUnsigned': 'true', 'Detect64BitPortabilityProblems': 'true', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'string1;string2', 'EnableEnhancedInstructionSet': '1', 'EnableFiberSafeOptimizations': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'EnablePREfast': 'true', 'Enableprefast': 'bogus', 'ErrorReporting': '1', 'ExceptionHandling': '1', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': '1', 'FloatingPointExceptions': 'true', 'FloatingPointModel': '1', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2', 'ForcedUsingFiles': 'file1;file2', 'GeneratePreprocessedFile': '1', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': '1', 'KeepComments': 'true', 'MinimalRebuild': 'true', 'ObjectFile': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMP': 'true', 'Optimization': '1', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderThrough': 'a_file_name', 'PreprocessorDefinitions': 'string1;string2', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': '1', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1', 'SuppressStartupBanner': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'string1;string2', 'UseFullPaths': 'true', 'UsePrecompiledHeader': '1', 'UseUnicodeResponseFiles': 'true', 'WarnAsError': 'true', 'WarningLevel': '1', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name', 'ZZXYZ': 'bogus'}, 'VCLinkerTool': { 'AdditionalDependencies': 'file1;file2', 'AdditionalDependencies_excluded': 'file3', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalManifestDependencies': 'file1;file2', 'AdditionalOptions': 'a string1', 'AddModuleNamesToAssembly': 'file1;file2', 'AllowIsolation': 'true', 'AssemblyDebug': '2', 'AssemblyLinkResource': 'file1;file2', 'BaseAddress': 'a string1', 'CLRImageType': '2', 'CLRThreadAttribute': '2', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '2', 'DelayLoadDLLs': 'file1;file2', 'DelaySign': 'true', 'Driver': '2', 'EmbedManagedResourceFile': 'file1;file2', 'EnableCOMDATFolding': '2', 'EnableUAC': 'true', 'EntryPointSymbol': 'a string1', 'ErrorReporting': '2', 'FixedBaseAddress': '2', 'ForceSymbolReferences': 'file1;file2', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateManifest': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a string1', 'HeapReserveSize': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreDefaultLibraryNames': 'file1;file2', 'IgnoreEmbeddedIDL': 'true', 'IgnoreImportLibrary': 'true', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': '2', 'LinkIncremental': '2', 'LinkLibraryDependencies': 'true', 'LinkTimeCodeGeneration': '2', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a string1', 'MidlCommandFile': 'a_file_name', 'ModuleDefinitionFile': 'a_file_name', 'OptimizeForWindows98': '1', 'OptimizeReferences': '2', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': '2', 'RegisterOutput': 'true', 'ResourceOnlyDLL': 'true', 'SetChecksum': 'true', 'ShowProgress': '2', 'StackCommitSize': 'a string1', 'StackReserveSize': 'a string1', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': '2', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNet': 'true', 'TargetMachine': '2', 'TerminalServerAware': '2', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': '2', 'UACUIAccess': 'true', 'UseLibraryDependencyInputs': 'true', 'UseUnicodeResponseFiles': 'true', 'Version': 'a string1'}, 'VCMIDLTool': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'CPreprocessOptions': 'a string1', 'DefaultCharType': '1', 'DLLDataFileName': 'a_file_name', 'EnableErrorChecks': '1', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'MkTypLibCompatible': 'true', 'notgood': 'bogus', 'OutputDirectory': 'a string1', 'PreprocessorDefinitions': 'string1;string2', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'StructMemberAlignment': '1', 'SuppressStartupBanner': 'true', 'TargetEnvironment': '1', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'string1;string2', 'ValidateParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '1'}, 'VCResourceCompilerTool': { 'AdditionalOptions': 'a string1', 'AdditionalIncludeDirectories': 'folder1;folder2', 'Culture': '1003', 'IgnoreStandardIncludePath': 'true', 'notgood2': 'bogus', 'PreprocessorDefinitions': 'string1;string2', 'ResourceOutputFileName': 'a string1', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'UndefinePreprocessorDefinitions': 'string1;string2'}, 'VCLibrarianTool': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'ExportNamedFunctions': 'string1;string2', 'ForceSymbolReferences': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2', 'LinkLibraryDependencies': 'true', 'ModuleDefinitionFile': 'a_file_name', 'OutputFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'UseUnicodeResponseFiles': 'true'}, 'VCManifestTool': { 'AdditionalManifestFiles': 'file1;file2', 'AdditionalOptions': 'a string1', 'AssemblyIdentity': 'a string1', 'ComponentFileName': 'a_file_name', 'DependencyInformationFile': 'a_file_name', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'a string1', 'ManifestResourceFile': 'a_file_name', 'OutputManifestFile': 'a_file_name', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'truel', 'UpdateFileHashesSearchPath': 'a_file_name', 'UseFAT32Workaround': 'true', 'UseUnicodeResponseFiles': 'true', 'VerboseOutput': 'true'}}, self.stderr) self._ExpectedWarnings([ 'Warning: for VCCLCompilerTool/BasicRuntimeChecks, ' 'index value (5) not in expected range [0, 4)', 'Warning: for VCCLCompilerTool/BrowseInformation, ' "invalid literal for int() with base 10: 'fdkslj'", 'Warning: for VCCLCompilerTool/CallingConvention, ' 'index value (-1) not in expected range [0, 4)', 'Warning: for VCCLCompilerTool/DebugInformationFormat, ' 'converted value for 2 not specified.', 'Warning: unrecognized setting VCCLCompilerTool/Enableprefast', 'Warning: unrecognized setting VCCLCompilerTool/ZZXYZ', 'Warning: for VCLinkerTool/TargetMachine, ' 'converted value for 2 not specified.', 'Warning: unrecognized setting VCMIDLTool/notgood', 'Warning: unrecognized setting VCResourceCompilerTool/notgood2', 'Warning: for VCManifestTool/UpdateFileHashes, ' "expected bool; got 'truel'" '']) def testValidateMSBuildSettings_settings(self): """Tests that for invalid MSBuild settings.""" MSVSSettings.ValidateMSBuildSettings( {'ClCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': ['string1', 'string2'], 'AdditionalUsingDirectories': 'folder1;folder2', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': 'NoListing', 'BasicRuntimeChecks': 'StackFrameRuntimeCheck', 'BrowseInformation': 'false', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'BuildingInIDE': 'true', 'CallingConvention': 'Cdecl', 'CompileAs': 'CompileAsC', 'CompileAsManaged': 'true', 'CreateHotpatchableImage': 'true', 'DebugInformationFormat': 'ProgramDatabase', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'string1;string2', 'EnableEnhancedInstructionSet': 'StreamingSIMDExtensions', 'EnableFiberSafeOptimizations': 'true', 'EnablePREfast': 'true', 'Enableprefast': 'bogus', 'ErrorReporting': 'Prompt', 'ExceptionHandling': 'SyncCThrow', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': 'Neither', 'FloatingPointExceptions': 'true', 'FloatingPointModel': 'Precise', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2', 'ForcedUsingFiles': 'file1;file2', 'FunctionLevelLinking': 'false', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': 'OnlyExplicitInline', 'IntrinsicFunctions': 'false', 'MinimalRebuild': 'true', 'MultiProcessorCompilation': 'true', 'ObjectFileName': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMPSupport': 'true', 'Optimization': 'Disabled', 'PrecompiledHeader': 'NotUsing', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderOutputFile': 'a_file_name', 'PreprocessKeepComments': 'true', 'PreprocessorDefinitions': 'string1;string2', 'PreprocessOutputPath': 'a string1', 'PreprocessSuppressLineNumbers': 'false', 'PreprocessToFile': 'false', 'ProcessorNumber': '33', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': 'MultiThreaded', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1Byte', 'SuppressStartupBanner': 'true', 'TrackerLogDirectory': 'a_folder', 'TreatSpecificWarningsAsErrors': 'string1;string2', 'TreatWarningAsError': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'string1;string2', 'UseFullPaths': 'true', 'UseUnicodeForAssemblerListing': 'true', 'WarningLevel': 'TurnOffAllWarnings', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name', 'ZZXYZ': 'bogus'}, 'Link': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalManifestDependencies': 'file1;file2', 'AdditionalOptions': 'a string1', 'AddModuleNamesToAssembly': 'file1;file2', 'AllowIsolation': 'true', 'AssemblyDebug': '', 'AssemblyLinkResource': 'file1;file2', 'BaseAddress': 'a string1', 'BuildingInIDE': 'true', 'CLRImageType': 'ForceIJWImage', 'CLRSupportLastError': 'Enabled', 'CLRThreadAttribute': 'MTAThreadingAttribute', 'CLRUnmanagedCodeCheck': 'true', 'CreateHotPatchableImage': 'X86Image', 'DataExecutionPrevention': 'false', 'DelayLoadDLLs': 'file1;file2', 'DelaySign': 'true', 'Driver': 'NotSet', 'EmbedManagedResourceFile': 'file1;file2', 'EnableCOMDATFolding': 'false', 'EnableUAC': 'true', 'EntryPointSymbol': 'a string1', 'FixedBaseAddress': 'false', 'ForceFileOutput': 'Enabled', 'ForceSymbolReferences': 'file1;file2', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a string1', 'HeapReserveSize': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreEmbeddedIDL': 'true', 'IgnoreSpecificDefaultLibraries': 'a_file_list', 'ImageHasSafeExceptionHandlers': 'true', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': 'false', 'LinkDLL': 'true', 'LinkErrorReporting': 'SendErrorReport', 'LinkStatus': 'true', 'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a string1', 'MidlCommandFile': 'a_file_name', 'MinimumRequiredVersion': 'a string1', 'ModuleDefinitionFile': 'a_file_name', 'MSDOSStubFileName': 'a_file_name', 'NoEntryPoint': 'true', 'OptimizeReferences': 'false', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'PreventDllBinding': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': 'false', 'RegisterOutput': 'true', 'SectionAlignment': '33', 'SetChecksum': 'true', 'ShowProgress': 'LinkVerboseREF', 'SpecifySectionAttributes': 'a string1', 'StackCommitSize': 'a string1', 'StackReserveSize': 'a string1', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': 'Console', 'SupportNobindOfDelayLoadedDLL': 'true', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNET': 'true', 'TargetMachine': 'MachineX86', 'TerminalServerAware': 'false', 'TrackerLogDirectory': 'a_folder', 'TreatLinkerWarningAsErrors': 'true', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': 'AsInvoker', 'UACUIAccess': 'true', 'Version': 'a string1'}, 'ResourceCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'Culture': '0x236', 'IgnoreStandardIncludePath': 'true', 'NullTerminateStrings': 'true', 'PreprocessorDefinitions': 'string1;string2', 'ResourceOutputFileName': 'a string1', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'TrackerLogDirectory': 'a_folder', 'UndefinePreprocessorDefinitions': 'string1;string2'}, 'Midl': { 'AdditionalIncludeDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'ApplicationConfigurationMode': 'true', 'ClientStubFile': 'a_file_name', 'CPreprocessOptions': 'a string1', 'DefaultCharType': 'Signed', 'DllDataFileName': 'a_file_name', 'EnableErrorChecks': 'EnableCustom', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateClientFiles': 'Stub', 'GenerateServerFiles': 'None', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'LocaleID': '33', 'MkTypLibCompatible': 'true', 'OutputDirectory': 'a string1', 'PreprocessorDefinitions': 'string1;string2', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'ServerStubFile': 'a_file_name', 'StructMemberAlignment': 'NotSet', 'SuppressCompilerWarnings': 'true', 'SuppressStartupBanner': 'true', 'TargetEnvironment': 'Itanium', 'TrackerLogDirectory': 'a_folder', 'TypeLibFormat': 'NewFormat', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'string1;string2', 'ValidateAllParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '1'}, 'Lib': { 'AdditionalDependencies': 'file1;file2', 'AdditionalLibraryDirectories': 'folder1;folder2', 'AdditionalOptions': 'a string1', 'DisplayLibrary': 'a string1', 'ErrorReporting': 'PromptImmediately', 'ExportNamedFunctions': 'string1;string2', 'ForceSymbolReferences': 'a string1', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2', 'LinkTimeCodeGeneration': 'true', 'MinimumRequiredVersion': 'a string1', 'ModuleDefinitionFile': 'a_file_name', 'Name': 'a_file_name', 'OutputFile': 'a_file_name', 'RemoveObjects': 'file1;file2', 'SubSystem': 'Console', 'SuppressStartupBanner': 'true', 'TargetMachine': 'MachineX86i', 'TrackerLogDirectory': 'a_folder', 'TreatLibWarningAsErrors': 'true', 'UseUnicodeResponseFiles': 'true', 'Verbose': 'true'}, 'Manifest': { 'AdditionalManifestFiles': 'file1;file2', 'AdditionalOptions': 'a string1', 'AssemblyIdentity': 'a string1', 'ComponentFileName': 'a_file_name', 'EnableDPIAwareness': 'fal', 'GenerateCatalogFiles': 'truel', 'GenerateCategoryTags': 'true', 'InputResourceManifests': 'a string1', 'ManifestFromManagedAssembly': 'a_file_name', 'notgood3': 'bogus', 'OutputManifestFile': 'a_file_name', 'OutputResourceManifests': 'a string1', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressDependencyElement': 'true', 'SuppressStartupBanner': 'true', 'TrackerLogDirectory': 'a_folder', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'a_file_name', 'VerboseOutput': 'true'}, 'ProjectReference': { 'LinkLibraryDependencies': 'true', 'UseLibraryDependencyInputs': 'true'}, 'ManifestResourceCompile': { 'ResourceOutputFileName': 'a_file_name'}, '': { 'EmbedManifest': 'true', 'GenerateManifest': 'true', 'IgnoreImportLibrary': 'true', 'LinkIncremental': 'false'}}, self.stderr) self._ExpectedWarnings([ 'Warning: unrecognized setting ClCompile/Enableprefast', 'Warning: unrecognized setting ClCompile/ZZXYZ', 'Warning: unrecognized setting Manifest/notgood3', 'Warning: for Manifest/GenerateCatalogFiles, ' "expected bool; got 'truel'", 'Warning: for Lib/TargetMachine, unrecognized enumerated value ' 'MachineX86i', "Warning: for Manifest/EnableDPIAwareness, expected bool; got 'fal'"]) def testConvertToMSBuildSettings_empty(self): """Tests an empty conversion.""" msvs_settings = {} expected_msbuild_settings = {} actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) def testConvertToMSBuildSettings_minimal(self): """Tests a minimal conversion.""" msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/foo', 'BasicRuntimeChecks': '0', }, 'VCLinkerTool': { 'LinkTimeCodeGeneration': '1', 'ErrorReporting': '1', 'DataExecutionPrevention': '2', }, } expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/foo', 'BasicRuntimeChecks': 'Default', }, 'Link': { 'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration', 'LinkErrorReporting': 'PromptImmediately', 'DataExecutionPrevention': 'true', }, } actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) def testConvertToMSBuildSettings_warnings(self): """Tests conversion that generates warnings.""" msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': '1', 'AdditionalOptions': '2', # These are incorrect values: 'BasicRuntimeChecks': '12', 'BrowseInformation': '21', 'UsePrecompiledHeader': '13', 'GeneratePreprocessedFile': '14'}, 'VCLinkerTool': { # These are incorrect values: 'Driver': '10', 'LinkTimeCodeGeneration': '31', 'ErrorReporting': '21', 'FixedBaseAddress': '6'}, 'VCResourceCompilerTool': { # Custom 'Culture': '1003'}} expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': '1', 'AdditionalOptions': '2'}, 'Link': {}, 'ResourceCompile': { # Custom 'Culture': '0x03eb'}} actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([ 'Warning: while converting VCCLCompilerTool/BasicRuntimeChecks to ' 'MSBuild, index value (12) not in expected range [0, 4)', 'Warning: while converting VCCLCompilerTool/BrowseInformation to ' 'MSBuild, index value (21) not in expected range [0, 3)', 'Warning: while converting VCCLCompilerTool/UsePrecompiledHeader to ' 'MSBuild, index value (13) not in expected range [0, 3)', 'Warning: while converting VCCLCompilerTool/GeneratePreprocessedFile to ' 'MSBuild, value must be one of [0, 1, 2]; got 14', 'Warning: while converting VCLinkerTool/Driver to ' 'MSBuild, index value (10) not in expected range [0, 4)', 'Warning: while converting VCLinkerTool/LinkTimeCodeGeneration to ' 'MSBuild, index value (31) not in expected range [0, 5)', 'Warning: while converting VCLinkerTool/ErrorReporting to ' 'MSBuild, index value (21) not in expected range [0, 3)', 'Warning: while converting VCLinkerTool/FixedBaseAddress to ' 'MSBuild, index value (6) not in expected range [0, 3)', ]) def testConvertToMSBuildSettings_full_synthetic(self): """Tests conversion of all the MSBuild settings.""" msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'AdditionalUsingDirectories': 'folder1;folder2;folder3', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': '0', 'BasicRuntimeChecks': '1', 'BrowseInformation': '2', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'CallingConvention': '0', 'CompileAs': '1', 'DebugInformationFormat': '4', 'DefaultCharIsUnsigned': 'true', 'Detect64BitPortabilityProblems': 'true', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'd1;d2;d3', 'EnableEnhancedInstructionSet': '0', 'EnableFiberSafeOptimizations': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'EnablePREfast': 'true', 'ErrorReporting': '1', 'ExceptionHandling': '2', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': '0', 'FloatingPointExceptions': 'true', 'FloatingPointModel': '1', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2;file3', 'ForcedUsingFiles': 'file1;file2;file3', 'GeneratePreprocessedFile': '1', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': '2', 'KeepComments': 'true', 'MinimalRebuild': 'true', 'ObjectFile': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMP': 'true', 'Optimization': '3', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderThrough': 'a_file_name', 'PreprocessorDefinitions': 'd1;d2;d3', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': '0', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1', 'SuppressStartupBanner': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'UseFullPaths': 'true', 'UsePrecompiledHeader': '1', 'UseUnicodeResponseFiles': 'true', 'WarnAsError': 'true', 'WarningLevel': '2', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name'}, 'VCLinkerTool': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalLibraryDirectories_excluded': 'folder1;folder2;folder3', 'AdditionalManifestDependencies': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AddModuleNamesToAssembly': 'file1;file2;file3', 'AllowIsolation': 'true', 'AssemblyDebug': '0', 'AssemblyLinkResource': 'file1;file2;file3', 'BaseAddress': 'a_string', 'CLRImageType': '1', 'CLRThreadAttribute': '2', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '0', 'DelayLoadDLLs': 'file1;file2;file3', 'DelaySign': 'true', 'Driver': '1', 'EmbedManagedResourceFile': 'file1;file2;file3', 'EnableCOMDATFolding': '0', 'EnableUAC': 'true', 'EntryPointSymbol': 'a_string', 'ErrorReporting': '0', 'FixedBaseAddress': '1', 'ForceSymbolReferences': 'file1;file2;file3', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateManifest': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a_string', 'HeapReserveSize': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreDefaultLibraryNames': 'file1;file2;file3', 'IgnoreEmbeddedIDL': 'true', 'IgnoreImportLibrary': 'true', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': '2', 'LinkIncremental': '1', 'LinkLibraryDependencies': 'true', 'LinkTimeCodeGeneration': '2', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a_string', 'MidlCommandFile': 'a_file_name', 'ModuleDefinitionFile': 'a_file_name', 'OptimizeForWindows98': '1', 'OptimizeReferences': '0', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': '1', 'RegisterOutput': 'true', 'ResourceOnlyDLL': 'true', 'SetChecksum': 'true', 'ShowProgress': '0', 'StackCommitSize': 'a_string', 'StackReserveSize': 'a_string', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': '2', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNet': 'true', 'TargetMachine': '3', 'TerminalServerAware': '2', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': '1', 'UACUIAccess': 'true', 'UseLibraryDependencyInputs': 'false', 'UseUnicodeResponseFiles': 'true', 'Version': 'a_string'}, 'VCResourceCompilerTool': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'Culture': '1003', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': 'd1;d2;d3', 'ResourceOutputFileName': 'a_string', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3'}, 'VCMIDLTool': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'CPreprocessOptions': 'a_string', 'DefaultCharType': '0', 'DLLDataFileName': 'a_file_name', 'EnableErrorChecks': '2', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'MkTypLibCompatible': 'true', 'OutputDirectory': 'a_string', 'PreprocessorDefinitions': 'd1;d2;d3', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'StructMemberAlignment': '3', 'SuppressStartupBanner': 'true', 'TargetEnvironment': '1', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'ValidateParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '4'}, 'VCLibrarianTool': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalLibraryDirectories_excluded': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'ExportNamedFunctions': 'd1;d2;d3', 'ForceSymbolReferences': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2;file3', 'LinkLibraryDependencies': 'true', 'ModuleDefinitionFile': 'a_file_name', 'OutputFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'UseUnicodeResponseFiles': 'true'}, 'VCManifestTool': { 'AdditionalManifestFiles': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AssemblyIdentity': 'a_string', 'ComponentFileName': 'a_file_name', 'DependencyInformationFile': 'a_file_name', 'EmbedManifest': 'true', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'a_string', 'ManifestResourceFile': 'my_name', 'OutputManifestFile': 'a_file_name', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'a_file_name', 'UseFAT32Workaround': 'true', 'UseUnicodeResponseFiles': 'true', 'VerboseOutput': 'true'}} expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string /J', 'AdditionalUsingDirectories': 'folder1;folder2;folder3', 'AssemblerListingLocation': 'a_file_name', 'AssemblerOutput': 'NoListing', 'BasicRuntimeChecks': 'StackFrameRuntimeCheck', 'BrowseInformation': 'true', 'BrowseInformationFile': 'a_file_name', 'BufferSecurityCheck': 'true', 'CallingConvention': 'Cdecl', 'CompileAs': 'CompileAsC', 'DebugInformationFormat': 'EditAndContinue', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'd1;d2;d3', 'EnableEnhancedInstructionSet': 'NotSet', 'EnableFiberSafeOptimizations': 'true', 'EnablePREfast': 'true', 'ErrorReporting': 'Prompt', 'ExceptionHandling': 'Async', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': 'Neither', 'FloatingPointExceptions': 'true', 'FloatingPointModel': 'Strict', 'ForceConformanceInForLoopScope': 'true', 'ForcedIncludeFiles': 'file1;file2;file3', 'ForcedUsingFiles': 'file1;file2;file3', 'FunctionLevelLinking': 'true', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': 'AnySuitable', 'IntrinsicFunctions': 'true', 'MinimalRebuild': 'true', 'ObjectFileName': 'a_file_name', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMPSupport': 'true', 'Optimization': 'Full', 'PrecompiledHeader': 'Create', 'PrecompiledHeaderFile': 'a_file_name', 'PrecompiledHeaderOutputFile': 'a_file_name', 'PreprocessKeepComments': 'true', 'PreprocessorDefinitions': 'd1;d2;d3', 'PreprocessSuppressLineNumbers': 'false', 'PreprocessToFile': 'true', 'ProgramDataBaseFileName': 'a_file_name', 'RuntimeLibrary': 'MultiThreaded', 'RuntimeTypeInfo': 'true', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '1Byte', 'SuppressStartupBanner': 'true', 'TreatWarningAsError': 'true', 'TreatWChar_tAsBuiltInType': 'true', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'UseFullPaths': 'true', 'WarningLevel': 'Level2', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': 'a_file_name'}, 'Link': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalManifestDependencies': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AddModuleNamesToAssembly': 'file1;file2;file3', 'AllowIsolation': 'true', 'AssemblyDebug': '', 'AssemblyLinkResource': 'file1;file2;file3', 'BaseAddress': 'a_string', 'CLRImageType': 'ForceIJWImage', 'CLRThreadAttribute': 'STAThreadingAttribute', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '', 'DelayLoadDLLs': 'file1;file2;file3', 'DelaySign': 'true', 'Driver': 'Driver', 'EmbedManagedResourceFile': 'file1;file2;file3', 'EnableCOMDATFolding': '', 'EnableUAC': 'true', 'EntryPointSymbol': 'a_string', 'FixedBaseAddress': 'false', 'ForceSymbolReferences': 'file1;file2;file3', 'FunctionOrder': 'a_file_name', 'GenerateDebugInformation': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': 'a_string', 'HeapReserveSize': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreEmbeddedIDL': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2;file3', 'ImportLibrary': 'a_file_name', 'KeyContainer': 'a_file_name', 'KeyFile': 'a_file_name', 'LargeAddressAware': 'true', 'LinkErrorReporting': 'NoErrorReport', 'LinkTimeCodeGeneration': 'PGInstrument', 'ManifestFile': 'a_file_name', 'MapExports': 'true', 'MapFileName': 'a_file_name', 'MergedIDLBaseFileName': 'a_file_name', 'MergeSections': 'a_string', 'MidlCommandFile': 'a_file_name', 'ModuleDefinitionFile': 'a_file_name', 'NoEntryPoint': 'true', 'OptimizeReferences': '', 'OutputFile': 'a_file_name', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': 'a_file_name', 'ProgramDatabaseFile': 'a_file_name', 'RandomizedBaseAddress': 'false', 'RegisterOutput': 'true', 'SetChecksum': 'true', 'ShowProgress': 'NotSet', 'StackCommitSize': 'a_string', 'StackReserveSize': 'a_string', 'StripPrivateSymbols': 'a_file_name', 'SubSystem': 'Windows', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'true', 'SwapRunFromCD': 'true', 'SwapRunFromNET': 'true', 'TargetMachine': 'MachineARM', 'TerminalServerAware': 'true', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'a_file_name', 'TypeLibraryResourceID': '33', 'UACExecutionLevel': 'HighestAvailable', 'UACUIAccess': 'true', 'Version': 'a_string'}, 'ResourceCompile': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'Culture': '0x03eb', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': 'd1;d2;d3', 'ResourceOutputFileName': 'a_string', 'ShowProgress': 'true', 'SuppressStartupBanner': 'true', 'UndefinePreprocessorDefinitions': 'd1;d2;d3'}, 'Midl': { 'AdditionalIncludeDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'CPreprocessOptions': 'a_string', 'DefaultCharType': 'Unsigned', 'DllDataFileName': 'a_file_name', 'EnableErrorChecks': 'All', 'ErrorCheckAllocations': 'true', 'ErrorCheckBounds': 'true', 'ErrorCheckEnumRange': 'true', 'ErrorCheckRefPointers': 'true', 'ErrorCheckStubData': 'true', 'GenerateStublessProxies': 'true', 'GenerateTypeLibrary': 'true', 'HeaderFileName': 'a_file_name', 'IgnoreStandardIncludePath': 'true', 'InterfaceIdentifierFileName': 'a_file_name', 'MkTypLibCompatible': 'true', 'OutputDirectory': 'a_string', 'PreprocessorDefinitions': 'd1;d2;d3', 'ProxyFileName': 'a_file_name', 'RedirectOutputAndErrors': 'a_file_name', 'StructMemberAlignment': '4', 'SuppressStartupBanner': 'true', 'TargetEnvironment': 'Win32', 'TypeLibraryName': 'a_file_name', 'UndefinePreprocessorDefinitions': 'd1;d2;d3', 'ValidateAllParameters': 'true', 'WarnAsError': 'true', 'WarningLevel': '4'}, 'Lib': { 'AdditionalDependencies': 'file1;file2;file3', 'AdditionalLibraryDirectories': 'folder1;folder2;folder3', 'AdditionalOptions': 'a_string', 'ExportNamedFunctions': 'd1;d2;d3', 'ForceSymbolReferences': 'a_string', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreSpecificDefaultLibraries': 'file1;file2;file3', 'ModuleDefinitionFile': 'a_file_name', 'OutputFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'UseUnicodeResponseFiles': 'true'}, 'Manifest': { 'AdditionalManifestFiles': 'file1;file2;file3', 'AdditionalOptions': 'a_string', 'AssemblyIdentity': 'a_string', 'ComponentFileName': 'a_file_name', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'a_string', 'OutputManifestFile': 'a_file_name', 'RegistrarScriptFile': 'a_file_name', 'ReplacementsFile': 'a_file_name', 'SuppressStartupBanner': 'true', 'TypeLibraryFile': 'a_file_name', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'a_file_name', 'VerboseOutput': 'true'}, 'ManifestResourceCompile': { 'ResourceOutputFileName': 'my_name'}, 'ProjectReference': { 'LinkLibraryDependencies': 'true', 'UseLibraryDependencyInputs': 'false'}, '': { 'EmbedManifest': 'true', 'GenerateManifest': 'true', 'IgnoreImportLibrary': 'true', 'LinkIncremental': 'false'}} actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) def testConvertToMSBuildSettings_actual(self): """Tests the conversion of an actual project. A VS2008 project with most of the options defined was created through the VS2008 IDE. It was then converted to VS2010. The tool settings found in the .vcproj and .vcxproj files were converted to the two dictionaries msvs_settings and expected_msbuild_settings. Note that for many settings, the VS2010 converter adds macros like %(AdditionalIncludeDirectories) to make sure than inherited values are included. Since the Gyp projects we generate do not use inheritance, we removed these macros. They were: ClCompile: AdditionalIncludeDirectories: ';%(AdditionalIncludeDirectories)' AdditionalOptions: ' %(AdditionalOptions)' AdditionalUsingDirectories: ';%(AdditionalUsingDirectories)' DisableSpecificWarnings: ';%(DisableSpecificWarnings)', ForcedIncludeFiles: ';%(ForcedIncludeFiles)', ForcedUsingFiles: ';%(ForcedUsingFiles)', PreprocessorDefinitions: ';%(PreprocessorDefinitions)', UndefinePreprocessorDefinitions: ';%(UndefinePreprocessorDefinitions)', Link: AdditionalDependencies: ';%(AdditionalDependencies)', AdditionalLibraryDirectories: ';%(AdditionalLibraryDirectories)', AdditionalManifestDependencies: ';%(AdditionalManifestDependencies)', AdditionalOptions: ' %(AdditionalOptions)', AddModuleNamesToAssembly: ';%(AddModuleNamesToAssembly)', AssemblyLinkResource: ';%(AssemblyLinkResource)', DelayLoadDLLs: ';%(DelayLoadDLLs)', EmbedManagedResourceFile: ';%(EmbedManagedResourceFile)', ForceSymbolReferences: ';%(ForceSymbolReferences)', IgnoreSpecificDefaultLibraries: ';%(IgnoreSpecificDefaultLibraries)', ResourceCompile: AdditionalIncludeDirectories: ';%(AdditionalIncludeDirectories)', AdditionalOptions: ' %(AdditionalOptions)', PreprocessorDefinitions: ';%(PreprocessorDefinitions)', Manifest: AdditionalManifestFiles: ';%(AdditionalManifestFiles)', AdditionalOptions: ' %(AdditionalOptions)', InputResourceManifests: ';%(InputResourceManifests)', """ msvs_settings = { 'VCCLCompilerTool': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/more', 'AdditionalUsingDirectories': 'test', 'AssemblerListingLocation': '$(IntDir)\\a', 'AssemblerOutput': '1', 'BasicRuntimeChecks': '3', 'BrowseInformation': '1', 'BrowseInformationFile': '$(IntDir)\\e', 'BufferSecurityCheck': 'false', 'CallingConvention': '1', 'CompileAs': '1', 'DebugInformationFormat': '4', 'DefaultCharIsUnsigned': 'true', 'Detect64BitPortabilityProblems': 'true', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'abc', 'EnableEnhancedInstructionSet': '1', 'EnableFiberSafeOptimizations': 'true', 'EnableFunctionLevelLinking': 'true', 'EnableIntrinsicFunctions': 'true', 'EnablePREfast': 'true', 'ErrorReporting': '2', 'ExceptionHandling': '2', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': '2', 'FloatingPointExceptions': 'true', 'FloatingPointModel': '1', 'ForceConformanceInForLoopScope': 'false', 'ForcedIncludeFiles': 'def', 'ForcedUsingFiles': 'ge', 'GeneratePreprocessedFile': '2', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': '1', 'KeepComments': 'true', 'MinimalRebuild': 'true', 'ObjectFile': '$(IntDir)\\b', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMP': 'true', 'Optimization': '3', 'PrecompiledHeaderFile': '$(IntDir)\\$(TargetName).pche', 'PrecompiledHeaderThrough': 'StdAfx.hd', 'PreprocessorDefinitions': 'WIN32;_DEBUG;_CONSOLE', 'ProgramDataBaseFileName': '$(IntDir)\\vc90b.pdb', 'RuntimeLibrary': '3', 'RuntimeTypeInfo': 'false', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '3', 'SuppressStartupBanner': 'false', 'TreatWChar_tAsBuiltInType': 'false', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'wer', 'UseFullPaths': 'true', 'UsePrecompiledHeader': '0', 'UseUnicodeResponseFiles': 'false', 'WarnAsError': 'true', 'WarningLevel': '3', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': '$(IntDir)\\c'}, 'VCLinkerTool': { 'AdditionalDependencies': 'zx', 'AdditionalLibraryDirectories': 'asd', 'AdditionalManifestDependencies': 's2', 'AdditionalOptions': '/mor2', 'AddModuleNamesToAssembly': 'd1', 'AllowIsolation': 'false', 'AssemblyDebug': '1', 'AssemblyLinkResource': 'd5', 'BaseAddress': '23423', 'CLRImageType': '3', 'CLRThreadAttribute': '1', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '0', 'DelayLoadDLLs': 'd4', 'DelaySign': 'true', 'Driver': '2', 'EmbedManagedResourceFile': 'd2', 'EnableCOMDATFolding': '1', 'EnableUAC': 'false', 'EntryPointSymbol': 'f5', 'ErrorReporting': '2', 'FixedBaseAddress': '1', 'ForceSymbolReferences': 'd3', 'FunctionOrder': 'fssdfsd', 'GenerateDebugInformation': 'true', 'GenerateManifest': 'false', 'GenerateMapFile': 'true', 'HeapCommitSize': '13', 'HeapReserveSize': '12', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreDefaultLibraryNames': 'flob;flok', 'IgnoreEmbeddedIDL': 'true', 'IgnoreImportLibrary': 'true', 'ImportLibrary': 'f4', 'KeyContainer': 'f7', 'KeyFile': 'f6', 'LargeAddressAware': '2', 'LinkIncremental': '0', 'LinkLibraryDependencies': 'false', 'LinkTimeCodeGeneration': '1', 'ManifestFile': '$(IntDir)\\$(TargetFileName).2intermediate.manifest', 'MapExports': 'true', 'MapFileName': 'd5', 'MergedIDLBaseFileName': 'f2', 'MergeSections': 'f5', 'MidlCommandFile': 'f1', 'ModuleDefinitionFile': 'sdsd', 'OptimizeForWindows98': '2', 'OptimizeReferences': '2', 'OutputFile': '$(OutDir)\\$(ProjectName)2.exe', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': '$(TargetDir)$(TargetName).pgdd', 'ProgramDatabaseFile': 'Flob.pdb', 'RandomizedBaseAddress': '1', 'RegisterOutput': 'true', 'ResourceOnlyDLL': 'true', 'SetChecksum': 'false', 'ShowProgress': '1', 'StackCommitSize': '15', 'StackReserveSize': '14', 'StripPrivateSymbols': 'd3', 'SubSystem': '1', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'false', 'SwapRunFromCD': 'true', 'SwapRunFromNet': 'true', 'TargetMachine': '1', 'TerminalServerAware': '1', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'f3', 'TypeLibraryResourceID': '12', 'UACExecutionLevel': '2', 'UACUIAccess': 'true', 'UseLibraryDependencyInputs': 'true', 'UseUnicodeResponseFiles': 'false', 'Version': '333'}, 'VCResourceCompilerTool': { 'AdditionalIncludeDirectories': 'f3', 'AdditionalOptions': '/more3', 'Culture': '3084', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': '_UNICODE;UNICODE2', 'ResourceOutputFileName': '$(IntDir)/$(InputName)3.res', 'ShowProgress': 'true'}, 'VCManifestTool': { 'AdditionalManifestFiles': 'sfsdfsd', 'AdditionalOptions': 'afdsdafsd', 'AssemblyIdentity': 'sddfdsadfsa', 'ComponentFileName': 'fsdfds', 'DependencyInformationFile': '$(IntDir)\\mt.depdfd', 'EmbedManifest': 'false', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'asfsfdafs', 'ManifestResourceFile': '$(IntDir)\\$(TargetFileName).embed.manifest.resfdsf', 'OutputManifestFile': '$(TargetPath).manifestdfs', 'RegistrarScriptFile': 'sdfsfd', 'ReplacementsFile': 'sdffsd', 'SuppressStartupBanner': 'false', 'TypeLibraryFile': 'sfsd', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'sfsd', 'UseFAT32Workaround': 'true', 'UseUnicodeResponseFiles': 'false', 'VerboseOutput': 'true'}} expected_msbuild_settings = { 'ClCompile': { 'AdditionalIncludeDirectories': 'dir1', 'AdditionalOptions': '/more /J', 'AdditionalUsingDirectories': 'test', 'AssemblerListingLocation': '$(IntDir)a', 'AssemblerOutput': 'AssemblyCode', 'BasicRuntimeChecks': 'EnableFastChecks', 'BrowseInformation': 'true', 'BrowseInformationFile': '$(IntDir)e', 'BufferSecurityCheck': 'false', 'CallingConvention': 'FastCall', 'CompileAs': 'CompileAsC', 'DebugInformationFormat': 'EditAndContinue', 'DisableLanguageExtensions': 'true', 'DisableSpecificWarnings': 'abc', 'EnableEnhancedInstructionSet': 'StreamingSIMDExtensions', 'EnableFiberSafeOptimizations': 'true', 'EnablePREfast': 'true', 'ErrorReporting': 'Queue', 'ExceptionHandling': 'Async', 'ExpandAttributedSource': 'true', 'FavorSizeOrSpeed': 'Size', 'FloatingPointExceptions': 'true', 'FloatingPointModel': 'Strict', 'ForceConformanceInForLoopScope': 'false', 'ForcedIncludeFiles': 'def', 'ForcedUsingFiles': 'ge', 'FunctionLevelLinking': 'true', 'GenerateXMLDocumentationFiles': 'true', 'IgnoreStandardIncludePath': 'true', 'InlineFunctionExpansion': 'OnlyExplicitInline', 'IntrinsicFunctions': 'true', 'MinimalRebuild': 'true', 'ObjectFileName': '$(IntDir)b', 'OmitDefaultLibName': 'true', 'OmitFramePointers': 'true', 'OpenMPSupport': 'true', 'Optimization': 'Full', 'PrecompiledHeader': 'NotUsing', # Actual conversion gives '' 'PrecompiledHeaderFile': 'StdAfx.hd', 'PrecompiledHeaderOutputFile': '$(IntDir)$(TargetName).pche', 'PreprocessKeepComments': 'true', 'PreprocessorDefinitions': 'WIN32;_DEBUG;_CONSOLE', 'PreprocessSuppressLineNumbers': 'true', 'PreprocessToFile': 'true', 'ProgramDataBaseFileName': '$(IntDir)vc90b.pdb', 'RuntimeLibrary': 'MultiThreadedDebugDLL', 'RuntimeTypeInfo': 'false', 'ShowIncludes': 'true', 'SmallerTypeCheck': 'true', 'StringPooling': 'true', 'StructMemberAlignment': '4Bytes', 'SuppressStartupBanner': 'false', 'TreatWarningAsError': 'true', 'TreatWChar_tAsBuiltInType': 'false', 'UndefineAllPreprocessorDefinitions': 'true', 'UndefinePreprocessorDefinitions': 'wer', 'UseFullPaths': 'true', 'WarningLevel': 'Level3', 'WholeProgramOptimization': 'true', 'XMLDocumentationFileName': '$(IntDir)c'}, 'Link': { 'AdditionalDependencies': 'zx', 'AdditionalLibraryDirectories': 'asd', 'AdditionalManifestDependencies': 's2', 'AdditionalOptions': '/mor2', 'AddModuleNamesToAssembly': 'd1', 'AllowIsolation': 'false', 'AssemblyDebug': 'true', 'AssemblyLinkResource': 'd5', 'BaseAddress': '23423', 'CLRImageType': 'ForceSafeILImage', 'CLRThreadAttribute': 'MTAThreadingAttribute', 'CLRUnmanagedCodeCheck': 'true', 'DataExecutionPrevention': '', 'DelayLoadDLLs': 'd4', 'DelaySign': 'true', 'Driver': 'UpOnly', 'EmbedManagedResourceFile': 'd2', 'EnableCOMDATFolding': 'false', 'EnableUAC': 'false', 'EntryPointSymbol': 'f5', 'FixedBaseAddress': 'false', 'ForceSymbolReferences': 'd3', 'FunctionOrder': 'fssdfsd', 'GenerateDebugInformation': 'true', 'GenerateMapFile': 'true', 'HeapCommitSize': '13', 'HeapReserveSize': '12', 'IgnoreAllDefaultLibraries': 'true', 'IgnoreEmbeddedIDL': 'true', 'IgnoreSpecificDefaultLibraries': 'flob;flok', 'ImportLibrary': 'f4', 'KeyContainer': 'f7', 'KeyFile': 'f6', 'LargeAddressAware': 'true', 'LinkErrorReporting': 'QueueForNextLogin', 'LinkTimeCodeGeneration': 'UseLinkTimeCodeGeneration', 'ManifestFile': '$(IntDir)$(TargetFileName).2intermediate.manifest', 'MapExports': 'true', 'MapFileName': 'd5', 'MergedIDLBaseFileName': 'f2', 'MergeSections': 'f5', 'MidlCommandFile': 'f1', 'ModuleDefinitionFile': 'sdsd', 'NoEntryPoint': 'true', 'OptimizeReferences': 'true', 'OutputFile': '$(OutDir)$(ProjectName)2.exe', 'PerUserRedirection': 'true', 'Profile': 'true', 'ProfileGuidedDatabase': '$(TargetDir)$(TargetName).pgdd', 'ProgramDatabaseFile': 'Flob.pdb', 'RandomizedBaseAddress': 'false', 'RegisterOutput': 'true', 'SetChecksum': 'false', 'ShowProgress': 'LinkVerbose', 'StackCommitSize': '15', 'StackReserveSize': '14', 'StripPrivateSymbols': 'd3', 'SubSystem': 'Console', 'SupportUnloadOfDelayLoadedDLL': 'true', 'SuppressStartupBanner': 'false', 'SwapRunFromCD': 'true', 'SwapRunFromNET': 'true', 'TargetMachine': 'MachineX86', 'TerminalServerAware': 'false', 'TurnOffAssemblyGeneration': 'true', 'TypeLibraryFile': 'f3', 'TypeLibraryResourceID': '12', 'UACExecutionLevel': 'RequireAdministrator', 'UACUIAccess': 'true', 'Version': '333'}, 'ResourceCompile': { 'AdditionalIncludeDirectories': 'f3', 'AdditionalOptions': '/more3', 'Culture': '0x0c0c', 'IgnoreStandardIncludePath': 'true', 'PreprocessorDefinitions': '_UNICODE;UNICODE2', 'ResourceOutputFileName': '$(IntDir)%(Filename)3.res', 'ShowProgress': 'true'}, 'Manifest': { 'AdditionalManifestFiles': 'sfsdfsd', 'AdditionalOptions': 'afdsdafsd', 'AssemblyIdentity': 'sddfdsadfsa', 'ComponentFileName': 'fsdfds', 'GenerateCatalogFiles': 'true', 'InputResourceManifests': 'asfsfdafs', 'OutputManifestFile': '$(TargetPath).manifestdfs', 'RegistrarScriptFile': 'sdfsfd', 'ReplacementsFile': 'sdffsd', 'SuppressStartupBanner': 'false', 'TypeLibraryFile': 'sfsd', 'UpdateFileHashes': 'true', 'UpdateFileHashesSearchPath': 'sfsd', 'VerboseOutput': 'true'}, 'ProjectReference': { 'LinkLibraryDependencies': 'false', 'UseLibraryDependencyInputs': 'true'}, '': { 'EmbedManifest': 'false', 'GenerateManifest': 'false', 'IgnoreImportLibrary': 'true', 'LinkIncremental': '' }, 'ManifestResourceCompile': { 'ResourceOutputFileName': '$(IntDir)$(TargetFileName).embed.manifest.resfdsf'} } actual_msbuild_settings = MSVSSettings.ConvertToMSBuildSettings( msvs_settings, self.stderr) self.assertEqual(expected_msbuild_settings, actual_msbuild_settings) self._ExpectedWarnings([]) if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSToolFile.py0000644000000000000000000000341412631326456023210 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Visual Studio project reader/writer.""" import gyp.common import gyp.easy_xml as easy_xml class Writer(object): """Visual Studio XML tool file writer.""" def __init__(self, tool_file_path, name): """Initializes the tool file. Args: tool_file_path: Path to the tool file. name: Name of the tool file. """ self.tool_file_path = tool_file_path self.name = name self.rules_section = ['Rules'] def AddCustomBuildRule(self, name, cmd, description, additional_dependencies, outputs, extensions): """Adds a rule to the tool file. Args: name: Name of the rule. description: Description of the rule. cmd: Command line of the rule. additional_dependencies: other files which may trigger the rule. outputs: outputs of the rule. extensions: extensions handled by the rule. """ rule = ['CustomBuildRule', {'Name': name, 'ExecutionDescription': description, 'CommandLine': cmd, 'Outputs': ';'.join(outputs), 'FileExtensions': ';'.join(extensions), 'AdditionalDependencies': ';'.join(additional_dependencies) }] self.rules_section.append(rule) def WriteIfChanged(self): """Writes the tool file.""" content = ['VisualStudioToolFile', {'Version': '8.00', 'Name': self.name }, self.rules_section ] easy_xml.WriteXmlIfChanged(content, self.tool_file_path, encoding="Windows-1252") npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSUserFile.py0000644000000000000000000001174612631326456023220 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Visual Studio user preferences file writer.""" import os import re import socket # for gethostname import gyp.common import gyp.easy_xml as easy_xml #------------------------------------------------------------------------------ def _FindCommandInPath(command): """If there are no slashes in the command given, this function searches the PATH env to find the given command, and converts it to an absolute path. We have to do this because MSVS is looking for an actual file to launch a debugger on, not just a command line. Note that this happens at GYP time, so anything needing to be built needs to have a full path.""" if '/' in command or '\\' in command: # If the command already has path elements (either relative or # absolute), then assume it is constructed properly. return command else: # Search through the path list and find an existing file that # we can access. paths = os.environ.get('PATH','').split(os.pathsep) for path in paths: item = os.path.join(path, command) if os.path.isfile(item) and os.access(item, os.X_OK): return item return command def _QuoteWin32CommandLineArgs(args): new_args = [] for arg in args: # Replace all double-quotes with double-double-quotes to escape # them for cmd shell, and then quote the whole thing if there # are any. if arg.find('"') != -1: arg = '""'.join(arg.split('"')) arg = '"%s"' % arg # Otherwise, if there are any spaces, quote the whole arg. elif re.search(r'[ \t\n]', arg): arg = '"%s"' % arg new_args.append(arg) return new_args class Writer(object): """Visual Studio XML user user file writer.""" def __init__(self, user_file_path, version, name): """Initializes the user file. Args: user_file_path: Path to the user file. version: Version info. name: Name of the user file. """ self.user_file_path = user_file_path self.version = version self.name = name self.configurations = {} def AddConfig(self, name): """Adds a configuration to the project. Args: name: Configuration name. """ self.configurations[name] = ['Configuration', {'Name': name}] def AddDebugSettings(self, config_name, command, environment = {}, working_directory=""): """Adds a DebugSettings node to the user file for a particular config. Args: command: command line to run. First element in the list is the executable. All elements of the command will be quoted if necessary. working_directory: other files which may trigger the rule. (optional) """ command = _QuoteWin32CommandLineArgs(command) abs_command = _FindCommandInPath(command[0]) if environment and isinstance(environment, dict): env_list = ['%s="%s"' % (key, val) for (key,val) in environment.iteritems()] environment = ' '.join(env_list) else: environment = '' n_cmd = ['DebugSettings', {'Command': abs_command, 'WorkingDirectory': working_directory, 'CommandArguments': " ".join(command[1:]), 'RemoteMachine': socket.gethostname(), 'Environment': environment, 'EnvironmentMerge': 'true', # Currently these are all "dummy" values that we're just setting # in the default manner that MSVS does it. We could use some of # these to add additional capabilities, I suppose, but they might # not have parity with other platforms then. 'Attach': 'false', 'DebuggerType': '3', # 'auto' debugger 'Remote': '1', 'RemoteCommand': '', 'HttpUrl': '', 'PDBPath': '', 'SQLDebugging': '', 'DebuggerFlavor': '0', 'MPIRunCommand': '', 'MPIRunArguments': '', 'MPIRunWorkingDirectory': '', 'ApplicationCommand': '', 'ApplicationArguments': '', 'ShimCommand': '', 'MPIAcceptMode': '', 'MPIAcceptFilter': '' }] # Find the config, and add it if it doesn't exist. if config_name not in self.configurations: self.AddConfig(config_name) # Add the DebugSettings onto the appropriate config. self.configurations[config_name].append(n_cmd) def WriteIfChanged(self): """Writes the user file.""" configs = ['Configurations'] for config, spec in sorted(self.configurations.iteritems()): configs.append(spec) content = ['VisualStudioUserFile', {'Version': self.version.ProjectVersion(), 'Name': self.name }, configs] easy_xml.WriteXmlIfChanged(content, self.user_file_path, encoding="Windows-1252") npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSUtil.py0000644000000000000000000002250112631326456022406 0ustar 00000000000000# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions shared amongst the Windows generators.""" import copy import os # A dictionary mapping supported target types to extensions. TARGET_TYPE_EXT = { 'executable': 'exe', 'loadable_module': 'dll', 'shared_library': 'dll', 'static_library': 'lib', } def _GetLargePdbShimCcPath(): """Returns the path of the large_pdb_shim.cc file.""" this_dir = os.path.abspath(os.path.dirname(__file__)) src_dir = os.path.abspath(os.path.join(this_dir, '..', '..')) win_data_dir = os.path.join(src_dir, 'data', 'win') large_pdb_shim_cc = os.path.join(win_data_dir, 'large-pdb-shim.cc') return large_pdb_shim_cc def _DeepCopySomeKeys(in_dict, keys): """Performs a partial deep-copy on |in_dict|, only copying the keys in |keys|. Arguments: in_dict: The dictionary to copy. keys: The keys to be copied. If a key is in this list and doesn't exist in |in_dict| this is not an error. Returns: The partially deep-copied dictionary. """ d = {} for key in keys: if key not in in_dict: continue d[key] = copy.deepcopy(in_dict[key]) return d def _SuffixName(name, suffix): """Add a suffix to the end of a target. Arguments: name: name of the target (foo#target) suffix: the suffix to be added Returns: Target name with suffix added (foo_suffix#target) """ parts = name.rsplit('#', 1) parts[0] = '%s_%s' % (parts[0], suffix) return '#'.join(parts) def _ShardName(name, number): """Add a shard number to the end of a target. Arguments: name: name of the target (foo#target) number: shard number Returns: Target name with shard added (foo_1#target) """ return _SuffixName(name, str(number)) def ShardTargets(target_list, target_dicts): """Shard some targets apart to work around the linkers limits. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. Returns: Tuple of the new sharded versions of the inputs. """ # Gather the targets to shard, and how many pieces. targets_to_shard = {} for t in target_dicts: shards = int(target_dicts[t].get('msvs_shard', 0)) if shards: targets_to_shard[t] = shards # Shard target_list. new_target_list = [] for t in target_list: if t in targets_to_shard: for i in range(targets_to_shard[t]): new_target_list.append(_ShardName(t, i)) else: new_target_list.append(t) # Shard target_dict. new_target_dicts = {} for t in target_dicts: if t in targets_to_shard: for i in range(targets_to_shard[t]): name = _ShardName(t, i) new_target_dicts[name] = copy.copy(target_dicts[t]) new_target_dicts[name]['target_name'] = _ShardName( new_target_dicts[name]['target_name'], i) sources = new_target_dicts[name].get('sources', []) new_sources = [] for pos in range(i, len(sources), targets_to_shard[t]): new_sources.append(sources[pos]) new_target_dicts[name]['sources'] = new_sources else: new_target_dicts[t] = target_dicts[t] # Shard dependencies. for t in new_target_dicts: for deptype in ('dependencies', 'dependencies_original'): dependencies = copy.copy(new_target_dicts[t].get(deptype, [])) new_dependencies = [] for d in dependencies: if d in targets_to_shard: for i in range(targets_to_shard[d]): new_dependencies.append(_ShardName(d, i)) else: new_dependencies.append(d) new_target_dicts[t][deptype] = new_dependencies return (new_target_list, new_target_dicts) def _GetPdbPath(target_dict, config_name, vars): """Returns the path to the PDB file that will be generated by a given configuration. The lookup proceeds as follows: - Look for an explicit path in the VCLinkerTool configuration block. - Look for an 'msvs_large_pdb_path' variable. - Use '<(PRODUCT_DIR)/<(product_name).(exe|dll).pdb' if 'product_name' is specified. - Use '<(PRODUCT_DIR)/<(target_name).(exe|dll).pdb'. Arguments: target_dict: The target dictionary to be searched. config_name: The name of the configuration of interest. vars: A dictionary of common GYP variables with generator-specific values. Returns: The path of the corresponding PDB file. """ config = target_dict['configurations'][config_name] msvs = config.setdefault('msvs_settings', {}) linker = msvs.get('VCLinkerTool', {}) pdb_path = linker.get('ProgramDatabaseFile') if pdb_path: return pdb_path variables = target_dict.get('variables', {}) pdb_path = variables.get('msvs_large_pdb_path', None) if pdb_path: return pdb_path pdb_base = target_dict.get('product_name', target_dict['target_name']) pdb_base = '%s.%s.pdb' % (pdb_base, TARGET_TYPE_EXT[target_dict['type']]) pdb_path = vars['PRODUCT_DIR'] + '/' + pdb_base return pdb_path def InsertLargePdbShims(target_list, target_dicts, vars): """Insert a shim target that forces the linker to use 4KB pagesize PDBs. This is a workaround for targets with PDBs greater than 1GB in size, the limit for the 1KB pagesize PDBs created by the linker by default. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. vars: A dictionary of common GYP variables with generator-specific values. Returns: Tuple of the shimmed version of the inputs. """ # Determine which targets need shimming. targets_to_shim = [] for t in target_dicts: target_dict = target_dicts[t] # We only want to shim targets that have msvs_large_pdb enabled. if not int(target_dict.get('msvs_large_pdb', 0)): continue # This is intended for executable, shared_library and loadable_module # targets where every configuration is set up to produce a PDB output. # If any of these conditions is not true then the shim logic will fail # below. targets_to_shim.append(t) large_pdb_shim_cc = _GetLargePdbShimCcPath() for t in targets_to_shim: target_dict = target_dicts[t] target_name = target_dict.get('target_name') base_dict = _DeepCopySomeKeys(target_dict, ['configurations', 'default_configuration', 'toolset']) # This is the dict for copying the source file (part of the GYP tree) # to the intermediate directory of the project. This is necessary because # we can't always build a relative path to the shim source file (on Windows # GYP and the project may be on different drives), and Ninja hates absolute # paths (it ends up generating the .obj and .obj.d alongside the source # file, polluting GYPs tree). copy_suffix = 'large_pdb_copy' copy_target_name = target_name + '_' + copy_suffix full_copy_target_name = _SuffixName(t, copy_suffix) shim_cc_basename = os.path.basename(large_pdb_shim_cc) shim_cc_dir = vars['SHARED_INTERMEDIATE_DIR'] + '/' + copy_target_name shim_cc_path = shim_cc_dir + '/' + shim_cc_basename copy_dict = copy.deepcopy(base_dict) copy_dict['target_name'] = copy_target_name copy_dict['type'] = 'none' copy_dict['sources'] = [ large_pdb_shim_cc ] copy_dict['copies'] = [{ 'destination': shim_cc_dir, 'files': [ large_pdb_shim_cc ] }] # This is the dict for the PDB generating shim target. It depends on the # copy target. shim_suffix = 'large_pdb_shim' shim_target_name = target_name + '_' + shim_suffix full_shim_target_name = _SuffixName(t, shim_suffix) shim_dict = copy.deepcopy(base_dict) shim_dict['target_name'] = shim_target_name shim_dict['type'] = 'static_library' shim_dict['sources'] = [ shim_cc_path ] shim_dict['dependencies'] = [ full_copy_target_name ] # Set up the shim to output its PDB to the same location as the final linker # target. for config_name, config in shim_dict.get('configurations').iteritems(): pdb_path = _GetPdbPath(target_dict, config_name, vars) # A few keys that we don't want to propagate. for key in ['msvs_precompiled_header', 'msvs_precompiled_source', 'test']: config.pop(key, None) msvs = config.setdefault('msvs_settings', {}) # Update the compiler directives in the shim target. compiler = msvs.setdefault('VCCLCompilerTool', {}) compiler['DebugInformationFormat'] = '3' compiler['ProgramDataBaseFileName'] = pdb_path # Set the explicit PDB path in the appropriate configuration of the # original target. config = target_dict['configurations'][config_name] msvs = config.setdefault('msvs_settings', {}) linker = msvs.setdefault('VCLinkerTool', {}) linker['GenerateDebugInformation'] = 'true' linker['ProgramDatabaseFile'] = pdb_path # Add the new targets. They must go to the beginning of the list so that # the dependency generation works as expected in ninja. target_list.insert(0, full_copy_target_name) target_list.insert(0, full_shim_target_name) target_dicts[full_copy_target_name] = copy_dict target_dicts[full_shim_target_name] = shim_dict # Update the original target to depend on the shim target. target_dict.setdefault('dependencies', []).append(full_shim_target_name) return (target_list, target_dicts) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/MSVSVersion.py0000644000000000000000000004141512631326456023123 0ustar 00000000000000# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Handle version information related to Visual Stuio.""" import errno import os import re import subprocess import sys import gyp import glob class VisualStudioVersion(object): """Information regarding a version of Visual Studio.""" def __init__(self, short_name, description, solution_version, project_version, flat_sln, uses_vcxproj, path, sdk_based, default_toolset=None): self.short_name = short_name self.description = description self.solution_version = solution_version self.project_version = project_version self.flat_sln = flat_sln self.uses_vcxproj = uses_vcxproj self.path = path self.sdk_based = sdk_based self.default_toolset = default_toolset def ShortName(self): return self.short_name def Description(self): """Get the full description of the version.""" return self.description def SolutionVersion(self): """Get the version number of the sln files.""" return self.solution_version def ProjectVersion(self): """Get the version number of the vcproj or vcxproj files.""" return self.project_version def FlatSolution(self): return self.flat_sln def UsesVcxproj(self): """Returns true if this version uses a vcxproj file.""" return self.uses_vcxproj def ProjectExtension(self): """Returns the file extension for the project.""" return self.uses_vcxproj and '.vcxproj' or '.vcproj' def Path(self): """Returns the path to Visual Studio installation.""" return self.path def ToolPath(self, tool): """Returns the path to a given compiler tool. """ return os.path.normpath(os.path.join(self.path, "VC/bin", tool)) def DefaultToolset(self): """Returns the msbuild toolset version that will be used in the absence of a user override.""" return self.default_toolset def SetupScript(self, target_arch): """Returns a command (with arguments) to be used to set up the environment.""" # Check if we are running in the SDK command line environment and use # the setup script from the SDK if so. |target_arch| should be either # 'x86' or 'x64'. assert target_arch in ('x86', 'x64') sdk_dir = os.environ.get('WindowsSDKDir') if self.sdk_based and sdk_dir: return [os.path.normpath(os.path.join(sdk_dir, 'Bin/SetEnv.Cmd')), '/' + target_arch] else: # We don't use VC/vcvarsall.bat for x86 because vcvarsall calls # vcvars32, which it can only find if VS??COMNTOOLS is set, which it # isn't always. if target_arch == 'x86': if self.short_name >= '2013' and self.short_name[-1] != 'e' and ( os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'): # VS2013 and later, non-Express have a x64-x86 cross that we want # to prefer. return [os.path.normpath( os.path.join(self.path, 'VC/vcvarsall.bat')), 'amd64_x86'] # Otherwise, the standard x86 compiler. return [os.path.normpath( os.path.join(self.path, 'Common7/Tools/vsvars32.bat'))] else: assert target_arch == 'x64' arg = 'x86_amd64' # Use the 64-on-64 compiler if we're not using an express # edition and we're running on a 64bit OS. if self.short_name[-1] != 'e' and ( os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'): arg = 'amd64' return [os.path.normpath( os.path.join(self.path, 'VC/vcvarsall.bat')), arg] def _RegistryQueryBase(sysdir, key, value): """Use reg.exe to read a particular key. While ideally we might use the win32 module, we would like gyp to be python neutral, so for instance cygwin python lacks this module. Arguments: sysdir: The system subdirectory to attempt to launch reg.exe from. key: The registry key to read from. value: The particular value to read. Return: stdout from reg.exe, or None for failure. """ # Skip if not on Windows or Python Win32 setup issue if sys.platform not in ('win32', 'cygwin'): return None # Setup params to pass to and attempt to launch reg.exe cmd = [os.path.join(os.environ.get('WINDIR', ''), sysdir, 'reg.exe'), 'query', key] if value: cmd.extend(['/v', value]) p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) # Obtain the stdout from reg.exe, reading to the end so p.returncode is valid # Note that the error text may be in [1] in some cases text = p.communicate()[0] # Check return code from reg.exe; officially 0==success and 1==error if p.returncode: return None return text def _RegistryQuery(key, value=None): r"""Use reg.exe to read a particular key through _RegistryQueryBase. First tries to launch from %WinDir%\Sysnative to avoid WoW64 redirection. If that fails, it falls back to System32. Sysnative is available on Vista and up and available on Windows Server 2003 and XP through KB patch 942589. Note that Sysnative will always fail if using 64-bit python due to it being a virtual directory and System32 will work correctly in the first place. KB 942589 - http://support.microsoft.com/kb/942589/en-us. Arguments: key: The registry key. value: The particular registry value to read (optional). Return: stdout from reg.exe, or None for failure. """ text = None try: text = _RegistryQueryBase('Sysnative', key, value) except OSError, e: if e.errno == errno.ENOENT: text = _RegistryQueryBase('System32', key, value) else: raise return text def _RegistryGetValueUsingWinReg(key, value): """Use the _winreg module to obtain the value of a registry key. Args: key: The registry key. value: The particular registry value to read. Return: contents of the registry key's value, or None on failure. Throws ImportError if _winreg is unavailable. """ import _winreg try: root, subkey = key.split('\\', 1) assert root == 'HKLM' # Only need HKLM for now. with _winreg.OpenKey(_winreg.HKEY_LOCAL_MACHINE, subkey) as hkey: return _winreg.QueryValueEx(hkey, value)[0] except WindowsError: return None def _RegistryGetValue(key, value): """Use _winreg or reg.exe to obtain the value of a registry key. Using _winreg is preferable because it solves an issue on some corporate environments where access to reg.exe is locked down. However, we still need to fallback to reg.exe for the case where the _winreg module is not available (for example in cygwin python). Args: key: The registry key. value: The particular registry value to read. Return: contents of the registry key's value, or None on failure. """ try: return _RegistryGetValueUsingWinReg(key, value) except ImportError: pass # Fallback to reg.exe if we fail to import _winreg. text = _RegistryQuery(key, value) if not text: return None # Extract value. match = re.search(r'REG_\w+\s+([^\r]+)\r\n', text) if not match: return None return match.group(1) def _CreateVersion(name, path, sdk_based=False): """Sets up MSVS project generation. Setup is based off the GYP_MSVS_VERSION environment variable or whatever is autodetected if GYP_MSVS_VERSION is not explicitly specified. If a version is passed in that doesn't match a value in versions python will throw a error. """ if path: path = os.path.normpath(path) versions = { '2015': VisualStudioVersion('2015', 'Visual Studio 2015', solution_version='12.00', project_version='14.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v140'), '2013': VisualStudioVersion('2013', 'Visual Studio 2013', solution_version='13.00', project_version='12.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v120'), '2013e': VisualStudioVersion('2013e', 'Visual Studio 2013', solution_version='13.00', project_version='12.0', flat_sln=True, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v120'), '2012': VisualStudioVersion('2012', 'Visual Studio 2012', solution_version='12.00', project_version='4.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v110'), '2012e': VisualStudioVersion('2012e', 'Visual Studio 2012', solution_version='12.00', project_version='4.0', flat_sln=True, uses_vcxproj=True, path=path, sdk_based=sdk_based, default_toolset='v110'), '2010': VisualStudioVersion('2010', 'Visual Studio 2010', solution_version='11.00', project_version='4.0', flat_sln=False, uses_vcxproj=True, path=path, sdk_based=sdk_based), '2010e': VisualStudioVersion('2010e', 'Visual C++ Express 2010', solution_version='11.00', project_version='4.0', flat_sln=True, uses_vcxproj=True, path=path, sdk_based=sdk_based), '2008': VisualStudioVersion('2008', 'Visual Studio 2008', solution_version='10.00', project_version='9.00', flat_sln=False, uses_vcxproj=False, path=path, sdk_based=sdk_based), '2008e': VisualStudioVersion('2008e', 'Visual Studio 2008', solution_version='10.00', project_version='9.00', flat_sln=True, uses_vcxproj=False, path=path, sdk_based=sdk_based), '2005': VisualStudioVersion('2005', 'Visual Studio 2005', solution_version='9.00', project_version='8.00', flat_sln=False, uses_vcxproj=False, path=path, sdk_based=sdk_based), '2005e': VisualStudioVersion('2005e', 'Visual Studio 2005', solution_version='9.00', project_version='8.00', flat_sln=True, uses_vcxproj=False, path=path, sdk_based=sdk_based), } return versions[str(name)] def _ConvertToCygpath(path): """Convert to cygwin path if we are using cygwin.""" if sys.platform == 'cygwin': p = subprocess.Popen(['cygpath', path], stdout=subprocess.PIPE) path = p.communicate()[0].strip() return path def _DetectVisualStudioVersions(versions_to_check, force_express): """Collect the list of installed visual studio versions. Returns: A list of visual studio versions installed in descending order of usage preference. Base this on the registry and a quick check if devenv.exe exists. Only versions 8-10 are considered. Possibilities are: 2005(e) - Visual Studio 2005 (8) 2008(e) - Visual Studio 2008 (9) 2010(e) - Visual Studio 2010 (10) 2012(e) - Visual Studio 2012 (11) 2013(e) - Visual Studio 2013 (12) 2015 - Visual Studio 2015 (14) Where (e) is e for express editions of MSVS and blank otherwise. """ version_to_year = { '8.0': '2005', '9.0': '2008', '10.0': '2010', '11.0': '2012', '12.0': '2013', '14.0': '2015', } versions = [] for version in versions_to_check: # Old method of searching for which VS version is installed # We don't use the 2010-encouraged-way because we also want to get the # path to the binaries, which it doesn't offer. keys = [r'HKLM\Software\Microsoft\VisualStudio\%s' % version, r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\%s' % version, r'HKLM\Software\Microsoft\VCExpress\%s' % version, r'HKLM\Software\Wow6432Node\Microsoft\VCExpress\%s' % version] for index in range(len(keys)): path = _RegistryGetValue(keys[index], 'InstallDir') if not path: continue path = _ConvertToCygpath(path) # Check for full. full_path = os.path.join(path, 'devenv.exe') express_path = os.path.join(path, '*express.exe') if not force_express and os.path.exists(full_path): # Add this one. versions.append(_CreateVersion(version_to_year[version], os.path.join(path, '..', '..'))) # Check for express. elif glob.glob(express_path): # Add this one. versions.append(_CreateVersion(version_to_year[version] + 'e', os.path.join(path, '..', '..'))) # The old method above does not work when only SDK is installed. keys = [r'HKLM\Software\Microsoft\VisualStudio\SxS\VC7', r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VC7'] for index in range(len(keys)): path = _RegistryGetValue(keys[index], version) if not path: continue path = _ConvertToCygpath(path) if version != '14.0': # There is no Express edition for 2015. versions.append(_CreateVersion(version_to_year[version] + 'e', os.path.join(path, '..'), sdk_based=True)) return versions def SelectVisualStudioVersion(version='auto', allow_fallback=True): """Select which version of Visual Studio projects to generate. Arguments: version: Hook to allow caller to force a particular version (vs auto). Returns: An object representing a visual studio project format version. """ # In auto mode, check environment variable for override. if version == 'auto': version = os.environ.get('GYP_MSVS_VERSION', 'auto') version_map = { 'auto': ('14.0', '12.0', '10.0', '9.0', '8.0', '11.0'), '2005': ('8.0',), '2005e': ('8.0',), '2008': ('9.0',), '2008e': ('9.0',), '2010': ('10.0',), '2010e': ('10.0',), '2012': ('11.0',), '2012e': ('11.0',), '2013': ('12.0',), '2013e': ('12.0',), '2015': ('14.0',), } override_path = os.environ.get('GYP_MSVS_OVERRIDE_PATH') if override_path: msvs_version = os.environ.get('GYP_MSVS_VERSION') if not msvs_version: raise ValueError('GYP_MSVS_OVERRIDE_PATH requires GYP_MSVS_VERSION to be ' 'set to a particular version (e.g. 2010e).') return _CreateVersion(msvs_version, override_path, sdk_based=True) version = str(version) versions = _DetectVisualStudioVersions(version_map[version], 'e' in version) if not versions: if not allow_fallback: raise ValueError('Could not locate Visual Studio installation.') if version == 'auto': # Default to 2005 if we couldn't find anything return _CreateVersion('2005', None) else: return _CreateVersion(version, None) return versions[0] npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/__init__.py0000755000000000000000000005324212631326456022530 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import copy import gyp.input import optparse import os.path import re import shlex import sys import traceback from gyp.common import GypError # Default debug modes for GYP debug = {} # List of "official" debug modes, but you can use anything you like. DEBUG_GENERAL = 'general' DEBUG_VARIABLES = 'variables' DEBUG_INCLUDES = 'includes' def DebugOutput(mode, message, *args): if 'all' in gyp.debug or mode in gyp.debug: ctx = ('unknown', 0, 'unknown') try: f = traceback.extract_stack(limit=2) if f: ctx = f[0][:3] except: pass if args: message %= args print '%s:%s:%d:%s %s' % (mode.upper(), os.path.basename(ctx[0]), ctx[1], ctx[2], message) def FindBuildFiles(): extension = '.gyp' files = os.listdir(os.getcwd()) build_files = [] for file in files: if file.endswith(extension): build_files.append(file) return build_files def Load(build_files, format, default_variables={}, includes=[], depth='.', params=None, check=False, circular_check=True, duplicate_basename_check=True): """ Loads one or more specified build files. default_variables and includes will be copied before use. Returns the generator for the specified format and the data returned by loading the specified build files. """ if params is None: params = {} if '-' in format: format, params['flavor'] = format.split('-', 1) default_variables = copy.copy(default_variables) # Default variables provided by this program and its modules should be # named WITH_CAPITAL_LETTERS to provide a distinct "best practice" namespace, # avoiding collisions with user and automatic variables. default_variables['GENERATOR'] = format default_variables['GENERATOR_FLAVOR'] = params.get('flavor', '') # Format can be a custom python file, or by default the name of a module # within gyp.generator. if format.endswith('.py'): generator_name = os.path.splitext(format)[0] path, generator_name = os.path.split(generator_name) # Make sure the path to the custom generator is in sys.path # Don't worry about removing it once we are done. Keeping the path # to each generator that is used in sys.path is likely harmless and # arguably a good idea. path = os.path.abspath(path) if path not in sys.path: sys.path.insert(0, path) else: generator_name = 'gyp.generator.' + format # These parameters are passed in order (as opposed to by key) # because ActivePython cannot handle key parameters to __import__. generator = __import__(generator_name, globals(), locals(), generator_name) for (key, val) in generator.generator_default_variables.items(): default_variables.setdefault(key, val) # Give the generator the opportunity to set additional variables based on # the params it will receive in the output phase. if getattr(generator, 'CalculateVariables', None): generator.CalculateVariables(default_variables, params) # Give the generator the opportunity to set generator_input_info based on # the params it will receive in the output phase. if getattr(generator, 'CalculateGeneratorInputInfo', None): generator.CalculateGeneratorInputInfo(params) # Fetch the generator specific info that gets fed to input, we use getattr # so we can default things and the generators only have to provide what # they need. generator_input_info = { 'non_configuration_keys': getattr(generator, 'generator_additional_non_configuration_keys', []), 'path_sections': getattr(generator, 'generator_additional_path_sections', []), 'extra_sources_for_rules': getattr(generator, 'generator_extra_sources_for_rules', []), 'generator_supports_multiple_toolsets': getattr(generator, 'generator_supports_multiple_toolsets', False), 'generator_wants_static_library_dependencies_adjusted': getattr(generator, 'generator_wants_static_library_dependencies_adjusted', True), 'generator_wants_sorted_dependencies': getattr(generator, 'generator_wants_sorted_dependencies', False), 'generator_filelist_paths': getattr(generator, 'generator_filelist_paths', None), } # Process the input specific to this generator. result = gyp.input.Load(build_files, default_variables, includes[:], depth, generator_input_info, check, circular_check, duplicate_basename_check, params['parallel'], params['root_targets']) return [generator] + result def NameValueListToDict(name_value_list): """ Takes an array of strings of the form 'NAME=VALUE' and creates a dictionary of the pairs. If a string is simply NAME, then the value in the dictionary is set to True. If VALUE can be converted to an integer, it is. """ result = { } for item in name_value_list: tokens = item.split('=', 1) if len(tokens) == 2: # If we can make it an int, use that, otherwise, use the string. try: token_value = int(tokens[1]) except ValueError: token_value = tokens[1] # Set the variable to the supplied value. result[tokens[0]] = token_value else: # No value supplied, treat it as a boolean and set it. result[tokens[0]] = True return result def ShlexEnv(env_name): flags = os.environ.get(env_name, []) if flags: flags = shlex.split(flags) return flags def FormatOpt(opt, value): if opt.startswith('--'): return '%s=%s' % (opt, value) return opt + value def RegenerateAppendFlag(flag, values, predicate, env_name, options): """Regenerate a list of command line flags, for an option of action='append'. The |env_name|, if given, is checked in the environment and used to generate an initial list of options, then the options that were specified on the command line (given in |values|) are appended. This matches the handling of environment variables and command line flags where command line flags override the environment, while not requiring the environment to be set when the flags are used again. """ flags = [] if options.use_environment and env_name: for flag_value in ShlexEnv(env_name): value = FormatOpt(flag, predicate(flag_value)) if value in flags: flags.remove(value) flags.append(value) if values: for flag_value in values: flags.append(FormatOpt(flag, predicate(flag_value))) return flags def RegenerateFlags(options): """Given a parsed options object, and taking the environment variables into account, returns a list of flags that should regenerate an equivalent options object (even in the absence of the environment variables.) Any path options will be normalized relative to depth. The format flag is not included, as it is assumed the calling generator will set that as appropriate. """ def FixPath(path): path = gyp.common.FixIfRelativePath(path, options.depth) if not path: return os.path.curdir return path def Noop(value): return value # We always want to ignore the environment when regenerating, to avoid # duplicate or changed flags in the environment at the time of regeneration. flags = ['--ignore-environment'] for name, metadata in options._regeneration_metadata.iteritems(): opt = metadata['opt'] value = getattr(options, name) value_predicate = metadata['type'] == 'path' and FixPath or Noop action = metadata['action'] env_name = metadata['env_name'] if action == 'append': flags.extend(RegenerateAppendFlag(opt, value, value_predicate, env_name, options)) elif action in ('store', None): # None is a synonym for 'store'. if value: flags.append(FormatOpt(opt, value_predicate(value))) elif options.use_environment and env_name and os.environ.get(env_name): flags.append(FormatOpt(opt, value_predicate(os.environ.get(env_name)))) elif action in ('store_true', 'store_false'): if ((action == 'store_true' and value) or (action == 'store_false' and not value)): flags.append(opt) elif options.use_environment and env_name: print >>sys.stderr, ('Warning: environment regeneration unimplemented ' 'for %s flag %r env_name %r' % (action, opt, env_name)) else: print >>sys.stderr, ('Warning: regeneration unimplemented for action %r ' 'flag %r' % (action, opt)) return flags class RegeneratableOptionParser(optparse.OptionParser): def __init__(self): self.__regeneratable_options = {} optparse.OptionParser.__init__(self) def add_option(self, *args, **kw): """Add an option to the parser. This accepts the same arguments as OptionParser.add_option, plus the following: regenerate: can be set to False to prevent this option from being included in regeneration. env_name: name of environment variable that additional values for this option come from. type: adds type='path', to tell the regenerator that the values of this option need to be made relative to options.depth """ env_name = kw.pop('env_name', None) if 'dest' in kw and kw.pop('regenerate', True): dest = kw['dest'] # The path type is needed for regenerating, for optparse we can just treat # it as a string. type = kw.get('type') if type == 'path': kw['type'] = 'string' self.__regeneratable_options[dest] = { 'action': kw.get('action'), 'type': type, 'env_name': env_name, 'opt': args[0], } optparse.OptionParser.add_option(self, *args, **kw) def parse_args(self, *args): values, args = optparse.OptionParser.parse_args(self, *args) values._regeneration_metadata = self.__regeneratable_options return values, args def gyp_main(args): my_name = os.path.basename(sys.argv[0]) parser = RegeneratableOptionParser() usage = 'usage: %s [options ...] [build_file ...]' parser.set_usage(usage.replace('%s', '%prog')) parser.add_option('--build', dest='configs', action='append', help='configuration for build after project generation') parser.add_option('--check', dest='check', action='store_true', help='check format of gyp files') parser.add_option('--config-dir', dest='config_dir', action='store', env_name='GYP_CONFIG_DIR', default=None, help='The location for configuration files like ' 'include.gypi.') parser.add_option('-d', '--debug', dest='debug', metavar='DEBUGMODE', action='append', default=[], help='turn on a debugging ' 'mode for debugging GYP. Supported modes are "variables", ' '"includes" and "general" or "all" for all of them.') parser.add_option('-D', dest='defines', action='append', metavar='VAR=VAL', env_name='GYP_DEFINES', help='sets variable VAR to value VAL') parser.add_option('--depth', dest='depth', metavar='PATH', type='path', help='set DEPTH gyp variable to a relative path to PATH') parser.add_option('-f', '--format', dest='formats', action='append', env_name='GYP_GENERATORS', regenerate=False, help='output formats to generate') parser.add_option('-G', dest='generator_flags', action='append', default=[], metavar='FLAG=VAL', env_name='GYP_GENERATOR_FLAGS', help='sets generator flag FLAG to VAL') parser.add_option('--generator-output', dest='generator_output', action='store', default=None, metavar='DIR', type='path', env_name='GYP_GENERATOR_OUTPUT', help='puts generated build files under DIR') parser.add_option('--ignore-environment', dest='use_environment', action='store_false', default=True, regenerate=False, help='do not read options from environment variables') parser.add_option('-I', '--include', dest='includes', action='append', metavar='INCLUDE', type='path', help='files to include in all loaded .gyp files') # --no-circular-check disables the check for circular relationships between # .gyp files. These relationships should not exist, but they've only been # observed to be harmful with the Xcode generator. Chromium's .gyp files # currently have some circular relationships on non-Mac platforms, so this # option allows the strict behavior to be used on Macs and the lenient # behavior to be used elsewhere. # TODO(mark): Remove this option when http://crbug.com/35878 is fixed. parser.add_option('--no-circular-check', dest='circular_check', action='store_false', default=True, regenerate=False, help="don't check for circular relationships between files") # --no-duplicate-basename-check disables the check for duplicate basenames # in a static_library/shared_library project. Visual C++ 2008 generator # doesn't support this configuration. Libtool on Mac also generates warnings # when duplicate basenames are passed into Make generator on Mac. # TODO(yukawa): Remove this option when these legacy generators are # deprecated. parser.add_option('--no-duplicate-basename-check', dest='duplicate_basename_check', action='store_false', default=True, regenerate=False, help="don't check for duplicate basenames") parser.add_option('--no-parallel', action='store_true', default=False, help='Disable multiprocessing') parser.add_option('-S', '--suffix', dest='suffix', default='', help='suffix to add to generated files') parser.add_option('--toplevel-dir', dest='toplevel_dir', action='store', default=None, metavar='DIR', type='path', help='directory to use as the root of the source tree') parser.add_option('-R', '--root-target', dest='root_targets', action='append', metavar='TARGET', help='include only TARGET and its deep dependencies') options, build_files_arg = parser.parse_args(args) build_files = build_files_arg # Set up the configuration directory (defaults to ~/.gyp) if not options.config_dir: home = None home_dot_gyp = None if options.use_environment: home_dot_gyp = os.environ.get('GYP_CONFIG_DIR', None) if home_dot_gyp: home_dot_gyp = os.path.expanduser(home_dot_gyp) if not home_dot_gyp: home_vars = ['HOME'] if sys.platform in ('cygwin', 'win32'): home_vars.append('USERPROFILE') for home_var in home_vars: home = os.getenv(home_var) if home != None: home_dot_gyp = os.path.join(home, '.gyp') if not os.path.exists(home_dot_gyp): home_dot_gyp = None else: break else: home_dot_gyp = os.path.expanduser(options.config_dir) if home_dot_gyp and not os.path.exists(home_dot_gyp): home_dot_gyp = None if not options.formats: # If no format was given on the command line, then check the env variable. generate_formats = [] if options.use_environment: generate_formats = os.environ.get('GYP_GENERATORS', []) if generate_formats: generate_formats = re.split(r'[\s,]', generate_formats) if generate_formats: options.formats = generate_formats else: # Nothing in the variable, default based on platform. if sys.platform == 'darwin': options.formats = ['xcode'] elif sys.platform in ('win32', 'cygwin'): options.formats = ['msvs'] else: options.formats = ['make'] if not options.generator_output and options.use_environment: g_o = os.environ.get('GYP_GENERATOR_OUTPUT') if g_o: options.generator_output = g_o options.parallel = not options.no_parallel for mode in options.debug: gyp.debug[mode] = 1 # Do an extra check to avoid work when we're not debugging. if DEBUG_GENERAL in gyp.debug: DebugOutput(DEBUG_GENERAL, 'running with these options:') for option, value in sorted(options.__dict__.items()): if option[0] == '_': continue if isinstance(value, basestring): DebugOutput(DEBUG_GENERAL, " %s: '%s'", option, value) else: DebugOutput(DEBUG_GENERAL, " %s: %s", option, value) if not build_files: build_files = FindBuildFiles() if not build_files: raise GypError((usage + '\n\n%s: error: no build_file') % (my_name, my_name)) # TODO(mark): Chromium-specific hack! # For Chromium, the gyp "depth" variable should always be a relative path # to Chromium's top-level "src" directory. If no depth variable was set # on the command line, try to find a "src" directory by looking at the # absolute path to each build file's directory. The first "src" component # found will be treated as though it were the path used for --depth. if not options.depth: for build_file in build_files: build_file_dir = os.path.abspath(os.path.dirname(build_file)) build_file_dir_components = build_file_dir.split(os.path.sep) components_len = len(build_file_dir_components) for index in xrange(components_len - 1, -1, -1): if build_file_dir_components[index] == 'src': options.depth = os.path.sep.join(build_file_dir_components) break del build_file_dir_components[index] # If the inner loop found something, break without advancing to another # build file. if options.depth: break if not options.depth: raise GypError('Could not automatically locate src directory. This is' 'a temporary Chromium feature that will be removed. Use' '--depth as a workaround.') # If toplevel-dir is not set, we assume that depth is the root of our source # tree. if not options.toplevel_dir: options.toplevel_dir = options.depth # -D on the command line sets variable defaults - D isn't just for define, # it's for default. Perhaps there should be a way to force (-F?) a # variable's value so that it can't be overridden by anything else. cmdline_default_variables = {} defines = [] if options.use_environment: defines += ShlexEnv('GYP_DEFINES') if options.defines: defines += options.defines cmdline_default_variables = NameValueListToDict(defines) if DEBUG_GENERAL in gyp.debug: DebugOutput(DEBUG_GENERAL, "cmdline_default_variables: %s", cmdline_default_variables) # Set up includes. includes = [] # If ~/.gyp/include.gypi exists, it'll be forcibly included into every # .gyp file that's loaded, before anything else is included. if home_dot_gyp != None: default_include = os.path.join(home_dot_gyp, 'include.gypi') if os.path.exists(default_include): print 'Using overrides found in ' + default_include includes.append(default_include) # Command-line --include files come after the default include. if options.includes: includes.extend(options.includes) # Generator flags should be prefixed with the target generator since they # are global across all generator runs. gen_flags = [] if options.use_environment: gen_flags += ShlexEnv('GYP_GENERATOR_FLAGS') if options.generator_flags: gen_flags += options.generator_flags generator_flags = NameValueListToDict(gen_flags) if DEBUG_GENERAL in gyp.debug.keys(): DebugOutput(DEBUG_GENERAL, "generator_flags: %s", generator_flags) # Generate all requested formats (use a set in case we got one format request # twice) for format in set(options.formats): params = {'options': options, 'build_files': build_files, 'generator_flags': generator_flags, 'cwd': os.getcwd(), 'build_files_arg': build_files_arg, 'gyp_binary': sys.argv[0], 'home_dot_gyp': home_dot_gyp, 'parallel': options.parallel, 'root_targets': options.root_targets, 'target_arch': cmdline_default_variables.get('target_arch', '')} # Start with the default variables from the command line. [generator, flat_list, targets, data] = Load( build_files, format, cmdline_default_variables, includes, options.depth, params, options.check, options.circular_check, options.duplicate_basename_check) # TODO(mark): Pass |data| for now because the generator needs a list of # build files that came in. In the future, maybe it should just accept # a list, and not the whole data dict. # NOTE: flat_list is the flattened dependency graph specifying the order # that targets may be built. Build systems that operate serially or that # need to have dependencies defined before dependents reference them should # generate targets in the order specified in flat_list. generator.GenerateOutput(flat_list, targets, data, params) if options.configs: valid_configs = targets[flat_list[0]]['configurations'].keys() for conf in options.configs: if conf not in valid_configs: raise GypError('Invalid config specified via --build: %s' % conf) generator.PerformBuild(data, options.configs, params) # Done return 0 def main(args): try: return gyp_main(args) except GypError, e: sys.stderr.write("gyp: %s\n" % e) return 1 # NOTE: setuptools generated console_scripts calls function with no arguments def script_main(): return main(sys.argv[1:]) if __name__ == '__main__': sys.exit(script_main()) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/common.py0000644000000000000000000004713712631326456022264 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. from __future__ import with_statement import collections import errno import filecmp import os.path import re import tempfile import sys # A minimal memoizing decorator. It'll blow up if the args aren't immutable, # among other "problems". class memoize(object): def __init__(self, func): self.func = func self.cache = {} def __call__(self, *args): try: return self.cache[args] except KeyError: result = self.func(*args) self.cache[args] = result return result class GypError(Exception): """Error class representing an error, which is to be presented to the user. The main entry point will catch and display this. """ pass def ExceptionAppend(e, msg): """Append a message to the given exception's message.""" if not e.args: e.args = (msg,) elif len(e.args) == 1: e.args = (str(e.args[0]) + ' ' + msg,) else: e.args = (str(e.args[0]) + ' ' + msg,) + e.args[1:] def FindQualifiedTargets(target, qualified_list): """ Given a list of qualified targets, return the qualified targets for the specified |target|. """ return [t for t in qualified_list if ParseQualifiedTarget(t)[1] == target] def ParseQualifiedTarget(target): # Splits a qualified target into a build file, target name and toolset. # NOTE: rsplit is used to disambiguate the Windows drive letter separator. target_split = target.rsplit(':', 1) if len(target_split) == 2: [build_file, target] = target_split else: build_file = None target_split = target.rsplit('#', 1) if len(target_split) == 2: [target, toolset] = target_split else: toolset = None return [build_file, target, toolset] def ResolveTarget(build_file, target, toolset): # This function resolves a target into a canonical form: # - a fully defined build file, either absolute or relative to the current # directory # - a target name # - a toolset # # build_file is the file relative to which 'target' is defined. # target is the qualified target. # toolset is the default toolset for that target. [parsed_build_file, target, parsed_toolset] = ParseQualifiedTarget(target) if parsed_build_file: if build_file: # If a relative path, parsed_build_file is relative to the directory # containing build_file. If build_file is not in the current directory, # parsed_build_file is not a usable path as-is. Resolve it by # interpreting it as relative to build_file. If parsed_build_file is # absolute, it is usable as a path regardless of the current directory, # and os.path.join will return it as-is. build_file = os.path.normpath(os.path.join(os.path.dirname(build_file), parsed_build_file)) # Further (to handle cases like ../cwd), make it relative to cwd) if not os.path.isabs(build_file): build_file = RelativePath(build_file, '.') else: build_file = parsed_build_file if parsed_toolset: toolset = parsed_toolset return [build_file, target, toolset] def BuildFile(fully_qualified_target): # Extracts the build file from the fully qualified target. return ParseQualifiedTarget(fully_qualified_target)[0] def GetEnvironFallback(var_list, default): """Look up a key in the environment, with fallback to secondary keys and finally falling back to a default value.""" for var in var_list: if var in os.environ: return os.environ[var] return default def QualifiedTarget(build_file, target, toolset): # "Qualified" means the file that a target was defined in and the target # name, separated by a colon, suffixed by a # and the toolset name: # /path/to/file.gyp:target_name#toolset fully_qualified = build_file + ':' + target if toolset: fully_qualified = fully_qualified + '#' + toolset return fully_qualified @memoize def RelativePath(path, relative_to, follow_path_symlink=True): # Assuming both |path| and |relative_to| are relative to the current # directory, returns a relative path that identifies path relative to # relative_to. # If |follow_symlink_path| is true (default) and |path| is a symlink, then # this method returns a path to the real file represented by |path|. If it is # false, this method returns a path to the symlink. If |path| is not a # symlink, this option has no effect. # Convert to normalized (and therefore absolute paths). if follow_path_symlink: path = os.path.realpath(path) else: path = os.path.abspath(path) relative_to = os.path.realpath(relative_to) # On Windows, we can't create a relative path to a different drive, so just # use the absolute path. if sys.platform == 'win32': if (os.path.splitdrive(path)[0].lower() != os.path.splitdrive(relative_to)[0].lower()): return path # Split the paths into components. path_split = path.split(os.path.sep) relative_to_split = relative_to.split(os.path.sep) # Determine how much of the prefix the two paths share. prefix_len = len(os.path.commonprefix([path_split, relative_to_split])) # Put enough ".." components to back up out of relative_to to the common # prefix, and then append the part of path_split after the common prefix. relative_split = [os.path.pardir] * (len(relative_to_split) - prefix_len) + \ path_split[prefix_len:] if len(relative_split) == 0: # The paths were the same. return '' # Turn it back into a string and we're done. return os.path.join(*relative_split) @memoize def InvertRelativePath(path, toplevel_dir=None): """Given a path like foo/bar that is relative to toplevel_dir, return the inverse relative path back to the toplevel_dir. E.g. os.path.normpath(os.path.join(path, InvertRelativePath(path))) should always produce the empty string, unless the path contains symlinks. """ if not path: return path toplevel_dir = '.' if toplevel_dir is None else toplevel_dir return RelativePath(toplevel_dir, os.path.join(toplevel_dir, path)) def FixIfRelativePath(path, relative_to): # Like RelativePath but returns |path| unchanged if it is absolute. if os.path.isabs(path): return path return RelativePath(path, relative_to) def UnrelativePath(path, relative_to): # Assuming that |relative_to| is relative to the current directory, and |path| # is a path relative to the dirname of |relative_to|, returns a path that # identifies |path| relative to the current directory. rel_dir = os.path.dirname(relative_to) return os.path.normpath(os.path.join(rel_dir, path)) # re objects used by EncodePOSIXShellArgument. See IEEE 1003.1 XCU.2.2 at # http://www.opengroup.org/onlinepubs/009695399/utilities/xcu_chap02.html#tag_02_02 # and the documentation for various shells. # _quote is a pattern that should match any argument that needs to be quoted # with double-quotes by EncodePOSIXShellArgument. It matches the following # characters appearing anywhere in an argument: # \t, \n, space parameter separators # # comments # $ expansions (quoted to always expand within one argument) # % called out by IEEE 1003.1 XCU.2.2 # & job control # ' quoting # (, ) subshell execution # *, ?, [ pathname expansion # ; command delimiter # <, >, | redirection # = assignment # {, } brace expansion (bash) # ~ tilde expansion # It also matches the empty string, because "" (or '') is the only way to # represent an empty string literal argument to a POSIX shell. # # This does not match the characters in _escape, because those need to be # backslash-escaped regardless of whether they appear in a double-quoted # string. _quote = re.compile('[\t\n #$%&\'()*;<=>?[{|}~]|^$') # _escape is a pattern that should match any character that needs to be # escaped with a backslash, whether or not the argument matched the _quote # pattern. _escape is used with re.sub to backslash anything in _escape's # first match group, hence the (parentheses) in the regular expression. # # _escape matches the following characters appearing anywhere in an argument: # " to prevent POSIX shells from interpreting this character for quoting # \ to prevent POSIX shells from interpreting this character for escaping # ` to prevent POSIX shells from interpreting this character for command # substitution # Missing from this list is $, because the desired behavior of # EncodePOSIXShellArgument is to permit parameter (variable) expansion. # # Also missing from this list is !, which bash will interpret as the history # expansion character when history is enabled. bash does not enable history # by default in non-interactive shells, so this is not thought to be a problem. # ! was omitted from this list because bash interprets "\!" as a literal string # including the backslash character (avoiding history expansion but retaining # the backslash), which would not be correct for argument encoding. Handling # this case properly would also be problematic because bash allows the history # character to be changed with the histchars shell variable. Fortunately, # as history is not enabled in non-interactive shells and # EncodePOSIXShellArgument is only expected to encode for non-interactive # shells, there is no room for error here by ignoring !. _escape = re.compile(r'(["\\`])') def EncodePOSIXShellArgument(argument): """Encodes |argument| suitably for consumption by POSIX shells. argument may be quoted and escaped as necessary to ensure that POSIX shells treat the returned value as a literal representing the argument passed to this function. Parameter (variable) expansions beginning with $ are allowed to remain intact without escaping the $, to allow the argument to contain references to variables to be expanded by the shell. """ if not isinstance(argument, str): argument = str(argument) if _quote.search(argument): quote = '"' else: quote = '' encoded = quote + re.sub(_escape, r'\\\1', argument) + quote return encoded def EncodePOSIXShellList(list): """Encodes |list| suitably for consumption by POSIX shells. Returns EncodePOSIXShellArgument for each item in list, and joins them together using the space character as an argument separator. """ encoded_arguments = [] for argument in list: encoded_arguments.append(EncodePOSIXShellArgument(argument)) return ' '.join(encoded_arguments) def DeepDependencyTargets(target_dicts, roots): """Returns the recursive list of target dependencies.""" dependencies = set() pending = set(roots) while pending: # Pluck out one. r = pending.pop() # Skip if visited already. if r in dependencies: continue # Add it. dependencies.add(r) # Add its children. spec = target_dicts[r] pending.update(set(spec.get('dependencies', []))) pending.update(set(spec.get('dependencies_original', []))) return list(dependencies - set(roots)) def BuildFileTargets(target_list, build_file): """From a target_list, returns the subset from the specified build_file. """ return [p for p in target_list if BuildFile(p) == build_file] def AllTargets(target_list, target_dicts, build_file): """Returns all targets (direct and dependencies) for the specified build_file. """ bftargets = BuildFileTargets(target_list, build_file) deptargets = DeepDependencyTargets(target_dicts, bftargets) return bftargets + deptargets def WriteOnDiff(filename): """Write to a file only if the new contents differ. Arguments: filename: name of the file to potentially write to. Returns: A file like object which will write to temporary file and only overwrite the target if it differs (on close). """ class Writer(object): """Wrapper around file which only covers the target if it differs.""" def __init__(self): # Pick temporary file. tmp_fd, self.tmp_path = tempfile.mkstemp( suffix='.tmp', prefix=os.path.split(filename)[1] + '.gyp.', dir=os.path.split(filename)[0]) try: self.tmp_file = os.fdopen(tmp_fd, 'wb') except Exception: # Don't leave turds behind. os.unlink(self.tmp_path) raise def __getattr__(self, attrname): # Delegate everything else to self.tmp_file return getattr(self.tmp_file, attrname) def close(self): try: # Close tmp file. self.tmp_file.close() # Determine if different. same = False try: same = filecmp.cmp(self.tmp_path, filename, False) except OSError, e: if e.errno != errno.ENOENT: raise if same: # The new file is identical to the old one, just get rid of the new # one. os.unlink(self.tmp_path) else: # The new file is different from the old one, or there is no old one. # Rename the new file to the permanent name. # # tempfile.mkstemp uses an overly restrictive mode, resulting in a # file that can only be read by the owner, regardless of the umask. # There's no reason to not respect the umask here, which means that # an extra hoop is required to fetch it and reset the new file's mode. # # No way to get the umask without setting a new one? Set a safe one # and then set it back to the old value. umask = os.umask(077) os.umask(umask) os.chmod(self.tmp_path, 0666 & ~umask) if sys.platform == 'win32' and os.path.exists(filename): # NOTE: on windows (but not cygwin) rename will not replace an # existing file, so it must be preceded with a remove. Sadly there # is no way to make the switch atomic. os.remove(filename) os.rename(self.tmp_path, filename) except Exception: # Don't leave turds behind. os.unlink(self.tmp_path) raise return Writer() def EnsureDirExists(path): """Make sure the directory for |path| exists.""" try: os.makedirs(os.path.dirname(path)) except OSError: pass def GetFlavor(params): """Returns |params.flavor| if it's set, the system's default flavor else.""" flavors = { 'cygwin': 'win', 'win32': 'win', 'darwin': 'mac', } if 'flavor' in params: return params['flavor'] if sys.platform in flavors: return flavors[sys.platform] if sys.platform.startswith('sunos'): return 'solaris' if sys.platform.startswith('freebsd'): return 'freebsd' if sys.platform.startswith('openbsd'): return 'openbsd' if sys.platform.startswith('netbsd'): return 'netbsd' if sys.platform.startswith('aix'): return 'aix' return 'linux' def CopyTool(flavor, out_path): """Finds (flock|mac|win)_tool.gyp in the gyp directory and copies it to |out_path|.""" # aix and solaris just need flock emulation. mac and win use more complicated # support scripts. prefix = { 'aix': 'flock', 'solaris': 'flock', 'mac': 'mac', 'win': 'win' }.get(flavor, None) if not prefix: return # Slurp input file. source_path = os.path.join( os.path.dirname(os.path.abspath(__file__)), '%s_tool.py' % prefix) with open(source_path) as source_file: source = source_file.readlines() # Add header and write it out. tool_path = os.path.join(out_path, 'gyp-%s-tool' % prefix) with open(tool_path, 'w') as tool_file: tool_file.write( ''.join([source[0], '# Generated by gyp. Do not edit.\n'] + source[1:])) # Make file executable. os.chmod(tool_path, 0755) # From Alex Martelli, # http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/52560 # ASPN: Python Cookbook: Remove duplicates from a sequence # First comment, dated 2001/10/13. # (Also in the printed Python Cookbook.) def uniquer(seq, idfun=None): if idfun is None: idfun = lambda x: x seen = {} result = [] for item in seq: marker = idfun(item) if marker in seen: continue seen[marker] = 1 result.append(item) return result # Based on http://code.activestate.com/recipes/576694/. class OrderedSet(collections.MutableSet): def __init__(self, iterable=None): self.end = end = [] end += [None, end, end] # sentinel node for doubly linked list self.map = {} # key --> [key, prev, next] if iterable is not None: self |= iterable def __len__(self): return len(self.map) def __contains__(self, key): return key in self.map def add(self, key): if key not in self.map: end = self.end curr = end[1] curr[2] = end[1] = self.map[key] = [key, curr, end] def discard(self, key): if key in self.map: key, prev_item, next_item = self.map.pop(key) prev_item[2] = next_item next_item[1] = prev_item def __iter__(self): end = self.end curr = end[2] while curr is not end: yield curr[0] curr = curr[2] def __reversed__(self): end = self.end curr = end[1] while curr is not end: yield curr[0] curr = curr[1] # The second argument is an addition that causes a pylint warning. def pop(self, last=True): # pylint: disable=W0221 if not self: raise KeyError('set is empty') key = self.end[1][0] if last else self.end[2][0] self.discard(key) return key def __repr__(self): if not self: return '%s()' % (self.__class__.__name__,) return '%s(%r)' % (self.__class__.__name__, list(self)) def __eq__(self, other): if isinstance(other, OrderedSet): return len(self) == len(other) and list(self) == list(other) return set(self) == set(other) # Extensions to the recipe. def update(self, iterable): for i in iterable: if i not in self: self.add(i) class CycleError(Exception): """An exception raised when an unexpected cycle is detected.""" def __init__(self, nodes): self.nodes = nodes def __str__(self): return 'CycleError: cycle involving: ' + str(self.nodes) def TopologicallySorted(graph, get_edges): r"""Topologically sort based on a user provided edge definition. Args: graph: A list of node names. get_edges: A function mapping from node name to a hashable collection of node names which this node has outgoing edges to. Returns: A list containing all of the node in graph in topological order. It is assumed that calling get_edges once for each node and caching is cheaper than repeatedly calling get_edges. Raises: CycleError in the event of a cycle. Example: graph = {'a': '$(b) $(c)', 'b': 'hi', 'c': '$(b)'} def GetEdges(node): return re.findall(r'\$\(([^))]\)', graph[node]) print TopologicallySorted(graph.keys(), GetEdges) ==> ['a', 'c', b'] """ get_edges = memoize(get_edges) visited = set() visiting = set() ordered_nodes = [] def Visit(node): if node in visiting: raise CycleError(visiting) if node in visited: return visited.add(node) visiting.add(node) for neighbor in get_edges(node): Visit(neighbor) visiting.remove(node) ordered_nodes.insert(0, node) for node in sorted(graph): Visit(node) return ordered_nodes def CrossCompileRequested(): # TODO: figure out how to not build extra host objects in the # non-cross-compile case when this is enabled, and enable unconditionally. return (os.environ.get('GYP_CROSSCOMPILE') or os.environ.get('AR_host') or os.environ.get('CC_host') or os.environ.get('CXX_host') or os.environ.get('AR_target') or os.environ.get('CC_target') or os.environ.get('CXX_target')) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/common_test.py0000755000000000000000000000366212631326456023321 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Unit tests for the common.py file.""" import gyp.common import unittest import sys class TestTopologicallySorted(unittest.TestCase): def test_Valid(self): """Test that sorting works on a valid graph with one possible order.""" graph = { 'a': ['b', 'c'], 'b': [], 'c': ['d'], 'd': ['b'], } def GetEdge(node): return tuple(graph[node]) self.assertEqual( gyp.common.TopologicallySorted(graph.keys(), GetEdge), ['a', 'c', 'd', 'b']) def test_Cycle(self): """Test that an exception is thrown on a cyclic graph.""" graph = { 'a': ['b'], 'b': ['c'], 'c': ['d'], 'd': ['a'], } def GetEdge(node): return tuple(graph[node]) self.assertRaises( gyp.common.CycleError, gyp.common.TopologicallySorted, graph.keys(), GetEdge) class TestGetFlavor(unittest.TestCase): """Test that gyp.common.GetFlavor works as intended""" original_platform = '' def setUp(self): self.original_platform = sys.platform def tearDown(self): sys.platform = self.original_platform def assertFlavor(self, expected, argument, param): sys.platform = argument self.assertEqual(expected, gyp.common.GetFlavor(param)) def test_platform_default(self): self.assertFlavor('freebsd', 'freebsd9' , {}) self.assertFlavor('freebsd', 'freebsd10', {}) self.assertFlavor('openbsd', 'openbsd5' , {}) self.assertFlavor('solaris', 'sunos5' , {}); self.assertFlavor('solaris', 'sunos' , {}); self.assertFlavor('linux' , 'linux2' , {}); self.assertFlavor('linux' , 'linux3' , {}); def test_param(self): self.assertFlavor('foobar', 'linux2' , {'flavor': 'foobar'}) if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/easy_xml.py0000644000000000000000000001152112631326456022601 0ustar 00000000000000# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import re import os def XmlToString(content, encoding='utf-8', pretty=False): """ Writes the XML content to disk, touching the file only if it has changed. Visual Studio files have a lot of pre-defined structures. This function makes it easy to represent these structures as Python data structures, instead of having to create a lot of function calls. Each XML element of the content is represented as a list composed of: 1. The name of the element, a string, 2. The attributes of the element, a dictionary (optional), and 3+. The content of the element, if any. Strings are simple text nodes and lists are child elements. Example 1: becomes ['test'] Example 2: This is it! becomes ['myelement', {'a':'value1', 'b':'value2'}, ['childtype', 'This is'], ['childtype', 'it!'], ] Args: content: The structured content to be converted. encoding: The encoding to report on the first XML line. pretty: True if we want pretty printing with indents and new lines. Returns: The XML content as a string. """ # We create a huge list of all the elements of the file. xml_parts = ['' % encoding] if pretty: xml_parts.append('\n') _ConstructContentList(xml_parts, content, pretty) # Convert it to a string return ''.join(xml_parts) def _ConstructContentList(xml_parts, specification, pretty, level=0): """ Appends the XML parts corresponding to the specification. Args: xml_parts: A list of XML parts to be appended to. specification: The specification of the element. See EasyXml docs. pretty: True if we want pretty printing with indents and new lines. level: Indentation level. """ # The first item in a specification is the name of the element. if pretty: indentation = ' ' * level new_line = '\n' else: indentation = '' new_line = '' name = specification[0] if not isinstance(name, str): raise Exception('The first item of an EasyXml specification should be ' 'a string. Specification was ' + str(specification)) xml_parts.append(indentation + '<' + name) # Optionally in second position is a dictionary of the attributes. rest = specification[1:] if rest and isinstance(rest[0], dict): for at, val in sorted(rest[0].iteritems()): xml_parts.append(' %s="%s"' % (at, _XmlEscape(val, attr=True))) rest = rest[1:] if rest: xml_parts.append('>') all_strings = reduce(lambda x, y: x and isinstance(y, str), rest, True) multi_line = not all_strings if multi_line and new_line: xml_parts.append(new_line) for child_spec in rest: # If it's a string, append a text node. # Otherwise recurse over that child definition if isinstance(child_spec, str): xml_parts.append(_XmlEscape(child_spec)) else: _ConstructContentList(xml_parts, child_spec, pretty, level + 1) if multi_line and indentation: xml_parts.append(indentation) xml_parts.append('%s' % (name, new_line)) else: xml_parts.append('/>%s' % new_line) def WriteXmlIfChanged(content, path, encoding='utf-8', pretty=False, win32=False): """ Writes the XML content to disk, touching the file only if it has changed. Args: content: The structured content to be written. path: Location of the file. encoding: The encoding to report on the first line of the XML file. pretty: True if we want pretty printing with indents and new lines. """ xml_string = XmlToString(content, encoding, pretty) if win32 and os.linesep != '\r\n': xml_string = xml_string.replace('\n', '\r\n') try: xml_string = xml_string.encode(encoding) except Exception: xml_string = unicode(xml_string, 'latin-1').encode(encoding) # Get the old content try: f = open(path, 'r') existing = f.read() f.close() except: existing = None # It has changed, write it if existing != xml_string: f = open(path, 'w') f.write(xml_string) f.close() _xml_escape_map = { '"': '"', "'": ''', '<': '<', '>': '>', '&': '&', '\n': ' ', '\r': ' ', } _xml_escape_re = re.compile( "(%s)" % "|".join(map(re.escape, _xml_escape_map.keys()))) def _XmlEscape(value, attr=False): """ Escape a string for inclusion in XML.""" def replace(match): m = match.string[match.start() : match.end()] # don't replace single quotes in attrs if attr and m == "'": return m return _xml_escape_map[m] return _xml_escape_re.sub(replace, value) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/easy_xml_test.py0000755000000000000000000000630612631326456023650 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the easy_xml.py file. """ import gyp.easy_xml as easy_xml import unittest import StringIO class TestSequenceFunctions(unittest.TestCase): def setUp(self): self.stderr = StringIO.StringIO() def test_EasyXml_simple(self): self.assertEqual( easy_xml.XmlToString(['test']), '') self.assertEqual( easy_xml.XmlToString(['test'], encoding='Windows-1252'), '') def test_EasyXml_simple_with_attributes(self): self.assertEqual( easy_xml.XmlToString(['test2', {'a': 'value1', 'b': 'value2'}]), '') def test_EasyXml_escaping(self): original = '\'"\r&\nfoo' converted = '<test>\'" & foo' converted_apos = converted.replace("'", ''') self.assertEqual( easy_xml.XmlToString(['test3', {'a': original}, original]), '%s' % (converted, converted_apos)) def test_EasyXml_pretty(self): self.assertEqual( easy_xml.XmlToString( ['test3', ['GrandParent', ['Parent1', ['Child'] ], ['Parent2'] ] ], pretty=True), '\n' '\n' ' \n' ' \n' ' \n' ' \n' ' \n' ' \n' '\n') def test_EasyXml_complex(self): # We want to create: target = ( '' '' '' '{D2250C20-3A94-4FB9-AF73-11BC5B73884B}' 'Win32Proj' 'automated_ui_tests' '' '' '' 'Application' 'Unicode' '' '') xml = easy_xml.XmlToString( ['Project', ['PropertyGroup', {'Label': 'Globals'}, ['ProjectGuid', '{D2250C20-3A94-4FB9-AF73-11BC5B73884B}'], ['Keyword', 'Win32Proj'], ['RootNamespace', 'automated_ui_tests'] ], ['Import', {'Project': '$(VCTargetsPath)\\Microsoft.Cpp.props'}], ['PropertyGroup', {'Condition': "'$(Configuration)|$(Platform)'=='Debug|Win32'", 'Label': 'Configuration'}, ['ConfigurationType', 'Application'], ['CharacterSet', 'Unicode'] ] ]) self.assertEqual(xml, target) if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/flock_tool.py0000755000000000000000000000332412631326456023120 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """These functions are executed via gyp-flock-tool when using the Makefile generator. Used on systems that don't have a built-in flock.""" import fcntl import os import struct import subprocess import sys def main(args): executor = FlockTool() executor.Dispatch(args) class FlockTool(object): """This class emulates the 'flock' command.""" def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like copy-info-plist to CopyInfoPlist""" return name_string.title().replace('-', '') def ExecFlock(self, lockfile, *cmd_list): """Emulates the most basic behavior of Linux's flock(1).""" # Rely on exception handling to report errors. # Note that the stock python on SunOS has a bug # where fcntl.flock(fd, LOCK_EX) always fails # with EBADF, that's why we use this F_SETLK # hack instead. fd = os.open(lockfile, os.O_WRONLY|os.O_NOCTTY|os.O_CREAT, 0666) if sys.platform.startswith('aix'): # Python on AIX is compiled with LARGEFILE support, which changes the # struct size. op = struct.pack('hhIllqq', fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0) else: op = struct.pack('hhllhhl', fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0) fcntl.fcntl(fd, fcntl.F_SETLK, op) return subprocess.call(cmd_list) if __name__ == '__main__': sys.exit(main(sys.argv[1:])) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/0000755000000000000000000000000012631326456022374 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/input.py0000644000000000000000000034225012631326456022125 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. from compiler.ast import Const from compiler.ast import Dict from compiler.ast import Discard from compiler.ast import List from compiler.ast import Module from compiler.ast import Node from compiler.ast import Stmt import compiler import gyp.common import gyp.simple_copy import multiprocessing import optparse import os.path import re import shlex import signal import subprocess import sys import threading import time import traceback from gyp.common import GypError from gyp.common import OrderedSet # A list of types that are treated as linkable. linkable_types = [ 'executable', 'shared_library', 'loadable_module', 'mac_kernel_extension', ] # A list of sections that contain links to other targets. dependency_sections = ['dependencies', 'export_dependent_settings'] # base_path_sections is a list of sections defined by GYP that contain # pathnames. The generators can provide more keys, the two lists are merged # into path_sections, but you should call IsPathSection instead of using either # list directly. base_path_sections = [ 'destination', 'files', 'include_dirs', 'inputs', 'libraries', 'outputs', 'sources', ] path_sections = set() # These per-process dictionaries are used to cache build file data when loading # in parallel mode. per_process_data = {} per_process_aux_data = {} def IsPathSection(section): # If section ends in one of the '=+?!' characters, it's applied to a section # without the trailing characters. '/' is notably absent from this list, # because there's no way for a regular expression to be treated as a path. while section and section[-1:] in '=+?!': section = section[:-1] if section in path_sections: return True # Sections mathing the regexp '_(dir|file|path)s?$' are also # considered PathSections. Using manual string matching since that # is much faster than the regexp and this can be called hundreds of # thousands of times so micro performance matters. if "_" in section: tail = section[-6:] if tail[-1] == 's': tail = tail[:-1] if tail[-5:] in ('_file', '_path'): return True return tail[-4:] == '_dir' return False # base_non_configuration_keys is a list of key names that belong in the target # itself and should not be propagated into its configurations. It is merged # with a list that can come from the generator to # create non_configuration_keys. base_non_configuration_keys = [ # Sections that must exist inside targets and not configurations. 'actions', 'configurations', 'copies', 'default_configuration', 'dependencies', 'dependencies_original', 'libraries', 'postbuilds', 'product_dir', 'product_extension', 'product_name', 'product_prefix', 'rules', 'run_as', 'sources', 'standalone_static_library', 'suppress_wildcard', 'target_name', 'toolset', 'toolsets', 'type', # Sections that can be found inside targets or configurations, but that # should not be propagated from targets into their configurations. 'variables', ] non_configuration_keys = [] # Keys that do not belong inside a configuration dictionary. invalid_configuration_keys = [ 'actions', 'all_dependent_settings', 'configurations', 'dependencies', 'direct_dependent_settings', 'libraries', 'link_settings', 'sources', 'standalone_static_library', 'target_name', 'type', ] # Controls whether or not the generator supports multiple toolsets. multiple_toolsets = False # Paths for converting filelist paths to output paths: { # toplevel, # qualified_output_dir, # } generator_filelist_paths = None def GetIncludedBuildFiles(build_file_path, aux_data, included=None): """Return a list of all build files included into build_file_path. The returned list will contain build_file_path as well as all other files that it included, either directly or indirectly. Note that the list may contain files that were included into a conditional section that evaluated to false and was not merged into build_file_path's dict. aux_data is a dict containing a key for each build file or included build file. Those keys provide access to dicts whose "included" keys contain lists of all other files included by the build file. included should be left at its default None value by external callers. It is used for recursion. The returned list will not contain any duplicate entries. Each build file in the list will be relative to the current directory. """ if included == None: included = [] if build_file_path in included: return included included.append(build_file_path) for included_build_file in aux_data[build_file_path].get('included', []): GetIncludedBuildFiles(included_build_file, aux_data, included) return included def CheckedEval(file_contents): """Return the eval of a gyp file. The gyp file is restricted to dictionaries and lists only, and repeated keys are not allowed. Note that this is slower than eval() is. """ ast = compiler.parse(file_contents) assert isinstance(ast, Module) c1 = ast.getChildren() assert c1[0] is None assert isinstance(c1[1], Stmt) c2 = c1[1].getChildren() assert isinstance(c2[0], Discard) c3 = c2[0].getChildren() assert len(c3) == 1 return CheckNode(c3[0], []) def CheckNode(node, keypath): if isinstance(node, Dict): c = node.getChildren() dict = {} for n in range(0, len(c), 2): assert isinstance(c[n], Const) key = c[n].getChildren()[0] if key in dict: raise GypError("Key '" + key + "' repeated at level " + repr(len(keypath) + 1) + " with key path '" + '.'.join(keypath) + "'") kp = list(keypath) # Make a copy of the list for descending this node. kp.append(key) dict[key] = CheckNode(c[n + 1], kp) return dict elif isinstance(node, List): c = node.getChildren() children = [] for index, child in enumerate(c): kp = list(keypath) # Copy list. kp.append(repr(index)) children.append(CheckNode(child, kp)) return children elif isinstance(node, Const): return node.getChildren()[0] else: raise TypeError("Unknown AST node at key path '" + '.'.join(keypath) + "': " + repr(node)) def LoadOneBuildFile(build_file_path, data, aux_data, includes, is_target, check): if build_file_path in data: return data[build_file_path] if os.path.exists(build_file_path): build_file_contents = open(build_file_path).read() else: raise GypError("%s not found (cwd: %s)" % (build_file_path, os.getcwd())) build_file_data = None try: if check: build_file_data = CheckedEval(build_file_contents) else: build_file_data = eval(build_file_contents, {'__builtins__': None}, None) except SyntaxError, e: e.filename = build_file_path raise except Exception, e: gyp.common.ExceptionAppend(e, 'while reading ' + build_file_path) raise if type(build_file_data) is not dict: raise GypError("%s does not evaluate to a dictionary." % build_file_path) data[build_file_path] = build_file_data aux_data[build_file_path] = {} # Scan for includes and merge them in. if ('skip_includes' not in build_file_data or not build_file_data['skip_includes']): try: if is_target: LoadBuildFileIncludesIntoDict(build_file_data, build_file_path, data, aux_data, includes, check) else: LoadBuildFileIncludesIntoDict(build_file_data, build_file_path, data, aux_data, None, check) except Exception, e: gyp.common.ExceptionAppend(e, 'while reading includes of ' + build_file_path) raise return build_file_data def LoadBuildFileIncludesIntoDict(subdict, subdict_path, data, aux_data, includes, check): includes_list = [] if includes != None: includes_list.extend(includes) if 'includes' in subdict: for include in subdict['includes']: # "include" is specified relative to subdict_path, so compute the real # path to include by appending the provided "include" to the directory # in which subdict_path resides. relative_include = \ os.path.normpath(os.path.join(os.path.dirname(subdict_path), include)) includes_list.append(relative_include) # Unhook the includes list, it's no longer needed. del subdict['includes'] # Merge in the included files. for include in includes_list: if not 'included' in aux_data[subdict_path]: aux_data[subdict_path]['included'] = [] aux_data[subdict_path]['included'].append(include) gyp.DebugOutput(gyp.DEBUG_INCLUDES, "Loading Included File: '%s'", include) MergeDicts(subdict, LoadOneBuildFile(include, data, aux_data, None, False, check), subdict_path, include) # Recurse into subdictionaries. for k, v in subdict.iteritems(): if type(v) is dict: LoadBuildFileIncludesIntoDict(v, subdict_path, data, aux_data, None, check) elif type(v) is list: LoadBuildFileIncludesIntoList(v, subdict_path, data, aux_data, check) # This recurses into lists so that it can look for dicts. def LoadBuildFileIncludesIntoList(sublist, sublist_path, data, aux_data, check): for item in sublist: if type(item) is dict: LoadBuildFileIncludesIntoDict(item, sublist_path, data, aux_data, None, check) elif type(item) is list: LoadBuildFileIncludesIntoList(item, sublist_path, data, aux_data, check) # Processes toolsets in all the targets. This recurses into condition entries # since they can contain toolsets as well. def ProcessToolsetsInDict(data): if 'targets' in data: target_list = data['targets'] new_target_list = [] for target in target_list: # If this target already has an explicit 'toolset', and no 'toolsets' # list, don't modify it further. if 'toolset' in target and 'toolsets' not in target: new_target_list.append(target) continue if multiple_toolsets: toolsets = target.get('toolsets', ['target']) else: toolsets = ['target'] # Make sure this 'toolsets' definition is only processed once. if 'toolsets' in target: del target['toolsets'] if len(toolsets) > 0: # Optimization: only do copies if more than one toolset is specified. for build in toolsets[1:]: new_target = gyp.simple_copy.deepcopy(target) new_target['toolset'] = build new_target_list.append(new_target) target['toolset'] = toolsets[0] new_target_list.append(target) data['targets'] = new_target_list if 'conditions' in data: for condition in data['conditions']: if type(condition) is list: for condition_dict in condition[1:]: if type(condition_dict) is dict: ProcessToolsetsInDict(condition_dict) # TODO(mark): I don't love this name. It just means that it's going to load # a build file that contains targets and is expected to provide a targets dict # that contains the targets... def LoadTargetBuildFile(build_file_path, data, aux_data, variables, includes, depth, check, load_dependencies): # If depth is set, predefine the DEPTH variable to be a relative path from # this build file's directory to the directory identified by depth. if depth: # TODO(dglazkov) The backslash/forward-slash replacement at the end is a # temporary measure. This should really be addressed by keeping all paths # in POSIX until actual project generation. d = gyp.common.RelativePath(depth, os.path.dirname(build_file_path)) if d == '': variables['DEPTH'] = '.' else: variables['DEPTH'] = d.replace('\\', '/') # The 'target_build_files' key is only set when loading target build files in # the non-parallel code path, where LoadTargetBuildFile is called # recursively. In the parallel code path, we don't need to check whether the # |build_file_path| has already been loaded, because the 'scheduled' set in # ParallelState guarantees that we never load the same |build_file_path| # twice. if 'target_build_files' in data: if build_file_path in data['target_build_files']: # Already loaded. return False data['target_build_files'].add(build_file_path) gyp.DebugOutput(gyp.DEBUG_INCLUDES, "Loading Target Build File '%s'", build_file_path) build_file_data = LoadOneBuildFile(build_file_path, data, aux_data, includes, True, check) # Store DEPTH for later use in generators. build_file_data['_DEPTH'] = depth # Set up the included_files key indicating which .gyp files contributed to # this target dict. if 'included_files' in build_file_data: raise GypError(build_file_path + ' must not contain included_files key') included = GetIncludedBuildFiles(build_file_path, aux_data) build_file_data['included_files'] = [] for included_file in included: # included_file is relative to the current directory, but it needs to # be made relative to build_file_path's directory. included_relative = \ gyp.common.RelativePath(included_file, os.path.dirname(build_file_path)) build_file_data['included_files'].append(included_relative) # Do a first round of toolsets expansion so that conditions can be defined # per toolset. ProcessToolsetsInDict(build_file_data) # Apply "pre"/"early" variable expansions and condition evaluations. ProcessVariablesAndConditionsInDict( build_file_data, PHASE_EARLY, variables, build_file_path) # Since some toolsets might have been defined conditionally, perform # a second round of toolsets expansion now. ProcessToolsetsInDict(build_file_data) # Look at each project's target_defaults dict, and merge settings into # targets. if 'target_defaults' in build_file_data: if 'targets' not in build_file_data: raise GypError("Unable to find targets in build file %s" % build_file_path) index = 0 while index < len(build_file_data['targets']): # This procedure needs to give the impression that target_defaults is # used as defaults, and the individual targets inherit from that. # The individual targets need to be merged into the defaults. Make # a deep copy of the defaults for each target, merge the target dict # as found in the input file into that copy, and then hook up the # copy with the target-specific data merged into it as the replacement # target dict. old_target_dict = build_file_data['targets'][index] new_target_dict = gyp.simple_copy.deepcopy( build_file_data['target_defaults']) MergeDicts(new_target_dict, old_target_dict, build_file_path, build_file_path) build_file_data['targets'][index] = new_target_dict index += 1 # No longer needed. del build_file_data['target_defaults'] # Look for dependencies. This means that dependency resolution occurs # after "pre" conditionals and variable expansion, but before "post" - # in other words, you can't put a "dependencies" section inside a "post" # conditional within a target. dependencies = [] if 'targets' in build_file_data: for target_dict in build_file_data['targets']: if 'dependencies' not in target_dict: continue for dependency in target_dict['dependencies']: dependencies.append( gyp.common.ResolveTarget(build_file_path, dependency, None)[0]) if load_dependencies: for dependency in dependencies: try: LoadTargetBuildFile(dependency, data, aux_data, variables, includes, depth, check, load_dependencies) except Exception, e: gyp.common.ExceptionAppend( e, 'while loading dependencies of %s' % build_file_path) raise else: return (build_file_path, dependencies) def CallLoadTargetBuildFile(global_flags, build_file_path, variables, includes, depth, check, generator_input_info): """Wrapper around LoadTargetBuildFile for parallel processing. This wrapper is used when LoadTargetBuildFile is executed in a worker process. """ try: signal.signal(signal.SIGINT, signal.SIG_IGN) # Apply globals so that the worker process behaves the same. for key, value in global_flags.iteritems(): globals()[key] = value SetGeneratorGlobals(generator_input_info) result = LoadTargetBuildFile(build_file_path, per_process_data, per_process_aux_data, variables, includes, depth, check, False) if not result: return result (build_file_path, dependencies) = result # We can safely pop the build_file_data from per_process_data because it # will never be referenced by this process again, so we don't need to keep # it in the cache. build_file_data = per_process_data.pop(build_file_path) # This gets serialized and sent back to the main process via a pipe. # It's handled in LoadTargetBuildFileCallback. return (build_file_path, build_file_data, dependencies) except GypError, e: sys.stderr.write("gyp: %s\n" % e) return None except Exception, e: print >>sys.stderr, 'Exception:', e print >>sys.stderr, traceback.format_exc() return None class ParallelProcessingError(Exception): pass class ParallelState(object): """Class to keep track of state when processing input files in parallel. If build files are loaded in parallel, use this to keep track of state during farming out and processing parallel jobs. It's stored in a global so that the callback function can have access to it. """ def __init__(self): # The multiprocessing pool. self.pool = None # The condition variable used to protect this object and notify # the main loop when there might be more data to process. self.condition = None # The "data" dict that was passed to LoadTargetBuildFileParallel self.data = None # The number of parallel calls outstanding; decremented when a response # was received. self.pending = 0 # The set of all build files that have been scheduled, so we don't # schedule the same one twice. self.scheduled = set() # A list of dependency build file paths that haven't been scheduled yet. self.dependencies = [] # Flag to indicate if there was an error in a child process. self.error = False def LoadTargetBuildFileCallback(self, result): """Handle the results of running LoadTargetBuildFile in another process. """ self.condition.acquire() if not result: self.error = True self.condition.notify() self.condition.release() return (build_file_path0, build_file_data0, dependencies0) = result self.data[build_file_path0] = build_file_data0 self.data['target_build_files'].add(build_file_path0) for new_dependency in dependencies0: if new_dependency not in self.scheduled: self.scheduled.add(new_dependency) self.dependencies.append(new_dependency) self.pending -= 1 self.condition.notify() self.condition.release() def LoadTargetBuildFilesParallel(build_files, data, variables, includes, depth, check, generator_input_info): parallel_state = ParallelState() parallel_state.condition = threading.Condition() # Make copies of the build_files argument that we can modify while working. parallel_state.dependencies = list(build_files) parallel_state.scheduled = set(build_files) parallel_state.pending = 0 parallel_state.data = data try: parallel_state.condition.acquire() while parallel_state.dependencies or parallel_state.pending: if parallel_state.error: break if not parallel_state.dependencies: parallel_state.condition.wait() continue dependency = parallel_state.dependencies.pop() parallel_state.pending += 1 global_flags = { 'path_sections': globals()['path_sections'], 'non_configuration_keys': globals()['non_configuration_keys'], 'multiple_toolsets': globals()['multiple_toolsets']} if not parallel_state.pool: parallel_state.pool = multiprocessing.Pool(multiprocessing.cpu_count()) parallel_state.pool.apply_async( CallLoadTargetBuildFile, args = (global_flags, dependency, variables, includes, depth, check, generator_input_info), callback = parallel_state.LoadTargetBuildFileCallback) except KeyboardInterrupt, e: parallel_state.pool.terminate() raise e parallel_state.condition.release() parallel_state.pool.close() parallel_state.pool.join() parallel_state.pool = None if parallel_state.error: sys.exit(1) # Look for the bracket that matches the first bracket seen in a # string, and return the start and end as a tuple. For example, if # the input is something like "<(foo <(bar)) blah", then it would # return (1, 13), indicating the entire string except for the leading # "<" and trailing " blah". LBRACKETS= set('{[(') BRACKETS = {'}': '{', ']': '[', ')': '('} def FindEnclosingBracketGroup(input_str): stack = [] start = -1 for index, char in enumerate(input_str): if char in LBRACKETS: stack.append(char) if start == -1: start = index elif char in BRACKETS: if not stack: return (-1, -1) if stack.pop() != BRACKETS[char]: return (-1, -1) if not stack: return (start, index + 1) return (-1, -1) def IsStrCanonicalInt(string): """Returns True if |string| is in its canonical integer form. The canonical form is such that str(int(string)) == string. """ if type(string) is str: # This function is called a lot so for maximum performance, avoid # involving regexps which would otherwise make the code much # shorter. Regexps would need twice the time of this function. if string: if string == "0": return True if string[0] == "-": string = string[1:] if not string: return False if '1' <= string[0] <= '9': return string.isdigit() return False # This matches things like "<(asdf)", "(?P<(?:(?:!?@?)|\|)?)' r'(?P[-a-zA-Z0-9_.]+)?' r'\((?P\s*\[?)' r'(?P.*?)(\]?)\))') # This matches the same as early_variable_re, but with '>' instead of '<'. late_variable_re = re.compile( r'(?P(?P>(?:(?:!?@?)|\|)?)' r'(?P[-a-zA-Z0-9_.]+)?' r'\((?P\s*\[?)' r'(?P.*?)(\]?)\))') # This matches the same as early_variable_re, but with '^' instead of '<'. latelate_variable_re = re.compile( r'(?P(?P[\^](?:(?:!?@?)|\|)?)' r'(?P[-a-zA-Z0-9_.]+)?' r'\((?P\s*\[?)' r'(?P.*?)(\]?)\))') # Global cache of results from running commands so they don't have to be run # more then once. cached_command_results = {} def FixupPlatformCommand(cmd): if sys.platform == 'win32': if type(cmd) is list: cmd = [re.sub('^cat ', 'type ', cmd[0])] + cmd[1:] else: cmd = re.sub('^cat ', 'type ', cmd) return cmd PHASE_EARLY = 0 PHASE_LATE = 1 PHASE_LATELATE = 2 def ExpandVariables(input, phase, variables, build_file): # Look for the pattern that gets expanded into variables if phase == PHASE_EARLY: variable_re = early_variable_re expansion_symbol = '<' elif phase == PHASE_LATE: variable_re = late_variable_re expansion_symbol = '>' elif phase == PHASE_LATELATE: variable_re = latelate_variable_re expansion_symbol = '^' else: assert False input_str = str(input) if IsStrCanonicalInt(input_str): return int(input_str) # Do a quick scan to determine if an expensive regex search is warranted. if expansion_symbol not in input_str: return input_str # Get the entire list of matches as a list of MatchObject instances. # (using findall here would return strings instead of MatchObjects). matches = list(variable_re.finditer(input_str)) if not matches: return input_str output = input_str # Reverse the list of matches so that replacements are done right-to-left. # That ensures that earlier replacements won't mess up the string in a # way that causes later calls to find the earlier substituted text instead # of what's intended for replacement. matches.reverse() for match_group in matches: match = match_group.groupdict() gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Matches: %r", match) # match['replace'] is the substring to look for, match['type'] # is the character code for the replacement type (< > ! <| >| <@ # >@ !@), match['is_array'] contains a '[' for command # arrays, and match['content'] is the name of the variable (< >) # or command to run (!). match['command_string'] is an optional # command string. Currently, only 'pymod_do_main' is supported. # run_command is true if a ! variant is used. run_command = '!' in match['type'] command_string = match['command_string'] # file_list is true if a | variant is used. file_list = '|' in match['type'] # Capture these now so we can adjust them later. replace_start = match_group.start('replace') replace_end = match_group.end('replace') # Find the ending paren, and re-evaluate the contained string. (c_start, c_end) = FindEnclosingBracketGroup(input_str[replace_start:]) # Adjust the replacement range to match the entire command # found by FindEnclosingBracketGroup (since the variable_re # probably doesn't match the entire command if it contained # nested variables). replace_end = replace_start + c_end # Find the "real" replacement, matching the appropriate closing # paren, and adjust the replacement start and end. replacement = input_str[replace_start:replace_end] # Figure out what the contents of the variable parens are. contents_start = replace_start + c_start + 1 contents_end = replace_end - 1 contents = input_str[contents_start:contents_end] # Do filter substitution now for <|(). # Admittedly, this is different than the evaluation order in other # contexts. However, since filtration has no chance to run on <|(), # this seems like the only obvious way to give them access to filters. if file_list: processed_variables = gyp.simple_copy.deepcopy(variables) ProcessListFiltersInDict(contents, processed_variables) # Recurse to expand variables in the contents contents = ExpandVariables(contents, phase, processed_variables, build_file) else: # Recurse to expand variables in the contents contents = ExpandVariables(contents, phase, variables, build_file) # Strip off leading/trailing whitespace so that variable matches are # simpler below (and because they are rarely needed). contents = contents.strip() # expand_to_list is true if an @ variant is used. In that case, # the expansion should result in a list. Note that the caller # is to be expecting a list in return, and not all callers do # because not all are working in list context. Also, for list # expansions, there can be no other text besides the variable # expansion in the input string. expand_to_list = '@' in match['type'] and input_str == replacement if run_command or file_list: # Find the build file's directory, so commands can be run or file lists # generated relative to it. build_file_dir = os.path.dirname(build_file) if build_file_dir == '' and not file_list: # If build_file is just a leaf filename indicating a file in the # current directory, build_file_dir might be an empty string. Set # it to None to signal to subprocess.Popen that it should run the # command in the current directory. build_file_dir = None # Support <|(listfile.txt ...) which generates a file # containing items from a gyp list, generated at gyp time. # This works around actions/rules which have more inputs than will # fit on the command line. if file_list: if type(contents) is list: contents_list = contents else: contents_list = contents.split(' ') replacement = contents_list[0] if os.path.isabs(replacement): raise GypError('| cannot handle absolute paths, got "%s"' % replacement) if not generator_filelist_paths: path = os.path.join(build_file_dir, replacement) else: if os.path.isabs(build_file_dir): toplevel = generator_filelist_paths['toplevel'] rel_build_file_dir = gyp.common.RelativePath(build_file_dir, toplevel) else: rel_build_file_dir = build_file_dir qualified_out_dir = generator_filelist_paths['qualified_out_dir'] path = os.path.join(qualified_out_dir, rel_build_file_dir, replacement) gyp.common.EnsureDirExists(path) replacement = gyp.common.RelativePath(path, build_file_dir) f = gyp.common.WriteOnDiff(path) for i in contents_list[1:]: f.write('%s\n' % i) f.close() elif run_command: use_shell = True if match['is_array']: contents = eval(contents) use_shell = False # Check for a cached value to avoid executing commands, or generating # file lists more than once. The cache key contains the command to be # run as well as the directory to run it from, to account for commands # that depend on their current directory. # TODO(http://code.google.com/p/gyp/issues/detail?id=111): In theory, # someone could author a set of GYP files where each time the command # is invoked it produces different output by design. When the need # arises, the syntax should be extended to support no caching off a # command's output so it is run every time. cache_key = (str(contents), build_file_dir) cached_value = cached_command_results.get(cache_key, None) if cached_value is None: gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Executing command '%s' in directory '%s'", contents, build_file_dir) replacement = '' if command_string == 'pymod_do_main': # (sources/) etc. to resolve to # and empty list if undefined. This allows actions to: # 'action!': [ # '>@(_sources!)', # ], # 'action/': [ # '>@(_sources/)', # ], replacement = [] else: raise GypError('Undefined variable ' + contents + ' in ' + build_file) else: replacement = variables[contents] if type(replacement) is list: for item in replacement: if not contents[-1] == '/' and type(item) not in (str, int): raise GypError('Variable ' + contents + ' must expand to a string or list of strings; ' + 'list contains a ' + item.__class__.__name__) # Run through the list and handle variable expansions in it. Since # the list is guaranteed not to contain dicts, this won't do anything # with conditions sections. ProcessVariablesAndConditionsInList(replacement, phase, variables, build_file) elif type(replacement) not in (str, int): raise GypError('Variable ' + contents + ' must expand to a string or list of strings; ' + 'found a ' + replacement.__class__.__name__) if expand_to_list: # Expanding in list context. It's guaranteed that there's only one # replacement to do in |input_str| and that it's this replacement. See # above. if type(replacement) is list: # If it's already a list, make a copy. output = replacement[:] else: # Split it the same way sh would split arguments. output = shlex.split(str(replacement)) else: # Expanding in string context. encoded_replacement = '' if type(replacement) is list: # When expanding a list into string context, turn the list items # into a string in a way that will work with a subprocess call. # # TODO(mark): This isn't completely correct. This should # call a generator-provided function that observes the # proper list-to-argument quoting rules on a specific # platform instead of just calling the POSIX encoding # routine. encoded_replacement = gyp.common.EncodePOSIXShellList(replacement) else: encoded_replacement = replacement output = output[:replace_start] + str(encoded_replacement) + \ output[replace_end:] # Prepare for the next match iteration. input_str = output if output == input: gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Found only identity matches on %r, avoiding infinite " "recursion.", output) else: # Look for more matches now that we've replaced some, to deal with # expanding local variables (variables defined in the same # variables block as this one). gyp.DebugOutput(gyp.DEBUG_VARIABLES, "Found output %r, recursing.", output) if type(output) is list: if output and type(output[0]) is list: # Leave output alone if it's a list of lists. # We don't want such lists to be stringified. pass else: new_output = [] for item in output: new_output.append( ExpandVariables(item, phase, variables, build_file)) output = new_output else: output = ExpandVariables(output, phase, variables, build_file) # Convert all strings that are canonically-represented integers into integers. if type(output) is list: for index in xrange(0, len(output)): if IsStrCanonicalInt(output[index]): output[index] = int(output[index]) elif IsStrCanonicalInt(output): output = int(output) return output # The same condition is often evaluated over and over again so it # makes sense to cache as much as possible between evaluations. cached_conditions_asts = {} def EvalCondition(condition, conditions_key, phase, variables, build_file): """Returns the dict that should be used or None if the result was that nothing should be used.""" if type(condition) is not list: raise GypError(conditions_key + ' must be a list') if len(condition) < 2: # It's possible that condition[0] won't work in which case this # attempt will raise its own IndexError. That's probably fine. raise GypError(conditions_key + ' ' + condition[0] + ' must be at least length 2, not ' + str(len(condition))) i = 0 result = None while i < len(condition): cond_expr = condition[i] true_dict = condition[i + 1] if type(true_dict) is not dict: raise GypError('{} {} must be followed by a dictionary, not {}'.format( conditions_key, cond_expr, type(true_dict))) if len(condition) > i + 2 and type(condition[i + 2]) is dict: false_dict = condition[i + 2] i = i + 3 if i != len(condition): raise GypError('{} {} has {} unexpected trailing items'.format( conditions_key, cond_expr, len(condition) - i)) else: false_dict = None i = i + 2 if result == None: result = EvalSingleCondition( cond_expr, true_dict, false_dict, phase, variables, build_file) return result def EvalSingleCondition( cond_expr, true_dict, false_dict, phase, variables, build_file): """Returns true_dict if cond_expr evaluates to true, and false_dict otherwise.""" # Do expansions on the condition itself. Since the conditon can naturally # contain variable references without needing to resort to GYP expansion # syntax, this is of dubious value for variables, but someone might want to # use a command expansion directly inside a condition. cond_expr_expanded = ExpandVariables(cond_expr, phase, variables, build_file) if type(cond_expr_expanded) not in (str, int): raise ValueError( 'Variable expansion in this context permits str and int ' + \ 'only, found ' + cond_expr_expanded.__class__.__name__) try: if cond_expr_expanded in cached_conditions_asts: ast_code = cached_conditions_asts[cond_expr_expanded] else: ast_code = compile(cond_expr_expanded, '', 'eval') cached_conditions_asts[cond_expr_expanded] = ast_code if eval(ast_code, {'__builtins__': None}, variables): return true_dict return false_dict except SyntaxError, e: syntax_error = SyntaxError('%s while evaluating condition \'%s\' in %s ' 'at character %d.' % (str(e.args[0]), e.text, build_file, e.offset), e.filename, e.lineno, e.offset, e.text) raise syntax_error except NameError, e: gyp.common.ExceptionAppend(e, 'while evaluating condition \'%s\' in %s' % (cond_expr_expanded, build_file)) raise GypError(e) def ProcessConditionsInDict(the_dict, phase, variables, build_file): # Process a 'conditions' or 'target_conditions' section in the_dict, # depending on phase. # early -> conditions # late -> target_conditions # latelate -> no conditions # # Each item in a conditions list consists of cond_expr, a string expression # evaluated as the condition, and true_dict, a dict that will be merged into # the_dict if cond_expr evaluates to true. Optionally, a third item, # false_dict, may be present. false_dict is merged into the_dict if # cond_expr evaluates to false. # # Any dict merged into the_dict will be recursively processed for nested # conditionals and other expansions, also according to phase, immediately # prior to being merged. if phase == PHASE_EARLY: conditions_key = 'conditions' elif phase == PHASE_LATE: conditions_key = 'target_conditions' elif phase == PHASE_LATELATE: return else: assert False if not conditions_key in the_dict: return conditions_list = the_dict[conditions_key] # Unhook the conditions list, it's no longer needed. del the_dict[conditions_key] for condition in conditions_list: merge_dict = EvalCondition(condition, conditions_key, phase, variables, build_file) if merge_dict != None: # Expand variables and nested conditinals in the merge_dict before # merging it. ProcessVariablesAndConditionsInDict(merge_dict, phase, variables, build_file) MergeDicts(the_dict, merge_dict, build_file, build_file) def LoadAutomaticVariablesFromDict(variables, the_dict): # Any keys with plain string values in the_dict become automatic variables. # The variable name is the key name with a "_" character prepended. for key, value in the_dict.iteritems(): if type(value) in (str, int, list): variables['_' + key] = value def LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key): # Any keys in the_dict's "variables" dict, if it has one, becomes a # variable. The variable name is the key name in the "variables" dict. # Variables that end with the % character are set only if they are unset in # the variables dict. the_dict_key is the name of the key that accesses # the_dict in the_dict's parent dict. If the_dict's parent is not a dict # (it could be a list or it could be parentless because it is a root dict), # the_dict_key will be None. for key, value in the_dict.get('variables', {}).iteritems(): if type(value) not in (str, int, list): continue if key.endswith('%'): variable_name = key[:-1] if variable_name in variables: # If the variable is already set, don't set it. continue if the_dict_key is 'variables' and variable_name in the_dict: # If the variable is set without a % in the_dict, and the_dict is a # variables dict (making |variables| a varaibles sub-dict of a # variables dict), use the_dict's definition. value = the_dict[variable_name] else: variable_name = key variables[variable_name] = value def ProcessVariablesAndConditionsInDict(the_dict, phase, variables_in, build_file, the_dict_key=None): """Handle all variable and command expansion and conditional evaluation. This function is the public entry point for all variable expansions and conditional evaluations. The variables_in dictionary will not be modified by this function. """ # Make a copy of the variables_in dict that can be modified during the # loading of automatics and the loading of the variables dict. variables = variables_in.copy() LoadAutomaticVariablesFromDict(variables, the_dict) if 'variables' in the_dict: # Make sure all the local variables are added to the variables # list before we process them so that you can reference one # variable from another. They will be fully expanded by recursion # in ExpandVariables. for key, value in the_dict['variables'].iteritems(): variables[key] = value # Handle the associated variables dict first, so that any variable # references within can be resolved prior to using them as variables. # Pass a copy of the variables dict to avoid having it be tainted. # Otherwise, it would have extra automatics added for everything that # should just be an ordinary variable in this scope. ProcessVariablesAndConditionsInDict(the_dict['variables'], phase, variables, build_file, 'variables') LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key) for key, value in the_dict.iteritems(): # Skip "variables", which was already processed if present. if key != 'variables' and type(value) is str: expanded = ExpandVariables(value, phase, variables, build_file) if type(expanded) not in (str, int): raise ValueError( 'Variable expansion in this context permits str and int ' + \ 'only, found ' + expanded.__class__.__name__ + ' for ' + key) the_dict[key] = expanded # Variable expansion may have resulted in changes to automatics. Reload. # TODO(mark): Optimization: only reload if no changes were made. variables = variables_in.copy() LoadAutomaticVariablesFromDict(variables, the_dict) LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key) # Process conditions in this dict. This is done after variable expansion # so that conditions may take advantage of expanded variables. For example, # if the_dict contains: # {'type': '<(library_type)', # 'conditions': [['_type=="static_library"', { ... }]]}, # _type, as used in the condition, will only be set to the value of # library_type if variable expansion is performed before condition # processing. However, condition processing should occur prior to recursion # so that variables (both automatic and "variables" dict type) may be # adjusted by conditions sections, merged into the_dict, and have the # intended impact on contained dicts. # # This arrangement means that a "conditions" section containing a "variables" # section will only have those variables effective in subdicts, not in # the_dict. The workaround is to put a "conditions" section within a # "variables" section. For example: # {'conditions': [['os=="mac"', {'variables': {'define': 'IS_MAC'}}]], # 'defines': ['<(define)'], # 'my_subdict': {'defines': ['<(define)']}}, # will not result in "IS_MAC" being appended to the "defines" list in the # current scope but would result in it being appended to the "defines" list # within "my_subdict". By comparison: # {'variables': {'conditions': [['os=="mac"', {'define': 'IS_MAC'}]]}, # 'defines': ['<(define)'], # 'my_subdict': {'defines': ['<(define)']}}, # will append "IS_MAC" to both "defines" lists. # Evaluate conditions sections, allowing variable expansions within them # as well as nested conditionals. This will process a 'conditions' or # 'target_conditions' section, perform appropriate merging and recursive # conditional and variable processing, and then remove the conditions section # from the_dict if it is present. ProcessConditionsInDict(the_dict, phase, variables, build_file) # Conditional processing may have resulted in changes to automatics or the # variables dict. Reload. variables = variables_in.copy() LoadAutomaticVariablesFromDict(variables, the_dict) LoadVariablesFromVariablesDict(variables, the_dict, the_dict_key) # Recurse into child dicts, or process child lists which may result in # further recursion into descendant dicts. for key, value in the_dict.iteritems(): # Skip "variables" and string values, which were already processed if # present. if key == 'variables' or type(value) is str: continue if type(value) is dict: # Pass a copy of the variables dict so that subdicts can't influence # parents. ProcessVariablesAndConditionsInDict(value, phase, variables, build_file, key) elif type(value) is list: # The list itself can't influence the variables dict, and # ProcessVariablesAndConditionsInList will make copies of the variables # dict if it needs to pass it to something that can influence it. No # copy is necessary here. ProcessVariablesAndConditionsInList(value, phase, variables, build_file) elif type(value) is not int: raise TypeError('Unknown type ' + value.__class__.__name__ + \ ' for ' + key) def ProcessVariablesAndConditionsInList(the_list, phase, variables, build_file): # Iterate using an index so that new values can be assigned into the_list. index = 0 while index < len(the_list): item = the_list[index] if type(item) is dict: # Make a copy of the variables dict so that it won't influence anything # outside of its own scope. ProcessVariablesAndConditionsInDict(item, phase, variables, build_file) elif type(item) is list: ProcessVariablesAndConditionsInList(item, phase, variables, build_file) elif type(item) is str: expanded = ExpandVariables(item, phase, variables, build_file) if type(expanded) in (str, int): the_list[index] = expanded elif type(expanded) is list: the_list[index:index+1] = expanded index += len(expanded) # index now identifies the next item to examine. Continue right now # without falling into the index increment below. continue else: raise ValueError( 'Variable expansion in this context permits strings and ' + \ 'lists only, found ' + expanded.__class__.__name__ + ' at ' + \ index) elif type(item) is not int: raise TypeError('Unknown type ' + item.__class__.__name__ + \ ' at index ' + index) index = index + 1 def BuildTargetsDict(data): """Builds a dict mapping fully-qualified target names to their target dicts. |data| is a dict mapping loaded build files by pathname relative to the current directory. Values in |data| are build file contents. For each |data| value with a "targets" key, the value of the "targets" key is taken as a list containing target dicts. Each target's fully-qualified name is constructed from the pathname of the build file (|data| key) and its "target_name" property. These fully-qualified names are used as the keys in the returned dict. These keys provide access to the target dicts, the dicts in the "targets" lists. """ targets = {} for build_file in data['target_build_files']: for target in data[build_file].get('targets', []): target_name = gyp.common.QualifiedTarget(build_file, target['target_name'], target['toolset']) if target_name in targets: raise GypError('Duplicate target definitions for ' + target_name) targets[target_name] = target return targets def QualifyDependencies(targets): """Make dependency links fully-qualified relative to the current directory. |targets| is a dict mapping fully-qualified target names to their target dicts. For each target in this dict, keys known to contain dependency links are examined, and any dependencies referenced will be rewritten so that they are fully-qualified and relative to the current directory. All rewritten dependencies are suitable for use as keys to |targets| or a similar dict. """ all_dependency_sections = [dep + op for dep in dependency_sections for op in ('', '!', '/')] for target, target_dict in targets.iteritems(): target_build_file = gyp.common.BuildFile(target) toolset = target_dict['toolset'] for dependency_key in all_dependency_sections: dependencies = target_dict.get(dependency_key, []) for index in xrange(0, len(dependencies)): dep_file, dep_target, dep_toolset = gyp.common.ResolveTarget( target_build_file, dependencies[index], toolset) if not multiple_toolsets: # Ignore toolset specification in the dependency if it is specified. dep_toolset = toolset dependency = gyp.common.QualifiedTarget(dep_file, dep_target, dep_toolset) dependencies[index] = dependency # Make sure anything appearing in a list other than "dependencies" also # appears in the "dependencies" list. if dependency_key != 'dependencies' and \ dependency not in target_dict['dependencies']: raise GypError('Found ' + dependency + ' in ' + dependency_key + ' of ' + target + ', but not in dependencies') def ExpandWildcardDependencies(targets, data): """Expands dependencies specified as build_file:*. For each target in |targets|, examines sections containing links to other targets. If any such section contains a link of the form build_file:*, it is taken as a wildcard link, and is expanded to list each target in build_file. The |data| dict provides access to build file dicts. Any target that does not wish to be included by wildcard can provide an optional "suppress_wildcard" key in its target dict. When present and true, a wildcard dependency link will not include such targets. All dependency names, including the keys to |targets| and the values in each dependency list, must be qualified when this function is called. """ for target, target_dict in targets.iteritems(): toolset = target_dict['toolset'] target_build_file = gyp.common.BuildFile(target) for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) # Loop this way instead of "for dependency in" or "for index in xrange" # because the dependencies list will be modified within the loop body. index = 0 while index < len(dependencies): (dependency_build_file, dependency_target, dependency_toolset) = \ gyp.common.ParseQualifiedTarget(dependencies[index]) if dependency_target != '*' and dependency_toolset != '*': # Not a wildcard. Keep it moving. index = index + 1 continue if dependency_build_file == target_build_file: # It's an error for a target to depend on all other targets in # the same file, because a target cannot depend on itself. raise GypError('Found wildcard in ' + dependency_key + ' of ' + target + ' referring to same build file') # Take the wildcard out and adjust the index so that the next # dependency in the list will be processed the next time through the # loop. del dependencies[index] index = index - 1 # Loop through the targets in the other build file, adding them to # this target's list of dependencies in place of the removed # wildcard. dependency_target_dicts = data[dependency_build_file]['targets'] for dependency_target_dict in dependency_target_dicts: if int(dependency_target_dict.get('suppress_wildcard', False)): continue dependency_target_name = dependency_target_dict['target_name'] if (dependency_target != '*' and dependency_target != dependency_target_name): continue dependency_target_toolset = dependency_target_dict['toolset'] if (dependency_toolset != '*' and dependency_toolset != dependency_target_toolset): continue dependency = gyp.common.QualifiedTarget(dependency_build_file, dependency_target_name, dependency_target_toolset) index = index + 1 dependencies.insert(index, dependency) index = index + 1 def Unify(l): """Removes duplicate elements from l, keeping the first element.""" seen = {} return [seen.setdefault(e, e) for e in l if e not in seen] def RemoveDuplicateDependencies(targets): """Makes sure every dependency appears only once in all targets's dependency lists.""" for target_name, target_dict in targets.iteritems(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) if dependencies: target_dict[dependency_key] = Unify(dependencies) def Filter(l, item): """Removes item from l.""" res = {} return [res.setdefault(e, e) for e in l if e != item] def RemoveSelfDependencies(targets): """Remove self dependencies from targets that have the prune_self_dependency variable set.""" for target_name, target_dict in targets.iteritems(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) if dependencies: for t in dependencies: if t == target_name: if targets[t].get('variables', {}).get('prune_self_dependency', 0): target_dict[dependency_key] = Filter(dependencies, target_name) def RemoveLinkDependenciesFromNoneTargets(targets): """Remove dependencies having the 'link_dependency' attribute from the 'none' targets.""" for target_name, target_dict in targets.iteritems(): for dependency_key in dependency_sections: dependencies = target_dict.get(dependency_key, []) if dependencies: for t in dependencies: if target_dict.get('type', None) == 'none': if targets[t].get('variables', {}).get('link_dependency', 0): target_dict[dependency_key] = \ Filter(target_dict[dependency_key], t) class DependencyGraphNode(object): """ Attributes: ref: A reference to an object that this DependencyGraphNode represents. dependencies: List of DependencyGraphNodes on which this one depends. dependents: List of DependencyGraphNodes that depend on this one. """ class CircularException(GypError): pass def __init__(self, ref): self.ref = ref self.dependencies = [] self.dependents = [] def __repr__(self): return '' % self.ref def FlattenToList(self): # flat_list is the sorted list of dependencies - actually, the list items # are the "ref" attributes of DependencyGraphNodes. Every target will # appear in flat_list after all of its dependencies, and before all of its # dependents. flat_list = OrderedSet() # in_degree_zeros is the list of DependencyGraphNodes that have no # dependencies not in flat_list. Initially, it is a copy of the children # of this node, because when the graph was built, nodes with no # dependencies were made implicit dependents of the root node. in_degree_zeros = set(self.dependents[:]) while in_degree_zeros: # Nodes in in_degree_zeros have no dependencies not in flat_list, so they # can be appended to flat_list. Take these nodes out of in_degree_zeros # as work progresses, so that the next node to process from the list can # always be accessed at a consistent position. node = in_degree_zeros.pop() flat_list.add(node.ref) # Look at dependents of the node just added to flat_list. Some of them # may now belong in in_degree_zeros. for node_dependent in node.dependents: is_in_degree_zero = True # TODO: We want to check through the # node_dependent.dependencies list but if it's long and we # always start at the beginning, then we get O(n^2) behaviour. for node_dependent_dependency in node_dependent.dependencies: if not node_dependent_dependency.ref in flat_list: # The dependent one or more dependencies not in flat_list. There # will be more chances to add it to flat_list when examining # it again as a dependent of those other dependencies, provided # that there are no cycles. is_in_degree_zero = False break if is_in_degree_zero: # All of the dependent's dependencies are already in flat_list. Add # it to in_degree_zeros where it will be processed in a future # iteration of the outer loop. in_degree_zeros.add(node_dependent) return list(flat_list) def FindCycles(self): """ Returns a list of cycles in the graph, where each cycle is its own list. """ results = [] visited = set() def Visit(node, path): for child in node.dependents: if child in path: results.append([child] + path[:path.index(child) + 1]) elif not child in visited: visited.add(child) Visit(child, [child] + path) visited.add(self) Visit(self, [self]) return results def DirectDependencies(self, dependencies=None): """Returns a list of just direct dependencies.""" if dependencies == None: dependencies = [] for dependency in self.dependencies: # Check for None, corresponding to the root node. if dependency.ref != None and dependency.ref not in dependencies: dependencies.append(dependency.ref) return dependencies def _AddImportedDependencies(self, targets, dependencies=None): """Given a list of direct dependencies, adds indirect dependencies that other dependencies have declared to export their settings. This method does not operate on self. Rather, it operates on the list of dependencies in the |dependencies| argument. For each dependency in that list, if any declares that it exports the settings of one of its own dependencies, those dependencies whose settings are "passed through" are added to the list. As new items are added to the list, they too will be processed, so it is possible to import settings through multiple levels of dependencies. This method is not terribly useful on its own, it depends on being "primed" with a list of direct dependencies such as one provided by DirectDependencies. DirectAndImportedDependencies is intended to be the public entry point. """ if dependencies == None: dependencies = [] index = 0 while index < len(dependencies): dependency = dependencies[index] dependency_dict = targets[dependency] # Add any dependencies whose settings should be imported to the list # if not already present. Newly-added items will be checked for # their own imports when the list iteration reaches them. # Rather than simply appending new items, insert them after the # dependency that exported them. This is done to more closely match # the depth-first method used by DeepDependencies. add_index = 1 for imported_dependency in \ dependency_dict.get('export_dependent_settings', []): if imported_dependency not in dependencies: dependencies.insert(index + add_index, imported_dependency) add_index = add_index + 1 index = index + 1 return dependencies def DirectAndImportedDependencies(self, targets, dependencies=None): """Returns a list of a target's direct dependencies and all indirect dependencies that a dependency has advertised settings should be exported through the dependency for. """ dependencies = self.DirectDependencies(dependencies) return self._AddImportedDependencies(targets, dependencies) def DeepDependencies(self, dependencies=None): """Returns an OrderedSet of all of a target's dependencies, recursively.""" if dependencies is None: # Using a list to get ordered output and a set to do fast "is it # already added" checks. dependencies = OrderedSet() for dependency in self.dependencies: # Check for None, corresponding to the root node. if dependency.ref is None: continue if dependency.ref not in dependencies: dependency.DeepDependencies(dependencies) dependencies.add(dependency.ref) return dependencies def _LinkDependenciesInternal(self, targets, include_shared_libraries, dependencies=None, initial=True): """Returns an OrderedSet of dependency targets that are linked into this target. This function has a split personality, depending on the setting of |initial|. Outside callers should always leave |initial| at its default setting. When adding a target to the list of dependencies, this function will recurse into itself with |initial| set to False, to collect dependencies that are linked into the linkable target for which the list is being built. If |include_shared_libraries| is False, the resulting dependencies will not include shared_library targets that are linked into this target. """ if dependencies is None: # Using a list to get ordered output and a set to do fast "is it # already added" checks. dependencies = OrderedSet() # Check for None, corresponding to the root node. if self.ref is None: return dependencies # It's kind of sucky that |targets| has to be passed into this function, # but that's presently the easiest way to access the target dicts so that # this function can find target types. if 'target_name' not in targets[self.ref]: raise GypError("Missing 'target_name' field in target.") if 'type' not in targets[self.ref]: raise GypError("Missing 'type' field in target %s" % targets[self.ref]['target_name']) target_type = targets[self.ref]['type'] is_linkable = target_type in linkable_types if initial and not is_linkable: # If this is the first target being examined and it's not linkable, # return an empty list of link dependencies, because the link # dependencies are intended to apply to the target itself (initial is # True) and this target won't be linked. return dependencies # Don't traverse 'none' targets if explicitly excluded. if (target_type == 'none' and not targets[self.ref].get('dependencies_traverse', True)): dependencies.add(self.ref) return dependencies # Executables, mac kernel extensions and loadable modules are already fully # and finally linked. Nothing else can be a link dependency of them, there # can only be dependencies in the sense that a dependent target might run # an executable or load the loadable_module. if not initial and target_type in ('executable', 'loadable_module', 'mac_kernel_extension'): return dependencies # Shared libraries are already fully linked. They should only be included # in |dependencies| when adjusting static library dependencies (in order to # link against the shared_library's import lib), but should not be included # in |dependencies| when propagating link_settings. # The |include_shared_libraries| flag controls which of these two cases we # are handling. if (not initial and target_type == 'shared_library' and not include_shared_libraries): return dependencies # The target is linkable, add it to the list of link dependencies. if self.ref not in dependencies: dependencies.add(self.ref) if initial or not is_linkable: # If this is a subsequent target and it's linkable, don't look any # further for linkable dependencies, as they'll already be linked into # this target linkable. Always look at dependencies of the initial # target, and always look at dependencies of non-linkables. for dependency in self.dependencies: dependency._LinkDependenciesInternal(targets, include_shared_libraries, dependencies, False) return dependencies def DependenciesForLinkSettings(self, targets): """ Returns a list of dependency targets whose link_settings should be merged into this target. """ # TODO(sbaig) Currently, chrome depends on the bug that shared libraries' # link_settings are propagated. So for now, we will allow it, unless the # 'allow_sharedlib_linksettings_propagation' flag is explicitly set to # False. Once chrome is fixed, we can remove this flag. include_shared_libraries = \ targets[self.ref].get('allow_sharedlib_linksettings_propagation', True) return self._LinkDependenciesInternal(targets, include_shared_libraries) def DependenciesToLinkAgainst(self, targets): """ Returns a list of dependency targets that are linked into this target. """ return self._LinkDependenciesInternal(targets, True) def BuildDependencyList(targets): # Create a DependencyGraphNode for each target. Put it into a dict for easy # access. dependency_nodes = {} for target, spec in targets.iteritems(): if target not in dependency_nodes: dependency_nodes[target] = DependencyGraphNode(target) # Set up the dependency links. Targets that have no dependencies are treated # as dependent on root_node. root_node = DependencyGraphNode(None) for target, spec in targets.iteritems(): target_node = dependency_nodes[target] target_build_file = gyp.common.BuildFile(target) dependencies = spec.get('dependencies') if not dependencies: target_node.dependencies = [root_node] root_node.dependents.append(target_node) else: for dependency in dependencies: dependency_node = dependency_nodes.get(dependency) if not dependency_node: raise GypError("Dependency '%s' not found while " "trying to load target %s" % (dependency, target)) target_node.dependencies.append(dependency_node) dependency_node.dependents.append(target_node) flat_list = root_node.FlattenToList() # If there's anything left unvisited, there must be a circular dependency # (cycle). if len(flat_list) != len(targets): if not root_node.dependents: # If all targets have dependencies, add the first target as a dependent # of root_node so that the cycle can be discovered from root_node. target = targets.keys()[0] target_node = dependency_nodes[target] target_node.dependencies.append(root_node) root_node.dependents.append(target_node) cycles = [] for cycle in root_node.FindCycles(): paths = [node.ref for node in cycle] cycles.append('Cycle: %s' % ' -> '.join(paths)) raise DependencyGraphNode.CircularException( 'Cycles in dependency graph detected:\n' + '\n'.join(cycles)) return [dependency_nodes, flat_list] def VerifyNoGYPFileCircularDependencies(targets): # Create a DependencyGraphNode for each gyp file containing a target. Put # it into a dict for easy access. dependency_nodes = {} for target in targets.iterkeys(): build_file = gyp.common.BuildFile(target) if not build_file in dependency_nodes: dependency_nodes[build_file] = DependencyGraphNode(build_file) # Set up the dependency links. for target, spec in targets.iteritems(): build_file = gyp.common.BuildFile(target) build_file_node = dependency_nodes[build_file] target_dependencies = spec.get('dependencies', []) for dependency in target_dependencies: try: dependency_build_file = gyp.common.BuildFile(dependency) except GypError, e: gyp.common.ExceptionAppend( e, 'while computing dependencies of .gyp file %s' % build_file) raise if dependency_build_file == build_file: # A .gyp file is allowed to refer back to itself. continue dependency_node = dependency_nodes.get(dependency_build_file) if not dependency_node: raise GypError("Dependancy '%s' not found" % dependency_build_file) if dependency_node not in build_file_node.dependencies: build_file_node.dependencies.append(dependency_node) dependency_node.dependents.append(build_file_node) # Files that have no dependencies are treated as dependent on root_node. root_node = DependencyGraphNode(None) for build_file_node in dependency_nodes.itervalues(): if len(build_file_node.dependencies) == 0: build_file_node.dependencies.append(root_node) root_node.dependents.append(build_file_node) flat_list = root_node.FlattenToList() # If there's anything left unvisited, there must be a circular dependency # (cycle). if len(flat_list) != len(dependency_nodes): if not root_node.dependents: # If all files have dependencies, add the first file as a dependent # of root_node so that the cycle can be discovered from root_node. file_node = dependency_nodes.values()[0] file_node.dependencies.append(root_node) root_node.dependents.append(file_node) cycles = [] for cycle in root_node.FindCycles(): paths = [node.ref for node in cycle] cycles.append('Cycle: %s' % ' -> '.join(paths)) raise DependencyGraphNode.CircularException( 'Cycles in .gyp file dependency graph detected:\n' + '\n'.join(cycles)) def DoDependentSettings(key, flat_list, targets, dependency_nodes): # key should be one of all_dependent_settings, direct_dependent_settings, # or link_settings. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) if key == 'all_dependent_settings': dependencies = dependency_nodes[target].DeepDependencies() elif key == 'direct_dependent_settings': dependencies = \ dependency_nodes[target].DirectAndImportedDependencies(targets) elif key == 'link_settings': dependencies = \ dependency_nodes[target].DependenciesForLinkSettings(targets) else: raise GypError("DoDependentSettings doesn't know how to determine " 'dependencies for ' + key) for dependency in dependencies: dependency_dict = targets[dependency] if not key in dependency_dict: continue dependency_build_file = gyp.common.BuildFile(dependency) MergeDicts(target_dict, dependency_dict[key], build_file, dependency_build_file) def AdjustStaticLibraryDependencies(flat_list, targets, dependency_nodes, sort_dependencies): # Recompute target "dependencies" properties. For each static library # target, remove "dependencies" entries referring to other static libraries, # unless the dependency has the "hard_dependency" attribute set. For each # linkable target, add a "dependencies" entry referring to all of the # target's computed list of link dependencies (including static libraries # if no such entry is already present. for target in flat_list: target_dict = targets[target] target_type = target_dict['type'] if target_type == 'static_library': if not 'dependencies' in target_dict: continue target_dict['dependencies_original'] = target_dict.get( 'dependencies', [])[:] # A static library should not depend on another static library unless # the dependency relationship is "hard," which should only be done when # a dependent relies on some side effect other than just the build # product, like a rule or action output. Further, if a target has a # non-hard dependency, but that dependency exports a hard dependency, # the non-hard dependency can safely be removed, but the exported hard # dependency must be added to the target to keep the same dependency # ordering. dependencies = \ dependency_nodes[target].DirectAndImportedDependencies(targets) index = 0 while index < len(dependencies): dependency = dependencies[index] dependency_dict = targets[dependency] # Remove every non-hard static library dependency and remove every # non-static library dependency that isn't a direct dependency. if (dependency_dict['type'] == 'static_library' and \ not dependency_dict.get('hard_dependency', False)) or \ (dependency_dict['type'] != 'static_library' and \ not dependency in target_dict['dependencies']): # Take the dependency out of the list, and don't increment index # because the next dependency to analyze will shift into the index # formerly occupied by the one being removed. del dependencies[index] else: index = index + 1 # Update the dependencies. If the dependencies list is empty, it's not # needed, so unhook it. if len(dependencies) > 0: target_dict['dependencies'] = dependencies else: del target_dict['dependencies'] elif target_type in linkable_types: # Get a list of dependency targets that should be linked into this # target. Add them to the dependencies list if they're not already # present. link_dependencies = \ dependency_nodes[target].DependenciesToLinkAgainst(targets) for dependency in link_dependencies: if dependency == target: continue if not 'dependencies' in target_dict: target_dict['dependencies'] = [] if not dependency in target_dict['dependencies']: target_dict['dependencies'].append(dependency) # Sort the dependencies list in the order from dependents to dependencies. # e.g. If A and B depend on C and C depends on D, sort them in A, B, C, D. # Note: flat_list is already sorted in the order from dependencies to # dependents. if sort_dependencies and 'dependencies' in target_dict: target_dict['dependencies'] = [dep for dep in reversed(flat_list) if dep in target_dict['dependencies']] # Initialize this here to speed up MakePathRelative. exception_re = re.compile(r'''["']?[-/$<>^]''') def MakePathRelative(to_file, fro_file, item): # If item is a relative path, it's relative to the build file dict that it's # coming from. Fix it up to make it relative to the build file dict that # it's going into. # Exception: any |item| that begins with these special characters is # returned without modification. # / Used when a path is already absolute (shortcut optimization; # such paths would be returned as absolute anyway) # $ Used for build environment variables # - Used for some build environment flags (such as -lapr-1 in a # "libraries" section) # < Used for our own variable and command expansions (see ExpandVariables) # > Used for our own variable and command expansions (see ExpandVariables) # ^ Used for our own variable and command expansions (see ExpandVariables) # # "/' Used when a value is quoted. If these are present, then we # check the second character instead. # if to_file == fro_file or exception_re.match(item): return item else: # TODO(dglazkov) The backslash/forward-slash replacement at the end is a # temporary measure. This should really be addressed by keeping all paths # in POSIX until actual project generation. ret = os.path.normpath(os.path.join( gyp.common.RelativePath(os.path.dirname(fro_file), os.path.dirname(to_file)), item)).replace('\\', '/') if item[-1] == '/': ret += '/' return ret def MergeLists(to, fro, to_file, fro_file, is_paths=False, append=True): # Python documentation recommends objects which do not support hash # set this value to None. Python library objects follow this rule. is_hashable = lambda val: val.__hash__ # If x is hashable, returns whether x is in s. Else returns whether x is in l. def is_in_set_or_list(x, s, l): if is_hashable(x): return x in s return x in l prepend_index = 0 # Make membership testing of hashables in |to| (in particular, strings) # faster. hashable_to_set = set(x for x in to if is_hashable(x)) for item in fro: singleton = False if type(item) in (str, int): # The cheap and easy case. if is_paths: to_item = MakePathRelative(to_file, fro_file, item) else: to_item = item if not (type(item) is str and item.startswith('-')): # Any string that doesn't begin with a "-" is a singleton - it can # only appear once in a list, to be enforced by the list merge append # or prepend. singleton = True elif type(item) is dict: # Make a copy of the dictionary, continuing to look for paths to fix. # The other intelligent aspects of merge processing won't apply because # item is being merged into an empty dict. to_item = {} MergeDicts(to_item, item, to_file, fro_file) elif type(item) is list: # Recurse, making a copy of the list. If the list contains any # descendant dicts, path fixing will occur. Note that here, custom # values for is_paths and append are dropped; those are only to be # applied to |to| and |fro|, not sublists of |fro|. append shouldn't # matter anyway because the new |to_item| list is empty. to_item = [] MergeLists(to_item, item, to_file, fro_file) else: raise TypeError( 'Attempt to merge list item of unsupported type ' + \ item.__class__.__name__) if append: # If appending a singleton that's already in the list, don't append. # This ensures that the earliest occurrence of the item will stay put. if not singleton or not is_in_set_or_list(to_item, hashable_to_set, to): to.append(to_item) if is_hashable(to_item): hashable_to_set.add(to_item) else: # If prepending a singleton that's already in the list, remove the # existing instance and proceed with the prepend. This ensures that the # item appears at the earliest possible position in the list. while singleton and to_item in to: to.remove(to_item) # Don't just insert everything at index 0. That would prepend the new # items to the list in reverse order, which would be an unwelcome # surprise. to.insert(prepend_index, to_item) if is_hashable(to_item): hashable_to_set.add(to_item) prepend_index = prepend_index + 1 def MergeDicts(to, fro, to_file, fro_file): # I wanted to name the parameter "from" but it's a Python keyword... for k, v in fro.iteritems(): # It would be nice to do "if not k in to: to[k] = v" but that wouldn't give # copy semantics. Something else may want to merge from the |fro| dict # later, and having the same dict ref pointed to twice in the tree isn't # what anyone wants considering that the dicts may subsequently be # modified. if k in to: bad_merge = False if type(v) in (str, int): if type(to[k]) not in (str, int): bad_merge = True elif type(v) is not type(to[k]): bad_merge = True if bad_merge: raise TypeError( 'Attempt to merge dict value of type ' + v.__class__.__name__ + \ ' into incompatible type ' + to[k].__class__.__name__ + \ ' for key ' + k) if type(v) in (str, int): # Overwrite the existing value, if any. Cheap and easy. is_path = IsPathSection(k) if is_path: to[k] = MakePathRelative(to_file, fro_file, v) else: to[k] = v elif type(v) is dict: # Recurse, guaranteeing copies will be made of objects that require it. if not k in to: to[k] = {} MergeDicts(to[k], v, to_file, fro_file) elif type(v) is list: # Lists in dicts can be merged with different policies, depending on # how the key in the "from" dict (k, the from-key) is written. # # If the from-key has ...the to-list will have this action # this character appended:... applied when receiving the from-list: # = replace # + prepend # ? set, only if to-list does not yet exist # (none) append # # This logic is list-specific, but since it relies on the associated # dict key, it's checked in this dict-oriented function. ext = k[-1] append = True if ext == '=': list_base = k[:-1] lists_incompatible = [list_base, list_base + '?'] to[list_base] = [] elif ext == '+': list_base = k[:-1] lists_incompatible = [list_base + '=', list_base + '?'] append = False elif ext == '?': list_base = k[:-1] lists_incompatible = [list_base, list_base + '=', list_base + '+'] else: list_base = k lists_incompatible = [list_base + '=', list_base + '?'] # Some combinations of merge policies appearing together are meaningless. # It's stupid to replace and append simultaneously, for example. Append # and prepend are the only policies that can coexist. for list_incompatible in lists_incompatible: if list_incompatible in fro: raise GypError('Incompatible list policies ' + k + ' and ' + list_incompatible) if list_base in to: if ext == '?': # If the key ends in "?", the list will only be merged if it doesn't # already exist. continue elif type(to[list_base]) is not list: # This may not have been checked above if merging in a list with an # extension character. raise TypeError( 'Attempt to merge dict value of type ' + v.__class__.__name__ + \ ' into incompatible type ' + to[list_base].__class__.__name__ + \ ' for key ' + list_base + '(' + k + ')') else: to[list_base] = [] # Call MergeLists, which will make copies of objects that require it. # MergeLists can recurse back into MergeDicts, although this will be # to make copies of dicts (with paths fixed), there will be no # subsequent dict "merging" once entering a list because lists are # always replaced, appended to, or prepended to. is_paths = IsPathSection(list_base) MergeLists(to[list_base], v, to_file, fro_file, is_paths, append) else: raise TypeError( 'Attempt to merge dict value of unsupported type ' + \ v.__class__.__name__ + ' for key ' + k) def MergeConfigWithInheritance(new_configuration_dict, build_file, target_dict, configuration, visited): # Skip if previously visted. if configuration in visited: return # Look at this configuration. configuration_dict = target_dict['configurations'][configuration] # Merge in parents. for parent in configuration_dict.get('inherit_from', []): MergeConfigWithInheritance(new_configuration_dict, build_file, target_dict, parent, visited + [configuration]) # Merge it into the new config. MergeDicts(new_configuration_dict, configuration_dict, build_file, build_file) # Drop abstract. if 'abstract' in new_configuration_dict: del new_configuration_dict['abstract'] def SetUpConfigurations(target, target_dict): # key_suffixes is a list of key suffixes that might appear on key names. # These suffixes are handled in conditional evaluations (for =, +, and ?) # and rules/exclude processing (for ! and /). Keys with these suffixes # should be treated the same as keys without. key_suffixes = ['=', '+', '?', '!', '/'] build_file = gyp.common.BuildFile(target) # Provide a single configuration by default if none exists. # TODO(mark): Signal an error if default_configurations exists but # configurations does not. if not 'configurations' in target_dict: target_dict['configurations'] = {'Default': {}} if not 'default_configuration' in target_dict: concrete = [i for (i, config) in target_dict['configurations'].iteritems() if not config.get('abstract')] target_dict['default_configuration'] = sorted(concrete)[0] merged_configurations = {} configs = target_dict['configurations'] for (configuration, old_configuration_dict) in configs.iteritems(): # Skip abstract configurations (saves work only). if old_configuration_dict.get('abstract'): continue # Configurations inherit (most) settings from the enclosing target scope. # Get the inheritance relationship right by making a copy of the target # dict. new_configuration_dict = {} for (key, target_val) in target_dict.iteritems(): key_ext = key[-1:] if key_ext in key_suffixes: key_base = key[:-1] else: key_base = key if not key_base in non_configuration_keys: new_configuration_dict[key] = gyp.simple_copy.deepcopy(target_val) # Merge in configuration (with all its parents first). MergeConfigWithInheritance(new_configuration_dict, build_file, target_dict, configuration, []) merged_configurations[configuration] = new_configuration_dict # Put the new configurations back into the target dict as a configuration. for configuration in merged_configurations.keys(): target_dict['configurations'][configuration] = ( merged_configurations[configuration]) # Now drop all the abstract ones. for configuration in target_dict['configurations'].keys(): old_configuration_dict = target_dict['configurations'][configuration] if old_configuration_dict.get('abstract'): del target_dict['configurations'][configuration] # Now that all of the target's configurations have been built, go through # the target dict's keys and remove everything that's been moved into a # "configurations" section. delete_keys = [] for key in target_dict: key_ext = key[-1:] if key_ext in key_suffixes: key_base = key[:-1] else: key_base = key if not key_base in non_configuration_keys: delete_keys.append(key) for key in delete_keys: del target_dict[key] # Check the configurations to see if they contain invalid keys. for configuration in target_dict['configurations'].keys(): configuration_dict = target_dict['configurations'][configuration] for key in configuration_dict.keys(): if key in invalid_configuration_keys: raise GypError('%s not allowed in the %s configuration, found in ' 'target %s' % (key, configuration, target)) def ProcessListFiltersInDict(name, the_dict): """Process regular expression and exclusion-based filters on lists. An exclusion list is in a dict key named with a trailing "!", like "sources!". Every item in such a list is removed from the associated main list, which in this example, would be "sources". Removed items are placed into a "sources_excluded" list in the dict. Regular expression (regex) filters are contained in dict keys named with a trailing "/", such as "sources/" to operate on the "sources" list. Regex filters in a dict take the form: 'sources/': [ ['exclude', '_(linux|mac|win)\\.cc$'], ['include', '_mac\\.cc$'] ], The first filter says to exclude all files ending in _linux.cc, _mac.cc, and _win.cc. The second filter then includes all files ending in _mac.cc that are now or were once in the "sources" list. Items matching an "exclude" filter are subject to the same processing as would occur if they were listed by name in an exclusion list (ending in "!"). Items matching an "include" filter are brought back into the main list if previously excluded by an exclusion list or exclusion regex filter. Subsequent matching "exclude" patterns can still cause items to be excluded after matching an "include". """ # Look through the dictionary for any lists whose keys end in "!" or "/". # These are lists that will be treated as exclude lists and regular # expression-based exclude/include lists. Collect the lists that are # needed first, looking for the lists that they operate on, and assemble # then into |lists|. This is done in a separate loop up front, because # the _included and _excluded keys need to be added to the_dict, and that # can't be done while iterating through it. lists = [] del_lists = [] for key, value in the_dict.iteritems(): operation = key[-1] if operation != '!' and operation != '/': continue if type(value) is not list: raise ValueError(name + ' key ' + key + ' must be list, not ' + \ value.__class__.__name__) list_key = key[:-1] if list_key not in the_dict: # This happens when there's a list like "sources!" but no corresponding # "sources" list. Since there's nothing for it to operate on, queue up # the "sources!" list for deletion now. del_lists.append(key) continue if type(the_dict[list_key]) is not list: value = the_dict[list_key] raise ValueError(name + ' key ' + list_key + \ ' must be list, not ' + \ value.__class__.__name__ + ' when applying ' + \ {'!': 'exclusion', '/': 'regex'}[operation]) if not list_key in lists: lists.append(list_key) # Delete the lists that are known to be unneeded at this point. for del_list in del_lists: del the_dict[del_list] for list_key in lists: the_list = the_dict[list_key] # Initialize the list_actions list, which is parallel to the_list. Each # item in list_actions identifies whether the corresponding item in # the_list should be excluded, unconditionally preserved (included), or # whether no exclusion or inclusion has been applied. Items for which # no exclusion or inclusion has been applied (yet) have value -1, items # excluded have value 0, and items included have value 1. Includes and # excludes override previous actions. All items in list_actions are # initialized to -1 because no excludes or includes have been processed # yet. list_actions = list((-1,) * len(the_list)) exclude_key = list_key + '!' if exclude_key in the_dict: for exclude_item in the_dict[exclude_key]: for index in xrange(0, len(the_list)): if exclude_item == the_list[index]: # This item matches the exclude_item, so set its action to 0 # (exclude). list_actions[index] = 0 # The "whatever!" list is no longer needed, dump it. del the_dict[exclude_key] regex_key = list_key + '/' if regex_key in the_dict: for regex_item in the_dict[regex_key]: [action, pattern] = regex_item pattern_re = re.compile(pattern) if action == 'exclude': # This item matches an exclude regex, so set its value to 0 (exclude). action_value = 0 elif action == 'include': # This item matches an include regex, so set its value to 1 (include). action_value = 1 else: # This is an action that doesn't make any sense. raise ValueError('Unrecognized action ' + action + ' in ' + name + \ ' key ' + regex_key) for index in xrange(0, len(the_list)): list_item = the_list[index] if list_actions[index] == action_value: # Even if the regex matches, nothing will change so continue (regex # searches are expensive). continue if pattern_re.search(list_item): # Regular expression match. list_actions[index] = action_value # The "whatever/" list is no longer needed, dump it. del the_dict[regex_key] # Add excluded items to the excluded list. # # Note that exclude_key ("sources!") is different from excluded_key # ("sources_excluded"). The exclude_key list is input and it was already # processed and deleted; the excluded_key list is output and it's about # to be created. excluded_key = list_key + '_excluded' if excluded_key in the_dict: raise GypError(name + ' key ' + excluded_key + ' must not be present prior ' ' to applying exclusion/regex filters for ' + list_key) excluded_list = [] # Go backwards through the list_actions list so that as items are deleted, # the indices of items that haven't been seen yet don't shift. That means # that things need to be prepended to excluded_list to maintain them in the # same order that they existed in the_list. for index in xrange(len(list_actions) - 1, -1, -1): if list_actions[index] == 0: # Dump anything with action 0 (exclude). Keep anything with action 1 # (include) or -1 (no include or exclude seen for the item). excluded_list.insert(0, the_list[index]) del the_list[index] # If anything was excluded, put the excluded list into the_dict at # excluded_key. if len(excluded_list) > 0: the_dict[excluded_key] = excluded_list # Now recurse into subdicts and lists that may contain dicts. for key, value in the_dict.iteritems(): if type(value) is dict: ProcessListFiltersInDict(key, value) elif type(value) is list: ProcessListFiltersInList(key, value) def ProcessListFiltersInList(name, the_list): for item in the_list: if type(item) is dict: ProcessListFiltersInDict(name, item) elif type(item) is list: ProcessListFiltersInList(name, item) def ValidateTargetType(target, target_dict): """Ensures the 'type' field on the target is one of the known types. Arguments: target: string, name of target. target_dict: dict, target spec. Raises an exception on error. """ VALID_TARGET_TYPES = ('executable', 'loadable_module', 'static_library', 'shared_library', 'mac_kernel_extension', 'none') target_type = target_dict.get('type', None) if target_type not in VALID_TARGET_TYPES: raise GypError("Target %s has an invalid target type '%s'. " "Must be one of %s." % (target, target_type, '/'.join(VALID_TARGET_TYPES))) if (target_dict.get('standalone_static_library', 0) and not target_type == 'static_library'): raise GypError('Target %s has type %s but standalone_static_library flag is' ' only valid for static_library type.' % (target, target_type)) def ValidateSourcesInTarget(target, target_dict, build_file, duplicate_basename_check): if not duplicate_basename_check: return if target_dict.get('type', None) != 'static_library': return sources = target_dict.get('sources', []) basenames = {} for source in sources: name, ext = os.path.splitext(source) is_compiled_file = ext in [ '.c', '.cc', '.cpp', '.cxx', '.m', '.mm', '.s', '.S'] if not is_compiled_file: continue basename = os.path.basename(name) # Don't include extension. basenames.setdefault(basename, []).append(source) error = '' for basename, files in basenames.iteritems(): if len(files) > 1: error += ' %s: %s\n' % (basename, ' '.join(files)) if error: print('static library %s has several files with the same basename:\n' % target + error + 'libtool on Mac cannot handle that. Use ' '--no-duplicate-basename-check to disable this validation.') raise GypError('Duplicate basenames in sources section, see list above') def ValidateRulesInTarget(target, target_dict, extra_sources_for_rules): """Ensures that the rules sections in target_dict are valid and consistent, and determines which sources they apply to. Arguments: target: string, name of target. target_dict: dict, target spec containing "rules" and "sources" lists. extra_sources_for_rules: a list of keys to scan for rule matches in addition to 'sources'. """ # Dicts to map between values found in rules' 'rule_name' and 'extension' # keys and the rule dicts themselves. rule_names = {} rule_extensions = {} rules = target_dict.get('rules', []) for rule in rules: # Make sure that there's no conflict among rule names and extensions. rule_name = rule['rule_name'] if rule_name in rule_names: raise GypError('rule %s exists in duplicate, target %s' % (rule_name, target)) rule_names[rule_name] = rule rule_extension = rule['extension'] if rule_extension.startswith('.'): rule_extension = rule_extension[1:] if rule_extension in rule_extensions: raise GypError(('extension %s associated with multiple rules, ' + 'target %s rules %s and %s') % (rule_extension, target, rule_extensions[rule_extension]['rule_name'], rule_name)) rule_extensions[rule_extension] = rule # Make sure rule_sources isn't already there. It's going to be # created below if needed. if 'rule_sources' in rule: raise GypError( 'rule_sources must not exist in input, target %s rule %s' % (target, rule_name)) rule_sources = [] source_keys = ['sources'] source_keys.extend(extra_sources_for_rules) for source_key in source_keys: for source in target_dict.get(source_key, []): (source_root, source_extension) = os.path.splitext(source) if source_extension.startswith('.'): source_extension = source_extension[1:] if source_extension == rule_extension: rule_sources.append(source) if len(rule_sources) > 0: rule['rule_sources'] = rule_sources def ValidateRunAsInTarget(target, target_dict, build_file): target_name = target_dict.get('target_name') run_as = target_dict.get('run_as') if not run_as: return if type(run_as) is not dict: raise GypError("The 'run_as' in target %s from file %s should be a " "dictionary." % (target_name, build_file)) action = run_as.get('action') if not action: raise GypError("The 'run_as' in target %s from file %s must have an " "'action' section." % (target_name, build_file)) if type(action) is not list: raise GypError("The 'action' for 'run_as' in target %s from file %s " "must be a list." % (target_name, build_file)) working_directory = run_as.get('working_directory') if working_directory and type(working_directory) is not str: raise GypError("The 'working_directory' for 'run_as' in target %s " "in file %s should be a string." % (target_name, build_file)) environment = run_as.get('environment') if environment and type(environment) is not dict: raise GypError("The 'environment' for 'run_as' in target %s " "in file %s should be a dictionary." % (target_name, build_file)) def ValidateActionsInTarget(target, target_dict, build_file): '''Validates the inputs to the actions in a target.''' target_name = target_dict.get('target_name') actions = target_dict.get('actions', []) for action in actions: action_name = action.get('action_name') if not action_name: raise GypError("Anonymous action in target %s. " "An action must have an 'action_name' field." % target_name) inputs = action.get('inputs', None) if inputs is None: raise GypError('Action in target %s has no inputs.' % target_name) action_command = action.get('action') if action_command and not action_command[0]: raise GypError("Empty action as command in target %s." % target_name) def TurnIntIntoStrInDict(the_dict): """Given dict the_dict, recursively converts all integers into strings. """ # Use items instead of iteritems because there's no need to try to look at # reinserted keys and their associated values. for k, v in the_dict.items(): if type(v) is int: v = str(v) the_dict[k] = v elif type(v) is dict: TurnIntIntoStrInDict(v) elif type(v) is list: TurnIntIntoStrInList(v) if type(k) is int: del the_dict[k] the_dict[str(k)] = v def TurnIntIntoStrInList(the_list): """Given list the_list, recursively converts all integers into strings. """ for index in xrange(0, len(the_list)): item = the_list[index] if type(item) is int: the_list[index] = str(item) elif type(item) is dict: TurnIntIntoStrInDict(item) elif type(item) is list: TurnIntIntoStrInList(item) def PruneUnwantedTargets(targets, flat_list, dependency_nodes, root_targets, data): """Return only the targets that are deep dependencies of |root_targets|.""" qualified_root_targets = [] for target in root_targets: target = target.strip() qualified_targets = gyp.common.FindQualifiedTargets(target, flat_list) if not qualified_targets: raise GypError("Could not find target %s" % target) qualified_root_targets.extend(qualified_targets) wanted_targets = {} for target in qualified_root_targets: wanted_targets[target] = targets[target] for dependency in dependency_nodes[target].DeepDependencies(): wanted_targets[dependency] = targets[dependency] wanted_flat_list = [t for t in flat_list if t in wanted_targets] # Prune unwanted targets from each build_file's data dict. for build_file in data['target_build_files']: if not 'targets' in data[build_file]: continue new_targets = [] for target in data[build_file]['targets']: qualified_name = gyp.common.QualifiedTarget(build_file, target['target_name'], target['toolset']) if qualified_name in wanted_targets: new_targets.append(target) data[build_file]['targets'] = new_targets return wanted_targets, wanted_flat_list def VerifyNoCollidingTargets(targets): """Verify that no two targets in the same directory share the same name. Arguments: targets: A list of targets in the form 'path/to/file.gyp:target_name'. """ # Keep a dict going from 'subdirectory:target_name' to 'foo.gyp'. used = {} for target in targets: # Separate out 'path/to/file.gyp, 'target_name' from # 'path/to/file.gyp:target_name'. path, name = target.rsplit(':', 1) # Separate out 'path/to', 'file.gyp' from 'path/to/file.gyp'. subdir, gyp = os.path.split(path) # Use '.' for the current directory '', so that the error messages make # more sense. if not subdir: subdir = '.' # Prepare a key like 'path/to:target_name'. key = subdir + ':' + name if key in used: # Complain if this target is already used. raise GypError('Duplicate target name "%s" in directory "%s" used both ' 'in "%s" and "%s".' % (name, subdir, gyp, used[key])) used[key] = gyp def SetGeneratorGlobals(generator_input_info): # Set up path_sections and non_configuration_keys with the default data plus # the generator-specific data. global path_sections path_sections = set(base_path_sections) path_sections.update(generator_input_info['path_sections']) global non_configuration_keys non_configuration_keys = base_non_configuration_keys[:] non_configuration_keys.extend(generator_input_info['non_configuration_keys']) global multiple_toolsets multiple_toolsets = generator_input_info[ 'generator_supports_multiple_toolsets'] global generator_filelist_paths generator_filelist_paths = generator_input_info['generator_filelist_paths'] def Load(build_files, variables, includes, depth, generator_input_info, check, circular_check, duplicate_basename_check, parallel, root_targets): SetGeneratorGlobals(generator_input_info) # A generator can have other lists (in addition to sources) be processed # for rules. extra_sources_for_rules = generator_input_info['extra_sources_for_rules'] # Load build files. This loads every target-containing build file into # the |data| dictionary such that the keys to |data| are build file names, # and the values are the entire build file contents after "early" or "pre" # processing has been done and includes have been resolved. # NOTE: data contains both "target" files (.gyp) and "includes" (.gypi), as # well as meta-data (e.g. 'included_files' key). 'target_build_files' keeps # track of the keys corresponding to "target" files. data = {'target_build_files': set()} # Normalize paths everywhere. This is important because paths will be # used as keys to the data dict and for references between input files. build_files = set(map(os.path.normpath, build_files)) if parallel: LoadTargetBuildFilesParallel(build_files, data, variables, includes, depth, check, generator_input_info) else: aux_data = {} for build_file in build_files: try: LoadTargetBuildFile(build_file, data, aux_data, variables, includes, depth, check, True) except Exception, e: gyp.common.ExceptionAppend(e, 'while trying to load %s' % build_file) raise # Build a dict to access each target's subdict by qualified name. targets = BuildTargetsDict(data) # Fully qualify all dependency links. QualifyDependencies(targets) # Remove self-dependencies from targets that have 'prune_self_dependencies' # set to 1. RemoveSelfDependencies(targets) # Expand dependencies specified as build_file:*. ExpandWildcardDependencies(targets, data) # Remove all dependencies marked as 'link_dependency' from the targets of # type 'none'. RemoveLinkDependenciesFromNoneTargets(targets) # Apply exclude (!) and regex (/) list filters only for dependency_sections. for target_name, target_dict in targets.iteritems(): tmp_dict = {} for key_base in dependency_sections: for op in ('', '!', '/'): key = key_base + op if key in target_dict: tmp_dict[key] = target_dict[key] del target_dict[key] ProcessListFiltersInDict(target_name, tmp_dict) # Write the results back to |target_dict|. for key in tmp_dict: target_dict[key] = tmp_dict[key] # Make sure every dependency appears at most once. RemoveDuplicateDependencies(targets) if circular_check: # Make sure that any targets in a.gyp don't contain dependencies in other # .gyp files that further depend on a.gyp. VerifyNoGYPFileCircularDependencies(targets) [dependency_nodes, flat_list] = BuildDependencyList(targets) if root_targets: # Remove, from |targets| and |flat_list|, the targets that are not deep # dependencies of the targets specified in |root_targets|. targets, flat_list = PruneUnwantedTargets( targets, flat_list, dependency_nodes, root_targets, data) # Check that no two targets in the same directory have the same name. VerifyNoCollidingTargets(flat_list) # Handle dependent settings of various types. for settings_type in ['all_dependent_settings', 'direct_dependent_settings', 'link_settings']: DoDependentSettings(settings_type, flat_list, targets, dependency_nodes) # Take out the dependent settings now that they've been published to all # of the targets that require them. for target in flat_list: if settings_type in targets[target]: del targets[target][settings_type] # Make sure static libraries don't declare dependencies on other static # libraries, but that linkables depend on all unlinked static libraries # that they need so that their link steps will be correct. gii = generator_input_info if gii['generator_wants_static_library_dependencies_adjusted']: AdjustStaticLibraryDependencies(flat_list, targets, dependency_nodes, gii['generator_wants_sorted_dependencies']) # Apply "post"/"late"/"target" variable expansions and condition evaluations. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) ProcessVariablesAndConditionsInDict( target_dict, PHASE_LATE, variables, build_file) # Move everything that can go into a "configurations" section into one. for target in flat_list: target_dict = targets[target] SetUpConfigurations(target, target_dict) # Apply exclude (!) and regex (/) list filters. for target in flat_list: target_dict = targets[target] ProcessListFiltersInDict(target, target_dict) # Apply "latelate" variable expansions and condition evaluations. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) ProcessVariablesAndConditionsInDict( target_dict, PHASE_LATELATE, variables, build_file) # Make sure that the rules make sense, and build up rule_sources lists as # needed. Not all generators will need to use the rule_sources lists, but # some may, and it seems best to build the list in a common spot. # Also validate actions and run_as elements in targets. for target in flat_list: target_dict = targets[target] build_file = gyp.common.BuildFile(target) ValidateTargetType(target, target_dict) ValidateSourcesInTarget(target, target_dict, build_file, duplicate_basename_check) ValidateRulesInTarget(target, target_dict, extra_sources_for_rules) ValidateRunAsInTarget(target, target_dict, build_file) ValidateActionsInTarget(target, target_dict, build_file) # Generators might not expect ints. Turn them into strs. TurnIntIntoStrInDict(data) # TODO(mark): Return |data| for now because the generator needs a list of # build files that came in. In the future, maybe it should just accept # a list, and not the whole data dict. return [flat_list, targets, data] npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/input_test.py0000755000000000000000000000620712631326456023166 0ustar 00000000000000#!/usr/bin/env python # Copyright 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Unit tests for the input.py file.""" import gyp.input import unittest import sys class TestFindCycles(unittest.TestCase): def setUp(self): self.nodes = {} for x in ('a', 'b', 'c', 'd', 'e'): self.nodes[x] = gyp.input.DependencyGraphNode(x) def _create_dependency(self, dependent, dependency): dependent.dependencies.append(dependency) dependency.dependents.append(dependent) def test_no_cycle_empty_graph(self): for label, node in self.nodes.iteritems(): self.assertEquals([], node.FindCycles()) def test_no_cycle_line(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['c']) self._create_dependency(self.nodes['c'], self.nodes['d']) for label, node in self.nodes.iteritems(): self.assertEquals([], node.FindCycles()) def test_no_cycle_dag(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['a'], self.nodes['c']) self._create_dependency(self.nodes['b'], self.nodes['c']) for label, node in self.nodes.iteritems(): self.assertEquals([], node.FindCycles()) def test_cycle_self_reference(self): self._create_dependency(self.nodes['a'], self.nodes['a']) self.assertEquals([[self.nodes['a'], self.nodes['a']]], self.nodes['a'].FindCycles()) def test_cycle_two_nodes(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['a']) self.assertEquals([[self.nodes['a'], self.nodes['b'], self.nodes['a']]], self.nodes['a'].FindCycles()) self.assertEquals([[self.nodes['b'], self.nodes['a'], self.nodes['b']]], self.nodes['b'].FindCycles()) def test_two_cycles(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['a']) self._create_dependency(self.nodes['b'], self.nodes['c']) self._create_dependency(self.nodes['c'], self.nodes['b']) cycles = self.nodes['a'].FindCycles() self.assertTrue( [self.nodes['a'], self.nodes['b'], self.nodes['a']] in cycles) self.assertTrue( [self.nodes['b'], self.nodes['c'], self.nodes['b']] in cycles) self.assertEquals(2, len(cycles)) def test_big_cycle(self): self._create_dependency(self.nodes['a'], self.nodes['b']) self._create_dependency(self.nodes['b'], self.nodes['c']) self._create_dependency(self.nodes['c'], self.nodes['d']) self._create_dependency(self.nodes['d'], self.nodes['e']) self._create_dependency(self.nodes['e'], self.nodes['a']) self.assertEquals([[self.nodes['a'], self.nodes['b'], self.nodes['c'], self.nodes['d'], self.nodes['e'], self.nodes['a']]], self.nodes['a'].FindCycles()) if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/mac_tool.py0000755000000000000000000005547212631326456022575 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions to perform Xcode-style build steps. These functions are executed via gyp-mac-tool when using the Makefile generator. """ import fcntl import fnmatch import glob import json import os import plistlib import re import shutil import string import subprocess import sys import tempfile def main(args): executor = MacTool() exit_code = executor.Dispatch(args) if exit_code is not None: sys.exit(exit_code) class MacTool(object): """This class performs all the Mac tooling steps. The methods can either be executed directly, or dispatched from an argument list.""" def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) return getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like copy-info-plist to CopyInfoPlist""" return name_string.title().replace('-', '') def ExecCopyBundleResource(self, source, dest, convert_to_binary): """Copies a resource file to the bundle/Resources directory, performing any necessary compilation on each resource.""" extension = os.path.splitext(source)[1].lower() if os.path.isdir(source): # Copy tree. # TODO(thakis): This copies file attributes like mtime, while the # single-file branch below doesn't. This should probably be changed to # be consistent with the single-file branch. if os.path.exists(dest): shutil.rmtree(dest) shutil.copytree(source, dest) elif extension == '.xib': return self._CopyXIBFile(source, dest) elif extension == '.storyboard': return self._CopyXIBFile(source, dest) elif extension == '.strings': self._CopyStringsFile(source, dest, convert_to_binary) else: shutil.copy(source, dest) def _CopyXIBFile(self, source, dest): """Compiles a XIB file with ibtool into a binary plist in the bundle.""" # ibtool sometimes crashes with relative paths. See crbug.com/314728. base = os.path.dirname(os.path.realpath(__file__)) if os.path.relpath(source): source = os.path.join(base, source) if os.path.relpath(dest): dest = os.path.join(base, dest) args = ['xcrun', 'ibtool', '--errors', '--warnings', '--notices', '--output-format', 'human-readable-text', '--compile', dest, source] ibtool_section_re = re.compile(r'/\*.*\*/') ibtool_re = re.compile(r'.*note:.*is clipping its content') ibtoolout = subprocess.Popen(args, stdout=subprocess.PIPE) current_section_header = None for line in ibtoolout.stdout: if ibtool_section_re.match(line): current_section_header = line elif not ibtool_re.match(line): if current_section_header: sys.stdout.write(current_section_header) current_section_header = None sys.stdout.write(line) return ibtoolout.returncode def _ConvertToBinary(self, dest): subprocess.check_call([ 'xcrun', 'plutil', '-convert', 'binary1', '-o', dest, dest]) def _CopyStringsFile(self, source, dest, convert_to_binary): """Copies a .strings file using iconv to reconvert the input into UTF-16.""" input_code = self._DetectInputEncoding(source) or "UTF-8" # Xcode's CpyCopyStringsFile / builtin-copyStrings seems to call # CFPropertyListCreateFromXMLData() behind the scenes; at least it prints # CFPropertyListCreateFromXMLData(): Old-style plist parser: missing # semicolon in dictionary. # on invalid files. Do the same kind of validation. import CoreFoundation s = open(source, 'rb').read() d = CoreFoundation.CFDataCreate(None, s, len(s)) _, error = CoreFoundation.CFPropertyListCreateFromXMLData(None, d, 0, None) if error: return fp = open(dest, 'wb') fp.write(s.decode(input_code).encode('UTF-16')) fp.close() if convert_to_binary == 'True': self._ConvertToBinary(dest) def _DetectInputEncoding(self, file_name): """Reads the first few bytes from file_name and tries to guess the text encoding. Returns None as a guess if it can't detect it.""" fp = open(file_name, 'rb') try: header = fp.read(3) except e: fp.close() return None fp.close() if header.startswith("\xFE\xFF"): return "UTF-16" elif header.startswith("\xFF\xFE"): return "UTF-16" elif header.startswith("\xEF\xBB\xBF"): return "UTF-8" else: return None def ExecCopyInfoPlist(self, source, dest, convert_to_binary, *keys): """Copies the |source| Info.plist to the destination directory |dest|.""" # Read the source Info.plist into memory. fd = open(source, 'r') lines = fd.read() fd.close() # Insert synthesized key/value pairs (e.g. BuildMachineOSBuild). plist = plistlib.readPlistFromString(lines) if keys: plist = dict(plist.items() + json.loads(keys[0]).items()) lines = plistlib.writePlistToString(plist) # Go through all the environment variables and replace them as variables in # the file. IDENT_RE = re.compile(r'[/\s]') for key in os.environ: if key.startswith('_'): continue evar = '${%s}' % key evalue = os.environ[key] lines = string.replace(lines, evar, evalue) # Xcode supports various suffices on environment variables, which are # all undocumented. :rfc1034identifier is used in the standard project # template these days, and :identifier was used earlier. They are used to # convert non-url characters into things that look like valid urls -- # except that the replacement character for :identifier, '_' isn't valid # in a URL either -- oops, hence :rfc1034identifier was born. evar = '${%s:identifier}' % key evalue = IDENT_RE.sub('_', os.environ[key]) lines = string.replace(lines, evar, evalue) evar = '${%s:rfc1034identifier}' % key evalue = IDENT_RE.sub('-', os.environ[key]) lines = string.replace(lines, evar, evalue) # Remove any keys with values that haven't been replaced. lines = lines.split('\n') for i in range(len(lines)): if lines[i].strip().startswith("${"): lines[i] = None lines[i - 1] = None lines = '\n'.join(filter(lambda x: x is not None, lines)) # Write out the file with variables replaced. fd = open(dest, 'w') fd.write(lines) fd.close() # Now write out PkgInfo file now that the Info.plist file has been # "compiled". self._WritePkgInfo(dest) if convert_to_binary == 'True': self._ConvertToBinary(dest) def _WritePkgInfo(self, info_plist): """This writes the PkgInfo file from the data stored in Info.plist.""" plist = plistlib.readPlist(info_plist) if not plist: return # Only create PkgInfo for executable types. package_type = plist['CFBundlePackageType'] if package_type != 'APPL': return # The format of PkgInfo is eight characters, representing the bundle type # and bundle signature, each four characters. If that is missing, four # '?' characters are used instead. signature_code = plist.get('CFBundleSignature', '????') if len(signature_code) != 4: # Wrong length resets everything, too. signature_code = '?' * 4 dest = os.path.join(os.path.dirname(info_plist), 'PkgInfo') fp = open(dest, 'w') fp.write('%s%s' % (package_type, signature_code)) fp.close() def ExecFlock(self, lockfile, *cmd_list): """Emulates the most basic behavior of Linux's flock(1).""" # Rely on exception handling to report errors. fd = os.open(lockfile, os.O_RDONLY|os.O_NOCTTY|os.O_CREAT, 0o666) fcntl.flock(fd, fcntl.LOCK_EX) return subprocess.call(cmd_list) def ExecFilterLibtool(self, *cmd_list): """Calls libtool and filters out '/path/to/libtool: file: foo.o has no symbols'.""" libtool_re = re.compile(r'^.*libtool: file: .* has no symbols$') libtool_re5 = re.compile( r'^.*libtool: warning for library: ' + r'.* the table of contents is empty ' + r'\(no object file members in the library define global symbols\)$') env = os.environ.copy() # Ref: # http://www.opensource.apple.com/source/cctools/cctools-809/misc/libtool.c # The problem with this flag is that it resets the file mtime on the file to # epoch=0, e.g. 1970-1-1 or 1969-12-31 depending on timezone. env['ZERO_AR_DATE'] = '1' libtoolout = subprocess.Popen(cmd_list, stderr=subprocess.PIPE, env=env) _, err = libtoolout.communicate() for line in err.splitlines(): if not libtool_re.match(line) and not libtool_re5.match(line): print >>sys.stderr, line # Unconditionally touch the output .a file on the command line if present # and the command succeeded. A bit hacky. if not libtoolout.returncode: for i in range(len(cmd_list) - 1): if cmd_list[i] == "-o" and cmd_list[i+1].endswith('.a'): os.utime(cmd_list[i+1], None) break return libtoolout.returncode def ExecPackageFramework(self, framework, version): """Takes a path to Something.framework and the Current version of that and sets up all the symlinks.""" # Find the name of the binary based on the part before the ".framework". binary = os.path.basename(framework).split('.')[0] CURRENT = 'Current' RESOURCES = 'Resources' VERSIONS = 'Versions' if not os.path.exists(os.path.join(framework, VERSIONS, version, binary)): # Binary-less frameworks don't seem to contain symlinks (see e.g. # chromium's out/Debug/org.chromium.Chromium.manifest/ bundle). return # Move into the framework directory to set the symlinks correctly. pwd = os.getcwd() os.chdir(framework) # Set up the Current version. self._Relink(version, os.path.join(VERSIONS, CURRENT)) # Set up the root symlinks. self._Relink(os.path.join(VERSIONS, CURRENT, binary), binary) self._Relink(os.path.join(VERSIONS, CURRENT, RESOURCES), RESOURCES) # Back to where we were before! os.chdir(pwd) def _Relink(self, dest, link): """Creates a symlink to |dest| named |link|. If |link| already exists, it is overwritten.""" if os.path.lexists(link): os.remove(link) os.symlink(dest, link) def ExecCompileXcassets(self, keys, *inputs): """Compiles multiple .xcassets files into a single .car file. This invokes 'actool' to compile all the inputs .xcassets files. The |keys| arguments is a json-encoded dictionary of extra arguments to pass to 'actool' when the asset catalogs contains an application icon or a launch image. Note that 'actool' does not create the Assets.car file if the asset catalogs does not contains imageset. """ command_line = [ 'xcrun', 'actool', '--output-format', 'human-readable-text', '--compress-pngs', '--notices', '--warnings', '--errors', ] is_iphone_target = 'IPHONEOS_DEPLOYMENT_TARGET' in os.environ if is_iphone_target: platform = os.environ['CONFIGURATION'].split('-')[-1] if platform not in ('iphoneos', 'iphonesimulator'): platform = 'iphonesimulator' command_line.extend([ '--platform', platform, '--target-device', 'iphone', '--target-device', 'ipad', '--minimum-deployment-target', os.environ['IPHONEOS_DEPLOYMENT_TARGET'], '--compile', os.path.abspath(os.environ['CONTENTS_FOLDER_PATH']), ]) else: command_line.extend([ '--platform', 'macosx', '--target-device', 'mac', '--minimum-deployment-target', os.environ['MACOSX_DEPLOYMENT_TARGET'], '--compile', os.path.abspath(os.environ['UNLOCALIZED_RESOURCES_FOLDER_PATH']), ]) if keys: keys = json.loads(keys) for key, value in keys.iteritems(): arg_name = '--' + key if isinstance(value, bool): if value: command_line.append(arg_name) elif isinstance(value, list): for v in value: command_line.append(arg_name) command_line.append(str(v)) else: command_line.append(arg_name) command_line.append(str(value)) # Note: actool crashes if inputs path are relative, so use os.path.abspath # to get absolute path name for inputs. command_line.extend(map(os.path.abspath, inputs)) subprocess.check_call(command_line) def ExecMergeInfoPlist(self, output, *inputs): """Merge multiple .plist files into a single .plist file.""" merged_plist = {} for path in inputs: plist = self._LoadPlistMaybeBinary(path) self._MergePlist(merged_plist, plist) plistlib.writePlist(merged_plist, output) def ExecCodeSignBundle(self, key, resource_rules, entitlements, provisioning): """Code sign a bundle. This function tries to code sign an iOS bundle, following the same algorithm as Xcode: 1. copy ResourceRules.plist from the user or the SDK into the bundle, 2. pick the provisioning profile that best match the bundle identifier, and copy it into the bundle as embedded.mobileprovision, 3. copy Entitlements.plist from user or SDK next to the bundle, 4. code sign the bundle. """ resource_rules_path = self._InstallResourceRules(resource_rules) substitutions, overrides = self._InstallProvisioningProfile( provisioning, self._GetCFBundleIdentifier()) entitlements_path = self._InstallEntitlements( entitlements, substitutions, overrides) subprocess.check_call([ 'codesign', '--force', '--sign', key, '--resource-rules', resource_rules_path, '--entitlements', entitlements_path, os.path.join( os.environ['TARGET_BUILD_DIR'], os.environ['FULL_PRODUCT_NAME'])]) def _InstallResourceRules(self, resource_rules): """Installs ResourceRules.plist from user or SDK into the bundle. Args: resource_rules: string, optional, path to the ResourceRules.plist file to use, default to "${SDKROOT}/ResourceRules.plist" Returns: Path to the copy of ResourceRules.plist into the bundle. """ source_path = resource_rules target_path = os.path.join( os.environ['BUILT_PRODUCTS_DIR'], os.environ['CONTENTS_FOLDER_PATH'], 'ResourceRules.plist') if not source_path: source_path = os.path.join( os.environ['SDKROOT'], 'ResourceRules.plist') shutil.copy2(source_path, target_path) return target_path def _InstallProvisioningProfile(self, profile, bundle_identifier): """Installs embedded.mobileprovision into the bundle. Args: profile: string, optional, short name of the .mobileprovision file to use, if empty or the file is missing, the best file installed will be used bundle_identifier: string, value of CFBundleIdentifier from Info.plist Returns: A tuple containing two dictionary: variables substitutions and values to overrides when generating the entitlements file. """ source_path, provisioning_data, team_id = self._FindProvisioningProfile( profile, bundle_identifier) target_path = os.path.join( os.environ['BUILT_PRODUCTS_DIR'], os.environ['CONTENTS_FOLDER_PATH'], 'embedded.mobileprovision') shutil.copy2(source_path, target_path) substitutions = self._GetSubstitutions(bundle_identifier, team_id + '.') return substitutions, provisioning_data['Entitlements'] def _FindProvisioningProfile(self, profile, bundle_identifier): """Finds the .mobileprovision file to use for signing the bundle. Checks all the installed provisioning profiles (or if the user specified the PROVISIONING_PROFILE variable, only consult it) and select the most specific that correspond to the bundle identifier. Args: profile: string, optional, short name of the .mobileprovision file to use, if empty or the file is missing, the best file installed will be used bundle_identifier: string, value of CFBundleIdentifier from Info.plist Returns: A tuple of the path to the selected provisioning profile, the data of the embedded plist in the provisioning profile and the team identifier to use for code signing. Raises: SystemExit: if no .mobileprovision can be used to sign the bundle. """ profiles_dir = os.path.join( os.environ['HOME'], 'Library', 'MobileDevice', 'Provisioning Profiles') if not os.path.isdir(profiles_dir): print >>sys.stderr, ( 'cannot find mobile provisioning for %s' % bundle_identifier) sys.exit(1) provisioning_profiles = None if profile: profile_path = os.path.join(profiles_dir, profile + '.mobileprovision') if os.path.exists(profile_path): provisioning_profiles = [profile_path] if not provisioning_profiles: provisioning_profiles = glob.glob( os.path.join(profiles_dir, '*.mobileprovision')) valid_provisioning_profiles = {} for profile_path in provisioning_profiles: profile_data = self._LoadProvisioningProfile(profile_path) app_id_pattern = profile_data.get( 'Entitlements', {}).get('application-identifier', '') for team_identifier in profile_data.get('TeamIdentifier', []): app_id = '%s.%s' % (team_identifier, bundle_identifier) if fnmatch.fnmatch(app_id, app_id_pattern): valid_provisioning_profiles[app_id_pattern] = ( profile_path, profile_data, team_identifier) if not valid_provisioning_profiles: print >>sys.stderr, ( 'cannot find mobile provisioning for %s' % bundle_identifier) sys.exit(1) # If the user has multiple provisioning profiles installed that can be # used for ${bundle_identifier}, pick the most specific one (ie. the # provisioning profile whose pattern is the longest). selected_key = max(valid_provisioning_profiles, key=lambda v: len(v)) return valid_provisioning_profiles[selected_key] def _LoadProvisioningProfile(self, profile_path): """Extracts the plist embedded in a provisioning profile. Args: profile_path: string, path to the .mobileprovision file Returns: Content of the plist embedded in the provisioning profile as a dictionary. """ with tempfile.NamedTemporaryFile() as temp: subprocess.check_call([ 'security', 'cms', '-D', '-i', profile_path, '-o', temp.name]) return self._LoadPlistMaybeBinary(temp.name) def _MergePlist(self, merged_plist, plist): """Merge |plist| into |merged_plist|.""" for key, value in plist.iteritems(): if isinstance(value, dict): merged_value = merged_plist.get(key, {}) if isinstance(merged_value, dict): self._MergePlist(merged_value, value) merged_plist[key] = merged_value else: merged_plist[key] = value else: merged_plist[key] = value def _LoadPlistMaybeBinary(self, plist_path): """Loads into a memory a plist possibly encoded in binary format. This is a wrapper around plistlib.readPlist that tries to convert the plist to the XML format if it can't be parsed (assuming that it is in the binary format). Args: plist_path: string, path to a plist file, in XML or binary format Returns: Content of the plist as a dictionary. """ try: # First, try to read the file using plistlib that only supports XML, # and if an exception is raised, convert a temporary copy to XML and # load that copy. return plistlib.readPlist(plist_path) except: pass with tempfile.NamedTemporaryFile() as temp: shutil.copy2(plist_path, temp.name) subprocess.check_call(['plutil', '-convert', 'xml1', temp.name]) return plistlib.readPlist(temp.name) def _GetSubstitutions(self, bundle_identifier, app_identifier_prefix): """Constructs a dictionary of variable substitutions for Entitlements.plist. Args: bundle_identifier: string, value of CFBundleIdentifier from Info.plist app_identifier_prefix: string, value for AppIdentifierPrefix Returns: Dictionary of substitutions to apply when generating Entitlements.plist. """ return { 'CFBundleIdentifier': bundle_identifier, 'AppIdentifierPrefix': app_identifier_prefix, } def _GetCFBundleIdentifier(self): """Extracts CFBundleIdentifier value from Info.plist in the bundle. Returns: Value of CFBundleIdentifier in the Info.plist located in the bundle. """ info_plist_path = os.path.join( os.environ['TARGET_BUILD_DIR'], os.environ['INFOPLIST_PATH']) info_plist_data = self._LoadPlistMaybeBinary(info_plist_path) return info_plist_data['CFBundleIdentifier'] def _InstallEntitlements(self, entitlements, substitutions, overrides): """Generates and install the ${BundleName}.xcent entitlements file. Expands variables "$(variable)" pattern in the source entitlements file, add extra entitlements defined in the .mobileprovision file and the copy the generated plist to "${BundlePath}.xcent". Args: entitlements: string, optional, path to the Entitlements.plist template to use, defaults to "${SDKROOT}/Entitlements.plist" substitutions: dictionary, variable substitutions overrides: dictionary, values to add to the entitlements Returns: Path to the generated entitlements file. """ source_path = entitlements target_path = os.path.join( os.environ['BUILT_PRODUCTS_DIR'], os.environ['PRODUCT_NAME'] + '.xcent') if not source_path: source_path = os.path.join( os.environ['SDKROOT'], 'Entitlements.plist') shutil.copy2(source_path, target_path) data = self._LoadPlistMaybeBinary(target_path) data = self._ExpandVariables(data, substitutions) if overrides: for key in overrides: if key not in data: data[key] = overrides[key] plistlib.writePlist(data, target_path) return target_path def _ExpandVariables(self, data, substitutions): """Expands variables "$(variable)" in data. Args: data: object, can be either string, list or dictionary substitutions: dictionary, variable substitutions to perform Returns: Copy of data where each references to "$(variable)" has been replaced by the corresponding value found in substitutions, or left intact if the key was not found. """ if isinstance(data, str): for key, value in substitutions.iteritems(): data = data.replace('$(%s)' % key, value) return data if isinstance(data, list): return [self._ExpandVariables(v, substitutions) for v in data] if isinstance(data, dict): return {k: self._ExpandVariables(data[k], substitutions) for k in data} return data if __name__ == '__main__': sys.exit(main(sys.argv[1:])) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/msvs_emulation.py0000644000000000000000000013512112631326456024030 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ This module helps emulate Visual Studio 2008 behavior on top of other build systems, primarily ninja. """ import os import re import subprocess import sys from gyp.common import OrderedSet import gyp.MSVSUtil import gyp.MSVSVersion windows_quoter_regex = re.compile(r'(\\*)"') def QuoteForRspFile(arg): """Quote a command line argument so that it appears as one argument when processed via cmd.exe and parsed by CommandLineToArgvW (as is typical for Windows programs).""" # See http://goo.gl/cuFbX and http://goo.gl/dhPnp including the comment # threads. This is actually the quoting rules for CommandLineToArgvW, not # for the shell, because the shell doesn't do anything in Windows. This # works more or less because most programs (including the compiler, etc.) # use that function to handle command line arguments. # For a literal quote, CommandLineToArgvW requires 2n+1 backslashes # preceding it, and results in n backslashes + the quote. So we substitute # in 2* what we match, +1 more, plus the quote. arg = windows_quoter_regex.sub(lambda mo: 2 * mo.group(1) + '\\"', arg) # %'s also need to be doubled otherwise they're interpreted as batch # positional arguments. Also make sure to escape the % so that they're # passed literally through escaping so they can be singled to just the # original %. Otherwise, trying to pass the literal representation that # looks like an environment variable to the shell (e.g. %PATH%) would fail. arg = arg.replace('%', '%%') # These commands are used in rsp files, so no escaping for the shell (via ^) # is necessary. # Finally, wrap the whole thing in quotes so that the above quote rule # applies and whitespace isn't a word break. return '"' + arg + '"' def EncodeRspFileList(args): """Process a list of arguments using QuoteCmdExeArgument.""" # Note that the first argument is assumed to be the command. Don't add # quotes around it because then built-ins like 'echo', etc. won't work. # Take care to normpath only the path in the case of 'call ../x.bat' because # otherwise the whole thing is incorrectly interpreted as a path and not # normalized correctly. if not args: return '' if args[0].startswith('call '): call, program = args[0].split(' ', 1) program = call + ' ' + os.path.normpath(program) else: program = os.path.normpath(args[0]) return program + ' ' + ' '.join(QuoteForRspFile(arg) for arg in args[1:]) def _GenericRetrieve(root, default, path): """Given a list of dictionary keys |path| and a tree of dicts |root|, find value at path, or return |default| if any of the path doesn't exist.""" if not root: return default if not path: return root return _GenericRetrieve(root.get(path[0]), default, path[1:]) def _AddPrefix(element, prefix): """Add |prefix| to |element| or each subelement if element is iterable.""" if element is None: return element # Note, not Iterable because we don't want to handle strings like that. if isinstance(element, list) or isinstance(element, tuple): return [prefix + e for e in element] else: return prefix + element def _DoRemapping(element, map): """If |element| then remap it through |map|. If |element| is iterable then each item will be remapped. Any elements not found will be removed.""" if map is not None and element is not None: if not callable(map): map = map.get # Assume it's a dict, otherwise a callable to do the remap. if isinstance(element, list) or isinstance(element, tuple): element = filter(None, [map(elem) for elem in element]) else: element = map(element) return element def _AppendOrReturn(append, element): """If |append| is None, simply return |element|. If |append| is not None, then add |element| to it, adding each item in |element| if it's a list or tuple.""" if append is not None and element is not None: if isinstance(element, list) or isinstance(element, tuple): append.extend(element) else: append.append(element) else: return element def _FindDirectXInstallation(): """Try to find an installation location for the DirectX SDK. Check for the standard environment variable, and if that doesn't exist, try to find via the registry. May return None if not found in either location.""" # Return previously calculated value, if there is one if hasattr(_FindDirectXInstallation, 'dxsdk_dir'): return _FindDirectXInstallation.dxsdk_dir dxsdk_dir = os.environ.get('DXSDK_DIR') if not dxsdk_dir: # Setup params to pass to and attempt to launch reg.exe. cmd = ['reg.exe', 'query', r'HKLM\Software\Microsoft\DirectX', '/s'] p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE) for line in p.communicate()[0].splitlines(): if 'InstallPath' in line: dxsdk_dir = line.split(' ')[3] + "\\" # Cache return value _FindDirectXInstallation.dxsdk_dir = dxsdk_dir return dxsdk_dir def GetGlobalVSMacroEnv(vs_version): """Get a dict of variables mapping internal VS macro names to their gyp equivalents. Returns all variables that are independent of the target.""" env = {} # '$(VSInstallDir)' and '$(VCInstallDir)' are available when and only when # Visual Studio is actually installed. if vs_version.Path(): env['$(VSInstallDir)'] = vs_version.Path() env['$(VCInstallDir)'] = os.path.join(vs_version.Path(), 'VC') + '\\' # Chromium uses DXSDK_DIR in include/lib paths, but it may or may not be # set. This happens when the SDK is sync'd via src-internal, rather than # by typical end-user installation of the SDK. If it's not set, we don't # want to leave the unexpanded variable in the path, so simply strip it. dxsdk_dir = _FindDirectXInstallation() env['$(DXSDK_DIR)'] = dxsdk_dir if dxsdk_dir else '' # Try to find an installation location for the Windows DDK by checking # the WDK_DIR environment variable, may be None. env['$(WDK_DIR)'] = os.environ.get('WDK_DIR', '') return env def ExtractSharedMSVSSystemIncludes(configs, generator_flags): """Finds msvs_system_include_dirs that are common to all targets, removes them from all targets, and returns an OrderedSet containing them.""" all_system_includes = OrderedSet( configs[0].get('msvs_system_include_dirs', [])) for config in configs[1:]: system_includes = config.get('msvs_system_include_dirs', []) all_system_includes = all_system_includes & OrderedSet(system_includes) if not all_system_includes: return None # Expand macros in all_system_includes. env = GetGlobalVSMacroEnv(GetVSVersion(generator_flags)) expanded_system_includes = OrderedSet([ExpandMacros(include, env) for include in all_system_includes]) if any(['$' in include for include in expanded_system_includes]): # Some path relies on target-specific variables, bail. return None # Remove system includes shared by all targets from the targets. for config in configs: includes = config.get('msvs_system_include_dirs', []) if includes: # Don't insert a msvs_system_include_dirs key if not needed. # This must check the unexpanded includes list: new_includes = [i for i in includes if i not in all_system_includes] config['msvs_system_include_dirs'] = new_includes return expanded_system_includes class MsvsSettings(object): """A class that understands the gyp 'msvs_...' values (especially the msvs_settings field). They largely correpond to the VS2008 IDE DOM. This class helps map those settings to command line options.""" def __init__(self, spec, generator_flags): self.spec = spec self.vs_version = GetVSVersion(generator_flags) supported_fields = [ ('msvs_configuration_attributes', dict), ('msvs_settings', dict), ('msvs_system_include_dirs', list), ('msvs_disabled_warnings', list), ('msvs_precompiled_header', str), ('msvs_precompiled_source', str), ('msvs_configuration_platform', str), ('msvs_target_platform', str), ] configs = spec['configurations'] for field, default in supported_fields: setattr(self, field, {}) for configname, config in configs.iteritems(): getattr(self, field)[configname] = config.get(field, default()) self.msvs_cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.']) unsupported_fields = [ 'msvs_prebuild', 'msvs_postbuild', ] unsupported = [] for field in unsupported_fields: for config in configs.values(): if field in config: unsupported += ["%s not supported (target %s)." % (field, spec['target_name'])] if unsupported: raise Exception('\n'.join(unsupported)) def GetExtension(self): """Returns the extension for the target, with no leading dot. Uses 'product_extension' if specified, otherwise uses MSVS defaults based on the target type. """ ext = self.spec.get('product_extension', None) if ext: return ext return gyp.MSVSUtil.TARGET_TYPE_EXT.get(self.spec['type'], '') def GetVSMacroEnv(self, base_to_build=None, config=None): """Get a dict of variables mapping internal VS macro names to their gyp equivalents.""" target_platform = 'Win32' if self.GetArch(config) == 'x86' else 'x64' target_name = self.spec.get('product_prefix', '') + \ self.spec.get('product_name', self.spec['target_name']) target_dir = base_to_build + '\\' if base_to_build else '' target_ext = '.' + self.GetExtension() target_file_name = target_name + target_ext replacements = { '$(InputName)': '${root}', '$(InputPath)': '${source}', '$(IntDir)': '$!INTERMEDIATE_DIR', '$(OutDir)\\': target_dir, '$(PlatformName)': target_platform, '$(ProjectDir)\\': '', '$(ProjectName)': self.spec['target_name'], '$(TargetDir)\\': target_dir, '$(TargetExt)': target_ext, '$(TargetFileName)': target_file_name, '$(TargetName)': target_name, '$(TargetPath)': os.path.join(target_dir, target_file_name), } replacements.update(GetGlobalVSMacroEnv(self.vs_version)) return replacements def ConvertVSMacros(self, s, base_to_build=None, config=None): """Convert from VS macro names to something equivalent.""" env = self.GetVSMacroEnv(base_to_build, config=config) return ExpandMacros(s, env) def AdjustLibraries(self, libraries): """Strip -l from library if it's specified with that.""" libs = [lib[2:] if lib.startswith('-l') else lib for lib in libraries] return [lib + '.lib' if not lib.endswith('.lib') else lib for lib in libs] def _GetAndMunge(self, field, path, default, prefix, append, map): """Retrieve a value from |field| at |path| or return |default|. If |append| is specified, and the item is found, it will be appended to that object instead of returned. If |map| is specified, results will be remapped through |map| before being returned or appended.""" result = _GenericRetrieve(field, default, path) result = _DoRemapping(result, map) result = _AddPrefix(result, prefix) return _AppendOrReturn(append, result) class _GetWrapper(object): def __init__(self, parent, field, base_path, append=None): self.parent = parent self.field = field self.base_path = [base_path] self.append = append def __call__(self, name, map=None, prefix='', default=None): return self.parent._GetAndMunge(self.field, self.base_path + [name], default=default, prefix=prefix, append=self.append, map=map) def GetArch(self, config): """Get architecture based on msvs_configuration_platform and msvs_target_platform. Returns either 'x86' or 'x64'.""" configuration_platform = self.msvs_configuration_platform.get(config, '') platform = self.msvs_target_platform.get(config, '') if not platform: # If no specific override, use the configuration's. platform = configuration_platform # Map from platform to architecture. return {'Win32': 'x86', 'x64': 'x64'}.get(platform, 'x86') def _TargetConfig(self, config): """Returns the target-specific configuration.""" # There's two levels of architecture/platform specification in VS. The # first level is globally for the configuration (this is what we consider # "the" config at the gyp level, which will be something like 'Debug' or # 'Release_x64'), and a second target-specific configuration, which is an # override for the global one. |config| is remapped here to take into # account the local target-specific overrides to the global configuration. arch = self.GetArch(config) if arch == 'x64' and not config.endswith('_x64'): config += '_x64' if arch == 'x86' and config.endswith('_x64'): config = config.rsplit('_', 1)[0] return config def _Setting(self, path, config, default=None, prefix='', append=None, map=None): """_GetAndMunge for msvs_settings.""" return self._GetAndMunge( self.msvs_settings[config], path, default, prefix, append, map) def _ConfigAttrib(self, path, config, default=None, prefix='', append=None, map=None): """_GetAndMunge for msvs_configuration_attributes.""" return self._GetAndMunge( self.msvs_configuration_attributes[config], path, default, prefix, append, map) def AdjustIncludeDirs(self, include_dirs, config): """Updates include_dirs to expand VS specific paths, and adds the system include dirs used for platform SDK and similar.""" config = self._TargetConfig(config) includes = include_dirs + self.msvs_system_include_dirs[config] includes.extend(self._Setting( ('VCCLCompilerTool', 'AdditionalIncludeDirectories'), config, default=[])) return [self.ConvertVSMacros(p, config=config) for p in includes] def AdjustMidlIncludeDirs(self, midl_include_dirs, config): """Updates midl_include_dirs to expand VS specific paths, and adds the system include dirs used for platform SDK and similar.""" config = self._TargetConfig(config) includes = midl_include_dirs + self.msvs_system_include_dirs[config] includes.extend(self._Setting( ('VCMIDLTool', 'AdditionalIncludeDirectories'), config, default=[])) return [self.ConvertVSMacros(p, config=config) for p in includes] def GetComputedDefines(self, config): """Returns the set of defines that are injected to the defines list based on other VS settings.""" config = self._TargetConfig(config) defines = [] if self._ConfigAttrib(['CharacterSet'], config) == '1': defines.extend(('_UNICODE', 'UNICODE')) if self._ConfigAttrib(['CharacterSet'], config) == '2': defines.append('_MBCS') defines.extend(self._Setting( ('VCCLCompilerTool', 'PreprocessorDefinitions'), config, default=[])) return defines def GetCompilerPdbName(self, config, expand_special): """Get the pdb file name that should be used for compiler invocations, or None if there's no explicit name specified.""" config = self._TargetConfig(config) pdbname = self._Setting( ('VCCLCompilerTool', 'ProgramDataBaseFileName'), config) if pdbname: pdbname = expand_special(self.ConvertVSMacros(pdbname)) return pdbname def GetMapFileName(self, config, expand_special): """Gets the explicitly overriden map file name for a target or returns None if it's not set.""" config = self._TargetConfig(config) map_file = self._Setting(('VCLinkerTool', 'MapFileName'), config) if map_file: map_file = expand_special(self.ConvertVSMacros(map_file, config=config)) return map_file def GetOutputName(self, config, expand_special): """Gets the explicitly overridden output name for a target or returns None if it's not overridden.""" config = self._TargetConfig(config) type = self.spec['type'] root = 'VCLibrarianTool' if type == 'static_library' else 'VCLinkerTool' # TODO(scottmg): Handle OutputDirectory without OutputFile. output_file = self._Setting((root, 'OutputFile'), config) if output_file: output_file = expand_special(self.ConvertVSMacros( output_file, config=config)) return output_file def GetPDBName(self, config, expand_special, default): """Gets the explicitly overridden pdb name for a target or returns default if it's not overridden, or if no pdb will be generated.""" config = self._TargetConfig(config) output_file = self._Setting(('VCLinkerTool', 'ProgramDatabaseFile'), config) generate_debug_info = self._Setting( ('VCLinkerTool', 'GenerateDebugInformation'), config) if generate_debug_info == 'true': if output_file: return expand_special(self.ConvertVSMacros(output_file, config=config)) else: return default else: return None def GetNoImportLibrary(self, config): """If NoImportLibrary: true, ninja will not expect the output to include an import library.""" config = self._TargetConfig(config) noimplib = self._Setting(('NoImportLibrary',), config) return noimplib == 'true' def GetAsmflags(self, config): """Returns the flags that need to be added to ml invocations.""" config = self._TargetConfig(config) asmflags = [] safeseh = self._Setting(('MASM', 'UseSafeExceptionHandlers'), config) if safeseh == 'true': asmflags.append('/safeseh') return asmflags def GetCflags(self, config): """Returns the flags that need to be added to .c and .cc compilations.""" config = self._TargetConfig(config) cflags = [] cflags.extend(['/wd' + w for w in self.msvs_disabled_warnings[config]]) cl = self._GetWrapper(self, self.msvs_settings[config], 'VCCLCompilerTool', append=cflags) cl('Optimization', map={'0': 'd', '1': '1', '2': '2', '3': 'x'}, prefix='/O', default='2') cl('InlineFunctionExpansion', prefix='/Ob') cl('DisableSpecificWarnings', prefix='/wd') cl('StringPooling', map={'true': '/GF'}) cl('EnableFiberSafeOptimizations', map={'true': '/GT'}) cl('OmitFramePointers', map={'false': '-', 'true': ''}, prefix='/Oy') cl('EnableIntrinsicFunctions', map={'false': '-', 'true': ''}, prefix='/Oi') cl('FavorSizeOrSpeed', map={'1': 't', '2': 's'}, prefix='/O') cl('FloatingPointModel', map={'0': 'precise', '1': 'strict', '2': 'fast'}, prefix='/fp:', default='0') cl('CompileAsManaged', map={'false': '', 'true': '/clr'}) cl('WholeProgramOptimization', map={'true': '/GL'}) cl('WarningLevel', prefix='/W') cl('WarnAsError', map={'true': '/WX'}) cl('CallingConvention', map={'0': 'd', '1': 'r', '2': 'z', '3': 'v'}, prefix='/G') cl('DebugInformationFormat', map={'1': '7', '3': 'i', '4': 'I'}, prefix='/Z') cl('RuntimeTypeInfo', map={'true': '/GR', 'false': '/GR-'}) cl('EnableFunctionLevelLinking', map={'true': '/Gy', 'false': '/Gy-'}) cl('MinimalRebuild', map={'true': '/Gm'}) cl('BufferSecurityCheck', map={'true': '/GS', 'false': '/GS-'}) cl('BasicRuntimeChecks', map={'1': 's', '2': 'u', '3': '1'}, prefix='/RTC') cl('RuntimeLibrary', map={'0': 'T', '1': 'Td', '2': 'D', '3': 'Dd'}, prefix='/M') cl('ExceptionHandling', map={'1': 'sc','2': 'a'}, prefix='/EH') cl('DefaultCharIsUnsigned', map={'true': '/J'}) cl('TreatWChar_tAsBuiltInType', map={'false': '-', 'true': ''}, prefix='/Zc:wchar_t') cl('EnablePREfast', map={'true': '/analyze'}) cl('AdditionalOptions', prefix='') cl('EnableEnhancedInstructionSet', map={'1': 'SSE', '2': 'SSE2', '3': 'AVX', '4': 'IA32', '5': 'AVX2'}, prefix='/arch:') cflags.extend(['/FI' + f for f in self._Setting( ('VCCLCompilerTool', 'ForcedIncludeFiles'), config, default=[])]) if self.vs_version.short_name in ('2013', '2013e', '2015'): # New flag required in 2013 to maintain previous PDB behavior. cflags.append('/FS') # ninja handles parallelism by itself, don't have the compiler do it too. cflags = filter(lambda x: not x.startswith('/MP'), cflags) return cflags def _GetPchFlags(self, config, extension): """Get the flags to be added to the cflags for precompiled header support. """ config = self._TargetConfig(config) # The PCH is only built once by a particular source file. Usage of PCH must # only be for the same language (i.e. C vs. C++), so only include the pch # flags when the language matches. if self.msvs_precompiled_header[config]: source_ext = os.path.splitext(self.msvs_precompiled_source[config])[1] if _LanguageMatchesForPch(source_ext, extension): pch = os.path.split(self.msvs_precompiled_header[config])[1] return ['/Yu' + pch, '/FI' + pch, '/Fp${pchprefix}.' + pch + '.pch'] return [] def GetCflagsC(self, config): """Returns the flags that need to be added to .c compilations.""" config = self._TargetConfig(config) return self._GetPchFlags(config, '.c') def GetCflagsCC(self, config): """Returns the flags that need to be added to .cc compilations.""" config = self._TargetConfig(config) return ['/TP'] + self._GetPchFlags(config, '.cc') def _GetAdditionalLibraryDirectories(self, root, config, gyp_to_build_path): """Get and normalize the list of paths in AdditionalLibraryDirectories setting.""" config = self._TargetConfig(config) libpaths = self._Setting((root, 'AdditionalLibraryDirectories'), config, default=[]) libpaths = [os.path.normpath( gyp_to_build_path(self.ConvertVSMacros(p, config=config))) for p in libpaths] return ['/LIBPATH:"' + p + '"' for p in libpaths] def GetLibFlags(self, config, gyp_to_build_path): """Returns the flags that need to be added to lib commands.""" config = self._TargetConfig(config) libflags = [] lib = self._GetWrapper(self, self.msvs_settings[config], 'VCLibrarianTool', append=libflags) libflags.extend(self._GetAdditionalLibraryDirectories( 'VCLibrarianTool', config, gyp_to_build_path)) lib('LinkTimeCodeGeneration', map={'true': '/LTCG'}) lib('TargetMachine', map={'1': 'X86', '17': 'X64', '3': 'ARM'}, prefix='/MACHINE:') lib('AdditionalOptions') return libflags def GetDefFile(self, gyp_to_build_path): """Returns the .def file from sources, if any. Otherwise returns None.""" spec = self.spec if spec['type'] in ('shared_library', 'loadable_module', 'executable'): def_files = [s for s in spec.get('sources', []) if s.endswith('.def')] if len(def_files) == 1: return gyp_to_build_path(def_files[0]) elif len(def_files) > 1: raise Exception("Multiple .def files") return None def _GetDefFileAsLdflags(self, ldflags, gyp_to_build_path): """.def files get implicitly converted to a ModuleDefinitionFile for the linker in the VS generator. Emulate that behaviour here.""" def_file = self.GetDefFile(gyp_to_build_path) if def_file: ldflags.append('/DEF:"%s"' % def_file) def GetPGDName(self, config, expand_special): """Gets the explicitly overridden pgd name for a target or returns None if it's not overridden.""" config = self._TargetConfig(config) output_file = self._Setting( ('VCLinkerTool', 'ProfileGuidedDatabase'), config) if output_file: output_file = expand_special(self.ConvertVSMacros( output_file, config=config)) return output_file def GetLdflags(self, config, gyp_to_build_path, expand_special, manifest_base_name, output_name, is_executable, build_dir): """Returns the flags that need to be added to link commands, and the manifest files.""" config = self._TargetConfig(config) ldflags = [] ld = self._GetWrapper(self, self.msvs_settings[config], 'VCLinkerTool', append=ldflags) self._GetDefFileAsLdflags(ldflags, gyp_to_build_path) ld('GenerateDebugInformation', map={'true': '/DEBUG'}) ld('TargetMachine', map={'1': 'X86', '17': 'X64', '3': 'ARM'}, prefix='/MACHINE:') ldflags.extend(self._GetAdditionalLibraryDirectories( 'VCLinkerTool', config, gyp_to_build_path)) ld('DelayLoadDLLs', prefix='/DELAYLOAD:') ld('TreatLinkerWarningAsErrors', prefix='/WX', map={'true': '', 'false': ':NO'}) out = self.GetOutputName(config, expand_special) if out: ldflags.append('/OUT:' + out) pdb = self.GetPDBName(config, expand_special, output_name + '.pdb') if pdb: ldflags.append('/PDB:' + pdb) pgd = self.GetPGDName(config, expand_special) if pgd: ldflags.append('/PGD:' + pgd) map_file = self.GetMapFileName(config, expand_special) ld('GenerateMapFile', map={'true': '/MAP:' + map_file if map_file else '/MAP'}) ld('MapExports', map={'true': '/MAPINFO:EXPORTS'}) ld('AdditionalOptions', prefix='') minimum_required_version = self._Setting( ('VCLinkerTool', 'MinimumRequiredVersion'), config, default='') if minimum_required_version: minimum_required_version = ',' + minimum_required_version ld('SubSystem', map={'1': 'CONSOLE%s' % minimum_required_version, '2': 'WINDOWS%s' % minimum_required_version}, prefix='/SUBSYSTEM:') stack_reserve_size = self._Setting( ('VCLinkerTool', 'StackReserveSize'), config, default='') if stack_reserve_size: stack_commit_size = self._Setting( ('VCLinkerTool', 'StackCommitSize'), config, default='') if stack_commit_size: stack_commit_size = ',' + stack_commit_size ldflags.append('/STACK:%s%s' % (stack_reserve_size, stack_commit_size)) ld('TerminalServerAware', map={'1': ':NO', '2': ''}, prefix='/TSAWARE') ld('LinkIncremental', map={'1': ':NO', '2': ''}, prefix='/INCREMENTAL') ld('BaseAddress', prefix='/BASE:') ld('FixedBaseAddress', map={'1': ':NO', '2': ''}, prefix='/FIXED') ld('RandomizedBaseAddress', map={'1': ':NO', '2': ''}, prefix='/DYNAMICBASE') ld('DataExecutionPrevention', map={'1': ':NO', '2': ''}, prefix='/NXCOMPAT') ld('OptimizeReferences', map={'1': 'NOREF', '2': 'REF'}, prefix='/OPT:') ld('ForceSymbolReferences', prefix='/INCLUDE:') ld('EnableCOMDATFolding', map={'1': 'NOICF', '2': 'ICF'}, prefix='/OPT:') ld('LinkTimeCodeGeneration', map={'1': '', '2': ':PGINSTRUMENT', '3': ':PGOPTIMIZE', '4': ':PGUPDATE'}, prefix='/LTCG') ld('IgnoreDefaultLibraryNames', prefix='/NODEFAULTLIB:') ld('ResourceOnlyDLL', map={'true': '/NOENTRY'}) ld('EntryPointSymbol', prefix='/ENTRY:') ld('Profile', map={'true': '/PROFILE'}) ld('LargeAddressAware', map={'1': ':NO', '2': ''}, prefix='/LARGEADDRESSAWARE') # TODO(scottmg): This should sort of be somewhere else (not really a flag). ld('AdditionalDependencies', prefix='') if self.GetArch(config) == 'x86': safeseh_default = 'true' else: safeseh_default = None ld('ImageHasSafeExceptionHandlers', map={'false': ':NO', 'true': ''}, prefix='/SAFESEH', default=safeseh_default) # If the base address is not specifically controlled, DYNAMICBASE should # be on by default. base_flags = filter(lambda x: 'DYNAMICBASE' in x or x == '/FIXED', ldflags) if not base_flags: ldflags.append('/DYNAMICBASE') # If the NXCOMPAT flag has not been specified, default to on. Despite the # documentation that says this only defaults to on when the subsystem is # Vista or greater (which applies to the linker), the IDE defaults it on # unless it's explicitly off. if not filter(lambda x: 'NXCOMPAT' in x, ldflags): ldflags.append('/NXCOMPAT') have_def_file = filter(lambda x: x.startswith('/DEF:'), ldflags) manifest_flags, intermediate_manifest, manifest_files = \ self._GetLdManifestFlags(config, manifest_base_name, gyp_to_build_path, is_executable and not have_def_file, build_dir) ldflags.extend(manifest_flags) return ldflags, intermediate_manifest, manifest_files def _GetLdManifestFlags(self, config, name, gyp_to_build_path, allow_isolation, build_dir): """Returns a 3-tuple: - the set of flags that need to be added to the link to generate a default manifest - the intermediate manifest that the linker will generate that should be used to assert it doesn't add anything to the merged one. - the list of all the manifest files to be merged by the manifest tool and included into the link.""" generate_manifest = self._Setting(('VCLinkerTool', 'GenerateManifest'), config, default='true') if generate_manifest != 'true': # This means not only that the linker should not generate the intermediate # manifest but also that the manifest tool should do nothing even when # additional manifests are specified. return ['/MANIFEST:NO'], [], [] output_name = name + '.intermediate.manifest' flags = [ '/MANIFEST', '/ManifestFile:' + output_name, ] # Instead of using the MANIFESTUAC flags, we generate a .manifest to # include into the list of manifests. This allows us to avoid the need to # do two passes during linking. The /MANIFEST flag and /ManifestFile are # still used, and the intermediate manifest is used to assert that the # final manifest we get from merging all the additional manifest files # (plus the one we generate here) isn't modified by merging the # intermediate into it. # Always NO, because we generate a manifest file that has what we want. flags.append('/MANIFESTUAC:NO') config = self._TargetConfig(config) enable_uac = self._Setting(('VCLinkerTool', 'EnableUAC'), config, default='true') manifest_files = [] generated_manifest_outer = \ "" \ "%s" \ "" if enable_uac == 'true': execution_level = self._Setting(('VCLinkerTool', 'UACExecutionLevel'), config, default='0') execution_level_map = { '0': 'asInvoker', '1': 'highestAvailable', '2': 'requireAdministrator' } ui_access = self._Setting(('VCLinkerTool', 'UACUIAccess'), config, default='false') inner = ''' ''' % (execution_level_map[execution_level], ui_access) else: inner = '' generated_manifest_contents = generated_manifest_outer % inner generated_name = name + '.generated.manifest' # Need to join with the build_dir here as we're writing it during # generation time, but we return the un-joined version because the build # will occur in that directory. We only write the file if the contents # have changed so that simply regenerating the project files doesn't # cause a relink. build_dir_generated_name = os.path.join(build_dir, generated_name) gyp.common.EnsureDirExists(build_dir_generated_name) f = gyp.common.WriteOnDiff(build_dir_generated_name) f.write(generated_manifest_contents) f.close() manifest_files = [generated_name] if allow_isolation: flags.append('/ALLOWISOLATION') manifest_files += self._GetAdditionalManifestFiles(config, gyp_to_build_path) return flags, output_name, manifest_files def _GetAdditionalManifestFiles(self, config, gyp_to_build_path): """Gets additional manifest files that are added to the default one generated by the linker.""" files = self._Setting(('VCManifestTool', 'AdditionalManifestFiles'), config, default=[]) if isinstance(files, str): files = files.split(';') return [os.path.normpath( gyp_to_build_path(self.ConvertVSMacros(f, config=config))) for f in files] def IsUseLibraryDependencyInputs(self, config): """Returns whether the target should be linked via Use Library Dependency Inputs (using component .objs of a given .lib).""" config = self._TargetConfig(config) uldi = self._Setting(('VCLinkerTool', 'UseLibraryDependencyInputs'), config) return uldi == 'true' def IsEmbedManifest(self, config): """Returns whether manifest should be linked into binary.""" config = self._TargetConfig(config) embed = self._Setting(('VCManifestTool', 'EmbedManifest'), config, default='true') return embed == 'true' def IsLinkIncremental(self, config): """Returns whether the target should be linked incrementally.""" config = self._TargetConfig(config) link_inc = self._Setting(('VCLinkerTool', 'LinkIncremental'), config) return link_inc != '1' def GetRcflags(self, config, gyp_to_ninja_path): """Returns the flags that need to be added to invocations of the resource compiler.""" config = self._TargetConfig(config) rcflags = [] rc = self._GetWrapper(self, self.msvs_settings[config], 'VCResourceCompilerTool', append=rcflags) rc('AdditionalIncludeDirectories', map=gyp_to_ninja_path, prefix='/I') rcflags.append('/I' + gyp_to_ninja_path('.')) rc('PreprocessorDefinitions', prefix='/d') # /l arg must be in hex without leading '0x' rc('Culture', prefix='/l', map=lambda x: hex(int(x))[2:]) return rcflags def BuildCygwinBashCommandLine(self, args, path_to_base): """Build a command line that runs args via cygwin bash. We assume that all incoming paths are in Windows normpath'd form, so they need to be converted to posix style for the part of the command line that's passed to bash. We also have to do some Visual Studio macro emulation here because various rules use magic VS names for things. Also note that rules that contain ninja variables cannot be fixed here (for example ${source}), so the outer generator needs to make sure that the paths that are written out are in posix style, if the command line will be used here.""" cygwin_dir = os.path.normpath( os.path.join(path_to_base, self.msvs_cygwin_dirs[0])) cd = ('cd %s' % path_to_base).replace('\\', '/') args = [a.replace('\\', '/').replace('"', '\\"') for a in args] args = ["'%s'" % a.replace("'", "'\\''") for a in args] bash_cmd = ' '.join(args) cmd = ( 'call "%s\\setup_env.bat" && set CYGWIN=nontsec && ' % cygwin_dir + 'bash -c "%s ; %s"' % (cd, bash_cmd)) return cmd def IsRuleRunUnderCygwin(self, rule): """Determine if an action should be run under cygwin. If the variable is unset, or set to 1 we use cygwin.""" return int(rule.get('msvs_cygwin_shell', self.spec.get('msvs_cygwin_shell', 1))) != 0 def _HasExplicitRuleForExtension(self, spec, extension): """Determine if there's an explicit rule for a particular extension.""" for rule in spec.get('rules', []): if rule['extension'] == extension: return True return False def _HasExplicitIdlActions(self, spec): """Determine if an action should not run midl for .idl files.""" return any([action.get('explicit_idl_action', 0) for action in spec.get('actions', [])]) def HasExplicitIdlRulesOrActions(self, spec): """Determine if there's an explicit rule or action for idl files. When there isn't we need to generate implicit rules to build MIDL .idl files.""" return (self._HasExplicitRuleForExtension(spec, 'idl') or self._HasExplicitIdlActions(spec)) def HasExplicitAsmRules(self, spec): """Determine if there's an explicit rule for asm files. When there isn't we need to generate implicit rules to assemble .asm files.""" return self._HasExplicitRuleForExtension(spec, 'asm') def GetIdlBuildData(self, source, config): """Determine the implicit outputs for an idl file. Returns output directory, outputs, and variables and flags that are required.""" config = self._TargetConfig(config) midl_get = self._GetWrapper(self, self.msvs_settings[config], 'VCMIDLTool') def midl(name, default=None): return self.ConvertVSMacros(midl_get(name, default=default), config=config) tlb = midl('TypeLibraryName', default='${root}.tlb') header = midl('HeaderFileName', default='${root}.h') dlldata = midl('DLLDataFileName', default='dlldata.c') iid = midl('InterfaceIdentifierFileName', default='${root}_i.c') proxy = midl('ProxyFileName', default='${root}_p.c') # Note that .tlb is not included in the outputs as it is not always # generated depending on the content of the input idl file. outdir = midl('OutputDirectory', default='') output = [header, dlldata, iid, proxy] variables = [('tlb', tlb), ('h', header), ('dlldata', dlldata), ('iid', iid), ('proxy', proxy)] # TODO(scottmg): Are there configuration settings to set these flags? target_platform = 'win32' if self.GetArch(config) == 'x86' else 'x64' flags = ['/char', 'signed', '/env', target_platform, '/Oicf'] return outdir, output, variables, flags def _LanguageMatchesForPch(source_ext, pch_source_ext): c_exts = ('.c',) cc_exts = ('.cc', '.cxx', '.cpp') return ((source_ext in c_exts and pch_source_ext in c_exts) or (source_ext in cc_exts and pch_source_ext in cc_exts)) class PrecompiledHeader(object): """Helper to generate dependencies and build rules to handle generation of precompiled headers. Interface matches the GCH handler in xcode_emulation.py. """ def __init__( self, settings, config, gyp_to_build_path, gyp_to_unique_output, obj_ext): self.settings = settings self.config = config pch_source = self.settings.msvs_precompiled_source[self.config] self.pch_source = gyp_to_build_path(pch_source) filename, _ = os.path.splitext(pch_source) self.output_obj = gyp_to_unique_output(filename + obj_ext).lower() def _PchHeader(self): """Get the header that will appear in an #include line for all source files.""" return os.path.split(self.settings.msvs_precompiled_header[self.config])[1] def GetObjDependencies(self, sources, objs, arch): """Given a list of sources files and the corresponding object files, returns a list of the pch files that should be depended upon. The additional wrapping in the return value is for interface compatibility with make.py on Mac, and xcode_emulation.py.""" assert arch is None if not self._PchHeader(): return [] pch_ext = os.path.splitext(self.pch_source)[1] for source in sources: if _LanguageMatchesForPch(os.path.splitext(source)[1], pch_ext): return [(None, None, self.output_obj)] return [] def GetPchBuildCommands(self, arch): """Not used on Windows as there are no additional build steps required (instead, existing steps are modified in GetFlagsModifications below).""" return [] def GetFlagsModifications(self, input, output, implicit, command, cflags_c, cflags_cc, expand_special): """Get the modified cflags and implicit dependencies that should be used for the pch compilation step.""" if input == self.pch_source: pch_output = ['/Yc' + self._PchHeader()] if command == 'cxx': return ([('cflags_cc', map(expand_special, cflags_cc + pch_output))], self.output_obj, []) elif command == 'cc': return ([('cflags_c', map(expand_special, cflags_c + pch_output))], self.output_obj, []) return [], output, implicit vs_version = None def GetVSVersion(generator_flags): global vs_version if not vs_version: vs_version = gyp.MSVSVersion.SelectVisualStudioVersion( generator_flags.get('msvs_version', 'auto'), allow_fallback=False) return vs_version def _GetVsvarsSetupArgs(generator_flags, arch): vs = GetVSVersion(generator_flags) return vs.SetupScript() def ExpandMacros(string, expansions): """Expand $(Variable) per expansions dict. See MsvsSettings.GetVSMacroEnv for the canonical way to retrieve a suitable dict.""" if '$' in string: for old, new in expansions.iteritems(): assert '$(' not in new, new string = string.replace(old, new) return string def _ExtractImportantEnvironment(output_of_set): """Extracts environment variables required for the toolchain to run from a textual dump output by the cmd.exe 'set' command.""" envvars_to_save = ( 'goma_.*', # TODO(scottmg): This is ugly, but needed for goma. 'include', 'lib', 'libpath', 'path', 'pathext', 'systemroot', 'temp', 'tmp', ) env = {} for line in output_of_set.splitlines(): for envvar in envvars_to_save: if re.match(envvar + '=', line.lower()): var, setting = line.split('=', 1) if envvar == 'path': # Our own rules (for running gyp-win-tool) and other actions in # Chromium rely on python being in the path. Add the path to this # python here so that if it's not in the path when ninja is run # later, python will still be found. setting = os.path.dirname(sys.executable) + os.pathsep + setting env[var.upper()] = setting break for required in ('SYSTEMROOT', 'TEMP', 'TMP'): if required not in env: raise Exception('Environment variable "%s" ' 'required to be set to valid path' % required) return env def _FormatAsEnvironmentBlock(envvar_dict): """Format as an 'environment block' directly suitable for CreateProcess. Briefly this is a list of key=value\0, terminated by an additional \0. See CreateProcess documentation for more details.""" block = '' nul = '\0' for key, value in envvar_dict.iteritems(): block += key + '=' + value + nul block += nul return block def _ExtractCLPath(output_of_where): """Gets the path to cl.exe based on the output of calling the environment setup batch file, followed by the equivalent of `where`.""" # Take the first line, as that's the first found in the PATH. for line in output_of_where.strip().splitlines(): if line.startswith('LOC:'): return line[len('LOC:'):].strip() def GenerateEnvironmentFiles(toplevel_build_dir, generator_flags, system_includes, open_out): """It's not sufficient to have the absolute path to the compiler, linker, etc. on Windows, as those tools rely on .dlls being in the PATH. We also need to support both x86 and x64 compilers within the same build (to support msvs_target_platform hackery). Different architectures require a different compiler binary, and different supporting environment variables (INCLUDE, LIB, LIBPATH). So, we extract the environment here, wrap all invocations of compiler tools (cl, link, lib, rc, midl, etc.) via win_tool.py which sets up the environment, and then we do not prefix the compiler with an absolute path, instead preferring something like "cl.exe" in the rule which will then run whichever the environment setup has put in the path. When the following procedure to generate environment files does not meet your requirement (e.g. for custom toolchains), you can pass "-G ninja_use_custom_environment_files" to the gyp to suppress file generation and use custom environment files prepared by yourself.""" archs = ('x86', 'x64') if generator_flags.get('ninja_use_custom_environment_files', 0): cl_paths = {} for arch in archs: cl_paths[arch] = 'cl.exe' return cl_paths vs = GetVSVersion(generator_flags) cl_paths = {} for arch in archs: # Extract environment variables for subprocesses. args = vs.SetupScript(arch) args.extend(('&&', 'set')) popen = subprocess.Popen( args, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) variables, _ = popen.communicate() env = _ExtractImportantEnvironment(variables) # Inject system includes from gyp files into INCLUDE. if system_includes: system_includes = system_includes | OrderedSet( env.get('INCLUDE', '').split(';')) env['INCLUDE'] = ';'.join(system_includes) env_block = _FormatAsEnvironmentBlock(env) f = open_out(os.path.join(toplevel_build_dir, 'environment.' + arch), 'wb') f.write(env_block) f.close() # Find cl.exe location for this architecture. args = vs.SetupScript(arch) args.extend(('&&', 'for', '%i', 'in', '(cl.exe)', 'do', '@echo', 'LOC:%~$PATH:i')) popen = subprocess.Popen(args, shell=True, stdout=subprocess.PIPE) output, _ = popen.communicate() cl_paths[arch] = _ExtractCLPath(output) return cl_paths def VerifyMissingSources(sources, build_dir, generator_flags, gyp_to_ninja): """Emulate behavior of msvs_error_on_missing_sources present in the msvs generator: Check that all regular source files, i.e. not created at run time, exist on disk. Missing files cause needless recompilation when building via VS, and we want this check to match for people/bots that build using ninja, so they're not surprised when the VS build fails.""" if int(generator_flags.get('msvs_error_on_missing_sources', 0)): no_specials = filter(lambda x: '$' not in x, sources) relative = [os.path.join(build_dir, gyp_to_ninja(s)) for s in no_specials] missing = filter(lambda x: not os.path.exists(x), relative) if missing: # They'll look like out\Release\..\..\stuff\things.cc, so normalize the # path for a slightly less crazy looking output. cleaned_up = [os.path.normpath(x) for x in missing] raise Exception('Missing input files:\n%s' % '\n'.join(cleaned_up)) # Sets some values in default_variables, which are required for many # generators, run on Windows. def CalculateCommonVariables(default_variables, params): generator_flags = params.get('generator_flags', {}) # Set a variable so conditions can be based on msvs_version. msvs_version = gyp.msvs_emulation.GetVSVersion(generator_flags) default_variables['MSVS_VERSION'] = msvs_version.ShortName() # To determine processor word size on Windows, in addition to checking # PROCESSOR_ARCHITECTURE (which reflects the word size of the current # process), it is also necessary to check PROCESSOR_ARCHITEW6432 (which # contains the actual word size of the system when running thru WOW64). if ('64' in os.environ.get('PROCESSOR_ARCHITECTURE', '') or '64' in os.environ.get('PROCESSOR_ARCHITEW6432', '')): default_variables['MSVS_OS_BITS'] = 64 else: default_variables['MSVS_OS_BITS'] = 32 npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/ninja_syntax.py0000644000000000000000000001264012631326456023470 0ustar 00000000000000# This file comes from # https://github.com/martine/ninja/blob/master/misc/ninja_syntax.py # Do not edit! Edit the upstream one instead. """Python module for generating .ninja files. Note that this is emphatically not a required piece of Ninja; it's just a helpful utility for build-file-generation systems that already use Python. """ import textwrap import re def escape_path(word): return word.replace('$ ','$$ ').replace(' ','$ ').replace(':', '$:') class Writer(object): def __init__(self, output, width=78): self.output = output self.width = width def newline(self): self.output.write('\n') def comment(self, text): for line in textwrap.wrap(text, self.width - 2): self.output.write('# ' + line + '\n') def variable(self, key, value, indent=0): if value is None: return if isinstance(value, list): value = ' '.join(filter(None, value)) # Filter out empty strings. self._line('%s = %s' % (key, value), indent) def pool(self, name, depth): self._line('pool %s' % name) self.variable('depth', depth, indent=1) def rule(self, name, command, description=None, depfile=None, generator=False, pool=None, restat=False, rspfile=None, rspfile_content=None, deps=None): self._line('rule %s' % name) self.variable('command', command, indent=1) if description: self.variable('description', description, indent=1) if depfile: self.variable('depfile', depfile, indent=1) if generator: self.variable('generator', '1', indent=1) if pool: self.variable('pool', pool, indent=1) if restat: self.variable('restat', '1', indent=1) if rspfile: self.variable('rspfile', rspfile, indent=1) if rspfile_content: self.variable('rspfile_content', rspfile_content, indent=1) if deps: self.variable('deps', deps, indent=1) def build(self, outputs, rule, inputs=None, implicit=None, order_only=None, variables=None): outputs = self._as_list(outputs) all_inputs = self._as_list(inputs)[:] out_outputs = list(map(escape_path, outputs)) all_inputs = list(map(escape_path, all_inputs)) if implicit: implicit = map(escape_path, self._as_list(implicit)) all_inputs.append('|') all_inputs.extend(implicit) if order_only: order_only = map(escape_path, self._as_list(order_only)) all_inputs.append('||') all_inputs.extend(order_only) self._line('build %s: %s' % (' '.join(out_outputs), ' '.join([rule] + all_inputs))) if variables: if isinstance(variables, dict): iterator = iter(variables.items()) else: iterator = iter(variables) for key, val in iterator: self.variable(key, val, indent=1) return outputs def include(self, path): self._line('include %s' % path) def subninja(self, path): self._line('subninja %s' % path) def default(self, paths): self._line('default %s' % ' '.join(self._as_list(paths))) def _count_dollars_before_index(self, s, i): """Returns the number of '$' characters right in front of s[i].""" dollar_count = 0 dollar_index = i - 1 while dollar_index > 0 and s[dollar_index] == '$': dollar_count += 1 dollar_index -= 1 return dollar_count def _line(self, text, indent=0): """Write 'text' word-wrapped at self.width characters.""" leading_space = ' ' * indent while len(leading_space) + len(text) > self.width: # The text is too wide; wrap if possible. # Find the rightmost space that would obey our width constraint and # that's not an escaped space. available_space = self.width - len(leading_space) - len(' $') space = available_space while True: space = text.rfind(' ', 0, space) if space < 0 or \ self._count_dollars_before_index(text, space) % 2 == 0: break if space < 0: # No such space; just use the first unescaped space we can find. space = available_space - 1 while True: space = text.find(' ', space + 1) if space < 0 or \ self._count_dollars_before_index(text, space) % 2 == 0: break if space < 0: # Give up on breaking. break self.output.write(leading_space + text[0:space] + ' $\n') text = text[space+1:] # Subsequent lines are continuations, so indent them. leading_space = ' ' * (indent+2) self.output.write(leading_space + text + '\n') def _as_list(self, input): if input is None: return [] if isinstance(input, list): return input return [input] def escape(string): """Escape a string such that it can be embedded into a Ninja file without further interpretation.""" assert '\n' not in string, 'Ninja syntax does not allow newlines' # We only have one special metacharacter: '$'. return string.replace('$', '$$') npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/ordered_dict.py0000644000000000000000000002417612631326456023421 0ustar 00000000000000# Unmodified from http://code.activestate.com/recipes/576693/ # other than to add MIT license header (as specified on page, but not in code). # Linked from Python documentation here: # http://docs.python.org/2/library/collections.html#collections.OrderedDict # # This should be deleted once Py2.7 is available on all bots, see # http://crbug.com/241769. # # Copyright (c) 2009 Raymond Hettinger. # # Permission is hereby granted, free of charge, to any person obtaining a copy # of this software and associated documentation files (the "Software"), to deal # in the Software without restriction, including without limitation the rights # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell # copies of the Software, and to permit persons to whom the Software is # furnished to do so, subject to the following conditions: # # The above copyright notice and this permission notice shall be included in # all copies or substantial portions of the Software. # # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN # THE SOFTWARE. # Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy. # Passes Python2.7's test suite and incorporates all the latest updates. try: from thread import get_ident as _get_ident except ImportError: from dummy_thread import get_ident as _get_ident try: from _abcoll import KeysView, ValuesView, ItemsView except ImportError: pass class OrderedDict(dict): 'Dictionary that remembers insertion order' # An inherited dict maps keys to values. # The inherited dict provides __getitem__, __len__, __contains__, and get. # The remaining methods are order-aware. # Big-O running times for all methods are the same as for regular dictionaries. # The internal self.__map dictionary maps keys to links in a doubly linked list. # The circular doubly linked list starts and ends with a sentinel element. # The sentinel element never gets deleted (this simplifies the algorithm). # Each link is stored as a list of length three: [PREV, NEXT, KEY]. def __init__(self, *args, **kwds): '''Initialize an ordered dictionary. Signature is the same as for regular dictionaries, but keyword arguments are not recommended because their insertion order is arbitrary. ''' if len(args) > 1: raise TypeError('expected at most 1 arguments, got %d' % len(args)) try: self.__root except AttributeError: self.__root = root = [] # sentinel node root[:] = [root, root, None] self.__map = {} self.__update(*args, **kwds) def __setitem__(self, key, value, dict_setitem=dict.__setitem__): 'od.__setitem__(i, y) <==> od[i]=y' # Setting a new item creates a new link which goes at the end of the linked # list, and the inherited dictionary is updated with the new key/value pair. if key not in self: root = self.__root last = root[0] last[1] = root[0] = self.__map[key] = [last, root, key] dict_setitem(self, key, value) def __delitem__(self, key, dict_delitem=dict.__delitem__): 'od.__delitem__(y) <==> del od[y]' # Deleting an existing item uses self.__map to find the link which is # then removed by updating the links in the predecessor and successor nodes. dict_delitem(self, key) link_prev, link_next, key = self.__map.pop(key) link_prev[1] = link_next link_next[0] = link_prev def __iter__(self): 'od.__iter__() <==> iter(od)' root = self.__root curr = root[1] while curr is not root: yield curr[2] curr = curr[1] def __reversed__(self): 'od.__reversed__() <==> reversed(od)' root = self.__root curr = root[0] while curr is not root: yield curr[2] curr = curr[0] def clear(self): 'od.clear() -> None. Remove all items from od.' try: for node in self.__map.itervalues(): del node[:] root = self.__root root[:] = [root, root, None] self.__map.clear() except AttributeError: pass dict.clear(self) def popitem(self, last=True): '''od.popitem() -> (k, v), return and remove a (key, value) pair. Pairs are returned in LIFO order if last is true or FIFO order if false. ''' if not self: raise KeyError('dictionary is empty') root = self.__root if last: link = root[0] link_prev = link[0] link_prev[1] = root root[0] = link_prev else: link = root[1] link_next = link[1] root[1] = link_next link_next[0] = root key = link[2] del self.__map[key] value = dict.pop(self, key) return key, value # -- the following methods do not depend on the internal structure -- def keys(self): 'od.keys() -> list of keys in od' return list(self) def values(self): 'od.values() -> list of values in od' return [self[key] for key in self] def items(self): 'od.items() -> list of (key, value) pairs in od' return [(key, self[key]) for key in self] def iterkeys(self): 'od.iterkeys() -> an iterator over the keys in od' return iter(self) def itervalues(self): 'od.itervalues -> an iterator over the values in od' for k in self: yield self[k] def iteritems(self): 'od.iteritems -> an iterator over the (key, value) items in od' for k in self: yield (k, self[k]) # Suppress 'OrderedDict.update: Method has no argument': # pylint: disable=E0211 def update(*args, **kwds): '''od.update(E, **F) -> None. Update od from dict/iterable E and F. If E is a dict instance, does: for k in E: od[k] = E[k] If E has a .keys() method, does: for k in E.keys(): od[k] = E[k] Or if E is an iterable of items, does: for k, v in E: od[k] = v In either case, this is followed by: for k, v in F.items(): od[k] = v ''' if len(args) > 2: raise TypeError('update() takes at most 2 positional ' 'arguments (%d given)' % (len(args),)) elif not args: raise TypeError('update() takes at least 1 argument (0 given)') self = args[0] # Make progressively weaker assumptions about "other" other = () if len(args) == 2: other = args[1] if isinstance(other, dict): for key in other: self[key] = other[key] elif hasattr(other, 'keys'): for key in other.keys(): self[key] = other[key] else: for key, value in other: self[key] = value for key, value in kwds.items(): self[key] = value __update = update # let subclasses override update without breaking __init__ __marker = object() def pop(self, key, default=__marker): '''od.pop(k[,d]) -> v, remove specified key and return the corresponding value. If key is not found, d is returned if given, otherwise KeyError is raised. ''' if key in self: result = self[key] del self[key] return result if default is self.__marker: raise KeyError(key) return default def setdefault(self, key, default=None): 'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od' if key in self: return self[key] self[key] = default return default def __repr__(self, _repr_running={}): 'od.__repr__() <==> repr(od)' call_key = id(self), _get_ident() if call_key in _repr_running: return '...' _repr_running[call_key] = 1 try: if not self: return '%s()' % (self.__class__.__name__,) return '%s(%r)' % (self.__class__.__name__, self.items()) finally: del _repr_running[call_key] def __reduce__(self): 'Return state information for pickling' items = [[k, self[k]] for k in self] inst_dict = vars(self).copy() for k in vars(OrderedDict()): inst_dict.pop(k, None) if inst_dict: return (self.__class__, (items,), inst_dict) return self.__class__, (items,) def copy(self): 'od.copy() -> a shallow copy of od' return self.__class__(self) @classmethod def fromkeys(cls, iterable, value=None): '''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S and values equal to v (which defaults to None). ''' d = cls() for key in iterable: d[key] = value return d def __eq__(self, other): '''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive while comparison to a regular mapping is order-insensitive. ''' if isinstance(other, OrderedDict): return len(self)==len(other) and self.items() == other.items() return dict.__eq__(self, other) def __ne__(self, other): return not self == other # -- the following methods are only used in Python 2.7 -- def viewkeys(self): "od.viewkeys() -> a set-like object providing a view on od's keys" return KeysView(self) def viewvalues(self): "od.viewvalues() -> an object providing a view on od's values" return ValuesView(self) def viewitems(self): "od.viewitems() -> a set-like object providing a view on od's items" return ItemsView(self) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/simple_copy.py0000644000000000000000000000233712631326456023310 0ustar 00000000000000# Copyright 2014 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """A clone of the default copy.deepcopy that doesn't handle cyclic structures or complex types except for dicts and lists. This is because gyp copies so large structure that small copy overhead ends up taking seconds in a project the size of Chromium.""" class Error(Exception): pass __all__ = ["Error", "deepcopy"] def deepcopy(x): """Deep copy operation on gyp objects such as strings, ints, dicts and lists. More than twice as fast as copy.deepcopy but much less generic.""" try: return _deepcopy_dispatch[type(x)](x) except KeyError: raise Error('Unsupported type %s for deepcopy. Use copy.deepcopy ' + 'or expand simple_copy support.' % type(x)) _deepcopy_dispatch = d = {} def _deepcopy_atomic(x): return x for x in (type(None), int, long, float, bool, str, unicode, type): d[x] = _deepcopy_atomic def _deepcopy_list(x): return [deepcopy(a) for a in x] d[list] = _deepcopy_list def _deepcopy_dict(x): y = {} for key, value in x.iteritems(): y[deepcopy(key)] = deepcopy(value) return y d[dict] = _deepcopy_dict del d npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/win_tool.py0000755000000000000000000003071712631326456022625 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Utility functions for Windows builds. These functions are executed via gyp-win-tool when using the ninja generator. """ import os import re import shutil import subprocess import stat import string import sys BASE_DIR = os.path.dirname(os.path.abspath(__file__)) # A regex matching an argument corresponding to the output filename passed to # link.exe. _LINK_EXE_OUT_ARG = re.compile('/OUT:(?P.+)$', re.IGNORECASE) def main(args): executor = WinTool() exit_code = executor.Dispatch(args) if exit_code is not None: sys.exit(exit_code) class WinTool(object): """This class performs all the Windows tooling steps. The methods can either be executed directly, or dispatched from an argument list.""" def _UseSeparateMspdbsrv(self, env, args): """Allows to use a unique instance of mspdbsrv.exe per linker instead of a shared one.""" if len(args) < 1: raise Exception("Not enough arguments") if args[0] != 'link.exe': return # Use the output filename passed to the linker to generate an endpoint name # for mspdbsrv.exe. endpoint_name = None for arg in args: m = _LINK_EXE_OUT_ARG.match(arg) if m: endpoint_name = re.sub(r'\W+', '', '%s_%d' % (m.group('out'), os.getpid())) break if endpoint_name is None: return # Adds the appropriate environment variable. This will be read by link.exe # to know which instance of mspdbsrv.exe it should connect to (if it's # not set then the default endpoint is used). env['_MSPDBSRV_ENDPOINT_'] = endpoint_name def Dispatch(self, args): """Dispatches a string command to a method.""" if len(args) < 1: raise Exception("Not enough arguments") method = "Exec%s" % self._CommandifyName(args[0]) return getattr(self, method)(*args[1:]) def _CommandifyName(self, name_string): """Transforms a tool name like recursive-mirror to RecursiveMirror.""" return name_string.title().replace('-', '') def _GetEnv(self, arch): """Gets the saved environment from a file for a given architecture.""" # The environment is saved as an "environment block" (see CreateProcess # and msvs_emulation for details). We convert to a dict here. # Drop last 2 NULs, one for list terminator, one for trailing vs. separator. pairs = open(arch).read()[:-2].split('\0') kvs = [item.split('=', 1) for item in pairs] return dict(kvs) def ExecStamp(self, path): """Simple stamp command.""" open(path, 'w').close() def ExecRecursiveMirror(self, source, dest): """Emulation of rm -rf out && cp -af in out.""" if os.path.exists(dest): if os.path.isdir(dest): def _on_error(fn, path, excinfo): # The operation failed, possibly because the file is set to # read-only. If that's why, make it writable and try the op again. if not os.access(path, os.W_OK): os.chmod(path, stat.S_IWRITE) fn(path) shutil.rmtree(dest, onerror=_on_error) else: if not os.access(dest, os.W_OK): # Attempt to make the file writable before deleting it. os.chmod(dest, stat.S_IWRITE) os.unlink(dest) if os.path.isdir(source): shutil.copytree(source, dest) else: shutil.copy2(source, dest) def ExecLinkWrapper(self, arch, use_separate_mspdbsrv, *args): """Filter diagnostic output from link that looks like: ' Creating library ui.dll.lib and object ui.dll.exp' This happens when there are exports from the dll or exe. """ env = self._GetEnv(arch) if use_separate_mspdbsrv == 'True': self._UseSeparateMspdbsrv(env, args) link = subprocess.Popen([args[0].replace('/', '\\')] + list(args[1:]), shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = link.communicate() for line in out.splitlines(): if (not line.startswith(' Creating library ') and not line.startswith('Generating code') and not line.startswith('Finished generating code')): print line return link.returncode def ExecLinkWithManifests(self, arch, embed_manifest, out, ldcmd, resname, mt, rc, intermediate_manifest, *manifests): """A wrapper for handling creating a manifest resource and then executing a link command.""" # The 'normal' way to do manifests is to have link generate a manifest # based on gathering dependencies from the object files, then merge that # manifest with other manifests supplied as sources, convert the merged # manifest to a resource, and then *relink*, including the compiled # version of the manifest resource. This breaks incremental linking, and # is generally overly complicated. Instead, we merge all the manifests # provided (along with one that includes what would normally be in the # linker-generated one, see msvs_emulation.py), and include that into the # first and only link. We still tell link to generate a manifest, but we # only use that to assert that our simpler process did not miss anything. variables = { 'python': sys.executable, 'arch': arch, 'out': out, 'ldcmd': ldcmd, 'resname': resname, 'mt': mt, 'rc': rc, 'intermediate_manifest': intermediate_manifest, 'manifests': ' '.join(manifests), } add_to_ld = '' if manifests: subprocess.check_call( '%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo ' '-manifest %(manifests)s -out:%(out)s.manifest' % variables) if embed_manifest == 'True': subprocess.check_call( '%(python)s gyp-win-tool manifest-to-rc %(arch)s %(out)s.manifest' ' %(out)s.manifest.rc %(resname)s' % variables) subprocess.check_call( '%(python)s gyp-win-tool rc-wrapper %(arch)s %(rc)s ' '%(out)s.manifest.rc' % variables) add_to_ld = ' %(out)s.manifest.res' % variables subprocess.check_call(ldcmd + add_to_ld) # Run mt.exe on the theoretically complete manifest we generated, merging # it with the one the linker generated to confirm that the linker # generated one does not add anything. This is strictly unnecessary for # correctness, it's only to verify that e.g. /MANIFESTDEPENDENCY was not # used in a #pragma comment. if manifests: # Merge the intermediate one with ours to .assert.manifest, then check # that .assert.manifest is identical to ours. subprocess.check_call( '%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo ' '-manifest %(out)s.manifest %(intermediate_manifest)s ' '-out:%(out)s.assert.manifest' % variables) assert_manifest = '%(out)s.assert.manifest' % variables our_manifest = '%(out)s.manifest' % variables # Load and normalize the manifests. mt.exe sometimes removes whitespace, # and sometimes doesn't unfortunately. with open(our_manifest, 'rb') as our_f: with open(assert_manifest, 'rb') as assert_f: our_data = our_f.read().translate(None, string.whitespace) assert_data = assert_f.read().translate(None, string.whitespace) if our_data != assert_data: os.unlink(out) def dump(filename): sys.stderr.write('%s\n-----\n' % filename) with open(filename, 'rb') as f: sys.stderr.write(f.read() + '\n-----\n') dump(intermediate_manifest) dump(our_manifest) dump(assert_manifest) sys.stderr.write( 'Linker generated manifest "%s" added to final manifest "%s" ' '(result in "%s"). ' 'Were /MANIFEST switches used in #pragma statements? ' % ( intermediate_manifest, our_manifest, assert_manifest)) return 1 def ExecManifestWrapper(self, arch, *args): """Run manifest tool with environment set. Strip out undesirable warning (some XML blocks are recognized by the OS loader, but not the manifest tool).""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() for line in out.splitlines(): if line and 'manifest authoring warning 81010002' not in line: print line return popen.returncode def ExecManifestToRc(self, arch, *args): """Creates a resource file pointing a SxS assembly manifest. |args| is tuple containing path to resource file, path to manifest file and resource name which can be "1" (for executables) or "2" (for DLLs).""" manifest_path, resource_path, resource_name = args with open(resource_path, 'wb') as output: output.write('#include \n%s RT_MANIFEST "%s"' % ( resource_name, os.path.abspath(manifest_path).replace('\\', '/'))) def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl, *flags): """Filter noisy filenames output from MIDL compile step that isn't quietable via command line flags. """ args = ['midl', '/nologo'] + list(flags) + [ '/out', outdir, '/tlb', tlb, '/h', h, '/dlldata', dlldata, '/iid', iid, '/proxy', proxy, idl] env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() # Filter junk out of stdout, and write filtered versions. Output we want # to filter is pairs of lines that look like this: # Processing C:\Program Files (x86)\Microsoft SDKs\...\include\objidl.idl # objidl.idl lines = out.splitlines() prefixes = ('Processing ', '64 bit Processing ') processing = set(os.path.basename(x) for x in lines if x.startswith(prefixes)) for line in lines: if not line.startswith(prefixes) and line not in processing: print line return popen.returncode def ExecAsmWrapper(self, arch, *args): """Filter logo banner from invocations of asm.exe.""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() for line in out.splitlines(): if (not line.startswith('Copyright (C) Microsoft Corporation') and not line.startswith('Microsoft (R) Macro Assembler') and not line.startswith(' Assembling: ') and line): print line return popen.returncode def ExecRcWrapper(self, arch, *args): """Filter logo banner from invocations of rc.exe. Older versions of RC don't support the /nologo flag.""" env = self._GetEnv(arch) popen = subprocess.Popen(args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) out, _ = popen.communicate() for line in out.splitlines(): if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and not line.startswith('Copyright (C) Microsoft Corporation') and line): print line return popen.returncode def ExecActionWrapper(self, arch, rspfile, *dir): """Runs an action command line from a response file using the environment for |arch|. If |dir| is supplied, use that as the working directory.""" env = self._GetEnv(arch) # TODO(scottmg): This is a temporary hack to get some specific variables # through to actions that are set after gyp-time. http://crbug.com/333738. for k, v in os.environ.iteritems(): if k not in env: env[k] = v args = open(rspfile).read() dir = dir[0] if dir else None return subprocess.call(args, shell=True, env=env, cwd=dir) def ExecClCompile(self, project_dir, selected_files): """Executed by msvs-ninja projects when the 'ClCompile' target is used to build selected C/C++ files.""" project_dir = os.path.relpath(project_dir, BASE_DIR) selected_files = selected_files.split(';') ninja_targets = [os.path.join(project_dir, filename) + '^^' for filename in selected_files] cmd = ['ninja.exe'] cmd.extend(ninja_targets) return subprocess.call(cmd, shell=True, cwd=BASE_DIR) if __name__ == '__main__': sys.exit(main(sys.argv[1:])) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/xcode_emulation.py0000644000000000000000000017707612631326456024161 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ This module contains classes that help to emulate xcodebuild behavior on top of other build systems, such as make and ninja. """ import copy import gyp.common import os import os.path import re import shlex import subprocess import sys import tempfile from gyp.common import GypError # Populated lazily by XcodeVersion, for efficiency, and to fix an issue when # "xcodebuild" is called too quickly (it has been found to return incorrect # version number). XCODE_VERSION_CACHE = None # Populated lazily by GetXcodeArchsDefault, to an |XcodeArchsDefault| instance # corresponding to the installed version of Xcode. XCODE_ARCHS_DEFAULT_CACHE = None def XcodeArchsVariableMapping(archs, archs_including_64_bit=None): """Constructs a dictionary with expansion for $(ARCHS_STANDARD) variable, and optionally for $(ARCHS_STANDARD_INCLUDING_64_BIT).""" mapping = {'$(ARCHS_STANDARD)': archs} if archs_including_64_bit: mapping['$(ARCHS_STANDARD_INCLUDING_64_BIT)'] = archs_including_64_bit return mapping class XcodeArchsDefault(object): """A class to resolve ARCHS variable from xcode_settings, resolving Xcode macros and implementing filtering by VALID_ARCHS. The expansion of macros depends on the SDKROOT used ("macosx", "iphoneos", "iphonesimulator") and on the version of Xcode. """ # Match variable like $(ARCHS_STANDARD). variable_pattern = re.compile(r'\$\([a-zA-Z_][a-zA-Z0-9_]*\)$') def __init__(self, default, mac, iphonesimulator, iphoneos): self._default = (default,) self._archs = {'mac': mac, 'ios': iphoneos, 'iossim': iphonesimulator} def _VariableMapping(self, sdkroot): """Returns the dictionary of variable mapping depending on the SDKROOT.""" sdkroot = sdkroot.lower() if 'iphoneos' in sdkroot: return self._archs['ios'] elif 'iphonesimulator' in sdkroot: return self._archs['iossim'] else: return self._archs['mac'] def _ExpandArchs(self, archs, sdkroot): """Expands variables references in ARCHS, and remove duplicates.""" variable_mapping = self._VariableMapping(sdkroot) expanded_archs = [] for arch in archs: if self.variable_pattern.match(arch): variable = arch try: variable_expansion = variable_mapping[variable] for arch in variable_expansion: if arch not in expanded_archs: expanded_archs.append(arch) except KeyError as e: print 'Warning: Ignoring unsupported variable "%s".' % variable elif arch not in expanded_archs: expanded_archs.append(arch) return expanded_archs def ActiveArchs(self, archs, valid_archs, sdkroot): """Expands variables references in ARCHS, and filter by VALID_ARCHS if it is defined (if not set, Xcode accept any value in ARCHS, otherwise, only values present in VALID_ARCHS are kept).""" expanded_archs = self._ExpandArchs(archs or self._default, sdkroot or '') if valid_archs: filtered_archs = [] for arch in expanded_archs: if arch in valid_archs: filtered_archs.append(arch) expanded_archs = filtered_archs return expanded_archs def GetXcodeArchsDefault(): """Returns the |XcodeArchsDefault| object to use to expand ARCHS for the installed version of Xcode. The default values used by Xcode for ARCHS and the expansion of the variables depends on the version of Xcode used. For all version anterior to Xcode 5.0 or posterior to Xcode 5.1 included uses $(ARCHS_STANDARD) if ARCHS is unset, while Xcode 5.0 to 5.0.2 uses $(ARCHS_STANDARD_INCLUDING_64_BIT). This variable was added to Xcode 5.0 and deprecated with Xcode 5.1. For "macosx" SDKROOT, all version starting with Xcode 5.0 includes 64-bit architecture as part of $(ARCHS_STANDARD) and default to only building it. For "iphoneos" and "iphonesimulator" SDKROOT, 64-bit architectures are part of $(ARCHS_STANDARD_INCLUDING_64_BIT) from Xcode 5.0. From Xcode 5.1, they are also part of $(ARCHS_STANDARD). All thoses rules are coded in the construction of the |XcodeArchsDefault| object to use depending on the version of Xcode detected. The object is for performance reason.""" global XCODE_ARCHS_DEFAULT_CACHE if XCODE_ARCHS_DEFAULT_CACHE: return XCODE_ARCHS_DEFAULT_CACHE xcode_version, _ = XcodeVersion() if xcode_version < '0500': XCODE_ARCHS_DEFAULT_CACHE = XcodeArchsDefault( '$(ARCHS_STANDARD)', XcodeArchsVariableMapping(['i386']), XcodeArchsVariableMapping(['i386']), XcodeArchsVariableMapping(['armv7'])) elif xcode_version < '0510': XCODE_ARCHS_DEFAULT_CACHE = XcodeArchsDefault( '$(ARCHS_STANDARD_INCLUDING_64_BIT)', XcodeArchsVariableMapping(['x86_64'], ['x86_64']), XcodeArchsVariableMapping(['i386'], ['i386', 'x86_64']), XcodeArchsVariableMapping( ['armv7', 'armv7s'], ['armv7', 'armv7s', 'arm64'])) else: XCODE_ARCHS_DEFAULT_CACHE = XcodeArchsDefault( '$(ARCHS_STANDARD)', XcodeArchsVariableMapping(['x86_64'], ['x86_64']), XcodeArchsVariableMapping(['i386', 'x86_64'], ['i386', 'x86_64']), XcodeArchsVariableMapping( ['armv7', 'armv7s', 'arm64'], ['armv7', 'armv7s', 'arm64'])) return XCODE_ARCHS_DEFAULT_CACHE class XcodeSettings(object): """A class that understands the gyp 'xcode_settings' object.""" # Populated lazily by _SdkPath(). Shared by all XcodeSettings, so cached # at class-level for efficiency. _sdk_path_cache = {} _sdk_root_cache = {} # Populated lazily by GetExtraPlistItems(). Shared by all XcodeSettings, so # cached at class-level for efficiency. _plist_cache = {} # Populated lazily by GetIOSPostbuilds. Shared by all XcodeSettings, so # cached at class-level for efficiency. _codesigning_key_cache = {} def __init__(self, spec): self.spec = spec self.isIOS = False # Per-target 'xcode_settings' are pushed down into configs earlier by gyp. # This means self.xcode_settings[config] always contains all settings # for that config -- the per-target settings as well. Settings that are # the same for all configs are implicitly per-target settings. self.xcode_settings = {} configs = spec['configurations'] for configname, config in configs.iteritems(): self.xcode_settings[configname] = config.get('xcode_settings', {}) self._ConvertConditionalKeys(configname) if self.xcode_settings[configname].get('IPHONEOS_DEPLOYMENT_TARGET', None): self.isIOS = True # This is only non-None temporarily during the execution of some methods. self.configname = None # Used by _AdjustLibrary to match .a and .dylib entries in libraries. self.library_re = re.compile(r'^lib([^/]+)\.(a|dylib)$') def _ConvertConditionalKeys(self, configname): """Converts or warns on conditional keys. Xcode supports conditional keys, such as CODE_SIGN_IDENTITY[sdk=iphoneos*]. This is a partial implementation with some keys converted while the rest force a warning.""" settings = self.xcode_settings[configname] conditional_keys = [key for key in settings if key.endswith(']')] for key in conditional_keys: # If you need more, speak up at http://crbug.com/122592 if key.endswith("[sdk=iphoneos*]"): if configname.endswith("iphoneos"): new_key = key.split("[")[0] settings[new_key] = settings[key] else: print 'Warning: Conditional keys not implemented, ignoring:', \ ' '.join(conditional_keys) del settings[key] def _Settings(self): assert self.configname return self.xcode_settings[self.configname] def _Test(self, test_key, cond_key, default): return self._Settings().get(test_key, default) == cond_key def _Appendf(self, lst, test_key, format_str, default=None): if test_key in self._Settings(): lst.append(format_str % str(self._Settings()[test_key])) elif default: lst.append(format_str % str(default)) def _WarnUnimplemented(self, test_key): if test_key in self._Settings(): print 'Warning: Ignoring not yet implemented key "%s".' % test_key def IsBinaryOutputFormat(self, configname): default = "binary" if self.isIOS else "xml" format = self.xcode_settings[configname].get('INFOPLIST_OUTPUT_FORMAT', default) return format == "binary" def _IsBundle(self): return int(self.spec.get('mac_bundle', 0)) != 0 def _IsIosAppExtension(self): return int(self.spec.get('ios_app_extension', 0)) != 0 def _IsIosWatchKitExtension(self): return int(self.spec.get('ios_watchkit_extension', 0)) != 0 def _IsIosWatchApp(self): return int(self.spec.get('ios_watch_app', 0)) != 0 def GetFrameworkVersion(self): """Returns the framework version of the current target. Only valid for bundles.""" assert self._IsBundle() return self.GetPerTargetSetting('FRAMEWORK_VERSION', default='A') def GetWrapperExtension(self): """Returns the bundle extension (.app, .framework, .plugin, etc). Only valid for bundles.""" assert self._IsBundle() if self.spec['type'] in ('loadable_module', 'shared_library'): default_wrapper_extension = { 'loadable_module': 'bundle', 'shared_library': 'framework', }[self.spec['type']] wrapper_extension = self.GetPerTargetSetting( 'WRAPPER_EXTENSION', default=default_wrapper_extension) return '.' + self.spec.get('product_extension', wrapper_extension) elif self.spec['type'] == 'executable': if self._IsIosAppExtension() or self._IsIosWatchKitExtension(): return '.' + self.spec.get('product_extension', 'appex') else: return '.' + self.spec.get('product_extension', 'app') else: assert False, "Don't know extension for '%s', target '%s'" % ( self.spec['type'], self.spec['target_name']) def GetProductName(self): """Returns PRODUCT_NAME.""" return self.spec.get('product_name', self.spec['target_name']) def GetFullProductName(self): """Returns FULL_PRODUCT_NAME.""" if self._IsBundle(): return self.GetWrapperName() else: return self._GetStandaloneBinaryPath() def GetWrapperName(self): """Returns the directory name of the bundle represented by this target. Only valid for bundles.""" assert self._IsBundle() return self.GetProductName() + self.GetWrapperExtension() def GetBundleContentsFolderPath(self): """Returns the qualified path to the bundle's contents folder. E.g. Chromium.app/Contents or Foo.bundle/Versions/A. Only valid for bundles.""" if self.isIOS: return self.GetWrapperName() assert self._IsBundle() if self.spec['type'] == 'shared_library': return os.path.join( self.GetWrapperName(), 'Versions', self.GetFrameworkVersion()) else: # loadable_modules have a 'Contents' folder like executables. return os.path.join(self.GetWrapperName(), 'Contents') def GetBundleResourceFolder(self): """Returns the qualified path to the bundle's resource folder. E.g. Chromium.app/Contents/Resources. Only valid for bundles.""" assert self._IsBundle() if self.isIOS: return self.GetBundleContentsFolderPath() return os.path.join(self.GetBundleContentsFolderPath(), 'Resources') def GetBundlePlistPath(self): """Returns the qualified path to the bundle's plist file. E.g. Chromium.app/Contents/Info.plist. Only valid for bundles.""" assert self._IsBundle() if self.spec['type'] in ('executable', 'loadable_module'): return os.path.join(self.GetBundleContentsFolderPath(), 'Info.plist') else: return os.path.join(self.GetBundleContentsFolderPath(), 'Resources', 'Info.plist') def GetProductType(self): """Returns the PRODUCT_TYPE of this target.""" if self._IsIosAppExtension(): assert self._IsBundle(), ('ios_app_extension flag requires mac_bundle ' '(target %s)' % self.spec['target_name']) return 'com.apple.product-type.app-extension' if self._IsIosWatchKitExtension(): assert self._IsBundle(), ('ios_watchkit_extension flag requires ' 'mac_bundle (target %s)' % self.spec['target_name']) return 'com.apple.product-type.watchkit-extension' if self._IsIosWatchApp(): assert self._IsBundle(), ('ios_watch_app flag requires mac_bundle ' '(target %s)' % self.spec['target_name']) return 'com.apple.product-type.application.watchapp' if self._IsBundle(): return { 'executable': 'com.apple.product-type.application', 'loadable_module': 'com.apple.product-type.bundle', 'shared_library': 'com.apple.product-type.framework', }[self.spec['type']] else: return { 'executable': 'com.apple.product-type.tool', 'loadable_module': 'com.apple.product-type.library.dynamic', 'shared_library': 'com.apple.product-type.library.dynamic', 'static_library': 'com.apple.product-type.library.static', }[self.spec['type']] def GetMachOType(self): """Returns the MACH_O_TYPE of this target.""" # Weird, but matches Xcode. if not self._IsBundle() and self.spec['type'] == 'executable': return '' return { 'executable': 'mh_execute', 'static_library': 'staticlib', 'shared_library': 'mh_dylib', 'loadable_module': 'mh_bundle', }[self.spec['type']] def _GetBundleBinaryPath(self): """Returns the name of the bundle binary of by this target. E.g. Chromium.app/Contents/MacOS/Chromium. Only valid for bundles.""" assert self._IsBundle() if self.spec['type'] in ('shared_library') or self.isIOS: path = self.GetBundleContentsFolderPath() elif self.spec['type'] in ('executable', 'loadable_module'): path = os.path.join(self.GetBundleContentsFolderPath(), 'MacOS') return os.path.join(path, self.GetExecutableName()) def _GetStandaloneExecutableSuffix(self): if 'product_extension' in self.spec: return '.' + self.spec['product_extension'] return { 'executable': '', 'static_library': '.a', 'shared_library': '.dylib', 'loadable_module': '.so', }[self.spec['type']] def _GetStandaloneExecutablePrefix(self): return self.spec.get('product_prefix', { 'executable': '', 'static_library': 'lib', 'shared_library': 'lib', # Non-bundled loadable_modules are called foo.so for some reason # (that is, .so and no prefix) with the xcode build -- match that. 'loadable_module': '', }[self.spec['type']]) def _GetStandaloneBinaryPath(self): """Returns the name of the non-bundle binary represented by this target. E.g. hello_world. Only valid for non-bundles.""" assert not self._IsBundle() assert self.spec['type'] in ( 'executable', 'shared_library', 'static_library', 'loadable_module'), ( 'Unexpected type %s' % self.spec['type']) target = self.spec['target_name'] if self.spec['type'] == 'static_library': if target[:3] == 'lib': target = target[3:] elif self.spec['type'] in ('loadable_module', 'shared_library'): if target[:3] == 'lib': target = target[3:] target_prefix = self._GetStandaloneExecutablePrefix() target = self.spec.get('product_name', target) target_ext = self._GetStandaloneExecutableSuffix() return target_prefix + target + target_ext def GetExecutableName(self): """Returns the executable name of the bundle represented by this target. E.g. Chromium.""" if self._IsBundle(): return self.spec.get('product_name', self.spec['target_name']) else: return self._GetStandaloneBinaryPath() def GetExecutablePath(self): """Returns the directory name of the bundle represented by this target. E.g. Chromium.app/Contents/MacOS/Chromium.""" if self._IsBundle(): return self._GetBundleBinaryPath() else: return self._GetStandaloneBinaryPath() def GetActiveArchs(self, configname): """Returns the architectures this target should be built for.""" config_settings = self.xcode_settings[configname] xcode_archs_default = GetXcodeArchsDefault() return xcode_archs_default.ActiveArchs( config_settings.get('ARCHS'), config_settings.get('VALID_ARCHS'), config_settings.get('SDKROOT')) def _GetSdkVersionInfoItem(self, sdk, infoitem): # xcodebuild requires Xcode and can't run on Command Line Tools-only # systems from 10.7 onward. # Since the CLT has no SDK paths anyway, returning None is the # most sensible route and should still do the right thing. try: return GetStdout(['xcodebuild', '-version', '-sdk', sdk, infoitem]) except: pass def _SdkRoot(self, configname): if configname is None: configname = self.configname return self.GetPerConfigSetting('SDKROOT', configname, default='') def _SdkPath(self, configname=None): sdk_root = self._SdkRoot(configname) if sdk_root.startswith('/'): return sdk_root return self._XcodeSdkPath(sdk_root) def _XcodeSdkPath(self, sdk_root): if sdk_root not in XcodeSettings._sdk_path_cache: sdk_path = self._GetSdkVersionInfoItem(sdk_root, 'Path') XcodeSettings._sdk_path_cache[sdk_root] = sdk_path if sdk_root: XcodeSettings._sdk_root_cache[sdk_path] = sdk_root return XcodeSettings._sdk_path_cache[sdk_root] def _AppendPlatformVersionMinFlags(self, lst): self._Appendf(lst, 'MACOSX_DEPLOYMENT_TARGET', '-mmacosx-version-min=%s') if 'IPHONEOS_DEPLOYMENT_TARGET' in self._Settings(): # TODO: Implement this better? sdk_path_basename = os.path.basename(self._SdkPath()) if sdk_path_basename.lower().startswith('iphonesimulator'): self._Appendf(lst, 'IPHONEOS_DEPLOYMENT_TARGET', '-mios-simulator-version-min=%s') else: self._Appendf(lst, 'IPHONEOS_DEPLOYMENT_TARGET', '-miphoneos-version-min=%s') def GetCflags(self, configname, arch=None): """Returns flags that need to be added to .c, .cc, .m, and .mm compilations.""" # This functions (and the similar ones below) do not offer complete # emulation of all xcode_settings keys. They're implemented on demand. self.configname = configname cflags = [] sdk_root = self._SdkPath() if 'SDKROOT' in self._Settings() and sdk_root: cflags.append('-isysroot %s' % sdk_root) if self._Test('CLANG_WARN_CONSTANT_CONVERSION', 'YES', default='NO'): cflags.append('-Wconstant-conversion') if self._Test('GCC_CHAR_IS_UNSIGNED_CHAR', 'YES', default='NO'): cflags.append('-funsigned-char') if self._Test('GCC_CW_ASM_SYNTAX', 'YES', default='YES'): cflags.append('-fasm-blocks') if 'GCC_DYNAMIC_NO_PIC' in self._Settings(): if self._Settings()['GCC_DYNAMIC_NO_PIC'] == 'YES': cflags.append('-mdynamic-no-pic') else: pass # TODO: In this case, it depends on the target. xcode passes # mdynamic-no-pic by default for executable and possibly static lib # according to mento if self._Test('GCC_ENABLE_PASCAL_STRINGS', 'YES', default='YES'): cflags.append('-mpascal-strings') self._Appendf(cflags, 'GCC_OPTIMIZATION_LEVEL', '-O%s', default='s') if self._Test('GCC_GENERATE_DEBUGGING_SYMBOLS', 'YES', default='YES'): dbg_format = self._Settings().get('DEBUG_INFORMATION_FORMAT', 'dwarf') if dbg_format == 'dwarf': cflags.append('-gdwarf-2') elif dbg_format == 'stabs': raise NotImplementedError('stabs debug format is not supported yet.') elif dbg_format == 'dwarf-with-dsym': cflags.append('-gdwarf-2') else: raise NotImplementedError('Unknown debug format %s' % dbg_format) if self._Settings().get('GCC_STRICT_ALIASING') == 'YES': cflags.append('-fstrict-aliasing') elif self._Settings().get('GCC_STRICT_ALIASING') == 'NO': cflags.append('-fno-strict-aliasing') if self._Test('GCC_SYMBOLS_PRIVATE_EXTERN', 'YES', default='NO'): cflags.append('-fvisibility=hidden') if self._Test('GCC_TREAT_WARNINGS_AS_ERRORS', 'YES', default='NO'): cflags.append('-Werror') if self._Test('GCC_WARN_ABOUT_MISSING_NEWLINE', 'YES', default='NO'): cflags.append('-Wnewline-eof') # In Xcode, this is only activated when GCC_COMPILER_VERSION is clang or # llvm-gcc. It also requires a fairly recent libtool, and # if the system clang isn't used, DYLD_LIBRARY_PATH needs to contain the # path to the libLTO.dylib that matches the used clang. if self._Test('LLVM_LTO', 'YES', default='NO'): cflags.append('-flto') self._AppendPlatformVersionMinFlags(cflags) # TODO: if self._Test('COPY_PHASE_STRIP', 'YES', default='NO'): self._WarnUnimplemented('COPY_PHASE_STRIP') self._WarnUnimplemented('GCC_DEBUGGING_SYMBOLS') self._WarnUnimplemented('GCC_ENABLE_OBJC_EXCEPTIONS') # TODO: This is exported correctly, but assigning to it is not supported. self._WarnUnimplemented('MACH_O_TYPE') self._WarnUnimplemented('PRODUCT_TYPE') if arch is not None: archs = [arch] else: assert self.configname archs = self.GetActiveArchs(self.configname) if len(archs) != 1: # TODO: Supporting fat binaries will be annoying. self._WarnUnimplemented('ARCHS') archs = ['i386'] cflags.append('-arch ' + archs[0]) if archs[0] in ('i386', 'x86_64'): if self._Test('GCC_ENABLE_SSE3_EXTENSIONS', 'YES', default='NO'): cflags.append('-msse3') if self._Test('GCC_ENABLE_SUPPLEMENTAL_SSE3_INSTRUCTIONS', 'YES', default='NO'): cflags.append('-mssse3') # Note 3rd 's'. if self._Test('GCC_ENABLE_SSE41_EXTENSIONS', 'YES', default='NO'): cflags.append('-msse4.1') if self._Test('GCC_ENABLE_SSE42_EXTENSIONS', 'YES', default='NO'): cflags.append('-msse4.2') cflags += self._Settings().get('WARNING_CFLAGS', []) if sdk_root: framework_root = sdk_root else: framework_root = '' config = self.spec['configurations'][self.configname] framework_dirs = config.get('mac_framework_dirs', []) for directory in framework_dirs: cflags.append('-F' + directory.replace('$(SDKROOT)', framework_root)) self.configname = None return cflags def GetCflagsC(self, configname): """Returns flags that need to be added to .c, and .m compilations.""" self.configname = configname cflags_c = [] if self._Settings().get('GCC_C_LANGUAGE_STANDARD', '') == 'ansi': cflags_c.append('-ansi') else: self._Appendf(cflags_c, 'GCC_C_LANGUAGE_STANDARD', '-std=%s') cflags_c += self._Settings().get('OTHER_CFLAGS', []) self.configname = None return cflags_c def GetCflagsCC(self, configname): """Returns flags that need to be added to .cc, and .mm compilations.""" self.configname = configname cflags_cc = [] clang_cxx_language_standard = self._Settings().get( 'CLANG_CXX_LANGUAGE_STANDARD') # Note: Don't make c++0x to c++11 so that c++0x can be used with older # clangs that don't understand c++11 yet (like Xcode 4.2's). if clang_cxx_language_standard: cflags_cc.append('-std=%s' % clang_cxx_language_standard) self._Appendf(cflags_cc, 'CLANG_CXX_LIBRARY', '-stdlib=%s') if self._Test('GCC_ENABLE_CPP_RTTI', 'NO', default='YES'): cflags_cc.append('-fno-rtti') if self._Test('GCC_ENABLE_CPP_EXCEPTIONS', 'NO', default='YES'): cflags_cc.append('-fno-exceptions') if self._Test('GCC_INLINES_ARE_PRIVATE_EXTERN', 'YES', default='NO'): cflags_cc.append('-fvisibility-inlines-hidden') if self._Test('GCC_THREADSAFE_STATICS', 'NO', default='YES'): cflags_cc.append('-fno-threadsafe-statics') # Note: This flag is a no-op for clang, it only has an effect for gcc. if self._Test('GCC_WARN_ABOUT_INVALID_OFFSETOF_MACRO', 'NO', default='YES'): cflags_cc.append('-Wno-invalid-offsetof') other_ccflags = [] for flag in self._Settings().get('OTHER_CPLUSPLUSFLAGS', ['$(inherited)']): # TODO: More general variable expansion. Missing in many other places too. if flag in ('$inherited', '$(inherited)', '${inherited}'): flag = '$OTHER_CFLAGS' if flag in ('$OTHER_CFLAGS', '$(OTHER_CFLAGS)', '${OTHER_CFLAGS}'): other_ccflags += self._Settings().get('OTHER_CFLAGS', []) else: other_ccflags.append(flag) cflags_cc += other_ccflags self.configname = None return cflags_cc def _AddObjectiveCGarbageCollectionFlags(self, flags): gc_policy = self._Settings().get('GCC_ENABLE_OBJC_GC', 'unsupported') if gc_policy == 'supported': flags.append('-fobjc-gc') elif gc_policy == 'required': flags.append('-fobjc-gc-only') def _AddObjectiveCARCFlags(self, flags): if self._Test('CLANG_ENABLE_OBJC_ARC', 'YES', default='NO'): flags.append('-fobjc-arc') def _AddObjectiveCMissingPropertySynthesisFlags(self, flags): if self._Test('CLANG_WARN_OBJC_MISSING_PROPERTY_SYNTHESIS', 'YES', default='NO'): flags.append('-Wobjc-missing-property-synthesis') def GetCflagsObjC(self, configname): """Returns flags that need to be added to .m compilations.""" self.configname = configname cflags_objc = [] self._AddObjectiveCGarbageCollectionFlags(cflags_objc) self._AddObjectiveCARCFlags(cflags_objc) self._AddObjectiveCMissingPropertySynthesisFlags(cflags_objc) self.configname = None return cflags_objc def GetCflagsObjCC(self, configname): """Returns flags that need to be added to .mm compilations.""" self.configname = configname cflags_objcc = [] self._AddObjectiveCGarbageCollectionFlags(cflags_objcc) self._AddObjectiveCARCFlags(cflags_objcc) self._AddObjectiveCMissingPropertySynthesisFlags(cflags_objcc) if self._Test('GCC_OBJC_CALL_CXX_CDTORS', 'YES', default='NO'): cflags_objcc.append('-fobjc-call-cxx-cdtors') self.configname = None return cflags_objcc def GetInstallNameBase(self): """Return DYLIB_INSTALL_NAME_BASE for this target.""" # Xcode sets this for shared_libraries, and for nonbundled loadable_modules. if (self.spec['type'] != 'shared_library' and (self.spec['type'] != 'loadable_module' or self._IsBundle())): return None install_base = self.GetPerTargetSetting( 'DYLIB_INSTALL_NAME_BASE', default='/Library/Frameworks' if self._IsBundle() else '/usr/local/lib') return install_base def _StandardizePath(self, path): """Do :standardizepath processing for path.""" # I'm not quite sure what :standardizepath does. Just call normpath(), # but don't let @executable_path/../foo collapse to foo. if '/' in path: prefix, rest = '', path if path.startswith('@'): prefix, rest = path.split('/', 1) rest = os.path.normpath(rest) # :standardizepath path = os.path.join(prefix, rest) return path def GetInstallName(self): """Return LD_DYLIB_INSTALL_NAME for this target.""" # Xcode sets this for shared_libraries, and for nonbundled loadable_modules. if (self.spec['type'] != 'shared_library' and (self.spec['type'] != 'loadable_module' or self._IsBundle())): return None default_install_name = \ '$(DYLIB_INSTALL_NAME_BASE:standardizepath)/$(EXECUTABLE_PATH)' install_name = self.GetPerTargetSetting( 'LD_DYLIB_INSTALL_NAME', default=default_install_name) # Hardcode support for the variables used in chromium for now, to # unblock people using the make build. if '$' in install_name: assert install_name in ('$(DYLIB_INSTALL_NAME_BASE:standardizepath)/' '$(WRAPPER_NAME)/$(PRODUCT_NAME)', default_install_name), ( 'Variables in LD_DYLIB_INSTALL_NAME are not generally supported ' 'yet in target \'%s\' (got \'%s\')' % (self.spec['target_name'], install_name)) install_name = install_name.replace( '$(DYLIB_INSTALL_NAME_BASE:standardizepath)', self._StandardizePath(self.GetInstallNameBase())) if self._IsBundle(): # These are only valid for bundles, hence the |if|. install_name = install_name.replace( '$(WRAPPER_NAME)', self.GetWrapperName()) install_name = install_name.replace( '$(PRODUCT_NAME)', self.GetProductName()) else: assert '$(WRAPPER_NAME)' not in install_name assert '$(PRODUCT_NAME)' not in install_name install_name = install_name.replace( '$(EXECUTABLE_PATH)', self.GetExecutablePath()) return install_name def _MapLinkerFlagFilename(self, ldflag, gyp_to_build_path): """Checks if ldflag contains a filename and if so remaps it from gyp-directory-relative to build-directory-relative.""" # This list is expanded on demand. # They get matched as: # -exported_symbols_list file # -Wl,exported_symbols_list file # -Wl,exported_symbols_list,file LINKER_FILE = r'(\S+)' WORD = r'\S+' linker_flags = [ ['-exported_symbols_list', LINKER_FILE], # Needed for NaCl. ['-unexported_symbols_list', LINKER_FILE], ['-reexported_symbols_list', LINKER_FILE], ['-sectcreate', WORD, WORD, LINKER_FILE], # Needed for remoting. ] for flag_pattern in linker_flags: regex = re.compile('(?:-Wl,)?' + '[ ,]'.join(flag_pattern)) m = regex.match(ldflag) if m: ldflag = ldflag[:m.start(1)] + gyp_to_build_path(m.group(1)) + \ ldflag[m.end(1):] # Required for ffmpeg (no idea why they don't use LIBRARY_SEARCH_PATHS, # TODO(thakis): Update ffmpeg.gyp): if ldflag.startswith('-L'): ldflag = '-L' + gyp_to_build_path(ldflag[len('-L'):]) return ldflag def GetLdflags(self, configname, product_dir, gyp_to_build_path, arch=None): """Returns flags that need to be passed to the linker. Args: configname: The name of the configuration to get ld flags for. product_dir: The directory where products such static and dynamic libraries are placed. This is added to the library search path. gyp_to_build_path: A function that converts paths relative to the current gyp file to paths relative to the build direcotry. """ self.configname = configname ldflags = [] # The xcode build is relative to a gyp file's directory, and OTHER_LDFLAGS # can contain entries that depend on this. Explicitly absolutify these. for ldflag in self._Settings().get('OTHER_LDFLAGS', []): ldflags.append(self._MapLinkerFlagFilename(ldflag, gyp_to_build_path)) if self._Test('DEAD_CODE_STRIPPING', 'YES', default='NO'): ldflags.append('-Wl,-dead_strip') if self._Test('PREBINDING', 'YES', default='NO'): ldflags.append('-Wl,-prebind') self._Appendf( ldflags, 'DYLIB_COMPATIBILITY_VERSION', '-compatibility_version %s') self._Appendf( ldflags, 'DYLIB_CURRENT_VERSION', '-current_version %s') self._AppendPlatformVersionMinFlags(ldflags) if 'SDKROOT' in self._Settings() and self._SdkPath(): ldflags.append('-isysroot ' + self._SdkPath()) for library_path in self._Settings().get('LIBRARY_SEARCH_PATHS', []): ldflags.append('-L' + gyp_to_build_path(library_path)) if 'ORDER_FILE' in self._Settings(): ldflags.append('-Wl,-order_file ' + '-Wl,' + gyp_to_build_path( self._Settings()['ORDER_FILE'])) if arch is not None: archs = [arch] else: assert self.configname archs = self.GetActiveArchs(self.configname) if len(archs) != 1: # TODO: Supporting fat binaries will be annoying. self._WarnUnimplemented('ARCHS') archs = ['i386'] ldflags.append('-arch ' + archs[0]) # Xcode adds the product directory by default. ldflags.append('-L' + product_dir) install_name = self.GetInstallName() if install_name and self.spec['type'] != 'loadable_module': ldflags.append('-install_name ' + install_name.replace(' ', r'\ ')) for rpath in self._Settings().get('LD_RUNPATH_SEARCH_PATHS', []): ldflags.append('-Wl,-rpath,' + rpath) sdk_root = self._SdkPath() if not sdk_root: sdk_root = '' config = self.spec['configurations'][self.configname] framework_dirs = config.get('mac_framework_dirs', []) for directory in framework_dirs: ldflags.append('-F' + directory.replace('$(SDKROOT)', sdk_root)) is_extension = self._IsIosAppExtension() or self._IsIosWatchKitExtension() if sdk_root and is_extension: # Adds the link flags for extensions. These flags are common for all # extensions and provide loader and main function. # These flags reflect the compilation options used by xcode to compile # extensions. ldflags.append('-lpkstart') if XcodeVersion() < '0900': ldflags.append(sdk_root + '/System/Library/PrivateFrameworks/PlugInKit.framework/PlugInKit') ldflags.append('-fapplication-extension') ldflags.append('-Xlinker -rpath ' '-Xlinker @executable_path/../../Frameworks') self._Appendf(ldflags, 'CLANG_CXX_LIBRARY', '-stdlib=%s') self.configname = None return ldflags def GetLibtoolflags(self, configname): """Returns flags that need to be passed to the static linker. Args: configname: The name of the configuration to get ld flags for. """ self.configname = configname libtoolflags = [] for libtoolflag in self._Settings().get('OTHER_LDFLAGS', []): libtoolflags.append(libtoolflag) # TODO(thakis): ARCHS? self.configname = None return libtoolflags def GetPerTargetSettings(self): """Gets a list of all the per-target settings. This will only fetch keys whose values are the same across all configurations.""" first_pass = True result = {} for configname in sorted(self.xcode_settings.keys()): if first_pass: result = dict(self.xcode_settings[configname]) first_pass = False else: for key, value in self.xcode_settings[configname].iteritems(): if key not in result: continue elif result[key] != value: del result[key] return result def GetPerConfigSetting(self, setting, configname, default=None): if configname in self.xcode_settings: return self.xcode_settings[configname].get(setting, default) else: return self.GetPerTargetSetting(setting, default) def GetPerTargetSetting(self, setting, default=None): """Tries to get xcode_settings.setting from spec. Assumes that the setting has the same value in all configurations and throws otherwise.""" is_first_pass = True result = None for configname in sorted(self.xcode_settings.keys()): if is_first_pass: result = self.xcode_settings[configname].get(setting, None) is_first_pass = False else: assert result == self.xcode_settings[configname].get(setting, None), ( "Expected per-target setting for '%s', got per-config setting " "(target %s)" % (setting, self.spec['target_name'])) if result is None: return default return result def _GetStripPostbuilds(self, configname, output_binary, quiet): """Returns a list of shell commands that contain the shell commands neccessary to strip this target's binary. These should be run as postbuilds before the actual postbuilds run.""" self.configname = configname result = [] if (self._Test('DEPLOYMENT_POSTPROCESSING', 'YES', default='NO') and self._Test('STRIP_INSTALLED_PRODUCT', 'YES', default='NO')): default_strip_style = 'debugging' if self.spec['type'] == 'loadable_module' and self._IsBundle(): default_strip_style = 'non-global' elif self.spec['type'] == 'executable': default_strip_style = 'all' strip_style = self._Settings().get('STRIP_STYLE', default_strip_style) strip_flags = { 'all': '', 'non-global': '-x', 'debugging': '-S', }[strip_style] explicit_strip_flags = self._Settings().get('STRIPFLAGS', '') if explicit_strip_flags: strip_flags += ' ' + _NormalizeEnvVarReferences(explicit_strip_flags) if not quiet: result.append('echo STRIP\\(%s\\)' % self.spec['target_name']) result.append('strip %s %s' % (strip_flags, output_binary)) self.configname = None return result def _GetDebugInfoPostbuilds(self, configname, output, output_binary, quiet): """Returns a list of shell commands that contain the shell commands neccessary to massage this target's debug information. These should be run as postbuilds before the actual postbuilds run.""" self.configname = configname # For static libraries, no dSYMs are created. result = [] if (self._Test('GCC_GENERATE_DEBUGGING_SYMBOLS', 'YES', default='YES') and self._Test( 'DEBUG_INFORMATION_FORMAT', 'dwarf-with-dsym', default='dwarf') and self.spec['type'] != 'static_library'): if not quiet: result.append('echo DSYMUTIL\\(%s\\)' % self.spec['target_name']) result.append('dsymutil %s -o %s' % (output_binary, output + '.dSYM')) self.configname = None return result def _GetTargetPostbuilds(self, configname, output, output_binary, quiet=False): """Returns a list of shell commands that contain the shell commands to run as postbuilds for this target, before the actual postbuilds.""" # dSYMs need to build before stripping happens. return ( self._GetDebugInfoPostbuilds(configname, output, output_binary, quiet) + self._GetStripPostbuilds(configname, output_binary, quiet)) def _GetIOSPostbuilds(self, configname, output_binary): """Return a shell command to codesign the iOS output binary so it can be deployed to a device. This should be run as the very last step of the build.""" if not (self.isIOS and self.spec['type'] == 'executable'): return [] settings = self.xcode_settings[configname] key = self._GetIOSCodeSignIdentityKey(settings) if not key: return [] # Warn for any unimplemented signing xcode keys. unimpl = ['OTHER_CODE_SIGN_FLAGS'] unimpl = set(unimpl) & set(self.xcode_settings[configname].keys()) if unimpl: print 'Warning: Some codesign keys not implemented, ignoring: %s' % ( ', '.join(sorted(unimpl))) return ['%s code-sign-bundle "%s" "%s" "%s" "%s"' % ( os.path.join('${TARGET_BUILD_DIR}', 'gyp-mac-tool'), key, settings.get('CODE_SIGN_RESOURCE_RULES_PATH', ''), settings.get('CODE_SIGN_ENTITLEMENTS', ''), settings.get('PROVISIONING_PROFILE', '')) ] def _GetIOSCodeSignIdentityKey(self, settings): identity = settings.get('CODE_SIGN_IDENTITY') if not identity: return None if identity not in XcodeSettings._codesigning_key_cache: output = subprocess.check_output( ['security', 'find-identity', '-p', 'codesigning', '-v']) for line in output.splitlines(): if identity in line: fingerprint = line.split()[1] cache = XcodeSettings._codesigning_key_cache assert identity not in cache or fingerprint == cache[identity], ( "Multiple codesigning fingerprints for identity: %s" % identity) XcodeSettings._codesigning_key_cache[identity] = fingerprint return XcodeSettings._codesigning_key_cache.get(identity, '') def AddImplicitPostbuilds(self, configname, output, output_binary, postbuilds=[], quiet=False): """Returns a list of shell commands that should run before and after |postbuilds|.""" assert output_binary is not None pre = self._GetTargetPostbuilds(configname, output, output_binary, quiet) post = self._GetIOSPostbuilds(configname, output_binary) return pre + postbuilds + post def _AdjustLibrary(self, library, config_name=None): if library.endswith('.framework'): l = '-framework ' + os.path.splitext(os.path.basename(library))[0] else: m = self.library_re.match(library) if m: l = '-l' + m.group(1) else: l = library sdk_root = self._SdkPath(config_name) if not sdk_root: sdk_root = '' # Xcode 7 started shipping with ".tbd" (text based stubs) files instead of # ".dylib" without providing a real support for them. What it does, for # "/usr/lib" libraries, is do "-L/usr/lib -lname" which is dependent on the # library order and cause collision when building Chrome. # # Instead substitude ".tbd" to ".dylib" in the generated project when the # following conditions are both true: # - library is referenced in the gyp file as "$(SDKROOT)/**/*.dylib", # - the ".dylib" file does not exists but a ".tbd" file do. library = l.replace('$(SDKROOT)', sdk_root) if l.startswith('$(SDKROOT)'): basename, ext = os.path.splitext(library) if ext == '.dylib' and not os.path.exists(library): tbd_library = basename + '.tbd' if os.path.exists(tbd_library): library = tbd_library return library def AdjustLibraries(self, libraries, config_name=None): """Transforms entries like 'Cocoa.framework' in libraries into entries like '-framework Cocoa', 'libcrypto.dylib' into '-lcrypto', etc. """ libraries = [self._AdjustLibrary(library, config_name) for library in libraries] return libraries def _BuildMachineOSBuild(self): return GetStdout(['sw_vers', '-buildVersion']) def _XcodeIOSDeviceFamily(self, configname): family = self.xcode_settings[configname].get('TARGETED_DEVICE_FAMILY', '1') return [int(x) for x in family.split(',')] def GetExtraPlistItems(self, configname=None): """Returns a dictionary with extra items to insert into Info.plist.""" if configname not in XcodeSettings._plist_cache: cache = {} cache['BuildMachineOSBuild'] = self._BuildMachineOSBuild() xcode, xcode_build = XcodeVersion() cache['DTXcode'] = xcode cache['DTXcodeBuild'] = xcode_build sdk_root = self._SdkRoot(configname) if not sdk_root: sdk_root = self._DefaultSdkRoot() cache['DTSDKName'] = sdk_root if xcode >= '0430': cache['DTSDKBuild'] = self._GetSdkVersionInfoItem( sdk_root, 'ProductBuildVersion') else: cache['DTSDKBuild'] = cache['BuildMachineOSBuild'] if self.isIOS: cache['DTPlatformName'] = cache['DTSDKName'] if configname.endswith("iphoneos"): cache['DTPlatformVersion'] = self._GetSdkVersionInfoItem( sdk_root, 'ProductVersion') cache['CFBundleSupportedPlatforms'] = ['iPhoneOS'] else: cache['CFBundleSupportedPlatforms'] = ['iPhoneSimulator'] XcodeSettings._plist_cache[configname] = cache # Include extra plist items that are per-target, not per global # XcodeSettings. items = dict(XcodeSettings._plist_cache[configname]) if self.isIOS: items['UIDeviceFamily'] = self._XcodeIOSDeviceFamily(configname) return items def _DefaultSdkRoot(self): """Returns the default SDKROOT to use. Prior to version 5.0.0, if SDKROOT was not explicitly set in the Xcode project, then the environment variable was empty. Starting with this version, Xcode uses the name of the newest SDK installed. """ xcode_version, xcode_build = XcodeVersion() if xcode_version < '0500': return '' default_sdk_path = self._XcodeSdkPath('') default_sdk_root = XcodeSettings._sdk_root_cache.get(default_sdk_path) if default_sdk_root: return default_sdk_root try: all_sdks = GetStdout(['xcodebuild', '-showsdks']) except: # If xcodebuild fails, there will be no valid SDKs return '' for line in all_sdks.splitlines(): items = line.split() if len(items) >= 3 and items[-2] == '-sdk': sdk_root = items[-1] sdk_path = self._XcodeSdkPath(sdk_root) if sdk_path == default_sdk_path: return sdk_root return '' class MacPrefixHeader(object): """A class that helps with emulating Xcode's GCC_PREFIX_HEADER feature. This feature consists of several pieces: * If GCC_PREFIX_HEADER is present, all compilations in that project get an additional |-include path_to_prefix_header| cflag. * If GCC_PRECOMPILE_PREFIX_HEADER is present too, then the prefix header is instead compiled, and all other compilations in the project get an additional |-include path_to_compiled_header| instead. + Compiled prefix headers have the extension gch. There is one gch file for every language used in the project (c, cc, m, mm), since gch files for different languages aren't compatible. + gch files themselves are built with the target's normal cflags, but they obviously don't get the |-include| flag. Instead, they need a -x flag that describes their language. + All o files in the target need to depend on the gch file, to make sure it's built before any o file is built. This class helps with some of these tasks, but it needs help from the build system for writing dependencies to the gch files, for writing build commands for the gch files, and for figuring out the location of the gch files. """ def __init__(self, xcode_settings, gyp_path_to_build_path, gyp_path_to_build_output): """If xcode_settings is None, all methods on this class are no-ops. Args: gyp_path_to_build_path: A function that takes a gyp-relative path, and returns a path relative to the build directory. gyp_path_to_build_output: A function that takes a gyp-relative path and a language code ('c', 'cc', 'm', or 'mm'), and that returns a path to where the output of precompiling that path for that language should be placed (without the trailing '.gch'). """ # This doesn't support per-configuration prefix headers. Good enough # for now. self.header = None self.compile_headers = False if xcode_settings: self.header = xcode_settings.GetPerTargetSetting('GCC_PREFIX_HEADER') self.compile_headers = xcode_settings.GetPerTargetSetting( 'GCC_PRECOMPILE_PREFIX_HEADER', default='NO') != 'NO' self.compiled_headers = {} if self.header: if self.compile_headers: for lang in ['c', 'cc', 'm', 'mm']: self.compiled_headers[lang] = gyp_path_to_build_output( self.header, lang) self.header = gyp_path_to_build_path(self.header) def _CompiledHeader(self, lang, arch): assert self.compile_headers h = self.compiled_headers[lang] if arch: h += '.' + arch return h def GetInclude(self, lang, arch=None): """Gets the cflags to include the prefix header for language |lang|.""" if self.compile_headers and lang in self.compiled_headers: return '-include %s' % self._CompiledHeader(lang, arch) elif self.header: return '-include %s' % self.header else: return '' def _Gch(self, lang, arch): """Returns the actual file name of the prefix header for language |lang|.""" assert self.compile_headers return self._CompiledHeader(lang, arch) + '.gch' def GetObjDependencies(self, sources, objs, arch=None): """Given a list of source files and the corresponding object files, returns a list of (source, object, gch) tuples, where |gch| is the build-directory relative path to the gch file each object file depends on. |compilable[i]| has to be the source file belonging to |objs[i]|.""" if not self.header or not self.compile_headers: return [] result = [] for source, obj in zip(sources, objs): ext = os.path.splitext(source)[1] lang = { '.c': 'c', '.cpp': 'cc', '.cc': 'cc', '.cxx': 'cc', '.m': 'm', '.mm': 'mm', }.get(ext, None) if lang: result.append((source, obj, self._Gch(lang, arch))) return result def GetPchBuildCommands(self, arch=None): """Returns [(path_to_gch, language_flag, language, header)]. |path_to_gch| and |header| are relative to the build directory. """ if not self.header or not self.compile_headers: return [] return [ (self._Gch('c', arch), '-x c-header', 'c', self.header), (self._Gch('cc', arch), '-x c++-header', 'cc', self.header), (self._Gch('m', arch), '-x objective-c-header', 'm', self.header), (self._Gch('mm', arch), '-x objective-c++-header', 'mm', self.header), ] def XcodeVersion(): """Returns a tuple of version and build version of installed Xcode.""" # `xcodebuild -version` output looks like # Xcode 4.6.3 # Build version 4H1503 # or like # Xcode 3.2.6 # Component versions: DevToolsCore-1809.0; DevToolsSupport-1806.0 # BuildVersion: 10M2518 # Convert that to '0463', '4H1503'. global XCODE_VERSION_CACHE if XCODE_VERSION_CACHE: return XCODE_VERSION_CACHE try: version_list = GetStdout(['xcodebuild', '-version']).splitlines() # In some circumstances xcodebuild exits 0 but doesn't return # the right results; for example, a user on 10.7 or 10.8 with # a bogus path set via xcode-select # In that case this may be a CLT-only install so fall back to # checking that version. if len(version_list) < 2: raise GypError("xcodebuild returned unexpected results") except: version = CLTVersion() if version: version = re.match(r'(\d\.\d\.?\d*)', version).groups()[0] else: raise GypError("No Xcode or CLT version detected!") # The CLT has no build information, so we return an empty string. version_list = [version, ''] version = version_list[0] build = version_list[-1] # Be careful to convert "4.2" to "0420": version = version.split()[-1].replace('.', '') version = (version + '0' * (3 - len(version))).zfill(4) if build: build = build.split()[-1] XCODE_VERSION_CACHE = (version, build) return XCODE_VERSION_CACHE # This function ported from the logic in Homebrew's CLT version check def CLTVersion(): """Returns the version of command-line tools from pkgutil.""" # pkgutil output looks like # package-id: com.apple.pkg.CLTools_Executables # version: 5.0.1.0.1.1382131676 # volume: / # location: / # install-time: 1382544035 # groups: com.apple.FindSystemFiles.pkg-group com.apple.DevToolsBoth.pkg-group com.apple.DevToolsNonRelocatableShared.pkg-group STANDALONE_PKG_ID = "com.apple.pkg.DeveloperToolsCLILeo" FROM_XCODE_PKG_ID = "com.apple.pkg.DeveloperToolsCLI" MAVERICKS_PKG_ID = "com.apple.pkg.CLTools_Executables" regex = re.compile('version: (?P.+)') for key in [MAVERICKS_PKG_ID, STANDALONE_PKG_ID, FROM_XCODE_PKG_ID]: try: output = GetStdout(['/usr/sbin/pkgutil', '--pkg-info', key]) return re.search(regex, output).groupdict()['version'] except: continue def GetStdout(cmdlist): """Returns the content of standard output returned by invoking |cmdlist|. Raises |GypError| if the command return with a non-zero return code.""" job = subprocess.Popen(cmdlist, stdout=subprocess.PIPE) out = job.communicate()[0] if job.returncode != 0: sys.stderr.write(out + '\n') raise GypError('Error %d running %s' % (job.returncode, cmdlist[0])) return out.rstrip('\n') def MergeGlobalXcodeSettingsToSpec(global_dict, spec): """Merges the global xcode_settings dictionary into each configuration of the target represented by spec. For keys that are both in the global and the local xcode_settings dict, the local key gets precendence. """ # The xcode generator special-cases global xcode_settings and does something # that amounts to merging in the global xcode_settings into each local # xcode_settings dict. global_xcode_settings = global_dict.get('xcode_settings', {}) for config in spec['configurations'].values(): if 'xcode_settings' in config: new_settings = global_xcode_settings.copy() new_settings.update(config['xcode_settings']) config['xcode_settings'] = new_settings def IsMacBundle(flavor, spec): """Returns if |spec| should be treated as a bundle. Bundles are directories with a certain subdirectory structure, instead of just a single file. Bundle rules do not produce a binary but also package resources into that directory.""" is_mac_bundle = (int(spec.get('mac_bundle', 0)) != 0 and flavor == 'mac') if is_mac_bundle: assert spec['type'] != 'none', ( 'mac_bundle targets cannot have type none (target "%s")' % spec['target_name']) return is_mac_bundle def GetMacBundleResources(product_dir, xcode_settings, resources): """Yields (output, resource) pairs for every resource in |resources|. Only call this for mac bundle targets. Args: product_dir: Path to the directory containing the output bundle, relative to the build directory. xcode_settings: The XcodeSettings of the current target. resources: A list of bundle resources, relative to the build directory. """ dest = os.path.join(product_dir, xcode_settings.GetBundleResourceFolder()) for res in resources: output = dest # The make generator doesn't support it, so forbid it everywhere # to keep the generators more interchangable. assert ' ' not in res, ( "Spaces in resource filenames not supported (%s)" % res) # Split into (path,file). res_parts = os.path.split(res) # Now split the path into (prefix,maybe.lproj). lproj_parts = os.path.split(res_parts[0]) # If the resource lives in a .lproj bundle, add that to the destination. if lproj_parts[1].endswith('.lproj'): output = os.path.join(output, lproj_parts[1]) output = os.path.join(output, res_parts[1]) # Compiled XIB files are referred to by .nib. if output.endswith('.xib'): output = os.path.splitext(output)[0] + '.nib' # Compiled storyboard files are referred to by .storyboardc. if output.endswith('.storyboard'): output = os.path.splitext(output)[0] + '.storyboardc' yield output, res def GetMacInfoPlist(product_dir, xcode_settings, gyp_path_to_build_path): """Returns (info_plist, dest_plist, defines, extra_env), where: * |info_plist| is the source plist path, relative to the build directory, * |dest_plist| is the destination plist path, relative to the build directory, * |defines| is a list of preprocessor defines (empty if the plist shouldn't be preprocessed, * |extra_env| is a dict of env variables that should be exported when invoking |mac_tool copy-info-plist|. Only call this for mac bundle targets. Args: product_dir: Path to the directory containing the output bundle, relative to the build directory. xcode_settings: The XcodeSettings of the current target. gyp_to_build_path: A function that converts paths relative to the current gyp file to paths relative to the build direcotry. """ info_plist = xcode_settings.GetPerTargetSetting('INFOPLIST_FILE') if not info_plist: return None, None, [], {} # The make generator doesn't support it, so forbid it everywhere # to keep the generators more interchangable. assert ' ' not in info_plist, ( "Spaces in Info.plist filenames not supported (%s)" % info_plist) info_plist = gyp_path_to_build_path(info_plist) # If explicitly set to preprocess the plist, invoke the C preprocessor and # specify any defines as -D flags. if xcode_settings.GetPerTargetSetting( 'INFOPLIST_PREPROCESS', default='NO') == 'YES': # Create an intermediate file based on the path. defines = shlex.split(xcode_settings.GetPerTargetSetting( 'INFOPLIST_PREPROCESSOR_DEFINITIONS', default='')) else: defines = [] dest_plist = os.path.join(product_dir, xcode_settings.GetBundlePlistPath()) extra_env = xcode_settings.GetPerTargetSettings() return info_plist, dest_plist, defines, extra_env def _GetXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration, additional_settings=None): """Return the environment variables that Xcode would set. See http://developer.apple.com/library/mac/#documentation/DeveloperTools/Reference/XcodeBuildSettingRef/1-Build_Setting_Reference/build_setting_ref.html#//apple_ref/doc/uid/TP40003931-CH3-SW153 for a full list. Args: xcode_settings: An XcodeSettings object. If this is None, this function returns an empty dict. built_products_dir: Absolute path to the built products dir. srcroot: Absolute path to the source root. configuration: The build configuration name. additional_settings: An optional dict with more values to add to the result. """ if not xcode_settings: return {} # This function is considered a friend of XcodeSettings, so let it reach into # its implementation details. spec = xcode_settings.spec # These are filled in on a as-needed basis. env = { 'BUILT_FRAMEWORKS_DIR' : built_products_dir, 'BUILT_PRODUCTS_DIR' : built_products_dir, 'CONFIGURATION' : configuration, 'PRODUCT_NAME' : xcode_settings.GetProductName(), # See /Developer/Platforms/MacOSX.platform/Developer/Library/Xcode/Specifications/MacOSX\ Product\ Types.xcspec for FULL_PRODUCT_NAME 'SRCROOT' : srcroot, 'SOURCE_ROOT': '${SRCROOT}', # This is not true for static libraries, but currently the env is only # written for bundles: 'TARGET_BUILD_DIR' : built_products_dir, 'TEMP_DIR' : '${TMPDIR}', } if xcode_settings.GetPerConfigSetting('SDKROOT', configuration): env['SDKROOT'] = xcode_settings._SdkPath(configuration) else: env['SDKROOT'] = '' if spec['type'] in ( 'executable', 'static_library', 'shared_library', 'loadable_module'): env['EXECUTABLE_NAME'] = xcode_settings.GetExecutableName() env['EXECUTABLE_PATH'] = xcode_settings.GetExecutablePath() env['FULL_PRODUCT_NAME'] = xcode_settings.GetFullProductName() mach_o_type = xcode_settings.GetMachOType() if mach_o_type: env['MACH_O_TYPE'] = mach_o_type env['PRODUCT_TYPE'] = xcode_settings.GetProductType() if xcode_settings._IsBundle(): env['CONTENTS_FOLDER_PATH'] = \ xcode_settings.GetBundleContentsFolderPath() env['UNLOCALIZED_RESOURCES_FOLDER_PATH'] = \ xcode_settings.GetBundleResourceFolder() env['INFOPLIST_PATH'] = xcode_settings.GetBundlePlistPath() env['WRAPPER_NAME'] = xcode_settings.GetWrapperName() install_name = xcode_settings.GetInstallName() if install_name: env['LD_DYLIB_INSTALL_NAME'] = install_name install_name_base = xcode_settings.GetInstallNameBase() if install_name_base: env['DYLIB_INSTALL_NAME_BASE'] = install_name_base if XcodeVersion() >= '0500' and not env.get('SDKROOT'): sdk_root = xcode_settings._SdkRoot(configuration) if not sdk_root: sdk_root = xcode_settings._XcodeSdkPath('') if sdk_root is None: sdk_root = '' env['SDKROOT'] = sdk_root if not additional_settings: additional_settings = {} else: # Flatten lists to strings. for k in additional_settings: if not isinstance(additional_settings[k], str): additional_settings[k] = ' '.join(additional_settings[k]) additional_settings.update(env) for k in additional_settings: additional_settings[k] = _NormalizeEnvVarReferences(additional_settings[k]) return additional_settings def _NormalizeEnvVarReferences(str): """Takes a string containing variable references in the form ${FOO}, $(FOO), or $FOO, and returns a string with all variable references in the form ${FOO}. """ # $FOO -> ${FOO} str = re.sub(r'\$([a-zA-Z_][a-zA-Z0-9_]*)', r'${\1}', str) # $(FOO) -> ${FOO} matches = re.findall(r'(\$\(([a-zA-Z0-9\-_]+)\))', str) for match in matches: to_replace, variable = match assert '$(' not in match, '$($(FOO)) variables not supported: ' + match str = str.replace(to_replace, '${' + variable + '}') return str def ExpandEnvVars(string, expansions): """Expands ${VARIABLES}, $(VARIABLES), and $VARIABLES in string per the expansions list. If the variable expands to something that references another variable, this variable is expanded as well if it's in env -- until no variables present in env are left.""" for k, v in reversed(expansions): string = string.replace('${' + k + '}', v) string = string.replace('$(' + k + ')', v) string = string.replace('$' + k, v) return string def _TopologicallySortedEnvVarKeys(env): """Takes a dict |env| whose values are strings that can refer to other keys, for example env['foo'] = '$(bar) and $(baz)'. Returns a list L of all keys of env such that key2 is after key1 in L if env[key2] refers to env[key1]. Throws an Exception in case of dependency cycles. """ # Since environment variables can refer to other variables, the evaluation # order is important. Below is the logic to compute the dependency graph # and sort it. regex = re.compile(r'\$\{([a-zA-Z0-9\-_]+)\}') def GetEdges(node): # Use a definition of edges such that user_of_variable -> used_varible. # This happens to be easier in this case, since a variable's # definition contains all variables it references in a single string. # We can then reverse the result of the topological sort at the end. # Since: reverse(topsort(DAG)) = topsort(reverse_edges(DAG)) matches = set([v for v in regex.findall(env[node]) if v in env]) for dependee in matches: assert '${' not in dependee, 'Nested variables not supported: ' + dependee return matches try: # Topologically sort, and then reverse, because we used an edge definition # that's inverted from the expected result of this function (see comment # above). order = gyp.common.TopologicallySorted(env.keys(), GetEdges) order.reverse() return order except gyp.common.CycleError, e: raise GypError( 'Xcode environment variables are cyclically dependent: ' + str(e.nodes)) def GetSortedXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration, additional_settings=None): env = _GetXcodeEnv(xcode_settings, built_products_dir, srcroot, configuration, additional_settings) return [(key, env[key]) for key in _TopologicallySortedEnvVarKeys(env)] def GetSpecPostbuildCommands(spec, quiet=False): """Returns the list of postbuilds explicitly defined on |spec|, in a form executable by a shell.""" postbuilds = [] for postbuild in spec.get('postbuilds', []): if not quiet: postbuilds.append('echo POSTBUILD\\(%s\\) %s' % ( spec['target_name'], postbuild['postbuild_name'])) postbuilds.append(gyp.common.EncodePOSIXShellList(postbuild['action'])) return postbuilds def _HasIOSTarget(targets): """Returns true if any target contains the iOS specific key IPHONEOS_DEPLOYMENT_TARGET.""" for target_dict in targets.values(): for config in target_dict['configurations'].values(): if config.get('xcode_settings', {}).get('IPHONEOS_DEPLOYMENT_TARGET'): return True return False def _AddIOSDeviceConfigurations(targets): """Clone all targets and append -iphoneos to the name. Configure these targets to build for iOS devices and use correct architectures for those builds.""" for target_dict in targets.itervalues(): toolset = target_dict['toolset'] configs = target_dict['configurations'] for config_name, config_dict in dict(configs).iteritems(): iphoneos_config_dict = copy.deepcopy(config_dict) configs[config_name + '-iphoneos'] = iphoneos_config_dict configs[config_name + '-iphonesimulator'] = config_dict if toolset == 'target': iphoneos_config_dict['xcode_settings']['SDKROOT'] = 'iphoneos' return targets def CloneConfigurationForDeviceAndEmulator(target_dicts): """If |target_dicts| contains any iOS targets, automatically create -iphoneos targets for iOS device builds.""" if _HasIOSTarget(target_dicts): return _AddIOSDeviceConfigurations(target_dicts) return target_dicts npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/xcode_ninja.py0000644000000000000000000002453112631326456023246 0ustar 00000000000000# Copyright (c) 2014 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Xcode-ninja wrapper project file generator. This updates the data structures passed to the Xcode gyp generator to build with ninja instead. The Xcode project itself is transformed into a list of executable targets, each with a build step to build with ninja, and a target with every source and resource file. This appears to sidestep some of the major performance headaches experienced using complex projects and large number of targets within Xcode. """ import errno import gyp.generator.ninja import os import re import xml.sax.saxutils def _WriteWorkspace(main_gyp, sources_gyp, params): """ Create a workspace to wrap main and sources gyp paths. """ (build_file_root, build_file_ext) = os.path.splitext(main_gyp) workspace_path = build_file_root + '.xcworkspace' options = params['options'] if options.generator_output: workspace_path = os.path.join(options.generator_output, workspace_path) try: os.makedirs(workspace_path) except OSError, e: if e.errno != errno.EEXIST: raise output_string = '\n' + \ '\n' for gyp_name in [main_gyp, sources_gyp]: name = os.path.splitext(os.path.basename(gyp_name))[0] + '.xcodeproj' name = xml.sax.saxutils.quoteattr("group:" + name) output_string += ' \n' % name output_string += '\n' workspace_file = os.path.join(workspace_path, "contents.xcworkspacedata") try: with open(workspace_file, 'r') as input_file: input_string = input_file.read() if input_string == output_string: return except IOError: # Ignore errors if the file doesn't exist. pass with open(workspace_file, 'w') as output_file: output_file.write(output_string) def _TargetFromSpec(old_spec, params): """ Create fake target for xcode-ninja wrapper. """ # Determine ninja top level build dir (e.g. /path/to/out). ninja_toplevel = None jobs = 0 if params: options = params['options'] ninja_toplevel = \ os.path.join(options.toplevel_dir, gyp.generator.ninja.ComputeOutputDir(params)) jobs = params.get('generator_flags', {}).get('xcode_ninja_jobs', 0) target_name = old_spec.get('target_name') product_name = old_spec.get('product_name', target_name) product_extension = old_spec.get('product_extension') ninja_target = {} ninja_target['target_name'] = target_name ninja_target['product_name'] = product_name if product_extension: ninja_target['product_extension'] = product_extension ninja_target['toolset'] = old_spec.get('toolset') ninja_target['default_configuration'] = old_spec.get('default_configuration') ninja_target['configurations'] = {} # Tell Xcode to look in |ninja_toplevel| for build products. new_xcode_settings = {} if ninja_toplevel: new_xcode_settings['CONFIGURATION_BUILD_DIR'] = \ "%s/$(CONFIGURATION)$(EFFECTIVE_PLATFORM_NAME)" % ninja_toplevel if 'configurations' in old_spec: for config in old_spec['configurations'].iterkeys(): old_xcode_settings = \ old_spec['configurations'][config].get('xcode_settings', {}) if 'IPHONEOS_DEPLOYMENT_TARGET' in old_xcode_settings: new_xcode_settings['CODE_SIGNING_REQUIRED'] = "NO" new_xcode_settings['IPHONEOS_DEPLOYMENT_TARGET'] = \ old_xcode_settings['IPHONEOS_DEPLOYMENT_TARGET'] ninja_target['configurations'][config] = {} ninja_target['configurations'][config]['xcode_settings'] = \ new_xcode_settings ninja_target['mac_bundle'] = old_spec.get('mac_bundle', 0) ninja_target['ios_app_extension'] = old_spec.get('ios_app_extension', 0) ninja_target['ios_watchkit_extension'] = \ old_spec.get('ios_watchkit_extension', 0) ninja_target['ios_watchkit_app'] = old_spec.get('ios_watchkit_app', 0) ninja_target['type'] = old_spec['type'] if ninja_toplevel: ninja_target['actions'] = [ { 'action_name': 'Compile and copy %s via ninja' % target_name, 'inputs': [], 'outputs': [], 'action': [ 'env', 'PATH=%s' % os.environ['PATH'], 'ninja', '-C', new_xcode_settings['CONFIGURATION_BUILD_DIR'], target_name, ], 'message': 'Compile and copy %s via ninja' % target_name, }, ] if jobs > 0: ninja_target['actions'][0]['action'].extend(('-j', jobs)) return ninja_target def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec): """Limit targets for Xcode wrapper. Xcode sometimes performs poorly with too many targets, so only include proper executable targets, with filters to customize. Arguments: target_extras: Regular expression to always add, matching any target. executable_target_pattern: Regular expression limiting executable targets. spec: Specifications for target. """ target_name = spec.get('target_name') # Always include targets matching target_extras. if target_extras is not None and re.search(target_extras, target_name): return True # Otherwise just show executable targets. if spec.get('type', '') == 'executable' and \ spec.get('product_extension', '') != 'bundle': # If there is a filter and the target does not match, exclude the target. if executable_target_pattern is not None: if not re.search(executable_target_pattern, target_name): return False return True return False def CreateWrapper(target_list, target_dicts, data, params): """Initialize targets for the ninja wrapper. This sets up the necessary variables in the targets to generate Xcode projects that use ninja as an external builder. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. data: Dict of flattened build files keyed on gyp path. params: Dict of global options for gyp. """ orig_gyp = params['build_files'][0] for gyp_name, gyp_dict in data.iteritems(): if gyp_name == orig_gyp: depth = gyp_dict['_DEPTH'] # Check for custom main gyp name, otherwise use the default CHROMIUM_GYP_FILE # and prepend .ninja before the .gyp extension. generator_flags = params.get('generator_flags', {}) main_gyp = generator_flags.get('xcode_ninja_main_gyp', None) if main_gyp is None: (build_file_root, build_file_ext) = os.path.splitext(orig_gyp) main_gyp = build_file_root + ".ninja" + build_file_ext # Create new |target_list|, |target_dicts| and |data| data structures. new_target_list = [] new_target_dicts = {} new_data = {} # Set base keys needed for |data|. new_data[main_gyp] = {} new_data[main_gyp]['included_files'] = [] new_data[main_gyp]['targets'] = [] new_data[main_gyp]['xcode_settings'] = \ data[orig_gyp].get('xcode_settings', {}) # Normally the xcode-ninja generator includes only valid executable targets. # If |xcode_ninja_executable_target_pattern| is set, that list is reduced to # executable targets that match the pattern. (Default all) executable_target_pattern = \ generator_flags.get('xcode_ninja_executable_target_pattern', None) # For including other non-executable targets, add the matching target name # to the |xcode_ninja_target_pattern| regular expression. (Default none) target_extras = generator_flags.get('xcode_ninja_target_pattern', None) for old_qualified_target in target_list: spec = target_dicts[old_qualified_target] if IsValidTargetForWrapper(target_extras, executable_target_pattern, spec): # Add to new_target_list. target_name = spec.get('target_name') new_target_name = '%s:%s#target' % (main_gyp, target_name) new_target_list.append(new_target_name) # Add to new_target_dicts. new_target_dicts[new_target_name] = _TargetFromSpec(spec, params) # Add to new_data. for old_target in data[old_qualified_target.split(':')[0]]['targets']: if old_target['target_name'] == target_name: new_data_target = {} new_data_target['target_name'] = old_target['target_name'] new_data_target['toolset'] = old_target['toolset'] new_data[main_gyp]['targets'].append(new_data_target) # Create sources target. sources_target_name = 'sources_for_indexing' sources_target = _TargetFromSpec( { 'target_name' : sources_target_name, 'toolset': 'target', 'default_configuration': 'Default', 'mac_bundle': '0', 'type': 'executable' }, None) # Tell Xcode to look everywhere for headers. sources_target['configurations'] = {'Default': { 'include_dirs': [ depth ] } } sources = [] for target, target_dict in target_dicts.iteritems(): base = os.path.dirname(target) files = target_dict.get('sources', []) + \ target_dict.get('mac_bundle_resources', []) for action in target_dict.get('actions', []): files.extend(action.get('inputs', [])) # Remove files starting with $. These are mostly intermediate files for the # build system. files = [ file for file in files if not file.startswith('$')] # Make sources relative to root build file. relative_path = os.path.dirname(main_gyp) sources += [ os.path.relpath(os.path.join(base, file), relative_path) for file in files ] sources_target['sources'] = sorted(set(sources)) # Put sources_to_index in it's own gyp. sources_gyp = \ os.path.join(os.path.dirname(main_gyp), sources_target_name + ".gyp") fully_qualified_target_name = \ '%s:%s#target' % (sources_gyp, sources_target_name) # Add to new_target_list, new_target_dicts and new_data. new_target_list.append(fully_qualified_target_name) new_target_dicts[fully_qualified_target_name] = sources_target new_data_target = {} new_data_target['target_name'] = sources_target['target_name'] new_data_target['_DEPTH'] = depth new_data_target['toolset'] = "target" new_data[sources_gyp] = {} new_data[sources_gyp]['targets'] = [] new_data[sources_gyp]['included_files'] = [] new_data[sources_gyp]['xcode_settings'] = \ data[orig_gyp].get('xcode_settings', {}) new_data[sources_gyp]['targets'].append(new_data_target) # Write workspace to file. _WriteWorkspace(main_gyp, sources_gyp, params) return (new_target_list, new_target_dicts, new_data) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/xcodeproj_file.py0000644000000000000000000035401212631326456023761 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Xcode project file generator. This module is both an Xcode project file generator and a documentation of the Xcode project file format. Knowledge of the project file format was gained based on extensive experience with Xcode, and by making changes to projects in Xcode.app and observing the resultant changes in the associated project files. XCODE PROJECT FILES The generator targets the file format as written by Xcode 3.2 (specifically, 3.2.6), but past experience has taught that the format has not changed significantly in the past several years, and future versions of Xcode are able to read older project files. Xcode project files are "bundled": the project "file" from an end-user's perspective is actually a directory with an ".xcodeproj" extension. The project file from this module's perspective is actually a file inside this directory, always named "project.pbxproj". This file contains a complete description of the project and is all that is needed to use the xcodeproj. Other files contained in the xcodeproj directory are simply used to store per-user settings, such as the state of various UI elements in the Xcode application. The project.pbxproj file is a property list, stored in a format almost identical to the NeXTstep property list format. The file is able to carry Unicode data, and is encoded in UTF-8. The root element in the property list is a dictionary that contains several properties of minimal interest, and two properties of immense interest. The most important property is a dictionary named "objects". The entire structure of the project is represented by the children of this property. The objects dictionary is keyed by unique 96-bit values represented by 24 uppercase hexadecimal characters. Each value in the objects dictionary is itself a dictionary, describing an individual object. Each object in the dictionary is a member of a class, which is identified by the "isa" property of each object. A variety of classes are represented in a project file. Objects can refer to other objects by ID, using the 24-character hexadecimal object key. A project's objects form a tree, with a root object of class PBXProject at the root. As an example, the PBXProject object serves as parent to an XCConfigurationList object defining the build configurations used in the project, a PBXGroup object serving as a container for all files referenced in the project, and a list of target objects, each of which defines a target in the project. There are several different types of target object, such as PBXNativeTarget and PBXAggregateTarget. In this module, this relationship is expressed by having each target type derive from an abstract base named XCTarget. The project.pbxproj file's root dictionary also contains a property, sibling to the "objects" dictionary, named "rootObject". The value of rootObject is a 24-character object key referring to the root PBXProject object in the objects dictionary. In Xcode, every file used as input to a target or produced as a final product of a target must appear somewhere in the hierarchy rooted at the PBXGroup object referenced by the PBXProject's mainGroup property. A PBXGroup is generally represented as a folder in the Xcode application. PBXGroups can contain other PBXGroups as well as PBXFileReferences, which are pointers to actual files. Each XCTarget contains a list of build phases, represented in this module by the abstract base XCBuildPhase. Examples of concrete XCBuildPhase derivations are PBXSourcesBuildPhase and PBXFrameworksBuildPhase, which correspond to the "Compile Sources" and "Link Binary With Libraries" phases displayed in the Xcode application. Files used as input to these phases (for example, source files in the former case and libraries and frameworks in the latter) are represented by PBXBuildFile objects, referenced by elements of "files" lists in XCTarget objects. Each PBXBuildFile object refers to a PBXBuildFile object as a "weak" reference: it does not "own" the PBXBuildFile, which is owned by the root object's mainGroup or a descendant group. In most cases, the layer of indirection between an XCBuildPhase and a PBXFileReference via a PBXBuildFile appears extraneous, but there's actually one reason for this: file-specific compiler flags are added to the PBXBuildFile object so as to allow a single file to be a member of multiple targets while having distinct compiler flags for each. These flags can be modified in the Xcode applciation in the "Build" tab of a File Info window. When a project is open in the Xcode application, Xcode will rewrite it. As such, this module is careful to adhere to the formatting used by Xcode, to avoid insignificant changes appearing in the file when it is used in the Xcode application. This will keep version control repositories happy, and makes it possible to compare a project file used in Xcode to one generated by this module to determine if any significant changes were made in the application. Xcode has its own way of assigning 24-character identifiers to each object, which is not duplicated here. Because the identifier only is only generated once, when an object is created, and is then left unchanged, there is no need to attempt to duplicate Xcode's behavior in this area. The generator is free to select any identifier, even at random, to refer to the objects it creates, and Xcode will retain those identifiers and use them when subsequently rewriting the project file. However, the generator would choose new random identifiers each time the project files are generated, leading to difficulties comparing "used" project files to "pristine" ones produced by this module, and causing the appearance of changes as every object identifier is changed when updated projects are checked in to a version control repository. To mitigate this problem, this module chooses identifiers in a more deterministic way, by hashing a description of each object as well as its parent and ancestor objects. This strategy should result in minimal "shift" in IDs as successive generations of project files are produced. THIS MODULE This module introduces several classes, all derived from the XCObject class. Nearly all of the "brains" are built into the XCObject class, which understands how to create and modify objects, maintain the proper tree structure, compute identifiers, and print objects. For the most part, classes derived from XCObject need only provide a _schema class object, a dictionary that expresses what properties objects of the class may contain. Given this structure, it's possible to build a minimal project file by creating objects of the appropriate types and making the proper connections: config_list = XCConfigurationList() group = PBXGroup() project = PBXProject({'buildConfigurationList': config_list, 'mainGroup': group}) With the project object set up, it can be added to an XCProjectFile object. XCProjectFile is a pseudo-class in the sense that it is a concrete XCObject subclass that does not actually correspond to a class type found in a project file. Rather, it is used to represent the project file's root dictionary. Printing an XCProjectFile will print the entire project file, including the full "objects" dictionary. project_file = XCProjectFile({'rootObject': project}) project_file.ComputeIDs() project_file.Print() Xcode project files are always encoded in UTF-8. This module will accept strings of either the str class or the unicode class. Strings of class str are assumed to already be encoded in UTF-8. Obviously, if you're just using ASCII, you won't encounter difficulties because ASCII is a UTF-8 subset. Strings of class unicode are handled properly and encoded in UTF-8 when a project file is output. """ import gyp.common import posixpath import re import struct import sys # hashlib is supplied as of Python 2.5 as the replacement interface for sha # and other secure hashes. In 2.6, sha is deprecated. Import hashlib if # available, avoiding a deprecation warning under 2.6. Import sha otherwise, # preserving 2.4 compatibility. try: import hashlib _new_sha1 = hashlib.sha1 except ImportError: import sha _new_sha1 = sha.new # See XCObject._EncodeString. This pattern is used to determine when a string # can be printed unquoted. Strings that match this pattern may be printed # unquoted. Strings that do not match must be quoted and may be further # transformed to be properly encoded. Note that this expression matches the # characters listed with "+", for 1 or more occurrences: if a string is empty, # it must not match this pattern, because it needs to be encoded as "". _unquoted = re.compile('^[A-Za-z0-9$./_]+$') # Strings that match this pattern are quoted regardless of what _unquoted says. # Oddly, Xcode will quote any string with a run of three or more underscores. _quoted = re.compile('___') # This pattern should match any character that needs to be escaped by # XCObject._EncodeString. See that function. _escaped = re.compile('[\\\\"]|[\x00-\x1f]') # Used by SourceTreeAndPathFromPath _path_leading_variable = re.compile(r'^\$\((.*?)\)(/(.*))?$') def SourceTreeAndPathFromPath(input_path): """Given input_path, returns a tuple with sourceTree and path values. Examples: input_path (source_tree, output_path) '$(VAR)/path' ('VAR', 'path') '$(VAR)' ('VAR', None) 'path' (None, 'path') """ source_group_match = _path_leading_variable.match(input_path) if source_group_match: source_tree = source_group_match.group(1) output_path = source_group_match.group(3) # This may be None. else: source_tree = None output_path = input_path return (source_tree, output_path) def ConvertVariablesToShellSyntax(input_string): return re.sub(r'\$\((.*?)\)', '${\\1}', input_string) class XCObject(object): """The abstract base of all class types used in Xcode project files. Class variables: _schema: A dictionary defining the properties of this class. The keys to _schema are string property keys as used in project files. Values are a list of four or five elements: [ is_list, property_type, is_strong, is_required, default ] is_list: True if the property described is a list, as opposed to a single element. property_type: The type to use as the value of the property, or if is_list is True, the type to use for each element of the value's list. property_type must be an XCObject subclass, or one of the built-in types str, int, or dict. is_strong: If property_type is an XCObject subclass, is_strong is True to assert that this class "owns," or serves as parent, to the property value (or, if is_list is True, values). is_strong must be False if property_type is not an XCObject subclass. is_required: True if the property is required for the class. Note that is_required being True does not preclude an empty string ("", in the case of property_type str) or list ([], in the case of is_list True) from being set for the property. default: Optional. If is_requried is True, default may be set to provide a default value for objects that do not supply their own value. If is_required is True and default is not provided, users of the class must supply their own value for the property. Note that although the values of the array are expressed in boolean terms, subclasses provide values as integers to conserve horizontal space. _should_print_single_line: False in XCObject. Subclasses whose objects should be written to the project file in the alternate single-line format, such as PBXFileReference and PBXBuildFile, should set this to True. _encode_transforms: Used by _EncodeString to encode unprintable characters. The index into this list is the ordinal of the character to transform; each value is a string used to represent the character in the output. XCObject provides an _encode_transforms list suitable for most XCObject subclasses. _alternate_encode_transforms: Provided for subclasses that wish to use the alternate encoding rules. Xcode seems to use these rules when printing objects in single-line format. Subclasses that desire this behavior should set _encode_transforms to _alternate_encode_transforms. _hashables: A list of XCObject subclasses that can be hashed by ComputeIDs to construct this object's ID. Most classes that need custom hashing behavior should do it by overriding Hashables, but in some cases an object's parent may wish to push a hashable value into its child, and it can do so by appending to _hashables. Attributes: id: The object's identifier, a 24-character uppercase hexadecimal string. Usually, objects being created should not set id until the entire project file structure is built. At that point, UpdateIDs() should be called on the root object to assign deterministic values for id to each object in the tree. parent: The object's parent. This is set by a parent XCObject when a child object is added to it. _properties: The object's property dictionary. An object's properties are described by its class' _schema variable. """ _schema = {} _should_print_single_line = False # See _EncodeString. _encode_transforms = [] i = 0 while i < ord(' '): _encode_transforms.append('\\U%04x' % i) i = i + 1 _encode_transforms[7] = '\\a' _encode_transforms[8] = '\\b' _encode_transforms[9] = '\\t' _encode_transforms[10] = '\\n' _encode_transforms[11] = '\\v' _encode_transforms[12] = '\\f' _encode_transforms[13] = '\\n' _alternate_encode_transforms = list(_encode_transforms) _alternate_encode_transforms[9] = chr(9) _alternate_encode_transforms[10] = chr(10) _alternate_encode_transforms[11] = chr(11) def __init__(self, properties=None, id=None, parent=None): self.id = id self.parent = parent self._properties = {} self._hashables = [] self._SetDefaultsFromSchema() self.UpdateProperties(properties) def __repr__(self): try: name = self.Name() except NotImplementedError: return '<%s at 0x%x>' % (self.__class__.__name__, id(self)) return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self)) def Copy(self): """Make a copy of this object. The new object will have its own copy of lists and dicts. Any XCObject objects owned by this object (marked "strong") will be copied in the new object, even those found in lists. If this object has any weak references to other XCObjects, the same references are added to the new object without making a copy. """ that = self.__class__(id=self.id, parent=self.parent) for key, value in self._properties.iteritems(): is_strong = self._schema[key][2] if isinstance(value, XCObject): if is_strong: new_value = value.Copy() new_value.parent = that that._properties[key] = new_value else: that._properties[key] = value elif isinstance(value, str) or isinstance(value, unicode) or \ isinstance(value, int): that._properties[key] = value elif isinstance(value, list): if is_strong: # If is_strong is True, each element is an XCObject, so it's safe to # call Copy. that._properties[key] = [] for item in value: new_item = item.Copy() new_item.parent = that that._properties[key].append(new_item) else: that._properties[key] = value[:] elif isinstance(value, dict): # dicts are never strong. if is_strong: raise TypeError('Strong dict for key ' + key + ' in ' + \ self.__class__.__name__) else: that._properties[key] = value.copy() else: raise TypeError('Unexpected type ' + value.__class__.__name__ + \ ' for key ' + key + ' in ' + self.__class__.__name__) return that def Name(self): """Return the name corresponding to an object. Not all objects necessarily need to be nameable, and not all that do have a "name" property. Override as needed. """ # If the schema indicates that "name" is required, try to access the # property even if it doesn't exist. This will result in a KeyError # being raised for the property that should be present, which seems more # appropriate than NotImplementedError in this case. if 'name' in self._properties or \ ('name' in self._schema and self._schema['name'][3]): return self._properties['name'] raise NotImplementedError(self.__class__.__name__ + ' must implement Name') def Comment(self): """Return a comment string for the object. Most objects just use their name as the comment, but PBXProject uses different values. The returned comment is not escaped and does not have any comment marker strings applied to it. """ return self.Name() def Hashables(self): hashables = [self.__class__.__name__] name = self.Name() if name != None: hashables.append(name) hashables.extend(self._hashables) return hashables def HashablesForChild(self): return None def ComputeIDs(self, recursive=True, overwrite=True, seed_hash=None): """Set "id" properties deterministically. An object's "id" property is set based on a hash of its class type and name, as well as the class type and name of all ancestor objects. As such, it is only advisable to call ComputeIDs once an entire project file tree is built. If recursive is True, recurse into all descendant objects and update their hashes. If overwrite is True, any existing value set in the "id" property will be replaced. """ def _HashUpdate(hash, data): """Update hash with data's length and contents. If the hash were updated only with the value of data, it would be possible for clowns to induce collisions by manipulating the names of their objects. By adding the length, it's exceedingly less likely that ID collisions will be encountered, intentionally or not. """ hash.update(struct.pack('>i', len(data))) hash.update(data) if seed_hash is None: seed_hash = _new_sha1() hash = seed_hash.copy() hashables = self.Hashables() assert len(hashables) > 0 for hashable in hashables: _HashUpdate(hash, hashable) if recursive: hashables_for_child = self.HashablesForChild() if hashables_for_child is None: child_hash = hash else: assert len(hashables_for_child) > 0 child_hash = seed_hash.copy() for hashable in hashables_for_child: _HashUpdate(child_hash, hashable) for child in self.Children(): child.ComputeIDs(recursive, overwrite, child_hash) if overwrite or self.id is None: # Xcode IDs are only 96 bits (24 hex characters), but a SHA-1 digest is # is 160 bits. Instead of throwing out 64 bits of the digest, xor them # into the portion that gets used. assert hash.digest_size % 4 == 0 digest_int_count = hash.digest_size / 4 digest_ints = struct.unpack('>' + 'I' * digest_int_count, hash.digest()) id_ints = [0, 0, 0] for index in xrange(0, digest_int_count): id_ints[index % 3] ^= digest_ints[index] self.id = '%08X%08X%08X' % tuple(id_ints) def EnsureNoIDCollisions(self): """Verifies that no two objects have the same ID. Checks all descendants. """ ids = {} descendants = self.Descendants() for descendant in descendants: if descendant.id in ids: other = ids[descendant.id] raise KeyError( 'Duplicate ID %s, objects "%s" and "%s" in "%s"' % \ (descendant.id, str(descendant._properties), str(other._properties), self._properties['rootObject'].Name())) ids[descendant.id] = descendant def Children(self): """Returns a list of all of this object's owned (strong) children.""" children = [] for property, attributes in self._schema.iteritems(): (is_list, property_type, is_strong) = attributes[0:3] if is_strong and property in self._properties: if not is_list: children.append(self._properties[property]) else: children.extend(self._properties[property]) return children def Descendants(self): """Returns a list of all of this object's descendants, including this object. """ children = self.Children() descendants = [self] for child in children: descendants.extend(child.Descendants()) return descendants def PBXProjectAncestor(self): # The base case for recursion is defined at PBXProject.PBXProjectAncestor. if self.parent: return self.parent.PBXProjectAncestor() return None def _EncodeComment(self, comment): """Encodes a comment to be placed in the project file output, mimicing Xcode behavior. """ # This mimics Xcode behavior by wrapping the comment in "/*" and "*/". If # the string already contains a "*/", it is turned into "(*)/". This keeps # the file writer from outputting something that would be treated as the # end of a comment in the middle of something intended to be entirely a # comment. return '/* ' + comment.replace('*/', '(*)/') + ' */' def _EncodeTransform(self, match): # This function works closely with _EncodeString. It will only be called # by re.sub with match.group(0) containing a character matched by the # the _escaped expression. char = match.group(0) # Backslashes (\) and quotation marks (") are always replaced with a # backslash-escaped version of the same. Everything else gets its # replacement from the class' _encode_transforms array. if char == '\\': return '\\\\' if char == '"': return '\\"' return self._encode_transforms[ord(char)] def _EncodeString(self, value): """Encodes a string to be placed in the project file output, mimicing Xcode behavior. """ # Use quotation marks when any character outside of the range A-Z, a-z, 0-9, # $ (dollar sign), . (period), and _ (underscore) is present. Also use # quotation marks to represent empty strings. # # Escape " (double-quote) and \ (backslash) by preceding them with a # backslash. # # Some characters below the printable ASCII range are encoded specially: # 7 ^G BEL is encoded as "\a" # 8 ^H BS is encoded as "\b" # 11 ^K VT is encoded as "\v" # 12 ^L NP is encoded as "\f" # 127 ^? DEL is passed through as-is without escaping # - In PBXFileReference and PBXBuildFile objects: # 9 ^I HT is passed through as-is without escaping # 10 ^J NL is passed through as-is without escaping # 13 ^M CR is passed through as-is without escaping # - In other objects: # 9 ^I HT is encoded as "\t" # 10 ^J NL is encoded as "\n" # 13 ^M CR is encoded as "\n" rendering it indistinguishable from # 10 ^J NL # All other characters within the ASCII control character range (0 through # 31 inclusive) are encoded as "\U001f" referring to the Unicode code point # in hexadecimal. For example, character 14 (^N SO) is encoded as "\U000e". # Characters above the ASCII range are passed through to the output encoded # as UTF-8 without any escaping. These mappings are contained in the # class' _encode_transforms list. if _unquoted.search(value) and not _quoted.search(value): return value return '"' + _escaped.sub(self._EncodeTransform, value) + '"' def _XCPrint(self, file, tabs, line): file.write('\t' * tabs + line) def _XCPrintableValue(self, tabs, value, flatten_list=False): """Returns a representation of value that may be printed in a project file, mimicing Xcode's behavior. _XCPrintableValue can handle str and int values, XCObjects (which are made printable by returning their id property), and list and dict objects composed of any of the above types. When printing a list or dict, and _should_print_single_line is False, the tabs parameter is used to determine how much to indent the lines corresponding to the items in the list or dict. If flatten_list is True, single-element lists will be transformed into strings. """ printable = '' comment = None if self._should_print_single_line: sep = ' ' element_tabs = '' end_tabs = '' else: sep = '\n' element_tabs = '\t' * (tabs + 1) end_tabs = '\t' * tabs if isinstance(value, XCObject): printable += value.id comment = value.Comment() elif isinstance(value, str): printable += self._EncodeString(value) elif isinstance(value, unicode): printable += self._EncodeString(value.encode('utf-8')) elif isinstance(value, int): printable += str(value) elif isinstance(value, list): if flatten_list and len(value) <= 1: if len(value) == 0: printable += self._EncodeString('') else: printable += self._EncodeString(value[0]) else: printable = '(' + sep for item in value: printable += element_tabs + \ self._XCPrintableValue(tabs + 1, item, flatten_list) + \ ',' + sep printable += end_tabs + ')' elif isinstance(value, dict): printable = '{' + sep for item_key, item_value in sorted(value.iteritems()): printable += element_tabs + \ self._XCPrintableValue(tabs + 1, item_key, flatten_list) + ' = ' + \ self._XCPrintableValue(tabs + 1, item_value, flatten_list) + ';' + \ sep printable += end_tabs + '}' else: raise TypeError("Can't make " + value.__class__.__name__ + ' printable') if comment != None: printable += ' ' + self._EncodeComment(comment) return printable def _XCKVPrint(self, file, tabs, key, value): """Prints a key and value, members of an XCObject's _properties dictionary, to file. tabs is an int identifying the indentation level. If the class' _should_print_single_line variable is True, tabs is ignored and the key-value pair will be followed by a space insead of a newline. """ if self._should_print_single_line: printable = '' after_kv = ' ' else: printable = '\t' * tabs after_kv = '\n' # Xcode usually prints remoteGlobalIDString values in PBXContainerItemProxy # objects without comments. Sometimes it prints them with comments, but # the majority of the time, it doesn't. To avoid unnecessary changes to # the project file after Xcode opens it, don't write comments for # remoteGlobalIDString. This is a sucky hack and it would certainly be # cleaner to extend the schema to indicate whether or not a comment should # be printed, but since this is the only case where the problem occurs and # Xcode itself can't seem to make up its mind, the hack will suffice. # # Also see PBXContainerItemProxy._schema['remoteGlobalIDString']. if key == 'remoteGlobalIDString' and isinstance(self, PBXContainerItemProxy): value_to_print = value.id else: value_to_print = value # PBXBuildFile's settings property is represented in the output as a dict, # but a hack here has it represented as a string. Arrange to strip off the # quotes so that it shows up in the output as expected. if key == 'settings' and isinstance(self, PBXBuildFile): strip_value_quotes = True else: strip_value_quotes = False # In another one-off, let's set flatten_list on buildSettings properties # of XCBuildConfiguration objects, because that's how Xcode treats them. if key == 'buildSettings' and isinstance(self, XCBuildConfiguration): flatten_list = True else: flatten_list = False try: printable_key = self._XCPrintableValue(tabs, key, flatten_list) printable_value = self._XCPrintableValue(tabs, value_to_print, flatten_list) if strip_value_quotes and len(printable_value) > 1 and \ printable_value[0] == '"' and printable_value[-1] == '"': printable_value = printable_value[1:-1] printable += printable_key + ' = ' + printable_value + ';' + after_kv except TypeError, e: gyp.common.ExceptionAppend(e, 'while printing key "%s"' % key) raise self._XCPrint(file, 0, printable) def Print(self, file=sys.stdout): """Prints a reprentation of this object to file, adhering to Xcode output formatting. """ self.VerifyHasRequiredProperties() if self._should_print_single_line: # When printing an object in a single line, Xcode doesn't put any space # between the beginning of a dictionary (or presumably a list) and the # first contained item, so you wind up with snippets like # ...CDEF = {isa = PBXFileReference; fileRef = 0123... # If it were me, I would have put a space in there after the opening # curly, but I guess this is just another one of those inconsistencies # between how Xcode prints PBXFileReference and PBXBuildFile objects as # compared to other objects. Mimic Xcode's behavior here by using an # empty string for sep. sep = '' end_tabs = 0 else: sep = '\n' end_tabs = 2 # Start the object. For example, '\t\tPBXProject = {\n'. self._XCPrint(file, 2, self._XCPrintableValue(2, self) + ' = {' + sep) # "isa" isn't in the _properties dictionary, it's an intrinsic property # of the class which the object belongs to. Xcode always outputs "isa" # as the first element of an object dictionary. self._XCKVPrint(file, 3, 'isa', self.__class__.__name__) # The remaining elements of an object dictionary are sorted alphabetically. for property, value in sorted(self._properties.iteritems()): self._XCKVPrint(file, 3, property, value) # End the object. self._XCPrint(file, end_tabs, '};\n') def UpdateProperties(self, properties, do_copy=False): """Merge the supplied properties into the _properties dictionary. The input properties must adhere to the class schema or a KeyError or TypeError exception will be raised. If adding an object of an XCObject subclass and the schema indicates a strong relationship, the object's parent will be set to this object. If do_copy is True, then lists, dicts, strong-owned XCObjects, and strong-owned XCObjects in lists will be copied instead of having their references added. """ if properties is None: return for property, value in properties.iteritems(): # Make sure the property is in the schema. if not property in self._schema: raise KeyError(property + ' not in ' + self.__class__.__name__) # Make sure the property conforms to the schema. (is_list, property_type, is_strong) = self._schema[property][0:3] if is_list: if value.__class__ != list: raise TypeError( property + ' of ' + self.__class__.__name__ + \ ' must be list, not ' + value.__class__.__name__) for item in value: if not isinstance(item, property_type) and \ not (item.__class__ == unicode and property_type == str): # Accept unicode where str is specified. str is treated as # UTF-8-encoded. raise TypeError( 'item of ' + property + ' of ' + self.__class__.__name__ + \ ' must be ' + property_type.__name__ + ', not ' + \ item.__class__.__name__) elif not isinstance(value, property_type) and \ not (value.__class__ == unicode and property_type == str): # Accept unicode where str is specified. str is treated as # UTF-8-encoded. raise TypeError( property + ' of ' + self.__class__.__name__ + ' must be ' + \ property_type.__name__ + ', not ' + value.__class__.__name__) # Checks passed, perform the assignment. if do_copy: if isinstance(value, XCObject): if is_strong: self._properties[property] = value.Copy() else: self._properties[property] = value elif isinstance(value, str) or isinstance(value, unicode) or \ isinstance(value, int): self._properties[property] = value elif isinstance(value, list): if is_strong: # If is_strong is True, each element is an XCObject, so it's safe # to call Copy. self._properties[property] = [] for item in value: self._properties[property].append(item.Copy()) else: self._properties[property] = value[:] elif isinstance(value, dict): self._properties[property] = value.copy() else: raise TypeError("Don't know how to copy a " + \ value.__class__.__name__ + ' object for ' + \ property + ' in ' + self.__class__.__name__) else: self._properties[property] = value # Set up the child's back-reference to this object. Don't use |value| # any more because it may not be right if do_copy is true. if is_strong: if not is_list: self._properties[property].parent = self else: for item in self._properties[property]: item.parent = self def HasProperty(self, key): return key in self._properties def GetProperty(self, key): return self._properties[key] def SetProperty(self, key, value): self.UpdateProperties({key: value}) def DelProperty(self, key): if key in self._properties: del self._properties[key] def AppendProperty(self, key, value): # TODO(mark): Support ExtendProperty too (and make this call that)? # Schema validation. if not key in self._schema: raise KeyError(key + ' not in ' + self.__class__.__name__) (is_list, property_type, is_strong) = self._schema[key][0:3] if not is_list: raise TypeError(key + ' of ' + self.__class__.__name__ + ' must be list') if not isinstance(value, property_type): raise TypeError('item of ' + key + ' of ' + self.__class__.__name__ + \ ' must be ' + property_type.__name__ + ', not ' + \ value.__class__.__name__) # If the property doesn't exist yet, create a new empty list to receive the # item. if not key in self._properties: self._properties[key] = [] # Set up the ownership link. if is_strong: value.parent = self # Store the item. self._properties[key].append(value) def VerifyHasRequiredProperties(self): """Ensure that all properties identified as required by the schema are set. """ # TODO(mark): A stronger verification mechanism is needed. Some # subclasses need to perform validation beyond what the schema can enforce. for property, attributes in self._schema.iteritems(): (is_list, property_type, is_strong, is_required) = attributes[0:4] if is_required and not property in self._properties: raise KeyError(self.__class__.__name__ + ' requires ' + property) def _SetDefaultsFromSchema(self): """Assign object default values according to the schema. This will not overwrite properties that have already been set.""" defaults = {} for property, attributes in self._schema.iteritems(): (is_list, property_type, is_strong, is_required) = attributes[0:4] if is_required and len(attributes) >= 5 and \ not property in self._properties: default = attributes[4] defaults[property] = default if len(defaults) > 0: # Use do_copy=True so that each new object gets its own copy of strong # objects, lists, and dicts. self.UpdateProperties(defaults, do_copy=True) class XCHierarchicalElement(XCObject): """Abstract base for PBXGroup and PBXFileReference. Not represented in a project file.""" # TODO(mark): Do name and path belong here? Probably so. # If path is set and name is not, name may have a default value. Name will # be set to the basename of path, if the basename of path is different from # the full value of path. If path is already just a leaf name, name will # not be set. _schema = XCObject._schema.copy() _schema.update({ 'comments': [0, str, 0, 0], 'fileEncoding': [0, str, 0, 0], 'includeInIndex': [0, int, 0, 0], 'indentWidth': [0, int, 0, 0], 'lineEnding': [0, int, 0, 0], 'sourceTree': [0, str, 0, 1, ''], 'tabWidth': [0, int, 0, 0], 'usesTabs': [0, int, 0, 0], 'wrapsLines': [0, int, 0, 0], }) def __init__(self, properties=None, id=None, parent=None): # super XCObject.__init__(self, properties, id, parent) if 'path' in self._properties and not 'name' in self._properties: path = self._properties['path'] name = posixpath.basename(path) if name != '' and path != name: self.SetProperty('name', name) if 'path' in self._properties and \ (not 'sourceTree' in self._properties or \ self._properties['sourceTree'] == ''): # If the pathname begins with an Xcode variable like "$(SDKROOT)/", take # the variable out and make the path be relative to that variable by # assigning the variable name as the sourceTree. (source_tree, path) = SourceTreeAndPathFromPath(self._properties['path']) if source_tree != None: self._properties['sourceTree'] = source_tree if path != None: self._properties['path'] = path if source_tree != None and path is None and \ not 'name' in self._properties: # The path was of the form "$(SDKROOT)" with no path following it. # This object is now relative to that variable, so it has no path # attribute of its own. It does, however, keep a name. del self._properties['path'] self._properties['name'] = source_tree def Name(self): if 'name' in self._properties: return self._properties['name'] elif 'path' in self._properties: return self._properties['path'] else: # This happens in the case of the root PBXGroup. return None def Hashables(self): """Custom hashables for XCHierarchicalElements. XCHierarchicalElements are special. Generally, their hashes shouldn't change if the paths don't change. The normal XCObject implementation of Hashables adds a hashable for each object, which means that if the hierarchical structure changes (possibly due to changes caused when TakeOverOnlyChild runs and encounters slight changes in the hierarchy), the hashes will change. For example, if a project file initially contains a/b/f1 and a/b becomes collapsed into a/b, f1 will have a single parent a/b. If someone later adds a/f2 to the project file, a/b can no longer be collapsed, and f1 winds up with parent b and grandparent a. That would be sufficient to change f1's hash. To counteract this problem, hashables for all XCHierarchicalElements except for the main group (which has neither a name nor a path) are taken to be just the set of path components. Because hashables are inherited from parents, this provides assurance that a/b/f1 has the same set of hashables whether its parent is b or a/b. The main group is a special case. As it is permitted to have no name or path, it is permitted to use the standard XCObject hash mechanism. This is not considered a problem because there can be only one main group. """ if self == self.PBXProjectAncestor()._properties['mainGroup']: # super return XCObject.Hashables(self) hashables = [] # Put the name in first, ensuring that if TakeOverOnlyChild collapses # children into a top-level group like "Source", the name always goes # into the list of hashables without interfering with path components. if 'name' in self._properties: # Make it less likely for people to manipulate hashes by following the # pattern of always pushing an object type value onto the list first. hashables.append(self.__class__.__name__ + '.name') hashables.append(self._properties['name']) # NOTE: This still has the problem that if an absolute path is encountered, # including paths with a sourceTree, they'll still inherit their parents' # hashables, even though the paths aren't relative to their parents. This # is not expected to be much of a problem in practice. path = self.PathFromSourceTreeAndPath() if path != None: components = path.split(posixpath.sep) for component in components: hashables.append(self.__class__.__name__ + '.path') hashables.append(component) hashables.extend(self._hashables) return hashables def Compare(self, other): # Allow comparison of these types. PBXGroup has the highest sort rank; # PBXVariantGroup is treated as equal to PBXFileReference. valid_class_types = { PBXFileReference: 'file', PBXGroup: 'group', PBXVariantGroup: 'file', } self_type = valid_class_types[self.__class__] other_type = valid_class_types[other.__class__] if self_type == other_type: # If the two objects are of the same sort rank, compare their names. return cmp(self.Name(), other.Name()) # Otherwise, sort groups before everything else. if self_type == 'group': return -1 return 1 def CompareRootGroup(self, other): # This function should be used only to compare direct children of the # containing PBXProject's mainGroup. These groups should appear in the # listed order. # TODO(mark): "Build" is used by gyp.generator.xcode, perhaps the # generator should have a way of influencing this list rather than having # to hardcode for the generator here. order = ['Source', 'Intermediates', 'Projects', 'Frameworks', 'Products', 'Build'] # If the groups aren't in the listed order, do a name comparison. # Otherwise, groups in the listed order should come before those that # aren't. self_name = self.Name() other_name = other.Name() self_in = isinstance(self, PBXGroup) and self_name in order other_in = isinstance(self, PBXGroup) and other_name in order if not self_in and not other_in: return self.Compare(other) if self_name in order and not other_name in order: return -1 if other_name in order and not self_name in order: return 1 # If both groups are in the listed order, go by the defined order. self_index = order.index(self_name) other_index = order.index(other_name) if self_index < other_index: return -1 if self_index > other_index: return 1 return 0 def PathFromSourceTreeAndPath(self): # Turn the object's sourceTree and path properties into a single flat # string of a form comparable to the path parameter. If there's a # sourceTree property other than "", wrap it in $(...) for the # comparison. components = [] if self._properties['sourceTree'] != '': components.append('$(' + self._properties['sourceTree'] + ')') if 'path' in self._properties: components.append(self._properties['path']) if len(components) > 0: return posixpath.join(*components) return None def FullPath(self): # Returns a full path to self relative to the project file, or relative # to some other source tree. Start with self, and walk up the chain of # parents prepending their paths, if any, until no more parents are # available (project-relative path) or until a path relative to some # source tree is found. xche = self path = None while isinstance(xche, XCHierarchicalElement) and \ (path is None or \ (not path.startswith('/') and not path.startswith('$'))): this_path = xche.PathFromSourceTreeAndPath() if this_path != None and path != None: path = posixpath.join(this_path, path) elif this_path != None: path = this_path xche = xche.parent return path class PBXGroup(XCHierarchicalElement): """ Attributes: _children_by_path: Maps pathnames of children of this PBXGroup to the actual child XCHierarchicalElement objects. _variant_children_by_name_and_path: Maps (name, path) tuples of PBXVariantGroup children to the actual child PBXVariantGroup objects. """ _schema = XCHierarchicalElement._schema.copy() _schema.update({ 'children': [1, XCHierarchicalElement, 1, 1, []], 'name': [0, str, 0, 0], 'path': [0, str, 0, 0], }) def __init__(self, properties=None, id=None, parent=None): # super XCHierarchicalElement.__init__(self, properties, id, parent) self._children_by_path = {} self._variant_children_by_name_and_path = {} for child in self._properties.get('children', []): self._AddChildToDicts(child) def Hashables(self): # super hashables = XCHierarchicalElement.Hashables(self) # It is not sufficient to just rely on name and parent to build a unique # hashable : a node could have two child PBXGroup sharing a common name. # To add entropy the hashable is enhanced with the names of all its # children. for child in self._properties.get('children', []): child_name = child.Name() if child_name != None: hashables.append(child_name) return hashables def HashablesForChild(self): # To avoid a circular reference the hashables used to compute a child id do # not include the child names. return XCHierarchicalElement.Hashables(self) def _AddChildToDicts(self, child): # Sets up this PBXGroup object's dicts to reference the child properly. child_path = child.PathFromSourceTreeAndPath() if child_path: if child_path in self._children_by_path: raise ValueError('Found multiple children with path ' + child_path) self._children_by_path[child_path] = child if isinstance(child, PBXVariantGroup): child_name = child._properties.get('name', None) key = (child_name, child_path) if key in self._variant_children_by_name_and_path: raise ValueError('Found multiple PBXVariantGroup children with ' + \ 'name ' + str(child_name) + ' and path ' + \ str(child_path)) self._variant_children_by_name_and_path[key] = child def AppendChild(self, child): # Callers should use this instead of calling # AppendProperty('children', child) directly because this function # maintains the group's dicts. self.AppendProperty('children', child) self._AddChildToDicts(child) def GetChildByName(self, name): # This is not currently optimized with a dict as GetChildByPath is because # it has few callers. Most callers probably want GetChildByPath. This # function is only useful to get children that have names but no paths, # which is rare. The children of the main group ("Source", "Products", # etc.) is pretty much the only case where this likely to come up. # # TODO(mark): Maybe this should raise an error if more than one child is # present with the same name. if not 'children' in self._properties: return None for child in self._properties['children']: if child.Name() == name: return child return None def GetChildByPath(self, path): if not path: return None if path in self._children_by_path: return self._children_by_path[path] return None def GetChildByRemoteObject(self, remote_object): # This method is a little bit esoteric. Given a remote_object, which # should be a PBXFileReference in another project file, this method will # return this group's PBXReferenceProxy object serving as a local proxy # for the remote PBXFileReference. # # This function might benefit from a dict optimization as GetChildByPath # for some workloads, but profiling shows that it's not currently a # problem. if not 'children' in self._properties: return None for child in self._properties['children']: if not isinstance(child, PBXReferenceProxy): continue container_proxy = child._properties['remoteRef'] if container_proxy._properties['remoteGlobalIDString'] == remote_object: return child return None def AddOrGetFileByPath(self, path, hierarchical): """Returns an existing or new file reference corresponding to path. If hierarchical is True, this method will create or use the necessary hierarchical group structure corresponding to path. Otherwise, it will look in and create an item in the current group only. If an existing matching reference is found, it is returned, otherwise, a new one will be created, added to the correct group, and returned. If path identifies a directory by virtue of carrying a trailing slash, this method returns a PBXFileReference of "folder" type. If path identifies a variant, by virtue of it identifying a file inside a directory with an ".lproj" extension, this method returns a PBXVariantGroup containing the variant named by path, and possibly other variants. For all other paths, a "normal" PBXFileReference will be returned. """ # Adding or getting a directory? Directories end with a trailing slash. is_dir = False if path.endswith('/'): is_dir = True path = posixpath.normpath(path) if is_dir: path = path + '/' # Adding or getting a variant? Variants are files inside directories # with an ".lproj" extension. Xcode uses variants for localization. For # a variant path/to/Language.lproj/MainMenu.nib, put a variant group named # MainMenu.nib inside path/to, and give it a variant named Language. In # this example, grandparent would be set to path/to and parent_root would # be set to Language. variant_name = None parent = posixpath.dirname(path) grandparent = posixpath.dirname(parent) parent_basename = posixpath.basename(parent) (parent_root, parent_ext) = posixpath.splitext(parent_basename) if parent_ext == '.lproj': variant_name = parent_root if grandparent == '': grandparent = None # Putting a directory inside a variant group is not currently supported. assert not is_dir or variant_name is None path_split = path.split(posixpath.sep) if len(path_split) == 1 or \ ((is_dir or variant_name != None) and len(path_split) == 2) or \ not hierarchical: # The PBXFileReference or PBXVariantGroup will be added to or gotten from # this PBXGroup, no recursion necessary. if variant_name is None: # Add or get a PBXFileReference. file_ref = self.GetChildByPath(path) if file_ref != None: assert file_ref.__class__ == PBXFileReference else: file_ref = PBXFileReference({'path': path}) self.AppendChild(file_ref) else: # Add or get a PBXVariantGroup. The variant group name is the same # as the basename (MainMenu.nib in the example above). grandparent # specifies the path to the variant group itself, and path_split[-2:] # is the path of the specific variant relative to its group. variant_group_name = posixpath.basename(path) variant_group_ref = self.AddOrGetVariantGroupByNameAndPath( variant_group_name, grandparent) variant_path = posixpath.sep.join(path_split[-2:]) variant_ref = variant_group_ref.GetChildByPath(variant_path) if variant_ref != None: assert variant_ref.__class__ == PBXFileReference else: variant_ref = PBXFileReference({'name': variant_name, 'path': variant_path}) variant_group_ref.AppendChild(variant_ref) # The caller is interested in the variant group, not the specific # variant file. file_ref = variant_group_ref return file_ref else: # Hierarchical recursion. Add or get a PBXGroup corresponding to the # outermost path component, and then recurse into it, chopping off that # path component. next_dir = path_split[0] group_ref = self.GetChildByPath(next_dir) if group_ref != None: assert group_ref.__class__ == PBXGroup else: group_ref = PBXGroup({'path': next_dir}) self.AppendChild(group_ref) return group_ref.AddOrGetFileByPath(posixpath.sep.join(path_split[1:]), hierarchical) def AddOrGetVariantGroupByNameAndPath(self, name, path): """Returns an existing or new PBXVariantGroup for name and path. If a PBXVariantGroup identified by the name and path arguments is already present as a child of this object, it is returned. Otherwise, a new PBXVariantGroup with the correct properties is created, added as a child, and returned. This method will generally be called by AddOrGetFileByPath, which knows when to create a variant group based on the structure of the pathnames passed to it. """ key = (name, path) if key in self._variant_children_by_name_and_path: variant_group_ref = self._variant_children_by_name_and_path[key] assert variant_group_ref.__class__ == PBXVariantGroup return variant_group_ref variant_group_properties = {'name': name} if path != None: variant_group_properties['path'] = path variant_group_ref = PBXVariantGroup(variant_group_properties) self.AppendChild(variant_group_ref) return variant_group_ref def TakeOverOnlyChild(self, recurse=False): """If this PBXGroup has only one child and it's also a PBXGroup, take it over by making all of its children this object's children. This function will continue to take over only children when those children are groups. If there are three PBXGroups representing a, b, and c, with c inside b and b inside a, and a and b have no other children, this will result in a taking over both b and c, forming a PBXGroup for a/b/c. If recurse is True, this function will recurse into children and ask them to collapse themselves by taking over only children as well. Assuming an example hierarchy with files at a/b/c/d1, a/b/c/d2, and a/b/c/d3/e/f (d1, d2, and f are files, the rest are groups), recursion will result in a group for a/b/c containing a group for d3/e. """ # At this stage, check that child class types are PBXGroup exactly, # instead of using isinstance. The only subclass of PBXGroup, # PBXVariantGroup, should not participate in reparenting in the same way: # reparenting by merging different object types would be wrong. while len(self._properties['children']) == 1 and \ self._properties['children'][0].__class__ == PBXGroup: # Loop to take over the innermost only-child group possible. child = self._properties['children'][0] # Assume the child's properties, including its children. Save a copy # of this object's old properties, because they'll still be needed. # This object retains its existing id and parent attributes. old_properties = self._properties self._properties = child._properties self._children_by_path = child._children_by_path if not 'sourceTree' in self._properties or \ self._properties['sourceTree'] == '': # The child was relative to its parent. Fix up the path. Note that # children with a sourceTree other than "" are not relative to # their parents, so no path fix-up is needed in that case. if 'path' in old_properties: if 'path' in self._properties: # Both the original parent and child have paths set. self._properties['path'] = posixpath.join(old_properties['path'], self._properties['path']) else: # Only the original parent has a path, use it. self._properties['path'] = old_properties['path'] if 'sourceTree' in old_properties: # The original parent had a sourceTree set, use it. self._properties['sourceTree'] = old_properties['sourceTree'] # If the original parent had a name set, keep using it. If the original # parent didn't have a name but the child did, let the child's name # live on. If the name attribute seems unnecessary now, get rid of it. if 'name' in old_properties and old_properties['name'] != None and \ old_properties['name'] != self.Name(): self._properties['name'] = old_properties['name'] if 'name' in self._properties and 'path' in self._properties and \ self._properties['name'] == self._properties['path']: del self._properties['name'] # Notify all children of their new parent. for child in self._properties['children']: child.parent = self # If asked to recurse, recurse. if recurse: for child in self._properties['children']: if child.__class__ == PBXGroup: child.TakeOverOnlyChild(recurse) def SortGroup(self): self._properties['children'] = \ sorted(self._properties['children'], cmp=lambda x,y: x.Compare(y)) # Recurse. for child in self._properties['children']: if isinstance(child, PBXGroup): child.SortGroup() class XCFileLikeElement(XCHierarchicalElement): # Abstract base for objects that can be used as the fileRef property of # PBXBuildFile. def PathHashables(self): # A PBXBuildFile that refers to this object will call this method to # obtain additional hashables specific to this XCFileLikeElement. Don't # just use this object's hashables, they're not specific and unique enough # on their own (without access to the parent hashables.) Instead, provide # hashables that identify this object by path by getting its hashables as # well as the hashables of ancestor XCHierarchicalElement objects. hashables = [] xche = self while xche != None and isinstance(xche, XCHierarchicalElement): xche_hashables = xche.Hashables() for index in xrange(0, len(xche_hashables)): hashables.insert(index, xche_hashables[index]) xche = xche.parent return hashables class XCContainerPortal(XCObject): # Abstract base for objects that can be used as the containerPortal property # of PBXContainerItemProxy. pass class XCRemoteObject(XCObject): # Abstract base for objects that can be used as the remoteGlobalIDString # property of PBXContainerItemProxy. pass class PBXFileReference(XCFileLikeElement, XCContainerPortal, XCRemoteObject): _schema = XCFileLikeElement._schema.copy() _schema.update({ 'explicitFileType': [0, str, 0, 0], 'lastKnownFileType': [0, str, 0, 0], 'name': [0, str, 0, 0], 'path': [0, str, 0, 1], }) # Weird output rules for PBXFileReference. _should_print_single_line = True # super _encode_transforms = XCFileLikeElement._alternate_encode_transforms def __init__(self, properties=None, id=None, parent=None): # super XCFileLikeElement.__init__(self, properties, id, parent) if 'path' in self._properties and self._properties['path'].endswith('/'): self._properties['path'] = self._properties['path'][:-1] is_dir = True else: is_dir = False if 'path' in self._properties and \ not 'lastKnownFileType' in self._properties and \ not 'explicitFileType' in self._properties: # TODO(mark): This is the replacement for a replacement for a quick hack. # It is no longer incredibly sucky, but this list needs to be extended. extension_map = { 'a': 'archive.ar', 'app': 'wrapper.application', 'bdic': 'file', 'bundle': 'wrapper.cfbundle', 'c': 'sourcecode.c.c', 'cc': 'sourcecode.cpp.cpp', 'cpp': 'sourcecode.cpp.cpp', 'css': 'text.css', 'cxx': 'sourcecode.cpp.cpp', 'dart': 'sourcecode', 'dylib': 'compiled.mach-o.dylib', 'framework': 'wrapper.framework', 'gyp': 'sourcecode', 'gypi': 'sourcecode', 'h': 'sourcecode.c.h', 'hxx': 'sourcecode.cpp.h', 'icns': 'image.icns', 'java': 'sourcecode.java', 'js': 'sourcecode.javascript', 'kext': 'wrapper.kext', 'm': 'sourcecode.c.objc', 'mm': 'sourcecode.cpp.objcpp', 'nib': 'wrapper.nib', 'o': 'compiled.mach-o.objfile', 'pdf': 'image.pdf', 'pl': 'text.script.perl', 'plist': 'text.plist.xml', 'pm': 'text.script.perl', 'png': 'image.png', 'py': 'text.script.python', 'r': 'sourcecode.rez', 'rez': 'sourcecode.rez', 's': 'sourcecode.asm', 'storyboard': 'file.storyboard', 'strings': 'text.plist.strings', 'swift': 'sourcecode.swift', 'ttf': 'file', 'xcassets': 'folder.assetcatalog', 'xcconfig': 'text.xcconfig', 'xcdatamodel': 'wrapper.xcdatamodel', 'xcdatamodeld':'wrapper.xcdatamodeld', 'xib': 'file.xib', 'y': 'sourcecode.yacc', } prop_map = { 'dart': 'explicitFileType', 'gyp': 'explicitFileType', 'gypi': 'explicitFileType', } if is_dir: file_type = 'folder' prop_name = 'lastKnownFileType' else: basename = posixpath.basename(self._properties['path']) (root, ext) = posixpath.splitext(basename) # Check the map using a lowercase extension. # TODO(mark): Maybe it should try with the original case first and fall # back to lowercase, in case there are any instances where case # matters. There currently aren't. if ext != '': ext = ext[1:].lower() # TODO(mark): "text" is the default value, but "file" is appropriate # for unrecognized files not containing text. Xcode seems to choose # based on content. file_type = extension_map.get(ext, 'text') prop_name = prop_map.get(ext, 'lastKnownFileType') self._properties[prop_name] = file_type class PBXVariantGroup(PBXGroup, XCFileLikeElement): """PBXVariantGroup is used by Xcode to represent localizations.""" # No additions to the schema relative to PBXGroup. pass # PBXReferenceProxy is also an XCFileLikeElement subclass. It is defined below # because it uses PBXContainerItemProxy, defined below. class XCBuildConfiguration(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'baseConfigurationReference': [0, PBXFileReference, 0, 0], 'buildSettings': [0, dict, 0, 1, {}], 'name': [0, str, 0, 1], }) def HasBuildSetting(self, key): return key in self._properties['buildSettings'] def GetBuildSetting(self, key): return self._properties['buildSettings'][key] def SetBuildSetting(self, key, value): # TODO(mark): If a list, copy? self._properties['buildSettings'][key] = value def AppendBuildSetting(self, key, value): if not key in self._properties['buildSettings']: self._properties['buildSettings'][key] = [] self._properties['buildSettings'][key].append(value) def DelBuildSetting(self, key): if key in self._properties['buildSettings']: del self._properties['buildSettings'][key] def SetBaseConfiguration(self, value): self._properties['baseConfigurationReference'] = value class XCConfigurationList(XCObject): # _configs is the default list of configurations. _configs = [ XCBuildConfiguration({'name': 'Debug'}), XCBuildConfiguration({'name': 'Release'}) ] _schema = XCObject._schema.copy() _schema.update({ 'buildConfigurations': [1, XCBuildConfiguration, 1, 1, _configs], 'defaultConfigurationIsVisible': [0, int, 0, 1, 1], 'defaultConfigurationName': [0, str, 0, 1, 'Release'], }) def Name(self): return 'Build configuration list for ' + \ self.parent.__class__.__name__ + ' "' + self.parent.Name() + '"' def ConfigurationNamed(self, name): """Convenience accessor to obtain an XCBuildConfiguration by name.""" for configuration in self._properties['buildConfigurations']: if configuration._properties['name'] == name: return configuration raise KeyError(name) def DefaultConfiguration(self): """Convenience accessor to obtain the default XCBuildConfiguration.""" return self.ConfigurationNamed(self._properties['defaultConfigurationName']) def HasBuildSetting(self, key): """Determines the state of a build setting in all XCBuildConfiguration child objects. If all child objects have key in their build settings, and the value is the same in all child objects, returns 1. If no child objects have the key in their build settings, returns 0. If some, but not all, child objects have the key in their build settings, or if any children have different values for the key, returns -1. """ has = None value = None for configuration in self._properties['buildConfigurations']: configuration_has = configuration.HasBuildSetting(key) if has is None: has = configuration_has elif has != configuration_has: return -1 if configuration_has: configuration_value = configuration.GetBuildSetting(key) if value is None: value = configuration_value elif value != configuration_value: return -1 if not has: return 0 return 1 def GetBuildSetting(self, key): """Gets the build setting for key. All child XCConfiguration objects must have the same value set for the setting, or a ValueError will be raised. """ # TODO(mark): This is wrong for build settings that are lists. The list # contents should be compared (and a list copy returned?) value = None for configuration in self._properties['buildConfigurations']: configuration_value = configuration.GetBuildSetting(key) if value is None: value = configuration_value else: if value != configuration_value: raise ValueError('Variant values for ' + key) return value def SetBuildSetting(self, key, value): """Sets the build setting for key to value in all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.SetBuildSetting(key, value) def AppendBuildSetting(self, key, value): """Appends value to the build setting for key, which is treated as a list, in all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.AppendBuildSetting(key, value) def DelBuildSetting(self, key): """Deletes the build setting key from all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.DelBuildSetting(key) def SetBaseConfiguration(self, value): """Sets the build configuration in all child XCBuildConfiguration objects. """ for configuration in self._properties['buildConfigurations']: configuration.SetBaseConfiguration(value) class PBXBuildFile(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'fileRef': [0, XCFileLikeElement, 0, 1], 'settings': [0, str, 0, 0], # hack, it's a dict }) # Weird output rules for PBXBuildFile. _should_print_single_line = True _encode_transforms = XCObject._alternate_encode_transforms def Name(self): # Example: "main.cc in Sources" return self._properties['fileRef'].Name() + ' in ' + self.parent.Name() def Hashables(self): # super hashables = XCObject.Hashables(self) # It is not sufficient to just rely on Name() to get the # XCFileLikeElement's name, because that is not a complete pathname. # PathHashables returns hashables unique enough that no two # PBXBuildFiles should wind up with the same set of hashables, unless # someone adds the same file multiple times to the same target. That # would be considered invalid anyway. hashables.extend(self._properties['fileRef'].PathHashables()) return hashables class XCBuildPhase(XCObject): """Abstract base for build phase classes. Not represented in a project file. Attributes: _files_by_path: A dict mapping each path of a child in the files list by path (keys) to the corresponding PBXBuildFile children (values). _files_by_xcfilelikeelement: A dict mapping each XCFileLikeElement (keys) to the corresponding PBXBuildFile children (values). """ # TODO(mark): Some build phase types, like PBXShellScriptBuildPhase, don't # actually have a "files" list. XCBuildPhase should not have "files" but # another abstract subclass of it should provide this, and concrete build # phase types that do have "files" lists should be derived from that new # abstract subclass. XCBuildPhase should only provide buildActionMask and # runOnlyForDeploymentPostprocessing, and not files or the various # file-related methods and attributes. _schema = XCObject._schema.copy() _schema.update({ 'buildActionMask': [0, int, 0, 1, 0x7fffffff], 'files': [1, PBXBuildFile, 1, 1, []], 'runOnlyForDeploymentPostprocessing': [0, int, 0, 1, 0], }) def __init__(self, properties=None, id=None, parent=None): # super XCObject.__init__(self, properties, id, parent) self._files_by_path = {} self._files_by_xcfilelikeelement = {} for pbxbuildfile in self._properties.get('files', []): self._AddBuildFileToDicts(pbxbuildfile) def FileGroup(self, path): # Subclasses must override this by returning a two-element tuple. The # first item in the tuple should be the PBXGroup to which "path" should be # added, either as a child or deeper descendant. The second item should # be a boolean indicating whether files should be added into hierarchical # groups or one single flat group. raise NotImplementedError( self.__class__.__name__ + ' must implement FileGroup') def _AddPathToDict(self, pbxbuildfile, path): """Adds path to the dict tracking paths belonging to this build phase. If the path is already a member of this build phase, raises an exception. """ if path in self._files_by_path: raise ValueError('Found multiple build files with path ' + path) self._files_by_path[path] = pbxbuildfile def _AddBuildFileToDicts(self, pbxbuildfile, path=None): """Maintains the _files_by_path and _files_by_xcfilelikeelement dicts. If path is specified, then it is the path that is being added to the phase, and pbxbuildfile must contain either a PBXFileReference directly referencing that path, or it must contain a PBXVariantGroup that itself contains a PBXFileReference referencing the path. If path is not specified, either the PBXFileReference's path or the paths of all children of the PBXVariantGroup are taken as being added to the phase. If the path is already present in the phase, raises an exception. If the PBXFileReference or PBXVariantGroup referenced by pbxbuildfile are already present in the phase, referenced by a different PBXBuildFile object, raises an exception. This does not raise an exception when a PBXFileReference or PBXVariantGroup reappear and are referenced by the same PBXBuildFile that has already introduced them, because in the case of PBXVariantGroup objects, they may correspond to multiple paths that are not all added simultaneously. When this situation occurs, the path needs to be added to _files_by_path, but nothing needs to change in _files_by_xcfilelikeelement, and the caller should have avoided adding the PBXBuildFile if it is already present in the list of children. """ xcfilelikeelement = pbxbuildfile._properties['fileRef'] paths = [] if path != None: # It's best when the caller provides the path. if isinstance(xcfilelikeelement, PBXVariantGroup): paths.append(path) else: # If the caller didn't provide a path, there can be either multiple # paths (PBXVariantGroup) or one. if isinstance(xcfilelikeelement, PBXVariantGroup): for variant in xcfilelikeelement._properties['children']: paths.append(variant.FullPath()) else: paths.append(xcfilelikeelement.FullPath()) # Add the paths first, because if something's going to raise, the # messages provided by _AddPathToDict are more useful owing to its # having access to a real pathname and not just an object's Name(). for a_path in paths: self._AddPathToDict(pbxbuildfile, a_path) # If another PBXBuildFile references this XCFileLikeElement, there's a # problem. if xcfilelikeelement in self._files_by_xcfilelikeelement and \ self._files_by_xcfilelikeelement[xcfilelikeelement] != pbxbuildfile: raise ValueError('Found multiple build files for ' + \ xcfilelikeelement.Name()) self._files_by_xcfilelikeelement[xcfilelikeelement] = pbxbuildfile def AppendBuildFile(self, pbxbuildfile, path=None): # Callers should use this instead of calling # AppendProperty('files', pbxbuildfile) directly because this function # maintains the object's dicts. Better yet, callers can just call AddFile # with a pathname and not worry about building their own PBXBuildFile # objects. self.AppendProperty('files', pbxbuildfile) self._AddBuildFileToDicts(pbxbuildfile, path) def AddFile(self, path, settings=None): (file_group, hierarchical) = self.FileGroup(path) file_ref = file_group.AddOrGetFileByPath(path, hierarchical) if file_ref in self._files_by_xcfilelikeelement and \ isinstance(file_ref, PBXVariantGroup): # There's already a PBXBuildFile in this phase corresponding to the # PBXVariantGroup. path just provides a new variant that belongs to # the group. Add the path to the dict. pbxbuildfile = self._files_by_xcfilelikeelement[file_ref] self._AddBuildFileToDicts(pbxbuildfile, path) else: # Add a new PBXBuildFile to get file_ref into the phase. if settings is None: pbxbuildfile = PBXBuildFile({'fileRef': file_ref}) else: pbxbuildfile = PBXBuildFile({'fileRef': file_ref, 'settings': settings}) self.AppendBuildFile(pbxbuildfile, path) class PBXHeadersBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Headers' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) class PBXResourcesBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Resources' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) class PBXSourcesBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Sources' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) class PBXFrameworksBuildPhase(XCBuildPhase): # No additions to the schema relative to XCBuildPhase. def Name(self): return 'Frameworks' def FileGroup(self, path): (root, ext) = posixpath.splitext(path) if ext != '': ext = ext[1:].lower() if ext == 'o': # .o files are added to Xcode Frameworks phases, but conceptually aren't # frameworks, they're more like sources or intermediates. Redirect them # to show up in one of those other groups. return self.PBXProjectAncestor().RootGroupForPath(path) else: return (self.PBXProjectAncestor().FrameworksGroup(), False) class PBXShellScriptBuildPhase(XCBuildPhase): _schema = XCBuildPhase._schema.copy() _schema.update({ 'inputPaths': [1, str, 0, 1, []], 'name': [0, str, 0, 0], 'outputPaths': [1, str, 0, 1, []], 'shellPath': [0, str, 0, 1, '/bin/sh'], 'shellScript': [0, str, 0, 1], 'showEnvVarsInLog': [0, int, 0, 0], }) def Name(self): if 'name' in self._properties: return self._properties['name'] return 'ShellScript' class PBXCopyFilesBuildPhase(XCBuildPhase): _schema = XCBuildPhase._schema.copy() _schema.update({ 'dstPath': [0, str, 0, 1], 'dstSubfolderSpec': [0, int, 0, 1], 'name': [0, str, 0, 0], }) # path_tree_re matches "$(DIR)/path" or just "$(DIR)". Match group 1 is # "DIR", match group 3 is "path" or None. path_tree_re = re.compile('^\\$\\((.*)\\)(/(.*)|)$') # path_tree_to_subfolder maps names of Xcode variables to the associated # dstSubfolderSpec property value used in a PBXCopyFilesBuildPhase object. path_tree_to_subfolder = { 'BUILT_FRAMEWORKS_DIR': 10, # Frameworks Directory 'BUILT_PRODUCTS_DIR': 16, # Products Directory # Other types that can be chosen via the Xcode UI. # TODO(mark): Map Xcode variable names to these. # : 1, # Wrapper # : 6, # Executables: 6 # : 7, # Resources # : 15, # Java Resources # : 11, # Shared Frameworks # : 12, # Shared Support # : 13, # PlugIns } def Name(self): if 'name' in self._properties: return self._properties['name'] return 'CopyFiles' def FileGroup(self, path): return self.PBXProjectAncestor().RootGroupForPath(path) def SetDestination(self, path): """Set the dstSubfolderSpec and dstPath properties from path. path may be specified in the same notation used for XCHierarchicalElements, specifically, "$(DIR)/path". """ path_tree_match = self.path_tree_re.search(path) if path_tree_match: # Everything else needs to be relative to an Xcode variable. path_tree = path_tree_match.group(1) relative_path = path_tree_match.group(3) if path_tree in self.path_tree_to_subfolder: subfolder = self.path_tree_to_subfolder[path_tree] if relative_path is None: relative_path = '' else: # The path starts with an unrecognized Xcode variable # name like $(SRCROOT). Xcode will still handle this # as an "absolute path" that starts with the variable. subfolder = 0 relative_path = path elif path.startswith('/'): # Special case. Absolute paths are in dstSubfolderSpec 0. subfolder = 0 relative_path = path[1:] else: raise ValueError('Can\'t use path %s in a %s' % \ (path, self.__class__.__name__)) self._properties['dstPath'] = relative_path self._properties['dstSubfolderSpec'] = subfolder class PBXBuildRule(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'compilerSpec': [0, str, 0, 1], 'filePatterns': [0, str, 0, 0], 'fileType': [0, str, 0, 1], 'isEditable': [0, int, 0, 1, 1], 'outputFiles': [1, str, 0, 1, []], 'script': [0, str, 0, 0], }) def Name(self): # Not very inspired, but it's what Xcode uses. return self.__class__.__name__ def Hashables(self): # super hashables = XCObject.Hashables(self) # Use the hashables of the weak objects that this object refers to. hashables.append(self._properties['fileType']) if 'filePatterns' in self._properties: hashables.append(self._properties['filePatterns']) return hashables class PBXContainerItemProxy(XCObject): # When referencing an item in this project file, containerPortal is the # PBXProject root object of this project file. When referencing an item in # another project file, containerPortal is a PBXFileReference identifying # the other project file. # # When serving as a proxy to an XCTarget (in this project file or another), # proxyType is 1. When serving as a proxy to a PBXFileReference (in another # project file), proxyType is 2. Type 2 is used for references to the # producs of the other project file's targets. # # Xcode is weird about remoteGlobalIDString. Usually, it's printed without # a comment, indicating that it's tracked internally simply as a string, but # sometimes it's printed with a comment (usually when the object is initially # created), indicating that it's tracked as a project file object at least # sometimes. This module always tracks it as an object, but contains a hack # to prevent it from printing the comment in the project file output. See # _XCKVPrint. _schema = XCObject._schema.copy() _schema.update({ 'containerPortal': [0, XCContainerPortal, 0, 1], 'proxyType': [0, int, 0, 1], 'remoteGlobalIDString': [0, XCRemoteObject, 0, 1], 'remoteInfo': [0, str, 0, 1], }) def __repr__(self): props = self._properties name = '%s.gyp:%s' % (props['containerPortal'].Name(), props['remoteInfo']) return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self)) def Name(self): # Admittedly not the best name, but it's what Xcode uses. return self.__class__.__name__ def Hashables(self): # super hashables = XCObject.Hashables(self) # Use the hashables of the weak objects that this object refers to. hashables.extend(self._properties['containerPortal'].Hashables()) hashables.extend(self._properties['remoteGlobalIDString'].Hashables()) return hashables class PBXTargetDependency(XCObject): # The "target" property accepts an XCTarget object, and obviously not # NoneType. But XCTarget is defined below, so it can't be put into the # schema yet. The definition of PBXTargetDependency can't be moved below # XCTarget because XCTarget's own schema references PBXTargetDependency. # Python doesn't deal well with this circular relationship, and doesn't have # a real way to do forward declarations. To work around, the type of # the "target" property is reset below, after XCTarget is defined. # # At least one of "name" and "target" is required. _schema = XCObject._schema.copy() _schema.update({ 'name': [0, str, 0, 0], 'target': [0, None.__class__, 0, 0], 'targetProxy': [0, PBXContainerItemProxy, 1, 1], }) def __repr__(self): name = self._properties.get('name') or self._properties['target'].Name() return '<%s %r at 0x%x>' % (self.__class__.__name__, name, id(self)) def Name(self): # Admittedly not the best name, but it's what Xcode uses. return self.__class__.__name__ def Hashables(self): # super hashables = XCObject.Hashables(self) # Use the hashables of the weak objects that this object refers to. hashables.extend(self._properties['targetProxy'].Hashables()) return hashables class PBXReferenceProxy(XCFileLikeElement): _schema = XCFileLikeElement._schema.copy() _schema.update({ 'fileType': [0, str, 0, 1], 'path': [0, str, 0, 1], 'remoteRef': [0, PBXContainerItemProxy, 1, 1], }) class XCTarget(XCRemoteObject): # An XCTarget is really just an XCObject, the XCRemoteObject thing is just # to allow PBXProject to be used in the remoteGlobalIDString property of # PBXContainerItemProxy. # # Setting a "name" property at instantiation may also affect "productName", # which may in turn affect the "PRODUCT_NAME" build setting in children of # "buildConfigurationList". See __init__ below. _schema = XCRemoteObject._schema.copy() _schema.update({ 'buildConfigurationList': [0, XCConfigurationList, 1, 1, XCConfigurationList()], 'buildPhases': [1, XCBuildPhase, 1, 1, []], 'dependencies': [1, PBXTargetDependency, 1, 1, []], 'name': [0, str, 0, 1], 'productName': [0, str, 0, 1], }) def __init__(self, properties=None, id=None, parent=None, force_outdir=None, force_prefix=None, force_extension=None): # super XCRemoteObject.__init__(self, properties, id, parent) # Set up additional defaults not expressed in the schema. If a "name" # property was supplied, set "productName" if it is not present. Also set # the "PRODUCT_NAME" build setting in each configuration, but only if # the setting is not present in any build configuration. if 'name' in self._properties: if not 'productName' in self._properties: self.SetProperty('productName', self._properties['name']) if 'productName' in self._properties: if 'buildConfigurationList' in self._properties: configs = self._properties['buildConfigurationList'] if configs.HasBuildSetting('PRODUCT_NAME') == 0: configs.SetBuildSetting('PRODUCT_NAME', self._properties['productName']) def AddDependency(self, other): pbxproject = self.PBXProjectAncestor() other_pbxproject = other.PBXProjectAncestor() if pbxproject == other_pbxproject: # Add a dependency to another target in the same project file. container = PBXContainerItemProxy({'containerPortal': pbxproject, 'proxyType': 1, 'remoteGlobalIDString': other, 'remoteInfo': other.Name()}) dependency = PBXTargetDependency({'target': other, 'targetProxy': container}) self.AppendProperty('dependencies', dependency) else: # Add a dependency to a target in a different project file. other_project_ref = \ pbxproject.AddOrGetProjectReference(other_pbxproject)[1] container = PBXContainerItemProxy({ 'containerPortal': other_project_ref, 'proxyType': 1, 'remoteGlobalIDString': other, 'remoteInfo': other.Name(), }) dependency = PBXTargetDependency({'name': other.Name(), 'targetProxy': container}) self.AppendProperty('dependencies', dependency) # Proxy all of these through to the build configuration list. def ConfigurationNamed(self, name): return self._properties['buildConfigurationList'].ConfigurationNamed(name) def DefaultConfiguration(self): return self._properties['buildConfigurationList'].DefaultConfiguration() def HasBuildSetting(self, key): return self._properties['buildConfigurationList'].HasBuildSetting(key) def GetBuildSetting(self, key): return self._properties['buildConfigurationList'].GetBuildSetting(key) def SetBuildSetting(self, key, value): return self._properties['buildConfigurationList'].SetBuildSetting(key, \ value) def AppendBuildSetting(self, key, value): return self._properties['buildConfigurationList'].AppendBuildSetting(key, \ value) def DelBuildSetting(self, key): return self._properties['buildConfigurationList'].DelBuildSetting(key) # Redefine the type of the "target" property. See PBXTargetDependency._schema # above. PBXTargetDependency._schema['target'][1] = XCTarget class PBXNativeTarget(XCTarget): # buildPhases is overridden in the schema to be able to set defaults. # # NOTE: Contrary to most objects, it is advisable to set parent when # constructing PBXNativeTarget. A parent of an XCTarget must be a PBXProject # object. A parent reference is required for a PBXNativeTarget during # construction to be able to set up the target defaults for productReference, # because a PBXBuildFile object must be created for the target and it must # be added to the PBXProject's mainGroup hierarchy. _schema = XCTarget._schema.copy() _schema.update({ 'buildPhases': [1, XCBuildPhase, 1, 1, [PBXSourcesBuildPhase(), PBXFrameworksBuildPhase()]], 'buildRules': [1, PBXBuildRule, 1, 1, []], 'productReference': [0, PBXFileReference, 0, 1], 'productType': [0, str, 0, 1], }) # Mapping from Xcode product-types to settings. The settings are: # filetype : used for explicitFileType in the project file # prefix : the prefix for the file name # suffix : the suffix for the file name _product_filetypes = { 'com.apple.product-type.application': ['wrapper.application', '', '.app'], 'com.apple.product-type.application.watchapp': ['wrapper.application', '', '.app'], 'com.apple.product-type.watchkit-extension': ['wrapper.app-extension', '', '.appex'], 'com.apple.product-type.app-extension': ['wrapper.app-extension', '', '.appex'], 'com.apple.product-type.bundle': ['wrapper.cfbundle', '', '.bundle'], 'com.apple.product-type.framework': ['wrapper.framework', '', '.framework'], 'com.apple.product-type.library.dynamic': ['compiled.mach-o.dylib', 'lib', '.dylib'], 'com.apple.product-type.library.static': ['archive.ar', 'lib', '.a'], 'com.apple.product-type.tool': ['compiled.mach-o.executable', '', ''], 'com.apple.product-type.bundle.unit-test': ['wrapper.cfbundle', '', '.xctest'], 'com.googlecode.gyp.xcode.bundle': ['compiled.mach-o.dylib', '', '.so'], 'com.apple.product-type.kernel-extension': ['wrapper.kext', '', '.kext'], } def __init__(self, properties=None, id=None, parent=None, force_outdir=None, force_prefix=None, force_extension=None): # super XCTarget.__init__(self, properties, id, parent) if 'productName' in self._properties and \ 'productType' in self._properties and \ not 'productReference' in self._properties and \ self._properties['productType'] in self._product_filetypes: products_group = None pbxproject = self.PBXProjectAncestor() if pbxproject != None: products_group = pbxproject.ProductsGroup() if products_group != None: (filetype, prefix, suffix) = \ self._product_filetypes[self._properties['productType']] # Xcode does not have a distinct type for loadable modules that are # pure BSD targets (not in a bundle wrapper). GYP allows such modules # to be specified by setting a target type to loadable_module without # having mac_bundle set. These are mapped to the pseudo-product type # com.googlecode.gyp.xcode.bundle. # # By picking up this special type and converting it to a dynamic # library (com.apple.product-type.library.dynamic) with fix-ups, # single-file loadable modules can be produced. # # MACH_O_TYPE is changed to mh_bundle to produce the proper file type # (as opposed to mh_dylib). In order for linking to succeed, # DYLIB_CURRENT_VERSION and DYLIB_COMPATIBILITY_VERSION must be # cleared. They are meaningless for type mh_bundle. # # Finally, the .so extension is forcibly applied over the default # (.dylib), unless another forced extension is already selected. # .dylib is plainly wrong, and .bundle is used by loadable_modules in # bundle wrappers (com.apple.product-type.bundle). .so seems an odd # choice because it's used as the extension on many other systems that # don't distinguish between linkable shared libraries and non-linkable # loadable modules, but there's precedent: Python loadable modules on # Mac OS X use an .so extension. if self._properties['productType'] == 'com.googlecode.gyp.xcode.bundle': self._properties['productType'] = \ 'com.apple.product-type.library.dynamic' self.SetBuildSetting('MACH_O_TYPE', 'mh_bundle') self.SetBuildSetting('DYLIB_CURRENT_VERSION', '') self.SetBuildSetting('DYLIB_COMPATIBILITY_VERSION', '') if force_extension is None: force_extension = suffix[1:] if self._properties['productType'] == \ 'com.apple.product-type-bundle.unit.test': if force_extension is None: force_extension = suffix[1:] if force_extension is not None: # If it's a wrapper (bundle), set WRAPPER_EXTENSION. # Extension override. suffix = '.' + force_extension if filetype.startswith('wrapper.'): self.SetBuildSetting('WRAPPER_EXTENSION', force_extension) else: self.SetBuildSetting('EXECUTABLE_EXTENSION', force_extension) if filetype.startswith('compiled.mach-o.executable'): product_name = self._properties['productName'] product_name += suffix suffix = '' self.SetProperty('productName', product_name) self.SetBuildSetting('PRODUCT_NAME', product_name) # Xcode handles most prefixes based on the target type, however there # are exceptions. If a "BSD Dynamic Library" target is added in the # Xcode UI, Xcode sets EXECUTABLE_PREFIX. This check duplicates that # behavior. if force_prefix is not None: prefix = force_prefix if filetype.startswith('wrapper.'): self.SetBuildSetting('WRAPPER_PREFIX', prefix) else: self.SetBuildSetting('EXECUTABLE_PREFIX', prefix) if force_outdir is not None: self.SetBuildSetting('TARGET_BUILD_DIR', force_outdir) # TODO(tvl): Remove the below hack. # http://code.google.com/p/gyp/issues/detail?id=122 # Some targets include the prefix in the target_name. These targets # really should just add a product_name setting that doesn't include # the prefix. For example: # target_name = 'libevent', product_name = 'event' # This check cleans up for them. product_name = self._properties['productName'] prefix_len = len(prefix) if prefix_len and (product_name[:prefix_len] == prefix): product_name = product_name[prefix_len:] self.SetProperty('productName', product_name) self.SetBuildSetting('PRODUCT_NAME', product_name) ref_props = { 'explicitFileType': filetype, 'includeInIndex': 0, 'path': prefix + product_name + suffix, 'sourceTree': 'BUILT_PRODUCTS_DIR', } file_ref = PBXFileReference(ref_props) products_group.AppendChild(file_ref) self.SetProperty('productReference', file_ref) def GetBuildPhaseByType(self, type): if not 'buildPhases' in self._properties: return None the_phase = None for phase in self._properties['buildPhases']: if isinstance(phase, type): # Some phases may be present in multiples in a well-formed project file, # but phases like PBXSourcesBuildPhase may only be present singly, and # this function is intended as an aid to GetBuildPhaseByType. Loop # over the entire list of phases and assert if more than one of the # desired type is found. assert the_phase is None the_phase = phase return the_phase def HeadersPhase(self): headers_phase = self.GetBuildPhaseByType(PBXHeadersBuildPhase) if headers_phase is None: headers_phase = PBXHeadersBuildPhase() # The headers phase should come before the resources, sources, and # frameworks phases, if any. insert_at = len(self._properties['buildPhases']) for index in xrange(0, len(self._properties['buildPhases'])): phase = self._properties['buildPhases'][index] if isinstance(phase, PBXResourcesBuildPhase) or \ isinstance(phase, PBXSourcesBuildPhase) or \ isinstance(phase, PBXFrameworksBuildPhase): insert_at = index break self._properties['buildPhases'].insert(insert_at, headers_phase) headers_phase.parent = self return headers_phase def ResourcesPhase(self): resources_phase = self.GetBuildPhaseByType(PBXResourcesBuildPhase) if resources_phase is None: resources_phase = PBXResourcesBuildPhase() # The resources phase should come before the sources and frameworks # phases, if any. insert_at = len(self._properties['buildPhases']) for index in xrange(0, len(self._properties['buildPhases'])): phase = self._properties['buildPhases'][index] if isinstance(phase, PBXSourcesBuildPhase) or \ isinstance(phase, PBXFrameworksBuildPhase): insert_at = index break self._properties['buildPhases'].insert(insert_at, resources_phase) resources_phase.parent = self return resources_phase def SourcesPhase(self): sources_phase = self.GetBuildPhaseByType(PBXSourcesBuildPhase) if sources_phase is None: sources_phase = PBXSourcesBuildPhase() self.AppendProperty('buildPhases', sources_phase) return sources_phase def FrameworksPhase(self): frameworks_phase = self.GetBuildPhaseByType(PBXFrameworksBuildPhase) if frameworks_phase is None: frameworks_phase = PBXFrameworksBuildPhase() self.AppendProperty('buildPhases', frameworks_phase) return frameworks_phase def AddDependency(self, other): # super XCTarget.AddDependency(self, other) static_library_type = 'com.apple.product-type.library.static' shared_library_type = 'com.apple.product-type.library.dynamic' framework_type = 'com.apple.product-type.framework' if isinstance(other, PBXNativeTarget) and \ 'productType' in self._properties and \ self._properties['productType'] != static_library_type and \ 'productType' in other._properties and \ (other._properties['productType'] == static_library_type or \ ((other._properties['productType'] == shared_library_type or \ other._properties['productType'] == framework_type) and \ ((not other.HasBuildSetting('MACH_O_TYPE')) or other.GetBuildSetting('MACH_O_TYPE') != 'mh_bundle'))): file_ref = other.GetProperty('productReference') pbxproject = self.PBXProjectAncestor() other_pbxproject = other.PBXProjectAncestor() if pbxproject != other_pbxproject: other_project_product_group = \ pbxproject.AddOrGetProjectReference(other_pbxproject)[0] file_ref = other_project_product_group.GetChildByRemoteObject(file_ref) self.FrameworksPhase().AppendProperty('files', PBXBuildFile({'fileRef': file_ref})) class PBXAggregateTarget(XCTarget): pass class PBXProject(XCContainerPortal): # A PBXProject is really just an XCObject, the XCContainerPortal thing is # just to allow PBXProject to be used in the containerPortal property of # PBXContainerItemProxy. """ Attributes: path: "sample.xcodeproj". TODO(mark) Document me! _other_pbxprojects: A dictionary, keyed by other PBXProject objects. Each value is a reference to the dict in the projectReferences list associated with the keyed PBXProject. """ _schema = XCContainerPortal._schema.copy() _schema.update({ 'attributes': [0, dict, 0, 0], 'buildConfigurationList': [0, XCConfigurationList, 1, 1, XCConfigurationList()], 'compatibilityVersion': [0, str, 0, 1, 'Xcode 3.2'], 'hasScannedForEncodings': [0, int, 0, 1, 1], 'mainGroup': [0, PBXGroup, 1, 1, PBXGroup()], 'projectDirPath': [0, str, 0, 1, ''], 'projectReferences': [1, dict, 0, 0], 'projectRoot': [0, str, 0, 1, ''], 'targets': [1, XCTarget, 1, 1, []], }) def __init__(self, properties=None, id=None, parent=None, path=None): self.path = path self._other_pbxprojects = {} # super return XCContainerPortal.__init__(self, properties, id, parent) def Name(self): name = self.path if name[-10:] == '.xcodeproj': name = name[:-10] return posixpath.basename(name) def Path(self): return self.path def Comment(self): return 'Project object' def Children(self): # super children = XCContainerPortal.Children(self) # Add children that the schema doesn't know about. Maybe there's a more # elegant way around this, but this is the only case where we need to own # objects in a dictionary (that is itself in a list), and three lines for # a one-off isn't that big a deal. if 'projectReferences' in self._properties: for reference in self._properties['projectReferences']: children.append(reference['ProductGroup']) return children def PBXProjectAncestor(self): return self def _GroupByName(self, name): if not 'mainGroup' in self._properties: self.SetProperty('mainGroup', PBXGroup()) main_group = self._properties['mainGroup'] group = main_group.GetChildByName(name) if group is None: group = PBXGroup({'name': name}) main_group.AppendChild(group) return group # SourceGroup and ProductsGroup are created by default in Xcode's own # templates. def SourceGroup(self): return self._GroupByName('Source') def ProductsGroup(self): return self._GroupByName('Products') # IntermediatesGroup is used to collect source-like files that are generated # by rules or script phases and are placed in intermediate directories such # as DerivedSources. def IntermediatesGroup(self): return self._GroupByName('Intermediates') # FrameworksGroup and ProjectsGroup are top-level groups used to collect # frameworks and projects. def FrameworksGroup(self): return self._GroupByName('Frameworks') def ProjectsGroup(self): return self._GroupByName('Projects') def RootGroupForPath(self, path): """Returns a PBXGroup child of this object to which path should be added. This method is intended to choose between SourceGroup and IntermediatesGroup on the basis of whether path is present in a source directory or an intermediates directory. For the purposes of this determination, any path located within a derived file directory such as PROJECT_DERIVED_FILE_DIR is treated as being in an intermediates directory. The returned value is a two-element tuple. The first element is the PBXGroup, and the second element specifies whether that group should be organized hierarchically (True) or as a single flat list (False). """ # TODO(mark): make this a class variable and bind to self on call? # Also, this list is nowhere near exhaustive. # INTERMEDIATE_DIR and SHARED_INTERMEDIATE_DIR are used by # gyp.generator.xcode. There should probably be some way for that module # to push the names in, rather than having to hard-code them here. source_tree_groups = { 'DERIVED_FILE_DIR': (self.IntermediatesGroup, True), 'INTERMEDIATE_DIR': (self.IntermediatesGroup, True), 'PROJECT_DERIVED_FILE_DIR': (self.IntermediatesGroup, True), 'SHARED_INTERMEDIATE_DIR': (self.IntermediatesGroup, True), } (source_tree, path) = SourceTreeAndPathFromPath(path) if source_tree != None and source_tree in source_tree_groups: (group_func, hierarchical) = source_tree_groups[source_tree] group = group_func() return (group, hierarchical) # TODO(mark): make additional choices based on file extension. return (self.SourceGroup(), True) def AddOrGetFileInRootGroup(self, path): """Returns a PBXFileReference corresponding to path in the correct group according to RootGroupForPath's heuristics. If an existing PBXFileReference for path exists, it will be returned. Otherwise, one will be created and returned. """ (group, hierarchical) = self.RootGroupForPath(path) return group.AddOrGetFileByPath(path, hierarchical) def RootGroupsTakeOverOnlyChildren(self, recurse=False): """Calls TakeOverOnlyChild for all groups in the main group.""" for group in self._properties['mainGroup']._properties['children']: if isinstance(group, PBXGroup): group.TakeOverOnlyChild(recurse) def SortGroups(self): # Sort the children of the mainGroup (like "Source" and "Products") # according to their defined order. self._properties['mainGroup']._properties['children'] = \ sorted(self._properties['mainGroup']._properties['children'], cmp=lambda x,y: x.CompareRootGroup(y)) # Sort everything else by putting group before files, and going # alphabetically by name within sections of groups and files. SortGroup # is recursive. for group in self._properties['mainGroup']._properties['children']: if not isinstance(group, PBXGroup): continue if group.Name() == 'Products': # The Products group is a special case. Instead of sorting # alphabetically, sort things in the order of the targets that # produce the products. To do this, just build up a new list of # products based on the targets. products = [] for target in self._properties['targets']: if not isinstance(target, PBXNativeTarget): continue product = target._properties['productReference'] # Make sure that the product is already in the products group. assert product in group._properties['children'] products.append(product) # Make sure that this process doesn't miss anything that was already # in the products group. assert len(products) == len(group._properties['children']) group._properties['children'] = products else: group.SortGroup() def AddOrGetProjectReference(self, other_pbxproject): """Add a reference to another project file (via PBXProject object) to this one. Returns [ProductGroup, ProjectRef]. ProductGroup is a PBXGroup object in this project file that contains a PBXReferenceProxy object for each product of each PBXNativeTarget in the other project file. ProjectRef is a PBXFileReference to the other project file. If this project file already references the other project file, the existing ProductGroup and ProjectRef are returned. The ProductGroup will still be updated if necessary. """ if not 'projectReferences' in self._properties: self._properties['projectReferences'] = [] product_group = None project_ref = None if not other_pbxproject in self._other_pbxprojects: # This project file isn't yet linked to the other one. Establish the # link. product_group = PBXGroup({'name': 'Products'}) # ProductGroup is strong. product_group.parent = self # There's nothing unique about this PBXGroup, and if left alone, it will # wind up with the same set of hashables as all other PBXGroup objects # owned by the projectReferences list. Add the hashables of the # remote PBXProject that it's related to. product_group._hashables.extend(other_pbxproject.Hashables()) # The other project reports its path as relative to the same directory # that this project's path is relative to. The other project's path # is not necessarily already relative to this project. Figure out the # pathname that this project needs to use to refer to the other one. this_path = posixpath.dirname(self.Path()) projectDirPath = self.GetProperty('projectDirPath') if projectDirPath: if posixpath.isabs(projectDirPath[0]): this_path = projectDirPath else: this_path = posixpath.join(this_path, projectDirPath) other_path = gyp.common.RelativePath(other_pbxproject.Path(), this_path) # ProjectRef is weak (it's owned by the mainGroup hierarchy). project_ref = PBXFileReference({ 'lastKnownFileType': 'wrapper.pb-project', 'path': other_path, 'sourceTree': 'SOURCE_ROOT', }) self.ProjectsGroup().AppendChild(project_ref) ref_dict = {'ProductGroup': product_group, 'ProjectRef': project_ref} self._other_pbxprojects[other_pbxproject] = ref_dict self.AppendProperty('projectReferences', ref_dict) # Xcode seems to sort this list case-insensitively self._properties['projectReferences'] = \ sorted(self._properties['projectReferences'], cmp=lambda x,y: cmp(x['ProjectRef'].Name().lower(), y['ProjectRef'].Name().lower())) else: # The link already exists. Pull out the relevnt data. project_ref_dict = self._other_pbxprojects[other_pbxproject] product_group = project_ref_dict['ProductGroup'] project_ref = project_ref_dict['ProjectRef'] self._SetUpProductReferences(other_pbxproject, product_group, project_ref) inherit_unique_symroot = self._AllSymrootsUnique(other_pbxproject, False) targets = other_pbxproject.GetProperty('targets') if all(self._AllSymrootsUnique(t, inherit_unique_symroot) for t in targets): dir_path = project_ref._properties['path'] product_group._hashables.extend(dir_path) return [product_group, project_ref] def _AllSymrootsUnique(self, target, inherit_unique_symroot): # Returns True if all configurations have a unique 'SYMROOT' attribute. # The value of inherit_unique_symroot decides, if a configuration is assumed # to inherit a unique 'SYMROOT' attribute from its parent, if it doesn't # define an explicit value for 'SYMROOT'. symroots = self._DefinedSymroots(target) for s in self._DefinedSymroots(target): if (s is not None and not self._IsUniqueSymrootForTarget(s) or s is None and not inherit_unique_symroot): return False return True if symroots else inherit_unique_symroot def _DefinedSymroots(self, target): # Returns all values for the 'SYMROOT' attribute defined in all # configurations for this target. If any configuration doesn't define the # 'SYMROOT' attribute, None is added to the returned set. If all # configurations don't define the 'SYMROOT' attribute, an empty set is # returned. config_list = target.GetProperty('buildConfigurationList') symroots = set() for config in config_list.GetProperty('buildConfigurations'): setting = config.GetProperty('buildSettings') if 'SYMROOT' in setting: symroots.add(setting['SYMROOT']) else: symroots.add(None) if len(symroots) == 1 and None in symroots: return set() return symroots def _IsUniqueSymrootForTarget(self, symroot): # This method returns True if all configurations in target contain a # 'SYMROOT' attribute that is unique for the given target. A value is # unique, if the Xcode macro '$SRCROOT' appears in it in any form. uniquifier = ['$SRCROOT', '$(SRCROOT)'] if any(x in symroot for x in uniquifier): return True return False def _SetUpProductReferences(self, other_pbxproject, product_group, project_ref): # TODO(mark): This only adds references to products in other_pbxproject # when they don't exist in this pbxproject. Perhaps it should also # remove references from this pbxproject that are no longer present in # other_pbxproject. Perhaps it should update various properties if they # change. for target in other_pbxproject._properties['targets']: if not isinstance(target, PBXNativeTarget): continue other_fileref = target._properties['productReference'] if product_group.GetChildByRemoteObject(other_fileref) is None: # Xcode sets remoteInfo to the name of the target and not the name # of its product, despite this proxy being a reference to the product. container_item = PBXContainerItemProxy({ 'containerPortal': project_ref, 'proxyType': 2, 'remoteGlobalIDString': other_fileref, 'remoteInfo': target.Name() }) # TODO(mark): Does sourceTree get copied straight over from the other # project? Can the other project ever have lastKnownFileType here # instead of explicitFileType? (Use it if so?) Can path ever be # unset? (I don't think so.) Can other_fileref have name set, and # does it impact the PBXReferenceProxy if so? These are the questions # that perhaps will be answered one day. reference_proxy = PBXReferenceProxy({ 'fileType': other_fileref._properties['explicitFileType'], 'path': other_fileref._properties['path'], 'sourceTree': other_fileref._properties['sourceTree'], 'remoteRef': container_item, }) product_group.AppendChild(reference_proxy) def SortRemoteProductReferences(self): # For each remote project file, sort the associated ProductGroup in the # same order that the targets are sorted in the remote project file. This # is the sort order used by Xcode. def CompareProducts(x, y, remote_products): # x and y are PBXReferenceProxy objects. Go through their associated # PBXContainerItem to get the remote PBXFileReference, which will be # present in the remote_products list. x_remote = x._properties['remoteRef']._properties['remoteGlobalIDString'] y_remote = y._properties['remoteRef']._properties['remoteGlobalIDString'] x_index = remote_products.index(x_remote) y_index = remote_products.index(y_remote) # Use the order of each remote PBXFileReference in remote_products to # determine the sort order. return cmp(x_index, y_index) for other_pbxproject, ref_dict in self._other_pbxprojects.iteritems(): # Build up a list of products in the remote project file, ordered the # same as the targets that produce them. remote_products = [] for target in other_pbxproject._properties['targets']: if not isinstance(target, PBXNativeTarget): continue remote_products.append(target._properties['productReference']) # Sort the PBXReferenceProxy children according to the list of remote # products. product_group = ref_dict['ProductGroup'] product_group._properties['children'] = sorted( product_group._properties['children'], cmp=lambda x, y, rp=remote_products: CompareProducts(x, y, rp)) class XCProjectFile(XCObject): _schema = XCObject._schema.copy() _schema.update({ 'archiveVersion': [0, int, 0, 1, 1], 'classes': [0, dict, 0, 1, {}], 'objectVersion': [0, int, 0, 1, 46], 'rootObject': [0, PBXProject, 1, 1], }) def ComputeIDs(self, recursive=True, overwrite=True, hash=None): # Although XCProjectFile is implemented here as an XCObject, it's not a # proper object in the Xcode sense, and it certainly doesn't have its own # ID. Pass through an attempt to update IDs to the real root object. if recursive: self._properties['rootObject'].ComputeIDs(recursive, overwrite, hash) def Print(self, file=sys.stdout): self.VerifyHasRequiredProperties() # Add the special "objects" property, which will be caught and handled # separately during printing. This structure allows a fairly standard # loop do the normal printing. self._properties['objects'] = {} self._XCPrint(file, 0, '// !$*UTF8*$!\n') if self._should_print_single_line: self._XCPrint(file, 0, '{ ') else: self._XCPrint(file, 0, '{\n') for property, value in sorted(self._properties.iteritems(), cmp=lambda x, y: cmp(x, y)): if property == 'objects': self._PrintObjects(file) else: self._XCKVPrint(file, 1, property, value) self._XCPrint(file, 0, '}\n') del self._properties['objects'] def _PrintObjects(self, file): if self._should_print_single_line: self._XCPrint(file, 0, 'objects = {') else: self._XCPrint(file, 1, 'objects = {\n') objects_by_class = {} for object in self.Descendants(): if object == self: continue class_name = object.__class__.__name__ if not class_name in objects_by_class: objects_by_class[class_name] = [] objects_by_class[class_name].append(object) for class_name in sorted(objects_by_class): self._XCPrint(file, 0, '\n') self._XCPrint(file, 0, '/* Begin ' + class_name + ' section */\n') for object in sorted(objects_by_class[class_name], cmp=lambda x, y: cmp(x.id, y.id)): object.Print(file) self._XCPrint(file, 0, '/* End ' + class_name + ' section */\n') if self._should_print_single_line: self._XCPrint(file, 0, '}; ') else: self._XCPrint(file, 1, '};\n') npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/xml_fix.py0000644000000000000000000000417612631326456022436 0ustar 00000000000000# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Applies a fix to CR LF TAB handling in xml.dom. Fixes this: http://code.google.com/p/chromium/issues/detail?id=76293 Working around this: http://bugs.python.org/issue5752 TODO(bradnelson): Consider dropping this when we drop XP support. """ import xml.dom.minidom def _Replacement_write_data(writer, data, is_attrib=False): """Writes datachars to writer.""" data = data.replace("&", "&").replace("<", "<") data = data.replace("\"", """).replace(">", ">") if is_attrib: data = data.replace( "\r", " ").replace( "\n", " ").replace( "\t", " ") writer.write(data) def _Replacement_writexml(self, writer, indent="", addindent="", newl=""): # indent = current indentation # addindent = indentation to add to higher levels # newl = newline string writer.write(indent+"<" + self.tagName) attrs = self._get_attributes() a_names = attrs.keys() a_names.sort() for a_name in a_names: writer.write(" %s=\"" % a_name) _Replacement_write_data(writer, attrs[a_name].value, is_attrib=True) writer.write("\"") if self.childNodes: writer.write(">%s" % newl) for node in self.childNodes: node.writexml(writer, indent + addindent, addindent, newl) writer.write("%s%s" % (indent, self.tagName, newl)) else: writer.write("/>%s" % newl) class XmlFix(object): """Object to manage temporary patching of xml.dom.minidom.""" def __init__(self): # Preserve current xml.dom.minidom functions. self.write_data = xml.dom.minidom._write_data self.writexml = xml.dom.minidom.Element.writexml # Inject replacement versions of a function and a method. xml.dom.minidom._write_data = _Replacement_write_data xml.dom.minidom.Element.writexml = _Replacement_writexml def Cleanup(self): if self.write_data: xml.dom.minidom._write_data = self.write_data xml.dom.minidom.Element.writexml = self.writexml self.write_data = None def __del__(self): self.Cleanup() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/__init__.py0000644000000000000000000000000012631326456024473 0ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/analyzer.py0000644000000000000000000007354712631326456024613 0ustar 00000000000000# Copyright (c) 2014 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ This script is intended for use as a GYP_GENERATOR. It takes as input (by way of the generator flag config_path) the path of a json file that dictates the files and targets to search for. The following keys are supported: files: list of paths (relative) of the files to search for. test_targets: unqualified target names to search for. Any target in this list that depends upon a file in |files| is output regardless of the type of target or chain of dependencies. additional_compile_targets: Unqualified targets to search for in addition to test_targets. Targets in the combined list that depend upon a file in |files| are not necessarily output. For example, if the target is of type none then the target is not output (but one of the descendants of the target will be). The following is output: error: only supplied if there is an error. compile_targets: minimal set of targets that directly or indirectly (for targets of type none) depend on the files in |files| and is one of the supplied targets or a target that one of the supplied targets depends on. The expectation is this set of targets is passed into a build step. This list always contains the output of test_targets as well. test_targets: set of targets from the supplied |test_targets| that either directly or indirectly depend upon a file in |files|. This list if useful if additional processing needs to be done for certain targets after the build, such as running tests. status: outputs one of three values: none of the supplied files were found, one of the include files changed so that it should be assumed everything changed (in this case test_targets and compile_targets are not output) or at least one file was found. invalid_targets: list of supplied targets that were not found. Example: Consider a graph like the following: A D / \ B C A depends upon both B and C, A is of type none and B and C are executables. D is an executable, has no dependencies and nothing depends on it. If |additional_compile_targets| = ["A"], |test_targets| = ["B", "C"] and files = ["b.cc", "d.cc"] (B depends upon b.cc and D depends upon d.cc), then the following is output: |compile_targets| = ["B"] B must built as it depends upon the changed file b.cc and the supplied target A depends upon it. A is not output as a build_target as it is of type none with no rules and actions. |test_targets| = ["B"] B directly depends upon the change file b.cc. Even though the file d.cc, which D depends upon, has changed D is not output as it was not supplied by way of |additional_compile_targets| or |test_targets|. If the generator flag analyzer_output_path is specified, output is written there. Otherwise output is written to stdout. In Gyp the "all" target is shorthand for the root targets in the files passed to gyp. For example, if file "a.gyp" contains targets "a1" and "a2", and file "b.gyp" contains targets "b1" and "b2" and "a2" has a dependency on "b2" and gyp is supplied "a.gyp" then "all" consists of "a1" and "a2". Notice that "b1" and "b2" are not in the "all" target as "b.gyp" was not directly supplied to gyp. OTOH if both "a.gyp" and "b.gyp" are supplied to gyp then the "all" target includes "b1" and "b2". """ import gyp.common import gyp.ninja_syntax as ninja_syntax import json import os import posixpath import sys debug = False found_dependency_string = 'Found dependency' no_dependency_string = 'No dependencies' # Status when it should be assumed that everything has changed. all_changed_string = 'Found dependency (all)' # MatchStatus is used indicate if and how a target depends upon the supplied # sources. # The target's sources contain one of the supplied paths. MATCH_STATUS_MATCHES = 1 # The target has a dependency on another target that contains one of the # supplied paths. MATCH_STATUS_MATCHES_BY_DEPENDENCY = 2 # The target's sources weren't in the supplied paths and none of the target's # dependencies depend upon a target that matched. MATCH_STATUS_DOESNT_MATCH = 3 # The target doesn't contain the source, but the dependent targets have not yet # been visited to determine a more specific status yet. MATCH_STATUS_TBD = 4 generator_supports_multiple_toolsets = gyp.common.CrossCompileRequested() generator_wants_static_library_dependencies_adjusted = False generator_default_variables = { } for dirname in ['INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']: generator_default_variables[dirname] = '!!!' for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX', 'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX', 'CONFIGURATION_NAME']: generator_default_variables[unused] = '' def _ToGypPath(path): """Converts a path to the format used by gyp.""" if os.sep == '\\' and os.altsep == '/': return path.replace('\\', '/') return path def _ResolveParent(path, base_path_components): """Resolves |path|, which starts with at least one '../'. Returns an empty string if the path shouldn't be considered. See _AddSources() for a description of |base_path_components|.""" depth = 0 while path.startswith('../'): depth += 1 path = path[3:] # Relative includes may go outside the source tree. For example, an action may # have inputs in /usr/include, which are not in the source tree. if depth > len(base_path_components): return '' if depth == len(base_path_components): return path return '/'.join(base_path_components[0:len(base_path_components) - depth]) + \ '/' + path def _AddSources(sources, base_path, base_path_components, result): """Extracts valid sources from |sources| and adds them to |result|. Each source file is relative to |base_path|, but may contain '..'. To make resolving '..' easier |base_path_components| contains each of the directories in |base_path|. Additionally each source may contain variables. Such sources are ignored as it is assumed dependencies on them are expressed and tracked in some other means.""" # NOTE: gyp paths are always posix style. for source in sources: if not len(source) or source.startswith('!!!') or source.startswith('$'): continue # variable expansion may lead to //. org_source = source source = source[0] + source[1:].replace('//', '/') if source.startswith('../'): source = _ResolveParent(source, base_path_components) if len(source): result.append(source) continue result.append(base_path + source) if debug: print 'AddSource', org_source, result[len(result) - 1] def _ExtractSourcesFromAction(action, base_path, base_path_components, results): if 'inputs' in action: _AddSources(action['inputs'], base_path, base_path_components, results) def _ToLocalPath(toplevel_dir, path): """Converts |path| to a path relative to |toplevel_dir|.""" if path == toplevel_dir: return '' if path.startswith(toplevel_dir + '/'): return path[len(toplevel_dir) + len('/'):] return path def _ExtractSources(target, target_dict, toplevel_dir): # |target| is either absolute or relative and in the format of the OS. Gyp # source paths are always posix. Convert |target| to a posix path relative to # |toplevel_dir_|. This is done to make it easy to build source paths. base_path = posixpath.dirname(_ToLocalPath(toplevel_dir, _ToGypPath(target))) base_path_components = base_path.split('/') # Add a trailing '/' so that _AddSources() can easily build paths. if len(base_path): base_path += '/' if debug: print 'ExtractSources', target, base_path results = [] if 'sources' in target_dict: _AddSources(target_dict['sources'], base_path, base_path_components, results) # Include the inputs from any actions. Any changes to these affect the # resulting output. if 'actions' in target_dict: for action in target_dict['actions']: _ExtractSourcesFromAction(action, base_path, base_path_components, results) if 'rules' in target_dict: for rule in target_dict['rules']: _ExtractSourcesFromAction(rule, base_path, base_path_components, results) return results class Target(object): """Holds information about a particular target: deps: set of Targets this Target depends upon. This is not recursive, only the direct dependent Targets. match_status: one of the MatchStatus values. back_deps: set of Targets that have a dependency on this Target. visited: used during iteration to indicate whether we've visited this target. This is used for two iterations, once in building the set of Targets and again in _GetBuildTargets(). name: fully qualified name of the target. requires_build: True if the target type is such that it needs to be built. See _DoesTargetTypeRequireBuild for details. added_to_compile_targets: used when determining if the target was added to the set of targets that needs to be built. in_roots: true if this target is a descendant of one of the root nodes. is_executable: true if the type of target is executable. is_static_library: true if the type of target is static_library. is_or_has_linked_ancestor: true if the target does a link (eg executable), or if there is a target in back_deps that does a link.""" def __init__(self, name): self.deps = set() self.match_status = MATCH_STATUS_TBD self.back_deps = set() self.name = name # TODO(sky): I don't like hanging this off Target. This state is specific # to certain functions and should be isolated there. self.visited = False self.requires_build = False self.added_to_compile_targets = False self.in_roots = False self.is_executable = False self.is_static_library = False self.is_or_has_linked_ancestor = False class Config(object): """Details what we're looking for files: set of files to search for targets: see file description for details.""" def __init__(self): self.files = [] self.targets = set() self.additional_compile_target_names = set() self.test_target_names = set() def Init(self, params): """Initializes Config. This is a separate method as it raises an exception if there is a parse error.""" generator_flags = params.get('generator_flags', {}) config_path = generator_flags.get('config_path', None) if not config_path: return try: f = open(config_path, 'r') config = json.load(f) f.close() except IOError: raise Exception('Unable to open file ' + config_path) except ValueError as e: raise Exception('Unable to parse config file ' + config_path + str(e)) if not isinstance(config, dict): raise Exception('config_path must be a JSON file containing a dictionary') self.files = config.get('files', []) self.additional_compile_target_names = set( config.get('additional_compile_targets', [])) self.test_target_names = set(config.get('test_targets', [])) def _WasBuildFileModified(build_file, data, files, toplevel_dir): """Returns true if the build file |build_file| is either in |files| or one of the files included by |build_file| is in |files|. |toplevel_dir| is the root of the source tree.""" if _ToLocalPath(toplevel_dir, _ToGypPath(build_file)) in files: if debug: print 'gyp file modified', build_file return True # First element of included_files is the file itself. if len(data[build_file]['included_files']) <= 1: return False for include_file in data[build_file]['included_files'][1:]: # |included_files| are relative to the directory of the |build_file|. rel_include_file = \ _ToGypPath(gyp.common.UnrelativePath(include_file, build_file)) if _ToLocalPath(toplevel_dir, rel_include_file) in files: if debug: print 'included gyp file modified, gyp_file=', build_file, \ 'included file=', rel_include_file return True return False def _GetOrCreateTargetByName(targets, target_name): """Creates or returns the Target at targets[target_name]. If there is no Target for |target_name| one is created. Returns a tuple of whether a new Target was created and the Target.""" if target_name in targets: return False, targets[target_name] target = Target(target_name) targets[target_name] = target return True, target def _DoesTargetTypeRequireBuild(target_dict): """Returns true if the target type is such that it needs to be built.""" # If a 'none' target has rules or actions we assume it requires a build. return bool(target_dict['type'] != 'none' or target_dict.get('actions') or target_dict.get('rules')) def _GenerateTargets(data, target_list, target_dicts, toplevel_dir, files, build_files): """Returns a tuple of the following: . A dictionary mapping from fully qualified name to Target. . A list of the targets that have a source file in |files|. . Targets that constitute the 'all' target. See description at top of file for details on the 'all' target. This sets the |match_status| of the targets that contain any of the source files in |files| to MATCH_STATUS_MATCHES. |toplevel_dir| is the root of the source tree.""" # Maps from target name to Target. name_to_target = {} # Targets that matched. matching_targets = [] # Queue of targets to visit. targets_to_visit = target_list[:] # Maps from build file to a boolean indicating whether the build file is in # |files|. build_file_in_files = {} # Root targets across all files. roots = set() # Set of Targets in |build_files|. build_file_targets = set() while len(targets_to_visit) > 0: target_name = targets_to_visit.pop() created_target, target = _GetOrCreateTargetByName(name_to_target, target_name) if created_target: roots.add(target) elif target.visited: continue target.visited = True target.requires_build = _DoesTargetTypeRequireBuild( target_dicts[target_name]) target_type = target_dicts[target_name]['type'] target.is_executable = target_type == 'executable' target.is_static_library = target_type == 'static_library' target.is_or_has_linked_ancestor = (target_type == 'executable' or target_type == 'shared_library') build_file = gyp.common.ParseQualifiedTarget(target_name)[0] if not build_file in build_file_in_files: build_file_in_files[build_file] = \ _WasBuildFileModified(build_file, data, files, toplevel_dir) if build_file in build_files: build_file_targets.add(target) # If a build file (or any of its included files) is modified we assume all # targets in the file are modified. if build_file_in_files[build_file]: print 'matching target from modified build file', target_name target.match_status = MATCH_STATUS_MATCHES matching_targets.append(target) else: sources = _ExtractSources(target_name, target_dicts[target_name], toplevel_dir) for source in sources: if _ToGypPath(os.path.normpath(source)) in files: print 'target', target_name, 'matches', source target.match_status = MATCH_STATUS_MATCHES matching_targets.append(target) break # Add dependencies to visit as well as updating back pointers for deps. for dep in target_dicts[target_name].get('dependencies', []): targets_to_visit.append(dep) created_dep_target, dep_target = _GetOrCreateTargetByName(name_to_target, dep) if not created_dep_target: roots.discard(dep_target) target.deps.add(dep_target) dep_target.back_deps.add(target) return name_to_target, matching_targets, roots & build_file_targets def _GetUnqualifiedToTargetMapping(all_targets, to_find): """Returns a tuple of the following: . mapping (dictionary) from unqualified name to Target for all the Targets in |to_find|. . any target names not found. If this is empty all targets were found.""" result = {} if not to_find: return {}, [] to_find = set(to_find) for target_name in all_targets.keys(): extracted = gyp.common.ParseQualifiedTarget(target_name) if len(extracted) > 1 and extracted[1] in to_find: to_find.remove(extracted[1]) result[extracted[1]] = all_targets[target_name] if not to_find: return result, [] return result, [x for x in to_find] def _DoesTargetDependOnMatchingTargets(target): """Returns true if |target| or any of its dependencies is one of the targets containing the files supplied as input to analyzer. This updates |matches| of the Targets as it recurses. target: the Target to look for.""" if target.match_status == MATCH_STATUS_DOESNT_MATCH: return False if target.match_status == MATCH_STATUS_MATCHES or \ target.match_status == MATCH_STATUS_MATCHES_BY_DEPENDENCY: return True for dep in target.deps: if _DoesTargetDependOnMatchingTargets(dep): target.match_status = MATCH_STATUS_MATCHES_BY_DEPENDENCY print '\t', target.name, 'matches by dep', dep.name return True target.match_status = MATCH_STATUS_DOESNT_MATCH return False def _GetTargetsDependingOnMatchingTargets(possible_targets): """Returns the list of Targets in |possible_targets| that depend (either directly on indirectly) on at least one of the targets containing the files supplied as input to analyzer. possible_targets: targets to search from.""" found = [] print 'Targets that matched by dependency:' for target in possible_targets: if _DoesTargetDependOnMatchingTargets(target): found.append(target) return found def _AddCompileTargets(target, roots, add_if_no_ancestor, result): """Recurses through all targets that depend on |target|, adding all targets that need to be built (and are in |roots|) to |result|. roots: set of root targets. add_if_no_ancestor: If true and there are no ancestors of |target| then add |target| to |result|. |target| must still be in |roots|. result: targets that need to be built are added here.""" if target.visited: return target.visited = True target.in_roots = target in roots for back_dep_target in target.back_deps: _AddCompileTargets(back_dep_target, roots, False, result) target.added_to_compile_targets |= back_dep_target.added_to_compile_targets target.in_roots |= back_dep_target.in_roots target.is_or_has_linked_ancestor |= ( back_dep_target.is_or_has_linked_ancestor) # Always add 'executable' targets. Even though they may be built by other # targets that depend upon them it makes detection of what is going to be # built easier. # And always add static_libraries that have no dependencies on them from # linkables. This is necessary as the other dependencies on them may be # static libraries themselves, which are not compile time dependencies. if target.in_roots and \ (target.is_executable or (not target.added_to_compile_targets and (add_if_no_ancestor or target.requires_build)) or (target.is_static_library and add_if_no_ancestor and not target.is_or_has_linked_ancestor)): print '\t\tadding to compile targets', target.name, 'executable', \ target.is_executable, 'added_to_compile_targets', \ target.added_to_compile_targets, 'add_if_no_ancestor', \ add_if_no_ancestor, 'requires_build', target.requires_build, \ 'is_static_library', target.is_static_library, \ 'is_or_has_linked_ancestor', target.is_or_has_linked_ancestor result.add(target) target.added_to_compile_targets = True def _GetCompileTargets(matching_targets, supplied_targets): """Returns the set of Targets that require a build. matching_targets: targets that changed and need to be built. supplied_targets: set of targets supplied to analyzer to search from.""" result = set() for target in matching_targets: print 'finding compile targets for match', target.name _AddCompileTargets(target, supplied_targets, True, result) return result def _WriteOutput(params, **values): """Writes the output, either to stdout or a file is specified.""" if 'error' in values: print 'Error:', values['error'] if 'status' in values: print values['status'] if 'targets' in values: values['targets'].sort() print 'Supplied targets that depend on changed files:' for target in values['targets']: print '\t', target if 'invalid_targets' in values: values['invalid_targets'].sort() print 'The following targets were not found:' for target in values['invalid_targets']: print '\t', target if 'build_targets' in values: values['build_targets'].sort() print 'Targets that require a build:' for target in values['build_targets']: print '\t', target if 'compile_targets' in values: values['compile_targets'].sort() print 'Targets that need to be built:' for target in values['compile_targets']: print '\t', target if 'test_targets' in values: values['test_targets'].sort() print 'Test targets:' for target in values['test_targets']: print '\t', target output_path = params.get('generator_flags', {}).get( 'analyzer_output_path', None) if not output_path: print json.dumps(values) return try: f = open(output_path, 'w') f.write(json.dumps(values) + '\n') f.close() except IOError as e: print 'Error writing to output file', output_path, str(e) def _WasGypIncludeFileModified(params, files): """Returns true if one of the files in |files| is in the set of included files.""" if params['options'].includes: for include in params['options'].includes: if _ToGypPath(os.path.normpath(include)) in files: print 'Include file modified, assuming all changed', include return True return False def _NamesNotIn(names, mapping): """Returns a list of the values in |names| that are not in |mapping|.""" return [name for name in names if name not in mapping] def _LookupTargets(names, mapping): """Returns a list of the mapping[name] for each value in |names| that is in |mapping|.""" return [mapping[name] for name in names if name in mapping] def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" flavor = gyp.common.GetFlavor(params) if flavor == 'mac': default_variables.setdefault('OS', 'mac') elif flavor == 'win': default_variables.setdefault('OS', 'win') # Copy additional generator configuration data from VS, which is shared # by the Windows Ninja generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) else: operating_system = flavor if flavor == 'android': operating_system = 'linux' # Keep this legacy behavior for now. default_variables.setdefault('OS', operating_system) class TargetCalculator(object): """Calculates the matching test_targets and matching compile_targets.""" def __init__(self, files, additional_compile_target_names, test_target_names, data, target_list, target_dicts, toplevel_dir, build_files): self._additional_compile_target_names = set(additional_compile_target_names) self._test_target_names = set(test_target_names) self._name_to_target, self._changed_targets, self._root_targets = ( _GenerateTargets(data, target_list, target_dicts, toplevel_dir, frozenset(files), build_files)) self._unqualified_mapping, self.invalid_targets = ( _GetUnqualifiedToTargetMapping(self._name_to_target, self._supplied_target_names_no_all())) def _supplied_target_names(self): return self._additional_compile_target_names | self._test_target_names def _supplied_target_names_no_all(self): """Returns the supplied test targets without 'all'.""" result = self._supplied_target_names(); result.discard('all') return result def is_build_impacted(self): """Returns true if the supplied files impact the build at all.""" return self._changed_targets def find_matching_test_target_names(self): """Returns the set of output test targets.""" assert self.is_build_impacted() # Find the test targets first. 'all' is special cased to mean all the # root targets. To deal with all the supplied |test_targets| are expanded # to include the root targets during lookup. If any of the root targets # match, we remove it and replace it with 'all'. test_target_names_no_all = set(self._test_target_names) test_target_names_no_all.discard('all') test_targets_no_all = _LookupTargets(test_target_names_no_all, self._unqualified_mapping) test_target_names_contains_all = 'all' in self._test_target_names if test_target_names_contains_all: test_targets = [x for x in (set(test_targets_no_all) | set(self._root_targets))] else: test_targets = [x for x in test_targets_no_all] print 'supplied test_targets' for target_name in self._test_target_names: print '\t', target_name print 'found test_targets' for target in test_targets: print '\t', target.name print 'searching for matching test targets' matching_test_targets = _GetTargetsDependingOnMatchingTargets(test_targets) matching_test_targets_contains_all = (test_target_names_contains_all and set(matching_test_targets) & set(self._root_targets)) if matching_test_targets_contains_all: # Remove any of the targets for all that were not explicitly supplied, # 'all' is subsequentely added to the matching names below. matching_test_targets = [x for x in (set(matching_test_targets) & set(test_targets_no_all))] print 'matched test_targets' for target in matching_test_targets: print '\t', target.name matching_target_names = [gyp.common.ParseQualifiedTarget(target.name)[1] for target in matching_test_targets] if matching_test_targets_contains_all: matching_target_names.append('all') print '\tall' return matching_target_names def find_matching_compile_target_names(self): """Returns the set of output compile targets.""" assert self.is_build_impacted(); # Compile targets are found by searching up from changed targets. # Reset the visited status for _GetBuildTargets. for target in self._name_to_target.itervalues(): target.visited = False supplied_targets = _LookupTargets(self._supplied_target_names_no_all(), self._unqualified_mapping) if 'all' in self._supplied_target_names(): supplied_targets = [x for x in (set(supplied_targets) | set(self._root_targets))] print 'Supplied test_targets & compile_targets' for target in supplied_targets: print '\t', target.name print 'Finding compile targets' compile_targets = _GetCompileTargets(self._changed_targets, supplied_targets) return [gyp.common.ParseQualifiedTarget(target.name)[1] for target in compile_targets] def GenerateOutput(target_list, target_dicts, data, params): """Called by gyp as the final stage. Outputs results.""" config = Config() try: config.Init(params) if not config.files: raise Exception('Must specify files to analyze via config_path generator ' 'flag') toplevel_dir = _ToGypPath(os.path.abspath(params['options'].toplevel_dir)) if debug: print 'toplevel_dir', toplevel_dir if _WasGypIncludeFileModified(params, config.files): result_dict = { 'status': all_changed_string, 'test_targets': list(config.test_target_names), 'compile_targets': list( config.additional_compile_target_names | config.test_target_names) } _WriteOutput(params, **result_dict) return calculator = TargetCalculator(config.files, config.additional_compile_target_names, config.test_target_names, data, target_list, target_dicts, toplevel_dir, params['build_files']) if not calculator.is_build_impacted(): result_dict = { 'status': no_dependency_string, 'test_targets': [], 'compile_targets': [] } if calculator.invalid_targets: result_dict['invalid_targets'] = calculator.invalid_targets _WriteOutput(params, **result_dict) return test_target_names = calculator.find_matching_test_target_names() compile_target_names = calculator.find_matching_compile_target_names() found_at_least_one_target = compile_target_names or test_target_names result_dict = { 'test_targets': test_target_names, 'status': found_dependency_string if found_at_least_one_target else no_dependency_string, 'compile_targets': list( set(compile_target_names) | set(test_target_names)) } if calculator.invalid_targets: result_dict['invalid_targets'] = calculator.invalid_targets _WriteOutput(params, **result_dict) except Exception as e: _WriteOutput(params, error=str(e)) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/android.py0000644000000000000000000013032612631326456024373 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. # Notes: # # This generates makefiles suitable for inclusion into the Android build system # via an Android.mk file. It is based on make.py, the standard makefile # generator. # # The code below generates a separate .mk file for each target, but # all are sourced by the top-level GypAndroid.mk. This means that all # variables in .mk-files clobber one another, and furthermore that any # variables set potentially clash with other Android build system variables. # Try to avoid setting global variables where possible. import gyp import gyp.common import gyp.generator.make as make # Reuse global functions from make backend. import os import re import subprocess generator_default_variables = { 'OS': 'android', 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'SHARED_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_SUFFIX': '.so', 'INTERMEDIATE_DIR': '$(gyp_intermediate_dir)', 'SHARED_INTERMEDIATE_DIR': '$(gyp_shared_intermediate_dir)', 'PRODUCT_DIR': '$(gyp_shared_intermediate_dir)', 'SHARED_LIB_DIR': '$(builddir)/lib.$(TOOLSET)', 'LIB_DIR': '$(obj).$(TOOLSET)', 'RULE_INPUT_ROOT': '%(INPUT_ROOT)s', # This gets expanded by Python. 'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s', # This gets expanded by Python. 'RULE_INPUT_PATH': '$(RULE_SOURCES)', 'RULE_INPUT_EXT': '$(suffix $<)', 'RULE_INPUT_NAME': '$(notdir $<)', 'CONFIGURATION_NAME': '$(GYP_CONFIGURATION)', } # Make supports multiple toolsets generator_supports_multiple_toolsets = True # Generator-specific gyp specs. generator_additional_non_configuration_keys = [ # Boolean to declare that this target does not want its name mangled. 'android_unmangled_name', # Map of android build system variables to set. 'aosp_build_settings', ] generator_additional_path_sections = [] generator_extra_sources_for_rules = [] ALL_MODULES_FOOTER = """\ # "gyp_all_modules" is a concatenation of the "gyp_all_modules" targets from # all the included sub-makefiles. This is just here to clarify. gyp_all_modules: """ header = """\ # This file is generated by gyp; do not edit. """ # Map gyp target types to Android module classes. MODULE_CLASSES = { 'static_library': 'STATIC_LIBRARIES', 'shared_library': 'SHARED_LIBRARIES', 'executable': 'EXECUTABLES', } def IsCPPExtension(ext): return make.COMPILABLE_EXTENSIONS.get(ext) == 'cxx' def Sourceify(path): """Convert a path to its source directory form. The Android backend does not support options.generator_output, so this function is a noop.""" return path # Map from qualified target to path to output. # For Android, the target of these maps is a tuple ('static', 'modulename'), # ('dynamic', 'modulename'), or ('path', 'some/path') instead of a string, # since we link by module. target_outputs = {} # Map from qualified target to any linkable output. A subset # of target_outputs. E.g. when mybinary depends on liba, we want to # include liba in the linker line; when otherbinary depends on # mybinary, we just want to build mybinary first. target_link_deps = {} class AndroidMkWriter(object): """AndroidMkWriter packages up the writing of one target-specific Android.mk. Its only real entry point is Write(), and is mostly used for namespacing. """ def __init__(self, android_top_dir): self.android_top_dir = android_top_dir def Write(self, qualified_target, relative_target, base_path, output_filename, spec, configs, part_of_all, write_alias_target, sdk_version): """The main entry point: writes a .mk file for a single target. Arguments: qualified_target: target we're generating relative_target: qualified target name relative to the root base_path: path relative to source root we're building in, used to resolve target-relative paths output_filename: output .mk file name to write spec, configs: gyp info part_of_all: flag indicating this target is part of 'all' write_alias_target: flag indicating whether to create short aliases for this target sdk_version: what to emit for LOCAL_SDK_VERSION in output """ gyp.common.EnsureDirExists(output_filename) self.fp = open(output_filename, 'w') self.fp.write(header) self.qualified_target = qualified_target self.relative_target = relative_target self.path = base_path self.target = spec['target_name'] self.type = spec['type'] self.toolset = spec['toolset'] deps, link_deps = self.ComputeDeps(spec) # Some of the generation below can add extra output, sources, or # link dependencies. All of the out params of the functions that # follow use names like extra_foo. extra_outputs = [] extra_sources = [] self.android_class = MODULE_CLASSES.get(self.type, 'GYP') self.android_module = self.ComputeAndroidModule(spec) (self.android_stem, self.android_suffix) = self.ComputeOutputParts(spec) self.output = self.output_binary = self.ComputeOutput(spec) # Standard header. self.WriteLn('include $(CLEAR_VARS)\n') # Module class and name. self.WriteLn('LOCAL_MODULE_CLASS := ' + self.android_class) self.WriteLn('LOCAL_MODULE := ' + self.android_module) # Only emit LOCAL_MODULE_STEM if it's different to LOCAL_MODULE. # The library module classes fail if the stem is set. ComputeOutputParts # makes sure that stem == modulename in these cases. if self.android_stem != self.android_module: self.WriteLn('LOCAL_MODULE_STEM := ' + self.android_stem) self.WriteLn('LOCAL_MODULE_SUFFIX := ' + self.android_suffix) if self.toolset == 'host': self.WriteLn('LOCAL_IS_HOST_MODULE := true') self.WriteLn('LOCAL_MULTILIB := $(GYP_HOST_MULTILIB)') else: self.WriteLn('LOCAL_MODULE_TARGET_ARCH := ' '$(TARGET_$(GYP_VAR_PREFIX)ARCH)') self.WriteLn('LOCAL_SDK_VERSION := %s' % sdk_version) # Grab output directories; needed for Actions and Rules. if self.toolset == 'host': self.WriteLn('gyp_intermediate_dir := ' '$(call local-intermediates-dir,,$(GYP_HOST_VAR_PREFIX))') else: self.WriteLn('gyp_intermediate_dir := ' '$(call local-intermediates-dir,,$(GYP_VAR_PREFIX))') self.WriteLn('gyp_shared_intermediate_dir := ' '$(call intermediates-dir-for,GYP,shared,,,$(GYP_VAR_PREFIX))') self.WriteLn() # List files this target depends on so that actions/rules/copies/sources # can depend on the list. # TODO: doesn't pull in things through transitive link deps; needed? target_dependencies = [x[1] for x in deps if x[0] == 'path'] self.WriteLn('# Make sure our deps are built first.') self.WriteList(target_dependencies, 'GYP_TARGET_DEPENDENCIES', local_pathify=True) # Actions must come first, since they can generate more OBJs for use below. if 'actions' in spec: self.WriteActions(spec['actions'], extra_sources, extra_outputs) # Rules must be early like actions. if 'rules' in spec: self.WriteRules(spec['rules'], extra_sources, extra_outputs) if 'copies' in spec: self.WriteCopies(spec['copies'], extra_outputs) # GYP generated outputs. self.WriteList(extra_outputs, 'GYP_GENERATED_OUTPUTS', local_pathify=True) # Set LOCAL_ADDITIONAL_DEPENDENCIES so that Android's build rules depend # on both our dependency targets and our generated files. self.WriteLn('# Make sure our deps and generated files are built first.') self.WriteLn('LOCAL_ADDITIONAL_DEPENDENCIES := $(GYP_TARGET_DEPENDENCIES) ' '$(GYP_GENERATED_OUTPUTS)') self.WriteLn() # Sources. if spec.get('sources', []) or extra_sources: self.WriteSources(spec, configs, extra_sources) self.WriteTarget(spec, configs, deps, link_deps, part_of_all, write_alias_target) # Update global list of target outputs, used in dependency tracking. target_outputs[qualified_target] = ('path', self.output_binary) # Update global list of link dependencies. if self.type == 'static_library': target_link_deps[qualified_target] = ('static', self.android_module) elif self.type == 'shared_library': target_link_deps[qualified_target] = ('shared', self.android_module) self.fp.close() return self.android_module def WriteActions(self, actions, extra_sources, extra_outputs): """Write Makefile code for any 'actions' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these actions (used to make other pieces dependent on these actions) """ for action in actions: name = make.StringToMakefileVariable('%s_%s' % (self.relative_target, action['action_name'])) self.WriteLn('### Rules for action "%s":' % action['action_name']) inputs = action['inputs'] outputs = action['outputs'] # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set() for out in outputs: if not out.startswith('$'): print ('WARNING: Action for target "%s" writes output to local path ' '"%s".' % (self.target, out)) dir = os.path.split(out)[0] if dir: dirs.add(dir) if int(action.get('process_outputs_as_sources', False)): extra_sources += outputs # Prepare the actual command. command = gyp.common.EncodePOSIXShellList(action['action']) if 'message' in action: quiet_cmd = 'Gyp action: %s ($@)' % action['message'] else: quiet_cmd = 'Gyp action: %s ($@)' % name if len(dirs) > 0: command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command cd_action = 'cd $(gyp_local_path)/%s; ' % self.path command = cd_action + command # The makefile rules are all relative to the top dir, but the gyp actions # are defined relative to their containing dir. This replaces the gyp_* # variables for the action rule with an absolute version so that the # output goes in the right place. # Only write the gyp_* rules for the "primary" output (:1); # it's superfluous for the "extra outputs", and this avoids accidentally # writing duplicate dummy rules for those outputs. main_output = make.QuoteSpaces(self.LocalPathify(outputs[0])) self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output) self.WriteLn('%s: gyp_var_prefix := $(GYP_VAR_PREFIX)' % main_output) self.WriteLn('%s: gyp_intermediate_dir := ' '$(abspath $(gyp_intermediate_dir))' % main_output) self.WriteLn('%s: gyp_shared_intermediate_dir := ' '$(abspath $(gyp_shared_intermediate_dir))' % main_output) # Android's envsetup.sh adds a number of directories to the path including # the built host binary directory. This causes actions/rules invoked by # gyp to sometimes use these instead of system versions, e.g. bison. # The built host binaries may not be suitable, and can cause errors. # So, we remove them from the PATH using the ANDROID_BUILD_PATHS variable # set by envsetup. self.WriteLn('%s: export PATH := $(subst $(ANDROID_BUILD_PATHS),,$(PATH))' % main_output) # Don't allow spaces in input/output filenames, but make an exception for # filenames which start with '$(' since it's okay for there to be spaces # inside of make function/macro invocations. for input in inputs: if not input.startswith('$(') and ' ' in input: raise gyp.common.GypError( 'Action input filename "%s" in target %s contains a space' % (input, self.target)) for output in outputs: if not output.startswith('$(') and ' ' in output: raise gyp.common.GypError( 'Action output filename "%s" in target %s contains a space' % (output, self.target)) self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' % (main_output, ' '.join(map(self.LocalPathify, inputs)))) self.WriteLn('\t@echo "%s"' % quiet_cmd) self.WriteLn('\t$(hide)%s\n' % command) for output in outputs[1:]: # Make each output depend on the main output, with an empty command # to force make to notice that the mtime has changed. self.WriteLn('%s: %s ;' % (self.LocalPathify(output), main_output)) extra_outputs += outputs self.WriteLn() self.WriteLn() def WriteRules(self, rules, extra_sources, extra_outputs): """Write Makefile code for any 'rules' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these rules (used to make other pieces dependent on these rules) """ if len(rules) == 0: return for rule in rules: if len(rule.get('rule_sources', [])) == 0: continue name = make.StringToMakefileVariable('%s_%s' % (self.relative_target, rule['rule_name'])) self.WriteLn('\n### Generated for rule "%s":' % name) self.WriteLn('# "%s":' % rule) inputs = rule.get('inputs') for rule_source in rule.get('rule_sources', []): (rule_source_dirname, rule_source_basename) = os.path.split(rule_source) (rule_source_root, rule_source_ext) = \ os.path.splitext(rule_source_basename) outputs = [self.ExpandInputRoot(out, rule_source_root, rule_source_dirname) for out in rule['outputs']] dirs = set() for out in outputs: if not out.startswith('$'): print ('WARNING: Rule for target %s writes output to local path %s' % (self.target, out)) dir = os.path.dirname(out) if dir: dirs.add(dir) extra_outputs += outputs if int(rule.get('process_outputs_as_sources', False)): extra_sources.extend(outputs) components = [] for component in rule['action']: component = self.ExpandInputRoot(component, rule_source_root, rule_source_dirname) if '$(RULE_SOURCES)' in component: component = component.replace('$(RULE_SOURCES)', rule_source) components.append(component) command = gyp.common.EncodePOSIXShellList(components) cd_action = 'cd $(gyp_local_path)/%s; ' % self.path command = cd_action + command if dirs: command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command # We set up a rule to build the first output, and then set up # a rule for each additional output to depend on the first. outputs = map(self.LocalPathify, outputs) main_output = outputs[0] self.WriteLn('%s: gyp_local_path := $(LOCAL_PATH)' % main_output) self.WriteLn('%s: gyp_var_prefix := $(GYP_VAR_PREFIX)' % main_output) self.WriteLn('%s: gyp_intermediate_dir := ' '$(abspath $(gyp_intermediate_dir))' % main_output) self.WriteLn('%s: gyp_shared_intermediate_dir := ' '$(abspath $(gyp_shared_intermediate_dir))' % main_output) # See explanation in WriteActions. self.WriteLn('%s: export PATH := ' '$(subst $(ANDROID_BUILD_PATHS),,$(PATH))' % main_output) main_output_deps = self.LocalPathify(rule_source) if inputs: main_output_deps += ' ' main_output_deps += ' '.join([self.LocalPathify(f) for f in inputs]) self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES)' % (main_output, main_output_deps)) self.WriteLn('\t%s\n' % command) for output in outputs[1:]: # Make each output depend on the main output, with an empty command # to force make to notice that the mtime has changed. self.WriteLn('%s: %s ;' % (output, main_output)) self.WriteLn() self.WriteLn() def WriteCopies(self, copies, extra_outputs): """Write Makefile code for any 'copies' from the gyp input. extra_outputs: a list that will be filled in with any outputs of this action (used to make other pieces dependent on this action) """ self.WriteLn('### Generated for copy rule.') variable = make.StringToMakefileVariable(self.relative_target + '_copies') outputs = [] for copy in copies: for path in copy['files']: # The Android build system does not allow generation of files into the # source tree. The destination should start with a variable, which will # typically be $(gyp_intermediate_dir) or # $(gyp_shared_intermediate_dir). Note that we can't use an assertion # because some of the gyp tests depend on this. if not copy['destination'].startswith('$'): print ('WARNING: Copy rule for target %s writes output to ' 'local path %s' % (self.target, copy['destination'])) # LocalPathify() calls normpath, stripping trailing slashes. path = Sourceify(self.LocalPathify(path)) filename = os.path.split(path)[1] output = Sourceify(self.LocalPathify(os.path.join(copy['destination'], filename))) self.WriteLn('%s: %s $(GYP_TARGET_DEPENDENCIES) | $(ACP)' % (output, path)) self.WriteLn('\t@echo Copying: $@') self.WriteLn('\t$(hide) mkdir -p $(dir $@)') self.WriteLn('\t$(hide) $(ACP) -rpf $< $@') self.WriteLn() outputs.append(output) self.WriteLn('%s = %s' % (variable, ' '.join(map(make.QuoteSpaces, outputs)))) extra_outputs.append('$(%s)' % variable) self.WriteLn() def WriteSourceFlags(self, spec, configs): """Write out the flags and include paths used to compile source files for the current target. Args: spec, configs: input from gyp. """ for configname, config in sorted(configs.iteritems()): extracted_includes = [] self.WriteLn('\n# Flags passed to both C and C++ files.') cflags, includes_from_cflags = self.ExtractIncludesFromCFlags( config.get('cflags', []) + config.get('cflags_c', [])) extracted_includes.extend(includes_from_cflags) self.WriteList(cflags, 'MY_CFLAGS_%s' % configname) self.WriteList(config.get('defines'), 'MY_DEFS_%s' % configname, prefix='-D', quoter=make.EscapeCppDefine) self.WriteLn('\n# Include paths placed before CFLAGS/CPPFLAGS') includes = list(config.get('include_dirs', [])) includes.extend(extracted_includes) includes = map(Sourceify, map(self.LocalPathify, includes)) includes = self.NormalizeIncludePaths(includes) self.WriteList(includes, 'LOCAL_C_INCLUDES_%s' % configname) self.WriteLn('\n# Flags passed to only C++ (and not C) files.') self.WriteList(config.get('cflags_cc'), 'LOCAL_CPPFLAGS_%s' % configname) self.WriteLn('\nLOCAL_CFLAGS := $(MY_CFLAGS_$(GYP_CONFIGURATION)) ' '$(MY_DEFS_$(GYP_CONFIGURATION))') # Undefine ANDROID for host modules # TODO: the source code should not use macro ANDROID to tell if it's host # or target module. if self.toolset == 'host': self.WriteLn('# Undefine ANDROID for host modules') self.WriteLn('LOCAL_CFLAGS += -UANDROID') self.WriteLn('LOCAL_C_INCLUDES := $(GYP_COPIED_SOURCE_ORIGIN_DIRS) ' '$(LOCAL_C_INCLUDES_$(GYP_CONFIGURATION))') self.WriteLn('LOCAL_CPPFLAGS := $(LOCAL_CPPFLAGS_$(GYP_CONFIGURATION))') # Android uses separate flags for assembly file invocations, but gyp expects # the same CFLAGS to be applied: self.WriteLn('LOCAL_ASFLAGS := $(LOCAL_CFLAGS)') def WriteSources(self, spec, configs, extra_sources): """Write Makefile code for any 'sources' from the gyp input. These are source files necessary to build the current target. We need to handle shared_intermediate directory source files as a special case by copying them to the intermediate directory and treating them as a genereated sources. Otherwise the Android build rules won't pick them up. Args: spec, configs: input from gyp. extra_sources: Sources generated from Actions or Rules. """ sources = filter(make.Compilable, spec.get('sources', [])) generated_not_sources = [x for x in extra_sources if not make.Compilable(x)] extra_sources = filter(make.Compilable, extra_sources) # Determine and output the C++ extension used by these sources. # We simply find the first C++ file and use that extension. all_sources = sources + extra_sources local_cpp_extension = '.cpp' for source in all_sources: (root, ext) = os.path.splitext(source) if IsCPPExtension(ext): local_cpp_extension = ext break if local_cpp_extension != '.cpp': self.WriteLn('LOCAL_CPP_EXTENSION := %s' % local_cpp_extension) # We need to move any non-generated sources that are coming from the # shared intermediate directory out of LOCAL_SRC_FILES and put them # into LOCAL_GENERATED_SOURCES. We also need to move over any C++ files # that don't match our local_cpp_extension, since Android will only # generate Makefile rules for a single LOCAL_CPP_EXTENSION. local_files = [] for source in sources: (root, ext) = os.path.splitext(source) if '$(gyp_shared_intermediate_dir)' in source: extra_sources.append(source) elif '$(gyp_intermediate_dir)' in source: extra_sources.append(source) elif IsCPPExtension(ext) and ext != local_cpp_extension: extra_sources.append(source) else: local_files.append(os.path.normpath(os.path.join(self.path, source))) # For any generated source, if it is coming from the shared intermediate # directory then we add a Make rule to copy them to the local intermediate # directory first. This is because the Android LOCAL_GENERATED_SOURCES # must be in the local module intermediate directory for the compile rules # to work properly. If the file has the wrong C++ extension, then we add # a rule to copy that to intermediates and use the new version. final_generated_sources = [] # If a source file gets copied, we still need to add the orginal source # directory as header search path, for GCC searches headers in the # directory that contains the source file by default. origin_src_dirs = [] for source in extra_sources: local_file = source if not '$(gyp_intermediate_dir)/' in local_file: basename = os.path.basename(local_file) local_file = '$(gyp_intermediate_dir)/' + basename (root, ext) = os.path.splitext(local_file) if IsCPPExtension(ext) and ext != local_cpp_extension: local_file = root + local_cpp_extension if local_file != source: self.WriteLn('%s: %s' % (local_file, self.LocalPathify(source))) self.WriteLn('\tmkdir -p $(@D); cp $< $@') origin_src_dirs.append(os.path.dirname(source)) final_generated_sources.append(local_file) # We add back in all of the non-compilable stuff to make sure that the # make rules have dependencies on them. final_generated_sources.extend(generated_not_sources) self.WriteList(final_generated_sources, 'LOCAL_GENERATED_SOURCES') origin_src_dirs = gyp.common.uniquer(origin_src_dirs) origin_src_dirs = map(Sourceify, map(self.LocalPathify, origin_src_dirs)) self.WriteList(origin_src_dirs, 'GYP_COPIED_SOURCE_ORIGIN_DIRS') self.WriteList(local_files, 'LOCAL_SRC_FILES') # Write out the flags used to compile the source; this must be done last # so that GYP_COPIED_SOURCE_ORIGIN_DIRS can be used as an include path. self.WriteSourceFlags(spec, configs) def ComputeAndroidModule(self, spec): """Return the Android module name used for a gyp spec. We use the complete qualified target name to avoid collisions between duplicate targets in different directories. We also add a suffix to distinguish gyp-generated module names. """ if int(spec.get('android_unmangled_name', 0)): assert self.type != 'shared_library' or self.target.startswith('lib') return self.target if self.type == 'shared_library': # For reasons of convention, the Android build system requires that all # shared library modules are named 'libfoo' when generating -l flags. prefix = 'lib_' else: prefix = '' if spec['toolset'] == 'host': suffix = '_$(TARGET_$(GYP_VAR_PREFIX)ARCH)_host_gyp' else: suffix = '_gyp' if self.path: middle = make.StringToMakefileVariable('%s_%s' % (self.path, self.target)) else: middle = make.StringToMakefileVariable(self.target) return ''.join([prefix, middle, suffix]) def ComputeOutputParts(self, spec): """Return the 'output basename' of a gyp spec, split into filename + ext. Android libraries must be named the same thing as their module name, otherwise the linker can't find them, so product_name and so on must be ignored if we are building a library, and the "lib" prepending is not done for Android. """ assert self.type != 'loadable_module' # TODO: not supported? target = spec['target_name'] target_prefix = '' target_ext = '' if self.type == 'static_library': target = self.ComputeAndroidModule(spec) target_ext = '.a' elif self.type == 'shared_library': target = self.ComputeAndroidModule(spec) target_ext = '.so' elif self.type == 'none': target_ext = '.stamp' elif self.type != 'executable': print ("ERROR: What output file should be generated?", "type", self.type, "target", target) if self.type != 'static_library' and self.type != 'shared_library': target_prefix = spec.get('product_prefix', target_prefix) target = spec.get('product_name', target) product_ext = spec.get('product_extension') if product_ext: target_ext = '.' + product_ext target_stem = target_prefix + target return (target_stem, target_ext) def ComputeOutputBasename(self, spec): """Return the 'output basename' of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce 'libfoobar.so' """ return ''.join(self.ComputeOutputParts(spec)) def ComputeOutput(self, spec): """Return the 'output' (full output path) of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce '$(obj)/baz/libfoobar.so' """ if self.type == 'executable': # We install host executables into shared_intermediate_dir so they can be # run by gyp rules that refer to PRODUCT_DIR. path = '$(gyp_shared_intermediate_dir)' elif self.type == 'shared_library': if self.toolset == 'host': path = '$($(GYP_HOST_VAR_PREFIX)HOST_OUT_INTERMEDIATE_LIBRARIES)' else: path = '$($(GYP_VAR_PREFIX)TARGET_OUT_INTERMEDIATE_LIBRARIES)' else: # Other targets just get built into their intermediate dir. if self.toolset == 'host': path = ('$(call intermediates-dir-for,%s,%s,true,,' '$(GYP_HOST_VAR_PREFIX))' % (self.android_class, self.android_module)) else: path = ('$(call intermediates-dir-for,%s,%s,,,$(GYP_VAR_PREFIX))' % (self.android_class, self.android_module)) assert spec.get('product_dir') is None # TODO: not supported? return os.path.join(path, self.ComputeOutputBasename(spec)) def NormalizeIncludePaths(self, include_paths): """ Normalize include_paths. Convert absolute paths to relative to the Android top directory. Args: include_paths: A list of unprocessed include paths. Returns: A list of normalized include paths. """ normalized = [] for path in include_paths: if path[0] == '/': path = gyp.common.RelativePath(path, self.android_top_dir) normalized.append(path) return normalized def ExtractIncludesFromCFlags(self, cflags): """Extract includes "-I..." out from cflags Args: cflags: A list of compiler flags, which may be mixed with "-I.." Returns: A tuple of lists: (clean_clfags, include_paths). "-I.." is trimmed. """ clean_cflags = [] include_paths = [] for flag in cflags: if flag.startswith('-I'): include_paths.append(flag[2:]) else: clean_cflags.append(flag) return (clean_cflags, include_paths) def FilterLibraries(self, libraries): """Filter the 'libraries' key to separate things that shouldn't be ldflags. Library entries that look like filenames should be converted to android module names instead of being passed to the linker as flags. Args: libraries: the value of spec.get('libraries') Returns: A tuple (static_lib_modules, dynamic_lib_modules, ldflags) """ static_lib_modules = [] dynamic_lib_modules = [] ldflags = [] for libs in libraries: # Libs can have multiple words. for lib in libs.split(): # Filter the system libraries, which are added by default by the Android # build system. if (lib == '-lc' or lib == '-lstdc++' or lib == '-lm' or lib.endswith('libgcc.a')): continue match = re.search(r'([^/]+)\.a$', lib) if match: static_lib_modules.append(match.group(1)) continue match = re.search(r'([^/]+)\.so$', lib) if match: dynamic_lib_modules.append(match.group(1)) continue if lib.startswith('-l'): ldflags.append(lib) return (static_lib_modules, dynamic_lib_modules, ldflags) def ComputeDeps(self, spec): """Compute the dependencies of a gyp spec. Returns a tuple (deps, link_deps), where each is a list of filenames that will need to be put in front of make for either building (deps) or linking (link_deps). """ deps = [] link_deps = [] if 'dependencies' in spec: deps.extend([target_outputs[dep] for dep in spec['dependencies'] if target_outputs[dep]]) for dep in spec['dependencies']: if dep in target_link_deps: link_deps.append(target_link_deps[dep]) deps.extend(link_deps) return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps)) def WriteTargetFlags(self, spec, configs, link_deps): """Write Makefile code to specify the link flags and library dependencies. spec, configs: input from gyp. link_deps: link dependency list; see ComputeDeps() """ # Libraries (i.e. -lfoo) # These must be included even for static libraries as some of them provide # implicit include paths through the build system. libraries = gyp.common.uniquer(spec.get('libraries', [])) static_libs, dynamic_libs, ldflags_libs = self.FilterLibraries(libraries) if self.type != 'static_library': for configname, config in sorted(configs.iteritems()): ldflags = list(config.get('ldflags', [])) self.WriteLn('') self.WriteList(ldflags, 'LOCAL_LDFLAGS_%s' % configname) self.WriteList(ldflags_libs, 'LOCAL_GYP_LIBS') self.WriteLn('LOCAL_LDFLAGS := $(LOCAL_LDFLAGS_$(GYP_CONFIGURATION)) ' '$(LOCAL_GYP_LIBS)') # Link dependencies (i.e. other gyp targets this target depends on) # These need not be included for static libraries as within the gyp build # we do not use the implicit include path mechanism. if self.type != 'static_library': static_link_deps = [x[1] for x in link_deps if x[0] == 'static'] shared_link_deps = [x[1] for x in link_deps if x[0] == 'shared'] else: static_link_deps = [] shared_link_deps = [] # Only write the lists if they are non-empty. if static_libs or static_link_deps: self.WriteLn('') self.WriteList(static_libs + static_link_deps, 'LOCAL_STATIC_LIBRARIES') self.WriteLn('# Enable grouping to fix circular references') self.WriteLn('LOCAL_GROUP_STATIC_LIBRARIES := true') if dynamic_libs or shared_link_deps: self.WriteLn('') self.WriteList(dynamic_libs + shared_link_deps, 'LOCAL_SHARED_LIBRARIES') def WriteTarget(self, spec, configs, deps, link_deps, part_of_all, write_alias_target): """Write Makefile code to produce the final target of the gyp spec. spec, configs: input from gyp. deps, link_deps: dependency lists; see ComputeDeps() part_of_all: flag indicating this target is part of 'all' write_alias_target: flag indicating whether to create short aliases for this target """ self.WriteLn('### Rules for final target.') if self.type != 'none': self.WriteTargetFlags(spec, configs, link_deps) settings = spec.get('aosp_build_settings', {}) if settings: self.WriteLn('### Set directly by aosp_build_settings.') for k, v in settings.iteritems(): if isinstance(v, list): self.WriteList(v, k) else: self.WriteLn('%s := %s' % (k, make.QuoteIfNecessary(v))) self.WriteLn('') # Add to the set of targets which represent the gyp 'all' target. We use the # name 'gyp_all_modules' as the Android build system doesn't allow the use # of the Make target 'all' and because 'all_modules' is the equivalent of # the Make target 'all' on Android. if part_of_all and write_alias_target: self.WriteLn('# Add target alias to "gyp_all_modules" target.') self.WriteLn('.PHONY: gyp_all_modules') self.WriteLn('gyp_all_modules: %s' % self.android_module) self.WriteLn('') # Add an alias from the gyp target name to the Android module name. This # simplifies manual builds of the target, and is required by the test # framework. if self.target != self.android_module and write_alias_target: self.WriteLn('# Alias gyp target name.') self.WriteLn('.PHONY: %s' % self.target) self.WriteLn('%s: %s' % (self.target, self.android_module)) self.WriteLn('') # Add the command to trigger build of the target type depending # on the toolset. Ex: BUILD_STATIC_LIBRARY vs. BUILD_HOST_STATIC_LIBRARY # NOTE: This has to come last! modifier = '' if self.toolset == 'host': modifier = 'HOST_' if self.type == 'static_library': self.WriteLn('include $(BUILD_%sSTATIC_LIBRARY)' % modifier) elif self.type == 'shared_library': self.WriteLn('LOCAL_PRELINK_MODULE := false') self.WriteLn('include $(BUILD_%sSHARED_LIBRARY)' % modifier) elif self.type == 'executable': # Executables are for build and test purposes only, so they're installed # to a directory that doesn't get included in the system image. self.WriteLn('LOCAL_MODULE_PATH := $(gyp_shared_intermediate_dir)') self.WriteLn('include $(BUILD_%sEXECUTABLE)' % modifier) else: self.WriteLn('LOCAL_MODULE_PATH := $(PRODUCT_OUT)/gyp_stamp') self.WriteLn('LOCAL_UNINSTALLABLE_MODULE := true') if self.toolset == 'target': self.WriteLn('LOCAL_2ND_ARCH_VAR_PREFIX := $(GYP_VAR_PREFIX)') else: self.WriteLn('LOCAL_2ND_ARCH_VAR_PREFIX := $(GYP_HOST_VAR_PREFIX)') self.WriteLn() self.WriteLn('include $(BUILD_SYSTEM)/base_rules.mk') self.WriteLn() self.WriteLn('$(LOCAL_BUILT_MODULE): $(LOCAL_ADDITIONAL_DEPENDENCIES)') self.WriteLn('\t$(hide) echo "Gyp timestamp: $@"') self.WriteLn('\t$(hide) mkdir -p $(dir $@)') self.WriteLn('\t$(hide) touch $@') self.WriteLn() self.WriteLn('LOCAL_2ND_ARCH_VAR_PREFIX :=') def WriteList(self, value_list, variable=None, prefix='', quoter=make.QuoteIfNecessary, local_pathify=False): """Write a variable definition that is a list of values. E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out foo = blaha blahb but in a pretty-printed style. """ values = '' if value_list: value_list = [quoter(prefix + l) for l in value_list] if local_pathify: value_list = [self.LocalPathify(l) for l in value_list] values = ' \\\n\t' + ' \\\n\t'.join(value_list) self.fp.write('%s :=%s\n\n' % (variable, values)) def WriteLn(self, text=''): self.fp.write(text + '\n') def LocalPathify(self, path): """Convert a subdirectory-relative path into a normalized path which starts with the make variable $(LOCAL_PATH) (i.e. the top of the project tree). Absolute paths, or paths that contain variables, are just normalized.""" if '$(' in path or os.path.isabs(path): # path is not a file in the project tree in this case, but calling # normpath is still important for trimming trailing slashes. return os.path.normpath(path) local_path = os.path.join('$(LOCAL_PATH)', self.path, path) local_path = os.path.normpath(local_path) # Check that normalizing the path didn't ../ itself out of $(LOCAL_PATH) # - i.e. that the resulting path is still inside the project tree. The # path may legitimately have ended up containing just $(LOCAL_PATH), though, # so we don't look for a slash. assert local_path.startswith('$(LOCAL_PATH)'), ( 'Path %s attempts to escape from gyp path %s !)' % (path, self.path)) return local_path def ExpandInputRoot(self, template, expansion, dirname): if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template: return template path = template % { 'INPUT_ROOT': expansion, 'INPUT_DIRNAME': dirname, } return os.path.normpath(path) def PerformBuild(data, configurations, params): # The android backend only supports the default configuration. options = params['options'] makefile = os.path.abspath(os.path.join(options.toplevel_dir, 'GypAndroid.mk')) env = dict(os.environ) env['ONE_SHOT_MAKEFILE'] = makefile arguments = ['make', '-C', os.environ['ANDROID_BUILD_TOP'], 'gyp_all_modules'] print 'Building: %s' % arguments subprocess.check_call(arguments, env=env) def GenerateOutput(target_list, target_dicts, data, params): options = params['options'] generator_flags = params.get('generator_flags', {}) builddir_name = generator_flags.get('output_dir', 'out') limit_to_target_all = generator_flags.get('limit_to_target_all', False) write_alias_targets = generator_flags.get('write_alias_targets', True) sdk_version = generator_flags.get('aosp_sdk_version', 19) android_top_dir = os.environ.get('ANDROID_BUILD_TOP') assert android_top_dir, '$ANDROID_BUILD_TOP not set; you need to run lunch.' def CalculateMakefilePath(build_file, base_name): """Determine where to write a Makefile for a given gyp file.""" # Paths in gyp files are relative to the .gyp file, but we want # paths relative to the source root for the master makefile. Grab # the path of the .gyp file as the base to relativize against. # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp". base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.depth) # We write the file in the base_path directory. output_file = os.path.join(options.depth, base_path, base_name) assert not options.generator_output, ( 'The Android backend does not support options.generator_output.') base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.toplevel_dir) return base_path, output_file # TODO: search for the first non-'Default' target. This can go # away when we add verification that all targets have the # necessary configurations. default_configuration = None toolsets = set([target_dicts[target]['toolset'] for target in target_list]) for target in target_list: spec = target_dicts[target] if spec['default_configuration'] != 'Default': default_configuration = spec['default_configuration'] break if not default_configuration: default_configuration = 'Default' srcdir = '.' makefile_name = 'GypAndroid' + options.suffix + '.mk' makefile_path = os.path.join(options.toplevel_dir, makefile_name) assert not options.generator_output, ( 'The Android backend does not support options.generator_output.') gyp.common.EnsureDirExists(makefile_path) root_makefile = open(makefile_path, 'w') root_makefile.write(header) # We set LOCAL_PATH just once, here, to the top of the project tree. This # allows all the other paths we use to be relative to the Android.mk file, # as the Android build system expects. root_makefile.write('\nLOCAL_PATH := $(call my-dir)\n') # Find the list of targets that derive from the gyp file(s) being built. needed_targets = set() for build_file in params['build_files']: for target in gyp.common.AllTargets(target_list, target_dicts, build_file): needed_targets.add(target) build_files = set() include_list = set() android_modules = {} for qualified_target in target_list: build_file, target, toolset = gyp.common.ParseQualifiedTarget( qualified_target) relative_build_file = gyp.common.RelativePath(build_file, options.toplevel_dir) build_files.add(relative_build_file) included_files = data[build_file]['included_files'] for included_file in included_files: # The included_files entries are relative to the dir of the build file # that included them, so we have to undo that and then make them relative # to the root dir. relative_include_file = gyp.common.RelativePath( gyp.common.UnrelativePath(included_file, build_file), options.toplevel_dir) abs_include_file = os.path.abspath(relative_include_file) # If the include file is from the ~/.gyp dir, we should use absolute path # so that relocating the src dir doesn't break the path. if (params['home_dot_gyp'] and abs_include_file.startswith(params['home_dot_gyp'])): build_files.add(abs_include_file) else: build_files.add(relative_include_file) base_path, output_file = CalculateMakefilePath(build_file, target + '.' + toolset + options.suffix + '.mk') spec = target_dicts[qualified_target] configs = spec['configurations'] part_of_all = qualified_target in needed_targets if limit_to_target_all and not part_of_all: continue relative_target = gyp.common.QualifiedTarget(relative_build_file, target, toolset) writer = AndroidMkWriter(android_top_dir) android_module = writer.Write(qualified_target, relative_target, base_path, output_file, spec, configs, part_of_all=part_of_all, write_alias_target=write_alias_targets, sdk_version=sdk_version) if android_module in android_modules: print ('ERROR: Android module names must be unique. The following ' 'targets both generate Android module name %s.\n %s\n %s' % (android_module, android_modules[android_module], qualified_target)) return android_modules[android_module] = qualified_target # Our root_makefile lives at the source root. Compute the relative path # from there to the output_file for including. mkfile_rel_path = gyp.common.RelativePath(output_file, os.path.dirname(makefile_path)) include_list.add(mkfile_rel_path) root_makefile.write('GYP_CONFIGURATION ?= %s\n' % default_configuration) root_makefile.write('GYP_VAR_PREFIX ?=\n') root_makefile.write('GYP_HOST_VAR_PREFIX ?=\n') root_makefile.write('GYP_HOST_MULTILIB ?=\n') # Write out the sorted list of includes. root_makefile.write('\n') for include_file in sorted(include_list): root_makefile.write('include $(LOCAL_PATH)/' + include_file + '\n') root_makefile.write('\n') if write_alias_targets: root_makefile.write(ALL_MODULES_FOOTER) root_makefile.close() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/cmake.py0000644000000000000000000012707412631326456024041 0ustar 00000000000000# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """cmake output module This module is under development and should be considered experimental. This module produces cmake (2.8.8+) input as its output. One CMakeLists.txt is created for each configuration. This module's original purpose was to support editing in IDEs like KDevelop which use CMake for project management. It is also possible to use CMake to generate projects for other IDEs such as eclipse cdt and code::blocks. QtCreator will convert the CMakeLists.txt to a code::blocks cbp for the editor to read, but build using CMake. As a result QtCreator editor is unaware of compiler defines. The generated CMakeLists.txt can also be used to build on Linux. There is currently no support for building on platforms other than Linux. The generated CMakeLists.txt should properly compile all projects. However, there is a mismatch between gyp and cmake with regard to linking. All attempts are made to work around this, but CMake sometimes sees -Wl,--start-group as a library and incorrectly repeats it. As a result the output of this generator should not be relied on for building. When using with kdevelop, use version 4.4+. Previous versions of kdevelop will not be able to find the header file directories described in the generated CMakeLists.txt file. """ import multiprocessing import os import signal import string import subprocess import gyp.common generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_PREFIX': 'lib', 'SHARED_LIB_SUFFIX': '.so', 'SHARED_LIB_DIR': '${builddir}/lib.${TOOLSET}', 'LIB_DIR': '${obj}.${TOOLSET}', 'INTERMEDIATE_DIR': '${obj}.${TOOLSET}/${TARGET}/geni', 'SHARED_INTERMEDIATE_DIR': '${obj}/gen', 'PRODUCT_DIR': '${builddir}', 'RULE_INPUT_PATH': '${RULE_INPUT_PATH}', 'RULE_INPUT_DIRNAME': '${RULE_INPUT_DIRNAME}', 'RULE_INPUT_NAME': '${RULE_INPUT_NAME}', 'RULE_INPUT_ROOT': '${RULE_INPUT_ROOT}', 'RULE_INPUT_EXT': '${RULE_INPUT_EXT}', 'CONFIGURATION_NAME': '${configuration}', } FULL_PATH_VARS = ('${CMAKE_CURRENT_LIST_DIR}', '${builddir}', '${obj}') generator_supports_multiple_toolsets = True generator_wants_static_library_dependencies_adjusted = True COMPILABLE_EXTENSIONS = { '.c': 'cc', '.cc': 'cxx', '.cpp': 'cxx', '.cxx': 'cxx', '.s': 's', # cc '.S': 's', # cc } def RemovePrefix(a, prefix): """Returns 'a' without 'prefix' if it starts with 'prefix'.""" return a[len(prefix):] if a.startswith(prefix) else a def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" default_variables.setdefault('OS', gyp.common.GetFlavor(params)) def Compilable(filename): """Return true if the file is compilable (should be in OBJS).""" return any(filename.endswith(e) for e in COMPILABLE_EXTENSIONS) def Linkable(filename): """Return true if the file is linkable (should be on the link line).""" return filename.endswith('.o') def NormjoinPathForceCMakeSource(base_path, rel_path): """Resolves rel_path against base_path and returns the result. If rel_path is an absolute path it is returned unchanged. Otherwise it is resolved against base_path and normalized. If the result is a relative path, it is forced to be relative to the CMakeLists.txt. """ if os.path.isabs(rel_path): return rel_path if any([rel_path.startswith(var) for var in FULL_PATH_VARS]): return rel_path # TODO: do we need to check base_path for absolute variables as well? return os.path.join('${CMAKE_CURRENT_LIST_DIR}', os.path.normpath(os.path.join(base_path, rel_path))) def NormjoinPath(base_path, rel_path): """Resolves rel_path against base_path and returns the result. TODO: what is this really used for? If rel_path begins with '$' it is returned unchanged. Otherwise it is resolved against base_path if relative, then normalized. """ if rel_path.startswith('$') and not rel_path.startswith('${configuration}'): return rel_path return os.path.normpath(os.path.join(base_path, rel_path)) def CMakeStringEscape(a): """Escapes the string 'a' for use inside a CMake string. This means escaping '\' otherwise it may be seen as modifying the next character '"' otherwise it will end the string ';' otherwise the string becomes a list The following do not need to be escaped '#' when the lexer is in string state, this does not start a comment The following are yet unknown '$' generator variables (like ${obj}) must not be escaped, but text $ should be escaped what is wanted is to know which $ come from generator variables """ return a.replace('\\', '\\\\').replace(';', '\\;').replace('"', '\\"') def SetFileProperty(output, source_name, property_name, values, sep): """Given a set of source file, sets the given property on them.""" output.write('set_source_files_properties(') output.write(source_name) output.write(' PROPERTIES ') output.write(property_name) output.write(' "') for value in values: output.write(CMakeStringEscape(value)) output.write(sep) output.write('")\n') def SetFilesProperty(output, variable, property_name, values, sep): """Given a set of source files, sets the given property on them.""" output.write('set_source_files_properties(') WriteVariable(output, variable) output.write(' PROPERTIES ') output.write(property_name) output.write(' "') for value in values: output.write(CMakeStringEscape(value)) output.write(sep) output.write('")\n') def SetTargetProperty(output, target_name, property_name, values, sep=''): """Given a target, sets the given property.""" output.write('set_target_properties(') output.write(target_name) output.write(' PROPERTIES ') output.write(property_name) output.write(' "') for value in values: output.write(CMakeStringEscape(value)) output.write(sep) output.write('")\n') def SetVariable(output, variable_name, value): """Sets a CMake variable.""" output.write('set(') output.write(variable_name) output.write(' "') output.write(CMakeStringEscape(value)) output.write('")\n') def SetVariableList(output, variable_name, values): """Sets a CMake variable to a list.""" if not values: return SetVariable(output, variable_name, "") if len(values) == 1: return SetVariable(output, variable_name, values[0]) output.write('list(APPEND ') output.write(variable_name) output.write('\n "') output.write('"\n "'.join([CMakeStringEscape(value) for value in values])) output.write('")\n') def UnsetVariable(output, variable_name): """Unsets a CMake variable.""" output.write('unset(') output.write(variable_name) output.write(')\n') def WriteVariable(output, variable_name, prepend=None): if prepend: output.write(prepend) output.write('${') output.write(variable_name) output.write('}') class CMakeTargetType(object): def __init__(self, command, modifier, property_modifier): self.command = command self.modifier = modifier self.property_modifier = property_modifier cmake_target_type_from_gyp_target_type = { 'executable': CMakeTargetType('add_executable', None, 'RUNTIME'), 'static_library': CMakeTargetType('add_library', 'STATIC', 'ARCHIVE'), 'shared_library': CMakeTargetType('add_library', 'SHARED', 'LIBRARY'), 'loadable_module': CMakeTargetType('add_library', 'MODULE', 'LIBRARY'), 'none': CMakeTargetType('add_custom_target', 'SOURCES', None), } def StringToCMakeTargetName(a): """Converts the given string 'a' to a valid CMake target name. All invalid characters are replaced by '_'. Invalid for cmake: ' ', '/', '(', ')', '"' Invalid for make: ':' Invalid for unknown reasons but cause failures: '.' """ return a.translate(string.maketrans(' /():."', '_______')) def WriteActions(target_name, actions, extra_sources, extra_deps, path_to_gyp, output): """Write CMake for the 'actions' in the target. Args: target_name: the name of the CMake target being generated. actions: the Gyp 'actions' dict for this target. extra_sources: [(, )] to append with generated source files. extra_deps: [] to append with generated targets. path_to_gyp: relative path from CMakeLists.txt being generated to the Gyp file in which the target being generated is defined. """ for action in actions: action_name = StringToCMakeTargetName(action['action_name']) action_target_name = '%s__%s' % (target_name, action_name) inputs = action['inputs'] inputs_name = action_target_name + '__input' SetVariableList(output, inputs_name, [NormjoinPathForceCMakeSource(path_to_gyp, dep) for dep in inputs]) outputs = action['outputs'] cmake_outputs = [NormjoinPathForceCMakeSource(path_to_gyp, out) for out in outputs] outputs_name = action_target_name + '__output' SetVariableList(output, outputs_name, cmake_outputs) # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set(dir for dir in (os.path.dirname(o) for o in outputs) if dir) if int(action.get('process_outputs_as_sources', False)): extra_sources.extend(zip(cmake_outputs, outputs)) # add_custom_command output.write('add_custom_command(OUTPUT ') WriteVariable(output, outputs_name) output.write('\n') if len(dirs) > 0: for directory in dirs: output.write(' COMMAND ${CMAKE_COMMAND} -E make_directory ') output.write(directory) output.write('\n') output.write(' COMMAND ') output.write(gyp.common.EncodePOSIXShellList(action['action'])) output.write('\n') output.write(' DEPENDS ') WriteVariable(output, inputs_name) output.write('\n') output.write(' WORKING_DIRECTORY ${CMAKE_CURRENT_LIST_DIR}/') output.write(path_to_gyp) output.write('\n') output.write(' COMMENT ') if 'message' in action: output.write(action['message']) else: output.write(action_target_name) output.write('\n') output.write(' VERBATIM\n') output.write(')\n') # add_custom_target output.write('add_custom_target(') output.write(action_target_name) output.write('\n DEPENDS ') WriteVariable(output, outputs_name) output.write('\n SOURCES ') WriteVariable(output, inputs_name) output.write('\n)\n') extra_deps.append(action_target_name) def NormjoinRulePathForceCMakeSource(base_path, rel_path, rule_source): if rel_path.startswith(("${RULE_INPUT_PATH}","${RULE_INPUT_DIRNAME}")): if any([rule_source.startswith(var) for var in FULL_PATH_VARS]): return rel_path return NormjoinPathForceCMakeSource(base_path, rel_path) def WriteRules(target_name, rules, extra_sources, extra_deps, path_to_gyp, output): """Write CMake for the 'rules' in the target. Args: target_name: the name of the CMake target being generated. actions: the Gyp 'actions' dict for this target. extra_sources: [(, )] to append with generated source files. extra_deps: [] to append with generated targets. path_to_gyp: relative path from CMakeLists.txt being generated to the Gyp file in which the target being generated is defined. """ for rule in rules: rule_name = StringToCMakeTargetName(target_name + '__' + rule['rule_name']) inputs = rule.get('inputs', []) inputs_name = rule_name + '__input' SetVariableList(output, inputs_name, [NormjoinPathForceCMakeSource(path_to_gyp, dep) for dep in inputs]) outputs = rule['outputs'] var_outputs = [] for count, rule_source in enumerate(rule.get('rule_sources', [])): action_name = rule_name + '_' + str(count) rule_source_dirname, rule_source_basename = os.path.split(rule_source) rule_source_root, rule_source_ext = os.path.splitext(rule_source_basename) SetVariable(output, 'RULE_INPUT_PATH', rule_source) SetVariable(output, 'RULE_INPUT_DIRNAME', rule_source_dirname) SetVariable(output, 'RULE_INPUT_NAME', rule_source_basename) SetVariable(output, 'RULE_INPUT_ROOT', rule_source_root) SetVariable(output, 'RULE_INPUT_EXT', rule_source_ext) # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set(dir for dir in (os.path.dirname(o) for o in outputs) if dir) # Create variables for the output, as 'local' variable will be unset. these_outputs = [] for output_index, out in enumerate(outputs): output_name = action_name + '_' + str(output_index) SetVariable(output, output_name, NormjoinRulePathForceCMakeSource(path_to_gyp, out, rule_source)) if int(rule.get('process_outputs_as_sources', False)): extra_sources.append(('${' + output_name + '}', out)) these_outputs.append('${' + output_name + '}') var_outputs.append('${' + output_name + '}') # add_custom_command output.write('add_custom_command(OUTPUT\n') for out in these_outputs: output.write(' ') output.write(out) output.write('\n') for directory in dirs: output.write(' COMMAND ${CMAKE_COMMAND} -E make_directory ') output.write(directory) output.write('\n') output.write(' COMMAND ') output.write(gyp.common.EncodePOSIXShellList(rule['action'])) output.write('\n') output.write(' DEPENDS ') WriteVariable(output, inputs_name) output.write(' ') output.write(NormjoinPath(path_to_gyp, rule_source)) output.write('\n') # CMAKE_CURRENT_LIST_DIR is where the CMakeLists.txt lives. # The cwd is the current build directory. output.write(' WORKING_DIRECTORY ${CMAKE_CURRENT_LIST_DIR}/') output.write(path_to_gyp) output.write('\n') output.write(' COMMENT ') if 'message' in rule: output.write(rule['message']) else: output.write(action_name) output.write('\n') output.write(' VERBATIM\n') output.write(')\n') UnsetVariable(output, 'RULE_INPUT_PATH') UnsetVariable(output, 'RULE_INPUT_DIRNAME') UnsetVariable(output, 'RULE_INPUT_NAME') UnsetVariable(output, 'RULE_INPUT_ROOT') UnsetVariable(output, 'RULE_INPUT_EXT') # add_custom_target output.write('add_custom_target(') output.write(rule_name) output.write(' DEPENDS\n') for out in var_outputs: output.write(' ') output.write(out) output.write('\n') output.write('SOURCES ') WriteVariable(output, inputs_name) output.write('\n') for rule_source in rule.get('rule_sources', []): output.write(' ') output.write(NormjoinPath(path_to_gyp, rule_source)) output.write('\n') output.write(')\n') extra_deps.append(rule_name) def WriteCopies(target_name, copies, extra_deps, path_to_gyp, output): """Write CMake for the 'copies' in the target. Args: target_name: the name of the CMake target being generated. actions: the Gyp 'actions' dict for this target. extra_deps: [] to append with generated targets. path_to_gyp: relative path from CMakeLists.txt being generated to the Gyp file in which the target being generated is defined. """ copy_name = target_name + '__copies' # CMake gets upset with custom targets with OUTPUT which specify no output. have_copies = any(copy['files'] for copy in copies) if not have_copies: output.write('add_custom_target(') output.write(copy_name) output.write(')\n') extra_deps.append(copy_name) return class Copy(object): def __init__(self, ext, command): self.cmake_inputs = [] self.cmake_outputs = [] self.gyp_inputs = [] self.gyp_outputs = [] self.ext = ext self.inputs_name = None self.outputs_name = None self.command = command file_copy = Copy('', 'copy') dir_copy = Copy('_dirs', 'copy_directory') for copy in copies: files = copy['files'] destination = copy['destination'] for src in files: path = os.path.normpath(src) basename = os.path.split(path)[1] dst = os.path.join(destination, basename) copy = file_copy if os.path.basename(src) else dir_copy copy.cmake_inputs.append(NormjoinPathForceCMakeSource(path_to_gyp, src)) copy.cmake_outputs.append(NormjoinPathForceCMakeSource(path_to_gyp, dst)) copy.gyp_inputs.append(src) copy.gyp_outputs.append(dst) for copy in (file_copy, dir_copy): if copy.cmake_inputs: copy.inputs_name = copy_name + '__input' + copy.ext SetVariableList(output, copy.inputs_name, copy.cmake_inputs) copy.outputs_name = copy_name + '__output' + copy.ext SetVariableList(output, copy.outputs_name, copy.cmake_outputs) # add_custom_command output.write('add_custom_command(\n') output.write('OUTPUT') for copy in (file_copy, dir_copy): if copy.outputs_name: WriteVariable(output, copy.outputs_name, ' ') output.write('\n') for copy in (file_copy, dir_copy): for src, dst in zip(copy.gyp_inputs, copy.gyp_outputs): # 'cmake -E copy src dst' will create the 'dst' directory if needed. output.write('COMMAND ${CMAKE_COMMAND} -E %s ' % copy.command) output.write(src) output.write(' ') output.write(dst) output.write("\n") output.write('DEPENDS') for copy in (file_copy, dir_copy): if copy.inputs_name: WriteVariable(output, copy.inputs_name, ' ') output.write('\n') output.write('WORKING_DIRECTORY ${CMAKE_CURRENT_LIST_DIR}/') output.write(path_to_gyp) output.write('\n') output.write('COMMENT Copying for ') output.write(target_name) output.write('\n') output.write('VERBATIM\n') output.write(')\n') # add_custom_target output.write('add_custom_target(') output.write(copy_name) output.write('\n DEPENDS') for copy in (file_copy, dir_copy): if copy.outputs_name: WriteVariable(output, copy.outputs_name, ' ') output.write('\n SOURCES') if file_copy.inputs_name: WriteVariable(output, file_copy.inputs_name, ' ') output.write('\n)\n') extra_deps.append(copy_name) def CreateCMakeTargetBaseName(qualified_target): """This is the name we would like the target to have.""" _, gyp_target_name, gyp_target_toolset = ( gyp.common.ParseQualifiedTarget(qualified_target)) cmake_target_base_name = gyp_target_name if gyp_target_toolset and gyp_target_toolset != 'target': cmake_target_base_name += '_' + gyp_target_toolset return StringToCMakeTargetName(cmake_target_base_name) def CreateCMakeTargetFullName(qualified_target): """An unambiguous name for the target.""" gyp_file, gyp_target_name, gyp_target_toolset = ( gyp.common.ParseQualifiedTarget(qualified_target)) cmake_target_full_name = gyp_file + ':' + gyp_target_name if gyp_target_toolset and gyp_target_toolset != 'target': cmake_target_full_name += '_' + gyp_target_toolset return StringToCMakeTargetName(cmake_target_full_name) class CMakeNamer(object): """Converts Gyp target names into CMake target names. CMake requires that target names be globally unique. One way to ensure this is to fully qualify the names of the targets. Unfortunatly, this ends up with all targets looking like "chrome_chrome_gyp_chrome" instead of just "chrome". If this generator were only interested in building, it would be possible to fully qualify all target names, then create unqualified target names which depend on all qualified targets which should have had that name. This is more or less what the 'make' generator does with aliases. However, one goal of this generator is to create CMake files for use with IDEs, and fully qualified names are not as user friendly. Since target name collision is rare, we do the above only when required. Toolset variants are always qualified from the base, as this is required for building. However, it also makes sense for an IDE, as it is possible for defines to be different. """ def __init__(self, target_list): self.cmake_target_base_names_conficting = set() cmake_target_base_names_seen = set() for qualified_target in target_list: cmake_target_base_name = CreateCMakeTargetBaseName(qualified_target) if cmake_target_base_name not in cmake_target_base_names_seen: cmake_target_base_names_seen.add(cmake_target_base_name) else: self.cmake_target_base_names_conficting.add(cmake_target_base_name) def CreateCMakeTargetName(self, qualified_target): base_name = CreateCMakeTargetBaseName(qualified_target) if base_name in self.cmake_target_base_names_conficting: return CreateCMakeTargetFullName(qualified_target) return base_name def WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use, options, generator_flags, all_qualified_targets, output): # The make generator does this always. # TODO: It would be nice to be able to tell CMake all dependencies. circular_libs = generator_flags.get('circular', True) if not generator_flags.get('standalone', False): output.write('\n#') output.write(qualified_target) output.write('\n') gyp_file, _, _ = gyp.common.ParseQualifiedTarget(qualified_target) rel_gyp_file = gyp.common.RelativePath(gyp_file, options.toplevel_dir) rel_gyp_dir = os.path.dirname(rel_gyp_file) # Relative path from build dir to top dir. build_to_top = gyp.common.InvertRelativePath(build_dir, options.toplevel_dir) # Relative path from build dir to gyp dir. build_to_gyp = os.path.join(build_to_top, rel_gyp_dir) path_from_cmakelists_to_gyp = build_to_gyp spec = target_dicts.get(qualified_target, {}) config = spec.get('configurations', {}).get(config_to_use, {}) target_name = spec.get('target_name', '') target_type = spec.get('type', '') target_toolset = spec.get('toolset') cmake_target_type = cmake_target_type_from_gyp_target_type.get(target_type) if cmake_target_type is None: print ('Target %s has unknown target type %s, skipping.' % ( target_name, target_type ) ) return SetVariable(output, 'TARGET', target_name) SetVariable(output, 'TOOLSET', target_toolset) cmake_target_name = namer.CreateCMakeTargetName(qualified_target) extra_sources = [] extra_deps = [] # Actions must come first, since they can generate more OBJs for use below. if 'actions' in spec: WriteActions(cmake_target_name, spec['actions'], extra_sources, extra_deps, path_from_cmakelists_to_gyp, output) # Rules must be early like actions. if 'rules' in spec: WriteRules(cmake_target_name, spec['rules'], extra_sources, extra_deps, path_from_cmakelists_to_gyp, output) # Copies if 'copies' in spec: WriteCopies(cmake_target_name, spec['copies'], extra_deps, path_from_cmakelists_to_gyp, output) # Target and sources srcs = spec.get('sources', []) # Gyp separates the sheep from the goats based on file extensions. # A full separation is done here because of flag handing (see below). s_sources = [] c_sources = [] cxx_sources = [] linkable_sources = [] other_sources = [] for src in srcs: _, ext = os.path.splitext(src) src_type = COMPILABLE_EXTENSIONS.get(ext, None) src_norm_path = NormjoinPath(path_from_cmakelists_to_gyp, src); if src_type == 's': s_sources.append(src_norm_path) elif src_type == 'cc': c_sources.append(src_norm_path) elif src_type == 'cxx': cxx_sources.append(src_norm_path) elif Linkable(ext): linkable_sources.append(src_norm_path) else: other_sources.append(src_norm_path) for extra_source in extra_sources: src, real_source = extra_source _, ext = os.path.splitext(real_source) src_type = COMPILABLE_EXTENSIONS.get(ext, None) if src_type == 's': s_sources.append(src) elif src_type == 'cc': c_sources.append(src) elif src_type == 'cxx': cxx_sources.append(src) elif Linkable(ext): linkable_sources.append(src) else: other_sources.append(src) s_sources_name = None if s_sources: s_sources_name = cmake_target_name + '__asm_srcs' SetVariableList(output, s_sources_name, s_sources) c_sources_name = None if c_sources: c_sources_name = cmake_target_name + '__c_srcs' SetVariableList(output, c_sources_name, c_sources) cxx_sources_name = None if cxx_sources: cxx_sources_name = cmake_target_name + '__cxx_srcs' SetVariableList(output, cxx_sources_name, cxx_sources) linkable_sources_name = None if linkable_sources: linkable_sources_name = cmake_target_name + '__linkable_srcs' SetVariableList(output, linkable_sources_name, linkable_sources) other_sources_name = None if other_sources: other_sources_name = cmake_target_name + '__other_srcs' SetVariableList(output, other_sources_name, other_sources) # CMake gets upset when executable targets provide no sources. # http://www.cmake.org/pipermail/cmake/2010-July/038461.html dummy_sources_name = None has_sources = (s_sources_name or c_sources_name or cxx_sources_name or linkable_sources_name or other_sources_name) if target_type == 'executable' and not has_sources: dummy_sources_name = cmake_target_name + '__dummy_srcs' SetVariable(output, dummy_sources_name, "${obj}.${TOOLSET}/${TARGET}/genc/dummy.c") output.write('if(NOT EXISTS "') WriteVariable(output, dummy_sources_name) output.write('")\n') output.write(' file(WRITE "') WriteVariable(output, dummy_sources_name) output.write('" "")\n') output.write("endif()\n") # CMake is opposed to setting linker directories and considers the practice # of setting linker directories dangerous. Instead, it favors the use of # find_library and passing absolute paths to target_link_libraries. # However, CMake does provide the command link_directories, which adds # link directories to targets defined after it is called. # As a result, link_directories must come before the target definition. # CMake unfortunately has no means of removing entries from LINK_DIRECTORIES. library_dirs = config.get('library_dirs') if library_dirs is not None: output.write('link_directories(') for library_dir in library_dirs: output.write(' ') output.write(NormjoinPath(path_from_cmakelists_to_gyp, library_dir)) output.write('\n') output.write(')\n') output.write(cmake_target_type.command) output.write('(') output.write(cmake_target_name) if cmake_target_type.modifier is not None: output.write(' ') output.write(cmake_target_type.modifier) if s_sources_name: WriteVariable(output, s_sources_name, ' ') if c_sources_name: WriteVariable(output, c_sources_name, ' ') if cxx_sources_name: WriteVariable(output, cxx_sources_name, ' ') if linkable_sources_name: WriteVariable(output, linkable_sources_name, ' ') if other_sources_name: WriteVariable(output, other_sources_name, ' ') if dummy_sources_name: WriteVariable(output, dummy_sources_name, ' ') output.write(')\n') # Let CMake know if the 'all' target should depend on this target. exclude_from_all = ('TRUE' if qualified_target not in all_qualified_targets else 'FALSE') SetTargetProperty(output, cmake_target_name, 'EXCLUDE_FROM_ALL', exclude_from_all) for extra_target_name in extra_deps: SetTargetProperty(output, extra_target_name, 'EXCLUDE_FROM_ALL', exclude_from_all) # Output name and location. if target_type != 'none': # Link as 'C' if there are no other files if not c_sources and not cxx_sources: SetTargetProperty(output, cmake_target_name, 'LINKER_LANGUAGE', ['C']) # Mark uncompiled sources as uncompiled. if other_sources_name: output.write('set_source_files_properties(') WriteVariable(output, other_sources_name, '') output.write(' PROPERTIES HEADER_FILE_ONLY "TRUE")\n') # Mark object sources as linkable. if linkable_sources_name: output.write('set_source_files_properties(') WriteVariable(output, other_sources_name, '') output.write(' PROPERTIES EXTERNAL_OBJECT "TRUE")\n') # Output directory target_output_directory = spec.get('product_dir') if target_output_directory is None: if target_type in ('executable', 'loadable_module'): target_output_directory = generator_default_variables['PRODUCT_DIR'] elif target_type == 'shared_library': target_output_directory = '${builddir}/lib.${TOOLSET}' elif spec.get('standalone_static_library', False): target_output_directory = generator_default_variables['PRODUCT_DIR'] else: base_path = gyp.common.RelativePath(os.path.dirname(gyp_file), options.toplevel_dir) target_output_directory = '${obj}.${TOOLSET}' target_output_directory = ( os.path.join(target_output_directory, base_path)) cmake_target_output_directory = NormjoinPathForceCMakeSource( path_from_cmakelists_to_gyp, target_output_directory) SetTargetProperty(output, cmake_target_name, cmake_target_type.property_modifier + '_OUTPUT_DIRECTORY', cmake_target_output_directory) # Output name default_product_prefix = '' default_product_name = target_name default_product_ext = '' if target_type == 'static_library': static_library_prefix = generator_default_variables['STATIC_LIB_PREFIX'] default_product_name = RemovePrefix(default_product_name, static_library_prefix) default_product_prefix = static_library_prefix default_product_ext = generator_default_variables['STATIC_LIB_SUFFIX'] elif target_type in ('loadable_module', 'shared_library'): shared_library_prefix = generator_default_variables['SHARED_LIB_PREFIX'] default_product_name = RemovePrefix(default_product_name, shared_library_prefix) default_product_prefix = shared_library_prefix default_product_ext = generator_default_variables['SHARED_LIB_SUFFIX'] elif target_type != 'executable': print ('ERROR: What output file should be generated?', 'type', target_type, 'target', target_name) product_prefix = spec.get('product_prefix', default_product_prefix) product_name = spec.get('product_name', default_product_name) product_ext = spec.get('product_extension') if product_ext: product_ext = '.' + product_ext else: product_ext = default_product_ext SetTargetProperty(output, cmake_target_name, 'PREFIX', product_prefix) SetTargetProperty(output, cmake_target_name, cmake_target_type.property_modifier + '_OUTPUT_NAME', product_name) SetTargetProperty(output, cmake_target_name, 'SUFFIX', product_ext) # Make the output of this target referenceable as a source. cmake_target_output_basename = product_prefix + product_name + product_ext cmake_target_output = os.path.join(cmake_target_output_directory, cmake_target_output_basename) SetFileProperty(output, cmake_target_output, 'GENERATED', ['TRUE'], '') # Includes includes = config.get('include_dirs') if includes: # This (target include directories) is what requires CMake 2.8.8 includes_name = cmake_target_name + '__include_dirs' SetVariableList(output, includes_name, [NormjoinPathForceCMakeSource(path_from_cmakelists_to_gyp, include) for include in includes]) output.write('set_property(TARGET ') output.write(cmake_target_name) output.write(' APPEND PROPERTY INCLUDE_DIRECTORIES ') WriteVariable(output, includes_name, '') output.write(')\n') # Defines defines = config.get('defines') if defines is not None: SetTargetProperty(output, cmake_target_name, 'COMPILE_DEFINITIONS', defines, ';') # Compile Flags - http://www.cmake.org/Bug/view.php?id=6493 # CMake currently does not have target C and CXX flags. # So, instead of doing... # cflags_c = config.get('cflags_c') # if cflags_c is not None: # SetTargetProperty(output, cmake_target_name, # 'C_COMPILE_FLAGS', cflags_c, ' ') # cflags_cc = config.get('cflags_cc') # if cflags_cc is not None: # SetTargetProperty(output, cmake_target_name, # 'CXX_COMPILE_FLAGS', cflags_cc, ' ') # Instead we must... cflags = config.get('cflags', []) cflags_c = config.get('cflags_c', []) cflags_cxx = config.get('cflags_cc', []) if (not cflags_c or not c_sources) and (not cflags_cxx or not cxx_sources): SetTargetProperty(output, cmake_target_name, 'COMPILE_FLAGS', cflags, ' ') elif c_sources and not (s_sources or cxx_sources): flags = [] flags.extend(cflags) flags.extend(cflags_c) SetTargetProperty(output, cmake_target_name, 'COMPILE_FLAGS', flags, ' ') elif cxx_sources and not (s_sources or c_sources): flags = [] flags.extend(cflags) flags.extend(cflags_cxx) SetTargetProperty(output, cmake_target_name, 'COMPILE_FLAGS', flags, ' ') else: # TODO: This is broken, one cannot generally set properties on files, # as other targets may require different properties on the same files. if s_sources and cflags: SetFilesProperty(output, s_sources_name, 'COMPILE_FLAGS', cflags, ' ') if c_sources and (cflags or cflags_c): flags = [] flags.extend(cflags) flags.extend(cflags_c) SetFilesProperty(output, c_sources_name, 'COMPILE_FLAGS', flags, ' ') if cxx_sources and (cflags or cflags_cxx): flags = [] flags.extend(cflags) flags.extend(cflags_cxx) SetFilesProperty(output, cxx_sources_name, 'COMPILE_FLAGS', flags, ' ') # Linker flags ldflags = config.get('ldflags') if ldflags is not None: SetTargetProperty(output, cmake_target_name, 'LINK_FLAGS', ldflags, ' ') # Note on Dependencies and Libraries: # CMake wants to handle link order, resolving the link line up front. # Gyp does not retain or enforce specifying enough information to do so. # So do as other gyp generators and use --start-group and --end-group. # Give CMake as little information as possible so that it doesn't mess it up. # Dependencies rawDeps = spec.get('dependencies', []) static_deps = [] shared_deps = [] other_deps = [] for rawDep in rawDeps: dep_cmake_name = namer.CreateCMakeTargetName(rawDep) dep_spec = target_dicts.get(rawDep, {}) dep_target_type = dep_spec.get('type', None) if dep_target_type == 'static_library': static_deps.append(dep_cmake_name) elif dep_target_type == 'shared_library': shared_deps.append(dep_cmake_name) else: other_deps.append(dep_cmake_name) # ensure all external dependencies are complete before internal dependencies # extra_deps currently only depend on their own deps, so otherwise run early if static_deps or shared_deps or other_deps: for extra_dep in extra_deps: output.write('add_dependencies(') output.write(extra_dep) output.write('\n') for deps in (static_deps, shared_deps, other_deps): for dep in gyp.common.uniquer(deps): output.write(' ') output.write(dep) output.write('\n') output.write(')\n') linkable = target_type in ('executable', 'loadable_module', 'shared_library') other_deps.extend(extra_deps) if other_deps or (not linkable and (static_deps or shared_deps)): output.write('add_dependencies(') output.write(cmake_target_name) output.write('\n') for dep in gyp.common.uniquer(other_deps): output.write(' ') output.write(dep) output.write('\n') if not linkable: for deps in (static_deps, shared_deps): for lib_dep in gyp.common.uniquer(deps): output.write(' ') output.write(lib_dep) output.write('\n') output.write(')\n') # Libraries if linkable: external_libs = [lib for lib in spec.get('libraries', []) if len(lib) > 0] if external_libs or static_deps or shared_deps: output.write('target_link_libraries(') output.write(cmake_target_name) output.write('\n') if static_deps: write_group = circular_libs and len(static_deps) > 1 if write_group: output.write('-Wl,--start-group\n') for dep in gyp.common.uniquer(static_deps): output.write(' ') output.write(dep) output.write('\n') if write_group: output.write('-Wl,--end-group\n') if shared_deps: for dep in gyp.common.uniquer(shared_deps): output.write(' ') output.write(dep) output.write('\n') if external_libs: for lib in gyp.common.uniquer(external_libs): output.write(' ') output.write(lib) output.write('\n') output.write(')\n') UnsetVariable(output, 'TOOLSET') UnsetVariable(output, 'TARGET') def GenerateOutputForConfig(target_list, target_dicts, data, params, config_to_use): options = params['options'] generator_flags = params['generator_flags'] # generator_dir: relative path from pwd to where make puts build files. # Makes migrating from make to cmake easier, cmake doesn't put anything here. # Each Gyp configuration creates a different CMakeLists.txt file # to avoid incompatibilities between Gyp and CMake configurations. generator_dir = os.path.relpath(options.generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = generator_flags.get('output_dir', 'out') # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.normpath(os.path.join(generator_dir, output_dir, config_to_use)) toplevel_build = os.path.join(options.toplevel_dir, build_dir) output_file = os.path.join(toplevel_build, 'CMakeLists.txt') gyp.common.EnsureDirExists(output_file) output = open(output_file, 'w') output.write('cmake_minimum_required(VERSION 2.8.8 FATAL_ERROR)\n') output.write('cmake_policy(VERSION 2.8.8)\n') gyp_file, project_target, _ = gyp.common.ParseQualifiedTarget(target_list[-1]) output.write('project(') output.write(project_target) output.write(')\n') SetVariable(output, 'configuration', config_to_use) ar = None cc = None cxx = None make_global_settings = data[gyp_file].get('make_global_settings', []) build_to_top = gyp.common.InvertRelativePath(build_dir, options.toplevel_dir) for key, value in make_global_settings: if key == 'AR': ar = os.path.join(build_to_top, value) if key == 'CC': cc = os.path.join(build_to_top, value) if key == 'CXX': cxx = os.path.join(build_to_top, value) ar = gyp.common.GetEnvironFallback(['AR_target', 'AR'], ar) cc = gyp.common.GetEnvironFallback(['CC_target', 'CC'], cc) cxx = gyp.common.GetEnvironFallback(['CXX_target', 'CXX'], cxx) if ar: SetVariable(output, 'CMAKE_AR', ar) if cc: SetVariable(output, 'CMAKE_C_COMPILER', cc) if cxx: SetVariable(output, 'CMAKE_CXX_COMPILER', cxx) # The following appears to be as-yet undocumented. # http://public.kitware.com/Bug/view.php?id=8392 output.write('enable_language(ASM)\n') # ASM-ATT does not support .S files. # output.write('enable_language(ASM-ATT)\n') if cc: SetVariable(output, 'CMAKE_ASM_COMPILER', cc) SetVariable(output, 'builddir', '${CMAKE_CURRENT_BINARY_DIR}') SetVariable(output, 'obj', '${builddir}/obj') output.write('\n') # TODO: Undocumented/unsupported (the CMake Java generator depends on it). # CMake by default names the object resulting from foo.c to be foo.c.o. # Gyp traditionally names the object resulting from foo.c foo.o. # This should be irrelevant, but some targets extract .o files from .a # and depend on the name of the extracted .o files. output.write('set(CMAKE_C_OUTPUT_EXTENSION_REPLACE 1)\n') output.write('set(CMAKE_CXX_OUTPUT_EXTENSION_REPLACE 1)\n') output.write('\n') # Force ninja to use rsp files. Otherwise link and ar lines can get too long, # resulting in 'Argument list too long' errors. output.write('set(CMAKE_NINJA_FORCE_RESPONSE_FILE 1)\n') output.write('\n') namer = CMakeNamer(target_list) # The list of targets upon which the 'all' target should depend. # CMake has it's own implicit 'all' target, one is not created explicitly. all_qualified_targets = set() for build_file in params['build_files']: for qualified_target in gyp.common.AllTargets(target_list, target_dicts, os.path.normpath(build_file)): all_qualified_targets.add(qualified_target) for qualified_target in target_list: WriteTarget(namer, qualified_target, target_dicts, build_dir, config_to_use, options, generator_flags, all_qualified_targets, output) output.close() def PerformBuild(data, configurations, params): options = params['options'] generator_flags = params['generator_flags'] # generator_dir: relative path from pwd to where make puts build files. # Makes migrating from make to cmake easier, cmake doesn't put anything here. generator_dir = os.path.relpath(options.generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = generator_flags.get('output_dir', 'out') for config_name in configurations: # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.normpath(os.path.join(generator_dir, output_dir, config_name)) arguments = ['cmake', '-G', 'Ninja'] print 'Generating [%s]: %s' % (config_name, arguments) subprocess.check_call(arguments, cwd=build_dir) arguments = ['ninja', '-C', build_dir] print 'Building [%s]: %s' % (config_name, arguments) subprocess.check_call(arguments) def CallGenerateOutputForConfig(arglist): # Ignore the interrupt signal so that the parent process catches it and # kills all multiprocessing children. signal.signal(signal.SIGINT, signal.SIG_IGN) target_list, target_dicts, data, params, config_name = arglist GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) def GenerateOutput(target_list, target_dicts, data, params): user_config = params.get('generator_flags', {}).get('config', None) if user_config: GenerateOutputForConfig(target_list, target_dicts, data, params, user_config) else: config_names = target_dicts[target_list[0]]['configurations'].keys() if params['parallel']: try: pool = multiprocessing.Pool(len(config_names)) arglists = [] for config_name in config_names: arglists.append((target_list, target_dicts, data, params, config_name)) pool.map(CallGenerateOutputForConfig, arglists) except KeyboardInterrupt, e: pool.terminate() raise e else: for config_name in config_names: GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/dump_dependency_json.py0000644000000000000000000000654212631326456027151 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import collections import os import gyp import gyp.common import gyp.msvs_emulation import json import sys generator_supports_multiple_toolsets = True generator_wants_static_library_dependencies_adjusted = False generator_filelist_paths = { } generator_default_variables = { } for dirname in ['INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']: # Some gyp steps fail if these are empty(!). generator_default_variables[dirname] = 'dir' for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX', 'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX', 'CONFIGURATION_NAME']: generator_default_variables[unused] = '' def CalculateVariables(default_variables, params): generator_flags = params.get('generator_flags', {}) for key, val in generator_flags.items(): default_variables.setdefault(key, val) default_variables.setdefault('OS', gyp.common.GetFlavor(params)) flavor = gyp.common.GetFlavor(params) if flavor =='win': # Copy additional generator configuration data from VS, which is shared # by the Windows Ninja generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) def CalculateGeneratorInputInfo(params): """Calculate the generator specific info that gets fed to input (called by gyp).""" generator_flags = params.get('generator_flags', {}) if generator_flags.get('adjust_static_libraries', False): global generator_wants_static_library_dependencies_adjusted generator_wants_static_library_dependencies_adjusted = True toplevel = params['options'].toplevel_dir generator_dir = os.path.relpath(params['options'].generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = generator_flags.get('output_dir', 'out') qualified_out_dir = os.path.normpath(os.path.join( toplevel, generator_dir, output_dir, 'gypfiles')) global generator_filelist_paths generator_filelist_paths = { 'toplevel': toplevel, 'qualified_out_dir': qualified_out_dir, } def GenerateOutput(target_list, target_dicts, data, params): # Map of target -> list of targets it depends on. edges = {} # Queue of targets to visit. targets_to_visit = target_list[:] while len(targets_to_visit) > 0: target = targets_to_visit.pop() if target in edges: continue edges[target] = [] for dep in target_dicts[target].get('dependencies', []): edges[target].append(dep) targets_to_visit.append(dep) try: filepath = params['generator_flags']['output_dir'] except KeyError: filepath = '.' filename = os.path.join(filepath, 'dump.json') f = open(filename, 'w') json.dump(edges, f) f.close() print 'Wrote json to %s.' % filename npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/eclipse.py0000644000000000000000000004116612631326456024402 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """GYP backend that generates Eclipse CDT settings files. This backend DOES NOT generate Eclipse CDT projects. Instead, it generates XML files that can be imported into an Eclipse CDT project. The XML file contains a list of include paths and symbols (i.e. defines). Because a full .cproject definition is not created by this generator, it's not possible to properly define the include dirs and symbols for each file individually. Instead, one set of includes/symbols is generated for the entire project. This works fairly well (and is a vast improvement in general), but may still result in a few indexer issues here and there. This generator has no automated tests, so expect it to be broken. """ from xml.sax.saxutils import escape import os.path import subprocess import gyp import gyp.common import gyp.msvs_emulation import shlex import xml.etree.cElementTree as ET generator_wants_static_library_dependencies_adjusted = False generator_default_variables = { } for dirname in ['INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']: # Some gyp steps fail if these are empty(!), so we convert them to variables generator_default_variables[dirname] = '$' + dirname for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX', 'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX', 'CONFIGURATION_NAME']: generator_default_variables[unused] = '' # Include dirs will occasionally use the SHARED_INTERMEDIATE_DIR variable as # part of the path when dealing with generated headers. This value will be # replaced dynamically for each configuration. generator_default_variables['SHARED_INTERMEDIATE_DIR'] = \ '$SHARED_INTERMEDIATE_DIR' def CalculateVariables(default_variables, params): generator_flags = params.get('generator_flags', {}) for key, val in generator_flags.items(): default_variables.setdefault(key, val) flavor = gyp.common.GetFlavor(params) default_variables.setdefault('OS', flavor) if flavor == 'win': # Copy additional generator configuration data from VS, which is shared # by the Eclipse generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) def CalculateGeneratorInputInfo(params): """Calculate the generator specific info that gets fed to input (called by gyp).""" generator_flags = params.get('generator_flags', {}) if generator_flags.get('adjust_static_libraries', False): global generator_wants_static_library_dependencies_adjusted generator_wants_static_library_dependencies_adjusted = True def GetAllIncludeDirectories(target_list, target_dicts, shared_intermediate_dirs, config_name, params, compiler_path): """Calculate the set of include directories to be used. Returns: A list including all the include_dir's specified for every target followed by any include directories that were added as cflag compiler options. """ gyp_includes_set = set() compiler_includes_list = [] # Find compiler's default include dirs. if compiler_path: command = shlex.split(compiler_path) command.extend(['-E', '-xc++', '-v', '-']) proc = subprocess.Popen(args=command, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE) output = proc.communicate()[1] # Extract the list of include dirs from the output, which has this format: # ... # #include "..." search starts here: # #include <...> search starts here: # /usr/include/c++/4.6 # /usr/local/include # End of search list. # ... in_include_list = False for line in output.splitlines(): if line.startswith('#include'): in_include_list = True continue if line.startswith('End of search list.'): break if in_include_list: include_dir = line.strip() if include_dir not in compiler_includes_list: compiler_includes_list.append(include_dir) flavor = gyp.common.GetFlavor(params) if flavor == 'win': generator_flags = params.get('generator_flags', {}) for target_name in target_list: target = target_dicts[target_name] if config_name in target['configurations']: config = target['configurations'][config_name] # Look for any include dirs that were explicitly added via cflags. This # may be done in gyp files to force certain includes to come at the end. # TODO(jgreenwald): Change the gyp files to not abuse cflags for this, and # remove this. if flavor == 'win': msvs_settings = gyp.msvs_emulation.MsvsSettings(target, generator_flags) cflags = msvs_settings.GetCflags(config_name) else: cflags = config['cflags'] for cflag in cflags: if cflag.startswith('-I'): include_dir = cflag[2:] if include_dir not in compiler_includes_list: compiler_includes_list.append(include_dir) # Find standard gyp include dirs. if config.has_key('include_dirs'): include_dirs = config['include_dirs'] for shared_intermediate_dir in shared_intermediate_dirs: for include_dir in include_dirs: include_dir = include_dir.replace('$SHARED_INTERMEDIATE_DIR', shared_intermediate_dir) if not os.path.isabs(include_dir): base_dir = os.path.dirname(target_name) include_dir = base_dir + '/' + include_dir include_dir = os.path.abspath(include_dir) gyp_includes_set.add(include_dir) # Generate a list that has all the include dirs. all_includes_list = list(gyp_includes_set) all_includes_list.sort() for compiler_include in compiler_includes_list: if not compiler_include in gyp_includes_set: all_includes_list.append(compiler_include) # All done. return all_includes_list def GetCompilerPath(target_list, data, options): """Determine a command that can be used to invoke the compiler. Returns: If this is a gyp project that has explicit make settings, try to determine the compiler from that. Otherwise, see if a compiler was specified via the CC_target environment variable. """ # First, see if the compiler is configured in make's settings. build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings_dict = data[build_file].get('make_global_settings', {}) for key, value in make_global_settings_dict: if key in ['CC', 'CXX']: return os.path.join(options.toplevel_dir, value) # Check to see if the compiler was specified as an environment variable. for key in ['CC_target', 'CC', 'CXX']: compiler = os.environ.get(key) if compiler: return compiler return 'gcc' def GetAllDefines(target_list, target_dicts, data, config_name, params, compiler_path): """Calculate the defines for a project. Returns: A dict that includes explict defines declared in gyp files along with all of the default defines that the compiler uses. """ # Get defines declared in the gyp files. all_defines = {} flavor = gyp.common.GetFlavor(params) if flavor == 'win': generator_flags = params.get('generator_flags', {}) for target_name in target_list: target = target_dicts[target_name] if flavor == 'win': msvs_settings = gyp.msvs_emulation.MsvsSettings(target, generator_flags) extra_defines = msvs_settings.GetComputedDefines(config_name) else: extra_defines = [] if config_name in target['configurations']: config = target['configurations'][config_name] target_defines = config['defines'] else: target_defines = [] for define in target_defines + extra_defines: split_define = define.split('=', 1) if len(split_define) == 1: split_define.append('1') if split_define[0].strip() in all_defines: # Already defined continue all_defines[split_define[0].strip()] = split_define[1].strip() # Get default compiler defines (if possible). if flavor == 'win': return all_defines # Default defines already processed in the loop above. if compiler_path: command = shlex.split(compiler_path) command.extend(['-E', '-dM', '-']) cpp_proc = subprocess.Popen(args=command, cwd='.', stdin=subprocess.PIPE, stdout=subprocess.PIPE) cpp_output = cpp_proc.communicate()[0] cpp_lines = cpp_output.split('\n') for cpp_line in cpp_lines: if not cpp_line.strip(): continue cpp_line_parts = cpp_line.split(' ', 2) key = cpp_line_parts[1] if len(cpp_line_parts) >= 3: val = cpp_line_parts[2] else: val = '1' all_defines[key] = val return all_defines def WriteIncludePaths(out, eclipse_langs, include_dirs): """Write the includes section of a CDT settings export file.""" out.write('
      \n') out.write(' \n') for lang in eclipse_langs: out.write(' \n' % lang) for include_dir in include_dirs: out.write(' %s\n' % include_dir) out.write(' \n') out.write('
      \n') def WriteMacros(out, eclipse_langs, defines): """Write the macros section of a CDT settings export file.""" out.write('
      \n') out.write(' \n') for lang in eclipse_langs: out.write(' \n' % lang) for key in sorted(defines.iterkeys()): out.write(' %s%s\n' % (escape(key), escape(defines[key]))) out.write(' \n') out.write('
      \n') def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name): options = params['options'] generator_flags = params.get('generator_flags', {}) # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.join(generator_flags.get('output_dir', 'out'), config_name) toplevel_build = os.path.join(options.toplevel_dir, build_dir) # Ninja uses out/Debug/gen while make uses out/Debug/obj/gen as the # SHARED_INTERMEDIATE_DIR. Include both possible locations. shared_intermediate_dirs = [os.path.join(toplevel_build, 'obj', 'gen'), os.path.join(toplevel_build, 'gen')] GenerateCdtSettingsFile(target_list, target_dicts, data, params, config_name, os.path.join(toplevel_build, 'eclipse-cdt-settings.xml'), options, shared_intermediate_dirs) GenerateClasspathFile(target_list, target_dicts, options.toplevel_dir, toplevel_build, os.path.join(toplevel_build, 'eclipse-classpath.xml')) def GenerateCdtSettingsFile(target_list, target_dicts, data, params, config_name, out_name, options, shared_intermediate_dirs): gyp.common.EnsureDirExists(out_name) with open(out_name, 'w') as out: out.write('\n') out.write('\n') eclipse_langs = ['C++ Source File', 'C Source File', 'Assembly Source File', 'GNU C++', 'GNU C', 'Assembly'] compiler_path = GetCompilerPath(target_list, data, options) include_dirs = GetAllIncludeDirectories(target_list, target_dicts, shared_intermediate_dirs, config_name, params, compiler_path) WriteIncludePaths(out, eclipse_langs, include_dirs) defines = GetAllDefines(target_list, target_dicts, data, config_name, params, compiler_path) WriteMacros(out, eclipse_langs, defines) out.write('\n') def GenerateClasspathFile(target_list, target_dicts, toplevel_dir, toplevel_build, out_name): '''Generates a classpath file suitable for symbol navigation and code completion of Java code (such as in Android projects) by finding all .java and .jar files used as action inputs.''' gyp.common.EnsureDirExists(out_name) result = ET.Element('classpath') def AddElements(kind, paths): # First, we need to normalize the paths so they are all relative to the # toplevel dir. rel_paths = set() for path in paths: if os.path.isabs(path): rel_paths.add(os.path.relpath(path, toplevel_dir)) else: rel_paths.add(path) for path in sorted(rel_paths): entry_element = ET.SubElement(result, 'classpathentry') entry_element.set('kind', kind) entry_element.set('path', path) AddElements('lib', GetJavaJars(target_list, target_dicts, toplevel_dir)) AddElements('src', GetJavaSourceDirs(target_list, target_dicts, toplevel_dir)) # Include the standard JRE container and a dummy out folder AddElements('con', ['org.eclipse.jdt.launching.JRE_CONTAINER']) # Include a dummy out folder so that Eclipse doesn't use the default /bin # folder in the root of the project. AddElements('output', [os.path.join(toplevel_build, '.eclipse-java-build')]) ET.ElementTree(result).write(out_name) def GetJavaJars(target_list, target_dicts, toplevel_dir): '''Generates a sequence of all .jars used as inputs.''' for target_name in target_list: target = target_dicts[target_name] for action in target.get('actions', []): for input_ in action['inputs']: if os.path.splitext(input_)[1] == '.jar' and not input_.startswith('$'): if os.path.isabs(input_): yield input_ else: yield os.path.join(os.path.dirname(target_name), input_) def GetJavaSourceDirs(target_list, target_dicts, toplevel_dir): '''Generates a sequence of all likely java package root directories.''' for target_name in target_list: target = target_dicts[target_name] for action in target.get('actions', []): for input_ in action['inputs']: if (os.path.splitext(input_)[1] == '.java' and not input_.startswith('$')): dir_ = os.path.dirname(os.path.join(os.path.dirname(target_name), input_)) # If there is a parent 'src' or 'java' folder, navigate up to it - # these are canonical package root names in Chromium. This will # break if 'src' or 'java' exists in the package structure. This # could be further improved by inspecting the java file for the # package name if this proves to be too fragile in practice. parent_search = dir_ while os.path.basename(parent_search) not in ['src', 'java']: parent_search, _ = os.path.split(parent_search) if not parent_search or parent_search == toplevel_dir: # Didn't find a known root, just return the original path yield dir_ break else: yield parent_search def GenerateOutput(target_list, target_dicts, data, params): """Generate an XML settings file that can be imported into a CDT project.""" if params['options'].generator_output: raise NotImplementedError("--generator_output not implemented for eclipse") user_config = params.get('generator_flags', {}).get('config', None) if user_config: GenerateOutputForConfig(target_list, target_dicts, data, params, user_config) else: config_names = target_dicts[target_list[0]]['configurations'].keys() for config_name in config_names: GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/gypd.py0000644000000000000000000000662212631326456023717 0ustar 00000000000000# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """gypd output module This module produces gyp input as its output. Output files are given the .gypd extension to avoid overwriting the .gyp files that they are generated from. Internal references to .gyp files (such as those found in "dependencies" sections) are not adjusted to point to .gypd files instead; unlike other paths, which are relative to the .gyp or .gypd file, such paths are relative to the directory from which gyp was run to create the .gypd file. This generator module is intended to be a sample and a debugging aid, hence the "d" for "debug" in .gypd. It is useful to inspect the results of the various merges, expansions, and conditional evaluations performed by gyp and to see a representation of what would be fed to a generator module. It's not advisable to rename .gypd files produced by this module to .gyp, because they will have all merges, expansions, and evaluations already performed and the relevant constructs not present in the output; paths to dependencies may be wrong; and various sections that do not belong in .gyp files such as such as "included_files" and "*_excluded" will be present. Output will also be stripped of comments. This is not intended to be a general-purpose gyp pretty-printer; for that, you probably just want to run "pprint.pprint(eval(open('source.gyp').read()))", which will still strip comments but won't do all of the other things done to this module's output. The specific formatting of the output generated by this module is subject to change. """ import gyp.common import errno import os import pprint # These variables should just be spit back out as variable references. _generator_identity_variables = [ 'CONFIGURATION_NAME', 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'INTERMEDIATE_DIR', 'LIB_DIR', 'PRODUCT_DIR', 'RULE_INPUT_ROOT', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'RULE_INPUT_NAME', 'RULE_INPUT_PATH', 'SHARED_INTERMEDIATE_DIR', 'SHARED_LIB_DIR', 'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX', 'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX', ] # gypd doesn't define a default value for OS like many other generator # modules. Specify "-D OS=whatever" on the command line to provide a value. generator_default_variables = { } # gypd supports multiple toolsets generator_supports_multiple_toolsets = True # TODO(mark): This always uses <, which isn't right. The input module should # notify the generator to tell it which phase it is operating in, and this # module should use < for the early phase and then switch to > for the late # phase. Bonus points for carrying @ back into the output too. for v in _generator_identity_variables: generator_default_variables[v] = '<(%s)' % v def GenerateOutput(target_list, target_dicts, data, params): output_files = {} for qualified_target in target_list: [input_file, target] = \ gyp.common.ParseQualifiedTarget(qualified_target)[0:2] if input_file[-4:] != '.gyp': continue input_file_stem = input_file[:-4] output_file = input_file_stem + params['options'].suffix + '.gypd' if not output_file in output_files: output_files[output_file] = input_file for output_file, input_file in output_files.iteritems(): output = open(output_file, 'w') pprint.pprint(data[input_file], output) output.close() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/gypsh.py0000644000000000000000000000320112631326456024074 0ustar 00000000000000# Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """gypsh output module gypsh is a GYP shell. It's not really a generator per se. All it does is fire up an interactive Python session with a few local variables set to the variables passed to the generator. Like gypd, it's intended as a debugging aid, to facilitate the exploration of .gyp structures after being processed by the input module. The expected usage is "gyp -f gypsh -D OS=desired_os". """ import code import sys # All of this stuff about generator variables was lovingly ripped from gypd.py. # That module has a much better description of what's going on and why. _generator_identity_variables = [ 'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX', 'INTERMEDIATE_DIR', 'PRODUCT_DIR', 'RULE_INPUT_ROOT', 'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT', 'RULE_INPUT_NAME', 'RULE_INPUT_PATH', 'SHARED_INTERMEDIATE_DIR', ] generator_default_variables = { } for v in _generator_identity_variables: generator_default_variables[v] = '<(%s)' % v def GenerateOutput(target_list, target_dicts, data, params): locals = { 'target_list': target_list, 'target_dicts': target_dicts, 'data': data, } # Use a banner that looks like the stock Python one and like what # code.interact uses by default, but tack on something to indicate what # locals are available, and identify gypsh. banner='Python %s on %s\nlocals.keys() = %s\ngypsh' % \ (sys.version, sys.platform, repr(sorted(locals.keys()))) code.interact(banner, local=locals) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/make.py0000644000000000000000000026167512631326456023704 0ustar 00000000000000# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. # Notes: # # This is all roughly based on the Makefile system used by the Linux # kernel, but is a non-recursive make -- we put the entire dependency # graph in front of make and let it figure it out. # # The code below generates a separate .mk file for each target, but # all are sourced by the top-level Makefile. This means that all # variables in .mk-files clobber one another. Be careful to use := # where appropriate for immediate evaluation, and similarly to watch # that you're not relying on a variable value to last beween different # .mk files. # # TODOs: # # Global settings and utility functions are currently stuffed in the # toplevel Makefile. It may make sense to generate some .mk files on # the side to keep the the files readable. import os import re import sys import subprocess import gyp import gyp.common import gyp.xcode_emulation from gyp.common import GetEnvironFallback from gyp.common import GypError generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'SHARED_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'INTERMEDIATE_DIR': '$(obj).$(TOOLSET)/$(TARGET)/geni', 'SHARED_INTERMEDIATE_DIR': '$(obj)/gen', 'PRODUCT_DIR': '$(builddir)', 'RULE_INPUT_ROOT': '%(INPUT_ROOT)s', # This gets expanded by Python. 'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s', # This gets expanded by Python. 'RULE_INPUT_PATH': '$(abspath $<)', 'RULE_INPUT_EXT': '$(suffix $<)', 'RULE_INPUT_NAME': '$(notdir $<)', 'CONFIGURATION_NAME': '$(BUILDTYPE)', } # Make supports multiple toolsets generator_supports_multiple_toolsets = True # Request sorted dependencies in the order from dependents to dependencies. generator_wants_sorted_dependencies = False # Placates pylint. generator_additional_non_configuration_keys = [] generator_additional_path_sections = [] generator_extra_sources_for_rules = [] generator_filelist_paths = None def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" flavor = gyp.common.GetFlavor(params) if flavor == 'mac': default_variables.setdefault('OS', 'mac') default_variables.setdefault('SHARED_LIB_SUFFIX', '.dylib') default_variables.setdefault('SHARED_LIB_DIR', generator_default_variables['PRODUCT_DIR']) default_variables.setdefault('LIB_DIR', generator_default_variables['PRODUCT_DIR']) # Copy additional generator configuration data from Xcode, which is shared # by the Mac Make generator. import gyp.generator.xcode as xcode_generator global generator_additional_non_configuration_keys generator_additional_non_configuration_keys = getattr(xcode_generator, 'generator_additional_non_configuration_keys', []) global generator_additional_path_sections generator_additional_path_sections = getattr(xcode_generator, 'generator_additional_path_sections', []) global generator_extra_sources_for_rules generator_extra_sources_for_rules = getattr(xcode_generator, 'generator_extra_sources_for_rules', []) COMPILABLE_EXTENSIONS.update({'.m': 'objc', '.mm' : 'objcxx'}) else: operating_system = flavor if flavor == 'android': operating_system = 'linux' # Keep this legacy behavior for now. default_variables.setdefault('OS', operating_system) default_variables.setdefault('SHARED_LIB_SUFFIX', '.so') default_variables.setdefault('SHARED_LIB_DIR','$(builddir)/lib.$(TOOLSET)') default_variables.setdefault('LIB_DIR', '$(obj).$(TOOLSET)') def CalculateGeneratorInputInfo(params): """Calculate the generator specific info that gets fed to input (called by gyp).""" generator_flags = params.get('generator_flags', {}) android_ndk_version = generator_flags.get('android_ndk_version', None) # Android NDK requires a strict link order. if android_ndk_version: global generator_wants_sorted_dependencies generator_wants_sorted_dependencies = True output_dir = params['options'].generator_output or \ params['options'].toplevel_dir builddir_name = generator_flags.get('output_dir', 'out') qualified_out_dir = os.path.normpath(os.path.join( output_dir, builddir_name, 'gypfiles')) global generator_filelist_paths generator_filelist_paths = { 'toplevel': params['options'].toplevel_dir, 'qualified_out_dir': qualified_out_dir, } # The .d checking code below uses these functions: # wildcard, sort, foreach, shell, wordlist # wildcard can handle spaces, the rest can't. # Since I could find no way to make foreach work with spaces in filenames # correctly, the .d files have spaces replaced with another character. The .d # file for # Chromium\ Framework.framework/foo # is for example # out/Release/.deps/out/Release/Chromium?Framework.framework/foo # This is the replacement character. SPACE_REPLACEMENT = '?' LINK_COMMANDS_LINUX = """\ quiet_cmd_alink = AR($(TOOLSET)) $@ cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) quiet_cmd_alink_thin = AR($(TOOLSET)) $@ cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^) # Due to circular dependencies between libraries :(, we wrap the # special "figure out circular dependencies" flags around the entire # input list during linking. quiet_cmd_link = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ -Wl,--start-group $(LD_INPUTS) -Wl,--end-group $(LIBS) # We support two kinds of shared objects (.so): # 1) shared_library, which is just bundling together many dependent libraries # into a link line. # 2) loadable_module, which is generating a module intended for dlopen(). # # They differ only slightly: # In the former case, we want to package all dependent code into the .so. # In the latter case, we want to package just the API exposed by the # outermost module. # This means shared_library uses --whole-archive, while loadable_module doesn't. # (Note that --whole-archive is incompatible with the --start-group used in # normal linking.) # Other shared-object link notes: # - Set SONAME to the library filename so our binaries don't reference # the local, absolute paths used on the link command-line. quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--whole-archive $(LD_INPUTS) -Wl,--no-whole-archive $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--start-group $(filter-out FORCE_DO_CMD, $^) -Wl,--end-group $(LIBS) """ LINK_COMMANDS_MAC = """\ quiet_cmd_alink = LIBTOOL-STATIC $@ cmd_alink = rm -f $@ && ./gyp-mac-tool filter-libtool libtool $(GYP_LIBTOOLFLAGS) -static -o $@ $(filter %.o,$^) quiet_cmd_link = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS) quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o "$@" $(LD_INPUTS) $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -bundle $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) """ LINK_COMMANDS_ANDROID = """\ quiet_cmd_alink = AR($(TOOLSET)) $@ cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) crs $@ $(filter %.o,$^) quiet_cmd_alink_thin = AR($(TOOLSET)) $@ cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) crsT $@ $(filter %.o,$^) # Due to circular dependencies between libraries :(, we wrap the # special "figure out circular dependencies" flags around the entire # input list during linking. quiet_cmd_link = LINK($(TOOLSET)) $@ quiet_cmd_link_host = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ -Wl,--start-group $(LD_INPUTS) -Wl,--end-group $(LIBS) cmd_link_host = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) # Other shared-object link notes: # - Set SONAME to the library filename so our binaries don't reference # the local, absolute paths used on the link command-line. quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--whole-archive $(LD_INPUTS) -Wl,--no-whole-archive $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ -Wl,--start-group $(filter-out FORCE_DO_CMD, $^) -Wl,--end-group $(LIBS) quiet_cmd_solink_module_host = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module_host = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -Wl,-soname=$(@F) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) """ LINK_COMMANDS_AIX = """\ quiet_cmd_alink = AR($(TOOLSET)) $@ cmd_alink = rm -f $@ && $(AR.$(TOOLSET)) -X32_64 crs $@ $(filter %.o,$^) quiet_cmd_alink_thin = AR($(TOOLSET)) $@ cmd_alink_thin = rm -f $@ && $(AR.$(TOOLSET)) -X32_64 crs $@ $(filter %.o,$^) quiet_cmd_link = LINK($(TOOLSET)) $@ cmd_link = $(LINK.$(TOOLSET)) $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) quiet_cmd_solink = SOLINK($(TOOLSET)) $@ cmd_solink = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(LD_INPUTS) $(LIBS) quiet_cmd_solink_module = SOLINK_MODULE($(TOOLSET)) $@ cmd_solink_module = $(LINK.$(TOOLSET)) -shared $(GYP_LDFLAGS) $(LDFLAGS.$(TOOLSET)) -o $@ $(filter-out FORCE_DO_CMD, $^) $(LIBS) """ # Header of toplevel Makefile. # This should go into the build tree, but it's easier to keep it here for now. SHARED_HEADER = ("""\ # We borrow heavily from the kernel build setup, though we are simpler since # we don't have Kconfig tweaking settings on us. # The implicit make rules have it looking for RCS files, among other things. # We instead explicitly write all the rules we care about. # It's even quicker (saves ~200ms) to pass -r on the command line. MAKEFLAGS=-r # The source directory tree. srcdir := %(srcdir)s abs_srcdir := $(abspath $(srcdir)) # The name of the builddir. builddir_name ?= %(builddir)s # The V=1 flag on command line makes us verbosely print command lines. ifdef V quiet= else quiet=quiet_ endif # Specify BUILDTYPE=Release on the command line for a release build. BUILDTYPE ?= %(default_configuration)s # Directory all our build output goes into. # Note that this must be two directories beneath src/ for unit tests to pass, # as they reach into the src/ directory for data with relative paths. builddir ?= $(builddir_name)/$(BUILDTYPE) abs_builddir := $(abspath $(builddir)) depsdir := $(builddir)/.deps # Object output directory. obj := $(builddir)/obj abs_obj := $(abspath $(obj)) # We build up a list of every single one of the targets so we can slurp in the # generated dependency rule Makefiles in one pass. all_deps := %(make_global_settings)s CC.target ?= %(CC.target)s CFLAGS.target ?= $(CPPFLAGS) $(CFLAGS) CXX.target ?= %(CXX.target)s CXXFLAGS.target ?= $(CPPFLAGS) $(CXXFLAGS) LINK.target ?= %(LINK.target)s LDFLAGS.target ?= $(LDFLAGS) AR.target ?= $(AR) # C++ apps need to be linked with g++. LINK ?= $(CXX.target) # TODO(evan): move all cross-compilation logic to gyp-time so we don't need # to replicate this environment fallback in make as well. CC.host ?= %(CC.host)s CFLAGS.host ?= $(CPPFLAGS_host) $(CFLAGS_host) CXX.host ?= %(CXX.host)s CXXFLAGS.host ?= $(CPPFLAGS_host) $(CXXFLAGS_host) LINK.host ?= %(LINK.host)s LDFLAGS.host ?= AR.host ?= %(AR.host)s # Define a dir function that can handle spaces. # http://www.gnu.org/software/make/manual/make.html#Syntax-of-Functions # "leading spaces cannot appear in the text of the first argument as written. # These characters can be put into the argument value by variable substitution." empty := space := $(empty) $(empty) # http://stackoverflow.com/questions/1189781/using-make-dir-or-notdir-on-a-path-with-spaces replace_spaces = $(subst $(space),""" + SPACE_REPLACEMENT + """,$1) unreplace_spaces = $(subst """ + SPACE_REPLACEMENT + """,$(space),$1) dirx = $(call unreplace_spaces,$(dir $(call replace_spaces,$1))) # Flags to make gcc output dependency info. Note that you need to be # careful here to use the flags that ccache and distcc can understand. # We write to a dep file on the side first and then rename at the end # so we can't end up with a broken dep file. depfile = $(depsdir)/$(call replace_spaces,$@).d DEPFLAGS = -MMD -MF $(depfile).raw # We have to fixup the deps output in a few ways. # (1) the file output should mention the proper .o file. # ccache or distcc lose the path to the target, so we convert a rule of # the form: # foobar.o: DEP1 DEP2 # into # path/to/foobar.o: DEP1 DEP2 # (2) we want missing files not to cause us to fail to build. # We want to rewrite # foobar.o: DEP1 DEP2 \\ # DEP3 # to # DEP1: # DEP2: # DEP3: # so if the files are missing, they're just considered phony rules. # We have to do some pretty insane escaping to get those backslashes # and dollar signs past make, the shell, and sed at the same time. # Doesn't work with spaces, but that's fine: .d files have spaces in # their names replaced with other characters.""" r""" define fixup_dep # The depfile may not exist if the input file didn't have any #includes. touch $(depfile).raw # Fixup path as in (1). sed -e "s|^$(notdir $@)|$@|" $(depfile).raw >> $(depfile) # Add extra rules as in (2). # We remove slashes and replace spaces with new lines; # remove blank lines; # delete the first line and append a colon to the remaining lines. sed -e 's|\\||' -e 'y| |\n|' $(depfile).raw |\ grep -v '^$$' |\ sed -e 1d -e 's|$$|:|' \ >> $(depfile) rm $(depfile).raw endef """ """ # Command definitions: # - cmd_foo is the actual command to run; # - quiet_cmd_foo is the brief-output summary of the command. quiet_cmd_cc = CC($(TOOLSET)) $@ cmd_cc = $(CC.$(TOOLSET)) $(GYP_CFLAGS) $(DEPFLAGS) $(CFLAGS.$(TOOLSET)) -c -o $@ $< quiet_cmd_cxx = CXX($(TOOLSET)) $@ cmd_cxx = $(CXX.$(TOOLSET)) $(GYP_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $< %(extra_commands)s quiet_cmd_touch = TOUCH $@ cmd_touch = touch $@ quiet_cmd_copy = COPY $@ # send stderr to /dev/null to ignore messages when linking directories. cmd_copy = rm -rf "$@" && cp %(copy_archive_args)s "$<" "$@" %(link_commands)s """ r""" # Define an escape_quotes function to escape single quotes. # This allows us to handle quotes properly as long as we always use # use single quotes and escape_quotes. escape_quotes = $(subst ','\'',$(1)) # This comment is here just to include a ' to unconfuse syntax highlighting. # Define an escape_vars function to escape '$' variable syntax. # This allows us to read/write command lines with shell variables (e.g. # $LD_LIBRARY_PATH), without triggering make substitution. escape_vars = $(subst $$,$$$$,$(1)) # Helper that expands to a shell command to echo a string exactly as it is in # make. This uses printf instead of echo because printf's behaviour with respect # to escape sequences is more portable than echo's across different shells # (e.g., dash, bash). exact_echo = printf '%%s\n' '$(call escape_quotes,$(1))' """ """ # Helper to compare the command we're about to run against the command # we logged the last time we ran the command. Produces an empty # string (false) when the commands match. # Tricky point: Make has no string-equality test function. # The kernel uses the following, but it seems like it would have false # positives, where one string reordered its arguments. # arg_check = $(strip $(filter-out $(cmd_$(1)), $(cmd_$@)) \\ # $(filter-out $(cmd_$@), $(cmd_$(1)))) # We instead substitute each for the empty string into the other, and # say they're equal if both substitutions produce the empty string. # .d files contain """ + SPACE_REPLACEMENT + \ """ instead of spaces, take that into account. command_changed = $(or $(subst $(cmd_$(1)),,$(cmd_$(call replace_spaces,$@))),\\ $(subst $(cmd_$(call replace_spaces,$@)),,$(cmd_$(1)))) # Helper that is non-empty when a prerequisite changes. # Normally make does this implicitly, but we force rules to always run # so we can check their command lines. # $? -- new prerequisites # $| -- order-only dependencies prereq_changed = $(filter-out FORCE_DO_CMD,$(filter-out $|,$?)) # Helper that executes all postbuilds until one fails. define do_postbuilds @E=0;\\ for p in $(POSTBUILDS); do\\ eval $$p;\\ E=$$?;\\ if [ $$E -ne 0 ]; then\\ break;\\ fi;\\ done;\\ if [ $$E -ne 0 ]; then\\ rm -rf "$@";\\ exit $$E;\\ fi endef # do_cmd: run a command via the above cmd_foo names, if necessary. # Should always run for a given target to handle command-line changes. # Second argument, if non-zero, makes it do asm/C/C++ dependency munging. # Third argument, if non-zero, makes it do POSTBUILDS processing. # Note: We intentionally do NOT call dirx for depfile, since it contains """ + \ SPACE_REPLACEMENT + """ for # spaces already and dirx strips the """ + SPACE_REPLACEMENT + \ """ characters. define do_cmd $(if $(or $(command_changed),$(prereq_changed)), @$(call exact_echo, $($(quiet)cmd_$(1))) @mkdir -p "$(call dirx,$@)" "$(dir $(depfile))" $(if $(findstring flock,$(word %(flock_index)d,$(cmd_$1))), @$(cmd_$(1)) @echo " $(quiet_cmd_$(1)): Finished", @$(cmd_$(1)) ) @$(call exact_echo,$(call escape_vars,cmd_$(call replace_spaces,$@) := $(cmd_$(1)))) > $(depfile) @$(if $(2),$(fixup_dep)) $(if $(and $(3), $(POSTBUILDS)), $(call do_postbuilds) ) ) endef # Declare the "%(default_target)s" target first so it is the default, # even though we don't have the deps yet. .PHONY: %(default_target)s %(default_target)s: # make looks for ways to re-generate included makefiles, but in our case, we # don't have a direct way. Explicitly telling make that it has nothing to do # for them makes it go faster. %%.d: ; # Use FORCE_DO_CMD to force a target to run. Should be coupled with # do_cmd. .PHONY: FORCE_DO_CMD FORCE_DO_CMD: """) SHARED_HEADER_MAC_COMMANDS = """ quiet_cmd_objc = CXX($(TOOLSET)) $@ cmd_objc = $(CC.$(TOOLSET)) $(GYP_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $< quiet_cmd_objcxx = CXX($(TOOLSET)) $@ cmd_objcxx = $(CXX.$(TOOLSET)) $(GYP_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $< # Commands for precompiled header files. quiet_cmd_pch_c = CXX($(TOOLSET)) $@ cmd_pch_c = $(CC.$(TOOLSET)) $(GYP_PCH_CFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $< quiet_cmd_pch_cc = CXX($(TOOLSET)) $@ cmd_pch_cc = $(CC.$(TOOLSET)) $(GYP_PCH_CXXFLAGS) $(DEPFLAGS) $(CXXFLAGS.$(TOOLSET)) -c -o $@ $< quiet_cmd_pch_m = CXX($(TOOLSET)) $@ cmd_pch_m = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCFLAGS) $(DEPFLAGS) -c -o $@ $< quiet_cmd_pch_mm = CXX($(TOOLSET)) $@ cmd_pch_mm = $(CC.$(TOOLSET)) $(GYP_PCH_OBJCXXFLAGS) $(DEPFLAGS) -c -o $@ $< # gyp-mac-tool is written next to the root Makefile by gyp. # Use $(4) for the command, since $(2) and $(3) are used as flag by do_cmd # already. quiet_cmd_mac_tool = MACTOOL $(4) $< cmd_mac_tool = ./gyp-mac-tool $(4) $< "$@" quiet_cmd_mac_package_framework = PACKAGE FRAMEWORK $@ cmd_mac_package_framework = ./gyp-mac-tool package-framework "$@" $(4) quiet_cmd_infoplist = INFOPLIST $@ cmd_infoplist = $(CC.$(TOOLSET)) -E -P -Wno-trigraphs -x c $(INFOPLIST_DEFINES) "$<" -o "$@" """ def WriteRootHeaderSuffixRules(writer): extensions = sorted(COMPILABLE_EXTENSIONS.keys(), key=str.lower) writer.write('# Suffix rules, putting all outputs into $(obj).\n') for ext in extensions: writer.write('$(obj).$(TOOLSET)/%%.o: $(srcdir)/%%%s FORCE_DO_CMD\n' % ext) writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext]) writer.write('\n# Try building from generated source, too.\n') for ext in extensions: writer.write( '$(obj).$(TOOLSET)/%%.o: $(obj).$(TOOLSET)/%%%s FORCE_DO_CMD\n' % ext) writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext]) writer.write('\n') for ext in extensions: writer.write('$(obj).$(TOOLSET)/%%.o: $(obj)/%%%s FORCE_DO_CMD\n' % ext) writer.write('\t@$(call do_cmd,%s,1)\n' % COMPILABLE_EXTENSIONS[ext]) writer.write('\n') SHARED_HEADER_SUFFIX_RULES_COMMENT1 = ("""\ # Suffix rules, putting all outputs into $(obj). """) SHARED_HEADER_SUFFIX_RULES_COMMENT2 = ("""\ # Try building from generated source, too. """) SHARED_FOOTER = """\ # "all" is a concatenation of the "all" targets from all the included # sub-makefiles. This is just here to clarify. all: # Add in dependency-tracking rules. $(all_deps) is the list of every single # target in our tree. Only consider the ones with .d (dependency) info: d_files := $(wildcard $(foreach f,$(all_deps),$(depsdir)/$(f).d)) ifneq ($(d_files),) include $(d_files) endif """ header = """\ # This file is generated by gyp; do not edit. """ # Maps every compilable file extension to the do_cmd that compiles it. COMPILABLE_EXTENSIONS = { '.c': 'cc', '.cc': 'cxx', '.cpp': 'cxx', '.cxx': 'cxx', '.s': 'cc', '.S': 'cc', } def Compilable(filename): """Return true if the file is compilable (should be in OBJS).""" for res in (filename.endswith(e) for e in COMPILABLE_EXTENSIONS): if res: return True return False def Linkable(filename): """Return true if the file is linkable (should be on the link line).""" return filename.endswith('.o') def Target(filename): """Translate a compilable filename to its .o target.""" return os.path.splitext(filename)[0] + '.o' def EscapeShellArgument(s): """Quotes an argument so that it will be interpreted literally by a POSIX shell. Taken from http://stackoverflow.com/questions/35817/whats-the-best-way-to-escape-ossystem-calls-in-python """ return "'" + s.replace("'", "'\\''") + "'" def EscapeMakeVariableExpansion(s): """Make has its own variable expansion syntax using $. We must escape it for string to be interpreted literally.""" return s.replace('$', '$$') def EscapeCppDefine(s): """Escapes a CPP define so that it will reach the compiler unaltered.""" s = EscapeShellArgument(s) s = EscapeMakeVariableExpansion(s) # '#' characters must be escaped even embedded in a string, else Make will # treat it as the start of a comment. return s.replace('#', r'\#') def QuoteIfNecessary(string): """TODO: Should this ideally be replaced with one or more of the above functions?""" if '"' in string: string = '"' + string.replace('"', '\\"') + '"' return string def StringToMakefileVariable(string): """Convert a string to a value that is acceptable as a make variable name.""" return re.sub('[^a-zA-Z0-9_]', '_', string) srcdir_prefix = '' def Sourceify(path): """Convert a path to its source directory form.""" if '$(' in path: return path if os.path.isabs(path): return path return srcdir_prefix + path def QuoteSpaces(s, quote=r'\ '): return s.replace(' ', quote) # TODO: Avoid code duplication with _ValidateSourcesForMSVSProject in msvs.py. def _ValidateSourcesForOSX(spec, all_sources): """Makes sure if duplicate basenames are not specified in the source list. Arguments: spec: The target dictionary containing the properties of the target. """ if spec.get('type', None) != 'static_library': return basenames = {} for source in all_sources: name, ext = os.path.splitext(source) is_compiled_file = ext in [ '.c', '.cc', '.cpp', '.cxx', '.m', '.mm', '.s', '.S'] if not is_compiled_file: continue basename = os.path.basename(name) # Don't include extension. basenames.setdefault(basename, []).append(source) error = '' for basename, files in basenames.iteritems(): if len(files) > 1: error += ' %s: %s\n' % (basename, ' '.join(files)) if error: print('static library %s has several files with the same basename:\n' % spec['target_name'] + error + 'libtool on OS X will generate' + ' warnings for them.') raise GypError('Duplicate basenames in sources section, see list above') # Map from qualified target to path to output. target_outputs = {} # Map from qualified target to any linkable output. A subset # of target_outputs. E.g. when mybinary depends on liba, we want to # include liba in the linker line; when otherbinary depends on # mybinary, we just want to build mybinary first. target_link_deps = {} class MakefileWriter(object): """MakefileWriter packages up the writing of one target-specific foobar.mk. Its only real entry point is Write(), and is mostly used for namespacing. """ def __init__(self, generator_flags, flavor): self.generator_flags = generator_flags self.flavor = flavor self.suffix_rules_srcdir = {} self.suffix_rules_objdir1 = {} self.suffix_rules_objdir2 = {} # Generate suffix rules for all compilable extensions. for ext in COMPILABLE_EXTENSIONS.keys(): # Suffix rules for source folder. self.suffix_rules_srcdir.update({ext: ("""\ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(srcdir)/%%%s FORCE_DO_CMD @$(call do_cmd,%s,1) """ % (ext, COMPILABLE_EXTENSIONS[ext]))}) # Suffix rules for generated source files. self.suffix_rules_objdir1.update({ext: ("""\ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj).$(TOOLSET)/%%%s FORCE_DO_CMD @$(call do_cmd,%s,1) """ % (ext, COMPILABLE_EXTENSIONS[ext]))}) self.suffix_rules_objdir2.update({ext: ("""\ $(obj).$(TOOLSET)/$(TARGET)/%%.o: $(obj)/%%%s FORCE_DO_CMD @$(call do_cmd,%s,1) """ % (ext, COMPILABLE_EXTENSIONS[ext]))}) def Write(self, qualified_target, base_path, output_filename, spec, configs, part_of_all): """The main entry point: writes a .mk file for a single target. Arguments: qualified_target: target we're generating base_path: path relative to source root we're building in, used to resolve target-relative paths output_filename: output .mk file name to write spec, configs: gyp info part_of_all: flag indicating this target is part of 'all' """ gyp.common.EnsureDirExists(output_filename) self.fp = open(output_filename, 'w') self.fp.write(header) self.qualified_target = qualified_target self.path = base_path self.target = spec['target_name'] self.type = spec['type'] self.toolset = spec['toolset'] self.is_mac_bundle = gyp.xcode_emulation.IsMacBundle(self.flavor, spec) if self.flavor == 'mac': self.xcode_settings = gyp.xcode_emulation.XcodeSettings(spec) else: self.xcode_settings = None deps, link_deps = self.ComputeDeps(spec) # Some of the generation below can add extra output, sources, or # link dependencies. All of the out params of the functions that # follow use names like extra_foo. extra_outputs = [] extra_sources = [] extra_link_deps = [] extra_mac_bundle_resources = [] mac_bundle_deps = [] if self.is_mac_bundle: self.output = self.ComputeMacBundleOutput(spec) self.output_binary = self.ComputeMacBundleBinaryOutput(spec) else: self.output = self.output_binary = self.ComputeOutput(spec) self.is_standalone_static_library = bool( spec.get('standalone_static_library', 0)) self._INSTALLABLE_TARGETS = ('executable', 'loadable_module', 'shared_library') if (self.is_standalone_static_library or self.type in self._INSTALLABLE_TARGETS): self.alias = os.path.basename(self.output) install_path = self._InstallableTargetInstallPath() else: self.alias = self.output install_path = self.output self.WriteLn("TOOLSET := " + self.toolset) self.WriteLn("TARGET := " + self.target) # Actions must come first, since they can generate more OBJs for use below. if 'actions' in spec: self.WriteActions(spec['actions'], extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all) # Rules must be early like actions. if 'rules' in spec: self.WriteRules(spec['rules'], extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all) if 'copies' in spec: self.WriteCopies(spec['copies'], extra_outputs, part_of_all) # Bundle resources. if self.is_mac_bundle: all_mac_bundle_resources = ( spec.get('mac_bundle_resources', []) + extra_mac_bundle_resources) self.WriteMacBundleResources(all_mac_bundle_resources, mac_bundle_deps) self.WriteMacInfoPlist(mac_bundle_deps) # Sources. all_sources = spec.get('sources', []) + extra_sources if all_sources: if self.flavor == 'mac': # libtool on OS X generates warnings for duplicate basenames in the same # target. _ValidateSourcesForOSX(spec, all_sources) self.WriteSources( configs, deps, all_sources, extra_outputs, extra_link_deps, part_of_all, gyp.xcode_emulation.MacPrefixHeader( self.xcode_settings, lambda p: Sourceify(self.Absolutify(p)), self.Pchify)) sources = filter(Compilable, all_sources) if sources: self.WriteLn(SHARED_HEADER_SUFFIX_RULES_COMMENT1) extensions = set([os.path.splitext(s)[1] for s in sources]) for ext in extensions: if ext in self.suffix_rules_srcdir: self.WriteLn(self.suffix_rules_srcdir[ext]) self.WriteLn(SHARED_HEADER_SUFFIX_RULES_COMMENT2) for ext in extensions: if ext in self.suffix_rules_objdir1: self.WriteLn(self.suffix_rules_objdir1[ext]) for ext in extensions: if ext in self.suffix_rules_objdir2: self.WriteLn(self.suffix_rules_objdir2[ext]) self.WriteLn('# End of this set of suffix rules') # Add dependency from bundle to bundle binary. if self.is_mac_bundle: mac_bundle_deps.append(self.output_binary) self.WriteTarget(spec, configs, deps, extra_link_deps + link_deps, mac_bundle_deps, extra_outputs, part_of_all) # Update global list of target outputs, used in dependency tracking. target_outputs[qualified_target] = install_path # Update global list of link dependencies. if self.type in ('static_library', 'shared_library'): target_link_deps[qualified_target] = self.output_binary # Currently any versions have the same effect, but in future the behavior # could be different. if self.generator_flags.get('android_ndk_version', None): self.WriteAndroidNdkModuleRule(self.target, all_sources, link_deps) self.fp.close() def WriteSubMake(self, output_filename, makefile_path, targets, build_dir): """Write a "sub-project" Makefile. This is a small, wrapper Makefile that calls the top-level Makefile to build the targets from a single gyp file (i.e. a sub-project). Arguments: output_filename: sub-project Makefile name to write makefile_path: path to the top-level Makefile targets: list of "all" targets for this sub-project build_dir: build output directory, relative to the sub-project """ gyp.common.EnsureDirExists(output_filename) self.fp = open(output_filename, 'w') self.fp.write(header) # For consistency with other builders, put sub-project build output in the # sub-project dir (see test/subdirectory/gyptest-subdir-all.py). self.WriteLn('export builddir_name ?= %s' % os.path.join(os.path.dirname(output_filename), build_dir)) self.WriteLn('.PHONY: all') self.WriteLn('all:') if makefile_path: makefile_path = ' -C ' + makefile_path self.WriteLn('\t$(MAKE)%s %s' % (makefile_path, ' '.join(targets))) self.fp.close() def WriteActions(self, actions, extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all): """Write Makefile code for any 'actions' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these actions (used to make other pieces dependent on these actions) part_of_all: flag indicating this target is part of 'all' """ env = self.GetSortedXcodeEnv() for action in actions: name = StringToMakefileVariable('%s_%s' % (self.qualified_target, action['action_name'])) self.WriteLn('### Rules for action "%s":' % action['action_name']) inputs = action['inputs'] outputs = action['outputs'] # Build up a list of outputs. # Collect the output dirs we'll need. dirs = set() for out in outputs: dir = os.path.split(out)[0] if dir: dirs.add(dir) if int(action.get('process_outputs_as_sources', False)): extra_sources += outputs if int(action.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += outputs # Write the actual command. action_commands = action['action'] if self.flavor == 'mac': action_commands = [gyp.xcode_emulation.ExpandEnvVars(command, env) for command in action_commands] command = gyp.common.EncodePOSIXShellList(action_commands) if 'message' in action: self.WriteLn('quiet_cmd_%s = ACTION %s $@' % (name, action['message'])) else: self.WriteLn('quiet_cmd_%s = ACTION %s $@' % (name, name)) if len(dirs) > 0: command = 'mkdir -p %s' % ' '.join(dirs) + '; ' + command cd_action = 'cd %s; ' % Sourceify(self.path or '.') # command and cd_action get written to a toplevel variable called # cmd_foo. Toplevel variables can't handle things that change per # makefile like $(TARGET), so hardcode the target. command = command.replace('$(TARGET)', self.target) cd_action = cd_action.replace('$(TARGET)', self.target) # Set LD_LIBRARY_PATH in case the action runs an executable from this # build which links to shared libs from this build. # actions run on the host, so they should in theory only use host # libraries, but until everything is made cross-compile safe, also use # target libraries. # TODO(piman): when everything is cross-compile safe, remove lib.target self.WriteLn('cmd_%s = LD_LIBRARY_PATH=$(builddir)/lib.host:' '$(builddir)/lib.target:$$LD_LIBRARY_PATH; ' 'export LD_LIBRARY_PATH; ' '%s%s' % (name, cd_action, command)) self.WriteLn() outputs = map(self.Absolutify, outputs) # The makefile rules are all relative to the top dir, but the gyp actions # are defined relative to their containing dir. This replaces the obj # variable for the action rule with an absolute version so that the output # goes in the right place. # Only write the 'obj' and 'builddir' rules for the "primary" output (:1); # it's superfluous for the "extra outputs", and this avoids accidentally # writing duplicate dummy rules for those outputs. # Same for environment. self.WriteLn("%s: obj := $(abs_obj)" % QuoteSpaces(outputs[0])) self.WriteLn("%s: builddir := $(abs_builddir)" % QuoteSpaces(outputs[0])) self.WriteSortedXcodeEnv(outputs[0], self.GetSortedXcodeEnv()) for input in inputs: assert ' ' not in input, ( "Spaces in action input filenames not supported (%s)" % input) for output in outputs: assert ' ' not in output, ( "Spaces in action output filenames not supported (%s)" % output) # See the comment in WriteCopies about expanding env vars. outputs = [gyp.xcode_emulation.ExpandEnvVars(o, env) for o in outputs] inputs = [gyp.xcode_emulation.ExpandEnvVars(i, env) for i in inputs] self.WriteDoCmd(outputs, map(Sourceify, map(self.Absolutify, inputs)), part_of_all=part_of_all, command=name) # Stuff the outputs in a variable so we can refer to them later. outputs_variable = 'action_%s_outputs' % name self.WriteLn('%s := %s' % (outputs_variable, ' '.join(outputs))) extra_outputs.append('$(%s)' % outputs_variable) self.WriteLn() self.WriteLn() def WriteRules(self, rules, extra_sources, extra_outputs, extra_mac_bundle_resources, part_of_all): """Write Makefile code for any 'rules' from the gyp input. extra_sources: a list that will be filled in with newly generated source files, if any extra_outputs: a list that will be filled in with any outputs of these rules (used to make other pieces dependent on these rules) part_of_all: flag indicating this target is part of 'all' """ env = self.GetSortedXcodeEnv() for rule in rules: name = StringToMakefileVariable('%s_%s' % (self.qualified_target, rule['rule_name'])) count = 0 self.WriteLn('### Generated for rule %s:' % name) all_outputs = [] for rule_source in rule.get('rule_sources', []): dirs = set() (rule_source_dirname, rule_source_basename) = os.path.split(rule_source) (rule_source_root, rule_source_ext) = \ os.path.splitext(rule_source_basename) outputs = [self.ExpandInputRoot(out, rule_source_root, rule_source_dirname) for out in rule['outputs']] for out in outputs: dir = os.path.dirname(out) if dir: dirs.add(dir) if int(rule.get('process_outputs_as_sources', False)): extra_sources += outputs if int(rule.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += outputs inputs = map(Sourceify, map(self.Absolutify, [rule_source] + rule.get('inputs', []))) actions = ['$(call do_cmd,%s_%d)' % (name, count)] if name == 'resources_grit': # HACK: This is ugly. Grit intentionally doesn't touch the # timestamp of its output file when the file doesn't change, # which is fine in hash-based dependency systems like scons # and forge, but not kosher in the make world. After some # discussion, hacking around it here seems like the least # amount of pain. actions += ['@touch --no-create $@'] # See the comment in WriteCopies about expanding env vars. outputs = [gyp.xcode_emulation.ExpandEnvVars(o, env) for o in outputs] inputs = [gyp.xcode_emulation.ExpandEnvVars(i, env) for i in inputs] outputs = map(self.Absolutify, outputs) all_outputs += outputs # Only write the 'obj' and 'builddir' rules for the "primary" output # (:1); it's superfluous for the "extra outputs", and this avoids # accidentally writing duplicate dummy rules for those outputs. self.WriteLn('%s: obj := $(abs_obj)' % outputs[0]) self.WriteLn('%s: builddir := $(abs_builddir)' % outputs[0]) self.WriteMakeRule(outputs, inputs, actions, command="%s_%d" % (name, count)) # Spaces in rule filenames are not supported, but rule variables have # spaces in them (e.g. RULE_INPUT_PATH expands to '$(abspath $<)'). # The spaces within the variables are valid, so remove the variables # before checking. variables_with_spaces = re.compile(r'\$\([^ ]* \$<\)') for output in outputs: output = re.sub(variables_with_spaces, '', output) assert ' ' not in output, ( "Spaces in rule filenames not yet supported (%s)" % output) self.WriteLn('all_deps += %s' % ' '.join(outputs)) action = [self.ExpandInputRoot(ac, rule_source_root, rule_source_dirname) for ac in rule['action']] mkdirs = '' if len(dirs) > 0: mkdirs = 'mkdir -p %s; ' % ' '.join(dirs) cd_action = 'cd %s; ' % Sourceify(self.path or '.') # action, cd_action, and mkdirs get written to a toplevel variable # called cmd_foo. Toplevel variables can't handle things that change # per makefile like $(TARGET), so hardcode the target. if self.flavor == 'mac': action = [gyp.xcode_emulation.ExpandEnvVars(command, env) for command in action] action = gyp.common.EncodePOSIXShellList(action) action = action.replace('$(TARGET)', self.target) cd_action = cd_action.replace('$(TARGET)', self.target) mkdirs = mkdirs.replace('$(TARGET)', self.target) # Set LD_LIBRARY_PATH in case the rule runs an executable from this # build which links to shared libs from this build. # rules run on the host, so they should in theory only use host # libraries, but until everything is made cross-compile safe, also use # target libraries. # TODO(piman): when everything is cross-compile safe, remove lib.target self.WriteLn( "cmd_%(name)s_%(count)d = LD_LIBRARY_PATH=" "$(builddir)/lib.host:$(builddir)/lib.target:$$LD_LIBRARY_PATH; " "export LD_LIBRARY_PATH; " "%(cd_action)s%(mkdirs)s%(action)s" % { 'action': action, 'cd_action': cd_action, 'count': count, 'mkdirs': mkdirs, 'name': name, }) self.WriteLn( 'quiet_cmd_%(name)s_%(count)d = RULE %(name)s_%(count)d $@' % { 'count': count, 'name': name, }) self.WriteLn() count += 1 outputs_variable = 'rule_%s_outputs' % name self.WriteList(all_outputs, outputs_variable) extra_outputs.append('$(%s)' % outputs_variable) self.WriteLn('### Finished generating for rule: %s' % name) self.WriteLn() self.WriteLn('### Finished generating for all rules') self.WriteLn('') def WriteCopies(self, copies, extra_outputs, part_of_all): """Write Makefile code for any 'copies' from the gyp input. extra_outputs: a list that will be filled in with any outputs of this action (used to make other pieces dependent on this action) part_of_all: flag indicating this target is part of 'all' """ self.WriteLn('### Generated for copy rule.') variable = StringToMakefileVariable(self.qualified_target + '_copies') outputs = [] for copy in copies: for path in copy['files']: # Absolutify() may call normpath, and will strip trailing slashes. path = Sourceify(self.Absolutify(path)) filename = os.path.split(path)[1] output = Sourceify(self.Absolutify(os.path.join(copy['destination'], filename))) # If the output path has variables in it, which happens in practice for # 'copies', writing the environment as target-local doesn't work, # because the variables are already needed for the target name. # Copying the environment variables into global make variables doesn't # work either, because then the .d files will potentially contain spaces # after variable expansion, and .d file handling cannot handle spaces. # As a workaround, manually expand variables at gyp time. Since 'copies' # can't run scripts, there's no need to write the env then. # WriteDoCmd() will escape spaces for .d files. env = self.GetSortedXcodeEnv() output = gyp.xcode_emulation.ExpandEnvVars(output, env) path = gyp.xcode_emulation.ExpandEnvVars(path, env) self.WriteDoCmd([output], [path], 'copy', part_of_all) outputs.append(output) self.WriteLn('%s = %s' % (variable, ' '.join(map(QuoteSpaces, outputs)))) extra_outputs.append('$(%s)' % variable) self.WriteLn() def WriteMacBundleResources(self, resources, bundle_deps): """Writes Makefile code for 'mac_bundle_resources'.""" self.WriteLn('### Generated for mac_bundle_resources') for output, res in gyp.xcode_emulation.GetMacBundleResources( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, map(Sourceify, map(self.Absolutify, resources))): _, ext = os.path.splitext(output) if ext != '.xcassets': # Make does not supports '.xcassets' emulation. self.WriteDoCmd([output], [res], 'mac_tool,,,copy-bundle-resource', part_of_all=True) bundle_deps.append(output) def WriteMacInfoPlist(self, bundle_deps): """Write Makefile code for bundle Info.plist files.""" info_plist, out, defines, extra_env = gyp.xcode_emulation.GetMacInfoPlist( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, lambda p: Sourceify(self.Absolutify(p))) if not info_plist: return if defines: # Create an intermediate file to store preprocessed results. intermediate_plist = ('$(obj).$(TOOLSET)/$(TARGET)/' + os.path.basename(info_plist)) self.WriteList(defines, intermediate_plist + ': INFOPLIST_DEFINES', '-D', quoter=EscapeCppDefine) self.WriteMakeRule([intermediate_plist], [info_plist], ['$(call do_cmd,infoplist)', # "Convert" the plist so that any weird whitespace changes from the # preprocessor do not affect the XML parser in mac_tool. '@plutil -convert xml1 $@ $@']) info_plist = intermediate_plist # plists can contain envvars and substitute them into the file. self.WriteSortedXcodeEnv( out, self.GetSortedXcodeEnv(additional_settings=extra_env)) self.WriteDoCmd([out], [info_plist], 'mac_tool,,,copy-info-plist', part_of_all=True) bundle_deps.append(out) def WriteSources(self, configs, deps, sources, extra_outputs, extra_link_deps, part_of_all, precompiled_header): """Write Makefile code for any 'sources' from the gyp input. These are source files necessary to build the current target. configs, deps, sources: input from gyp. extra_outputs: a list of extra outputs this action should be dependent on; used to serialize action/rules before compilation extra_link_deps: a list that will be filled in with any outputs of compilation (to be used in link lines) part_of_all: flag indicating this target is part of 'all' """ # Write configuration-specific variables for CFLAGS, etc. for configname in sorted(configs.keys()): config = configs[configname] self.WriteList(config.get('defines'), 'DEFS_%s' % configname, prefix='-D', quoter=EscapeCppDefine) if self.flavor == 'mac': cflags = self.xcode_settings.GetCflags(configname) cflags_c = self.xcode_settings.GetCflagsC(configname) cflags_cc = self.xcode_settings.GetCflagsCC(configname) cflags_objc = self.xcode_settings.GetCflagsObjC(configname) cflags_objcc = self.xcode_settings.GetCflagsObjCC(configname) else: cflags = config.get('cflags') cflags_c = config.get('cflags_c') cflags_cc = config.get('cflags_cc') self.WriteLn("# Flags passed to all source files."); self.WriteList(cflags, 'CFLAGS_%s' % configname) self.WriteLn("# Flags passed to only C files."); self.WriteList(cflags_c, 'CFLAGS_C_%s' % configname) self.WriteLn("# Flags passed to only C++ files."); self.WriteList(cflags_cc, 'CFLAGS_CC_%s' % configname) if self.flavor == 'mac': self.WriteLn("# Flags passed to only ObjC files."); self.WriteList(cflags_objc, 'CFLAGS_OBJC_%s' % configname) self.WriteLn("# Flags passed to only ObjC++ files."); self.WriteList(cflags_objcc, 'CFLAGS_OBJCC_%s' % configname) includes = config.get('include_dirs') if includes: includes = map(Sourceify, map(self.Absolutify, includes)) self.WriteList(includes, 'INCS_%s' % configname, prefix='-I') compilable = filter(Compilable, sources) objs = map(self.Objectify, map(self.Absolutify, map(Target, compilable))) self.WriteList(objs, 'OBJS') for obj in objs: assert ' ' not in obj, ( "Spaces in object filenames not supported (%s)" % obj) self.WriteLn('# Add to the list of files we specially track ' 'dependencies for.') self.WriteLn('all_deps += $(OBJS)') self.WriteLn() # Make sure our dependencies are built first. if deps: self.WriteMakeRule(['$(OBJS)'], deps, comment = 'Make sure our dependencies are built ' 'before any of us.', order_only = True) # Make sure the actions and rules run first. # If they generate any extra headers etc., the per-.o file dep tracking # will catch the proper rebuilds, so order only is still ok here. if extra_outputs: self.WriteMakeRule(['$(OBJS)'], extra_outputs, comment = 'Make sure our actions/rules run ' 'before any of us.', order_only = True) pchdeps = precompiled_header.GetObjDependencies(compilable, objs ) if pchdeps: self.WriteLn('# Dependencies from obj files to their precompiled headers') for source, obj, gch in pchdeps: self.WriteLn('%s: %s' % (obj, gch)) self.WriteLn('# End precompiled header dependencies') if objs: extra_link_deps.append('$(OBJS)') self.WriteLn("""\ # CFLAGS et al overrides must be target-local. # See "Target-specific Variable Values" in the GNU Make manual.""") self.WriteLn("$(OBJS): TOOLSET := $(TOOLSET)") self.WriteLn("$(OBJS): GYP_CFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('c') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_C_$(BUILDTYPE))") self.WriteLn("$(OBJS): GYP_CXXFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('cc') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_CC_$(BUILDTYPE))") if self.flavor == 'mac': self.WriteLn("$(OBJS): GYP_OBJCFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('m') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_C_$(BUILDTYPE)) " "$(CFLAGS_OBJC_$(BUILDTYPE))") self.WriteLn("$(OBJS): GYP_OBJCXXFLAGS := " "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "%s " % precompiled_header.GetInclude('mm') + "$(CFLAGS_$(BUILDTYPE)) " "$(CFLAGS_CC_$(BUILDTYPE)) " "$(CFLAGS_OBJCC_$(BUILDTYPE))") self.WritePchTargets(precompiled_header.GetPchBuildCommands()) # If there are any object files in our input file list, link them into our # output. extra_link_deps += filter(Linkable, sources) self.WriteLn() def WritePchTargets(self, pch_commands): """Writes make rules to compile prefix headers.""" if not pch_commands: return for gch, lang_flag, lang, input in pch_commands: extra_flags = { 'c': '$(CFLAGS_C_$(BUILDTYPE))', 'cc': '$(CFLAGS_CC_$(BUILDTYPE))', 'm': '$(CFLAGS_C_$(BUILDTYPE)) $(CFLAGS_OBJC_$(BUILDTYPE))', 'mm': '$(CFLAGS_CC_$(BUILDTYPE)) $(CFLAGS_OBJCC_$(BUILDTYPE))', }[lang] var_name = { 'c': 'GYP_PCH_CFLAGS', 'cc': 'GYP_PCH_CXXFLAGS', 'm': 'GYP_PCH_OBJCFLAGS', 'mm': 'GYP_PCH_OBJCXXFLAGS', }[lang] self.WriteLn("%s: %s := %s " % (gch, var_name, lang_flag) + "$(DEFS_$(BUILDTYPE)) " "$(INCS_$(BUILDTYPE)) " "$(CFLAGS_$(BUILDTYPE)) " + extra_flags) self.WriteLn('%s: %s FORCE_DO_CMD' % (gch, input)) self.WriteLn('\t@$(call do_cmd,pch_%s,1)' % lang) self.WriteLn('') assert ' ' not in gch, ( "Spaces in gch filenames not supported (%s)" % gch) self.WriteLn('all_deps += %s' % gch) self.WriteLn('') def ComputeOutputBasename(self, spec): """Return the 'output basename' of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce 'libfoobar.so' """ assert not self.is_mac_bundle if self.flavor == 'mac' and self.type in ( 'static_library', 'executable', 'shared_library', 'loadable_module'): return self.xcode_settings.GetExecutablePath() target = spec['target_name'] target_prefix = '' target_ext = '' if self.type == 'static_library': if target[:3] == 'lib': target = target[3:] target_prefix = 'lib' target_ext = '.a' elif self.type in ('loadable_module', 'shared_library'): if target[:3] == 'lib': target = target[3:] target_prefix = 'lib' target_ext = '.so' elif self.type == 'none': target = '%s.stamp' % target elif self.type != 'executable': print ("ERROR: What output file should be generated?", "type", self.type, "target", target) target_prefix = spec.get('product_prefix', target_prefix) target = spec.get('product_name', target) product_ext = spec.get('product_extension') if product_ext: target_ext = '.' + product_ext return target_prefix + target + target_ext def _InstallImmediately(self): return self.toolset == 'target' and self.flavor == 'mac' and self.type in ( 'static_library', 'executable', 'shared_library', 'loadable_module') def ComputeOutput(self, spec): """Return the 'output' (full output path) of a gyp spec. E.g., the loadable module 'foobar' in directory 'baz' will produce '$(obj)/baz/libfoobar.so' """ assert not self.is_mac_bundle path = os.path.join('$(obj).' + self.toolset, self.path) if self.type == 'executable' or self._InstallImmediately(): path = '$(builddir)' path = spec.get('product_dir', path) return os.path.join(path, self.ComputeOutputBasename(spec)) def ComputeMacBundleOutput(self, spec): """Return the 'output' (full output path) to a bundle output directory.""" assert self.is_mac_bundle path = generator_default_variables['PRODUCT_DIR'] return os.path.join(path, self.xcode_settings.GetWrapperName()) def ComputeMacBundleBinaryOutput(self, spec): """Return the 'output' (full output path) to the binary in a bundle.""" path = generator_default_variables['PRODUCT_DIR'] return os.path.join(path, self.xcode_settings.GetExecutablePath()) def ComputeDeps(self, spec): """Compute the dependencies of a gyp spec. Returns a tuple (deps, link_deps), where each is a list of filenames that will need to be put in front of make for either building (deps) or linking (link_deps). """ deps = [] link_deps = [] if 'dependencies' in spec: deps.extend([target_outputs[dep] for dep in spec['dependencies'] if target_outputs[dep]]) for dep in spec['dependencies']: if dep in target_link_deps: link_deps.append(target_link_deps[dep]) deps.extend(link_deps) # TODO: It seems we need to transitively link in libraries (e.g. -lfoo)? # This hack makes it work: # link_deps.extend(spec.get('libraries', [])) return (gyp.common.uniquer(deps), gyp.common.uniquer(link_deps)) def WriteDependencyOnExtraOutputs(self, target, extra_outputs): self.WriteMakeRule([self.output_binary], extra_outputs, comment = 'Build our special outputs first.', order_only = True) def WriteTarget(self, spec, configs, deps, link_deps, bundle_deps, extra_outputs, part_of_all): """Write Makefile code to produce the final target of the gyp spec. spec, configs: input from gyp. deps, link_deps: dependency lists; see ComputeDeps() extra_outputs: any extra outputs that our target should depend on part_of_all: flag indicating this target is part of 'all' """ self.WriteLn('### Rules for final target.') if extra_outputs: self.WriteDependencyOnExtraOutputs(self.output_binary, extra_outputs) self.WriteMakeRule(extra_outputs, deps, comment=('Preserve order dependency of ' 'special output on deps.'), order_only = True) target_postbuilds = {} if self.type != 'none': for configname in sorted(configs.keys()): config = configs[configname] if self.flavor == 'mac': ldflags = self.xcode_settings.GetLdflags(configname, generator_default_variables['PRODUCT_DIR'], lambda p: Sourceify(self.Absolutify(p))) # TARGET_POSTBUILDS_$(BUILDTYPE) is added to postbuilds later on. gyp_to_build = gyp.common.InvertRelativePath(self.path) target_postbuild = self.xcode_settings.AddImplicitPostbuilds( configname, QuoteSpaces(os.path.normpath(os.path.join(gyp_to_build, self.output))), QuoteSpaces(os.path.normpath(os.path.join(gyp_to_build, self.output_binary)))) if target_postbuild: target_postbuilds[configname] = target_postbuild else: ldflags = config.get('ldflags', []) # Compute an rpath for this output if needed. if any(dep.endswith('.so') or '.so.' in dep for dep in deps): # We want to get the literal string "$ORIGIN" into the link command, # so we need lots of escaping. ldflags.append(r'-Wl,-rpath=\$$ORIGIN/lib.%s/' % self.toolset) ldflags.append(r'-Wl,-rpath-link=\$(builddir)/lib.%s/' % self.toolset) library_dirs = config.get('library_dirs', []) ldflags += [('-L%s' % library_dir) for library_dir in library_dirs] self.WriteList(ldflags, 'LDFLAGS_%s' % configname) if self.flavor == 'mac': self.WriteList(self.xcode_settings.GetLibtoolflags(configname), 'LIBTOOLFLAGS_%s' % configname) libraries = spec.get('libraries') if libraries: # Remove duplicate entries libraries = gyp.common.uniquer(libraries) if self.flavor == 'mac': libraries = self.xcode_settings.AdjustLibraries(libraries) self.WriteList(libraries, 'LIBS') self.WriteLn('%s: GYP_LDFLAGS := $(LDFLAGS_$(BUILDTYPE))' % QuoteSpaces(self.output_binary)) self.WriteLn('%s: LIBS := $(LIBS)' % QuoteSpaces(self.output_binary)) if self.flavor == 'mac': self.WriteLn('%s: GYP_LIBTOOLFLAGS := $(LIBTOOLFLAGS_$(BUILDTYPE))' % QuoteSpaces(self.output_binary)) # Postbuild actions. Like actions, but implicitly depend on the target's # output. postbuilds = [] if self.flavor == 'mac': if target_postbuilds: postbuilds.append('$(TARGET_POSTBUILDS_$(BUILDTYPE))') postbuilds.extend( gyp.xcode_emulation.GetSpecPostbuildCommands(spec)) if postbuilds: # Envvars may be referenced by TARGET_POSTBUILDS_$(BUILDTYPE), # so we must output its definition first, since we declare variables # using ":=". self.WriteSortedXcodeEnv(self.output, self.GetSortedXcodePostbuildEnv()) for configname in target_postbuilds: self.WriteLn('%s: TARGET_POSTBUILDS_%s := %s' % (QuoteSpaces(self.output), configname, gyp.common.EncodePOSIXShellList(target_postbuilds[configname]))) # Postbuilds expect to be run in the gyp file's directory, so insert an # implicit postbuild to cd to there. postbuilds.insert(0, gyp.common.EncodePOSIXShellList(['cd', self.path])) for i in xrange(len(postbuilds)): if not postbuilds[i].startswith('$'): postbuilds[i] = EscapeShellArgument(postbuilds[i]) self.WriteLn('%s: builddir := $(abs_builddir)' % QuoteSpaces(self.output)) self.WriteLn('%s: POSTBUILDS := %s' % ( QuoteSpaces(self.output), ' '.join(postbuilds))) # A bundle directory depends on its dependencies such as bundle resources # and bundle binary. When all dependencies have been built, the bundle # needs to be packaged. if self.is_mac_bundle: # If the framework doesn't contain a binary, then nothing depends # on the actions -- make the framework depend on them directly too. self.WriteDependencyOnExtraOutputs(self.output, extra_outputs) # Bundle dependencies. Note that the code below adds actions to this # target, so if you move these two lines, move the lines below as well. self.WriteList(map(QuoteSpaces, bundle_deps), 'BUNDLE_DEPS') self.WriteLn('%s: $(BUNDLE_DEPS)' % QuoteSpaces(self.output)) # After the framework is built, package it. Needs to happen before # postbuilds, since postbuilds depend on this. if self.type in ('shared_library', 'loadable_module'): self.WriteLn('\t@$(call do_cmd,mac_package_framework,,,%s)' % self.xcode_settings.GetFrameworkVersion()) # Bundle postbuilds can depend on the whole bundle, so run them after # the bundle is packaged, not already after the bundle binary is done. if postbuilds: self.WriteLn('\t@$(call do_postbuilds)') postbuilds = [] # Don't write postbuilds for target's output. # Needed by test/mac/gyptest-rebuild.py. self.WriteLn('\t@true # No-op, used by tests') # Since this target depends on binary and resources which are in # nested subfolders, the framework directory will be older than # its dependencies usually. To prevent this rule from executing # on every build (expensive, especially with postbuilds), expliclity # update the time on the framework directory. self.WriteLn('\t@touch -c %s' % QuoteSpaces(self.output)) if postbuilds: assert not self.is_mac_bundle, ('Postbuilds for bundles should be done ' 'on the bundle, not the binary (target \'%s\')' % self.target) assert 'product_dir' not in spec, ('Postbuilds do not work with ' 'custom product_dir') if self.type == 'executable': self.WriteLn('%s: LD_INPUTS := %s' % ( QuoteSpaces(self.output_binary), ' '.join(map(QuoteSpaces, link_deps)))) if self.toolset == 'host' and self.flavor == 'android': self.WriteDoCmd([self.output_binary], link_deps, 'link_host', part_of_all, postbuilds=postbuilds) else: self.WriteDoCmd([self.output_binary], link_deps, 'link', part_of_all, postbuilds=postbuilds) elif self.type == 'static_library': for link_dep in link_deps: assert ' ' not in link_dep, ( "Spaces in alink input filenames not supported (%s)" % link_dep) if (self.flavor not in ('mac', 'openbsd', 'netbsd', 'win') and not self.is_standalone_static_library): self.WriteDoCmd([self.output_binary], link_deps, 'alink_thin', part_of_all, postbuilds=postbuilds) else: self.WriteDoCmd([self.output_binary], link_deps, 'alink', part_of_all, postbuilds=postbuilds) elif self.type == 'shared_library': self.WriteLn('%s: LD_INPUTS := %s' % ( QuoteSpaces(self.output_binary), ' '.join(map(QuoteSpaces, link_deps)))) self.WriteDoCmd([self.output_binary], link_deps, 'solink', part_of_all, postbuilds=postbuilds) elif self.type == 'loadable_module': for link_dep in link_deps: assert ' ' not in link_dep, ( "Spaces in module input filenames not supported (%s)" % link_dep) if self.toolset == 'host' and self.flavor == 'android': self.WriteDoCmd([self.output_binary], link_deps, 'solink_module_host', part_of_all, postbuilds=postbuilds) else: self.WriteDoCmd( [self.output_binary], link_deps, 'solink_module', part_of_all, postbuilds=postbuilds) elif self.type == 'none': # Write a stamp line. self.WriteDoCmd([self.output_binary], deps, 'touch', part_of_all, postbuilds=postbuilds) else: print "WARNING: no output for", self.type, target # Add an alias for each target (if there are any outputs). # Installable target aliases are created below. if ((self.output and self.output != self.target) and (self.type not in self._INSTALLABLE_TARGETS)): self.WriteMakeRule([self.target], [self.output], comment='Add target alias', phony = True) if part_of_all: self.WriteMakeRule(['all'], [self.target], comment = 'Add target alias to "all" target.', phony = True) # Add special-case rules for our installable targets. # 1) They need to install to the build dir or "product" dir. # 2) They get shortcuts for building (e.g. "make chrome"). # 3) They are part of "make all". if (self.type in self._INSTALLABLE_TARGETS or self.is_standalone_static_library): if self.type == 'shared_library': file_desc = 'shared library' elif self.type == 'static_library': file_desc = 'static library' else: file_desc = 'executable' install_path = self._InstallableTargetInstallPath() installable_deps = [self.output] if (self.flavor == 'mac' and not 'product_dir' in spec and self.toolset == 'target'): # On mac, products are created in install_path immediately. assert install_path == self.output, '%s != %s' % ( install_path, self.output) # Point the target alias to the final binary output. self.WriteMakeRule([self.target], [install_path], comment='Add target alias', phony = True) if install_path != self.output: assert not self.is_mac_bundle # See comment a few lines above. self.WriteDoCmd([install_path], [self.output], 'copy', comment = 'Copy this to the %s output path.' % file_desc, part_of_all=part_of_all) installable_deps.append(install_path) if self.output != self.alias and self.alias != self.target: self.WriteMakeRule([self.alias], installable_deps, comment = 'Short alias for building this %s.' % file_desc, phony = True) if part_of_all: self.WriteMakeRule(['all'], [install_path], comment = 'Add %s to "all" target.' % file_desc, phony = True) def WriteList(self, value_list, variable=None, prefix='', quoter=QuoteIfNecessary): """Write a variable definition that is a list of values. E.g. WriteList(['a','b'], 'foo', prefix='blah') writes out foo = blaha blahb but in a pretty-printed style. """ values = '' if value_list: value_list = [quoter(prefix + l) for l in value_list] values = ' \\\n\t' + ' \\\n\t'.join(value_list) self.fp.write('%s :=%s\n\n' % (variable, values)) def WriteDoCmd(self, outputs, inputs, command, part_of_all, comment=None, postbuilds=False): """Write a Makefile rule that uses do_cmd. This makes the outputs dependent on the command line that was run, as well as support the V= make command line flag. """ suffix = '' if postbuilds: assert ',' not in command suffix = ',,1' # Tell do_cmd to honor $POSTBUILDS self.WriteMakeRule(outputs, inputs, actions = ['$(call do_cmd,%s%s)' % (command, suffix)], comment = comment, command = command, force = True) # Add our outputs to the list of targets we read depfiles from. # all_deps is only used for deps file reading, and for deps files we replace # spaces with ? because escaping doesn't work with make's $(sort) and # other functions. outputs = [QuoteSpaces(o, SPACE_REPLACEMENT) for o in outputs] self.WriteLn('all_deps += %s' % ' '.join(outputs)) def WriteMakeRule(self, outputs, inputs, actions=None, comment=None, order_only=False, force=False, phony=False, command=None): """Write a Makefile rule, with some extra tricks. outputs: a list of outputs for the rule (note: this is not directly supported by make; see comments below) inputs: a list of inputs for the rule actions: a list of shell commands to run for the rule comment: a comment to put in the Makefile above the rule (also useful for making this Python script's code self-documenting) order_only: if true, makes the dependency order-only force: if true, include FORCE_DO_CMD as an order-only dep phony: if true, the rule does not actually generate the named output, the output is just a name to run the rule command: (optional) command name to generate unambiguous labels """ outputs = map(QuoteSpaces, outputs) inputs = map(QuoteSpaces, inputs) if comment: self.WriteLn('# ' + comment) if phony: self.WriteLn('.PHONY: ' + ' '.join(outputs)) if actions: self.WriteLn("%s: TOOLSET := $(TOOLSET)" % outputs[0]) force_append = ' FORCE_DO_CMD' if force else '' if order_only: # Order only rule: Just write a simple rule. # TODO(evanm): just make order_only a list of deps instead of this hack. self.WriteLn('%s: | %s%s' % (' '.join(outputs), ' '.join(inputs), force_append)) elif len(outputs) == 1: # Regular rule, one output: Just write a simple rule. self.WriteLn('%s: %s%s' % (outputs[0], ' '.join(inputs), force_append)) else: # Regular rule, more than one output: Multiple outputs are tricky in # make. We will write three rules: # - All outputs depend on an intermediate file. # - Make .INTERMEDIATE depend on the intermediate. # - The intermediate file depends on the inputs and executes the # actual command. # - The intermediate recipe will 'touch' the intermediate file. # - The multi-output rule will have an do-nothing recipe. intermediate = "%s.intermediate" % (command if command else self.target) self.WriteLn('%s: %s' % (' '.join(outputs), intermediate)) self.WriteLn('\t%s' % '@:'); self.WriteLn('%s: %s' % ('.INTERMEDIATE', intermediate)) self.WriteLn('%s: %s%s' % (intermediate, ' '.join(inputs), force_append)) actions.insert(0, '$(call do_cmd,touch)') if actions: for action in actions: self.WriteLn('\t%s' % action) self.WriteLn() def WriteAndroidNdkModuleRule(self, module_name, all_sources, link_deps): """Write a set of LOCAL_XXX definitions for Android NDK. These variable definitions will be used by Android NDK but do nothing for non-Android applications. Arguments: module_name: Android NDK module name, which must be unique among all module names. all_sources: A list of source files (will be filtered by Compilable). link_deps: A list of link dependencies, which must be sorted in the order from dependencies to dependents. """ if self.type not in ('executable', 'shared_library', 'static_library'): return self.WriteLn('# Variable definitions for Android applications') self.WriteLn('include $(CLEAR_VARS)') self.WriteLn('LOCAL_MODULE := ' + module_name) self.WriteLn('LOCAL_CFLAGS := $(CFLAGS_$(BUILDTYPE)) ' '$(DEFS_$(BUILDTYPE)) ' # LOCAL_CFLAGS is applied to both of C and C++. There is # no way to specify $(CFLAGS_C_$(BUILDTYPE)) only for C # sources. '$(CFLAGS_C_$(BUILDTYPE)) ' # $(INCS_$(BUILDTYPE)) includes the prefix '-I' while # LOCAL_C_INCLUDES does not expect it. So put it in # LOCAL_CFLAGS. '$(INCS_$(BUILDTYPE))') # LOCAL_CXXFLAGS is obsolete and LOCAL_CPPFLAGS is preferred. self.WriteLn('LOCAL_CPPFLAGS := $(CFLAGS_CC_$(BUILDTYPE))') self.WriteLn('LOCAL_C_INCLUDES :=') self.WriteLn('LOCAL_LDLIBS := $(LDFLAGS_$(BUILDTYPE)) $(LIBS)') # Detect the C++ extension. cpp_ext = {'.cc': 0, '.cpp': 0, '.cxx': 0} default_cpp_ext = '.cpp' for filename in all_sources: ext = os.path.splitext(filename)[1] if ext in cpp_ext: cpp_ext[ext] += 1 if cpp_ext[ext] > cpp_ext[default_cpp_ext]: default_cpp_ext = ext self.WriteLn('LOCAL_CPP_EXTENSION := ' + default_cpp_ext) self.WriteList(map(self.Absolutify, filter(Compilable, all_sources)), 'LOCAL_SRC_FILES') # Filter out those which do not match prefix and suffix and produce # the resulting list without prefix and suffix. def DepsToModules(deps, prefix, suffix): modules = [] for filepath in deps: filename = os.path.basename(filepath) if filename.startswith(prefix) and filename.endswith(suffix): modules.append(filename[len(prefix):-len(suffix)]) return modules # Retrieve the default value of 'SHARED_LIB_SUFFIX' params = {'flavor': 'linux'} default_variables = {} CalculateVariables(default_variables, params) self.WriteList( DepsToModules(link_deps, generator_default_variables['SHARED_LIB_PREFIX'], default_variables['SHARED_LIB_SUFFIX']), 'LOCAL_SHARED_LIBRARIES') self.WriteList( DepsToModules(link_deps, generator_default_variables['STATIC_LIB_PREFIX'], generator_default_variables['STATIC_LIB_SUFFIX']), 'LOCAL_STATIC_LIBRARIES') if self.type == 'executable': self.WriteLn('include $(BUILD_EXECUTABLE)') elif self.type == 'shared_library': self.WriteLn('include $(BUILD_SHARED_LIBRARY)') elif self.type == 'static_library': self.WriteLn('include $(BUILD_STATIC_LIBRARY)') self.WriteLn() def WriteLn(self, text=''): self.fp.write(text + '\n') def GetSortedXcodeEnv(self, additional_settings=None): return gyp.xcode_emulation.GetSortedXcodeEnv( self.xcode_settings, "$(abs_builddir)", os.path.join("$(abs_srcdir)", self.path), "$(BUILDTYPE)", additional_settings) def GetSortedXcodePostbuildEnv(self): # CHROMIUM_STRIP_SAVE_FILE is a chromium-specific hack. # TODO(thakis): It would be nice to have some general mechanism instead. strip_save_file = self.xcode_settings.GetPerTargetSetting( 'CHROMIUM_STRIP_SAVE_FILE', '') # Even if strip_save_file is empty, explicitly write it. Else a postbuild # might pick up an export from an earlier target. return self.GetSortedXcodeEnv( additional_settings={'CHROMIUM_STRIP_SAVE_FILE': strip_save_file}) def WriteSortedXcodeEnv(self, target, env): for k, v in env: # For # foo := a\ b # the escaped space does the right thing. For # export foo := a\ b # it does not -- the backslash is written to the env as literal character. # So don't escape spaces in |env[k]|. self.WriteLn('%s: export %s := %s' % (QuoteSpaces(target), k, v)) def Objectify(self, path): """Convert a path to its output directory form.""" if '$(' in path: path = path.replace('$(obj)/', '$(obj).%s/$(TARGET)/' % self.toolset) if not '$(obj)' in path: path = '$(obj).%s/$(TARGET)/%s' % (self.toolset, path) return path def Pchify(self, path, lang): """Convert a prefix header path to its output directory form.""" path = self.Absolutify(path) if '$(' in path: path = path.replace('$(obj)/', '$(obj).%s/$(TARGET)/pch-%s' % (self.toolset, lang)) return path return '$(obj).%s/$(TARGET)/pch-%s/%s' % (self.toolset, lang, path) def Absolutify(self, path): """Convert a subdirectory-relative path into a base-relative path. Skips over paths that contain variables.""" if '$(' in path: # Don't call normpath in this case, as it might collapse the # path too aggressively if it features '..'. However it's still # important to strip trailing slashes. return path.rstrip('/') return os.path.normpath(os.path.join(self.path, path)) def ExpandInputRoot(self, template, expansion, dirname): if '%(INPUT_ROOT)s' not in template and '%(INPUT_DIRNAME)s' not in template: return template path = template % { 'INPUT_ROOT': expansion, 'INPUT_DIRNAME': dirname, } return path def _InstallableTargetInstallPath(self): """Returns the location of the final output for an installable target.""" # Xcode puts shared_library results into PRODUCT_DIR, and some gyp files # rely on this. Emulate this behavior for mac. # XXX(TooTallNate): disabling this code since we don't want this behavior... #if (self.type == 'shared_library' and # (self.flavor != 'mac' or self.toolset != 'target')): # # Install all shared libs into a common directory (per toolset) for # # convenient access with LD_LIBRARY_PATH. # return '$(builddir)/lib.%s/%s' % (self.toolset, self.alias) return '$(builddir)/' + self.alias def WriteAutoRegenerationRule(params, root_makefile, makefile_name, build_files): """Write the target to regenerate the Makefile.""" options = params['options'] build_files_args = [gyp.common.RelativePath(filename, options.toplevel_dir) for filename in params['build_files_arg']] gyp_binary = gyp.common.FixIfRelativePath(params['gyp_binary'], options.toplevel_dir) if not gyp_binary.startswith(os.sep): gyp_binary = os.path.join('.', gyp_binary) root_makefile.write( "quiet_cmd_regen_makefile = ACTION Regenerating $@\n" "cmd_regen_makefile = cd $(srcdir); %(cmd)s\n" "%(makefile_name)s: %(deps)s\n" "\t$(call do_cmd,regen_makefile)\n\n" % { 'makefile_name': makefile_name, 'deps': ' '.join(map(Sourceify, build_files)), 'cmd': gyp.common.EncodePOSIXShellList( [gyp_binary, '-fmake'] + gyp.RegenerateFlags(options) + build_files_args)}) def PerformBuild(data, configurations, params): options = params['options'] for config in configurations: arguments = ['make'] if options.toplevel_dir and options.toplevel_dir != '.': arguments += '-C', options.toplevel_dir arguments.append('BUILDTYPE=' + config) print 'Building [%s]: %s' % (config, arguments) subprocess.check_call(arguments) def GenerateOutput(target_list, target_dicts, data, params): options = params['options'] flavor = gyp.common.GetFlavor(params) generator_flags = params.get('generator_flags', {}) builddir_name = generator_flags.get('output_dir', 'out') android_ndk_version = generator_flags.get('android_ndk_version', None) default_target = generator_flags.get('default_target', 'all') def CalculateMakefilePath(build_file, base_name): """Determine where to write a Makefile for a given gyp file.""" # Paths in gyp files are relative to the .gyp file, but we want # paths relative to the source root for the master makefile. Grab # the path of the .gyp file as the base to relativize against. # E.g. "foo/bar" when we're constructing targets for "foo/bar/baz.gyp". base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.depth) # We write the file in the base_path directory. output_file = os.path.join(options.depth, base_path, base_name) if options.generator_output: output_file = os.path.join( options.depth, options.generator_output, base_path, base_name) base_path = gyp.common.RelativePath(os.path.dirname(build_file), options.toplevel_dir) return base_path, output_file # TODO: search for the first non-'Default' target. This can go # away when we add verification that all targets have the # necessary configurations. default_configuration = None toolsets = set([target_dicts[target]['toolset'] for target in target_list]) for target in target_list: spec = target_dicts[target] if spec['default_configuration'] != 'Default': default_configuration = spec['default_configuration'] break if not default_configuration: default_configuration = 'Default' srcdir = '.' makefile_name = 'Makefile' + options.suffix makefile_path = os.path.join(options.toplevel_dir, makefile_name) if options.generator_output: global srcdir_prefix makefile_path = os.path.join( options.toplevel_dir, options.generator_output, makefile_name) srcdir = gyp.common.RelativePath(srcdir, options.generator_output) srcdir_prefix = '$(srcdir)/' flock_command= 'flock' copy_archive_arguments = '-af' header_params = { 'default_target': default_target, 'builddir': builddir_name, 'default_configuration': default_configuration, 'flock': flock_command, 'flock_index': 1, 'link_commands': LINK_COMMANDS_LINUX, 'extra_commands': '', 'srcdir': srcdir, 'copy_archive_args': copy_archive_arguments, } if flavor == 'mac': flock_command = './gyp-mac-tool flock' header_params.update({ 'flock': flock_command, 'flock_index': 2, 'link_commands': LINK_COMMANDS_MAC, 'extra_commands': SHARED_HEADER_MAC_COMMANDS, }) elif flavor == 'android': header_params.update({ 'link_commands': LINK_COMMANDS_ANDROID, }) elif flavor == 'solaris': header_params.update({ 'flock': './gyp-flock-tool flock', 'flock_index': 2, }) elif flavor == 'freebsd': # Note: OpenBSD has sysutils/flock. lockf seems to be FreeBSD specific. header_params.update({ 'flock': 'lockf', }) elif flavor == 'openbsd': copy_archive_arguments = '-pPRf' header_params.update({ 'copy_archive_args': copy_archive_arguments, }) elif flavor == 'aix': copy_archive_arguments = '-pPRf' header_params.update({ 'copy_archive_args': copy_archive_arguments, 'link_commands': LINK_COMMANDS_AIX, 'flock': './gyp-flock-tool flock', 'flock_index': 2, }) header_params.update({ 'CC.target': GetEnvironFallback(('CC_target', 'CC'), '$(CC)'), 'AR.target': GetEnvironFallback(('AR_target', 'AR'), '$(AR)'), 'CXX.target': GetEnvironFallback(('CXX_target', 'CXX'), '$(CXX)'), 'LINK.target': GetEnvironFallback(('LINK_target', 'LINK'), '$(LINK)'), 'CC.host': GetEnvironFallback(('CC_host',), 'gcc'), 'AR.host': GetEnvironFallback(('AR_host',), 'ar'), 'CXX.host': GetEnvironFallback(('CXX_host',), 'g++'), 'LINK.host': GetEnvironFallback(('LINK_host',), '$(CXX.host)'), }) build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings_array = data[build_file].get('make_global_settings', []) wrappers = {} for key, value in make_global_settings_array: if key.endswith('_wrapper'): wrappers[key[:-len('_wrapper')]] = '$(abspath %s)' % value make_global_settings = '' for key, value in make_global_settings_array: if re.match('.*_wrapper', key): continue if value[0] != '$': value = '$(abspath %s)' % value wrapper = wrappers.get(key) if wrapper: value = '%s %s' % (wrapper, value) del wrappers[key] if key in ('CC', 'CC.host', 'CXX', 'CXX.host'): make_global_settings += ( 'ifneq (,$(filter $(origin %s), undefined default))\n' % key) # Let gyp-time envvars win over global settings. env_key = key.replace('.', '_') # CC.host -> CC_host if env_key in os.environ: value = os.environ[env_key] make_global_settings += ' %s = %s\n' % (key, value) make_global_settings += 'endif\n' else: make_global_settings += '%s ?= %s\n' % (key, value) # TODO(ukai): define cmd when only wrapper is specified in # make_global_settings. header_params['make_global_settings'] = make_global_settings gyp.common.EnsureDirExists(makefile_path) root_makefile = open(makefile_path, 'w') root_makefile.write(SHARED_HEADER % header_params) # Currently any versions have the same effect, but in future the behavior # could be different. if android_ndk_version: root_makefile.write( '# Define LOCAL_PATH for build of Android applications.\n' 'LOCAL_PATH := $(call my-dir)\n' '\n') for toolset in toolsets: root_makefile.write('TOOLSET := %s\n' % toolset) WriteRootHeaderSuffixRules(root_makefile) # Put build-time support tools next to the root Makefile. dest_path = os.path.dirname(makefile_path) gyp.common.CopyTool(flavor, dest_path) # Find the list of targets that derive from the gyp file(s) being built. needed_targets = set() for build_file in params['build_files']: for target in gyp.common.AllTargets(target_list, target_dicts, build_file): needed_targets.add(target) build_files = set() include_list = set() for qualified_target in target_list: build_file, target, toolset = gyp.common.ParseQualifiedTarget( qualified_target) this_make_global_settings = data[build_file].get('make_global_settings', []) assert make_global_settings_array == this_make_global_settings, ( "make_global_settings needs to be the same for all targets. %s vs. %s" % (this_make_global_settings, make_global_settings)) build_files.add(gyp.common.RelativePath(build_file, options.toplevel_dir)) included_files = data[build_file]['included_files'] for included_file in included_files: # The included_files entries are relative to the dir of the build file # that included them, so we have to undo that and then make them relative # to the root dir. relative_include_file = gyp.common.RelativePath( gyp.common.UnrelativePath(included_file, build_file), options.toplevel_dir) abs_include_file = os.path.abspath(relative_include_file) # If the include file is from the ~/.gyp dir, we should use absolute path # so that relocating the src dir doesn't break the path. if (params['home_dot_gyp'] and abs_include_file.startswith(params['home_dot_gyp'])): build_files.add(abs_include_file) else: build_files.add(relative_include_file) base_path, output_file = CalculateMakefilePath(build_file, target + '.' + toolset + options.suffix + '.mk') spec = target_dicts[qualified_target] configs = spec['configurations'] if flavor == 'mac': gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(data[build_file], spec) writer = MakefileWriter(generator_flags, flavor) writer.Write(qualified_target, base_path, output_file, spec, configs, part_of_all=qualified_target in needed_targets) # Our root_makefile lives at the source root. Compute the relative path # from there to the output_file for including. mkfile_rel_path = gyp.common.RelativePath(output_file, os.path.dirname(makefile_path)) include_list.add(mkfile_rel_path) # Write out per-gyp (sub-project) Makefiles. depth_rel_path = gyp.common.RelativePath(options.depth, os.getcwd()) for build_file in build_files: # The paths in build_files were relativized above, so undo that before # testing against the non-relativized items in target_list and before # calculating the Makefile path. build_file = os.path.join(depth_rel_path, build_file) gyp_targets = [target_dicts[target]['target_name'] for target in target_list if target.startswith(build_file) and target in needed_targets] # Only generate Makefiles for gyp files with targets. if not gyp_targets: continue base_path, output_file = CalculateMakefilePath(build_file, os.path.splitext(os.path.basename(build_file))[0] + '.Makefile') makefile_rel_path = gyp.common.RelativePath(os.path.dirname(makefile_path), os.path.dirname(output_file)) writer.WriteSubMake(output_file, makefile_rel_path, gyp_targets, builddir_name) # Write out the sorted list of includes. root_makefile.write('\n') for include_file in sorted(include_list): # We wrap each .mk include in an if statement so users can tell make to # not load a file by setting NO_LOAD. The below make code says, only # load the .mk file if the .mk filename doesn't start with a token in # NO_LOAD. root_makefile.write( "ifeq ($(strip $(foreach prefix,$(NO_LOAD),\\\n" " $(findstring $(join ^,$(prefix)),\\\n" " $(join ^," + include_file + ")))),)\n") root_makefile.write(" include " + include_file + "\n") root_makefile.write("endif\n") root_makefile.write('\n') if (not generator_flags.get('standalone') and generator_flags.get('auto_regeneration', True)): WriteAutoRegenerationRule(params, root_makefile, makefile_name, build_files) root_makefile.write(SHARED_FOOTER) root_makefile.close() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs.py0000644000000000000000000037773612631326456023765 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import copy import ntpath import os import posixpath import re import subprocess import sys import gyp.common import gyp.easy_xml as easy_xml import gyp.generator.ninja as ninja_generator import gyp.MSVSNew as MSVSNew import gyp.MSVSProject as MSVSProject import gyp.MSVSSettings as MSVSSettings import gyp.MSVSToolFile as MSVSToolFile import gyp.MSVSUserFile as MSVSUserFile import gyp.MSVSUtil as MSVSUtil import gyp.MSVSVersion as MSVSVersion from gyp.common import GypError from gyp.common import OrderedSet # TODO: Remove once bots are on 2.7, http://crbug.com/241769 def _import_OrderedDict(): import collections try: return collections.OrderedDict except AttributeError: import gyp.ordered_dict return gyp.ordered_dict.OrderedDict OrderedDict = _import_OrderedDict() # Regular expression for validating Visual Studio GUIDs. If the GUID # contains lowercase hex letters, MSVS will be fine. However, # IncrediBuild BuildConsole will parse the solution file, but then # silently skip building the target causing hard to track down errors. # Note that this only happens with the BuildConsole, and does not occur # if IncrediBuild is executed from inside Visual Studio. This regex # validates that the string looks like a GUID with all uppercase hex # letters. VALID_MSVS_GUID_CHARS = re.compile(r'^[A-F0-9\-]+$') generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '.exe', 'STATIC_LIB_PREFIX': '', 'SHARED_LIB_PREFIX': '', 'STATIC_LIB_SUFFIX': '.lib', 'SHARED_LIB_SUFFIX': '.dll', 'INTERMEDIATE_DIR': '$(IntDir)', 'SHARED_INTERMEDIATE_DIR': '$(OutDir)obj/global_intermediate', 'OS': 'win', 'PRODUCT_DIR': '$(OutDir)', 'LIB_DIR': '$(OutDir)lib', 'RULE_INPUT_ROOT': '$(InputName)', 'RULE_INPUT_DIRNAME': '$(InputDir)', 'RULE_INPUT_EXT': '$(InputExt)', 'RULE_INPUT_NAME': '$(InputFileName)', 'RULE_INPUT_PATH': '$(InputPath)', 'CONFIGURATION_NAME': '$(ConfigurationName)', } # The msvs specific sections that hold paths generator_additional_path_sections = [ 'msvs_cygwin_dirs', 'msvs_props', ] generator_additional_non_configuration_keys = [ 'msvs_cygwin_dirs', 'msvs_cygwin_shell', 'msvs_large_pdb', 'msvs_shard', 'msvs_external_builder', 'msvs_external_builder_out_dir', 'msvs_external_builder_build_cmd', 'msvs_external_builder_clean_cmd', 'msvs_external_builder_clcompile_cmd', 'msvs_enable_winrt', 'msvs_requires_importlibrary', 'msvs_enable_winphone', 'msvs_application_type_revision', 'msvs_target_platform_version', 'msvs_target_platform_minversion', ] # List of precompiled header related keys. precomp_keys = [ 'msvs_precompiled_header', 'msvs_precompiled_source', ] cached_username = None cached_domain = None # TODO(gspencer): Switch the os.environ calls to be # win32api.GetDomainName() and win32api.GetUserName() once the # python version in depot_tools has been updated to work on Vista # 64-bit. def _GetDomainAndUserName(): if sys.platform not in ('win32', 'cygwin'): return ('DOMAIN', 'USERNAME') global cached_username global cached_domain if not cached_domain or not cached_username: domain = os.environ.get('USERDOMAIN') username = os.environ.get('USERNAME') if not domain or not username: call = subprocess.Popen(['net', 'config', 'Workstation'], stdout=subprocess.PIPE) config = call.communicate()[0] username_re = re.compile(r'^User name\s+(\S+)', re.MULTILINE) username_match = username_re.search(config) if username_match: username = username_match.group(1) domain_re = re.compile(r'^Logon domain\s+(\S+)', re.MULTILINE) domain_match = domain_re.search(config) if domain_match: domain = domain_match.group(1) cached_domain = domain cached_username = username return (cached_domain, cached_username) fixpath_prefix = None def _NormalizedSource(source): """Normalize the path. But not if that gets rid of a variable, as this may expand to something larger than one directory. Arguments: source: The path to be normalize.d Returns: The normalized path. """ normalized = os.path.normpath(source) if source.count('$') == normalized.count('$'): source = normalized return source def _FixPath(path): """Convert paths to a form that will make sense in a vcproj file. Arguments: path: The path to convert, may contain / etc. Returns: The path with all slashes made into backslashes. """ if fixpath_prefix and path and not os.path.isabs(path) and not path[0] == '$': path = os.path.join(fixpath_prefix, path) path = path.replace('/', '\\') path = _NormalizedSource(path) if path and path[-1] == '\\': path = path[:-1] return path def _FixPaths(paths): """Fix each of the paths of the list.""" return [_FixPath(i) for i in paths] def _ConvertSourcesToFilterHierarchy(sources, prefix=None, excluded=None, list_excluded=True, msvs_version=None): """Converts a list split source file paths into a vcproj folder hierarchy. Arguments: sources: A list of source file paths split. prefix: A list of source file path layers meant to apply to each of sources. excluded: A set of excluded files. msvs_version: A MSVSVersion object. Returns: A hierarchy of filenames and MSVSProject.Filter objects that matches the layout of the source tree. For example: _ConvertSourcesToFilterHierarchy([['a', 'bob1.c'], ['b', 'bob2.c']], prefix=['joe']) --> [MSVSProject.Filter('a', contents=['joe\\a\\bob1.c']), MSVSProject.Filter('b', contents=['joe\\b\\bob2.c'])] """ if not prefix: prefix = [] result = [] excluded_result = [] folders = OrderedDict() # Gather files into the final result, excluded, or folders. for s in sources: if len(s) == 1: filename = _NormalizedSource('\\'.join(prefix + s)) if filename in excluded: excluded_result.append(filename) else: result.append(filename) elif msvs_version and not msvs_version.UsesVcxproj(): # For MSVS 2008 and earlier, we need to process all files before walking # the sub folders. if not folders.get(s[0]): folders[s[0]] = [] folders[s[0]].append(s[1:]) else: contents = _ConvertSourcesToFilterHierarchy([s[1:]], prefix + [s[0]], excluded=excluded, list_excluded=list_excluded, msvs_version=msvs_version) contents = MSVSProject.Filter(s[0], contents=contents) result.append(contents) # Add a folder for excluded files. if excluded_result and list_excluded: excluded_folder = MSVSProject.Filter('_excluded_files', contents=excluded_result) result.append(excluded_folder) if msvs_version and msvs_version.UsesVcxproj(): return result # Populate all the folders. for f in folders: contents = _ConvertSourcesToFilterHierarchy(folders[f], prefix=prefix + [f], excluded=excluded, list_excluded=list_excluded, msvs_version=msvs_version) contents = MSVSProject.Filter(f, contents=contents) result.append(contents) return result def _ToolAppend(tools, tool_name, setting, value, only_if_unset=False): if not value: return _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset) def _ToolSetOrAppend(tools, tool_name, setting, value, only_if_unset=False): # TODO(bradnelson): ugly hack, fix this more generally!!! if 'Directories' in setting or 'Dependencies' in setting: if type(value) == str: value = value.replace('/', '\\') else: value = [i.replace('/', '\\') for i in value] if not tools.get(tool_name): tools[tool_name] = dict() tool = tools[tool_name] if tool.get(setting): if only_if_unset: return if type(tool[setting]) == list and type(value) == list: tool[setting] += value else: raise TypeError( 'Appending "%s" to a non-list setting "%s" for tool "%s" is ' 'not allowed, previous value: %s' % ( value, setting, tool_name, str(tool[setting]))) else: tool[setting] = value def _ConfigPlatform(config_data): return config_data.get('msvs_configuration_platform', 'Win32') def _ConfigBaseName(config_name, platform_name): if config_name.endswith('_' + platform_name): return config_name[0:-len(platform_name) - 1] else: return config_name def _ConfigFullName(config_name, config_data): platform_name = _ConfigPlatform(config_data) return '%s|%s' % (_ConfigBaseName(config_name, platform_name), platform_name) def _BuildCommandLineForRuleRaw(spec, cmd, cygwin_shell, has_input_path, quote_cmd, do_setup_env): if [x for x in cmd if '$(InputDir)' in x]: input_dir_preamble = ( 'set INPUTDIR=$(InputDir)\n' 'if NOT DEFINED INPUTDIR set INPUTDIR=.\\\n' 'set INPUTDIR=%INPUTDIR:~0,-1%\n' ) else: input_dir_preamble = '' if cygwin_shell: # Find path to cygwin. cygwin_dir = _FixPath(spec.get('msvs_cygwin_dirs', ['.'])[0]) # Prepare command. direct_cmd = cmd direct_cmd = [i.replace('$(IntDir)', '`cygpath -m "${INTDIR}"`') for i in direct_cmd] direct_cmd = [i.replace('$(OutDir)', '`cygpath -m "${OUTDIR}"`') for i in direct_cmd] direct_cmd = [i.replace('$(InputDir)', '`cygpath -m "${INPUTDIR}"`') for i in direct_cmd] if has_input_path: direct_cmd = [i.replace('$(InputPath)', '`cygpath -m "${INPUTPATH}"`') for i in direct_cmd] direct_cmd = ['\\"%s\\"' % i.replace('"', '\\\\\\"') for i in direct_cmd] # direct_cmd = gyp.common.EncodePOSIXShellList(direct_cmd) direct_cmd = ' '.join(direct_cmd) # TODO(quote): regularize quoting path names throughout the module cmd = '' if do_setup_env: cmd += 'call "$(ProjectDir)%(cygwin_dir)s\\setup_env.bat" && ' cmd += 'set CYGWIN=nontsec&& ' if direct_cmd.find('NUMBER_OF_PROCESSORS') >= 0: cmd += 'set /a NUMBER_OF_PROCESSORS_PLUS_1=%%NUMBER_OF_PROCESSORS%%+1&& ' if direct_cmd.find('INTDIR') >= 0: cmd += 'set INTDIR=$(IntDir)&& ' if direct_cmd.find('OUTDIR') >= 0: cmd += 'set OUTDIR=$(OutDir)&& ' if has_input_path and direct_cmd.find('INPUTPATH') >= 0: cmd += 'set INPUTPATH=$(InputPath) && ' cmd += 'bash -c "%(cmd)s"' cmd = cmd % {'cygwin_dir': cygwin_dir, 'cmd': direct_cmd} return input_dir_preamble + cmd else: # Convert cat --> type to mimic unix. if cmd[0] == 'cat': command = ['type'] else: command = [cmd[0].replace('/', '\\')] # Add call before command to ensure that commands can be tied together one # after the other without aborting in Incredibuild, since IB makes a bat # file out of the raw command string, and some commands (like python) are # actually batch files themselves. command.insert(0, 'call') # Fix the paths # TODO(quote): This is a really ugly heuristic, and will miss path fixing # for arguments like "--arg=path" or "/opt:path". # If the argument starts with a slash or dash, it's probably a command line # switch arguments = [i if (i[:1] in "/-") else _FixPath(i) for i in cmd[1:]] arguments = [i.replace('$(InputDir)', '%INPUTDIR%') for i in arguments] arguments = [MSVSSettings.FixVCMacroSlashes(i) for i in arguments] if quote_cmd: # Support a mode for using cmd directly. # Convert any paths to native form (first element is used directly). # TODO(quote): regularize quoting path names throughout the module arguments = ['"%s"' % i for i in arguments] # Collapse into a single command. return input_dir_preamble + ' '.join(command + arguments) def _BuildCommandLineForRule(spec, rule, has_input_path, do_setup_env): # Currently this weird argument munging is used to duplicate the way a # python script would need to be run as part of the chrome tree. # Eventually we should add some sort of rule_default option to set this # per project. For now the behavior chrome needs is the default. mcs = rule.get('msvs_cygwin_shell') if mcs is None: mcs = int(spec.get('msvs_cygwin_shell', 1)) elif isinstance(mcs, str): mcs = int(mcs) quote_cmd = int(rule.get('msvs_quote_cmd', 1)) return _BuildCommandLineForRuleRaw(spec, rule['action'], mcs, has_input_path, quote_cmd, do_setup_env=do_setup_env) def _AddActionStep(actions_dict, inputs, outputs, description, command): """Merge action into an existing list of actions. Care must be taken so that actions which have overlapping inputs either don't get assigned to the same input, or get collapsed into one. Arguments: actions_dict: dictionary keyed on input name, which maps to a list of dicts describing the actions attached to that input file. inputs: list of inputs outputs: list of outputs description: description of the action command: command line to execute """ # Require there to be at least one input (call sites will ensure this). assert inputs action = { 'inputs': inputs, 'outputs': outputs, 'description': description, 'command': command, } # Pick where to stick this action. # While less than optimal in terms of build time, attach them to the first # input for now. chosen_input = inputs[0] # Add it there. if chosen_input not in actions_dict: actions_dict[chosen_input] = [] actions_dict[chosen_input].append(action) def _AddCustomBuildToolForMSVS(p, spec, primary_input, inputs, outputs, description, cmd): """Add a custom build tool to execute something. Arguments: p: the target project spec: the target project dict primary_input: input file to attach the build tool to inputs: list of inputs outputs: list of outputs description: description of the action cmd: command line to execute """ inputs = _FixPaths(inputs) outputs = _FixPaths(outputs) tool = MSVSProject.Tool( 'VCCustomBuildTool', {'Description': description, 'AdditionalDependencies': ';'.join(inputs), 'Outputs': ';'.join(outputs), 'CommandLine': cmd, }) # Add to the properties of primary input for each config. for config_name, c_data in spec['configurations'].iteritems(): p.AddFileConfig(_FixPath(primary_input), _ConfigFullName(config_name, c_data), tools=[tool]) def _AddAccumulatedActionsToMSVS(p, spec, actions_dict): """Add actions accumulated into an actions_dict, merging as needed. Arguments: p: the target project spec: the target project dict actions_dict: dictionary keyed on input name, which maps to a list of dicts describing the actions attached to that input file. """ for primary_input in actions_dict: inputs = OrderedSet() outputs = OrderedSet() descriptions = [] commands = [] for action in actions_dict[primary_input]: inputs.update(OrderedSet(action['inputs'])) outputs.update(OrderedSet(action['outputs'])) descriptions.append(action['description']) commands.append(action['command']) # Add the custom build step for one input file. description = ', and also '.join(descriptions) command = '\r\n'.join(commands) _AddCustomBuildToolForMSVS(p, spec, primary_input=primary_input, inputs=inputs, outputs=outputs, description=description, cmd=command) def _RuleExpandPath(path, input_file): """Given the input file to which a rule applied, string substitute a path. Arguments: path: a path to string expand input_file: the file to which the rule applied. Returns: The string substituted path. """ path = path.replace('$(InputName)', os.path.splitext(os.path.split(input_file)[1])[0]) path = path.replace('$(InputDir)', os.path.dirname(input_file)) path = path.replace('$(InputExt)', os.path.splitext(os.path.split(input_file)[1])[1]) path = path.replace('$(InputFileName)', os.path.split(input_file)[1]) path = path.replace('$(InputPath)', input_file) return path def _FindRuleTriggerFiles(rule, sources): """Find the list of files which a particular rule applies to. Arguments: rule: the rule in question sources: the set of all known source files for this project Returns: The list of sources that trigger a particular rule. """ return rule.get('rule_sources', []) def _RuleInputsAndOutputs(rule, trigger_file): """Find the inputs and outputs generated by a rule. Arguments: rule: the rule in question. trigger_file: the main trigger for this rule. Returns: The pair of (inputs, outputs) involved in this rule. """ raw_inputs = _FixPaths(rule.get('inputs', [])) raw_outputs = _FixPaths(rule.get('outputs', [])) inputs = OrderedSet() outputs = OrderedSet() inputs.add(trigger_file) for i in raw_inputs: inputs.add(_RuleExpandPath(i, trigger_file)) for o in raw_outputs: outputs.add(_RuleExpandPath(o, trigger_file)) return (inputs, outputs) def _GenerateNativeRulesForMSVS(p, rules, output_dir, spec, options): """Generate a native rules file. Arguments: p: the target project rules: the set of rules to include output_dir: the directory in which the project/gyp resides spec: the project dict options: global generator options """ rules_filename = '%s%s.rules' % (spec['target_name'], options.suffix) rules_file = MSVSToolFile.Writer(os.path.join(output_dir, rules_filename), spec['target_name']) # Add each rule. for r in rules: rule_name = r['rule_name'] rule_ext = r['extension'] inputs = _FixPaths(r.get('inputs', [])) outputs = _FixPaths(r.get('outputs', [])) # Skip a rule with no action and no inputs. if 'action' not in r and not r.get('rule_sources', []): continue cmd = _BuildCommandLineForRule(spec, r, has_input_path=True, do_setup_env=True) rules_file.AddCustomBuildRule(name=rule_name, description=r.get('message', rule_name), extensions=[rule_ext], additional_dependencies=inputs, outputs=outputs, cmd=cmd) # Write out rules file. rules_file.WriteIfChanged() # Add rules file to project. p.AddToolFile(rules_filename) def _Cygwinify(path): path = path.replace('$(OutDir)', '$(OutDirCygwin)') path = path.replace('$(IntDir)', '$(IntDirCygwin)') return path def _GenerateExternalRules(rules, output_dir, spec, sources, options, actions_to_add): """Generate an external makefile to do a set of rules. Arguments: rules: the list of rules to include output_dir: path containing project and gyp files spec: project specification data sources: set of sources known options: global generator options actions_to_add: The list of actions we will add to. """ filename = '%s_rules%s.mk' % (spec['target_name'], options.suffix) mk_file = gyp.common.WriteOnDiff(os.path.join(output_dir, filename)) # Find cygwin style versions of some paths. mk_file.write('OutDirCygwin:=$(shell cygpath -u "$(OutDir)")\n') mk_file.write('IntDirCygwin:=$(shell cygpath -u "$(IntDir)")\n') # Gather stuff needed to emit all: target. all_inputs = OrderedSet() all_outputs = OrderedSet() all_output_dirs = OrderedSet() first_outputs = [] for rule in rules: trigger_files = _FindRuleTriggerFiles(rule, sources) for tf in trigger_files: inputs, outputs = _RuleInputsAndOutputs(rule, tf) all_inputs.update(OrderedSet(inputs)) all_outputs.update(OrderedSet(outputs)) # Only use one target from each rule as the dependency for # 'all' so we don't try to build each rule multiple times. first_outputs.append(list(outputs)[0]) # Get the unique output directories for this rule. output_dirs = [os.path.split(i)[0] for i in outputs] for od in output_dirs: all_output_dirs.add(od) first_outputs_cyg = [_Cygwinify(i) for i in first_outputs] # Write out all: target, including mkdir for each output directory. mk_file.write('all: %s\n' % ' '.join(first_outputs_cyg)) for od in all_output_dirs: if od: mk_file.write('\tmkdir -p `cygpath -u "%s"`\n' % od) mk_file.write('\n') # Define how each output is generated. for rule in rules: trigger_files = _FindRuleTriggerFiles(rule, sources) for tf in trigger_files: # Get all the inputs and outputs for this rule for this trigger file. inputs, outputs = _RuleInputsAndOutputs(rule, tf) inputs = [_Cygwinify(i) for i in inputs] outputs = [_Cygwinify(i) for i in outputs] # Prepare the command line for this rule. cmd = [_RuleExpandPath(c, tf) for c in rule['action']] cmd = ['"%s"' % i for i in cmd] cmd = ' '.join(cmd) # Add it to the makefile. mk_file.write('%s: %s\n' % (' '.join(outputs), ' '.join(inputs))) mk_file.write('\t%s\n\n' % cmd) # Close up the file. mk_file.close() # Add makefile to list of sources. sources.add(filename) # Add a build action to call makefile. cmd = ['make', 'OutDir=$(OutDir)', 'IntDir=$(IntDir)', '-j', '${NUMBER_OF_PROCESSORS_PLUS_1}', '-f', filename] cmd = _BuildCommandLineForRuleRaw(spec, cmd, True, False, True, True) # Insert makefile as 0'th input, so it gets the action attached there, # as this is easier to understand from in the IDE. all_inputs = list(all_inputs) all_inputs.insert(0, filename) _AddActionStep(actions_to_add, inputs=_FixPaths(all_inputs), outputs=_FixPaths(all_outputs), description='Running external rules for %s' % spec['target_name'], command=cmd) def _EscapeEnvironmentVariableExpansion(s): """Escapes % characters. Escapes any % characters so that Windows-style environment variable expansions will leave them alone. See http://connect.microsoft.com/VisualStudio/feedback/details/106127/cl-d-name-text-containing-percentage-characters-doesnt-compile to understand why we have to do this. Args: s: The string to be escaped. Returns: The escaped string. """ s = s.replace('%', '%%') return s quote_replacer_regex = re.compile(r'(\\*)"') def _EscapeCommandLineArgumentForMSVS(s): """Escapes a Windows command-line argument. So that the Win32 CommandLineToArgv function will turn the escaped result back into the original string. See http://msdn.microsoft.com/en-us/library/17w5ykft.aspx ("Parsing C++ Command-Line Arguments") to understand why we have to do this. Args: s: the string to be escaped. Returns: the escaped string. """ def _Replace(match): # For a literal quote, CommandLineToArgv requires an odd number of # backslashes preceding it, and it produces half as many literal backslashes # (rounded down). So we need to produce 2n+1 backslashes. return 2 * match.group(1) + '\\"' # Escape all quotes so that they are interpreted literally. s = quote_replacer_regex.sub(_Replace, s) # Now add unescaped quotes so that any whitespace is interpreted literally. s = '"' + s + '"' return s delimiters_replacer_regex = re.compile(r'(\\*)([,;]+)') def _EscapeVCProjCommandLineArgListItem(s): """Escapes command line arguments for MSVS. The VCProj format stores string lists in a single string using commas and semi-colons as separators, which must be quoted if they are to be interpreted literally. However, command-line arguments may already have quotes, and the VCProj parser is ignorant of the backslash escaping convention used by CommandLineToArgv, so the command-line quotes and the VCProj quotes may not be the same quotes. So to store a general command-line argument in a VCProj list, we need to parse the existing quoting according to VCProj's convention and quote any delimiters that are not already quoted by that convention. The quotes that we add will also be seen by CommandLineToArgv, so if backslashes precede them then we also have to escape those backslashes according to the CommandLineToArgv convention. Args: s: the string to be escaped. Returns: the escaped string. """ def _Replace(match): # For a non-literal quote, CommandLineToArgv requires an even number of # backslashes preceding it, and it produces half as many literal # backslashes. So we need to produce 2n backslashes. return 2 * match.group(1) + '"' + match.group(2) + '"' segments = s.split('"') # The unquoted segments are at the even-numbered indices. for i in range(0, len(segments), 2): segments[i] = delimiters_replacer_regex.sub(_Replace, segments[i]) # Concatenate back into a single string s = '"'.join(segments) if len(segments) % 2 == 0: # String ends while still quoted according to VCProj's convention. This # means the delimiter and the next list item that follow this one in the # .vcproj file will be misinterpreted as part of this item. There is nothing # we can do about this. Adding an extra quote would correct the problem in # the VCProj but cause the same problem on the final command-line. Moving # the item to the end of the list does works, but that's only possible if # there's only one such item. Let's just warn the user. print >> sys.stderr, ('Warning: MSVS may misinterpret the odd number of ' + 'quotes in ' + s) return s def _EscapeCppDefineForMSVS(s): """Escapes a CPP define so that it will reach the compiler unaltered.""" s = _EscapeEnvironmentVariableExpansion(s) s = _EscapeCommandLineArgumentForMSVS(s) s = _EscapeVCProjCommandLineArgListItem(s) # cl.exe replaces literal # characters with = in preprocesor definitions for # some reason. Octal-encode to work around that. s = s.replace('#', '\\%03o' % ord('#')) return s quote_replacer_regex2 = re.compile(r'(\\+)"') def _EscapeCommandLineArgumentForMSBuild(s): """Escapes a Windows command-line argument for use by MSBuild.""" def _Replace(match): return (len(match.group(1)) / 2 * 4) * '\\' + '\\"' # Escape all quotes so that they are interpreted literally. s = quote_replacer_regex2.sub(_Replace, s) return s def _EscapeMSBuildSpecialCharacters(s): escape_dictionary = { '%': '%25', '$': '%24', '@': '%40', "'": '%27', ';': '%3B', '?': '%3F', '*': '%2A' } result = ''.join([escape_dictionary.get(c, c) for c in s]) return result def _EscapeCppDefineForMSBuild(s): """Escapes a CPP define so that it will reach the compiler unaltered.""" s = _EscapeEnvironmentVariableExpansion(s) s = _EscapeCommandLineArgumentForMSBuild(s) s = _EscapeMSBuildSpecialCharacters(s) # cl.exe replaces literal # characters with = in preprocesor definitions for # some reason. Octal-encode to work around that. s = s.replace('#', '\\%03o' % ord('#')) return s def _GenerateRulesForMSVS(p, output_dir, options, spec, sources, excluded_sources, actions_to_add): """Generate all the rules for a particular project. Arguments: p: the project output_dir: directory to emit rules to options: global options passed to the generator spec: the specification for this project sources: the set of all known source files in this project excluded_sources: the set of sources excluded from normal processing actions_to_add: deferred list of actions to add in """ rules = spec.get('rules', []) rules_native = [r for r in rules if not int(r.get('msvs_external_rule', 0))] rules_external = [r for r in rules if int(r.get('msvs_external_rule', 0))] # Handle rules that use a native rules file. if rules_native: _GenerateNativeRulesForMSVS(p, rules_native, output_dir, spec, options) # Handle external rules (non-native rules). if rules_external: _GenerateExternalRules(rules_external, output_dir, spec, sources, options, actions_to_add) _AdjustSourcesForRules(rules, sources, excluded_sources, False) def _AdjustSourcesForRules(rules, sources, excluded_sources, is_msbuild): # Add outputs generated by each rule (if applicable). for rule in rules: # Add in the outputs from this rule. trigger_files = _FindRuleTriggerFiles(rule, sources) for trigger_file in trigger_files: # Remove trigger_file from excluded_sources to let the rule be triggered # (e.g. rule trigger ax_enums.idl is added to excluded_sources # because it's also in an action's inputs in the same project) excluded_sources.discard(_FixPath(trigger_file)) # Done if not processing outputs as sources. if int(rule.get('process_outputs_as_sources', False)): inputs, outputs = _RuleInputsAndOutputs(rule, trigger_file) inputs = OrderedSet(_FixPaths(inputs)) outputs = OrderedSet(_FixPaths(outputs)) inputs.remove(_FixPath(trigger_file)) sources.update(inputs) if not is_msbuild: excluded_sources.update(inputs) sources.update(outputs) def _FilterActionsFromExcluded(excluded_sources, actions_to_add): """Take inputs with actions attached out of the list of exclusions. Arguments: excluded_sources: list of source files not to be built. actions_to_add: dict of actions keyed on source file they're attached to. Returns: excluded_sources with files that have actions attached removed. """ must_keep = OrderedSet(_FixPaths(actions_to_add.keys())) return [s for s in excluded_sources if s not in must_keep] def _GetDefaultConfiguration(spec): return spec['configurations'][spec['default_configuration']] def _GetGuidOfProject(proj_path, spec): """Get the guid for the project. Arguments: proj_path: Path of the vcproj or vcxproj file to generate. spec: The target dictionary containing the properties of the target. Returns: the guid. Raises: ValueError: if the specified GUID is invalid. """ # Pluck out the default configuration. default_config = _GetDefaultConfiguration(spec) # Decide the guid of the project. guid = default_config.get('msvs_guid') if guid: if VALID_MSVS_GUID_CHARS.match(guid) is None: raise ValueError('Invalid MSVS guid: "%s". Must match regex: "%s".' % (guid, VALID_MSVS_GUID_CHARS.pattern)) guid = '{%s}' % guid guid = guid or MSVSNew.MakeGuid(proj_path) return guid def _GetMsbuildToolsetOfProject(proj_path, spec, version): """Get the platform toolset for the project. Arguments: proj_path: Path of the vcproj or vcxproj file to generate. spec: The target dictionary containing the properties of the target. version: The MSVSVersion object. Returns: the platform toolset string or None. """ # Pluck out the default configuration. default_config = _GetDefaultConfiguration(spec) toolset = default_config.get('msbuild_toolset') if not toolset and version.DefaultToolset(): toolset = version.DefaultToolset() return toolset def _GenerateProject(project, options, version, generator_flags): """Generates a vcproj file. Arguments: project: the MSVSProject object. options: global generator options. version: the MSVSVersion object. generator_flags: dict of generator-specific flags. Returns: A list of source files that cannot be found on disk. """ default_config = _GetDefaultConfiguration(project.spec) # Skip emitting anything if told to with msvs_existing_vcproj option. if default_config.get('msvs_existing_vcproj'): return [] if version.UsesVcxproj(): return _GenerateMSBuildProject(project, options, version, generator_flags) else: return _GenerateMSVSProject(project, options, version, generator_flags) # TODO: Avoid code duplication with _ValidateSourcesForOSX in make.py. def _ValidateSourcesForMSVSProject(spec, version): """Makes sure if duplicate basenames are not specified in the source list. Arguments: spec: The target dictionary containing the properties of the target. version: The VisualStudioVersion object. """ # This validation should not be applied to MSVC2010 and later. assert not version.UsesVcxproj() # TODO: Check if MSVC allows this for loadable_module targets. if spec.get('type', None) not in ('static_library', 'shared_library'): return sources = spec.get('sources', []) basenames = {} for source in sources: name, ext = os.path.splitext(source) is_compiled_file = ext in [ '.c', '.cc', '.cpp', '.cxx', '.m', '.mm', '.s', '.S'] if not is_compiled_file: continue basename = os.path.basename(name) # Don't include extension. basenames.setdefault(basename, []).append(source) error = '' for basename, files in basenames.iteritems(): if len(files) > 1: error += ' %s: %s\n' % (basename, ' '.join(files)) if error: print('static library %s has several files with the same basename:\n' % spec['target_name'] + error + 'MSVC08 cannot handle that.') raise GypError('Duplicate basenames in sources section, see list above') def _GenerateMSVSProject(project, options, version, generator_flags): """Generates a .vcproj file. It may create .rules and .user files too. Arguments: project: The project object we will generate the file for. options: Global options passed to the generator. version: The VisualStudioVersion object. generator_flags: dict of generator-specific flags. """ spec = project.spec gyp.common.EnsureDirExists(project.path) platforms = _GetUniquePlatforms(spec) p = MSVSProject.Writer(project.path, version, spec['target_name'], project.guid, platforms) # Get directory project file is in. project_dir = os.path.split(project.path)[0] gyp_path = _NormalizedSource(project.build_file) relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir) config_type = _GetMSVSConfigurationType(spec, project.build_file) for config_name, config in spec['configurations'].iteritems(): _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config) # MSVC08 and prior version cannot handle duplicate basenames in the same # target. # TODO: Take excluded sources into consideration if possible. _ValidateSourcesForMSVSProject(spec, version) # Prepare list of sources and excluded sources. gyp_file = os.path.split(project.build_file)[1] sources, excluded_sources = _PrepareListOfSources(spec, generator_flags, gyp_file) # Add rules. actions_to_add = {} _GenerateRulesForMSVS(p, project_dir, options, spec, sources, excluded_sources, actions_to_add) list_excluded = generator_flags.get('msvs_list_excluded_files', True) sources, excluded_sources, excluded_idl = ( _AdjustSourcesAndConvertToFilterHierarchy(spec, options, project_dir, sources, excluded_sources, list_excluded, version)) # Add in files. missing_sources = _VerifySourcesExist(sources, project_dir) p.AddFiles(sources) _AddToolFilesToMSVS(p, spec) _HandlePreCompiledHeaders(p, sources, spec) _AddActions(actions_to_add, spec, relative_path_of_gyp_file) _AddCopies(actions_to_add, spec) _WriteMSVSUserFile(project.path, version, spec) # NOTE: this stanza must appear after all actions have been decided. # Don't excluded sources with actions attached, or they won't run. excluded_sources = _FilterActionsFromExcluded( excluded_sources, actions_to_add) _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl, list_excluded) _AddAccumulatedActionsToMSVS(p, spec, actions_to_add) # Write it out. p.WriteIfChanged() return missing_sources def _GetUniquePlatforms(spec): """Returns the list of unique platforms for this spec, e.g ['win32', ...]. Arguments: spec: The target dictionary containing the properties of the target. Returns: The MSVSUserFile object created. """ # Gather list of unique platforms. platforms = OrderedSet() for configuration in spec['configurations']: platforms.add(_ConfigPlatform(spec['configurations'][configuration])) platforms = list(platforms) return platforms def _CreateMSVSUserFile(proj_path, version, spec): """Generates a .user file for the user running this Gyp program. Arguments: proj_path: The path of the project file being created. The .user file shares the same path (with an appropriate suffix). version: The VisualStudioVersion object. spec: The target dictionary containing the properties of the target. Returns: The MSVSUserFile object created. """ (domain, username) = _GetDomainAndUserName() vcuser_filename = '.'.join([proj_path, domain, username, 'user']) user_file = MSVSUserFile.Writer(vcuser_filename, version, spec['target_name']) return user_file def _GetMSVSConfigurationType(spec, build_file): """Returns the configuration type for this project. It's a number defined by Microsoft. May raise an exception. Args: spec: The target dictionary containing the properties of the target. build_file: The path of the gyp file. Returns: An integer, the configuration type. """ try: config_type = { 'executable': '1', # .exe 'shared_library': '2', # .dll 'loadable_module': '2', # .dll 'static_library': '4', # .lib 'none': '10', # Utility type }[spec['type']] except KeyError: if spec.get('type'): raise GypError('Target type %s is not a valid target type for ' 'target %s in %s.' % (spec['type'], spec['target_name'], build_file)) else: raise GypError('Missing type field for target %s in %s.' % (spec['target_name'], build_file)) return config_type def _AddConfigurationToMSVSProject(p, spec, config_type, config_name, config): """Adds a configuration to the MSVS project. Many settings in a vcproj file are specific to a configuration. This function the main part of the vcproj file that's configuration specific. Arguments: p: The target project being generated. spec: The target dictionary containing the properties of the target. config_type: The configuration type, a number as defined by Microsoft. config_name: The name of the configuration. config: The dictionary that defines the special processing to be done for this configuration. """ # Get the information for this configuration include_dirs, midl_include_dirs, resource_include_dirs = \ _GetIncludeDirs(config) libraries = _GetLibraries(spec) library_dirs = _GetLibraryDirs(config) out_file, vc_tool, _ = _GetOutputFilePathAndTool(spec, msbuild=False) defines = _GetDefines(config) defines = [_EscapeCppDefineForMSVS(d) for d in defines] disabled_warnings = _GetDisabledWarnings(config) prebuild = config.get('msvs_prebuild') postbuild = config.get('msvs_postbuild') def_file = _GetModuleDefinition(spec) precompiled_header = config.get('msvs_precompiled_header') # Prepare the list of tools as a dictionary. tools = dict() # Add in user specified msvs_settings. msvs_settings = config.get('msvs_settings', {}) MSVSSettings.ValidateMSVSSettings(msvs_settings) # Prevent default library inheritance from the environment. _ToolAppend(tools, 'VCLinkerTool', 'AdditionalDependencies', ['$(NOINHERIT)']) for tool in msvs_settings: settings = config['msvs_settings'][tool] for setting in settings: _ToolAppend(tools, tool, setting, settings[setting]) # Add the information to the appropriate tool _ToolAppend(tools, 'VCCLCompilerTool', 'AdditionalIncludeDirectories', include_dirs) _ToolAppend(tools, 'VCMIDLTool', 'AdditionalIncludeDirectories', midl_include_dirs) _ToolAppend(tools, 'VCResourceCompilerTool', 'AdditionalIncludeDirectories', resource_include_dirs) # Add in libraries. _ToolAppend(tools, 'VCLinkerTool', 'AdditionalDependencies', libraries) _ToolAppend(tools, 'VCLinkerTool', 'AdditionalLibraryDirectories', library_dirs) if out_file: _ToolAppend(tools, vc_tool, 'OutputFile', out_file, only_if_unset=True) # Add defines. _ToolAppend(tools, 'VCCLCompilerTool', 'PreprocessorDefinitions', defines) _ToolAppend(tools, 'VCResourceCompilerTool', 'PreprocessorDefinitions', defines) # Change program database directory to prevent collisions. _ToolAppend(tools, 'VCCLCompilerTool', 'ProgramDataBaseFileName', '$(IntDir)$(ProjectName)\\vc80.pdb', only_if_unset=True) # Add disabled warnings. _ToolAppend(tools, 'VCCLCompilerTool', 'DisableSpecificWarnings', disabled_warnings) # Add Pre-build. _ToolAppend(tools, 'VCPreBuildEventTool', 'CommandLine', prebuild) # Add Post-build. _ToolAppend(tools, 'VCPostBuildEventTool', 'CommandLine', postbuild) # Turn on precompiled headers if appropriate. if precompiled_header: precompiled_header = os.path.split(precompiled_header)[1] _ToolAppend(tools, 'VCCLCompilerTool', 'UsePrecompiledHeader', '2') _ToolAppend(tools, 'VCCLCompilerTool', 'PrecompiledHeaderThrough', precompiled_header) _ToolAppend(tools, 'VCCLCompilerTool', 'ForcedIncludeFiles', precompiled_header) # Loadable modules don't generate import libraries; # tell dependent projects to not expect one. if spec['type'] == 'loadable_module': _ToolAppend(tools, 'VCLinkerTool', 'IgnoreImportLibrary', 'true') # Set the module definition file if any. if def_file: _ToolAppend(tools, 'VCLinkerTool', 'ModuleDefinitionFile', def_file) _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name) def _GetIncludeDirs(config): """Returns the list of directories to be used for #include directives. Arguments: config: The dictionary that defines the special processing to be done for this configuration. Returns: The list of directory paths. """ # TODO(bradnelson): include_dirs should really be flexible enough not to # require this sort of thing. include_dirs = ( config.get('include_dirs', []) + config.get('msvs_system_include_dirs', [])) midl_include_dirs = ( config.get('midl_include_dirs', []) + config.get('msvs_system_include_dirs', [])) resource_include_dirs = config.get('resource_include_dirs', include_dirs) include_dirs = _FixPaths(include_dirs) midl_include_dirs = _FixPaths(midl_include_dirs) resource_include_dirs = _FixPaths(resource_include_dirs) return include_dirs, midl_include_dirs, resource_include_dirs def _GetLibraryDirs(config): """Returns the list of directories to be used for library search paths. Arguments: config: The dictionary that defines the special processing to be done for this configuration. Returns: The list of directory paths. """ library_dirs = config.get('library_dirs', []) library_dirs = _FixPaths(library_dirs) return library_dirs def _GetLibraries(spec): """Returns the list of libraries for this configuration. Arguments: spec: The target dictionary containing the properties of the target. Returns: The list of directory paths. """ libraries = spec.get('libraries', []) # Strip out -l, as it is not used on windows (but is needed so we can pass # in libraries that are assumed to be in the default library path). # Also remove duplicate entries, leaving only the last duplicate, while # preserving order. found = OrderedSet() unique_libraries_list = [] for entry in reversed(libraries): library = re.sub(r'^\-l', '', entry) if not os.path.splitext(library)[1]: library += '.lib' if library not in found: found.add(library) unique_libraries_list.append(library) unique_libraries_list.reverse() return unique_libraries_list def _GetOutputFilePathAndTool(spec, msbuild): """Returns the path and tool to use for this target. Figures out the path of the file this spec will create and the name of the VC tool that will create it. Arguments: spec: The target dictionary containing the properties of the target. Returns: A triple of (file path, name of the vc tool, name of the msbuild tool) """ # Select a name for the output file. out_file = '' vc_tool = '' msbuild_tool = '' output_file_map = { 'executable': ('VCLinkerTool', 'Link', '$(OutDir)', '.exe'), 'shared_library': ('VCLinkerTool', 'Link', '$(OutDir)', '.dll'), 'loadable_module': ('VCLinkerTool', 'Link', '$(OutDir)', '.dll'), 'static_library': ('VCLibrarianTool', 'Lib', '$(OutDir)lib\\', '.lib'), } output_file_props = output_file_map.get(spec['type']) if output_file_props and int(spec.get('msvs_auto_output_file', 1)): vc_tool, msbuild_tool, out_dir, suffix = output_file_props if spec.get('standalone_static_library', 0): out_dir = '$(OutDir)' out_dir = spec.get('product_dir', out_dir) product_extension = spec.get('product_extension') if product_extension: suffix = '.' + product_extension elif msbuild: suffix = '$(TargetExt)' prefix = spec.get('product_prefix', '') product_name = spec.get('product_name', '$(ProjectName)') out_file = ntpath.join(out_dir, prefix + product_name + suffix) return out_file, vc_tool, msbuild_tool def _GetOutputTargetExt(spec): """Returns the extension for this target, including the dot If product_extension is specified, set target_extension to this to avoid MSB8012, returns None otherwise. Ignores any target_extension settings in the input files. Arguments: spec: The target dictionary containing the properties of the target. Returns: A string with the extension, or None """ target_extension = spec.get('product_extension') if target_extension: return '.' + target_extension return None def _GetDefines(config): """Returns the list of preprocessor definitions for this configuation. Arguments: config: The dictionary that defines the special processing to be done for this configuration. Returns: The list of preprocessor definitions. """ defines = [] for d in config.get('defines', []): if type(d) == list: fd = '='.join([str(dpart) for dpart in d]) else: fd = str(d) defines.append(fd) return defines def _GetDisabledWarnings(config): return [str(i) for i in config.get('msvs_disabled_warnings', [])] def _GetModuleDefinition(spec): def_file = '' if spec['type'] in ['shared_library', 'loadable_module', 'executable']: def_files = [s for s in spec.get('sources', []) if s.endswith('.def')] if len(def_files) == 1: def_file = _FixPath(def_files[0]) elif def_files: raise ValueError( 'Multiple module definition files in one target, target %s lists ' 'multiple .def files: %s' % ( spec['target_name'], ' '.join(def_files))) return def_file def _ConvertToolsToExpectedForm(tools): """Convert tools to a form expected by Visual Studio. Arguments: tools: A dictionary of settings; the tool name is the key. Returns: A list of Tool objects. """ tool_list = [] for tool, settings in tools.iteritems(): # Collapse settings with lists. settings_fixed = {} for setting, value in settings.iteritems(): if type(value) == list: if ((tool == 'VCLinkerTool' and setting == 'AdditionalDependencies') or setting == 'AdditionalOptions'): settings_fixed[setting] = ' '.join(value) else: settings_fixed[setting] = ';'.join(value) else: settings_fixed[setting] = value # Add in this tool. tool_list.append(MSVSProject.Tool(tool, settings_fixed)) return tool_list def _AddConfigurationToMSVS(p, spec, tools, config, config_type, config_name): """Add to the project file the configuration specified by config. Arguments: p: The target project being generated. spec: the target project dict. tools: A dictionary of settings; the tool name is the key. config: The dictionary that defines the special processing to be done for this configuration. config_type: The configuration type, a number as defined by Microsoft. config_name: The name of the configuration. """ attributes = _GetMSVSAttributes(spec, config, config_type) # Add in this configuration. tool_list = _ConvertToolsToExpectedForm(tools) p.AddConfig(_ConfigFullName(config_name, config), attrs=attributes, tools=tool_list) def _GetMSVSAttributes(spec, config, config_type): # Prepare configuration attributes. prepared_attrs = {} source_attrs = config.get('msvs_configuration_attributes', {}) for a in source_attrs: prepared_attrs[a] = source_attrs[a] # Add props files. vsprops_dirs = config.get('msvs_props', []) vsprops_dirs = _FixPaths(vsprops_dirs) if vsprops_dirs: prepared_attrs['InheritedPropertySheets'] = ';'.join(vsprops_dirs) # Set configuration type. prepared_attrs['ConfigurationType'] = config_type output_dir = prepared_attrs.get('OutputDirectory', '$(SolutionDir)$(ConfigurationName)') prepared_attrs['OutputDirectory'] = _FixPath(output_dir) + '\\' if 'IntermediateDirectory' not in prepared_attrs: intermediate = '$(ConfigurationName)\\obj\\$(ProjectName)' prepared_attrs['IntermediateDirectory'] = _FixPath(intermediate) + '\\' else: intermediate = _FixPath(prepared_attrs['IntermediateDirectory']) + '\\' intermediate = MSVSSettings.FixVCMacroSlashes(intermediate) prepared_attrs['IntermediateDirectory'] = intermediate return prepared_attrs def _AddNormalizedSources(sources_set, sources_array): sources_set.update(_NormalizedSource(s) for s in sources_array) def _PrepareListOfSources(spec, generator_flags, gyp_file): """Prepare list of sources and excluded sources. Besides the sources specified directly in the spec, adds the gyp file so that a change to it will cause a re-compile. Also adds appropriate sources for actions and copies. Assumes later stage will un-exclude files which have custom build steps attached. Arguments: spec: The target dictionary containing the properties of the target. gyp_file: The name of the gyp file. Returns: A pair of (list of sources, list of excluded sources). The sources will be relative to the gyp file. """ sources = OrderedSet() _AddNormalizedSources(sources, spec.get('sources', [])) excluded_sources = OrderedSet() # Add in the gyp file. if not generator_flags.get('standalone'): sources.add(gyp_file) # Add in 'action' inputs and outputs. for a in spec.get('actions', []): inputs = a['inputs'] inputs = [_NormalizedSource(i) for i in inputs] # Add all inputs to sources and excluded sources. inputs = OrderedSet(inputs) sources.update(inputs) if not spec.get('msvs_external_builder'): excluded_sources.update(inputs) if int(a.get('process_outputs_as_sources', False)): _AddNormalizedSources(sources, a.get('outputs', [])) # Add in 'copies' inputs and outputs. for cpy in spec.get('copies', []): _AddNormalizedSources(sources, cpy.get('files', [])) return (sources, excluded_sources) def _AdjustSourcesAndConvertToFilterHierarchy( spec, options, gyp_dir, sources, excluded_sources, list_excluded, version): """Adjusts the list of sources and excluded sources. Also converts the sets to lists. Arguments: spec: The target dictionary containing the properties of the target. options: Global generator options. gyp_dir: The path to the gyp file being processed. sources: A set of sources to be included for this project. excluded_sources: A set of sources to be excluded for this project. version: A MSVSVersion object. Returns: A trio of (list of sources, list of excluded sources, path of excluded IDL file) """ # Exclude excluded sources coming into the generator. excluded_sources.update(OrderedSet(spec.get('sources_excluded', []))) # Add excluded sources into sources for good measure. sources.update(excluded_sources) # Convert to proper windows form. # NOTE: sources goes from being a set to a list here. # NOTE: excluded_sources goes from being a set to a list here. sources = _FixPaths(sources) # Convert to proper windows form. excluded_sources = _FixPaths(excluded_sources) excluded_idl = _IdlFilesHandledNonNatively(spec, sources) precompiled_related = _GetPrecompileRelatedFiles(spec) # Find the excluded ones, minus the precompiled header related ones. fully_excluded = [i for i in excluded_sources if i not in precompiled_related] # Convert to folders and the right slashes. sources = [i.split('\\') for i in sources] sources = _ConvertSourcesToFilterHierarchy(sources, excluded=fully_excluded, list_excluded=list_excluded, msvs_version=version) # Prune filters with a single child to flatten ugly directory structures # such as ../../src/modules/module1 etc. if version.UsesVcxproj(): while all([isinstance(s, MSVSProject.Filter) for s in sources]) \ and len(set([s.name for s in sources])) == 1: assert all([len(s.contents) == 1 for s in sources]) sources = [s.contents[0] for s in sources] else: while len(sources) == 1 and isinstance(sources[0], MSVSProject.Filter): sources = sources[0].contents return sources, excluded_sources, excluded_idl def _IdlFilesHandledNonNatively(spec, sources): # If any non-native rules use 'idl' as an extension exclude idl files. # Gather a list here to use later. using_idl = False for rule in spec.get('rules', []): if rule['extension'] == 'idl' and int(rule.get('msvs_external_rule', 0)): using_idl = True break if using_idl: excluded_idl = [i for i in sources if i.endswith('.idl')] else: excluded_idl = [] return excluded_idl def _GetPrecompileRelatedFiles(spec): # Gather a list of precompiled header related sources. precompiled_related = [] for _, config in spec['configurations'].iteritems(): for k in precomp_keys: f = config.get(k) if f: precompiled_related.append(_FixPath(f)) return precompiled_related def _ExcludeFilesFromBeingBuilt(p, spec, excluded_sources, excluded_idl, list_excluded): exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl) for file_name, excluded_configs in exclusions.iteritems(): if (not list_excluded and len(excluded_configs) == len(spec['configurations'])): # If we're not listing excluded files, then they won't appear in the # project, so don't try to configure them to be excluded. pass else: for config_name, config in excluded_configs: p.AddFileConfig(file_name, _ConfigFullName(config_name, config), {'ExcludedFromBuild': 'true'}) def _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl): exclusions = {} # Exclude excluded sources from being built. for f in excluded_sources: excluded_configs = [] for config_name, config in spec['configurations'].iteritems(): precomped = [_FixPath(config.get(i, '')) for i in precomp_keys] # Don't do this for ones that are precompiled header related. if f not in precomped: excluded_configs.append((config_name, config)) exclusions[f] = excluded_configs # If any non-native rules use 'idl' as an extension exclude idl files. # Exclude them now. for f in excluded_idl: excluded_configs = [] for config_name, config in spec['configurations'].iteritems(): excluded_configs.append((config_name, config)) exclusions[f] = excluded_configs return exclusions def _AddToolFilesToMSVS(p, spec): # Add in tool files (rules). tool_files = OrderedSet() for _, config in spec['configurations'].iteritems(): for f in config.get('msvs_tool_files', []): tool_files.add(f) for f in tool_files: p.AddToolFile(f) def _HandlePreCompiledHeaders(p, sources, spec): # Pre-compiled header source stubs need a different compiler flag # (generate precompiled header) and any source file not of the same # kind (i.e. C vs. C++) as the precompiled header source stub needs # to have use of precompiled headers disabled. extensions_excluded_from_precompile = [] for config_name, config in spec['configurations'].iteritems(): source = config.get('msvs_precompiled_source') if source: source = _FixPath(source) # UsePrecompiledHeader=1 for if using precompiled headers. tool = MSVSProject.Tool('VCCLCompilerTool', {'UsePrecompiledHeader': '1'}) p.AddFileConfig(source, _ConfigFullName(config_name, config), {}, tools=[tool]) basename, extension = os.path.splitext(source) if extension == '.c': extensions_excluded_from_precompile = ['.cc', '.cpp', '.cxx'] else: extensions_excluded_from_precompile = ['.c'] def DisableForSourceTree(source_tree): for source in source_tree: if isinstance(source, MSVSProject.Filter): DisableForSourceTree(source.contents) else: basename, extension = os.path.splitext(source) if extension in extensions_excluded_from_precompile: for config_name, config in spec['configurations'].iteritems(): tool = MSVSProject.Tool('VCCLCompilerTool', {'UsePrecompiledHeader': '0', 'ForcedIncludeFiles': '$(NOINHERIT)'}) p.AddFileConfig(_FixPath(source), _ConfigFullName(config_name, config), {}, tools=[tool]) # Do nothing if there was no precompiled source. if extensions_excluded_from_precompile: DisableForSourceTree(sources) def _AddActions(actions_to_add, spec, relative_path_of_gyp_file): # Add actions. actions = spec.get('actions', []) # Don't setup_env every time. When all the actions are run together in one # batch file in VS, the PATH will grow too long. # Membership in this set means that the cygwin environment has been set up, # and does not need to be set up again. have_setup_env = set() for a in actions: # Attach actions to the gyp file if nothing else is there. inputs = a.get('inputs') or [relative_path_of_gyp_file] attached_to = inputs[0] need_setup_env = attached_to not in have_setup_env cmd = _BuildCommandLineForRule(spec, a, has_input_path=False, do_setup_env=need_setup_env) have_setup_env.add(attached_to) # Add the action. _AddActionStep(actions_to_add, inputs=inputs, outputs=a.get('outputs', []), description=a.get('message', a['action_name']), command=cmd) def _WriteMSVSUserFile(project_path, version, spec): # Add run_as and test targets. if 'run_as' in spec: run_as = spec['run_as'] action = run_as.get('action', []) environment = run_as.get('environment', []) working_directory = run_as.get('working_directory', '.') elif int(spec.get('test', 0)): action = ['$(TargetPath)', '--gtest_print_time'] environment = [] working_directory = '.' else: return # Nothing to add # Write out the user file. user_file = _CreateMSVSUserFile(project_path, version, spec) for config_name, c_data in spec['configurations'].iteritems(): user_file.AddDebugSettings(_ConfigFullName(config_name, c_data), action, environment, working_directory) user_file.WriteIfChanged() def _AddCopies(actions_to_add, spec): copies = _GetCopies(spec) for inputs, outputs, cmd, description in copies: _AddActionStep(actions_to_add, inputs=inputs, outputs=outputs, description=description, command=cmd) def _GetCopies(spec): copies = [] # Add copies. for cpy in spec.get('copies', []): for src in cpy.get('files', []): dst = os.path.join(cpy['destination'], os.path.basename(src)) # _AddCustomBuildToolForMSVS() will call _FixPath() on the inputs and # outputs, so do the same for our generated command line. if src.endswith('/'): src_bare = src[:-1] base_dir = posixpath.split(src_bare)[0] outer_dir = posixpath.split(src_bare)[1] cmd = 'cd "%s" && xcopy /e /f /y "%s" "%s\\%s\\"' % ( _FixPath(base_dir), outer_dir, _FixPath(dst), outer_dir) copies.append(([src], ['dummy_copies', dst], cmd, 'Copying %s to %s' % (src, dst))) else: cmd = 'mkdir "%s" 2>nul & set ERRORLEVEL=0 & copy /Y "%s" "%s"' % ( _FixPath(cpy['destination']), _FixPath(src), _FixPath(dst)) copies.append(([src], [dst], cmd, 'Copying %s to %s' % (src, dst))) return copies def _GetPathDict(root, path): # |path| will eventually be empty (in the recursive calls) if it was initially # relative; otherwise it will eventually end up as '\', 'D:\', etc. if not path or path.endswith(os.sep): return root parent, folder = os.path.split(path) parent_dict = _GetPathDict(root, parent) if folder not in parent_dict: parent_dict[folder] = dict() return parent_dict[folder] def _DictsToFolders(base_path, bucket, flat): # Convert to folders recursively. children = [] for folder, contents in bucket.iteritems(): if type(contents) == dict: folder_children = _DictsToFolders(os.path.join(base_path, folder), contents, flat) if flat: children += folder_children else: folder_children = MSVSNew.MSVSFolder(os.path.join(base_path, folder), name='(' + folder + ')', entries=folder_children) children.append(folder_children) else: children.append(contents) return children def _CollapseSingles(parent, node): # Recursively explorer the tree of dicts looking for projects which are # the sole item in a folder which has the same name as the project. Bring # such projects up one level. if (type(node) == dict and len(node) == 1 and node.keys()[0] == parent + '.vcproj'): return node[node.keys()[0]] if type(node) != dict: return node for child in node: node[child] = _CollapseSingles(child, node[child]) return node def _GatherSolutionFolders(sln_projects, project_objects, flat): root = {} # Convert into a tree of dicts on path. for p in sln_projects: gyp_file, target = gyp.common.ParseQualifiedTarget(p)[0:2] gyp_dir = os.path.dirname(gyp_file) path_dict = _GetPathDict(root, gyp_dir) path_dict[target + '.vcproj'] = project_objects[p] # Walk down from the top until we hit a folder that has more than one entry. # In practice, this strips the top-level "src/" dir from the hierarchy in # the solution. while len(root) == 1 and type(root[root.keys()[0]]) == dict: root = root[root.keys()[0]] # Collapse singles. root = _CollapseSingles('', root) # Merge buckets until everything is a root entry. return _DictsToFolders('', root, flat) def _GetPathOfProject(qualified_target, spec, options, msvs_version): default_config = _GetDefaultConfiguration(spec) proj_filename = default_config.get('msvs_existing_vcproj') if not proj_filename: proj_filename = (spec['target_name'] + options.suffix + msvs_version.ProjectExtension()) build_file = gyp.common.BuildFile(qualified_target) proj_path = os.path.join(os.path.dirname(build_file), proj_filename) fix_prefix = None if options.generator_output: project_dir_path = os.path.dirname(os.path.abspath(proj_path)) proj_path = os.path.join(options.generator_output, proj_path) fix_prefix = gyp.common.RelativePath(project_dir_path, os.path.dirname(proj_path)) return proj_path, fix_prefix def _GetPlatformOverridesOfProject(spec): # Prepare a dict indicating which project configurations are used for which # solution configurations for this target. config_platform_overrides = {} for config_name, c in spec['configurations'].iteritems(): config_fullname = _ConfigFullName(config_name, c) platform = c.get('msvs_target_platform', _ConfigPlatform(c)) fixed_config_fullname = '%s|%s' % ( _ConfigBaseName(config_name, _ConfigPlatform(c)), platform) config_platform_overrides[config_fullname] = fixed_config_fullname return config_platform_overrides def _CreateProjectObjects(target_list, target_dicts, options, msvs_version): """Create a MSVSProject object for the targets found in target list. Arguments: target_list: the list of targets to generate project objects for. target_dicts: the dictionary of specifications. options: global generator options. msvs_version: the MSVSVersion object. Returns: A set of created projects, keyed by target. """ global fixpath_prefix # Generate each project. projects = {} for qualified_target in target_list: spec = target_dicts[qualified_target] if spec['toolset'] != 'target': raise GypError( 'Multiple toolsets not supported in msvs build (target %s)' % qualified_target) proj_path, fixpath_prefix = _GetPathOfProject(qualified_target, spec, options, msvs_version) guid = _GetGuidOfProject(proj_path, spec) overrides = _GetPlatformOverridesOfProject(spec) build_file = gyp.common.BuildFile(qualified_target) # Create object for this project. obj = MSVSNew.MSVSProject( proj_path, name=spec['target_name'], guid=guid, spec=spec, build_file=build_file, config_platform_overrides=overrides, fixpath_prefix=fixpath_prefix) # Set project toolset if any (MS build only) if msvs_version.UsesVcxproj(): obj.set_msbuild_toolset( _GetMsbuildToolsetOfProject(proj_path, spec, msvs_version)) projects[qualified_target] = obj # Set all the dependencies, but not if we are using an external builder like # ninja for project in projects.values(): if not project.spec.get('msvs_external_builder'): deps = project.spec.get('dependencies', []) deps = [projects[d] for d in deps] project.set_dependencies(deps) return projects def _InitNinjaFlavor(params, target_list, target_dicts): """Initialize targets for the ninja flavor. This sets up the necessary variables in the targets to generate msvs projects that use ninja as an external builder. The variables in the spec are only set if they have not been set. This allows individual specs to override the default values initialized here. Arguments: params: Params provided to the generator. target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. """ for qualified_target in target_list: spec = target_dicts[qualified_target] if spec.get('msvs_external_builder'): # The spec explicitly defined an external builder, so don't change it. continue path_to_ninja = spec.get('msvs_path_to_ninja', 'ninja.exe') spec['msvs_external_builder'] = 'ninja' if not spec.get('msvs_external_builder_out_dir'): gyp_file, _, _ = gyp.common.ParseQualifiedTarget(qualified_target) gyp_dir = os.path.dirname(gyp_file) configuration = '$(Configuration)' if params.get('target_arch') == 'x64': configuration += '_x64' spec['msvs_external_builder_out_dir'] = os.path.join( gyp.common.RelativePath(params['options'].toplevel_dir, gyp_dir), ninja_generator.ComputeOutputDir(params), configuration) if not spec.get('msvs_external_builder_build_cmd'): spec['msvs_external_builder_build_cmd'] = [ path_to_ninja, '-C', '$(OutDir)', '$(ProjectName)', ] if not spec.get('msvs_external_builder_clean_cmd'): spec['msvs_external_builder_clean_cmd'] = [ path_to_ninja, '-C', '$(OutDir)', '-tclean', '$(ProjectName)', ] def CalculateVariables(default_variables, params): """Generated variables that require params to be known.""" generator_flags = params.get('generator_flags', {}) # Select project file format version (if unset, default to auto detecting). msvs_version = MSVSVersion.SelectVisualStudioVersion( generator_flags.get('msvs_version', 'auto')) # Stash msvs_version for later (so we don't have to probe the system twice). params['msvs_version'] = msvs_version # Set a variable so conditions can be based on msvs_version. default_variables['MSVS_VERSION'] = msvs_version.ShortName() # To determine processor word size on Windows, in addition to checking # PROCESSOR_ARCHITECTURE (which reflects the word size of the current # process), it is also necessary to check PROCESSOR_ARCITEW6432 (which # contains the actual word size of the system when running thru WOW64). if (os.environ.get('PROCESSOR_ARCHITECTURE', '').find('64') >= 0 or os.environ.get('PROCESSOR_ARCHITEW6432', '').find('64') >= 0): default_variables['MSVS_OS_BITS'] = 64 else: default_variables['MSVS_OS_BITS'] = 32 if gyp.common.GetFlavor(params) == 'ninja': default_variables['SHARED_INTERMEDIATE_DIR'] = '$(OutDir)gen' def PerformBuild(data, configurations, params): options = params['options'] msvs_version = params['msvs_version'] devenv = os.path.join(msvs_version.path, 'Common7', 'IDE', 'devenv.com') for build_file, build_file_dict in data.iteritems(): (build_file_root, build_file_ext) = os.path.splitext(build_file) if build_file_ext != '.gyp': continue sln_path = build_file_root + options.suffix + '.sln' if options.generator_output: sln_path = os.path.join(options.generator_output, sln_path) for config in configurations: arguments = [devenv, sln_path, '/Build', config] print 'Building [%s]: %s' % (config, arguments) rtn = subprocess.check_call(arguments) def GenerateOutput(target_list, target_dicts, data, params): """Generate .sln and .vcproj files. This is the entry point for this generator. Arguments: target_list: List of target pairs: 'base/base.gyp:base'. target_dicts: Dict of target properties keyed on target pair. data: Dictionary containing per .gyp data. """ global fixpath_prefix options = params['options'] # Get the project file format version back out of where we stashed it in # GeneratorCalculatedVariables. msvs_version = params['msvs_version'] generator_flags = params.get('generator_flags', {}) # Optionally shard targets marked with 'msvs_shard': SHARD_COUNT. (target_list, target_dicts) = MSVSUtil.ShardTargets(target_list, target_dicts) # Optionally use the large PDB workaround for targets marked with # 'msvs_large_pdb': 1. (target_list, target_dicts) = MSVSUtil.InsertLargePdbShims( target_list, target_dicts, generator_default_variables) # Optionally configure each spec to use ninja as the external builder. if params.get('flavor') == 'ninja': _InitNinjaFlavor(params, target_list, target_dicts) # Prepare the set of configurations. configs = set() for qualified_target in target_list: spec = target_dicts[qualified_target] for config_name, config in spec['configurations'].iteritems(): configs.add(_ConfigFullName(config_name, config)) configs = list(configs) # Figure out all the projects that will be generated and their guids project_objects = _CreateProjectObjects(target_list, target_dicts, options, msvs_version) # Generate each project. missing_sources = [] for project in project_objects.values(): fixpath_prefix = project.fixpath_prefix missing_sources.extend(_GenerateProject(project, options, msvs_version, generator_flags)) fixpath_prefix = None for build_file in data: # Validate build_file extension if not build_file.endswith('.gyp'): continue sln_path = os.path.splitext(build_file)[0] + options.suffix + '.sln' if options.generator_output: sln_path = os.path.join(options.generator_output, sln_path) # Get projects in the solution, and their dependents. sln_projects = gyp.common.BuildFileTargets(target_list, build_file) sln_projects += gyp.common.DeepDependencyTargets(target_dicts, sln_projects) # Create folder hierarchy. root_entries = _GatherSolutionFolders( sln_projects, project_objects, flat=msvs_version.FlatSolution()) # Create solution. sln = MSVSNew.MSVSSolution(sln_path, entries=root_entries, variants=configs, websiteProperties=False, version=msvs_version) sln.Write() if missing_sources: error_message = "Missing input files:\n" + \ '\n'.join(set(missing_sources)) if generator_flags.get('msvs_error_on_missing_sources', False): raise GypError(error_message) else: print >> sys.stdout, "Warning: " + error_message def _GenerateMSBuildFiltersFile(filters_path, source_files, rule_dependencies, extension_to_rule_name): """Generate the filters file. This file is used by Visual Studio to organize the presentation of source files into folders. Arguments: filters_path: The path of the file to be created. source_files: The hierarchical structure of all the sources. extension_to_rule_name: A dictionary mapping file extensions to rules. """ filter_group = [] source_group = [] _AppendFiltersForMSBuild('', source_files, rule_dependencies, extension_to_rule_name, filter_group, source_group) if filter_group: content = ['Project', {'ToolsVersion': '4.0', 'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003' }, ['ItemGroup'] + filter_group, ['ItemGroup'] + source_group ] easy_xml.WriteXmlIfChanged(content, filters_path, pretty=True, win32=True) elif os.path.exists(filters_path): # We don't need this filter anymore. Delete the old filter file. os.unlink(filters_path) def _AppendFiltersForMSBuild(parent_filter_name, sources, rule_dependencies, extension_to_rule_name, filter_group, source_group): """Creates the list of filters and sources to be added in the filter file. Args: parent_filter_name: The name of the filter under which the sources are found. sources: The hierarchy of filters and sources to process. extension_to_rule_name: A dictionary mapping file extensions to rules. filter_group: The list to which filter entries will be appended. source_group: The list to which source entries will be appeneded. """ for source in sources: if isinstance(source, MSVSProject.Filter): # We have a sub-filter. Create the name of that sub-filter. if not parent_filter_name: filter_name = source.name else: filter_name = '%s\\%s' % (parent_filter_name, source.name) # Add the filter to the group. filter_group.append( ['Filter', {'Include': filter_name}, ['UniqueIdentifier', MSVSNew.MakeGuid(source.name)]]) # Recurse and add its dependents. _AppendFiltersForMSBuild(filter_name, source.contents, rule_dependencies, extension_to_rule_name, filter_group, source_group) else: # It's a source. Create a source entry. _, element = _MapFileToMsBuildSourceType(source, rule_dependencies, extension_to_rule_name) source_entry = [element, {'Include': source}] # Specify the filter it is part of, if any. if parent_filter_name: source_entry.append(['Filter', parent_filter_name]) source_group.append(source_entry) def _MapFileToMsBuildSourceType(source, rule_dependencies, extension_to_rule_name): """Returns the group and element type of the source file. Arguments: source: The source file name. extension_to_rule_name: A dictionary mapping file extensions to rules. Returns: A pair of (group this file should be part of, the label of element) """ _, ext = os.path.splitext(source) if ext in extension_to_rule_name: group = 'rule' element = extension_to_rule_name[ext] elif ext in ['.cc', '.cpp', '.c', '.cxx']: group = 'compile' element = 'ClCompile' elif ext in ['.h', '.hxx']: group = 'include' element = 'ClInclude' elif ext == '.rc': group = 'resource' element = 'ResourceCompile' elif ext == '.asm': group = 'masm' element = 'MASM' elif ext == '.idl': group = 'midl' element = 'Midl' elif source in rule_dependencies: group = 'rule_dependency' element = 'CustomBuild' else: group = 'none' element = 'None' return (group, element) def _GenerateRulesForMSBuild(output_dir, options, spec, sources, excluded_sources, props_files_of_rules, targets_files_of_rules, actions_to_add, rule_dependencies, extension_to_rule_name): # MSBuild rules are implemented using three files: an XML file, a .targets # file and a .props file. # See http://blogs.msdn.com/b/vcblog/archive/2010/04/21/quick-help-on-vs2010-custom-build-rule.aspx # for more details. rules = spec.get('rules', []) rules_native = [r for r in rules if not int(r.get('msvs_external_rule', 0))] rules_external = [r for r in rules if int(r.get('msvs_external_rule', 0))] msbuild_rules = [] for rule in rules_native: # Skip a rule with no action and no inputs. if 'action' not in rule and not rule.get('rule_sources', []): continue msbuild_rule = MSBuildRule(rule, spec) msbuild_rules.append(msbuild_rule) rule_dependencies.update(msbuild_rule.additional_dependencies.split(';')) extension_to_rule_name[msbuild_rule.extension] = msbuild_rule.rule_name if msbuild_rules: base = spec['target_name'] + options.suffix props_name = base + '.props' targets_name = base + '.targets' xml_name = base + '.xml' props_files_of_rules.add(props_name) targets_files_of_rules.add(targets_name) props_path = os.path.join(output_dir, props_name) targets_path = os.path.join(output_dir, targets_name) xml_path = os.path.join(output_dir, xml_name) _GenerateMSBuildRulePropsFile(props_path, msbuild_rules) _GenerateMSBuildRuleTargetsFile(targets_path, msbuild_rules) _GenerateMSBuildRuleXmlFile(xml_path, msbuild_rules) if rules_external: _GenerateExternalRules(rules_external, output_dir, spec, sources, options, actions_to_add) _AdjustSourcesForRules(rules, sources, excluded_sources, True) class MSBuildRule(object): """Used to store information used to generate an MSBuild rule. Attributes: rule_name: The rule name, sanitized to use in XML. target_name: The name of the target. after_targets: The name of the AfterTargets element. before_targets: The name of the BeforeTargets element. depends_on: The name of the DependsOn element. compute_output: The name of the ComputeOutput element. dirs_to_make: The name of the DirsToMake element. inputs: The name of the _inputs element. tlog: The name of the _tlog element. extension: The extension this rule applies to. description: The message displayed when this rule is invoked. additional_dependencies: A string listing additional dependencies. outputs: The outputs of this rule. command: The command used to run the rule. """ def __init__(self, rule, spec): self.display_name = rule['rule_name'] # Assure that the rule name is only characters and numbers self.rule_name = re.sub(r'\W', '_', self.display_name) # Create the various element names, following the example set by the # Visual Studio 2008 to 2010 conversion. I don't know if VS2010 # is sensitive to the exact names. self.target_name = '_' + self.rule_name self.after_targets = self.rule_name + 'AfterTargets' self.before_targets = self.rule_name + 'BeforeTargets' self.depends_on = self.rule_name + 'DependsOn' self.compute_output = 'Compute%sOutput' % self.rule_name self.dirs_to_make = self.rule_name + 'DirsToMake' self.inputs = self.rule_name + '_inputs' self.tlog = self.rule_name + '_tlog' self.extension = rule['extension'] if not self.extension.startswith('.'): self.extension = '.' + self.extension self.description = MSVSSettings.ConvertVCMacrosToMSBuild( rule.get('message', self.rule_name)) old_additional_dependencies = _FixPaths(rule.get('inputs', [])) self.additional_dependencies = ( ';'.join([MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in old_additional_dependencies])) old_outputs = _FixPaths(rule.get('outputs', [])) self.outputs = ';'.join([MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in old_outputs]) old_command = _BuildCommandLineForRule(spec, rule, has_input_path=True, do_setup_env=True) self.command = MSVSSettings.ConvertVCMacrosToMSBuild(old_command) def _GenerateMSBuildRulePropsFile(props_path, msbuild_rules): """Generate the .props file.""" content = ['Project', {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003'}] for rule in msbuild_rules: content.extend([ ['PropertyGroup', {'Condition': "'$(%s)' == '' and '$(%s)' == '' and " "'$(ConfigurationType)' != 'Makefile'" % (rule.before_targets, rule.after_targets) }, [rule.before_targets, 'Midl'], [rule.after_targets, 'CustomBuild'], ], ['PropertyGroup', [rule.depends_on, {'Condition': "'$(ConfigurationType)' != 'Makefile'"}, '_SelectedFiles;$(%s)' % rule.depends_on ], ], ['ItemDefinitionGroup', [rule.rule_name, ['CommandLineTemplate', rule.command], ['Outputs', rule.outputs], ['ExecutionDescription', rule.description], ['AdditionalDependencies', rule.additional_dependencies], ], ] ]) easy_xml.WriteXmlIfChanged(content, props_path, pretty=True, win32=True) def _GenerateMSBuildRuleTargetsFile(targets_path, msbuild_rules): """Generate the .targets file.""" content = ['Project', {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003' } ] item_group = [ 'ItemGroup', ['PropertyPageSchema', {'Include': '$(MSBuildThisFileDirectory)$(MSBuildThisFileName).xml'} ] ] for rule in msbuild_rules: item_group.append( ['AvailableItemName', {'Include': rule.rule_name}, ['Targets', rule.target_name], ]) content.append(item_group) for rule in msbuild_rules: content.append( ['UsingTask', {'TaskName': rule.rule_name, 'TaskFactory': 'XamlTaskFactory', 'AssemblyName': 'Microsoft.Build.Tasks.v4.0' }, ['Task', '$(MSBuildThisFileDirectory)$(MSBuildThisFileName).xml'], ]) for rule in msbuild_rules: rule_name = rule.rule_name target_outputs = '%%(%s.Outputs)' % rule_name target_inputs = ('%%(%s.Identity);%%(%s.AdditionalDependencies);' '$(MSBuildProjectFile)') % (rule_name, rule_name) rule_inputs = '%%(%s.Identity)' % rule_name extension_condition = ("'%(Extension)'=='.obj' or " "'%(Extension)'=='.res' or " "'%(Extension)'=='.rsc' or " "'%(Extension)'=='.lib'") remove_section = [ 'ItemGroup', {'Condition': "'@(SelectedFiles)' != ''"}, [rule_name, {'Remove': '@(%s)' % rule_name, 'Condition': "'%(Identity)' != '@(SelectedFiles)'" } ] ] inputs_section = [ 'ItemGroup', [rule.inputs, {'Include': '%%(%s.AdditionalDependencies)' % rule_name}] ] logging_section = [ 'ItemGroup', [rule.tlog, {'Include': '%%(%s.Outputs)' % rule_name, 'Condition': ("'%%(%s.Outputs)' != '' and " "'%%(%s.ExcludedFromBuild)' != 'true'" % (rule_name, rule_name)) }, ['Source', "@(%s, '|')" % rule_name], ['Inputs', "@(%s -> '%%(Fullpath)', ';')" % rule.inputs], ], ] message_section = [ 'Message', {'Importance': 'High', 'Text': '%%(%s.ExecutionDescription)' % rule_name } ] write_tlog_section = [ 'WriteLinesToFile', {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != " "'true'" % (rule.tlog, rule.tlog), 'File': '$(IntDir)$(ProjectName).write.1.tlog', 'Lines': "^%%(%s.Source);@(%s->'%%(Fullpath)')" % (rule.tlog, rule.tlog) } ] read_tlog_section = [ 'WriteLinesToFile', {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != " "'true'" % (rule.tlog, rule.tlog), 'File': '$(IntDir)$(ProjectName).read.1.tlog', 'Lines': "^%%(%s.Source);%%(%s.Inputs)" % (rule.tlog, rule.tlog) } ] command_and_input_section = [ rule_name, {'Condition': "'@(%s)' != '' and '%%(%s.ExcludedFromBuild)' != " "'true'" % (rule_name, rule_name), 'EchoOff': 'true', 'StandardOutputImportance': 'High', 'StandardErrorImportance': 'High', 'CommandLineTemplate': '%%(%s.CommandLineTemplate)' % rule_name, 'AdditionalOptions': '%%(%s.AdditionalOptions)' % rule_name, 'Inputs': rule_inputs } ] content.extend([ ['Target', {'Name': rule.target_name, 'BeforeTargets': '$(%s)' % rule.before_targets, 'AfterTargets': '$(%s)' % rule.after_targets, 'Condition': "'@(%s)' != ''" % rule_name, 'DependsOnTargets': '$(%s);%s' % (rule.depends_on, rule.compute_output), 'Outputs': target_outputs, 'Inputs': target_inputs }, remove_section, inputs_section, logging_section, message_section, write_tlog_section, read_tlog_section, command_and_input_section, ], ['PropertyGroup', ['ComputeLinkInputsTargets', '$(ComputeLinkInputsTargets);', '%s;' % rule.compute_output ], ['ComputeLibInputsTargets', '$(ComputeLibInputsTargets);', '%s;' % rule.compute_output ], ], ['Target', {'Name': rule.compute_output, 'Condition': "'@(%s)' != ''" % rule_name }, ['ItemGroup', [rule.dirs_to_make, {'Condition': "'@(%s)' != '' and " "'%%(%s.ExcludedFromBuild)' != 'true'" % (rule_name, rule_name), 'Include': '%%(%s.Outputs)' % rule_name } ], ['Link', {'Include': '%%(%s.Identity)' % rule.dirs_to_make, 'Condition': extension_condition } ], ['Lib', {'Include': '%%(%s.Identity)' % rule.dirs_to_make, 'Condition': extension_condition } ], ['ImpLib', {'Include': '%%(%s.Identity)' % rule.dirs_to_make, 'Condition': extension_condition } ], ], ['MakeDir', {'Directories': ("@(%s->'%%(RootDir)%%(Directory)')" % rule.dirs_to_make) } ] ], ]) easy_xml.WriteXmlIfChanged(content, targets_path, pretty=True, win32=True) def _GenerateMSBuildRuleXmlFile(xml_path, msbuild_rules): # Generate the .xml file content = [ 'ProjectSchemaDefinitions', {'xmlns': ('clr-namespace:Microsoft.Build.Framework.XamlTypes;' 'assembly=Microsoft.Build.Framework'), 'xmlns:x': 'http://schemas.microsoft.com/winfx/2006/xaml', 'xmlns:sys': 'clr-namespace:System;assembly=mscorlib', 'xmlns:transformCallback': 'Microsoft.Cpp.Dev10.ConvertPropertyCallback' } ] for rule in msbuild_rules: content.extend([ ['Rule', {'Name': rule.rule_name, 'PageTemplate': 'tool', 'DisplayName': rule.display_name, 'Order': '200' }, ['Rule.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'ItemType': rule.rule_name } ] ], ['Rule.Categories', ['Category', {'Name': 'General'}, ['Category.DisplayName', ['sys:String', 'General'], ], ], ['Category', {'Name': 'Command Line', 'Subtype': 'CommandLine' }, ['Category.DisplayName', ['sys:String', 'Command Line'], ], ], ], ['StringListProperty', {'Name': 'Inputs', 'Category': 'Command Line', 'IsRequired': 'true', 'Switch': ' ' }, ['StringListProperty.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'ItemType': rule.rule_name, 'SourceType': 'Item' } ] ], ], ['StringProperty', {'Name': 'CommandLineTemplate', 'DisplayName': 'Command Line', 'Visible': 'False', 'IncludeInCommandLine': 'False' } ], ['DynamicEnumProperty', {'Name': rule.before_targets, 'Category': 'General', 'EnumProvider': 'Targets', 'IncludeInCommandLine': 'False' }, ['DynamicEnumProperty.DisplayName', ['sys:String', 'Execute Before'], ], ['DynamicEnumProperty.Description', ['sys:String', 'Specifies the targets for the build customization' ' to run before.' ], ], ['DynamicEnumProperty.ProviderSettings', ['NameValuePair', {'Name': 'Exclude', 'Value': '^%s|^Compute' % rule.before_targets } ] ], ['DynamicEnumProperty.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'HasConfigurationCondition': 'true' } ] ], ], ['DynamicEnumProperty', {'Name': rule.after_targets, 'Category': 'General', 'EnumProvider': 'Targets', 'IncludeInCommandLine': 'False' }, ['DynamicEnumProperty.DisplayName', ['sys:String', 'Execute After'], ], ['DynamicEnumProperty.Description', ['sys:String', ('Specifies the targets for the build customization' ' to run after.') ], ], ['DynamicEnumProperty.ProviderSettings', ['NameValuePair', {'Name': 'Exclude', 'Value': '^%s|^Compute' % rule.after_targets } ] ], ['DynamicEnumProperty.DataSource', ['DataSource', {'Persistence': 'ProjectFile', 'ItemType': '', 'HasConfigurationCondition': 'true' } ] ], ], ['StringListProperty', {'Name': 'Outputs', 'DisplayName': 'Outputs', 'Visible': 'False', 'IncludeInCommandLine': 'False' } ], ['StringProperty', {'Name': 'ExecutionDescription', 'DisplayName': 'Execution Description', 'Visible': 'False', 'IncludeInCommandLine': 'False' } ], ['StringListProperty', {'Name': 'AdditionalDependencies', 'DisplayName': 'Additional Dependencies', 'IncludeInCommandLine': 'False', 'Visible': 'false' } ], ['StringProperty', {'Subtype': 'AdditionalOptions', 'Name': 'AdditionalOptions', 'Category': 'Command Line' }, ['StringProperty.DisplayName', ['sys:String', 'Additional Options'], ], ['StringProperty.Description', ['sys:String', 'Additional Options'], ], ], ], ['ItemType', {'Name': rule.rule_name, 'DisplayName': rule.display_name } ], ['FileExtension', {'Name': '*' + rule.extension, 'ContentType': rule.rule_name } ], ['ContentType', {'Name': rule.rule_name, 'DisplayName': '', 'ItemType': rule.rule_name } ] ]) easy_xml.WriteXmlIfChanged(content, xml_path, pretty=True, win32=True) def _GetConfigurationAndPlatform(name, settings): configuration = name.rsplit('_', 1)[0] platform = settings.get('msvs_configuration_platform', 'Win32') return (configuration, platform) def _GetConfigurationCondition(name, settings): return (r"'$(Configuration)|$(Platform)'=='%s|%s'" % _GetConfigurationAndPlatform(name, settings)) def _GetMSBuildProjectConfigurations(configurations): group = ['ItemGroup', {'Label': 'ProjectConfigurations'}] for (name, settings) in sorted(configurations.iteritems()): configuration, platform = _GetConfigurationAndPlatform(name, settings) designation = '%s|%s' % (configuration, platform) group.append( ['ProjectConfiguration', {'Include': designation}, ['Configuration', configuration], ['Platform', platform]]) return [group] def _GetMSBuildGlobalProperties(spec, guid, gyp_file_name): namespace = os.path.splitext(gyp_file_name)[0] properties = [ ['PropertyGroup', {'Label': 'Globals'}, ['ProjectGuid', guid], ['Keyword', 'Win32Proj'], ['RootNamespace', namespace], ['IgnoreWarnCompileDuplicatedFilename', 'true'], ] ] if os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or \ os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64': properties[0].append(['PreferredToolArchitecture', 'x64']) if spec.get('msvs_enable_winrt'): properties[0].append(['DefaultLanguage', 'en-US']) properties[0].append(['AppContainerApplication', 'true']) if spec.get('msvs_application_type_revision'): app_type_revision = spec.get('msvs_application_type_revision') properties[0].append(['ApplicationTypeRevision', app_type_revision]) else: properties[0].append(['ApplicationTypeRevision', '8.1']) if spec.get('msvs_target_platform_version'): target_platform_version = spec.get('msvs_target_platform_version') properties[0].append(['WindowsTargetPlatformVersion', target_platform_version]) if spec.get('msvs_target_platform_minversion'): target_platform_minversion = spec.get('msvs_target_platform_minversion') properties[0].append(['WindowsTargetPlatformMinVersion', target_platform_minversion]) else: properties[0].append(['WindowsTargetPlatformMinVersion', target_platform_version]) if spec.get('msvs_enable_winphone'): properties[0].append(['ApplicationType', 'Windows Phone']) else: properties[0].append(['ApplicationType', 'Windows Store']) return properties def _GetMSBuildConfigurationDetails(spec, build_file): properties = {} for name, settings in spec['configurations'].iteritems(): msbuild_attributes = _GetMSBuildAttributes(spec, settings, build_file) condition = _GetConfigurationCondition(name, settings) character_set = msbuild_attributes.get('CharacterSet') _AddConditionalProperty(properties, condition, 'ConfigurationType', msbuild_attributes['ConfigurationType']) if character_set: if 'msvs_enable_winrt' not in spec : _AddConditionalProperty(properties, condition, 'CharacterSet', character_set) return _GetMSBuildPropertyGroup(spec, 'Configuration', properties) def _GetMSBuildLocalProperties(msbuild_toolset): # Currently the only local property we support is PlatformToolset properties = {} if msbuild_toolset: properties = [ ['PropertyGroup', {'Label': 'Locals'}, ['PlatformToolset', msbuild_toolset], ] ] return properties def _GetMSBuildPropertySheets(configurations): user_props = r'$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props' additional_props = {} props_specified = False for name, settings in sorted(configurations.iteritems()): configuration = _GetConfigurationCondition(name, settings) if settings.has_key('msbuild_props'): additional_props[configuration] = _FixPaths(settings['msbuild_props']) props_specified = True else: additional_props[configuration] = '' if not props_specified: return [ ['ImportGroup', {'Label': 'PropertySheets'}, ['Import', {'Project': user_props, 'Condition': "exists('%s')" % user_props, 'Label': 'LocalAppDataPlatform' } ] ] ] else: sheets = [] for condition, props in additional_props.iteritems(): import_group = [ 'ImportGroup', {'Label': 'PropertySheets', 'Condition': condition }, ['Import', {'Project': user_props, 'Condition': "exists('%s')" % user_props, 'Label': 'LocalAppDataPlatform' } ] ] for props_file in props: import_group.append(['Import', {'Project':props_file}]) sheets.append(import_group) return sheets def _ConvertMSVSBuildAttributes(spec, config, build_file): config_type = _GetMSVSConfigurationType(spec, build_file) msvs_attributes = _GetMSVSAttributes(spec, config, config_type) msbuild_attributes = {} for a in msvs_attributes: if a in ['IntermediateDirectory', 'OutputDirectory']: directory = MSVSSettings.ConvertVCMacrosToMSBuild(msvs_attributes[a]) if not directory.endswith('\\'): directory += '\\' msbuild_attributes[a] = directory elif a == 'CharacterSet': msbuild_attributes[a] = _ConvertMSVSCharacterSet(msvs_attributes[a]) elif a == 'ConfigurationType': msbuild_attributes[a] = _ConvertMSVSConfigurationType(msvs_attributes[a]) else: print 'Warning: Do not know how to convert MSVS attribute ' + a return msbuild_attributes def _ConvertMSVSCharacterSet(char_set): if char_set.isdigit(): char_set = { '0': 'MultiByte', '1': 'Unicode', '2': 'MultiByte', }[char_set] return char_set def _ConvertMSVSConfigurationType(config_type): if config_type.isdigit(): config_type = { '1': 'Application', '2': 'DynamicLibrary', '4': 'StaticLibrary', '10': 'Utility' }[config_type] return config_type def _GetMSBuildAttributes(spec, config, build_file): if 'msbuild_configuration_attributes' not in config: msbuild_attributes = _ConvertMSVSBuildAttributes(spec, config, build_file) else: config_type = _GetMSVSConfigurationType(spec, build_file) config_type = _ConvertMSVSConfigurationType(config_type) msbuild_attributes = config.get('msbuild_configuration_attributes', {}) msbuild_attributes.setdefault('ConfigurationType', config_type) output_dir = msbuild_attributes.get('OutputDirectory', '$(SolutionDir)$(Configuration)') msbuild_attributes['OutputDirectory'] = _FixPath(output_dir) + '\\' if 'IntermediateDirectory' not in msbuild_attributes: intermediate = _FixPath('$(Configuration)') + '\\' msbuild_attributes['IntermediateDirectory'] = intermediate if 'CharacterSet' in msbuild_attributes: msbuild_attributes['CharacterSet'] = _ConvertMSVSCharacterSet( msbuild_attributes['CharacterSet']) if 'TargetName' not in msbuild_attributes: prefix = spec.get('product_prefix', '') product_name = spec.get('product_name', '$(ProjectName)') target_name = prefix + product_name msbuild_attributes['TargetName'] = target_name if 'TargetExt' not in msbuild_attributes and 'product_extension' in spec: ext = spec.get('product_extension') msbuild_attributes['TargetExt'] = '.' + ext if spec.get('msvs_external_builder'): external_out_dir = spec.get('msvs_external_builder_out_dir', '.') msbuild_attributes['OutputDirectory'] = _FixPath(external_out_dir) + '\\' # Make sure that 'TargetPath' matches 'Lib.OutputFile' or 'Link.OutputFile' # (depending on the tool used) to avoid MSB8012 warning. msbuild_tool_map = { 'executable': 'Link', 'shared_library': 'Link', 'loadable_module': 'Link', 'static_library': 'Lib', } msbuild_tool = msbuild_tool_map.get(spec['type']) if msbuild_tool: msbuild_settings = config['finalized_msbuild_settings'] out_file = msbuild_settings[msbuild_tool].get('OutputFile') if out_file: msbuild_attributes['TargetPath'] = _FixPath(out_file) target_ext = msbuild_settings[msbuild_tool].get('TargetExt') if target_ext: msbuild_attributes['TargetExt'] = target_ext return msbuild_attributes def _GetMSBuildConfigurationGlobalProperties(spec, configurations, build_file): # TODO(jeanluc) We could optimize out the following and do it only if # there are actions. # TODO(jeanluc) Handle the equivalent of setting 'CYGWIN=nontsec'. new_paths = [] cygwin_dirs = spec.get('msvs_cygwin_dirs', ['.'])[0] if cygwin_dirs: cyg_path = '$(MSBuildProjectDirectory)\\%s\\bin\\' % _FixPath(cygwin_dirs) new_paths.append(cyg_path) # TODO(jeanluc) Change the convention to have both a cygwin_dir and a # python_dir. python_path = cyg_path.replace('cygwin\\bin', 'python_26') new_paths.append(python_path) if new_paths: new_paths = '$(ExecutablePath);' + ';'.join(new_paths) properties = {} for (name, configuration) in sorted(configurations.iteritems()): condition = _GetConfigurationCondition(name, configuration) attributes = _GetMSBuildAttributes(spec, configuration, build_file) msbuild_settings = configuration['finalized_msbuild_settings'] _AddConditionalProperty(properties, condition, 'IntDir', attributes['IntermediateDirectory']) _AddConditionalProperty(properties, condition, 'OutDir', attributes['OutputDirectory']) _AddConditionalProperty(properties, condition, 'TargetName', attributes['TargetName']) if 'TargetExt' in attributes: _AddConditionalProperty(properties, condition, 'TargetExt', attributes['TargetExt']) if attributes.get('TargetPath'): _AddConditionalProperty(properties, condition, 'TargetPath', attributes['TargetPath']) if attributes.get('TargetExt'): _AddConditionalProperty(properties, condition, 'TargetExt', attributes['TargetExt']) if new_paths: _AddConditionalProperty(properties, condition, 'ExecutablePath', new_paths) tool_settings = msbuild_settings.get('', {}) for name, value in sorted(tool_settings.iteritems()): formatted_value = _GetValueFormattedForMSBuild('', name, value) _AddConditionalProperty(properties, condition, name, formatted_value) return _GetMSBuildPropertyGroup(spec, None, properties) def _AddConditionalProperty(properties, condition, name, value): """Adds a property / conditional value pair to a dictionary. Arguments: properties: The dictionary to be modified. The key is the name of the property. The value is itself a dictionary; its key is the value and the value a list of condition for which this value is true. condition: The condition under which the named property has the value. name: The name of the property. value: The value of the property. """ if name not in properties: properties[name] = {} values = properties[name] if value not in values: values[value] = [] conditions = values[value] conditions.append(condition) # Regex for msvs variable references ( i.e. $(FOO) ). MSVS_VARIABLE_REFERENCE = re.compile(r'\$\(([a-zA-Z_][a-zA-Z0-9_]*)\)') def _GetMSBuildPropertyGroup(spec, label, properties): """Returns a PropertyGroup definition for the specified properties. Arguments: spec: The target project dict. label: An optional label for the PropertyGroup. properties: The dictionary to be converted. The key is the name of the property. The value is itself a dictionary; its key is the value and the value a list of condition for which this value is true. """ group = ['PropertyGroup'] if label: group.append({'Label': label}) num_configurations = len(spec['configurations']) def GetEdges(node): # Use a definition of edges such that user_of_variable -> used_varible. # This happens to be easier in this case, since a variable's # definition contains all variables it references in a single string. edges = set() for value in sorted(properties[node].keys()): # Add to edges all $(...) references to variables. # # Variable references that refer to names not in properties are excluded # These can exist for instance to refer built in definitions like # $(SolutionDir). # # Self references are ignored. Self reference is used in a few places to # append to the default value. I.e. PATH=$(PATH);other_path edges.update(set([v for v in MSVS_VARIABLE_REFERENCE.findall(value) if v in properties and v != node])) return edges properties_ordered = gyp.common.TopologicallySorted( properties.keys(), GetEdges) # Walk properties in the reverse of a topological sort on # user_of_variable -> used_variable as this ensures variables are # defined before they are used. # NOTE: reverse(topsort(DAG)) = topsort(reverse_edges(DAG)) for name in reversed(properties_ordered): values = properties[name] for value, conditions in sorted(values.iteritems()): if len(conditions) == num_configurations: # If the value is the same all configurations, # just add one unconditional entry. group.append([name, value]) else: for condition in conditions: group.append([name, {'Condition': condition}, value]) return [group] def _GetMSBuildToolSettingsSections(spec, configurations): groups = [] for (name, configuration) in sorted(configurations.iteritems()): msbuild_settings = configuration['finalized_msbuild_settings'] group = ['ItemDefinitionGroup', {'Condition': _GetConfigurationCondition(name, configuration)} ] for tool_name, tool_settings in sorted(msbuild_settings.iteritems()): # Skip the tool named '' which is a holder of global settings handled # by _GetMSBuildConfigurationGlobalProperties. if tool_name: if tool_settings: tool = [tool_name] for name, value in sorted(tool_settings.iteritems()): formatted_value = _GetValueFormattedForMSBuild(tool_name, name, value) tool.append([name, formatted_value]) group.append(tool) groups.append(group) return groups def _FinalizeMSBuildSettings(spec, configuration): if 'msbuild_settings' in configuration: converted = False msbuild_settings = configuration['msbuild_settings'] MSVSSettings.ValidateMSBuildSettings(msbuild_settings) else: converted = True msvs_settings = configuration.get('msvs_settings', {}) msbuild_settings = MSVSSettings.ConvertToMSBuildSettings(msvs_settings) include_dirs, midl_include_dirs, resource_include_dirs = \ _GetIncludeDirs(configuration) libraries = _GetLibraries(spec) library_dirs = _GetLibraryDirs(configuration) out_file, _, msbuild_tool = _GetOutputFilePathAndTool(spec, msbuild=True) target_ext = _GetOutputTargetExt(spec) defines = _GetDefines(configuration) if converted: # Visual Studio 2010 has TR1 defines = [d for d in defines if d != '_HAS_TR1=0'] # Warn of ignored settings ignored_settings = ['msvs_tool_files'] for ignored_setting in ignored_settings: value = configuration.get(ignored_setting) if value: print ('Warning: The automatic conversion to MSBuild does not handle ' '%s. Ignoring setting of %s' % (ignored_setting, str(value))) defines = [_EscapeCppDefineForMSBuild(d) for d in defines] disabled_warnings = _GetDisabledWarnings(configuration) prebuild = configuration.get('msvs_prebuild') postbuild = configuration.get('msvs_postbuild') def_file = _GetModuleDefinition(spec) precompiled_header = configuration.get('msvs_precompiled_header') # Add the information to the appropriate tool # TODO(jeanluc) We could optimize and generate these settings only if # the corresponding files are found, e.g. don't generate ResourceCompile # if you don't have any resources. _ToolAppend(msbuild_settings, 'ClCompile', 'AdditionalIncludeDirectories', include_dirs) _ToolAppend(msbuild_settings, 'Midl', 'AdditionalIncludeDirectories', midl_include_dirs) _ToolAppend(msbuild_settings, 'ResourceCompile', 'AdditionalIncludeDirectories', resource_include_dirs) # Add in libraries, note that even for empty libraries, we want this # set, to prevent inheriting default libraries from the enviroment. _ToolSetOrAppend(msbuild_settings, 'Link', 'AdditionalDependencies', libraries) _ToolAppend(msbuild_settings, 'Link', 'AdditionalLibraryDirectories', library_dirs) if out_file: _ToolAppend(msbuild_settings, msbuild_tool, 'OutputFile', out_file, only_if_unset=True) if target_ext: _ToolAppend(msbuild_settings, msbuild_tool, 'TargetExt', target_ext, only_if_unset=True) # Add defines. _ToolAppend(msbuild_settings, 'ClCompile', 'PreprocessorDefinitions', defines) _ToolAppend(msbuild_settings, 'ResourceCompile', 'PreprocessorDefinitions', defines) # Add disabled warnings. _ToolAppend(msbuild_settings, 'ClCompile', 'DisableSpecificWarnings', disabled_warnings) # Turn on precompiled headers if appropriate. if precompiled_header: precompiled_header = os.path.split(precompiled_header)[1] _ToolAppend(msbuild_settings, 'ClCompile', 'PrecompiledHeader', 'Use') _ToolAppend(msbuild_settings, 'ClCompile', 'PrecompiledHeaderFile', precompiled_header) _ToolAppend(msbuild_settings, 'ClCompile', 'ForcedIncludeFiles', [precompiled_header]) else: _ToolAppend(msbuild_settings, 'ClCompile', 'PrecompiledHeader', 'NotUsing') # Turn off WinRT compilation _ToolAppend(msbuild_settings, 'ClCompile', 'CompileAsWinRT', 'false') # Turn on import libraries if appropriate if spec.get('msvs_requires_importlibrary'): _ToolAppend(msbuild_settings, '', 'IgnoreImportLibrary', 'false') # Loadable modules don't generate import libraries; # tell dependent projects to not expect one. if spec['type'] == 'loadable_module': _ToolAppend(msbuild_settings, '', 'IgnoreImportLibrary', 'true') # Set the module definition file if any. if def_file: _ToolAppend(msbuild_settings, 'Link', 'ModuleDefinitionFile', def_file) configuration['finalized_msbuild_settings'] = msbuild_settings if prebuild: _ToolAppend(msbuild_settings, 'PreBuildEvent', 'Command', prebuild) if postbuild: _ToolAppend(msbuild_settings, 'PostBuildEvent', 'Command', postbuild) def _GetValueFormattedForMSBuild(tool_name, name, value): if type(value) == list: # For some settings, VS2010 does not automatically extends the settings # TODO(jeanluc) Is this what we want? if name in ['AdditionalIncludeDirectories', 'AdditionalLibraryDirectories', 'AdditionalOptions', 'DelayLoadDLLs', 'DisableSpecificWarnings', 'PreprocessorDefinitions']: value.append('%%(%s)' % name) # For most tools, entries in a list should be separated with ';' but some # settings use a space. Check for those first. exceptions = { 'ClCompile': ['AdditionalOptions'], 'Link': ['AdditionalOptions'], 'Lib': ['AdditionalOptions']} if tool_name in exceptions and name in exceptions[tool_name]: char = ' ' else: char = ';' formatted_value = char.join( [MSVSSettings.ConvertVCMacrosToMSBuild(i) for i in value]) else: formatted_value = MSVSSettings.ConvertVCMacrosToMSBuild(value) return formatted_value def _VerifySourcesExist(sources, root_dir): """Verifies that all source files exist on disk. Checks that all regular source files, i.e. not created at run time, exist on disk. Missing files cause needless recompilation but no otherwise visible errors. Arguments: sources: A recursive list of Filter/file names. root_dir: The root directory for the relative path names. Returns: A list of source files that cannot be found on disk. """ missing_sources = [] for source in sources: if isinstance(source, MSVSProject.Filter): missing_sources.extend(_VerifySourcesExist(source.contents, root_dir)) else: if '$' not in source: full_path = os.path.join(root_dir, source) if not os.path.exists(full_path): missing_sources.append(full_path) return missing_sources def _GetMSBuildSources(spec, sources, exclusions, rule_dependencies, extension_to_rule_name, actions_spec, sources_handled_by_action, list_excluded): groups = ['none', 'masm', 'midl', 'include', 'compile', 'resource', 'rule', 'rule_dependency'] grouped_sources = {} for g in groups: grouped_sources[g] = [] _AddSources2(spec, sources, exclusions, grouped_sources, rule_dependencies, extension_to_rule_name, sources_handled_by_action, list_excluded) sources = [] for g in groups: if grouped_sources[g]: sources.append(['ItemGroup'] + grouped_sources[g]) if actions_spec: sources.append(['ItemGroup'] + actions_spec) return sources def _AddSources2(spec, sources, exclusions, grouped_sources, rule_dependencies, extension_to_rule_name, sources_handled_by_action, list_excluded): extensions_excluded_from_precompile = [] for source in sources: if isinstance(source, MSVSProject.Filter): _AddSources2(spec, source.contents, exclusions, grouped_sources, rule_dependencies, extension_to_rule_name, sources_handled_by_action, list_excluded) else: if not source in sources_handled_by_action: detail = [] excluded_configurations = exclusions.get(source, []) if len(excluded_configurations) == len(spec['configurations']): detail.append(['ExcludedFromBuild', 'true']) else: for config_name, configuration in sorted(excluded_configurations): condition = _GetConfigurationCondition(config_name, configuration) detail.append(['ExcludedFromBuild', {'Condition': condition}, 'true']) # Add precompile if needed for config_name, configuration in spec['configurations'].iteritems(): precompiled_source = configuration.get('msvs_precompiled_source', '') if precompiled_source != '': precompiled_source = _FixPath(precompiled_source) if not extensions_excluded_from_precompile: # If the precompiled header is generated by a C source, we must # not try to use it for C++ sources, and vice versa. basename, extension = os.path.splitext(precompiled_source) if extension == '.c': extensions_excluded_from_precompile = ['.cc', '.cpp', '.cxx'] else: extensions_excluded_from_precompile = ['.c'] if precompiled_source == source: condition = _GetConfigurationCondition(config_name, configuration) detail.append(['PrecompiledHeader', {'Condition': condition}, 'Create' ]) else: # Turn off precompiled header usage for source files of a # different type than the file that generated the # precompiled header. for extension in extensions_excluded_from_precompile: if source.endswith(extension): detail.append(['PrecompiledHeader', '']) detail.append(['ForcedIncludeFiles', '']) group, element = _MapFileToMsBuildSourceType(source, rule_dependencies, extension_to_rule_name) grouped_sources[group].append([element, {'Include': source}] + detail) def _GetMSBuildProjectReferences(project): references = [] if project.dependencies: group = ['ItemGroup'] for dependency in project.dependencies: guid = dependency.guid project_dir = os.path.split(project.path)[0] relative_path = gyp.common.RelativePath(dependency.path, project_dir) project_ref = ['ProjectReference', {'Include': relative_path}, ['Project', guid], ['ReferenceOutputAssembly', 'false'] ] for config in dependency.spec.get('configurations', {}).itervalues(): # If it's disabled in any config, turn it off in the reference. if config.get('msvs_2010_disable_uldi_when_referenced', 0): project_ref.append(['UseLibraryDependencyInputs', 'false']) break group.append(project_ref) references.append(group) return references def _GenerateMSBuildProject(project, options, version, generator_flags): spec = project.spec configurations = spec['configurations'] project_dir, project_file_name = os.path.split(project.path) gyp.common.EnsureDirExists(project.path) # Prepare list of sources and excluded sources. gyp_path = _NormalizedSource(project.build_file) relative_path_of_gyp_file = gyp.common.RelativePath(gyp_path, project_dir) gyp_file = os.path.split(project.build_file)[1] sources, excluded_sources = _PrepareListOfSources(spec, generator_flags, gyp_file) # Add rules. actions_to_add = {} props_files_of_rules = set() targets_files_of_rules = set() rule_dependencies = set() extension_to_rule_name = {} list_excluded = generator_flags.get('msvs_list_excluded_files', True) # Don't generate rules if we are using an external builder like ninja. if not spec.get('msvs_external_builder'): _GenerateRulesForMSBuild(project_dir, options, spec, sources, excluded_sources, props_files_of_rules, targets_files_of_rules, actions_to_add, rule_dependencies, extension_to_rule_name) else: rules = spec.get('rules', []) _AdjustSourcesForRules(rules, sources, excluded_sources, True) sources, excluded_sources, excluded_idl = ( _AdjustSourcesAndConvertToFilterHierarchy(spec, options, project_dir, sources, excluded_sources, list_excluded, version)) # Don't add actions if we are using an external builder like ninja. if not spec.get('msvs_external_builder'): _AddActions(actions_to_add, spec, project.build_file) _AddCopies(actions_to_add, spec) # NOTE: this stanza must appear after all actions have been decided. # Don't excluded sources with actions attached, or they won't run. excluded_sources = _FilterActionsFromExcluded( excluded_sources, actions_to_add) exclusions = _GetExcludedFilesFromBuild(spec, excluded_sources, excluded_idl) actions_spec, sources_handled_by_action = _GenerateActionsForMSBuild( spec, actions_to_add) _GenerateMSBuildFiltersFile(project.path + '.filters', sources, rule_dependencies, extension_to_rule_name) missing_sources = _VerifySourcesExist(sources, project_dir) for configuration in configurations.itervalues(): _FinalizeMSBuildSettings(spec, configuration) # Add attributes to root element import_default_section = [ ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.Default.props'}]] import_cpp_props_section = [ ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.props'}]] import_cpp_targets_section = [ ['Import', {'Project': r'$(VCTargetsPath)\Microsoft.Cpp.targets'}]] import_masm_props_section = [ ['Import', {'Project': r'$(VCTargetsPath)\BuildCustomizations\masm.props'}]] import_masm_targets_section = [ ['Import', {'Project': r'$(VCTargetsPath)\BuildCustomizations\masm.targets'}]] macro_section = [['PropertyGroup', {'Label': 'UserMacros'}]] content = [ 'Project', {'xmlns': 'http://schemas.microsoft.com/developer/msbuild/2003', 'ToolsVersion': version.ProjectVersion(), 'DefaultTargets': 'Build' }] content += _GetMSBuildProjectConfigurations(configurations) content += _GetMSBuildGlobalProperties(spec, project.guid, project_file_name) content += import_default_section content += _GetMSBuildConfigurationDetails(spec, project.build_file) if spec.get('msvs_enable_winphone'): content += _GetMSBuildLocalProperties('v120_wp81') else: content += _GetMSBuildLocalProperties(project.msbuild_toolset) content += import_cpp_props_section content += import_masm_props_section content += _GetMSBuildExtensions(props_files_of_rules) content += _GetMSBuildPropertySheets(configurations) content += macro_section content += _GetMSBuildConfigurationGlobalProperties(spec, configurations, project.build_file) content += _GetMSBuildToolSettingsSections(spec, configurations) content += _GetMSBuildSources( spec, sources, exclusions, rule_dependencies, extension_to_rule_name, actions_spec, sources_handled_by_action, list_excluded) content += _GetMSBuildProjectReferences(project) content += import_cpp_targets_section content += import_masm_targets_section content += _GetMSBuildExtensionTargets(targets_files_of_rules) if spec.get('msvs_external_builder'): content += _GetMSBuildExternalBuilderTargets(spec) # TODO(jeanluc) File a bug to get rid of runas. We had in MSVS: # has_run_as = _WriteMSVSUserFile(project.path, version, spec) easy_xml.WriteXmlIfChanged(content, project.path, pretty=True, win32=True) return missing_sources def _GetMSBuildExternalBuilderTargets(spec): """Return a list of MSBuild targets for external builders. The "Build" and "Clean" targets are always generated. If the spec contains 'msvs_external_builder_clcompile_cmd', then the "ClCompile" target will also be generated, to support building selected C/C++ files. Arguments: spec: The gyp target spec. Returns: List of MSBuild 'Target' specs. """ build_cmd = _BuildCommandLineForRuleRaw( spec, spec['msvs_external_builder_build_cmd'], False, False, False, False) build_target = ['Target', {'Name': 'Build'}] build_target.append(['Exec', {'Command': build_cmd}]) clean_cmd = _BuildCommandLineForRuleRaw( spec, spec['msvs_external_builder_clean_cmd'], False, False, False, False) clean_target = ['Target', {'Name': 'Clean'}] clean_target.append(['Exec', {'Command': clean_cmd}]) targets = [build_target, clean_target] if spec.get('msvs_external_builder_clcompile_cmd'): clcompile_cmd = _BuildCommandLineForRuleRaw( spec, spec['msvs_external_builder_clcompile_cmd'], False, False, False, False) clcompile_target = ['Target', {'Name': 'ClCompile'}] clcompile_target.append(['Exec', {'Command': clcompile_cmd}]) targets.append(clcompile_target) return targets def _GetMSBuildExtensions(props_files_of_rules): extensions = ['ImportGroup', {'Label': 'ExtensionSettings'}] for props_file in props_files_of_rules: extensions.append(['Import', {'Project': props_file}]) return [extensions] def _GetMSBuildExtensionTargets(targets_files_of_rules): targets_node = ['ImportGroup', {'Label': 'ExtensionTargets'}] for targets_file in sorted(targets_files_of_rules): targets_node.append(['Import', {'Project': targets_file}]) return [targets_node] def _GenerateActionsForMSBuild(spec, actions_to_add): """Add actions accumulated into an actions_to_add, merging as needed. Arguments: spec: the target project dict actions_to_add: dictionary keyed on input name, which maps to a list of dicts describing the actions attached to that input file. Returns: A pair of (action specification, the sources handled by this action). """ sources_handled_by_action = OrderedSet() actions_spec = [] for primary_input, actions in actions_to_add.iteritems(): inputs = OrderedSet() outputs = OrderedSet() descriptions = [] commands = [] for action in actions: inputs.update(OrderedSet(action['inputs'])) outputs.update(OrderedSet(action['outputs'])) descriptions.append(action['description']) cmd = action['command'] # For most actions, add 'call' so that actions that invoke batch files # return and continue executing. msbuild_use_call provides a way to # disable this but I have not seen any adverse effect from doing that # for everything. if action.get('msbuild_use_call', True): cmd = 'call ' + cmd commands.append(cmd) # Add the custom build action for one input file. description = ', and also '.join(descriptions) # We can't join the commands simply with && because the command line will # get too long. See also _AddActions: cygwin's setup_env mustn't be called # for every invocation or the command that sets the PATH will grow too # long. command = '\r\n'.join([c + '\r\nif %errorlevel% neq 0 exit /b %errorlevel%' for c in commands]) _AddMSBuildAction(spec, primary_input, inputs, outputs, command, description, sources_handled_by_action, actions_spec) return actions_spec, sources_handled_by_action def _AddMSBuildAction(spec, primary_input, inputs, outputs, cmd, description, sources_handled_by_action, actions_spec): command = MSVSSettings.ConvertVCMacrosToMSBuild(cmd) primary_input = _FixPath(primary_input) inputs_array = _FixPaths(inputs) outputs_array = _FixPaths(outputs) additional_inputs = ';'.join([i for i in inputs_array if i != primary_input]) outputs = ';'.join(outputs_array) sources_handled_by_action.add(primary_input) action_spec = ['CustomBuild', {'Include': primary_input}] action_spec.extend( # TODO(jeanluc) 'Document' for all or just if as_sources? [['FileType', 'Document'], ['Command', command], ['Message', description], ['Outputs', outputs] ]) if additional_inputs: action_spec.append(['AdditionalInputs', additional_inputs]) actions_spec.append(action_spec) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/msvs_test.py0000755000000000000000000000177212631326456025007 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the msvs.py file. """ import gyp.generator.msvs as msvs import unittest import StringIO class TestSequenceFunctions(unittest.TestCase): def setUp(self): self.stderr = StringIO.StringIO() def test_GetLibraries(self): self.assertEqual( msvs._GetLibraries({}), []) self.assertEqual( msvs._GetLibraries({'libraries': []}), []) self.assertEqual( msvs._GetLibraries({'other':'foo', 'libraries': ['a.lib']}), ['a.lib']) self.assertEqual( msvs._GetLibraries({'libraries': ['-la']}), ['a.lib']) self.assertEqual( msvs._GetLibraries({'libraries': ['a.lib', 'b.lib', 'c.lib', '-lb.lib', '-lb.lib', 'd.lib', 'a.lib']}), ['c.lib', 'b.lib', 'd.lib', 'a.lib']) if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja.py0000644000000000000000000030375112631326456024056 0ustar 00000000000000# Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import collections import copy import hashlib import json import multiprocessing import os.path import re import signal import subprocess import sys import gyp import gyp.common from gyp.common import OrderedSet import gyp.msvs_emulation import gyp.MSVSUtil as MSVSUtil import gyp.xcode_emulation from cStringIO import StringIO from gyp.common import GetEnvironFallback import gyp.ninja_syntax as ninja_syntax generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_PREFIX': 'lib', # Gyp expects the following variables to be expandable by the build # system to the appropriate locations. Ninja prefers paths to be # known at gyp time. To resolve this, introduce special # variables starting with $! and $| (which begin with a $ so gyp knows it # should be treated specially, but is otherwise an invalid # ninja/shell variable) that are passed to gyp here but expanded # before writing out into the target .ninja files; see # ExpandSpecial. # $! is used for variables that represent a path and that can only appear at # the start of a string, while $| is used for variables that can appear # anywhere in a string. 'INTERMEDIATE_DIR': '$!INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR': '$!PRODUCT_DIR/gen', 'PRODUCT_DIR': '$!PRODUCT_DIR', 'CONFIGURATION_NAME': '$|CONFIGURATION_NAME', # Special variables that may be used by gyp 'rule' targets. # We generate definitions for these variables on the fly when processing a # rule. 'RULE_INPUT_ROOT': '${root}', 'RULE_INPUT_DIRNAME': '${dirname}', 'RULE_INPUT_PATH': '${source}', 'RULE_INPUT_EXT': '${ext}', 'RULE_INPUT_NAME': '${name}', } # Placates pylint. generator_additional_non_configuration_keys = [] generator_additional_path_sections = [] generator_extra_sources_for_rules = [] generator_filelist_paths = None generator_supports_multiple_toolsets = gyp.common.CrossCompileRequested() def StripPrefix(arg, prefix): if arg.startswith(prefix): return arg[len(prefix):] return arg def QuoteShellArgument(arg, flavor): """Quote a string such that it will be interpreted as a single argument by the shell.""" # Rather than attempting to enumerate the bad shell characters, just # whitelist common OK ones and quote anything else. if re.match(r'^[a-zA-Z0-9_=.\\/-]+$', arg): return arg # No quoting necessary. if flavor == 'win': return gyp.msvs_emulation.QuoteForRspFile(arg) return "'" + arg.replace("'", "'" + '"\'"' + "'") + "'" def Define(d, flavor): """Takes a preprocessor define and returns a -D parameter that's ninja- and shell-escaped.""" if flavor == 'win': # cl.exe replaces literal # characters with = in preprocesor definitions for # some reason. Octal-encode to work around that. d = d.replace('#', '\\%03o' % ord('#')) return QuoteShellArgument(ninja_syntax.escape('-D' + d), flavor) def AddArch(output, arch): """Adds an arch string to an output path.""" output, extension = os.path.splitext(output) return '%s.%s%s' % (output, arch, extension) class Target(object): """Target represents the paths used within a single gyp target. Conceptually, building a single target A is a series of steps: 1) actions/rules/copies generates source/resources/etc. 2) compiles generates .o files 3) link generates a binary (library/executable) 4) bundle merges the above in a mac bundle (Any of these steps can be optional.) From a build ordering perspective, a dependent target B could just depend on the last output of this series of steps. But some dependent commands sometimes need to reach inside the box. For example, when linking B it needs to get the path to the static library generated by A. This object stores those paths. To keep things simple, member variables only store concrete paths to single files, while methods compute derived values like "the last output of the target". """ def __init__(self, type): # Gyp type ("static_library", etc.) of this target. self.type = type # File representing whether any input dependencies necessary for # dependent actions have completed. self.preaction_stamp = None # File representing whether any input dependencies necessary for # dependent compiles have completed. self.precompile_stamp = None # File representing the completion of actions/rules/copies, if any. self.actions_stamp = None # Path to the output of the link step, if any. self.binary = None # Path to the file representing the completion of building the bundle, # if any. self.bundle = None # On Windows, incremental linking requires linking against all the .objs # that compose a .lib (rather than the .lib itself). That list is stored # here. In this case, we also need to save the compile_deps for the target, # so that the the target that directly depends on the .objs can also depend # on those. self.component_objs = None self.compile_deps = None # Windows only. The import .lib is the output of a build step, but # because dependents only link against the lib (not both the lib and the # dll) we keep track of the import library here. self.import_lib = None def Linkable(self): """Return true if this is a target that can be linked against.""" return self.type in ('static_library', 'shared_library') def UsesToc(self, flavor): """Return true if the target should produce a restat rule based on a TOC file.""" # For bundles, the .TOC should be produced for the binary, not for # FinalOutput(). But the naive approach would put the TOC file into the # bundle, so don't do this for bundles for now. if flavor == 'win' or self.bundle: return False return self.type in ('shared_library', 'loadable_module') def PreActionInput(self, flavor): """Return the path, if any, that should be used as a dependency of any dependent action step.""" if self.UsesToc(flavor): return self.FinalOutput() + '.TOC' return self.FinalOutput() or self.preaction_stamp def PreCompileInput(self): """Return the path, if any, that should be used as a dependency of any dependent compile step.""" return self.actions_stamp or self.precompile_stamp def FinalOutput(self): """Return the last output of the target, which depends on all prior steps.""" return self.bundle or self.binary or self.actions_stamp # A small discourse on paths as used within the Ninja build: # All files we produce (both at gyp and at build time) appear in the # build directory (e.g. out/Debug). # # Paths within a given .gyp file are always relative to the directory # containing the .gyp file. Call these "gyp paths". This includes # sources as well as the starting directory a given gyp rule/action # expects to be run from. We call the path from the source root to # the gyp file the "base directory" within the per-.gyp-file # NinjaWriter code. # # All paths as written into the .ninja files are relative to the build # directory. Call these paths "ninja paths". # # We translate between these two notions of paths with two helper # functions: # # - GypPathToNinja translates a gyp path (i.e. relative to the .gyp file) # into the equivalent ninja path. # # - GypPathToUniqueOutput translates a gyp path into a ninja path to write # an output file; the result can be namespaced such that it is unique # to the input file name as well as the output target name. class NinjaWriter(object): def __init__(self, hash_for_rules, target_outputs, base_dir, build_dir, output_file, toplevel_build, output_file_name, flavor, toplevel_dir=None): """ base_dir: path from source root to directory containing this gyp file, by gyp semantics, all input paths are relative to this build_dir: path from source root to build output toplevel_dir: path to the toplevel directory """ self.hash_for_rules = hash_for_rules self.target_outputs = target_outputs self.base_dir = base_dir self.build_dir = build_dir self.ninja = ninja_syntax.Writer(output_file) self.toplevel_build = toplevel_build self.output_file_name = output_file_name self.flavor = flavor self.abs_build_dir = None if toplevel_dir is not None: self.abs_build_dir = os.path.abspath(os.path.join(toplevel_dir, build_dir)) self.obj_ext = '.obj' if flavor == 'win' else '.o' if flavor == 'win': # See docstring of msvs_emulation.GenerateEnvironmentFiles(). self.win_env = {} for arch in ('x86', 'x64'): self.win_env[arch] = 'environment.' + arch # Relative path from build output dir to base dir. build_to_top = gyp.common.InvertRelativePath(build_dir, toplevel_dir) self.build_to_base = os.path.join(build_to_top, base_dir) # Relative path from base dir to build dir. base_to_top = gyp.common.InvertRelativePath(base_dir, toplevel_dir) self.base_to_build = os.path.join(base_to_top, build_dir) def ExpandSpecial(self, path, product_dir=None): """Expand specials like $!PRODUCT_DIR in |path|. If |product_dir| is None, assumes the cwd is already the product dir. Otherwise, |product_dir| is the relative path to the product dir. """ PRODUCT_DIR = '$!PRODUCT_DIR' if PRODUCT_DIR in path: if product_dir: path = path.replace(PRODUCT_DIR, product_dir) else: path = path.replace(PRODUCT_DIR + '/', '') path = path.replace(PRODUCT_DIR + '\\', '') path = path.replace(PRODUCT_DIR, '.') INTERMEDIATE_DIR = '$!INTERMEDIATE_DIR' if INTERMEDIATE_DIR in path: int_dir = self.GypPathToUniqueOutput('gen') # GypPathToUniqueOutput generates a path relative to the product dir, # so insert product_dir in front if it is provided. path = path.replace(INTERMEDIATE_DIR, os.path.join(product_dir or '', int_dir)) CONFIGURATION_NAME = '$|CONFIGURATION_NAME' path = path.replace(CONFIGURATION_NAME, self.config_name) return path def ExpandRuleVariables(self, path, root, dirname, source, ext, name): if self.flavor == 'win': path = self.msvs_settings.ConvertVSMacros( path, config=self.config_name) path = path.replace(generator_default_variables['RULE_INPUT_ROOT'], root) path = path.replace(generator_default_variables['RULE_INPUT_DIRNAME'], dirname) path = path.replace(generator_default_variables['RULE_INPUT_PATH'], source) path = path.replace(generator_default_variables['RULE_INPUT_EXT'], ext) path = path.replace(generator_default_variables['RULE_INPUT_NAME'], name) return path def GypPathToNinja(self, path, env=None): """Translate a gyp path to a ninja path, optionally expanding environment variable references in |path| with |env|. See the above discourse on path conversions.""" if env: if self.flavor == 'mac': path = gyp.xcode_emulation.ExpandEnvVars(path, env) elif self.flavor == 'win': path = gyp.msvs_emulation.ExpandMacros(path, env) if path.startswith('$!'): expanded = self.ExpandSpecial(path) if self.flavor == 'win': expanded = os.path.normpath(expanded) return expanded if '$|' in path: path = self.ExpandSpecial(path) assert '$' not in path, path return os.path.normpath(os.path.join(self.build_to_base, path)) def GypPathToUniqueOutput(self, path, qualified=True): """Translate a gyp path to a ninja path for writing output. If qualified is True, qualify the resulting filename with the name of the target. This is necessary when e.g. compiling the same path twice for two separate output targets. See the above discourse on path conversions.""" path = self.ExpandSpecial(path) assert not path.startswith('$'), path # Translate the path following this scheme: # Input: foo/bar.gyp, target targ, references baz/out.o # Output: obj/foo/baz/targ.out.o (if qualified) # obj/foo/baz/out.o (otherwise) # (and obj.host instead of obj for cross-compiles) # # Why this scheme and not some other one? # 1) for a given input, you can compute all derived outputs by matching # its path, even if the input is brought via a gyp file with '..'. # 2) simple files like libraries and stamps have a simple filename. obj = 'obj' if self.toolset != 'target': obj += '.' + self.toolset path_dir, path_basename = os.path.split(path) assert not os.path.isabs(path_dir), ( "'%s' can not be absolute path (see crbug.com/462153)." % path_dir) if qualified: path_basename = self.name + '.' + path_basename return os.path.normpath(os.path.join(obj, self.base_dir, path_dir, path_basename)) def WriteCollapsedDependencies(self, name, targets, order_only=None): """Given a list of targets, return a path for a single file representing the result of building all the targets or None. Uses a stamp file if necessary.""" assert targets == filter(None, targets), targets if len(targets) == 0: assert not order_only return None if len(targets) > 1 or order_only: stamp = self.GypPathToUniqueOutput(name + '.stamp') targets = self.ninja.build(stamp, 'stamp', targets, order_only=order_only) self.ninja.newline() return targets[0] def _SubninjaNameForArch(self, arch): output_file_base = os.path.splitext(self.output_file_name)[0] return '%s.%s.ninja' % (output_file_base, arch) def WriteSpec(self, spec, config_name, generator_flags): """The main entry point for NinjaWriter: write the build rules for a spec. Returns a Target object, which represents the output paths for this spec. Returns None if there are no outputs (e.g. a settings-only 'none' type target).""" self.config_name = config_name self.name = spec['target_name'] self.toolset = spec['toolset'] config = spec['configurations'][config_name] self.target = Target(spec['type']) self.is_standalone_static_library = bool( spec.get('standalone_static_library', 0)) # Track if this target contains any C++ files, to decide if gcc or g++ # should be used for linking. self.uses_cpp = False self.is_mac_bundle = gyp.xcode_emulation.IsMacBundle(self.flavor, spec) self.xcode_settings = self.msvs_settings = None if self.flavor == 'mac': self.xcode_settings = gyp.xcode_emulation.XcodeSettings(spec) if self.flavor == 'win': self.msvs_settings = gyp.msvs_emulation.MsvsSettings(spec, generator_flags) arch = self.msvs_settings.GetArch(config_name) self.ninja.variable('arch', self.win_env[arch]) self.ninja.variable('cc', '$cl_' + arch) self.ninja.variable('cxx', '$cl_' + arch) self.ninja.variable('cc_host', '$cl_' + arch) self.ninja.variable('cxx_host', '$cl_' + arch) self.ninja.variable('asm', '$ml_' + arch) if self.flavor == 'mac': self.archs = self.xcode_settings.GetActiveArchs(config_name) if len(self.archs) > 1: self.arch_subninjas = dict( (arch, ninja_syntax.Writer( OpenOutput(os.path.join(self.toplevel_build, self._SubninjaNameForArch(arch)), 'w'))) for arch in self.archs) # Compute predepends for all rules. # actions_depends is the dependencies this target depends on before running # any of its action/rule/copy steps. # compile_depends is the dependencies this target depends on before running # any of its compile steps. actions_depends = [] compile_depends = [] # TODO(evan): it is rather confusing which things are lists and which # are strings. Fix these. if 'dependencies' in spec: for dep in spec['dependencies']: if dep in self.target_outputs: target = self.target_outputs[dep] actions_depends.append(target.PreActionInput(self.flavor)) compile_depends.append(target.PreCompileInput()) actions_depends = filter(None, actions_depends) compile_depends = filter(None, compile_depends) actions_depends = self.WriteCollapsedDependencies('actions_depends', actions_depends) compile_depends = self.WriteCollapsedDependencies('compile_depends', compile_depends) self.target.preaction_stamp = actions_depends self.target.precompile_stamp = compile_depends # Write out actions, rules, and copies. These must happen before we # compile any sources, so compute a list of predependencies for sources # while we do it. extra_sources = [] mac_bundle_depends = [] self.target.actions_stamp = self.WriteActionsRulesCopies( spec, extra_sources, actions_depends, mac_bundle_depends) # If we have actions/rules/copies, we depend directly on those, but # otherwise we depend on dependent target's actions/rules/copies etc. # We never need to explicitly depend on previous target's link steps, # because no compile ever depends on them. compile_depends_stamp = (self.target.actions_stamp or compile_depends) # Write out the compilation steps, if any. link_deps = [] sources = extra_sources + spec.get('sources', []) if sources: if self.flavor == 'mac' and len(self.archs) > 1: # Write subninja file containing compile and link commands scoped to # a single arch if a fat binary is being built. for arch in self.archs: self.ninja.subninja(self._SubninjaNameForArch(arch)) pch = None if self.flavor == 'win': gyp.msvs_emulation.VerifyMissingSources( sources, self.abs_build_dir, generator_flags, self.GypPathToNinja) pch = gyp.msvs_emulation.PrecompiledHeader( self.msvs_settings, config_name, self.GypPathToNinja, self.GypPathToUniqueOutput, self.obj_ext) else: pch = gyp.xcode_emulation.MacPrefixHeader( self.xcode_settings, self.GypPathToNinja, lambda path, lang: self.GypPathToUniqueOutput(path + '-' + lang)) link_deps = self.WriteSources( self.ninja, config_name, config, sources, compile_depends_stamp, pch, spec) # Some actions/rules output 'sources' that are already object files. obj_outputs = [f for f in sources if f.endswith(self.obj_ext)] if obj_outputs: if self.flavor != 'mac' or len(self.archs) == 1: link_deps += [self.GypPathToNinja(o) for o in obj_outputs] else: print "Warning: Actions/rules writing object files don't work with " \ "multiarch targets, dropping. (target %s)" % spec['target_name'] elif self.flavor == 'mac' and len(self.archs) > 1: link_deps = collections.defaultdict(list) compile_deps = self.target.actions_stamp or actions_depends if self.flavor == 'win' and self.target.type == 'static_library': self.target.component_objs = link_deps self.target.compile_deps = compile_deps # Write out a link step, if needed. output = None is_empty_bundle = not link_deps and not mac_bundle_depends if link_deps or self.target.actions_stamp or actions_depends: output = self.WriteTarget(spec, config_name, config, link_deps, compile_deps) if self.is_mac_bundle: mac_bundle_depends.append(output) # Bundle all of the above together, if needed. if self.is_mac_bundle: output = self.WriteMacBundle(spec, mac_bundle_depends, is_empty_bundle) if not output: return None assert self.target.FinalOutput(), output return self.target def _WinIdlRule(self, source, prebuild, outputs): """Handle the implicit VS .idl rule for one source file. Fills |outputs| with files that are generated.""" outdir, output, vars, flags = self.msvs_settings.GetIdlBuildData( source, self.config_name) outdir = self.GypPathToNinja(outdir) def fix_path(path, rel=None): path = os.path.join(outdir, path) dirname, basename = os.path.split(source) root, ext = os.path.splitext(basename) path = self.ExpandRuleVariables( path, root, dirname, source, ext, basename) if rel: path = os.path.relpath(path, rel) return path vars = [(name, fix_path(value, outdir)) for name, value in vars] output = [fix_path(p) for p in output] vars.append(('outdir', outdir)) vars.append(('idlflags', flags)) input = self.GypPathToNinja(source) self.ninja.build(output, 'idl', input, variables=vars, order_only=prebuild) outputs.extend(output) def WriteWinIdlFiles(self, spec, prebuild): """Writes rules to match MSVS's implicit idl handling.""" assert self.flavor == 'win' if self.msvs_settings.HasExplicitIdlRulesOrActions(spec): return [] outputs = [] for source in filter(lambda x: x.endswith('.idl'), spec['sources']): self._WinIdlRule(source, prebuild, outputs) return outputs def WriteActionsRulesCopies(self, spec, extra_sources, prebuild, mac_bundle_depends): """Write out the Actions, Rules, and Copies steps. Return a path representing the outputs of these steps.""" outputs = [] if self.is_mac_bundle: mac_bundle_resources = spec.get('mac_bundle_resources', [])[:] else: mac_bundle_resources = [] extra_mac_bundle_resources = [] if 'actions' in spec: outputs += self.WriteActions(spec['actions'], extra_sources, prebuild, extra_mac_bundle_resources) if 'rules' in spec: outputs += self.WriteRules(spec['rules'], extra_sources, prebuild, mac_bundle_resources, extra_mac_bundle_resources) if 'copies' in spec: outputs += self.WriteCopies(spec['copies'], prebuild, mac_bundle_depends) if 'sources' in spec and self.flavor == 'win': outputs += self.WriteWinIdlFiles(spec, prebuild) stamp = self.WriteCollapsedDependencies('actions_rules_copies', outputs) if self.is_mac_bundle: xcassets = self.WriteMacBundleResources( extra_mac_bundle_resources + mac_bundle_resources, mac_bundle_depends) partial_info_plist = self.WriteMacXCassets(xcassets, mac_bundle_depends) self.WriteMacInfoPlist(partial_info_plist, mac_bundle_depends) return stamp def GenerateDescription(self, verb, message, fallback): """Generate and return a description of a build step. |verb| is the short summary, e.g. ACTION or RULE. |message| is a hand-written description, or None if not available. |fallback| is the gyp-level name of the step, usable as a fallback. """ if self.toolset != 'target': verb += '(%s)' % self.toolset if message: return '%s %s' % (verb, self.ExpandSpecial(message)) else: return '%s %s: %s' % (verb, self.name, fallback) def WriteActions(self, actions, extra_sources, prebuild, extra_mac_bundle_resources): # Actions cd into the base directory. env = self.GetToolchainEnv() all_outputs = [] for action in actions: # First write out a rule for the action. name = '%s_%s' % (action['action_name'], self.hash_for_rules) description = self.GenerateDescription('ACTION', action.get('message', None), name) is_cygwin = (self.msvs_settings.IsRuleRunUnderCygwin(action) if self.flavor == 'win' else False) args = action['action'] depfile = action.get('depfile', None) if depfile: depfile = self.ExpandSpecial(depfile, self.base_to_build) pool = 'console' if int(action.get('ninja_use_console', 0)) else None rule_name, _ = self.WriteNewNinjaRule(name, args, description, is_cygwin, env, pool, depfile=depfile) inputs = [self.GypPathToNinja(i, env) for i in action['inputs']] if int(action.get('process_outputs_as_sources', False)): extra_sources += action['outputs'] if int(action.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += action['outputs'] outputs = [self.GypPathToNinja(o, env) for o in action['outputs']] # Then write out an edge using the rule. self.ninja.build(outputs, rule_name, inputs, order_only=prebuild) all_outputs += outputs self.ninja.newline() return all_outputs def WriteRules(self, rules, extra_sources, prebuild, mac_bundle_resources, extra_mac_bundle_resources): env = self.GetToolchainEnv() all_outputs = [] for rule in rules: # Skip a rule with no action and no inputs. if 'action' not in rule and not rule.get('rule_sources', []): continue # First write out a rule for the rule action. name = '%s_%s' % (rule['rule_name'], self.hash_for_rules) args = rule['action'] description = self.GenerateDescription( 'RULE', rule.get('message', None), ('%s ' + generator_default_variables['RULE_INPUT_PATH']) % name) is_cygwin = (self.msvs_settings.IsRuleRunUnderCygwin(rule) if self.flavor == 'win' else False) pool = 'console' if int(rule.get('ninja_use_console', 0)) else None rule_name, args = self.WriteNewNinjaRule( name, args, description, is_cygwin, env, pool) # TODO: if the command references the outputs directly, we should # simplify it to just use $out. # Rules can potentially make use of some special variables which # must vary per source file. # Compute the list of variables we'll need to provide. special_locals = ('source', 'root', 'dirname', 'ext', 'name') needed_variables = set(['source']) for argument in args: for var in special_locals: if '${%s}' % var in argument: needed_variables.add(var) def cygwin_munge(path): # pylint: disable=cell-var-from-loop if is_cygwin: return path.replace('\\', '/') return path inputs = [self.GypPathToNinja(i, env) for i in rule.get('inputs', [])] # If there are n source files matching the rule, and m additional rule # inputs, then adding 'inputs' to each build edge written below will # write m * n inputs. Collapsing reduces this to m + n. sources = rule.get('rule_sources', []) num_inputs = len(inputs) if prebuild: num_inputs += 1 if num_inputs > 2 and len(sources) > 2: inputs = [self.WriteCollapsedDependencies( rule['rule_name'], inputs, order_only=prebuild)] prebuild = [] # For each source file, write an edge that generates all the outputs. for source in sources: source = os.path.normpath(source) dirname, basename = os.path.split(source) root, ext = os.path.splitext(basename) # Gather the list of inputs and outputs, expanding $vars if possible. outputs = [self.ExpandRuleVariables(o, root, dirname, source, ext, basename) for o in rule['outputs']] if int(rule.get('process_outputs_as_sources', False)): extra_sources += outputs was_mac_bundle_resource = source in mac_bundle_resources if was_mac_bundle_resource or \ int(rule.get('process_outputs_as_mac_bundle_resources', False)): extra_mac_bundle_resources += outputs # Note: This is n_resources * n_outputs_in_rule. Put to-be-removed # items in a set and remove them all in a single pass if this becomes # a performance issue. if was_mac_bundle_resource: mac_bundle_resources.remove(source) extra_bindings = [] for var in needed_variables: if var == 'root': extra_bindings.append(('root', cygwin_munge(root))) elif var == 'dirname': # '$dirname' is a parameter to the rule action, which means # it shouldn't be converted to a Ninja path. But we don't # want $!PRODUCT_DIR in there either. dirname_expanded = self.ExpandSpecial(dirname, self.base_to_build) extra_bindings.append(('dirname', cygwin_munge(dirname_expanded))) elif var == 'source': # '$source' is a parameter to the rule action, which means # it shouldn't be converted to a Ninja path. But we don't # want $!PRODUCT_DIR in there either. source_expanded = self.ExpandSpecial(source, self.base_to_build) extra_bindings.append(('source', cygwin_munge(source_expanded))) elif var == 'ext': extra_bindings.append(('ext', ext)) elif var == 'name': extra_bindings.append(('name', cygwin_munge(basename))) else: assert var == None, repr(var) outputs = [self.GypPathToNinja(o, env) for o in outputs] if self.flavor == 'win': # WriteNewNinjaRule uses unique_name for creating an rsp file on win. extra_bindings.append(('unique_name', hashlib.md5(outputs[0]).hexdigest())) self.ninja.build(outputs, rule_name, self.GypPathToNinja(source), implicit=inputs, order_only=prebuild, variables=extra_bindings) all_outputs.extend(outputs) return all_outputs def WriteCopies(self, copies, prebuild, mac_bundle_depends): outputs = [] env = self.GetToolchainEnv() for copy in copies: for path in copy['files']: # Normalize the path so trailing slashes don't confuse us. path = os.path.normpath(path) basename = os.path.split(path)[1] src = self.GypPathToNinja(path, env) dst = self.GypPathToNinja(os.path.join(copy['destination'], basename), env) outputs += self.ninja.build(dst, 'copy', src, order_only=prebuild) if self.is_mac_bundle: # gyp has mac_bundle_resources to copy things into a bundle's # Resources folder, but there's no built-in way to copy files to other # places in the bundle. Hence, some targets use copies for this. Check # if this file is copied into the current bundle, and if so add it to # the bundle depends so that dependent targets get rebuilt if the copy # input changes. if dst.startswith(self.xcode_settings.GetBundleContentsFolderPath()): mac_bundle_depends.append(dst) return outputs def WriteMacBundleResources(self, resources, bundle_depends): """Writes ninja edges for 'mac_bundle_resources'.""" xcassets = [] for output, res in gyp.xcode_emulation.GetMacBundleResources( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, map(self.GypPathToNinja, resources)): output = self.ExpandSpecial(output) if os.path.splitext(output)[-1] != '.xcassets': isBinary = self.xcode_settings.IsBinaryOutputFormat(self.config_name) self.ninja.build(output, 'mac_tool', res, variables=[('mactool_cmd', 'copy-bundle-resource'), \ ('binary', isBinary)]) bundle_depends.append(output) else: xcassets.append(res) return xcassets def WriteMacXCassets(self, xcassets, bundle_depends): """Writes ninja edges for 'mac_bundle_resources' .xcassets files. This add an invocation of 'actool' via the 'mac_tool.py' helper script. It assumes that the assets catalogs define at least one imageset and thus an Assets.car file will be generated in the application resources directory. If this is not the case, then the build will probably be done at each invocation of ninja.""" if not xcassets: return extra_arguments = {} settings_to_arg = { 'XCASSETS_APP_ICON': 'app-icon', 'XCASSETS_LAUNCH_IMAGE': 'launch-image', } settings = self.xcode_settings.xcode_settings[self.config_name] for settings_key, arg_name in settings_to_arg.iteritems(): value = settings.get(settings_key) if value: extra_arguments[arg_name] = value partial_info_plist = None if extra_arguments: partial_info_plist = self.GypPathToUniqueOutput( 'assetcatalog_generated_info.plist') extra_arguments['output-partial-info-plist'] = partial_info_plist outputs = [] outputs.append( os.path.join( self.xcode_settings.GetBundleResourceFolder(), 'Assets.car')) if partial_info_plist: outputs.append(partial_info_plist) keys = QuoteShellArgument(json.dumps(extra_arguments), self.flavor) extra_env = self.xcode_settings.GetPerTargetSettings() env = self.GetSortedXcodeEnv(additional_settings=extra_env) env = self.ComputeExportEnvString(env) bundle_depends.extend(self.ninja.build( outputs, 'compile_xcassets', xcassets, variables=[('env', env), ('keys', keys)])) return partial_info_plist def WriteMacInfoPlist(self, partial_info_plist, bundle_depends): """Write build rules for bundle Info.plist files.""" info_plist, out, defines, extra_env = gyp.xcode_emulation.GetMacInfoPlist( generator_default_variables['PRODUCT_DIR'], self.xcode_settings, self.GypPathToNinja) if not info_plist: return out = self.ExpandSpecial(out) if defines: # Create an intermediate file to store preprocessed results. intermediate_plist = self.GypPathToUniqueOutput( os.path.basename(info_plist)) defines = ' '.join([Define(d, self.flavor) for d in defines]) info_plist = self.ninja.build( intermediate_plist, 'preprocess_infoplist', info_plist, variables=[('defines',defines)]) env = self.GetSortedXcodeEnv(additional_settings=extra_env) env = self.ComputeExportEnvString(env) if partial_info_plist: intermediate_plist = self.GypPathToUniqueOutput('merged_info.plist') info_plist = self.ninja.build( intermediate_plist, 'merge_infoplist', [partial_info_plist, info_plist]) keys = self.xcode_settings.GetExtraPlistItems(self.config_name) keys = QuoteShellArgument(json.dumps(keys), self.flavor) isBinary = self.xcode_settings.IsBinaryOutputFormat(self.config_name) self.ninja.build(out, 'copy_infoplist', info_plist, variables=[('env', env), ('keys', keys), ('binary', isBinary)]) bundle_depends.append(out) def WriteSources(self, ninja_file, config_name, config, sources, predepends, precompiled_header, spec): """Write build rules to compile all of |sources|.""" if self.toolset == 'host': self.ninja.variable('ar', '$ar_host') self.ninja.variable('cc', '$cc_host') self.ninja.variable('cxx', '$cxx_host') self.ninja.variable('ld', '$ld_host') self.ninja.variable('ldxx', '$ldxx_host') self.ninja.variable('nm', '$nm_host') self.ninja.variable('readelf', '$readelf_host') if self.flavor != 'mac' or len(self.archs) == 1: return self.WriteSourcesForArch( self.ninja, config_name, config, sources, predepends, precompiled_header, spec) else: return dict((arch, self.WriteSourcesForArch( self.arch_subninjas[arch], config_name, config, sources, predepends, precompiled_header, spec, arch=arch)) for arch in self.archs) def WriteSourcesForArch(self, ninja_file, config_name, config, sources, predepends, precompiled_header, spec, arch=None): """Write build rules to compile all of |sources|.""" extra_defines = [] if self.flavor == 'mac': cflags = self.xcode_settings.GetCflags(config_name, arch=arch) cflags_c = self.xcode_settings.GetCflagsC(config_name) cflags_cc = self.xcode_settings.GetCflagsCC(config_name) cflags_objc = ['$cflags_c'] + \ self.xcode_settings.GetCflagsObjC(config_name) cflags_objcc = ['$cflags_cc'] + \ self.xcode_settings.GetCflagsObjCC(config_name) elif self.flavor == 'win': asmflags = self.msvs_settings.GetAsmflags(config_name) cflags = self.msvs_settings.GetCflags(config_name) cflags_c = self.msvs_settings.GetCflagsC(config_name) cflags_cc = self.msvs_settings.GetCflagsCC(config_name) extra_defines = self.msvs_settings.GetComputedDefines(config_name) # See comment at cc_command for why there's two .pdb files. pdbpath_c = pdbpath_cc = self.msvs_settings.GetCompilerPdbName( config_name, self.ExpandSpecial) if not pdbpath_c: obj = 'obj' if self.toolset != 'target': obj += '.' + self.toolset pdbpath = os.path.normpath(os.path.join(obj, self.base_dir, self.name)) pdbpath_c = pdbpath + '.c.pdb' pdbpath_cc = pdbpath + '.cc.pdb' self.WriteVariableList(ninja_file, 'pdbname_c', [pdbpath_c]) self.WriteVariableList(ninja_file, 'pdbname_cc', [pdbpath_cc]) self.WriteVariableList(ninja_file, 'pchprefix', [self.name]) else: cflags = config.get('cflags', []) cflags_c = config.get('cflags_c', []) cflags_cc = config.get('cflags_cc', []) # Respect environment variables related to build, but target-specific # flags can still override them. if self.toolset == 'target': cflags_c = (os.environ.get('CPPFLAGS', '').split() + os.environ.get('CFLAGS', '').split() + cflags_c) cflags_cc = (os.environ.get('CPPFLAGS', '').split() + os.environ.get('CXXFLAGS', '').split() + cflags_cc) elif self.toolset == 'host': cflags_c = (os.environ.get('CPPFLAGS_host', '').split() + os.environ.get('CFLAGS_host', '').split() + cflags_c) cflags_cc = (os.environ.get('CPPFLAGS_host', '').split() + os.environ.get('CXXFLAGS_host', '').split() + cflags_cc) defines = config.get('defines', []) + extra_defines self.WriteVariableList(ninja_file, 'defines', [Define(d, self.flavor) for d in defines]) if self.flavor == 'win': self.WriteVariableList(ninja_file, 'asmflags', map(self.ExpandSpecial, asmflags)) self.WriteVariableList(ninja_file, 'rcflags', [QuoteShellArgument(self.ExpandSpecial(f), self.flavor) for f in self.msvs_settings.GetRcflags(config_name, self.GypPathToNinja)]) include_dirs = config.get('include_dirs', []) env = self.GetToolchainEnv() if self.flavor == 'win': include_dirs = self.msvs_settings.AdjustIncludeDirs(include_dirs, config_name) self.WriteVariableList(ninja_file, 'includes', [QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor) for i in include_dirs]) if self.flavor == 'win': midl_include_dirs = config.get('midl_include_dirs', []) midl_include_dirs = self.msvs_settings.AdjustMidlIncludeDirs( midl_include_dirs, config_name) self.WriteVariableList(ninja_file, 'midl_includes', [QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor) for i in midl_include_dirs]) pch_commands = precompiled_header.GetPchBuildCommands(arch) if self.flavor == 'mac': # Most targets use no precompiled headers, so only write these if needed. for ext, var in [('c', 'cflags_pch_c'), ('cc', 'cflags_pch_cc'), ('m', 'cflags_pch_objc'), ('mm', 'cflags_pch_objcc')]: include = precompiled_header.GetInclude(ext, arch) if include: ninja_file.variable(var, include) arflags = config.get('arflags', []) self.WriteVariableList(ninja_file, 'cflags', map(self.ExpandSpecial, cflags)) self.WriteVariableList(ninja_file, 'cflags_c', map(self.ExpandSpecial, cflags_c)) self.WriteVariableList(ninja_file, 'cflags_cc', map(self.ExpandSpecial, cflags_cc)) if self.flavor == 'mac': self.WriteVariableList(ninja_file, 'cflags_objc', map(self.ExpandSpecial, cflags_objc)) self.WriteVariableList(ninja_file, 'cflags_objcc', map(self.ExpandSpecial, cflags_objcc)) self.WriteVariableList(ninja_file, 'arflags', map(self.ExpandSpecial, arflags)) ninja_file.newline() outputs = [] has_rc_source = False for source in sources: filename, ext = os.path.splitext(source) ext = ext[1:] obj_ext = self.obj_ext if ext in ('cc', 'cpp', 'cxx'): command = 'cxx' self.uses_cpp = True elif ext == 'c' or (ext == 'S' and self.flavor != 'win'): command = 'cc' elif ext == 's' and self.flavor != 'win': # Doesn't generate .o.d files. command = 'cc_s' elif (self.flavor == 'win' and ext == 'asm' and not self.msvs_settings.HasExplicitAsmRules(spec)): command = 'asm' # Add the _asm suffix as msvs is capable of handling .cc and # .asm files of the same name without collision. obj_ext = '_asm.obj' elif self.flavor == 'mac' and ext == 'm': command = 'objc' elif self.flavor == 'mac' and ext == 'mm': command = 'objcxx' self.uses_cpp = True elif self.flavor == 'win' and ext == 'rc': command = 'rc' obj_ext = '.res' has_rc_source = True else: # Ignore unhandled extensions. continue input = self.GypPathToNinja(source) output = self.GypPathToUniqueOutput(filename + obj_ext) if arch is not None: output = AddArch(output, arch) implicit = precompiled_header.GetObjDependencies([input], [output], arch) variables = [] if self.flavor == 'win': variables, output, implicit = precompiled_header.GetFlagsModifications( input, output, implicit, command, cflags_c, cflags_cc, self.ExpandSpecial) ninja_file.build(output, command, input, implicit=[gch for _, _, gch in implicit], order_only=predepends, variables=variables) outputs.append(output) if has_rc_source: resource_include_dirs = config.get('resource_include_dirs', include_dirs) self.WriteVariableList(ninja_file, 'resource_includes', [QuoteShellArgument('-I' + self.GypPathToNinja(i, env), self.flavor) for i in resource_include_dirs]) self.WritePchTargets(ninja_file, pch_commands) ninja_file.newline() return outputs def WritePchTargets(self, ninja_file, pch_commands): """Writes ninja rules to compile prefix headers.""" if not pch_commands: return for gch, lang_flag, lang, input in pch_commands: var_name = { 'c': 'cflags_pch_c', 'cc': 'cflags_pch_cc', 'm': 'cflags_pch_objc', 'mm': 'cflags_pch_objcc', }[lang] map = { 'c': 'cc', 'cc': 'cxx', 'm': 'objc', 'mm': 'objcxx', } cmd = map.get(lang) ninja_file.build(gch, cmd, input, variables=[(var_name, lang_flag)]) def WriteLink(self, spec, config_name, config, link_deps): """Write out a link step. Fills out target.binary. """ if self.flavor != 'mac' or len(self.archs) == 1: return self.WriteLinkForArch( self.ninja, spec, config_name, config, link_deps) else: output = self.ComputeOutput(spec) inputs = [self.WriteLinkForArch(self.arch_subninjas[arch], spec, config_name, config, link_deps[arch], arch=arch) for arch in self.archs] extra_bindings = [] build_output = output if not self.is_mac_bundle: self.AppendPostbuildVariable(extra_bindings, spec, output, output) # TODO(yyanagisawa): more work needed to fix: # https://code.google.com/p/gyp/issues/detail?id=411 if (spec['type'] in ('shared_library', 'loadable_module') and not self.is_mac_bundle): extra_bindings.append(('lib', output)) self.ninja.build([output, output + '.TOC'], 'solipo', inputs, variables=extra_bindings) else: self.ninja.build(build_output, 'lipo', inputs, variables=extra_bindings) return output def WriteLinkForArch(self, ninja_file, spec, config_name, config, link_deps, arch=None): """Write out a link step. Fills out target.binary. """ command = { 'executable': 'link', 'loadable_module': 'solink_module', 'shared_library': 'solink', }[spec['type']] command_suffix = '' implicit_deps = set() solibs = set() order_deps = set() if 'dependencies' in spec: # Two kinds of dependencies: # - Linkable dependencies (like a .a or a .so): add them to the link line. # - Non-linkable dependencies (like a rule that generates a file # and writes a stamp file): add them to implicit_deps extra_link_deps = set() for dep in spec['dependencies']: target = self.target_outputs.get(dep) if not target: continue linkable = target.Linkable() if linkable: new_deps = [] if (self.flavor == 'win' and target.component_objs and self.msvs_settings.IsUseLibraryDependencyInputs(config_name)): new_deps = target.component_objs if target.compile_deps: order_deps.add(target.compile_deps) elif self.flavor == 'win' and target.import_lib: new_deps = [target.import_lib] elif target.UsesToc(self.flavor): solibs.add(target.binary) implicit_deps.add(target.binary + '.TOC') else: new_deps = [target.binary] for new_dep in new_deps: if new_dep not in extra_link_deps: extra_link_deps.add(new_dep) link_deps.append(new_dep) final_output = target.FinalOutput() if not linkable or final_output != target.binary: implicit_deps.add(final_output) extra_bindings = [] if self.uses_cpp and self.flavor != 'win': extra_bindings.append(('ld', '$ldxx')) output = self.ComputeOutput(spec, arch) if arch is None and not self.is_mac_bundle: self.AppendPostbuildVariable(extra_bindings, spec, output, output) is_executable = spec['type'] == 'executable' # The ldflags config key is not used on mac or win. On those platforms # linker flags are set via xcode_settings and msvs_settings, respectively. env_ldflags = os.environ.get('LDFLAGS', '').split() if self.flavor == 'mac': ldflags = self.xcode_settings.GetLdflags(config_name, self.ExpandSpecial(generator_default_variables['PRODUCT_DIR']), self.GypPathToNinja, arch) ldflags = env_ldflags + ldflags elif self.flavor == 'win': manifest_base_name = self.GypPathToUniqueOutput( self.ComputeOutputFileName(spec)) ldflags, intermediate_manifest, manifest_files = \ self.msvs_settings.GetLdflags(config_name, self.GypPathToNinja, self.ExpandSpecial, manifest_base_name, output, is_executable, self.toplevel_build) ldflags = env_ldflags + ldflags self.WriteVariableList(ninja_file, 'manifests', manifest_files) implicit_deps = implicit_deps.union(manifest_files) if intermediate_manifest: self.WriteVariableList( ninja_file, 'intermediatemanifest', [intermediate_manifest]) command_suffix = _GetWinLinkRuleNameSuffix( self.msvs_settings.IsEmbedManifest(config_name)) def_file = self.msvs_settings.GetDefFile(self.GypPathToNinja) if def_file: implicit_deps.add(def_file) else: # Respect environment variables related to build, but target-specific # flags can still override them. ldflags = env_ldflags + config.get('ldflags', []) if is_executable and len(solibs): rpath = 'lib/' if self.toolset != 'target': rpath += self.toolset ldflags.append(r'-Wl,-rpath=\$$ORIGIN/%s' % rpath) ldflags.append('-Wl,-rpath-link=%s' % rpath) self.WriteVariableList(ninja_file, 'ldflags', map(self.ExpandSpecial, ldflags)) library_dirs = config.get('library_dirs', []) if self.flavor == 'win': library_dirs = [self.msvs_settings.ConvertVSMacros(l, config_name) for l in library_dirs] library_dirs = ['/LIBPATH:' + QuoteShellArgument(self.GypPathToNinja(l), self.flavor) for l in library_dirs] else: library_dirs = [QuoteShellArgument('-L' + self.GypPathToNinja(l), self.flavor) for l in library_dirs] libraries = gyp.common.uniquer(map(self.ExpandSpecial, spec.get('libraries', []))) if self.flavor == 'mac': libraries = self.xcode_settings.AdjustLibraries(libraries, config_name) elif self.flavor == 'win': libraries = self.msvs_settings.AdjustLibraries(libraries) self.WriteVariableList(ninja_file, 'libs', library_dirs + libraries) linked_binary = output if command in ('solink', 'solink_module'): extra_bindings.append(('soname', os.path.split(output)[1])) extra_bindings.append(('lib', gyp.common.EncodePOSIXShellArgument(output))) if self.flavor != 'win': link_file_list = output if self.is_mac_bundle: # 'Dependency Framework.framework/Versions/A/Dependency Framework' -> # 'Dependency Framework.framework.rsp' link_file_list = self.xcode_settings.GetWrapperName() if arch: link_file_list += '.' + arch link_file_list += '.rsp' # If an rspfile contains spaces, ninja surrounds the filename with # quotes around it and then passes it to open(), creating a file with # quotes in its name (and when looking for the rsp file, the name # makes it through bash which strips the quotes) :-/ link_file_list = link_file_list.replace(' ', '_') extra_bindings.append( ('link_file_list', gyp.common.EncodePOSIXShellArgument(link_file_list))) if self.flavor == 'win': extra_bindings.append(('binary', output)) if ('/NOENTRY' not in ldflags and not self.msvs_settings.GetNoImportLibrary(config_name)): self.target.import_lib = output + '.lib' extra_bindings.append(('implibflag', '/IMPLIB:%s' % self.target.import_lib)) pdbname = self.msvs_settings.GetPDBName( config_name, self.ExpandSpecial, output + '.pdb') output = [output, self.target.import_lib] if pdbname: output.append(pdbname) elif not self.is_mac_bundle: output = [output, output + '.TOC'] else: command = command + '_notoc' elif self.flavor == 'win': extra_bindings.append(('binary', output)) pdbname = self.msvs_settings.GetPDBName( config_name, self.ExpandSpecial, output + '.pdb') if pdbname: output = [output, pdbname] if len(solibs): extra_bindings.append(('solibs', gyp.common.EncodePOSIXShellList(solibs))) ninja_file.build(output, command + command_suffix, link_deps, implicit=list(implicit_deps), order_only=list(order_deps), variables=extra_bindings) return linked_binary def WriteTarget(self, spec, config_name, config, link_deps, compile_deps): extra_link_deps = any(self.target_outputs.get(dep).Linkable() for dep in spec.get('dependencies', []) if dep in self.target_outputs) if spec['type'] == 'none' or (not link_deps and not extra_link_deps): # TODO(evan): don't call this function for 'none' target types, as # it doesn't do anything, and we fake out a 'binary' with a stamp file. self.target.binary = compile_deps self.target.type = 'none' elif spec['type'] == 'static_library': self.target.binary = self.ComputeOutput(spec) if (self.flavor not in ('mac', 'openbsd', 'netbsd', 'win') and not self.is_standalone_static_library): self.ninja.build(self.target.binary, 'alink_thin', link_deps, order_only=compile_deps) else: variables = [] if self.xcode_settings: libtool_flags = self.xcode_settings.GetLibtoolflags(config_name) if libtool_flags: variables.append(('libtool_flags', libtool_flags)) if self.msvs_settings: libflags = self.msvs_settings.GetLibFlags(config_name, self.GypPathToNinja) variables.append(('libflags', libflags)) if self.flavor != 'mac' or len(self.archs) == 1: self.AppendPostbuildVariable(variables, spec, self.target.binary, self.target.binary) self.ninja.build(self.target.binary, 'alink', link_deps, order_only=compile_deps, variables=variables) else: inputs = [] for arch in self.archs: output = self.ComputeOutput(spec, arch) self.arch_subninjas[arch].build(output, 'alink', link_deps[arch], order_only=compile_deps, variables=variables) inputs.append(output) # TODO: It's not clear if libtool_flags should be passed to the alink # call that combines single-arch .a files into a fat .a file. self.AppendPostbuildVariable(variables, spec, self.target.binary, self.target.binary) self.ninja.build(self.target.binary, 'alink', inputs, # FIXME: test proving order_only=compile_deps isn't # needed. variables=variables) else: self.target.binary = self.WriteLink(spec, config_name, config, link_deps) return self.target.binary def WriteMacBundle(self, spec, mac_bundle_depends, is_empty): assert self.is_mac_bundle package_framework = spec['type'] in ('shared_library', 'loadable_module') output = self.ComputeMacBundleOutput() if is_empty: output += '.stamp' variables = [] self.AppendPostbuildVariable(variables, spec, output, self.target.binary, is_command_start=not package_framework) if package_framework and not is_empty: variables.append(('version', self.xcode_settings.GetFrameworkVersion())) self.ninja.build(output, 'package_framework', mac_bundle_depends, variables=variables) else: self.ninja.build(output, 'stamp', mac_bundle_depends, variables=variables) self.target.bundle = output return output def GetToolchainEnv(self, additional_settings=None): """Returns the variables toolchain would set for build steps.""" env = self.GetSortedXcodeEnv(additional_settings=additional_settings) if self.flavor == 'win': env = self.GetMsvsToolchainEnv( additional_settings=additional_settings) return env def GetMsvsToolchainEnv(self, additional_settings=None): """Returns the variables Visual Studio would set for build steps.""" return self.msvs_settings.GetVSMacroEnv('$!PRODUCT_DIR', config=self.config_name) def GetSortedXcodeEnv(self, additional_settings=None): """Returns the variables Xcode would set for build steps.""" assert self.abs_build_dir abs_build_dir = self.abs_build_dir return gyp.xcode_emulation.GetSortedXcodeEnv( self.xcode_settings, abs_build_dir, os.path.join(abs_build_dir, self.build_to_base), self.config_name, additional_settings) def GetSortedXcodePostbuildEnv(self): """Returns the variables Xcode would set for postbuild steps.""" postbuild_settings = {} # CHROMIUM_STRIP_SAVE_FILE is a chromium-specific hack. # TODO(thakis): It would be nice to have some general mechanism instead. strip_save_file = self.xcode_settings.GetPerTargetSetting( 'CHROMIUM_STRIP_SAVE_FILE') if strip_save_file: postbuild_settings['CHROMIUM_STRIP_SAVE_FILE'] = strip_save_file return self.GetSortedXcodeEnv(additional_settings=postbuild_settings) def AppendPostbuildVariable(self, variables, spec, output, binary, is_command_start=False): """Adds a 'postbuild' variable if there is a postbuild for |output|.""" postbuild = self.GetPostbuildCommand(spec, output, binary, is_command_start) if postbuild: variables.append(('postbuilds', postbuild)) def GetPostbuildCommand(self, spec, output, output_binary, is_command_start): """Returns a shell command that runs all the postbuilds, and removes |output| if any of them fails. If |is_command_start| is False, then the returned string will start with ' && '.""" if not self.xcode_settings or spec['type'] == 'none' or not output: return '' output = QuoteShellArgument(output, self.flavor) postbuilds = gyp.xcode_emulation.GetSpecPostbuildCommands(spec, quiet=True) if output_binary is not None: postbuilds = self.xcode_settings.AddImplicitPostbuilds( self.config_name, os.path.normpath(os.path.join(self.base_to_build, output)), QuoteShellArgument( os.path.normpath(os.path.join(self.base_to_build, output_binary)), self.flavor), postbuilds, quiet=True) if not postbuilds: return '' # Postbuilds expect to be run in the gyp file's directory, so insert an # implicit postbuild to cd to there. postbuilds.insert(0, gyp.common.EncodePOSIXShellList( ['cd', self.build_to_base])) env = self.ComputeExportEnvString(self.GetSortedXcodePostbuildEnv()) # G will be non-null if any postbuild fails. Run all postbuilds in a # subshell. commands = env + ' (' + \ ' && '.join([ninja_syntax.escape(command) for command in postbuilds]) command_string = (commands + '); G=$$?; ' # Remove the final output if any postbuild failed. '((exit $$G) || rm -rf %s) ' % output + '&& exit $$G)') if is_command_start: return '(' + command_string + ' && ' else: return '$ && (' + command_string def ComputeExportEnvString(self, env): """Given an environment, returns a string looking like 'export FOO=foo; export BAR="${FOO} bar;' that exports |env| to the shell.""" export_str = [] for k, v in env: export_str.append('export %s=%s;' % (k, ninja_syntax.escape(gyp.common.EncodePOSIXShellArgument(v)))) return ' '.join(export_str) def ComputeMacBundleOutput(self): """Return the 'output' (full output path) to a bundle output directory.""" assert self.is_mac_bundle path = generator_default_variables['PRODUCT_DIR'] return self.ExpandSpecial( os.path.join(path, self.xcode_settings.GetWrapperName())) def ComputeOutputFileName(self, spec, type=None): """Compute the filename of the final output for the current target.""" if not type: type = spec['type'] default_variables = copy.copy(generator_default_variables) CalculateVariables(default_variables, {'flavor': self.flavor}) # Compute filename prefix: the product prefix, or a default for # the product type. DEFAULT_PREFIX = { 'loadable_module': default_variables['SHARED_LIB_PREFIX'], 'shared_library': default_variables['SHARED_LIB_PREFIX'], 'static_library': default_variables['STATIC_LIB_PREFIX'], 'executable': default_variables['EXECUTABLE_PREFIX'], } prefix = spec.get('product_prefix', DEFAULT_PREFIX.get(type, '')) # Compute filename extension: the product extension, or a default # for the product type. DEFAULT_EXTENSION = { 'loadable_module': default_variables['SHARED_LIB_SUFFIX'], 'shared_library': default_variables['SHARED_LIB_SUFFIX'], 'static_library': default_variables['STATIC_LIB_SUFFIX'], 'executable': default_variables['EXECUTABLE_SUFFIX'], } extension = spec.get('product_extension') if extension: extension = '.' + extension else: extension = DEFAULT_EXTENSION.get(type, '') if 'product_name' in spec: # If we were given an explicit name, use that. target = spec['product_name'] else: # Otherwise, derive a name from the target name. target = spec['target_name'] if prefix == 'lib': # Snip out an extra 'lib' from libs if appropriate. target = StripPrefix(target, 'lib') if type in ('static_library', 'loadable_module', 'shared_library', 'executable'): return '%s%s%s' % (prefix, target, extension) elif type == 'none': return '%s.stamp' % target else: raise Exception('Unhandled output type %s' % type) def ComputeOutput(self, spec, arch=None): """Compute the path for the final output of the spec.""" type = spec['type'] if self.flavor == 'win': override = self.msvs_settings.GetOutputName(self.config_name, self.ExpandSpecial) if override: return override if arch is None and self.flavor == 'mac' and type in ( 'static_library', 'executable', 'shared_library', 'loadable_module'): filename = self.xcode_settings.GetExecutablePath() else: filename = self.ComputeOutputFileName(spec, type) if arch is None and 'product_dir' in spec: path = os.path.join(spec['product_dir'], filename) return self.ExpandSpecial(path) # Some products go into the output root, libraries go into shared library # dir, and everything else goes into the normal place. type_in_output_root = ['executable', 'loadable_module'] if self.flavor == 'mac' and self.toolset == 'target': type_in_output_root += ['shared_library', 'static_library'] elif self.flavor == 'win' and self.toolset == 'target': type_in_output_root += ['shared_library'] if arch is not None: # Make sure partial executables don't end up in a bundle or the regular # output directory. archdir = 'arch' if self.toolset != 'target': archdir = os.path.join('arch', '%s' % self.toolset) return os.path.join(archdir, AddArch(filename, arch)) elif type in type_in_output_root or self.is_standalone_static_library: return filename elif type == 'shared_library': libdir = 'lib' if self.toolset != 'target': libdir = os.path.join('lib', '%s' % self.toolset) return os.path.join(libdir, filename) else: return self.GypPathToUniqueOutput(filename, qualified=False) def WriteVariableList(self, ninja_file, var, values): assert not isinstance(values, str) if values is None: values = [] ninja_file.variable(var, ' '.join(values)) def WriteNewNinjaRule(self, name, args, description, is_cygwin, env, pool, depfile=None): """Write out a new ninja "rule" statement for a given command. Returns the name of the new rule, and a copy of |args| with variables expanded.""" if self.flavor == 'win': args = [self.msvs_settings.ConvertVSMacros( arg, self.base_to_build, config=self.config_name) for arg in args] description = self.msvs_settings.ConvertVSMacros( description, config=self.config_name) elif self.flavor == 'mac': # |env| is an empty list on non-mac. args = [gyp.xcode_emulation.ExpandEnvVars(arg, env) for arg in args] description = gyp.xcode_emulation.ExpandEnvVars(description, env) # TODO: we shouldn't need to qualify names; we do it because # currently the ninja rule namespace is global, but it really # should be scoped to the subninja. rule_name = self.name if self.toolset == 'target': rule_name += '.' + self.toolset rule_name += '.' + name rule_name = re.sub('[^a-zA-Z0-9_]', '_', rule_name) # Remove variable references, but not if they refer to the magic rule # variables. This is not quite right, as it also protects these for # actions, not just for rules where they are valid. Good enough. protect = [ '${root}', '${dirname}', '${source}', '${ext}', '${name}' ] protect = '(?!' + '|'.join(map(re.escape, protect)) + ')' description = re.sub(protect + r'\$', '_', description) # gyp dictates that commands are run from the base directory. # cd into the directory before running, and adjust paths in # the arguments to point to the proper locations. rspfile = None rspfile_content = None args = [self.ExpandSpecial(arg, self.base_to_build) for arg in args] if self.flavor == 'win': rspfile = rule_name + '.$unique_name.rsp' # The cygwin case handles this inside the bash sub-shell. run_in = '' if is_cygwin else ' ' + self.build_to_base if is_cygwin: rspfile_content = self.msvs_settings.BuildCygwinBashCommandLine( args, self.build_to_base) else: rspfile_content = gyp.msvs_emulation.EncodeRspFileList(args) command = ('%s gyp-win-tool action-wrapper $arch ' % sys.executable + rspfile + run_in) else: env = self.ComputeExportEnvString(env) command = gyp.common.EncodePOSIXShellList(args) command = 'cd %s; ' % self.build_to_base + env + command # GYP rules/actions express being no-ops by not touching their outputs. # Avoid executing downstream dependencies in this case by specifying # restat=1 to ninja. self.ninja.rule(rule_name, command, description, depfile=depfile, restat=True, pool=pool, rspfile=rspfile, rspfile_content=rspfile_content) self.ninja.newline() return rule_name, args def CalculateVariables(default_variables, params): """Calculate additional variables for use in the build (called by gyp).""" global generator_additional_non_configuration_keys global generator_additional_path_sections flavor = gyp.common.GetFlavor(params) if flavor == 'mac': default_variables.setdefault('OS', 'mac') default_variables.setdefault('SHARED_LIB_SUFFIX', '.dylib') default_variables.setdefault('SHARED_LIB_DIR', generator_default_variables['PRODUCT_DIR']) default_variables.setdefault('LIB_DIR', generator_default_variables['PRODUCT_DIR']) # Copy additional generator configuration data from Xcode, which is shared # by the Mac Ninja generator. import gyp.generator.xcode as xcode_generator generator_additional_non_configuration_keys = getattr(xcode_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(xcode_generator, 'generator_additional_path_sections', []) global generator_extra_sources_for_rules generator_extra_sources_for_rules = getattr(xcode_generator, 'generator_extra_sources_for_rules', []) elif flavor == 'win': exts = gyp.MSVSUtil.TARGET_TYPE_EXT default_variables.setdefault('OS', 'win') default_variables['EXECUTABLE_SUFFIX'] = '.' + exts['executable'] default_variables['STATIC_LIB_PREFIX'] = '' default_variables['STATIC_LIB_SUFFIX'] = '.' + exts['static_library'] default_variables['SHARED_LIB_PREFIX'] = '' default_variables['SHARED_LIB_SUFFIX'] = '.' + exts['shared_library'] # Copy additional generator configuration data from VS, which is shared # by the Windows Ninja generator. import gyp.generator.msvs as msvs_generator generator_additional_non_configuration_keys = getattr(msvs_generator, 'generator_additional_non_configuration_keys', []) generator_additional_path_sections = getattr(msvs_generator, 'generator_additional_path_sections', []) gyp.msvs_emulation.CalculateCommonVariables(default_variables, params) else: operating_system = flavor if flavor == 'android': operating_system = 'linux' # Keep this legacy behavior for now. default_variables.setdefault('OS', operating_system) default_variables.setdefault('SHARED_LIB_SUFFIX', '.so') default_variables.setdefault('SHARED_LIB_DIR', os.path.join('$!PRODUCT_DIR', 'lib')) default_variables.setdefault('LIB_DIR', os.path.join('$!PRODUCT_DIR', 'obj')) def ComputeOutputDir(params): """Returns the path from the toplevel_dir to the build output directory.""" # generator_dir: relative path from pwd to where make puts build files. # Makes migrating from make to ninja easier, ninja doesn't put anything here. generator_dir = os.path.relpath(params['options'].generator_output or '.') # output_dir: relative path from generator_dir to the build directory. output_dir = params.get('generator_flags', {}).get('output_dir', 'out') # Relative path from source root to our output files. e.g. "out" return os.path.normpath(os.path.join(generator_dir, output_dir)) def CalculateGeneratorInputInfo(params): """Called by __init__ to initialize generator values based on params.""" # E.g. "out/gypfiles" toplevel = params['options'].toplevel_dir qualified_out_dir = os.path.normpath(os.path.join( toplevel, ComputeOutputDir(params), 'gypfiles')) global generator_filelist_paths generator_filelist_paths = { 'toplevel': toplevel, 'qualified_out_dir': qualified_out_dir, } def OpenOutput(path, mode='w'): """Open |path| for writing, creating directories if necessary.""" gyp.common.EnsureDirExists(path) return open(path, mode) def CommandWithWrapper(cmd, wrappers, prog): wrapper = wrappers.get(cmd, '') if wrapper: return wrapper + ' ' + prog return prog def GetDefaultConcurrentLinks(): """Returns a best-guess for a number of concurrent links.""" pool_size = int(os.environ.get('GYP_LINK_CONCURRENCY', 0)) if pool_size: return pool_size if sys.platform in ('win32', 'cygwin'): import ctypes class MEMORYSTATUSEX(ctypes.Structure): _fields_ = [ ("dwLength", ctypes.c_ulong), ("dwMemoryLoad", ctypes.c_ulong), ("ullTotalPhys", ctypes.c_ulonglong), ("ullAvailPhys", ctypes.c_ulonglong), ("ullTotalPageFile", ctypes.c_ulonglong), ("ullAvailPageFile", ctypes.c_ulonglong), ("ullTotalVirtual", ctypes.c_ulonglong), ("ullAvailVirtual", ctypes.c_ulonglong), ("sullAvailExtendedVirtual", ctypes.c_ulonglong), ] stat = MEMORYSTATUSEX() stat.dwLength = ctypes.sizeof(stat) ctypes.windll.kernel32.GlobalMemoryStatusEx(ctypes.byref(stat)) # VS 2015 uses 20% more working set than VS 2013 and can consume all RAM # on a 64 GB machine. mem_limit = max(1, stat.ullTotalPhys / (5 * (2 ** 30))) # total / 5GB hard_cap = max(1, int(os.environ.get('GYP_LINK_CONCURRENCY_MAX', 2**32))) return min(mem_limit, hard_cap) elif sys.platform.startswith('linux'): if os.path.exists("/proc/meminfo"): with open("/proc/meminfo") as meminfo: memtotal_re = re.compile(r'^MemTotal:\s*(\d*)\s*kB') for line in meminfo: match = memtotal_re.match(line) if not match: continue # Allow 8Gb per link on Linux because Gold is quite memory hungry return max(1, int(match.group(1)) / (8 * (2 ** 20))) return 1 elif sys.platform == 'darwin': try: avail_bytes = int(subprocess.check_output(['sysctl', '-n', 'hw.memsize'])) # A static library debug build of Chromium's unit_tests takes ~2.7GB, so # 4GB per ld process allows for some more bloat. return max(1, avail_bytes / (4 * (2 ** 30))) # total / 4GB except: return 1 else: # TODO(scottmg): Implement this for other platforms. return 1 def _GetWinLinkRuleNameSuffix(embed_manifest): """Returns the suffix used to select an appropriate linking rule depending on whether the manifest embedding is enabled.""" return '_embed' if embed_manifest else '' def _AddWinLinkRules(master_ninja, embed_manifest): """Adds link rules for Windows platform to |master_ninja|.""" def FullLinkCommand(ldcmd, out, binary_type): resource_name = { 'exe': '1', 'dll': '2', }[binary_type] return '%(python)s gyp-win-tool link-with-manifests $arch %(embed)s ' \ '%(out)s "%(ldcmd)s" %(resname)s $mt $rc "$intermediatemanifest" ' \ '$manifests' % { 'python': sys.executable, 'out': out, 'ldcmd': ldcmd, 'resname': resource_name, 'embed': embed_manifest } rule_name_suffix = _GetWinLinkRuleNameSuffix(embed_manifest) use_separate_mspdbsrv = ( int(os.environ.get('GYP_USE_SEPARATE_MSPDBSRV', '0')) != 0) dlldesc = 'LINK%s(DLL) $binary' % rule_name_suffix.upper() dllcmd = ('%s gyp-win-tool link-wrapper $arch %s ' '$ld /nologo $implibflag /DLL /OUT:$binary ' '@$binary.rsp' % (sys.executable, use_separate_mspdbsrv)) dllcmd = FullLinkCommand(dllcmd, '$binary', 'dll') master_ninja.rule('solink' + rule_name_suffix, description=dlldesc, command=dllcmd, rspfile='$binary.rsp', rspfile_content='$libs $in_newline $ldflags', restat=True, pool='link_pool') master_ninja.rule('solink_module' + rule_name_suffix, description=dlldesc, command=dllcmd, rspfile='$binary.rsp', rspfile_content='$libs $in_newline $ldflags', restat=True, pool='link_pool') # Note that ldflags goes at the end so that it has the option of # overriding default settings earlier in the command line. exe_cmd = ('%s gyp-win-tool link-wrapper $arch %s ' '$ld /nologo /OUT:$binary @$binary.rsp' % (sys.executable, use_separate_mspdbsrv)) exe_cmd = FullLinkCommand(exe_cmd, '$binary', 'exe') master_ninja.rule('link' + rule_name_suffix, description='LINK%s $binary' % rule_name_suffix.upper(), command=exe_cmd, rspfile='$binary.rsp', rspfile_content='$in_newline $libs $ldflags', pool='link_pool') def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name): options = params['options'] flavor = gyp.common.GetFlavor(params) generator_flags = params.get('generator_flags', {}) # build_dir: relative path from source root to our output files. # e.g. "out/Debug" build_dir = os.path.normpath( os.path.join(ComputeOutputDir(params), config_name)) toplevel_build = os.path.join(options.toplevel_dir, build_dir) master_ninja_file = OpenOutput(os.path.join(toplevel_build, 'build.ninja')) master_ninja = ninja_syntax.Writer(master_ninja_file, width=120) # Put build-time support tools in out/{config_name}. gyp.common.CopyTool(flavor, toplevel_build) # Grab make settings for CC/CXX. # The rules are # - The priority from low to high is gcc/g++, the 'make_global_settings' in # gyp, the environment variable. # - If there is no 'make_global_settings' for CC.host/CXX.host or # 'CC_host'/'CXX_host' enviroment variable, cc_host/cxx_host should be set # to cc/cxx. if flavor == 'win': ar = 'lib.exe' # cc and cxx must be set to the correct architecture by overriding with one # of cl_x86 or cl_x64 below. cc = 'UNSET' cxx = 'UNSET' ld = 'link.exe' ld_host = '$ld' else: ar = 'ar' cc = 'cc' cxx = 'c++' ld = '$cc' ldxx = '$cxx' ld_host = '$cc_host' ldxx_host = '$cxx_host' ar_host = 'ar' cc_host = None cxx_host = None cc_host_global_setting = None cxx_host_global_setting = None clang_cl = None nm = 'nm' nm_host = 'nm' readelf = 'readelf' readelf_host = 'readelf' build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0]) make_global_settings = data[build_file].get('make_global_settings', []) build_to_root = gyp.common.InvertRelativePath(build_dir, options.toplevel_dir) wrappers = {} for key, value in make_global_settings: if key == 'AR': ar = os.path.join(build_to_root, value) if key == 'AR.host': ar_host = os.path.join(build_to_root, value) if key == 'CC': cc = os.path.join(build_to_root, value) if cc.endswith('clang-cl'): clang_cl = cc if key == 'CXX': cxx = os.path.join(build_to_root, value) if key == 'CC.host': cc_host = os.path.join(build_to_root, value) cc_host_global_setting = value if key == 'CXX.host': cxx_host = os.path.join(build_to_root, value) cxx_host_global_setting = value if key == 'LD': ld = os.path.join(build_to_root, value) if key == 'LD.host': ld_host = os.path.join(build_to_root, value) if key == 'NM': nm = os.path.join(build_to_root, value) if key == 'NM.host': nm_host = os.path.join(build_to_root, value) if key == 'READELF': readelf = os.path.join(build_to_root, value) if key == 'READELF.host': readelf_host = os.path.join(build_to_root, value) if key.endswith('_wrapper'): wrappers[key[:-len('_wrapper')]] = os.path.join(build_to_root, value) # Support wrappers from environment variables too. for key, value in os.environ.iteritems(): if key.lower().endswith('_wrapper'): key_prefix = key[:-len('_wrapper')] key_prefix = re.sub(r'\.HOST$', '.host', key_prefix) wrappers[key_prefix] = os.path.join(build_to_root, value) if flavor == 'win': configs = [target_dicts[qualified_target]['configurations'][config_name] for qualified_target in target_list] shared_system_includes = None if not generator_flags.get('ninja_use_custom_environment_files', 0): shared_system_includes = \ gyp.msvs_emulation.ExtractSharedMSVSSystemIncludes( configs, generator_flags) cl_paths = gyp.msvs_emulation.GenerateEnvironmentFiles( toplevel_build, generator_flags, shared_system_includes, OpenOutput) for arch, path in cl_paths.iteritems(): if clang_cl: # If we have selected clang-cl, use that instead. path = clang_cl command = CommandWithWrapper('CC', wrappers, QuoteShellArgument(path, 'win')) if clang_cl: # Use clang-cl to cross-compile for x86 or x86_64. command += (' -m32' if arch == 'x86' else ' -m64') master_ninja.variable('cl_' + arch, command) cc = GetEnvironFallback(['CC_target', 'CC'], cc) master_ninja.variable('cc', CommandWithWrapper('CC', wrappers, cc)) cxx = GetEnvironFallback(['CXX_target', 'CXX'], cxx) master_ninja.variable('cxx', CommandWithWrapper('CXX', wrappers, cxx)) if flavor == 'win': master_ninja.variable('ld', ld) master_ninja.variable('idl', 'midl.exe') master_ninja.variable('ar', ar) master_ninja.variable('rc', 'rc.exe') master_ninja.variable('ml_x86', 'ml.exe') master_ninja.variable('ml_x64', 'ml64.exe') master_ninja.variable('mt', 'mt.exe') else: master_ninja.variable('ld', CommandWithWrapper('LINK', wrappers, ld)) master_ninja.variable('ldxx', CommandWithWrapper('LINK', wrappers, ldxx)) master_ninja.variable('ar', GetEnvironFallback(['AR_target', 'AR'], ar)) if flavor != 'mac': # Mac does not use readelf/nm for .TOC generation, so avoiding polluting # the master ninja with extra unused variables. master_ninja.variable( 'nm', GetEnvironFallback(['NM_target', 'NM'], nm)) master_ninja.variable( 'readelf', GetEnvironFallback(['READELF_target', 'READELF'], readelf)) if generator_supports_multiple_toolsets: if not cc_host: cc_host = cc if not cxx_host: cxx_host = cxx master_ninja.variable('ar_host', GetEnvironFallback(['AR_host'], ar_host)) master_ninja.variable('nm_host', GetEnvironFallback(['NM_host'], nm_host)) master_ninja.variable('readelf_host', GetEnvironFallback(['READELF_host'], readelf_host)) cc_host = GetEnvironFallback(['CC_host'], cc_host) cxx_host = GetEnvironFallback(['CXX_host'], cxx_host) # The environment variable could be used in 'make_global_settings', like # ['CC.host', '$(CC)'] or ['CXX.host', '$(CXX)'], transform them here. if '$(CC)' in cc_host and cc_host_global_setting: cc_host = cc_host_global_setting.replace('$(CC)', cc) if '$(CXX)' in cxx_host and cxx_host_global_setting: cxx_host = cxx_host_global_setting.replace('$(CXX)', cxx) master_ninja.variable('cc_host', CommandWithWrapper('CC.host', wrappers, cc_host)) master_ninja.variable('cxx_host', CommandWithWrapper('CXX.host', wrappers, cxx_host)) if flavor == 'win': master_ninja.variable('ld_host', ld_host) else: master_ninja.variable('ld_host', CommandWithWrapper( 'LINK', wrappers, ld_host)) master_ninja.variable('ldxx_host', CommandWithWrapper( 'LINK', wrappers, ldxx_host)) master_ninja.newline() master_ninja.pool('link_pool', depth=GetDefaultConcurrentLinks()) master_ninja.newline() deps = 'msvc' if flavor == 'win' else 'gcc' if flavor != 'win': master_ninja.rule( 'cc', description='CC $out', command=('$cc -MMD -MF $out.d $defines $includes $cflags $cflags_c ' '$cflags_pch_c -c $in -o $out'), depfile='$out.d', deps=deps) master_ninja.rule( 'cc_s', description='CC $out', command=('$cc $defines $includes $cflags $cflags_c ' '$cflags_pch_c -c $in -o $out')) master_ninja.rule( 'cxx', description='CXX $out', command=('$cxx -MMD -MF $out.d $defines $includes $cflags $cflags_cc ' '$cflags_pch_cc -c $in -o $out'), depfile='$out.d', deps=deps) else: # TODO(scottmg) Separate pdb names is a test to see if it works around # http://crbug.com/142362. It seems there's a race between the creation of # the .pdb by the precompiled header step for .cc and the compilation of # .c files. This should be handled by mspdbsrv, but rarely errors out with # c1xx : fatal error C1033: cannot open program database # By making the rules target separate pdb files this might be avoided. cc_command = ('ninja -t msvc -e $arch ' + '-- ' '$cc /nologo /showIncludes /FC ' '@$out.rsp /c $in /Fo$out /Fd$pdbname_c ') cxx_command = ('ninja -t msvc -e $arch ' + '-- ' '$cxx /nologo /showIncludes /FC ' '@$out.rsp /c $in /Fo$out /Fd$pdbname_cc ') master_ninja.rule( 'cc', description='CC $out', command=cc_command, rspfile='$out.rsp', rspfile_content='$defines $includes $cflags $cflags_c', deps=deps) master_ninja.rule( 'cxx', description='CXX $out', command=cxx_command, rspfile='$out.rsp', rspfile_content='$defines $includes $cflags $cflags_cc', deps=deps) master_ninja.rule( 'idl', description='IDL $in', command=('%s gyp-win-tool midl-wrapper $arch $outdir ' '$tlb $h $dlldata $iid $proxy $in ' '$midl_includes $idlflags' % sys.executable)) master_ninja.rule( 'rc', description='RC $in', # Note: $in must be last otherwise rc.exe complains. command=('%s gyp-win-tool rc-wrapper ' '$arch $rc $defines $resource_includes $rcflags /fo$out $in' % sys.executable)) master_ninja.rule( 'asm', description='ASM $out', command=('%s gyp-win-tool asm-wrapper ' '$arch $asm $defines $includes $asmflags /c /Fo $out $in' % sys.executable)) if flavor != 'mac' and flavor != 'win': master_ninja.rule( 'alink', description='AR $out', command='rm -f $out && $ar rcs $arflags $out $in') master_ninja.rule( 'alink_thin', description='AR $out', command='rm -f $out && $ar rcsT $arflags $out $in') # This allows targets that only need to depend on $lib's API to declare an # order-only dependency on $lib.TOC and avoid relinking such downstream # dependencies when $lib changes only in non-public ways. # The resulting string leaves an uninterpolated %{suffix} which # is used in the final substitution below. mtime_preserving_solink_base = ( 'if [ ! -e $lib -o ! -e $lib.TOC ]; then ' '%(solink)s && %(extract_toc)s > $lib.TOC; else ' '%(solink)s && %(extract_toc)s > $lib.tmp && ' 'if ! cmp -s $lib.tmp $lib.TOC; then mv $lib.tmp $lib.TOC ; ' 'fi; fi' % { 'solink': '$ld -shared $ldflags -o $lib -Wl,-soname=$soname %(suffix)s', 'extract_toc': ('{ $readelf -d $lib | grep SONAME ; ' '$nm -gD -f p $lib | cut -f1-2 -d\' \'; }')}) master_ninja.rule( 'solink', description='SOLINK $lib', restat=True, command=mtime_preserving_solink_base % {'suffix': '@$link_file_list'}, rspfile='$link_file_list', rspfile_content= '-Wl,--whole-archive $in $solibs -Wl,--no-whole-archive $libs', pool='link_pool') master_ninja.rule( 'solink_module', description='SOLINK(module) $lib', restat=True, command=mtime_preserving_solink_base % {'suffix': '@$link_file_list'}, rspfile='$link_file_list', rspfile_content='-Wl,--start-group $in -Wl,--end-group $solibs $libs', pool='link_pool') master_ninja.rule( 'link', description='LINK $out', command=('$ld $ldflags -o $out ' '-Wl,--start-group $in -Wl,--end-group $solibs $libs'), pool='link_pool') elif flavor == 'win': master_ninja.rule( 'alink', description='LIB $out', command=('%s gyp-win-tool link-wrapper $arch False ' '$ar /nologo /ignore:4221 /OUT:$out @$out.rsp' % sys.executable), rspfile='$out.rsp', rspfile_content='$in_newline $libflags') _AddWinLinkRules(master_ninja, embed_manifest=True) _AddWinLinkRules(master_ninja, embed_manifest=False) else: master_ninja.rule( 'objc', description='OBJC $out', command=('$cc -MMD -MF $out.d $defines $includes $cflags $cflags_objc ' '$cflags_pch_objc -c $in -o $out'), depfile='$out.d', deps=deps) master_ninja.rule( 'objcxx', description='OBJCXX $out', command=('$cxx -MMD -MF $out.d $defines $includes $cflags $cflags_objcc ' '$cflags_pch_objcc -c $in -o $out'), depfile='$out.d', deps=deps) master_ninja.rule( 'alink', description='LIBTOOL-STATIC $out, POSTBUILDS', command='rm -f $out && ' './gyp-mac-tool filter-libtool libtool $libtool_flags ' '-static -o $out $in' '$postbuilds') master_ninja.rule( 'lipo', description='LIPO $out, POSTBUILDS', command='rm -f $out && lipo -create $in -output $out$postbuilds') master_ninja.rule( 'solipo', description='SOLIPO $out, POSTBUILDS', command=( 'rm -f $lib $lib.TOC && lipo -create $in -output $lib$postbuilds &&' '%(extract_toc)s > $lib.TOC' % { 'extract_toc': '{ otool -l $lib | grep LC_ID_DYLIB -A 5; ' 'nm -gP $lib | cut -f1-2 -d\' \' | grep -v U$$; true; }'})) # Record the public interface of $lib in $lib.TOC. See the corresponding # comment in the posix section above for details. solink_base = '$ld %(type)s $ldflags -o $lib %(suffix)s' mtime_preserving_solink_base = ( 'if [ ! -e $lib -o ! -e $lib.TOC ] || ' # Always force dependent targets to relink if this library # reexports something. Handling this correctly would require # recursive TOC dumping but this is rare in practice, so punt. 'otool -l $lib | grep -q LC_REEXPORT_DYLIB ; then ' '%(solink)s && %(extract_toc)s > $lib.TOC; ' 'else ' '%(solink)s && %(extract_toc)s > $lib.tmp && ' 'if ! cmp -s $lib.tmp $lib.TOC; then ' 'mv $lib.tmp $lib.TOC ; ' 'fi; ' 'fi' % { 'solink': solink_base, 'extract_toc': '{ otool -l $lib | grep LC_ID_DYLIB -A 5; ' 'nm -gP $lib | cut -f1-2 -d\' \' | grep -v U$$; true; }'}) solink_suffix = '@$link_file_list$postbuilds' master_ninja.rule( 'solink', description='SOLINK $lib, POSTBUILDS', restat=True, command=mtime_preserving_solink_base % {'suffix': solink_suffix, 'type': '-shared'}, rspfile='$link_file_list', rspfile_content='$in $solibs $libs', pool='link_pool') master_ninja.rule( 'solink_notoc', description='SOLINK $lib, POSTBUILDS', restat=True, command=solink_base % {'suffix':solink_suffix, 'type': '-shared'}, rspfile='$link_file_list', rspfile_content='$in $solibs $libs', pool='link_pool') master_ninja.rule( 'solink_module', description='SOLINK(module) $lib, POSTBUILDS', restat=True, command=mtime_preserving_solink_base % {'suffix': solink_suffix, 'type': '-bundle'}, rspfile='$link_file_list', rspfile_content='$in $solibs $libs', pool='link_pool') master_ninja.rule( 'solink_module_notoc', description='SOLINK(module) $lib, POSTBUILDS', restat=True, command=solink_base % {'suffix': solink_suffix, 'type': '-bundle'}, rspfile='$link_file_list', rspfile_content='$in $solibs $libs', pool='link_pool') master_ninja.rule( 'link', description='LINK $out, POSTBUILDS', command=('$ld $ldflags -o $out ' '$in $solibs $libs$postbuilds'), pool='link_pool') master_ninja.rule( 'preprocess_infoplist', description='PREPROCESS INFOPLIST $out', command=('$cc -E -P -Wno-trigraphs -x c $defines $in -o $out && ' 'plutil -convert xml1 $out $out')) master_ninja.rule( 'copy_infoplist', description='COPY INFOPLIST $in', command='$env ./gyp-mac-tool copy-info-plist $in $out $binary $keys') master_ninja.rule( 'merge_infoplist', description='MERGE INFOPLISTS $in', command='$env ./gyp-mac-tool merge-info-plist $out $in') master_ninja.rule( 'compile_xcassets', description='COMPILE XCASSETS $in', command='$env ./gyp-mac-tool compile-xcassets $keys $in') master_ninja.rule( 'mac_tool', description='MACTOOL $mactool_cmd $in', command='$env ./gyp-mac-tool $mactool_cmd $in $out $binary') master_ninja.rule( 'package_framework', description='PACKAGE FRAMEWORK $out, POSTBUILDS', command='./gyp-mac-tool package-framework $out $version$postbuilds ' '&& touch $out') if flavor == 'win': master_ninja.rule( 'stamp', description='STAMP $out', command='%s gyp-win-tool stamp $out' % sys.executable) master_ninja.rule( 'copy', description='COPY $in $out', command='%s gyp-win-tool recursive-mirror $in $out' % sys.executable) else: master_ninja.rule( 'stamp', description='STAMP $out', command='${postbuilds}touch $out') master_ninja.rule( 'copy', description='COPY $in $out', command='rm -rf $out && cp -af $in $out') master_ninja.newline() all_targets = set() for build_file in params['build_files']: for target in gyp.common.AllTargets(target_list, target_dicts, os.path.normpath(build_file)): all_targets.add(target) all_outputs = set() # target_outputs is a map from qualified target name to a Target object. target_outputs = {} # target_short_names is a map from target short name to a list of Target # objects. target_short_names = {} # short name of targets that were skipped because they didn't contain anything # interesting. # NOTE: there may be overlap between this an non_empty_target_names. empty_target_names = set() # Set of non-empty short target names. # NOTE: there may be overlap between this an empty_target_names. non_empty_target_names = set() for qualified_target in target_list: # qualified_target is like: third_party/icu/icu.gyp:icui18n#target build_file, name, toolset = \ gyp.common.ParseQualifiedTarget(qualified_target) this_make_global_settings = data[build_file].get('make_global_settings', []) assert make_global_settings == this_make_global_settings, ( "make_global_settings needs to be the same for all targets. %s vs. %s" % (this_make_global_settings, make_global_settings)) spec = target_dicts[qualified_target] if flavor == 'mac': gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(data[build_file], spec) # If build_file is a symlink, we must not follow it because there's a chance # it could point to a path above toplevel_dir, and we cannot correctly deal # with that case at the moment. build_file = gyp.common.RelativePath(build_file, options.toplevel_dir, False) qualified_target_for_hash = gyp.common.QualifiedTarget(build_file, name, toolset) hash_for_rules = hashlib.md5(qualified_target_for_hash).hexdigest() base_path = os.path.dirname(build_file) obj = 'obj' if toolset != 'target': obj += '.' + toolset output_file = os.path.join(obj, base_path, name + '.ninja') ninja_output = StringIO() writer = NinjaWriter(hash_for_rules, target_outputs, base_path, build_dir, ninja_output, toplevel_build, output_file, flavor, toplevel_dir=options.toplevel_dir) target = writer.WriteSpec(spec, config_name, generator_flags) if ninja_output.tell() > 0: # Only create files for ninja files that actually have contents. with OpenOutput(os.path.join(toplevel_build, output_file)) as ninja_file: ninja_file.write(ninja_output.getvalue()) ninja_output.close() master_ninja.subninja(output_file) if target: if name != target.FinalOutput() and spec['toolset'] == 'target': target_short_names.setdefault(name, []).append(target) target_outputs[qualified_target] = target if qualified_target in all_targets: all_outputs.add(target.FinalOutput()) non_empty_target_names.add(name) else: empty_target_names.add(name) if target_short_names: # Write a short name to build this target. This benefits both the # "build chrome" case as well as the gyp tests, which expect to be # able to run actions and build libraries by their short name. master_ninja.newline() master_ninja.comment('Short names for targets.') for short_name in target_short_names: master_ninja.build(short_name, 'phony', [x.FinalOutput() for x in target_short_names[short_name]]) # Write phony targets for any empty targets that weren't written yet. As # short names are not necessarily unique only do this for short names that # haven't already been output for another target. empty_target_names = empty_target_names - non_empty_target_names if empty_target_names: master_ninja.newline() master_ninja.comment('Empty targets (output for completeness).') for name in sorted(empty_target_names): master_ninja.build(name, 'phony') if all_outputs: master_ninja.newline() master_ninja.build('all', 'phony', list(all_outputs)) master_ninja.default(generator_flags.get('default_target', 'all')) master_ninja_file.close() def PerformBuild(data, configurations, params): options = params['options'] for config in configurations: builddir = os.path.join(options.toplevel_dir, 'out', config) arguments = ['ninja', '-C', builddir] print 'Building [%s]: %s' % (config, arguments) subprocess.check_call(arguments) def CallGenerateOutputForConfig(arglist): # Ignore the interrupt signal so that the parent process catches it and # kills all multiprocessing children. signal.signal(signal.SIGINT, signal.SIG_IGN) (target_list, target_dicts, data, params, config_name) = arglist GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) def GenerateOutput(target_list, target_dicts, data, params): # Update target_dicts for iOS device builds. target_dicts = gyp.xcode_emulation.CloneConfigurationForDeviceAndEmulator( target_dicts) user_config = params.get('generator_flags', {}).get('config', None) if gyp.common.GetFlavor(params) == 'win': target_list, target_dicts = MSVSUtil.ShardTargets(target_list, target_dicts) target_list, target_dicts = MSVSUtil.InsertLargePdbShims( target_list, target_dicts, generator_default_variables) if user_config: GenerateOutputForConfig(target_list, target_dicts, data, params, user_config) else: config_names = target_dicts[target_list[0]]['configurations'].keys() if params['parallel']: try: pool = multiprocessing.Pool(len(config_names)) arglists = [] for config_name in config_names: arglists.append( (target_list, target_dicts, data, params, config_name)) pool.map(CallGenerateOutputForConfig, arglists) except KeyboardInterrupt, e: pool.terminate() raise e else: for config_name in config_names: GenerateOutputForConfig(target_list, target_dicts, data, params, config_name) npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/ninja_test.py0000644000000000000000000000337212631326456025111 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the ninja.py file. """ import gyp.generator.ninja as ninja import unittest import StringIO import sys import TestCommon class TestPrefixesAndSuffixes(unittest.TestCase): def test_BinaryNamesWindows(self): # These cannot run on non-Windows as they require a VS installation to # correctly handle variable expansion. if sys.platform.startswith('win'): writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'build.ninja', '.', 'build.ninja', 'win') spec = { 'target_name': 'wee' } self.assertTrue(writer.ComputeOutputFileName(spec, 'executable'). endswith('.exe')) self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library'). endswith('.dll')) self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library'). endswith('.lib')) def test_BinaryNamesLinux(self): writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'build.ninja', '.', 'build.ninja', 'linux') spec = { 'target_name': 'wee' } self.assertTrue('.' not in writer.ComputeOutputFileName(spec, 'executable')) self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library'). startswith('lib')) self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library'). startswith('lib')) self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library'). endswith('.so')) self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library'). endswith('.a')) if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode.py0000644000000000000000000016175012631326456024062 0ustar 00000000000000# Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import filecmp import gyp.common import gyp.xcodeproj_file import gyp.xcode_ninja import errno import os import sys import posixpath import re import shutil import subprocess import tempfile # Project files generated by this module will use _intermediate_var as a # custom Xcode setting whose value is a DerivedSources-like directory that's # project-specific and configuration-specific. The normal choice, # DERIVED_FILE_DIR, is target-specific, which is thought to be too restrictive # as it is likely that multiple targets within a single project file will want # to access the same set of generated files. The other option, # PROJECT_DERIVED_FILE_DIR, is unsuitable because while it is project-specific, # it is not configuration-specific. INTERMEDIATE_DIR is defined as # $(PROJECT_DERIVED_FILE_DIR)/$(CONFIGURATION). _intermediate_var = 'INTERMEDIATE_DIR' # SHARED_INTERMEDIATE_DIR is the same, except that it is shared among all # targets that share the same BUILT_PRODUCTS_DIR. _shared_intermediate_var = 'SHARED_INTERMEDIATE_DIR' _library_search_paths_var = 'LIBRARY_SEARCH_PATHS' generator_default_variables = { 'EXECUTABLE_PREFIX': '', 'EXECUTABLE_SUFFIX': '', 'STATIC_LIB_PREFIX': 'lib', 'SHARED_LIB_PREFIX': 'lib', 'STATIC_LIB_SUFFIX': '.a', 'SHARED_LIB_SUFFIX': '.dylib', # INTERMEDIATE_DIR is a place for targets to build up intermediate products. # It is specific to each build environment. It is only guaranteed to exist # and be constant within the context of a project, corresponding to a single # input file. Some build environments may allow their intermediate directory # to be shared on a wider scale, but this is not guaranteed. 'INTERMEDIATE_DIR': '$(%s)' % _intermediate_var, 'OS': 'mac', 'PRODUCT_DIR': '$(BUILT_PRODUCTS_DIR)', 'LIB_DIR': '$(BUILT_PRODUCTS_DIR)', 'RULE_INPUT_ROOT': '$(INPUT_FILE_BASE)', 'RULE_INPUT_EXT': '$(INPUT_FILE_SUFFIX)', 'RULE_INPUT_NAME': '$(INPUT_FILE_NAME)', 'RULE_INPUT_PATH': '$(INPUT_FILE_PATH)', 'RULE_INPUT_DIRNAME': '$(INPUT_FILE_DIRNAME)', 'SHARED_INTERMEDIATE_DIR': '$(%s)' % _shared_intermediate_var, 'CONFIGURATION_NAME': '$(CONFIGURATION)', } # The Xcode-specific sections that hold paths. generator_additional_path_sections = [ 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers', # 'mac_framework_dirs', input already handles _dirs endings. ] # The Xcode-specific keys that exist on targets and aren't moved down to # configurations. generator_additional_non_configuration_keys = [ 'ios_app_extension', 'ios_watch_app', 'ios_watchkit_extension', 'mac_bundle', 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers', 'mac_xctest_bundle', 'xcode_create_dependents_test_runner', ] # We want to let any rules apply to files that are resources also. generator_extra_sources_for_rules = [ 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers', ] generator_filelist_paths = None # Xcode's standard set of library directories, which don't need to be duplicated # in LIBRARY_SEARCH_PATHS. This list is not exhaustive, but that's okay. xcode_standard_library_dirs = frozenset([ '$(SDKROOT)/usr/lib', '$(SDKROOT)/usr/local/lib', ]) def CreateXCConfigurationList(configuration_names): xccl = gyp.xcodeproj_file.XCConfigurationList({'buildConfigurations': []}) if len(configuration_names) == 0: configuration_names = ['Default'] for configuration_name in configuration_names: xcbc = gyp.xcodeproj_file.XCBuildConfiguration({ 'name': configuration_name}) xccl.AppendProperty('buildConfigurations', xcbc) xccl.SetProperty('defaultConfigurationName', configuration_names[0]) return xccl class XcodeProject(object): def __init__(self, gyp_path, path, build_file_dict): self.gyp_path = gyp_path self.path = path self.project = gyp.xcodeproj_file.PBXProject(path=path) projectDirPath = gyp.common.RelativePath( os.path.dirname(os.path.abspath(self.gyp_path)), os.path.dirname(path) or '.') self.project.SetProperty('projectDirPath', projectDirPath) self.project_file = \ gyp.xcodeproj_file.XCProjectFile({'rootObject': self.project}) self.build_file_dict = build_file_dict # TODO(mark): add destructor that cleans up self.path if created_dir is # True and things didn't complete successfully. Or do something even # better with "try"? self.created_dir = False try: os.makedirs(self.path) self.created_dir = True except OSError, e: if e.errno != errno.EEXIST: raise def Finalize1(self, xcode_targets, serialize_all_tests): # Collect a list of all of the build configuration names used by the # various targets in the file. It is very heavily advised to keep each # target in an entire project (even across multiple project files) using # the same set of configuration names. configurations = [] for xct in self.project.GetProperty('targets'): xccl = xct.GetProperty('buildConfigurationList') xcbcs = xccl.GetProperty('buildConfigurations') for xcbc in xcbcs: name = xcbc.GetProperty('name') if name not in configurations: configurations.append(name) # Replace the XCConfigurationList attached to the PBXProject object with # a new one specifying all of the configuration names used by the various # targets. try: xccl = CreateXCConfigurationList(configurations) self.project.SetProperty('buildConfigurationList', xccl) except: sys.stderr.write("Problem with gyp file %s\n" % self.gyp_path) raise # The need for this setting is explained above where _intermediate_var is # defined. The comments below about wanting to avoid project-wide build # settings apply here too, but this needs to be set on a project-wide basis # so that files relative to the _intermediate_var setting can be displayed # properly in the Xcode UI. # # Note that for configuration-relative files such as anything relative to # _intermediate_var, for the purposes of UI tree view display, Xcode will # only resolve the configuration name once, when the project file is # opened. If the active build configuration is changed, the project file # must be closed and reopened if it is desired for the tree view to update. # This is filed as Apple radar 6588391. xccl.SetBuildSetting(_intermediate_var, '$(PROJECT_DERIVED_FILE_DIR)/$(CONFIGURATION)') xccl.SetBuildSetting(_shared_intermediate_var, '$(SYMROOT)/DerivedSources/$(CONFIGURATION)') # Set user-specified project-wide build settings and config files. This # is intended to be used very sparingly. Really, almost everything should # go into target-specific build settings sections. The project-wide # settings are only intended to be used in cases where Xcode attempts to # resolve variable references in a project context as opposed to a target # context, such as when resolving sourceTree references while building up # the tree tree view for UI display. # Any values set globally are applied to all configurations, then any # per-configuration values are applied. for xck, xcv in self.build_file_dict.get('xcode_settings', {}).iteritems(): xccl.SetBuildSetting(xck, xcv) if 'xcode_config_file' in self.build_file_dict: config_ref = self.project.AddOrGetFileInRootGroup( self.build_file_dict['xcode_config_file']) xccl.SetBaseConfiguration(config_ref) build_file_configurations = self.build_file_dict.get('configurations', {}) if build_file_configurations: for config_name in configurations: build_file_configuration_named = \ build_file_configurations.get(config_name, {}) if build_file_configuration_named: xcc = xccl.ConfigurationNamed(config_name) for xck, xcv in build_file_configuration_named.get('xcode_settings', {}).iteritems(): xcc.SetBuildSetting(xck, xcv) if 'xcode_config_file' in build_file_configuration_named: config_ref = self.project.AddOrGetFileInRootGroup( build_file_configurations[config_name]['xcode_config_file']) xcc.SetBaseConfiguration(config_ref) # Sort the targets based on how they appeared in the input. # TODO(mark): Like a lot of other things here, this assumes internal # knowledge of PBXProject - in this case, of its "targets" property. # ordinary_targets are ordinary targets that are already in the project # file. run_test_targets are the targets that run unittests and should be # used for the Run All Tests target. support_targets are the action/rule # targets used by GYP file targets, just kept for the assert check. ordinary_targets = [] run_test_targets = [] support_targets = [] # targets is full list of targets in the project. targets = [] # does the it define it's own "all"? has_custom_all = False # targets_for_all is the list of ordinary_targets that should be listed # in this project's "All" target. It includes each non_runtest_target # that does not have suppress_wildcard set. targets_for_all = [] for target in self.build_file_dict['targets']: target_name = target['target_name'] toolset = target['toolset'] qualified_target = gyp.common.QualifiedTarget(self.gyp_path, target_name, toolset) xcode_target = xcode_targets[qualified_target] # Make sure that the target being added to the sorted list is already in # the unsorted list. assert xcode_target in self.project._properties['targets'] targets.append(xcode_target) ordinary_targets.append(xcode_target) if xcode_target.support_target: support_targets.append(xcode_target.support_target) targets.append(xcode_target.support_target) if not int(target.get('suppress_wildcard', False)): targets_for_all.append(xcode_target) if target_name.lower() == 'all': has_custom_all = True; # If this target has a 'run_as' attribute, add its target to the # targets, and add it to the test targets. if target.get('run_as'): # Make a target to run something. It should have one # dependency, the parent xcode target. xccl = CreateXCConfigurationList(configurations) run_target = gyp.xcodeproj_file.PBXAggregateTarget({ 'name': 'Run ' + target_name, 'productName': xcode_target.GetProperty('productName'), 'buildConfigurationList': xccl, }, parent=self.project) run_target.AddDependency(xcode_target) command = target['run_as'] script = '' if command.get('working_directory'): script = script + 'cd "%s"\n' % \ gyp.xcodeproj_file.ConvertVariablesToShellSyntax( command.get('working_directory')) if command.get('environment'): script = script + "\n".join( ['export %s="%s"' % (key, gyp.xcodeproj_file.ConvertVariablesToShellSyntax(val)) for (key, val) in command.get('environment').iteritems()]) + "\n" # Some test end up using sockets, files on disk, etc. and can get # confused if more then one test runs at a time. The generator # flag 'xcode_serialize_all_test_runs' controls the forcing of all # tests serially. It defaults to True. To get serial runs this # little bit of python does the same as the linux flock utility to # make sure only one runs at a time. command_prefix = '' if serialize_all_tests: command_prefix = \ """python -c "import fcntl, subprocess, sys file = open('$TMPDIR/GYP_serialize_test_runs', 'a') fcntl.flock(file.fileno(), fcntl.LOCK_EX) sys.exit(subprocess.call(sys.argv[1:]))" """ # If we were unable to exec for some reason, we want to exit # with an error, and fixup variable references to be shell # syntax instead of xcode syntax. script = script + 'exec ' + command_prefix + '%s\nexit 1\n' % \ gyp.xcodeproj_file.ConvertVariablesToShellSyntax( gyp.common.EncodePOSIXShellList(command.get('action'))) ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'shellScript': script, 'showEnvVarsInLog': 0, }) run_target.AppendProperty('buildPhases', ssbp) # Add the run target to the project file. targets.append(run_target) run_test_targets.append(run_target) xcode_target.test_runner = run_target # Make sure that the list of targets being replaced is the same length as # the one replacing it, but allow for the added test runner targets. assert len(self.project._properties['targets']) == \ len(ordinary_targets) + len(support_targets) self.project._properties['targets'] = targets # Get rid of unnecessary levels of depth in groups like the Source group. self.project.RootGroupsTakeOverOnlyChildren(True) # Sort the groups nicely. Do this after sorting the targets, because the # Products group is sorted based on the order of the targets. self.project.SortGroups() # Create an "All" target if there's more than one target in this project # file and the project didn't define its own "All" target. Put a generated # "All" target first so that people opening up the project for the first # time will build everything by default. if len(targets_for_all) > 1 and not has_custom_all: xccl = CreateXCConfigurationList(configurations) all_target = gyp.xcodeproj_file.PBXAggregateTarget( { 'buildConfigurationList': xccl, 'name': 'All', }, parent=self.project) for target in targets_for_all: all_target.AddDependency(target) # TODO(mark): This is evil because it relies on internal knowledge of # PBXProject._properties. It's important to get the "All" target first, # though. self.project._properties['targets'].insert(0, all_target) # The same, but for run_test_targets. if len(run_test_targets) > 1: xccl = CreateXCConfigurationList(configurations) run_all_tests_target = gyp.xcodeproj_file.PBXAggregateTarget( { 'buildConfigurationList': xccl, 'name': 'Run All Tests', }, parent=self.project) for run_test_target in run_test_targets: run_all_tests_target.AddDependency(run_test_target) # Insert after the "All" target, which must exist if there is more than # one run_test_target. self.project._properties['targets'].insert(1, run_all_tests_target) def Finalize2(self, xcode_targets, xcode_target_to_target_dict): # Finalize2 needs to happen in a separate step because the process of # updating references to other projects depends on the ordering of targets # within remote project files. Finalize1 is responsible for sorting duty, # and once all project files are sorted, Finalize2 can come in and update # these references. # To support making a "test runner" target that will run all the tests # that are direct dependents of any given target, we look for # xcode_create_dependents_test_runner being set on an Aggregate target, # and generate a second target that will run the tests runners found under # the marked target. for bf_tgt in self.build_file_dict['targets']: if int(bf_tgt.get('xcode_create_dependents_test_runner', 0)): tgt_name = bf_tgt['target_name'] toolset = bf_tgt['toolset'] qualified_target = gyp.common.QualifiedTarget(self.gyp_path, tgt_name, toolset) xcode_target = xcode_targets[qualified_target] if isinstance(xcode_target, gyp.xcodeproj_file.PBXAggregateTarget): # Collect all the run test targets. all_run_tests = [] pbxtds = xcode_target.GetProperty('dependencies') for pbxtd in pbxtds: pbxcip = pbxtd.GetProperty('targetProxy') dependency_xct = pbxcip.GetProperty('remoteGlobalIDString') if hasattr(dependency_xct, 'test_runner'): all_run_tests.append(dependency_xct.test_runner) # Directly depend on all the runners as they depend on the target # that builds them. if len(all_run_tests) > 0: run_all_target = gyp.xcodeproj_file.PBXAggregateTarget({ 'name': 'Run %s Tests' % tgt_name, 'productName': tgt_name, }, parent=self.project) for run_test_target in all_run_tests: run_all_target.AddDependency(run_test_target) # Insert the test runner after the related target. idx = self.project._properties['targets'].index(xcode_target) self.project._properties['targets'].insert(idx + 1, run_all_target) # Update all references to other projects, to make sure that the lists of # remote products are complete. Otherwise, Xcode will fill them in when # it opens the project file, which will result in unnecessary diffs. # TODO(mark): This is evil because it relies on internal knowledge of # PBXProject._other_pbxprojects. for other_pbxproject in self.project._other_pbxprojects.keys(): self.project.AddOrGetProjectReference(other_pbxproject) self.project.SortRemoteProductReferences() # Give everything an ID. self.project_file.ComputeIDs() # Make sure that no two objects in the project file have the same ID. If # multiple objects wind up with the same ID, upon loading the file, Xcode # will only recognize one object (the last one in the file?) and the # results are unpredictable. self.project_file.EnsureNoIDCollisions() def Write(self): # Write the project file to a temporary location first. Xcode watches for # changes to the project file and presents a UI sheet offering to reload # the project when it does change. However, in some cases, especially when # multiple projects are open or when Xcode is busy, things don't work so # seamlessly. Sometimes, Xcode is able to detect that a project file has # changed but can't unload it because something else is referencing it. # To mitigate this problem, and to avoid even having Xcode present the UI # sheet when an open project is rewritten for inconsequential changes, the # project file is written to a temporary file in the xcodeproj directory # first. The new temporary file is then compared to the existing project # file, if any. If they differ, the new file replaces the old; otherwise, # the new project file is simply deleted. Xcode properly detects a file # being renamed over an open project file as a change and so it remains # able to present the "project file changed" sheet under this system. # Writing to a temporary file first also avoids the possible problem of # Xcode rereading an incomplete project file. (output_fd, new_pbxproj_path) = \ tempfile.mkstemp(suffix='.tmp', prefix='project.pbxproj.gyp.', dir=self.path) try: output_file = os.fdopen(output_fd, 'wb') self.project_file.Print(output_file) output_file.close() pbxproj_path = os.path.join(self.path, 'project.pbxproj') same = False try: same = filecmp.cmp(pbxproj_path, new_pbxproj_path, False) except OSError, e: if e.errno != errno.ENOENT: raise if same: # The new file is identical to the old one, just get rid of the new # one. os.unlink(new_pbxproj_path) else: # The new file is different from the old one, or there is no old one. # Rename the new file to the permanent name. # # tempfile.mkstemp uses an overly restrictive mode, resulting in a # file that can only be read by the owner, regardless of the umask. # There's no reason to not respect the umask here, which means that # an extra hoop is required to fetch it and reset the new file's mode. # # No way to get the umask without setting a new one? Set a safe one # and then set it back to the old value. umask = os.umask(077) os.umask(umask) os.chmod(new_pbxproj_path, 0666 & ~umask) os.rename(new_pbxproj_path, pbxproj_path) except Exception: # Don't leave turds behind. In fact, if this code was responsible for # creating the xcodeproj directory, get rid of that too. os.unlink(new_pbxproj_path) if self.created_dir: shutil.rmtree(self.path, True) raise def AddSourceToTarget(source, type, pbxp, xct): # TODO(mark): Perhaps source_extensions and library_extensions can be made a # little bit fancier. source_extensions = ['c', 'cc', 'cpp', 'cxx', 'm', 'mm', 's', 'swift'] # .o is conceptually more of a "source" than a "library," but Xcode thinks # of "sources" as things to compile and "libraries" (or "frameworks") as # things to link with. Adding an object file to an Xcode target's frameworks # phase works properly. library_extensions = ['a', 'dylib', 'framework', 'o'] basename = posixpath.basename(source) (root, ext) = posixpath.splitext(basename) if ext: ext = ext[1:].lower() if ext in source_extensions and type != 'none': xct.SourcesPhase().AddFile(source) elif ext in library_extensions and type != 'none': xct.FrameworksPhase().AddFile(source) else: # Files that aren't added to a sources or frameworks build phase can still # go into the project file, just not as part of a build phase. pbxp.AddOrGetFileInRootGroup(source) def AddResourceToTarget(resource, pbxp, xct): # TODO(mark): Combine with AddSourceToTarget above? Or just inline this call # where it's used. xct.ResourcesPhase().AddFile(resource) def AddHeaderToTarget(header, pbxp, xct, is_public): # TODO(mark): Combine with AddSourceToTarget above? Or just inline this call # where it's used. settings = '{ATTRIBUTES = (%s, ); }' % ('Private', 'Public')[is_public] xct.HeadersPhase().AddFile(header, settings) _xcode_variable_re = re.compile(r'(\$\((.*?)\))') def ExpandXcodeVariables(string, expansions): """Expands Xcode-style $(VARIABLES) in string per the expansions dict. In some rare cases, it is appropriate to expand Xcode variables when a project file is generated. For any substring $(VAR) in string, if VAR is a key in the expansions dict, $(VAR) will be replaced with expansions[VAR]. Any $(VAR) substring in string for which VAR is not a key in the expansions dict will remain in the returned string. """ matches = _xcode_variable_re.findall(string) if matches == None: return string matches.reverse() for match in matches: (to_replace, variable) = match if not variable in expansions: continue replacement = expansions[variable] string = re.sub(re.escape(to_replace), replacement, string) return string _xcode_define_re = re.compile(r'([\\\"\' ])') def EscapeXcodeDefine(s): """We must escape the defines that we give to XCode so that it knows not to split on spaces and to respect backslash and quote literals. However, we must not quote the define, or Xcode will incorrectly intepret variables especially $(inherited).""" return re.sub(_xcode_define_re, r'\\\1', s) def PerformBuild(data, configurations, params): options = params['options'] for build_file, build_file_dict in data.iteritems(): (build_file_root, build_file_ext) = os.path.splitext(build_file) if build_file_ext != '.gyp': continue xcodeproj_path = build_file_root + options.suffix + '.xcodeproj' if options.generator_output: xcodeproj_path = os.path.join(options.generator_output, xcodeproj_path) for config in configurations: arguments = ['xcodebuild', '-project', xcodeproj_path] arguments += ['-configuration', config] print "Building [%s]: %s" % (config, arguments) subprocess.check_call(arguments) def CalculateGeneratorInputInfo(params): toplevel = params['options'].toplevel_dir if params.get('flavor') == 'ninja': generator_dir = os.path.relpath(params['options'].generator_output or '.') output_dir = params.get('generator_flags', {}).get('output_dir', 'out') output_dir = os.path.normpath(os.path.join(generator_dir, output_dir)) qualified_out_dir = os.path.normpath(os.path.join( toplevel, output_dir, 'gypfiles-xcode-ninja')) else: output_dir = os.path.normpath(os.path.join(toplevel, 'xcodebuild')) qualified_out_dir = os.path.normpath(os.path.join( toplevel, output_dir, 'gypfiles')) global generator_filelist_paths generator_filelist_paths = { 'toplevel': toplevel, 'qualified_out_dir': qualified_out_dir, } def GenerateOutput(target_list, target_dicts, data, params): # Optionally configure each spec to use ninja as the external builder. ninja_wrapper = params.get('flavor') == 'ninja' if ninja_wrapper: (target_list, target_dicts, data) = \ gyp.xcode_ninja.CreateWrapper(target_list, target_dicts, data, params) options = params['options'] generator_flags = params.get('generator_flags', {}) parallel_builds = generator_flags.get('xcode_parallel_builds', True) serialize_all_tests = \ generator_flags.get('xcode_serialize_all_test_runs', True) upgrade_check_project_version = \ generator_flags.get('xcode_upgrade_check_project_version', None) # Format upgrade_check_project_version with leading zeros as needed. if upgrade_check_project_version: upgrade_check_project_version = str(upgrade_check_project_version) while len(upgrade_check_project_version) < 4: upgrade_check_project_version = '0' + upgrade_check_project_version skip_excluded_files = \ not generator_flags.get('xcode_list_excluded_files', True) xcode_projects = {} for build_file, build_file_dict in data.iteritems(): (build_file_root, build_file_ext) = os.path.splitext(build_file) if build_file_ext != '.gyp': continue xcodeproj_path = build_file_root + options.suffix + '.xcodeproj' if options.generator_output: xcodeproj_path = os.path.join(options.generator_output, xcodeproj_path) xcp = XcodeProject(build_file, xcodeproj_path, build_file_dict) xcode_projects[build_file] = xcp pbxp = xcp.project # Set project-level attributes from multiple options project_attributes = {}; if parallel_builds: project_attributes['BuildIndependentTargetsInParallel'] = 'YES' if upgrade_check_project_version: project_attributes['LastUpgradeCheck'] = upgrade_check_project_version project_attributes['LastTestingUpgradeCheck'] = \ upgrade_check_project_version project_attributes['LastSwiftUpdateCheck'] = \ upgrade_check_project_version pbxp.SetProperty('attributes', project_attributes) # Add gyp/gypi files to project if not generator_flags.get('standalone'): main_group = pbxp.GetProperty('mainGroup') build_group = gyp.xcodeproj_file.PBXGroup({'name': 'Build'}) main_group.AppendChild(build_group) for included_file in build_file_dict['included_files']: build_group.AddOrGetFileByPath(included_file, False) xcode_targets = {} xcode_target_to_target_dict = {} for qualified_target in target_list: [build_file, target_name, toolset] = \ gyp.common.ParseQualifiedTarget(qualified_target) spec = target_dicts[qualified_target] if spec['toolset'] != 'target': raise Exception( 'Multiple toolsets not supported in xcode build (target %s)' % qualified_target) configuration_names = [spec['default_configuration']] for configuration_name in sorted(spec['configurations'].keys()): if configuration_name not in configuration_names: configuration_names.append(configuration_name) xcp = xcode_projects[build_file] pbxp = xcp.project # Set up the configurations for the target according to the list of names # supplied. xccl = CreateXCConfigurationList(configuration_names) # Create an XCTarget subclass object for the target. The type with # "+bundle" appended will be used if the target has "mac_bundle" set. # loadable_modules not in a mac_bundle are mapped to # com.googlecode.gyp.xcode.bundle, a pseudo-type that xcode.py interprets # to create a single-file mh_bundle. _types = { 'executable': 'com.apple.product-type.tool', 'loadable_module': 'com.googlecode.gyp.xcode.bundle', 'shared_library': 'com.apple.product-type.library.dynamic', 'static_library': 'com.apple.product-type.library.static', 'mac_kernel_extension': 'com.apple.product-type.kernel-extension', 'executable+bundle': 'com.apple.product-type.application', 'loadable_module+bundle': 'com.apple.product-type.bundle', 'loadable_module+xctest': 'com.apple.product-type.bundle.unit-test', 'shared_library+bundle': 'com.apple.product-type.framework', 'executable+extension+bundle': 'com.apple.product-type.app-extension', 'executable+watch+extension+bundle': 'com.apple.product-type.watchkit-extension', 'executable+watch+bundle': 'com.apple.product-type.application.watchapp', 'mac_kernel_extension+bundle': 'com.apple.product-type.kernel-extension', } target_properties = { 'buildConfigurationList': xccl, 'name': target_name, } type = spec['type'] is_xctest = int(spec.get('mac_xctest_bundle', 0)) is_bundle = int(spec.get('mac_bundle', 0)) or is_xctest is_app_extension = int(spec.get('ios_app_extension', 0)) is_watchkit_extension = int(spec.get('ios_watchkit_extension', 0)) is_watch_app = int(spec.get('ios_watch_app', 0)) if type != 'none': type_bundle_key = type if is_xctest: type_bundle_key += '+xctest' assert type == 'loadable_module', ( 'mac_xctest_bundle targets must have type loadable_module ' '(target %s)' % target_name) elif is_app_extension: assert is_bundle, ('ios_app_extension flag requires mac_bundle ' '(target %s)' % target_name) type_bundle_key += '+extension+bundle' elif is_watchkit_extension: assert is_bundle, ('ios_watchkit_extension flag requires mac_bundle ' '(target %s)' % target_name) type_bundle_key += '+watch+extension+bundle' elif is_watch_app: assert is_bundle, ('ios_watch_app flag requires mac_bundle ' '(target %s)' % target_name) type_bundle_key += '+watch+bundle' elif is_bundle: type_bundle_key += '+bundle' xctarget_type = gyp.xcodeproj_file.PBXNativeTarget try: target_properties['productType'] = _types[type_bundle_key] except KeyError, e: gyp.common.ExceptionAppend(e, "-- unknown product type while " "writing target %s" % target_name) raise else: xctarget_type = gyp.xcodeproj_file.PBXAggregateTarget assert not is_bundle, ( 'mac_bundle targets cannot have type none (target "%s")' % target_name) assert not is_xctest, ( 'mac_xctest_bundle targets cannot have type none (target "%s")' % target_name) target_product_name = spec.get('product_name') if target_product_name is not None: target_properties['productName'] = target_product_name xct = xctarget_type(target_properties, parent=pbxp, force_outdir=spec.get('product_dir'), force_prefix=spec.get('product_prefix'), force_extension=spec.get('product_extension')) pbxp.AppendProperty('targets', xct) xcode_targets[qualified_target] = xct xcode_target_to_target_dict[xct] = spec spec_actions = spec.get('actions', []) spec_rules = spec.get('rules', []) # Xcode has some "issues" with checking dependencies for the "Compile # sources" step with any source files/headers generated by actions/rules. # To work around this, if a target is building anything directly (not # type "none"), then a second target is used to run the GYP actions/rules # and is made a dependency of this target. This way the work is done # before the dependency checks for what should be recompiled. support_xct = None # The Xcode "issues" don't affect xcode-ninja builds, since the dependency # logic all happens in ninja. Don't bother creating the extra targets in # that case. if type != 'none' and (spec_actions or spec_rules) and not ninja_wrapper: support_xccl = CreateXCConfigurationList(configuration_names); support_target_suffix = generator_flags.get( 'support_target_suffix', ' Support') support_target_properties = { 'buildConfigurationList': support_xccl, 'name': target_name + support_target_suffix, } if target_product_name: support_target_properties['productName'] = \ target_product_name + ' Support' support_xct = \ gyp.xcodeproj_file.PBXAggregateTarget(support_target_properties, parent=pbxp) pbxp.AppendProperty('targets', support_xct) xct.AddDependency(support_xct) # Hang the support target off the main target so it can be tested/found # by the generator during Finalize. xct.support_target = support_xct prebuild_index = 0 # Add custom shell script phases for "actions" sections. for action in spec_actions: # There's no need to write anything into the script to ensure that the # output directories already exist, because Xcode will look at the # declared outputs and automatically ensure that they exist for us. # Do we have a message to print when this action runs? message = action.get('message') if message: message = 'echo note: ' + gyp.common.EncodePOSIXShellArgument(message) else: message = '' # Turn the list into a string that can be passed to a shell. action_string = gyp.common.EncodePOSIXShellList(action['action']) # Convert Xcode-type variable references to sh-compatible environment # variable references. message_sh = gyp.xcodeproj_file.ConvertVariablesToShellSyntax(message) action_string_sh = gyp.xcodeproj_file.ConvertVariablesToShellSyntax( action_string) script = '' # Include the optional message if message_sh: script += message_sh + '\n' # Be sure the script runs in exec, and that if exec fails, the script # exits signalling an error. script += 'exec ' + action_string_sh + '\nexit 1\n' ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'inputPaths': action['inputs'], 'name': 'Action "' + action['action_name'] + '"', 'outputPaths': action['outputs'], 'shellScript': script, 'showEnvVarsInLog': 0, }) if support_xct: support_xct.AppendProperty('buildPhases', ssbp) else: # TODO(mark): this assumes too much knowledge of the internals of # xcodeproj_file; some of these smarts should move into xcodeproj_file # itself. xct._properties['buildPhases'].insert(prebuild_index, ssbp) prebuild_index = prebuild_index + 1 # TODO(mark): Should verify that at most one of these is specified. if int(action.get('process_outputs_as_sources', False)): for output in action['outputs']: AddSourceToTarget(output, type, pbxp, xct) if int(action.get('process_outputs_as_mac_bundle_resources', False)): for output in action['outputs']: AddResourceToTarget(output, pbxp, xct) # tgt_mac_bundle_resources holds the list of bundle resources so # the rule processing can check against it. if is_bundle: tgt_mac_bundle_resources = spec.get('mac_bundle_resources', []) else: tgt_mac_bundle_resources = [] # Add custom shell script phases driving "make" for "rules" sections. # # Xcode's built-in rule support is almost powerful enough to use directly, # but there are a few significant deficiencies that render them unusable. # There are workarounds for some of its inadequacies, but in aggregate, # the workarounds added complexity to the generator, and some workarounds # actually require input files to be crafted more carefully than I'd like. # Consequently, until Xcode rules are made more capable, "rules" input # sections will be handled in Xcode output by shell script build phases # performed prior to the compilation phase. # # The following problems with Xcode rules were found. The numbers are # Apple radar IDs. I hope that these shortcomings are addressed, I really # liked having the rules handled directly in Xcode during the period that # I was prototyping this. # # 6588600 Xcode compiles custom script rule outputs too soon, compilation # fails. This occurs when rule outputs from distinct inputs are # interdependent. The only workaround is to put rules and their # inputs in a separate target from the one that compiles the rule # outputs. This requires input file cooperation and it means that # process_outputs_as_sources is unusable. # 6584932 Need to declare that custom rule outputs should be excluded from # compilation. A possible workaround is to lie to Xcode about a # rule's output, giving it a dummy file it doesn't know how to # compile. The rule action script would need to touch the dummy. # 6584839 I need a way to declare additional inputs to a custom rule. # A possible workaround is a shell script phase prior to # compilation that touches a rule's primary input files if any # would-be additional inputs are newer than the output. Modifying # the source tree - even just modification times - feels dirty. # 6564240 Xcode "custom script" build rules always dump all environment # variables. This is a low-prioroty problem and is not a # show-stopper. rules_by_ext = {} for rule in spec_rules: rules_by_ext[rule['extension']] = rule # First, some definitions: # # A "rule source" is a file that was listed in a target's "sources" # list and will have a rule applied to it on the basis of matching the # rule's "extensions" attribute. Rule sources are direct inputs to # rules. # # Rule definitions may specify additional inputs in their "inputs" # attribute. These additional inputs are used for dependency tracking # purposes. # # A "concrete output" is a rule output with input-dependent variables # resolved. For example, given a rule with: # 'extension': 'ext', 'outputs': ['$(INPUT_FILE_BASE).cc'], # if the target's "sources" list contained "one.ext" and "two.ext", # the "concrete output" for rule input "two.ext" would be "two.cc". If # a rule specifies multiple outputs, each input file that the rule is # applied to will have the same number of concrete outputs. # # If any concrete outputs are outdated or missing relative to their # corresponding rule_source or to any specified additional input, the # rule action must be performed to generate the concrete outputs. # concrete_outputs_by_rule_source will have an item at the same index # as the rule['rule_sources'] that it corresponds to. Each item is a # list of all of the concrete outputs for the rule_source. concrete_outputs_by_rule_source = [] # concrete_outputs_all is a flat list of all concrete outputs that this # rule is able to produce, given the known set of input files # (rule_sources) that apply to it. concrete_outputs_all = [] # messages & actions are keyed by the same indices as rule['rule_sources'] # and concrete_outputs_by_rule_source. They contain the message and # action to perform after resolving input-dependent variables. The # message is optional, in which case None is stored for each rule source. messages = [] actions = [] for rule_source in rule.get('rule_sources', []): rule_source_dirname, rule_source_basename = \ posixpath.split(rule_source) (rule_source_root, rule_source_ext) = \ posixpath.splitext(rule_source_basename) # These are the same variable names that Xcode uses for its own native # rule support. Because Xcode's rule engine is not being used, they # need to be expanded as they are written to the makefile. rule_input_dict = { 'INPUT_FILE_BASE': rule_source_root, 'INPUT_FILE_SUFFIX': rule_source_ext, 'INPUT_FILE_NAME': rule_source_basename, 'INPUT_FILE_PATH': rule_source, 'INPUT_FILE_DIRNAME': rule_source_dirname, } concrete_outputs_for_this_rule_source = [] for output in rule.get('outputs', []): # Fortunately, Xcode and make both use $(VAR) format for their # variables, so the expansion is the only transformation necessary. # Any remaning $(VAR)-type variables in the string can be given # directly to make, which will pick up the correct settings from # what Xcode puts into the environment. concrete_output = ExpandXcodeVariables(output, rule_input_dict) concrete_outputs_for_this_rule_source.append(concrete_output) # Add all concrete outputs to the project. pbxp.AddOrGetFileInRootGroup(concrete_output) concrete_outputs_by_rule_source.append( \ concrete_outputs_for_this_rule_source) concrete_outputs_all.extend(concrete_outputs_for_this_rule_source) # TODO(mark): Should verify that at most one of these is specified. if int(rule.get('process_outputs_as_sources', False)): for output in concrete_outputs_for_this_rule_source: AddSourceToTarget(output, type, pbxp, xct) # If the file came from the mac_bundle_resources list or if the rule # is marked to process outputs as bundle resource, do so. was_mac_bundle_resource = rule_source in tgt_mac_bundle_resources if was_mac_bundle_resource or \ int(rule.get('process_outputs_as_mac_bundle_resources', False)): for output in concrete_outputs_for_this_rule_source: AddResourceToTarget(output, pbxp, xct) # Do we have a message to print when this rule runs? message = rule.get('message') if message: message = gyp.common.EncodePOSIXShellArgument(message) message = ExpandXcodeVariables(message, rule_input_dict) messages.append(message) # Turn the list into a string that can be passed to a shell. action_string = gyp.common.EncodePOSIXShellList(rule['action']) action = ExpandXcodeVariables(action_string, rule_input_dict) actions.append(action) if len(concrete_outputs_all) > 0: # TODO(mark): There's a possibilty for collision here. Consider # target "t" rule "A_r" and target "t_A" rule "r". makefile_name = '%s.make' % re.sub( '[^a-zA-Z0-9_]', '_' , '%s_%s' % (target_name, rule['rule_name'])) makefile_path = os.path.join(xcode_projects[build_file].path, makefile_name) # TODO(mark): try/close? Write to a temporary file and swap it only # if it's got changes? makefile = open(makefile_path, 'wb') # make will build the first target in the makefile by default. By # convention, it's called "all". List all (or at least one) # concrete output for each rule source as a prerequisite of the "all" # target. makefile.write('all: \\\n') for concrete_output_index in \ xrange(0, len(concrete_outputs_by_rule_source)): # Only list the first (index [0]) concrete output of each input # in the "all" target. Otherwise, a parallel make (-j > 1) would # attempt to process each input multiple times simultaneously. # Otherwise, "all" could just contain the entire list of # concrete_outputs_all. concrete_output = \ concrete_outputs_by_rule_source[concrete_output_index][0] if concrete_output_index == len(concrete_outputs_by_rule_source) - 1: eol = '' else: eol = ' \\' makefile.write(' %s%s\n' % (concrete_output, eol)) for (rule_source, concrete_outputs, message, action) in \ zip(rule['rule_sources'], concrete_outputs_by_rule_source, messages, actions): makefile.write('\n') # Add a rule that declares it can build each concrete output of a # rule source. Collect the names of the directories that are # required. concrete_output_dirs = [] for concrete_output_index in xrange(0, len(concrete_outputs)): concrete_output = concrete_outputs[concrete_output_index] if concrete_output_index == 0: bol = '' else: bol = ' ' makefile.write('%s%s \\\n' % (bol, concrete_output)) concrete_output_dir = posixpath.dirname(concrete_output) if (concrete_output_dir and concrete_output_dir not in concrete_output_dirs): concrete_output_dirs.append(concrete_output_dir) makefile.write(' : \\\n') # The prerequisites for this rule are the rule source itself and # the set of additional rule inputs, if any. prerequisites = [rule_source] prerequisites.extend(rule.get('inputs', [])) for prerequisite_index in xrange(0, len(prerequisites)): prerequisite = prerequisites[prerequisite_index] if prerequisite_index == len(prerequisites) - 1: eol = '' else: eol = ' \\' makefile.write(' %s%s\n' % (prerequisite, eol)) # Make sure that output directories exist before executing the rule # action. if len(concrete_output_dirs) > 0: makefile.write('\t@mkdir -p "%s"\n' % '" "'.join(concrete_output_dirs)) # The rule message and action have already had the necessary variable # substitutions performed. if message: # Mark it with note: so Xcode picks it up in build output. makefile.write('\t@echo note: %s\n' % message) makefile.write('\t%s\n' % action) makefile.close() # It might be nice to ensure that needed output directories exist # here rather than in each target in the Makefile, but that wouldn't # work if there ever was a concrete output that had an input-dependent # variable anywhere other than in the leaf position. # Don't declare any inputPaths or outputPaths. If they're present, # Xcode will provide a slight optimization by only running the script # phase if any output is missing or outdated relative to any input. # Unfortunately, it will also assume that all outputs are touched by # the script, and if the outputs serve as files in a compilation # phase, they will be unconditionally rebuilt. Since make might not # rebuild everything that could be declared here as an output, this # extra compilation activity is unnecessary. With inputPaths and # outputPaths not supplied, make will always be called, but it knows # enough to not do anything when everything is up-to-date. # To help speed things up, pass -j COUNT to make so it does some work # in parallel. Don't use ncpus because Xcode will build ncpus targets # in parallel and if each target happens to have a rules step, there # would be ncpus^2 things going. With a machine that has 2 quad-core # Xeons, a build can quickly run out of processes based on # scheduling/other tasks, and randomly failing builds are no good. script = \ """JOB_COUNT="$(/usr/sbin/sysctl -n hw.ncpu)" if [ "${JOB_COUNT}" -gt 4 ]; then JOB_COUNT=4 fi exec xcrun make -f "${PROJECT_FILE_PATH}/%s" -j "${JOB_COUNT}" exit 1 """ % makefile_name ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'name': 'Rule "' + rule['rule_name'] + '"', 'shellScript': script, 'showEnvVarsInLog': 0, }) if support_xct: support_xct.AppendProperty('buildPhases', ssbp) else: # TODO(mark): this assumes too much knowledge of the internals of # xcodeproj_file; some of these smarts should move into xcodeproj_file # itself. xct._properties['buildPhases'].insert(prebuild_index, ssbp) prebuild_index = prebuild_index + 1 # Extra rule inputs also go into the project file. Concrete outputs were # already added when they were computed. groups = ['inputs', 'inputs_excluded'] if skip_excluded_files: groups = [x for x in groups if not x.endswith('_excluded')] for group in groups: for item in rule.get(group, []): pbxp.AddOrGetFileInRootGroup(item) # Add "sources". for source in spec.get('sources', []): (source_root, source_extension) = posixpath.splitext(source) if source_extension[1:] not in rules_by_ext: # AddSourceToTarget will add the file to a root group if it's not # already there. AddSourceToTarget(source, type, pbxp, xct) else: pbxp.AddOrGetFileInRootGroup(source) # Add "mac_bundle_resources" and "mac_framework_private_headers" if # it's a bundle of any type. if is_bundle: for resource in tgt_mac_bundle_resources: (resource_root, resource_extension) = posixpath.splitext(resource) if resource_extension[1:] not in rules_by_ext: AddResourceToTarget(resource, pbxp, xct) else: pbxp.AddOrGetFileInRootGroup(resource) for header in spec.get('mac_framework_private_headers', []): AddHeaderToTarget(header, pbxp, xct, False) # Add "mac_framework_headers". These can be valid for both frameworks # and static libraries. if is_bundle or type == 'static_library': for header in spec.get('mac_framework_headers', []): AddHeaderToTarget(header, pbxp, xct, True) # Add "copies". pbxcp_dict = {} for copy_group in spec.get('copies', []): dest = copy_group['destination'] if dest[0] not in ('/', '$'): # Relative paths are relative to $(SRCROOT). dest = '$(SRCROOT)/' + dest code_sign = int(copy_group.get('xcode_code_sign', 0)) settings = (None, '{ATTRIBUTES = (CodeSignOnCopy, ); }')[code_sign]; # Coalesce multiple "copies" sections in the same target with the same # "destination" property into the same PBXCopyFilesBuildPhase, otherwise # they'll wind up with ID collisions. pbxcp = pbxcp_dict.get(dest, None) if pbxcp is None: pbxcp = gyp.xcodeproj_file.PBXCopyFilesBuildPhase({ 'name': 'Copy to ' + copy_group['destination'] }, parent=xct) pbxcp.SetDestination(dest) # TODO(mark): The usual comment about this knowing too much about # gyp.xcodeproj_file internals applies. xct._properties['buildPhases'].insert(prebuild_index, pbxcp) pbxcp_dict[dest] = pbxcp for file in copy_group['files']: pbxcp.AddFile(file, settings) # Excluded files can also go into the project file. if not skip_excluded_files: for key in ['sources', 'mac_bundle_resources', 'mac_framework_headers', 'mac_framework_private_headers']: excluded_key = key + '_excluded' for item in spec.get(excluded_key, []): pbxp.AddOrGetFileInRootGroup(item) # So can "inputs" and "outputs" sections of "actions" groups. groups = ['inputs', 'inputs_excluded', 'outputs', 'outputs_excluded'] if skip_excluded_files: groups = [x for x in groups if not x.endswith('_excluded')] for action in spec.get('actions', []): for group in groups: for item in action.get(group, []): # Exclude anything in BUILT_PRODUCTS_DIR. They're products, not # sources. if not item.startswith('$(BUILT_PRODUCTS_DIR)/'): pbxp.AddOrGetFileInRootGroup(item) for postbuild in spec.get('postbuilds', []): action_string_sh = gyp.common.EncodePOSIXShellList(postbuild['action']) script = 'exec ' + action_string_sh + '\nexit 1\n' # Make the postbuild step depend on the output of ld or ar from this # target. Apparently putting the script step after the link step isn't # sufficient to ensure proper ordering in all cases. With an input # declared but no outputs, the script step should run every time, as # desired. ssbp = gyp.xcodeproj_file.PBXShellScriptBuildPhase({ 'inputPaths': ['$(BUILT_PRODUCTS_DIR)/$(EXECUTABLE_PATH)'], 'name': 'Postbuild "' + postbuild['postbuild_name'] + '"', 'shellScript': script, 'showEnvVarsInLog': 0, }) xct.AppendProperty('buildPhases', ssbp) # Add dependencies before libraries, because adding a dependency may imply # adding a library. It's preferable to keep dependencies listed first # during a link phase so that they can override symbols that would # otherwise be provided by libraries, which will usually include system # libraries. On some systems, ld is finicky and even requires the # libraries to be ordered in such a way that unresolved symbols in # earlier-listed libraries may only be resolved by later-listed libraries. # The Mac linker doesn't work that way, but other platforms do, and so # their linker invocations need to be constructed in this way. There's # no compelling reason for Xcode's linker invocations to differ. if 'dependencies' in spec: for dependency in spec['dependencies']: xct.AddDependency(xcode_targets[dependency]) # The support project also gets the dependencies (in case they are # needed for the actions/rules to work). if support_xct: support_xct.AddDependency(xcode_targets[dependency]) if 'libraries' in spec: for library in spec['libraries']: xct.FrameworksPhase().AddFile(library) # Add the library's directory to LIBRARY_SEARCH_PATHS if necessary. # I wish Xcode handled this automatically. library_dir = posixpath.dirname(library) if library_dir not in xcode_standard_library_dirs and ( not xct.HasBuildSetting(_library_search_paths_var) or library_dir not in xct.GetBuildSetting(_library_search_paths_var)): xct.AppendBuildSetting(_library_search_paths_var, library_dir) for configuration_name in configuration_names: configuration = spec['configurations'][configuration_name] xcbc = xct.ConfigurationNamed(configuration_name) for include_dir in configuration.get('mac_framework_dirs', []): xcbc.AppendBuildSetting('FRAMEWORK_SEARCH_PATHS', include_dir) for include_dir in configuration.get('include_dirs', []): xcbc.AppendBuildSetting('HEADER_SEARCH_PATHS', include_dir) for library_dir in configuration.get('library_dirs', []): if library_dir not in xcode_standard_library_dirs and ( not xcbc.HasBuildSetting(_library_search_paths_var) or library_dir not in xcbc.GetBuildSetting(_library_search_paths_var)): xcbc.AppendBuildSetting(_library_search_paths_var, library_dir) if 'defines' in configuration: for define in configuration['defines']: set_define = EscapeXcodeDefine(define) xcbc.AppendBuildSetting('GCC_PREPROCESSOR_DEFINITIONS', set_define) if 'xcode_settings' in configuration: for xck, xcv in configuration['xcode_settings'].iteritems(): xcbc.SetBuildSetting(xck, xcv) if 'xcode_config_file' in configuration: config_ref = pbxp.AddOrGetFileInRootGroup( configuration['xcode_config_file']) xcbc.SetBaseConfiguration(config_ref) build_files = [] for build_file, build_file_dict in data.iteritems(): if build_file.endswith('.gyp'): build_files.append(build_file) for build_file in build_files: xcode_projects[build_file].Finalize1(xcode_targets, serialize_all_tests) for build_file in build_files: xcode_projects[build_file].Finalize2(xcode_targets, xcode_target_to_target_dict) for build_file in build_files: xcode_projects[build_file].Write() npm_3.5.2.orig/node_modules/node-gyp/gyp/pylib/gyp/generator/xcode_test.py0000644000000000000000000000120512631326456025105 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2013 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """ Unit tests for the xcode.py file. """ import gyp.generator.xcode as xcode import unittest import sys class TestEscapeXcodeDefine(unittest.TestCase): if sys.platform == 'darwin': def test_InheritedRemainsUnescaped(self): self.assertEqual(xcode.EscapeXcodeDefine('$(inherited)'), '$(inherited)') def test_Escaping(self): self.assertEqual(xcode.EscapeXcodeDefine('a b"c\\'), 'a\\ b\\"c\\\\') if __name__ == '__main__': unittest.main() npm_3.5.2.orig/node_modules/node-gyp/gyp/samples/samples0000755000000000000000000000450012631326456021525 0ustar 00000000000000#!/usr/bin/python # Copyright (c) 2009 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. import os.path import shutil import sys gyps = [ 'app/app.gyp', 'base/base.gyp', 'build/temp_gyp/googleurl.gyp', 'build/all.gyp', 'build/common.gypi', 'build/external_code.gypi', 'chrome/test/security_tests/security_tests.gyp', 'chrome/third_party/hunspell/hunspell.gyp', 'chrome/chrome.gyp', 'media/media.gyp', 'net/net.gyp', 'printing/printing.gyp', 'sdch/sdch.gyp', 'skia/skia.gyp', 'testing/gmock.gyp', 'testing/gtest.gyp', 'third_party/bzip2/bzip2.gyp', 'third_party/icu38/icu38.gyp', 'third_party/libevent/libevent.gyp', 'third_party/libjpeg/libjpeg.gyp', 'third_party/libpng/libpng.gyp', 'third_party/libxml/libxml.gyp', 'third_party/libxslt/libxslt.gyp', 'third_party/lzma_sdk/lzma_sdk.gyp', 'third_party/modp_b64/modp_b64.gyp', 'third_party/npapi/npapi.gyp', 'third_party/sqlite/sqlite.gyp', 'third_party/zlib/zlib.gyp', 'v8/tools/gyp/v8.gyp', 'webkit/activex_shim/activex_shim.gyp', 'webkit/activex_shim_dll/activex_shim_dll.gyp', 'webkit/build/action_csspropertynames.py', 'webkit/build/action_cssvaluekeywords.py', 'webkit/build/action_jsconfig.py', 'webkit/build/action_makenames.py', 'webkit/build/action_maketokenizer.py', 'webkit/build/action_useragentstylesheets.py', 'webkit/build/rule_binding.py', 'webkit/build/rule_bison.py', 'webkit/build/rule_gperf.py', 'webkit/tools/test_shell/test_shell.gyp', 'webkit/webkit.gyp', ] def Main(argv): if len(argv) != 3 or argv[1] not in ['push', 'pull']: print 'Usage: %s push/pull PATH_TO_CHROME' % argv[0] return 1 path_to_chrome = argv[2] for g in gyps: chrome_file = os.path.join(path_to_chrome, g) local_file = os.path.join(os.path.dirname(argv[0]), os.path.split(g)[1]) if argv[1] == 'push': print 'Copying %s to %s' % (local_file, chrome_file) shutil.copyfile(local_file, chrome_file) elif argv[1] == 'pull': print 'Copying %s to %s' % (chrome_file, local_file) shutil.copyfile(chrome_file, local_file) else: assert False return 0 if __name__ == '__main__': sys.exit(Main(sys.argv)) npm_3.5.2.orig/node_modules/node-gyp/gyp/samples/samples.bat0000644000000000000000000000030412631326456022265 0ustar 00000000000000@rem Copyright (c) 2009 Google Inc. All rights reserved. @rem Use of this source code is governed by a BSD-style license that can be @rem found in the LICENSE file. @python %~dp0/samples %* npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/README0000644000000000000000000000150512631326456020511 0ustar 00000000000000pretty_vcproj: Usage: pretty_vcproj.py "c:\path\to\vcproj.vcproj" [key1=value1] [key2=value2] They key/value pair are used to resolve vsprops name. For example, if I want to diff the base.vcproj project: pretty_vcproj.py z:\dev\src-chrome\src\base\build\base.vcproj "$(SolutionDir)=z:\dev\src-chrome\src\chrome\\" "$(CHROMIUM_BUILD)=" "$(CHROME_BUILD_TYPE)=" > orignal.txt pretty_vcproj.py z:\dev\src-chrome\src\base\base_gyp.vcproj "$(SolutionDir)=z:\dev\src-chrome\src\chrome\\" "$(CHROMIUM_BUILD)=" "$(CHROME_BUILD_TYPE)=" > gyp.txt And you can use your favorite diff tool to see the changes. Note: In the case of base.vcproj, the original vcproj is one level up the generated one. I suggest you do a search and replace for '"..\' and replace it with '"' in original.txt before you perform the diff.npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/Xcode/0000755000000000000000000000000012631326456020672 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/emacs/0000755000000000000000000000000012631326456020720 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/graphviz.py0000755000000000000000000000547612631326456022053 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2011 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Using the JSON dumped by the dump-dependency-json generator, generate input suitable for graphviz to render a dependency graph of targets.""" import collections import json import sys def ParseTarget(target): target, _, suffix = target.partition('#') filename, _, target = target.partition(':') return filename, target, suffix def LoadEdges(filename, targets): """Load the edges map from the dump file, and filter it to only show targets in |targets| and their depedendents.""" file = open('dump.json') edges = json.load(file) file.close() # Copy out only the edges we're interested in from the full edge list. target_edges = {} to_visit = targets[:] while to_visit: src = to_visit.pop() if src in target_edges: continue target_edges[src] = edges[src] to_visit.extend(edges[src]) return target_edges def WriteGraph(edges): """Print a graphviz graph to stdout. |edges| is a map of target to a list of other targets it depends on.""" # Bucket targets by file. files = collections.defaultdict(list) for src, dst in edges.items(): build_file, target_name, toolset = ParseTarget(src) files[build_file].append(src) print 'digraph D {' print ' fontsize=8' # Used by subgraphs. print ' node [fontsize=8]' # Output nodes by file. We must first write out each node within # its file grouping before writing out any edges that may refer # to those nodes. for filename, targets in files.items(): if len(targets) == 1: # If there's only one node for this file, simplify # the display by making it a box without an internal node. target = targets[0] build_file, target_name, toolset = ParseTarget(target) print ' "%s" [shape=box, label="%s\\n%s"]' % (target, filename, target_name) else: # Group multiple nodes together in a subgraph. print ' subgraph "cluster_%s" {' % filename print ' label = "%s"' % filename for target in targets: build_file, target_name, toolset = ParseTarget(target) print ' "%s" [label="%s"]' % (target, target_name) print ' }' # Now that we've placed all the nodes within subgraphs, output all # the edges between nodes. for src, dsts in edges.items(): for dst in dsts: print ' "%s" -> "%s"' % (src, dst) print '}' def main(): if len(sys.argv) < 2: print >>sys.stderr, __doc__ print >>sys.stderr print >>sys.stderr, 'usage: %s target1 target2...' % (sys.argv[0]) return 1 edges = LoadEdges('dump.json', sys.argv[1:]) WriteGraph(edges) return 0 if __name__ == '__main__': sys.exit(main()) npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/pretty_gyp.py0000755000000000000000000001122412631326456022413 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Pretty-prints the contents of a GYP file.""" import sys import re # Regex to remove comments when we're counting braces. COMMENT_RE = re.compile(r'\s*#.*') # Regex to remove quoted strings when we're counting braces. # It takes into account quoted quotes, and makes sure that the quotes match. # NOTE: It does not handle quotes that span more than one line, or # cases where an escaped quote is preceeded by an escaped backslash. QUOTE_RE_STR = r'(?P[\'"])(.*?)(? 0: after = True # This catches the special case of a closing brace having something # other than just whitespace ahead of it -- we don't want to # unindent that until after this line is printed so it stays with # the previous indentation level. if cnt < 0 and closing_prefix_re.match(stripline): after = True return (cnt, after) def prettyprint_input(lines): """Does the main work of indenting the input based on the brace counts.""" indent = 0 basic_offset = 2 last_line = "" for line in lines: if COMMENT_RE.match(line): print line else: line = line.strip('\r\n\t ') # Otherwise doesn't strip \r on Unix. if len(line) > 0: (brace_diff, after) = count_braces(line) if brace_diff != 0: if after: print " " * (basic_offset * indent) + line indent += brace_diff else: indent += brace_diff print " " * (basic_offset * indent) + line else: print " " * (basic_offset * indent) + line else: print "" last_line = line def main(): if len(sys.argv) > 1: data = open(sys.argv[1]).read().splitlines() else: data = sys.stdin.read().splitlines() # Split up the double braces. lines = split_double_braces(data) # Indent and print the output. prettyprint_input(lines) return 0 if __name__ == '__main__': sys.exit(main()) npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/pretty_sln.py0000755000000000000000000001175312631326456022417 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Prints the information in a sln file in a diffable way. It first outputs each projects in alphabetical order with their dependencies. Then it outputs a possible build order. """ __author__ = 'nsylvain (Nicolas Sylvain)' import os import re import sys import pretty_vcproj def BuildProject(project, built, projects, deps): # if all dependencies are done, we can build it, otherwise we try to build the # dependency. # This is not infinite-recursion proof. for dep in deps[project]: if dep not in built: BuildProject(dep, built, projects, deps) print project built.append(project) def ParseSolution(solution_file): # All projects, their clsid and paths. projects = dict() # A list of dependencies associated with a project. dependencies = dict() # Regular expressions that matches the SLN format. # The first line of a project definition. begin_project = re.compile(r'^Project\("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942' r'}"\) = "(.*)", "(.*)", "(.*)"$') # The last line of a project definition. end_project = re.compile('^EndProject$') # The first line of a dependency list. begin_dep = re.compile( r'ProjectSection\(ProjectDependencies\) = postProject$') # The last line of a dependency list. end_dep = re.compile('EndProjectSection$') # A line describing a dependency. dep_line = re.compile(' *({.*}) = ({.*})$') in_deps = False solution = open(solution_file) for line in solution: results = begin_project.search(line) if results: # Hack to remove icu because the diff is too different. if results.group(1).find('icu') != -1: continue # We remove "_gyp" from the names because it helps to diff them. current_project = results.group(1).replace('_gyp', '') projects[current_project] = [results.group(2).replace('_gyp', ''), results.group(3), results.group(2)] dependencies[current_project] = [] continue results = end_project.search(line) if results: current_project = None continue results = begin_dep.search(line) if results: in_deps = True continue results = end_dep.search(line) if results: in_deps = False continue results = dep_line.search(line) if results and in_deps and current_project: dependencies[current_project].append(results.group(1)) continue # Change all dependencies clsid to name instead. for project in dependencies: # For each dependencies in this project new_dep_array = [] for dep in dependencies[project]: # Look for the project name matching this cldis for project_info in projects: if projects[project_info][1] == dep: new_dep_array.append(project_info) dependencies[project] = sorted(new_dep_array) return (projects, dependencies) def PrintDependencies(projects, deps): print "---------------------------------------" print "Dependencies for all projects" print "---------------------------------------" print "-- --" for (project, dep_list) in sorted(deps.items()): print "Project : %s" % project print "Path : %s" % projects[project][0] if dep_list: for dep in dep_list: print " - %s" % dep print "" print "-- --" def PrintBuildOrder(projects, deps): print "---------------------------------------" print "Build order " print "---------------------------------------" print "-- --" built = [] for (project, _) in sorted(deps.items()): if project not in built: BuildProject(project, built, projects, deps) print "-- --" def PrintVCProj(projects): for project in projects: print "-------------------------------------" print "-------------------------------------" print project print project print project print "-------------------------------------" print "-------------------------------------" project_path = os.path.abspath(os.path.join(os.path.dirname(sys.argv[1]), projects[project][2])) pretty = pretty_vcproj argv = [ '', project_path, '$(SolutionDir)=%s\\' % os.path.dirname(sys.argv[1]), ] argv.extend(sys.argv[3:]) pretty.main(argv) def main(): # check if we have exactly 1 parameter. if len(sys.argv) < 2: print 'Usage: %s "c:\\path\\to\\project.sln"' % sys.argv[0] return 1 (projects, deps) = ParseSolution(sys.argv[1]) PrintDependencies(projects, deps) PrintBuildOrder(projects, deps) if '--recursive' in sys.argv: PrintVCProj(projects) return 0 if __name__ == '__main__': sys.exit(main()) npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/pretty_vcproj.py0000755000000000000000000002256212631326456023126 0ustar 00000000000000#!/usr/bin/env python # Copyright (c) 2012 Google Inc. All rights reserved. # Use of this source code is governed by a BSD-style license that can be # found in the LICENSE file. """Make the format of a vcproj really pretty. This script normalize and sort an xml. It also fetches all the properties inside linked vsprops and include them explicitly in the vcproj. It outputs the resulting xml to stdout. """ __author__ = 'nsylvain (Nicolas Sylvain)' import os import sys from xml.dom.minidom import parse from xml.dom.minidom import Node REPLACEMENTS = dict() ARGUMENTS = None class CmpTuple(object): """Compare function between 2 tuple.""" def __call__(self, x, y): return cmp(x[0], y[0]) class CmpNode(object): """Compare function between 2 xml nodes.""" def __call__(self, x, y): def get_string(node): node_string = "node" node_string += node.nodeName if node.nodeValue: node_string += node.nodeValue if node.attributes: # We first sort by name, if present. node_string += node.getAttribute("Name") all_nodes = [] for (name, value) in node.attributes.items(): all_nodes.append((name, value)) all_nodes.sort(CmpTuple()) for (name, value) in all_nodes: node_string += name node_string += value return node_string return cmp(get_string(x), get_string(y)) def PrettyPrintNode(node, indent=0): if node.nodeType == Node.TEXT_NODE: if node.data.strip(): print '%s%s' % (' '*indent, node.data.strip()) return if node.childNodes: node.normalize() # Get the number of attributes attr_count = 0 if node.attributes: attr_count = node.attributes.length # Print the main tag if attr_count == 0: print '%s<%s>' % (' '*indent, node.nodeName) else: print '%s<%s' % (' '*indent, node.nodeName) all_attributes = [] for (name, value) in node.attributes.items(): all_attributes.append((name, value)) all_attributes.sort(CmpTuple()) for (name, value) in all_attributes: print '%s %s="%s"' % (' '*indent, name, value) print '%s>' % (' '*indent) if node.nodeValue: print '%s %s' % (' '*indent, node.nodeValue) for sub_node in node.childNodes: PrettyPrintNode(sub_node, indent=indent+2) print '%s' % (' '*indent, node.nodeName) def FlattenFilter(node): """Returns a list of all the node and sub nodes.""" node_list = [] if (node.attributes and node.getAttribute('Name') == '_excluded_files'): # We don't add the "_excluded_files" filter. return [] for current in node.childNodes: if current.nodeName == 'Filter': node_list.extend(FlattenFilter(current)) else: node_list.append(current) return node_list def FixFilenames(filenames, current_directory): new_list = [] for filename in filenames: if filename: for key in REPLACEMENTS: filename = filename.replace(key, REPLACEMENTS[key]) os.chdir(current_directory) filename = filename.strip('"\' ') if filename.startswith('$'): new_list.append(filename) else: new_list.append(os.path.abspath(filename)) return new_list def AbsoluteNode(node): """Makes all the properties we know about in this node absolute.""" if node.attributes: for (name, value) in node.attributes.items(): if name in ['InheritedPropertySheets', 'RelativePath', 'AdditionalIncludeDirectories', 'IntermediateDirectory', 'OutputDirectory', 'AdditionalLibraryDirectories']: # We want to fix up these paths path_list = value.split(';') new_list = FixFilenames(path_list, os.path.dirname(ARGUMENTS[1])) node.setAttribute(name, ';'.join(new_list)) if not value: node.removeAttribute(name) def CleanupVcproj(node): """For each sub node, we call recursively this function.""" for sub_node in node.childNodes: AbsoluteNode(sub_node) CleanupVcproj(sub_node) # Normalize the node, and remove all extranous whitespaces. for sub_node in node.childNodes: if sub_node.nodeType == Node.TEXT_NODE: sub_node.data = sub_node.data.replace("\r", "") sub_node.data = sub_node.data.replace("\n", "") sub_node.data = sub_node.data.rstrip() # Fix all the semicolon separated attributes to be sorted, and we also # remove the dups. if node.attributes: for (name, value) in node.attributes.items(): sorted_list = sorted(value.split(';')) unique_list = [] for i in sorted_list: if not unique_list.count(i): unique_list.append(i) node.setAttribute(name, ';'.join(unique_list)) if not value: node.removeAttribute(name) if node.childNodes: node.normalize() # For each node, take a copy, and remove it from the list. node_array = [] while node.childNodes and node.childNodes[0]: # Take a copy of the node and remove it from the list. current = node.childNodes[0] node.removeChild(current) # If the child is a filter, we want to append all its children # to this same list. if current.nodeName == 'Filter': node_array.extend(FlattenFilter(current)) else: node_array.append(current) # Sort the list. node_array.sort(CmpNode()) # Insert the nodes in the correct order. for new_node in node_array: # But don't append empty tool node. if new_node.nodeName == 'Tool': if new_node.attributes and new_node.attributes.length == 1: # This one was empty. continue if new_node.nodeName == 'UserMacro': continue node.appendChild(new_node) def GetConfiguationNodes(vcproj): #TODO(nsylvain): Find a better way to navigate the xml. nodes = [] for node in vcproj.childNodes: if node.nodeName == "Configurations": for sub_node in node.childNodes: if sub_node.nodeName == "Configuration": nodes.append(sub_node) return nodes def GetChildrenVsprops(filename): dom = parse(filename) if dom.documentElement.attributes: vsprops = dom.documentElement.getAttribute('InheritedPropertySheets') return FixFilenames(vsprops.split(';'), os.path.dirname(filename)) return [] def SeekToNode(node1, child2): # A text node does not have properties. if child2.nodeType == Node.TEXT_NODE: return None # Get the name of the current node. current_name = child2.getAttribute("Name") if not current_name: # There is no name. We don't know how to merge. return None # Look through all the nodes to find a match. for sub_node in node1.childNodes: if sub_node.nodeName == child2.nodeName: name = sub_node.getAttribute("Name") if name == current_name: return sub_node # No match. We give up. return None def MergeAttributes(node1, node2): # No attributes to merge? if not node2.attributes: return for (name, value2) in node2.attributes.items(): # Don't merge the 'Name' attribute. if name == 'Name': continue value1 = node1.getAttribute(name) if value1: # The attribute exist in the main node. If it's equal, we leave it # untouched, otherwise we concatenate it. if value1 != value2: node1.setAttribute(name, ';'.join([value1, value2])) else: # The attribute does nto exist in the main node. We append this one. node1.setAttribute(name, value2) # If the attribute was a property sheet attributes, we remove it, since # they are useless. if name == 'InheritedPropertySheets': node1.removeAttribute(name) def MergeProperties(node1, node2): MergeAttributes(node1, node2) for child2 in node2.childNodes: child1 = SeekToNode(node1, child2) if child1: MergeProperties(child1, child2) else: node1.appendChild(child2.cloneNode(True)) def main(argv): """Main function of this vcproj prettifier.""" global ARGUMENTS ARGUMENTS = argv # check if we have exactly 1 parameter. if len(argv) < 2: print ('Usage: %s "c:\\path\\to\\vcproj.vcproj" [key1=value1] ' '[key2=value2]' % argv[0]) return 1 # Parse the keys for i in range(2, len(argv)): (key, value) = argv[i].split('=') REPLACEMENTS[key] = value # Open the vcproj and parse the xml. dom = parse(argv[1]) # First thing we need to do is find the Configuration Node and merge them # with the vsprops they include. for configuration_node in GetConfiguationNodes(dom.documentElement): # Get the property sheets associated with this configuration. vsprops = configuration_node.getAttribute('InheritedPropertySheets') # Fix the filenames to be absolute. vsprops_list = FixFilenames(vsprops.strip().split(';'), os.path.dirname(argv[1])) # Extend the list of vsprops with all vsprops contained in the current # vsprops. for current_vsprops in vsprops_list: vsprops_list.extend(GetChildrenVsprops(current_vsprops)) # Now that we have all the vsprops, we need to merge them. for current_vsprops in vsprops_list: MergeProperties(configuration_node, parse(current_vsprops).documentElement) # Now that everything is merged, we need to cleanup the xml. CleanupVcproj(dom.documentElement) # Finally, we use the prett xml function to print the vcproj back to the # user. #print dom.toprettyxml(newl="\n") PrettyPrintNode(dom.documentElement) return 0 if __name__ == '__main__': sys.exit(main(sys.argv)) npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/Xcode/README0000644000000000000000000000044112631326456021551 0ustar 00000000000000Specifications contains syntax formatters for Xcode 3. These do not appear to be supported yet on Xcode 4. To use these with Xcode 3 please install both the gyp.pbfilespec and gyp.xclangspec files in ~/Library/Application Support/Developer/Shared/Xcode/Specifications/ and restart Xcode.npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/Xcode/Specifications/0000755000000000000000000000000012631326456023635 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.pbfilespec0000644000000000000000000000127512631326456026477 0ustar 00000000000000/* gyp.pbfilespec GYP source file spec for Xcode 3 There is not much documentation available regarding the format of .pbfilespec files. As a starting point, see for instance the outdated documentation at: http://maxao.free.fr/xcode-plugin-interface/specifications.html and the files in: /Developer/Library/PrivateFrameworks/XcodeEdit.framework/Versions/A/Resources/ Place this file in directory: ~/Library/Application Support/Developer/Shared/Xcode/Specifications/ */ ( { Identifier = sourcecode.gyp; BasedOn = sourcecode; Name = "GYP Files"; Extensions = ("gyp", "gypi"); MIMETypes = ("text/gyp"); Language = "xcode.lang.gyp"; IsTextFile = YES; IsSourceFile = YES; } ) npm_3.5.2.orig/node_modules/node-gyp/gyp/tools/Xcode/Specifications/gyp.xclangspec0000644000000000000000000001174012631326456026510 0ustar 00000000000000/* Copyright (c) 2011 Google Inc. All rights reserved. Use of this source code is governed by a BSD-style license that can be found in the LICENSE file. gyp.xclangspec GYP language specification for Xcode 3 There is not much documentation available regarding the format of .xclangspec files. As a starting point, see for instance the outdated documentation at: http://maxao.free.fr/xcode-plugin-interface/specifications.html and the files in: /Developer/Library/PrivateFrameworks/XcodeEdit.framework/Versions/A/Resources/ Place this file in directory: ~/Library/Application Support/Developer/Shared/Xcode/Specifications/ */ ( { Identifier = "xcode.lang.gyp.keyword"; Syntax = { Words = ( "and", "or", " (caar gyp-parse-history) target-point) (setq gyp-parse-history (cdr gyp-parse-history)))) (defun gyp-parse-point () "The point of the last parse state added by gyp-parse-to." (caar gyp-parse-history)) (defun gyp-parse-sections () "A list of section symbols holding at the last parse state point." (cdar gyp-parse-history)) (defun gyp-inside-dictionary-p () "Predicate returning true if the parser is inside a dictionary." (not (eq (cadar gyp-parse-history) 'list))) (defun gyp-add-parse-history (point sections) "Add parse state SECTIONS to the parse history at POINT so that parsing can be resumed instantly." (while (>= (caar gyp-parse-history) point) (setq gyp-parse-history (cdr gyp-parse-history))) (setq gyp-parse-history (cons (cons point sections) gyp-parse-history))) (defun gyp-parse-to (target-point) "Parses from (point) to TARGET-POINT adding the parse state information to gyp-parse-state-history. Parsing stops if TARGET-POINT is reached or if a string literal has been parsed. Returns nil if no further parsing can be done, otherwise returns the position of the start of a parsed string, leaving the point at the end of the string." (let ((parsing t) string-start) (while parsing (setq string-start nil) ;; Parse up to a character that starts a sexp, or if the nesting ;; level decreases. (let ((state (parse-partial-sexp (gyp-parse-point) target-point -1 t)) (sections (gyp-parse-sections))) (if (= (nth 0 state) -1) (setq sections (cdr sections)) ; pop out a level (cond ((looking-at-p "['\"]") ; a string (setq string-start (point)) (goto-char (scan-sexps (point) 1)) (if (gyp-inside-dictionary-p) ;; Look for sections inside a dictionary (let ((section (gyp-section-name (buffer-substring-no-properties (+ 1 string-start) (- (point) 1))))) (setq sections (cons section (cdr sections))))) ;; Stop after the string so it can be fontified. (setq target-point (point))) ((looking-at-p "{") ;; Inside a dictionary. Increase nesting. (forward-char 1) (setq sections (cons 'unknown sections))) ((looking-at-p "\\[") ;; Inside a list. Increase nesting (forward-char 1) (setq sections (cons 'list sections))) ((not (eobp)) ;; other (forward-char 1)))) (gyp-add-parse-history (point) sections) (setq parsing (< (point) target-point)))) string-start)) (defun gyp-section-at-point () "Transform the last parse state, which is a list of nested sections and return the section symbol that should be used to determine font-lock information for the string. Can return nil indicating the string should not have any attached section." (let ((sections (gyp-parse-sections))) (cond ((eq (car sections) 'conditions) ;; conditions can occur in a variables section, but we still want to ;; highlight it as a keyword. nil) ((and (eq (car sections) 'list) (eq (cadr sections) 'list)) ;; conditions and sources can have items in [[ ]] (caddr sections)) (t (cadr sections))))) (defun gyp-section-match (limit) "Parse from (point) to LIMIT returning by means of match data what was matched. The group of the match indicates what style font-lock should apply. See also `gyp-add-font-lock-keywords'." (gyp-invalidate-parse-states-after (point)) (let ((group nil) (string-start t)) (while (and (< (point) limit) (not group) string-start) (setq string-start (gyp-parse-to limit)) (if string-start (setq group (case (gyp-section-at-point) ('dependencies 1) ('variables 2) ('conditions 2) ('sources 3) ('defines 4) (nil nil))))) (if group (progn ;; Set the match data to indicate to the font-lock mechanism the ;; highlighting to be performed. (set-match-data (append (list string-start (point)) (make-list (* (1- group) 2) nil) (list (1+ string-start) (1- (point))))) t)))) ;;; Please see http://code.google.com/p/gyp/wiki/GypLanguageSpecification for ;;; canonical list of keywords. (defun gyp-add-font-lock-keywords () "Add gyp-mode keywords to font-lock mechanism." ;; TODO(jknotten): Move all the keyword highlighting into gyp-section-match ;; so that we can do the font-locking in a single font-lock pass. (font-lock-add-keywords nil (list ;; Top-level keywords (list (concat "['\"]\\(" (regexp-opt (list "action" "action_name" "actions" "cflags" "cflags_cc" "conditions" "configurations" "copies" "defines" "dependencies" "destination" "direct_dependent_settings" "export_dependent_settings" "extension" "files" "include_dirs" "includes" "inputs" "ldflags" "libraries" "link_settings" "mac_bundle" "message" "msvs_external_rule" "outputs" "product_name" "process_outputs_as_sources" "rules" "rule_name" "sources" "suppress_wildcard" "target_conditions" "target_defaults" "target_defines" "target_name" "toolsets" "targets" "type" "variables" "xcode_settings")) "[!/+=]?\\)") 1 'font-lock-keyword-face t) ;; Type of target (list (concat "['\"]\\(" (regexp-opt (list "loadable_module" "static_library" "shared_library" "executable" "none")) "\\)") 1 'font-lock-type-face t) (list "\\(?:target\\|action\\)_name['\"]\\s-*:\\s-*['\"]\\([^ '\"]*\\)" 1 'font-lock-function-name-face t) (list 'gyp-section-match (list 1 'font-lock-function-name-face t t) ; dependencies (list 2 'font-lock-variable-name-face t t) ; variables, conditions (list 3 'font-lock-constant-face t t) ; sources (list 4 'font-lock-preprocessor-face t t)) ; preprocessor ;; Variable expansion (list "<@?(\\([^\n )]+\\))" 1 'font-lock-variable-name-face t) ;; Command expansion (list "= 3.5) { if (r = rePath.exec(l)) { msbuilds.push({ version: ver, path: r[1] }) } } } }) msbuilds.sort(function (x, y) { return (x.version < y.version ? -1 : 1) }) ;(function verifyMsbuild () { if (!msbuilds.length) return callback(notfoundErr) msbuildPath = path.resolve(msbuilds.pop().path, 'msbuild.exe') fs.stat(msbuildPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { if (msbuilds.length) { return verifyMsbuild() } else { callback(notfoundErr) } } else { callback(err) } return } command = msbuildPath copyNodeLib() }) })() }) } /** * Copies the node.lib file for the current target architecture into the * current proper dev dir location. */ function copyNodeLib () { if (!win || !copyDevLib) return doBuild() var buildDir = path.resolve(nodeDir, buildType) , archNodeLibPath = path.resolve(nodeDir, arch, release.name + '.lib') , buildNodeLibPath = path.resolve(buildDir, release.name + '.lib') mkdirp(buildDir, function (err, isNew) { if (err) return callback(err) log.verbose('"' + buildType + '" dir needed to be created?', isNew) var rs = fs.createReadStream(archNodeLibPath) , ws = fs.createWriteStream(buildNodeLibPath) log.verbose('copying "' + release.name + '.lib" for ' + arch, buildNodeLibPath) rs.pipe(ws) rs.on('error', callback) ws.on('error', callback) rs.on('end', doBuild) }) } /** * Actually spawn the process and compile the module. */ function doBuild () { // Enable Verbose build var verbose = log.levels[log.level] <= log.levels.verbose if (!win && verbose) { argv.push('V=1') } if (win && !verbose) { argv.push('/clp:Verbosity=minimal') } if (win) { // Turn off the Microsoft logo on Windows argv.push('/nologo') } // Specify the build type, Release by default if (win) { var p = arch === 'x64' ? 'x64' : 'Win32' argv.push('/p:Configuration=' + buildType + ';Platform=' + p) if (jobs) { var j = parseInt(jobs, 10) if (!isNaN(j) && j > 0) { argv.push('/m:' + j) } else if (jobs.toUpperCase() === 'MAX') { argv.push('/m:' + require('os').cpus().length) } } } else { argv.push('BUILDTYPE=' + buildType) // Invoke the Makefile in the 'build' dir. argv.push('-C') argv.push('build') if (jobs) { var j = parseInt(jobs, 10) if (!isNaN(j) && j > 0) { argv.push('--jobs') argv.push(j) } else if (jobs.toUpperCase() === 'MAX') { argv.push('--jobs') argv.push(require('os').cpus().length) } } } if (win) { // did the user specify their own .sln file? var hasSln = argv.some(function (arg) { return path.extname(arg) == '.sln' }) if (!hasSln) { argv.unshift(gyp.opts.solution || guessedSolution) } } var proc = gyp.spawn(command, argv) proc.on('exit', onExit) } /** * Invoked after the make/msbuild command exits. */ function onExit (code, signal) { if (code !== 0) { return callback(new Error('`' + command + '` failed with exit code: ' + code)) } if (signal) { return callback(new Error('`' + command + '` got signal: ' + signal)) } callback() } } npm_3.5.2.orig/node_modules/node-gyp/lib/clean.js0000644000000000000000000000057212631326456020063 0ustar 00000000000000 module.exports = exports = clean exports.usage = 'Removes any generated build files and the "out" dir' /** * Module dependencies. */ var rm = require('rimraf') var log = require('npmlog') function clean (gyp, argv, callback) { // Remove the 'build' dir var buildDir = 'build' log.verbose('clean', 'removing "%s" directory', buildDir) rm(buildDir, callback) } npm_3.5.2.orig/node_modules/node-gyp/lib/configure.js0000644000000000000000000003204612631326456020763 0ustar 00000000000000module.exports = exports = configure module.exports.test = { findPython: findPython } /** * Module dependencies. */ var fs = require('graceful-fs') , path = require('path') , glob = require('glob') , log = require('npmlog') , osenv = require('osenv') , which = require('which') , semver = require('semver') , mkdirp = require('mkdirp') , cp = require('child_process') , PathArray = require('path-array') , extend = require('util')._extend , processRelease = require('./process-release') , spawn = cp.spawn , execFile = cp.execFile , win = process.platform == 'win32' , findNodeDirectory = require('./find-node-directory') exports.usage = 'Generates ' + (win ? 'MSVC project files' : 'a Makefile') + ' for the current module' function configure (gyp, argv, callback) { var python = gyp.opts.python || process.env.PYTHON || 'python2' , buildDir = path.resolve('build') , configNames = [ 'config.gypi', 'common.gypi' ] , configs = [] , nodeDir , release = processRelease(argv, gyp, process.version, process.release) findPython(python, function (err, found) { if (err) { callback(err) } else { python = found getNodeDir() } }) function getNodeDir () { // 'python' should be set by now process.env.PYTHON = python if (gyp.opts.nodedir) { // --nodedir was specified. use that for the dev files nodeDir = gyp.opts.nodedir.replace(/^~/, osenv.home()) log.verbose('get node dir', 'compiling against specified --nodedir dev files: %s', nodeDir) createBuildDir() } else { // if no --nodedir specified, ensure node dependencies are installed if ('v' + release.version !== process.version) { // if --target was given, then determine a target version to compile for log.verbose('get node dir', 'compiling against --target node version: %s', release.version) } else { // if no --target was specified then use the current host node version log.verbose('get node dir', 'no --target version specified, falling back to host node version: %s', release.version) } if (!release.semver) { // could not parse the version string with semver return callback(new Error('Invalid version number: ' + release.version)) } // ensure that the target node version's dev files are installed gyp.opts.ensure = true gyp.commands.install([ release.version ], function (err, version) { if (err) return callback(err) log.verbose('get node dir', 'target node version installed:', release.versionDir) nodeDir = path.resolve(gyp.devDir, release.versionDir) createBuildDir() }) } } function createBuildDir () { log.verbose('build dir', 'attempting to create "build" dir: %s', buildDir) mkdirp(buildDir, function (err, isNew) { if (err) return callback(err) log.verbose('build dir', '"build" dir needed to be created?', isNew) createConfigFile() }) } function createConfigFile (err) { if (err) return callback(err) var configFilename = 'config.gypi' var configPath = path.resolve(buildDir, configFilename) log.verbose('build/' + configFilename, 'creating config file') var config = process.config || {} , defaults = config.target_defaults , variables = config.variables // default "config.variables" if (!variables) variables = config.variables = {} // default "config.defaults" if (!defaults) defaults = config.target_defaults = {} // don't inherit the "defaults" from node's `process.config` object. // doing so could cause problems in cases where the `node` executable was // compiled on a different machine (with different lib/include paths) than // the machine where the addon is being built to defaults.cflags = [] defaults.defines = [] defaults.include_dirs = [] defaults.libraries = [] // set the default_configuration prop if ('debug' in gyp.opts) { defaults.default_configuration = gyp.opts.debug ? 'Debug' : 'Release' } if (!defaults.default_configuration) { defaults.default_configuration = 'Release' } // set the target_arch variable variables.target_arch = gyp.opts.arch || process.arch || 'ia32' // set the node development directory variables.nodedir = nodeDir // don't copy dev libraries with nodedir option variables.copy_dev_lib = !gyp.opts.nodedir // disable -T "thin" static archives by default variables.standalone_static_library = gyp.opts.thin ? 0 : 1 // loop through the rest of the opts and add the unknown ones as variables. // this allows for module-specific configure flags like: // // $ node-gyp configure --shared-libxml2 Object.keys(gyp.opts).forEach(function (opt) { if (opt === 'argv') return if (opt in gyp.configDefs) return variables[opt.replace(/-/g, '_')] = gyp.opts[opt] }) // ensures that any boolean values from `process.config` get stringified function boolsToString (k, v) { if (typeof v === 'boolean') return String(v) return v } log.silly('build/' + configFilename, config) // now write out the config.gypi file to the build/ dir var prefix = '# Do not edit. File was generated by node-gyp\'s "configure" step' , json = JSON.stringify(config, boolsToString, 2) log.verbose('build/' + configFilename, 'writing out config file: %s', configPath) configs.push(configPath) fs.writeFile(configPath, [prefix, json, ''].join('\n'), findConfigs) } function findConfigs (err) { if (err) return callback(err) var name = configNames.shift() if (!name) return runGyp() var fullPath = path.resolve(name) log.verbose(name, 'checking for gypi file: %s', fullPath) fs.stat(fullPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { findConfigs() // check next gypi filename } else { callback(err) } } else { log.verbose(name, 'found gypi file') configs.push(fullPath) findConfigs() } }) } function runGyp (err) { if (err) return callback(err) if (!~argv.indexOf('-f') && !~argv.indexOf('--format')) { if (win) { log.verbose('gyp', 'gyp format was not specified; forcing "msvs"') // force the 'make' target for non-Windows argv.push('-f', 'msvs') } else { log.verbose('gyp', 'gyp format was not specified; forcing "make"') // force the 'make' target for non-Windows argv.push('-f', 'make') } } function hasMsvsVersion () { return argv.some(function (arg) { return arg.indexOf('msvs_version') === 0 }) } if (win && !hasMsvsVersion()) { if ('msvs_version' in gyp.opts) { argv.push('-G', 'msvs_version=' + gyp.opts.msvs_version) } else { argv.push('-G', 'msvs_version=auto') } } // include all the ".gypi" files that were found configs.forEach(function (config) { argv.push('-I', config) }) // for AIX we need to set up the path to the exp file // which contains the symbols needed for linking. // The file will either be in one of the following // depending on whether it is an installed or // development environment: // - the include/node directory // - the out/Release directory // - the out/Debug directory // - the root directory var node_exp_file = '' if (process.platform === 'aix') { var node_root_dir = findNodeDirectory() var candidates = ['include/node/node.exp', 'out/Release/node.exp', 'out/Debug/node.exp', 'node.exp'] for (var next = 0; next < candidates.length; next++) { node_exp_file = path.resolve(node_root_dir, candidates[next]) try { fs.accessSync(node_exp_file, fs.R_OK) // exp file found, stop looking break } catch (exception) { // this candidate was not found or not readable, do nothing } } } // this logic ported from the old `gyp_addon` python file var gyp_script = path.resolve(__dirname, '..', 'gyp', 'gyp_main.py') var addon_gypi = path.resolve(__dirname, '..', 'addon.gypi') var common_gypi = path.resolve(nodeDir, 'include/node/common.gypi') fs.stat(common_gypi, function (err, stat) { if (err) common_gypi = path.resolve(nodeDir, 'common.gypi') var output_dir = 'build' if (win) { // Windows expects an absolute path output_dir = buildDir } var nodeGypDir = path.resolve(__dirname, '..') argv.push('-I', addon_gypi) argv.push('-I', common_gypi) argv.push('-Dlibrary=shared_library') argv.push('-Dvisibility=default') argv.push('-Dnode_root_dir=' + nodeDir) if (process.platform === 'aix') { argv.push('-Dnode_exp_file=' + node_exp_file) } argv.push('-Dnode_gyp_dir=' + nodeGypDir) argv.push('-Dnode_lib_file=' + release.name + '.lib') argv.push('-Dmodule_root_dir=' + process.cwd()) argv.push('--depth=.') argv.push('--no-parallel') // tell gyp to write the Makefile/Solution files into output_dir argv.push('--generator-output', output_dir) // tell make to write its output into the same dir argv.push('-Goutput_dir=.') // enforce use of the "binding.gyp" file argv.unshift('binding.gyp') // execute `gyp` from the current target nodedir argv.unshift(gyp_script) // make sure python uses files that came with this particular node package var pypath = new PathArray(process.env, 'PYTHONPATH') pypath.unshift(path.join(__dirname, '..', 'gyp', 'pylib')) var cp = gyp.spawn(python, argv) cp.on('exit', onCpExit) }) } /** * Called when the `gyp` child process exits. */ function onCpExit (code, signal) { if (code !== 0) { callback(new Error('`gyp` failed with exit code: ' + code)) } else { // we're done callback() } } } function findPython (python, callback) { checkPython() // Check if Python is in the $PATH function checkPython () { log.verbose('check python', 'checking for Python executable "%s" in the PATH', python) which(python, function (err, execPath) { if (err) { log.verbose('`which` failed', python, err) if (python === 'python2') { python = 'python' return checkPython() } if (win) { guessPython() } else { failNoPython() } } else { log.verbose('`which` succeeded', python, execPath) // Found the `python` exceutable, and from now on we use it explicitly. // This solves #667 and #750 (`execFile` won't run batch files // (*.cmd, and *.bat)) python = execPath checkPythonVersion() } }) } // Called on Windows when "python" isn't available in the current $PATH. // We're gonna check if "%SystemDrive%\python27\python.exe" exists. function guessPython () { log.verbose('could not find "' + python + '". guessing location') var rootDir = process.env.SystemDrive || 'C:\\' if (rootDir[rootDir.length - 1] !== '\\') { rootDir += '\\' } var pythonPath = path.resolve(rootDir, 'Python27', 'python.exe') log.verbose('ensuring that file exists:', pythonPath) fs.stat(pythonPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { failNoPython() } else { callback(err) } return } python = pythonPath checkPythonVersion() }) } function checkPythonVersion () { var env = extend({}, process.env) env.TERM = 'dumb' execFile(python, ['-c', 'import platform; print(platform.python_version());'], { env: env }, function (err, stdout) { if (err) { return callback(err) } log.verbose('check python version', '`%s -c "import platform; print(platform.python_version());"` returned: %j', python, stdout) var version = stdout.trim() if (~version.indexOf('+')) { log.silly('stripping "+" sign(s) from version') version = version.replace(/\+/g, '') } if (~version.indexOf('rc')) { log.silly('stripping "rc" identifier from version') version = version.replace(/rc(.*)$/ig, '') } var range = semver.Range('>=2.5.0 <3.0.0') var valid = false try { valid = range.test(version) } catch (e) { log.silly('range.test() error', e) } if (valid) { callback(null, python) } else { failPythonVersion(version) } }) } function failNoPython () { callback(new Error('Can\'t find Python executable "' + python + '", you can set the PYTHON env variable.')) } function failPythonVersion (badVersion) { callback(new Error('Python executable "' + python + '" is v' + badVersion + ', which is not supported by gyp.\n' + 'You can pass the --python switch to point to Python >= v2.5.0 & < 3.0.0.')) } } npm_3.5.2.orig/node_modules/node-gyp/lib/find-node-directory.js0000644000000000000000000000452512631326456022650 0ustar 00000000000000var path = require('path') , log = require('npmlog') function findNodeDirectory(scriptLocation, processObj) { // set dirname and process if not passed in // this facilitates regression tests if (scriptLocation === undefined) { scriptLocation = __dirname } if (processObj === undefined) { processObj = process } // Have a look to see what is above us, to try and work out where we are npm_parent_directory = path.join(scriptLocation, '../../../..') log.verbose('node-gyp root', 'npm_parent_directory is ' + path.basename(npm_parent_directory)) node_root_dir = "" log.verbose('node-gyp root', 'Finding node root directory') if (path.basename(npm_parent_directory) === 'deps') { // We are in a build directory where this script lives in // deps/npm/node_modules/node-gyp/lib node_root_dir = path.join(npm_parent_directory, '..') log.verbose('node-gyp root', 'in build directory, root = ' + node_root_dir) } else if (path.basename(npm_parent_directory) === 'node_modules') { // We are in a node install directory where this script lives in // lib/node_modules/npm/node_modules/node-gyp/lib or // node_modules/npm/node_modules/node-gyp/lib depending on the // platform if (processObj.platform === 'win32') { node_root_dir = path.join(npm_parent_directory, '..') } else { node_root_dir = path.join(npm_parent_directory, '../..') } log.verbose('node-gyp root', 'in install directory, root = ' + node_root_dir) } else { // We don't know where we are, try working it out from the location // of the node binary var node_dir = path.dirname(processObj.execPath) var directory_up = path.basename(node_dir) if (directory_up === 'bin') { node_root_dir = path.join(node_dir, '..') } else if (directory_up === 'Release' || directory_up === 'Debug') { // If we are a recently built node, and the directory structure // is that of a repository. If we are on Windows then we only need // to go one level up, everything else, two if (processObj.platform === 'win32') { node_root_dir = path.join(node_dir, '..') } else { node_root_dir = path.join(node_dir, '../..') } } // Else return the default blank, "". } return node_root_dir } module.exports = findNodeDirectory npm_3.5.2.orig/node_modules/node-gyp/lib/install.js0000644000000000000000000003423112631326456020446 0ustar 00000000000000 module.exports = exports = install exports.usage = 'Install node development files for the specified node version.' /** * Module dependencies. */ var fs = require('graceful-fs') , osenv = require('osenv') , tar = require('tar') , rm = require('rimraf') , path = require('path') , crypto = require('crypto') , zlib = require('zlib') , log = require('npmlog') , semver = require('semver') , fstream = require('fstream') , request = require('request') , minimatch = require('minimatch') , mkdir = require('mkdirp') , processRelease = require('./process-release') , win = process.platform == 'win32' function install (gyp, argv, callback) { var release = processRelease(argv, gyp, process.version, process.release) // ensure no double-callbacks happen function cb (err) { if (cb.done) return cb.done = true if (err) { log.warn('install', 'got an error, rolling back install') // roll-back the install if anything went wrong gyp.commands.remove([ release.versionDir ], function (err2) { callback(err) }) } else { callback(null, release.version) } } // Determine which node dev files version we are installing log.verbose('install', 'input version string %j', release.version) if (!release.semver) { // could not parse the version string with semver return callback(new Error('Invalid version number: ' + release.version)) } if (semver.lt(release.version, '0.8.0')) { return callback(new Error('Minimum target version is `0.8.0` or greater. Got: ' + release.version)) } // 0.x.y-pre versions are not published yet and cannot be installed. Bail. if (release.semver.prerelease[0] === 'pre') { log.verbose('detected "pre" node version', release.version) if (gyp.opts.nodedir) { log.verbose('--nodedir flag was passed; skipping install', gyp.opts.nodedir) callback() } else { callback(new Error('"pre" versions of node cannot be installed, use the --nodedir flag instead')) } return } // flatten version into String log.verbose('install', 'installing version: %s', release.versionDir) // the directory where the dev files will be installed var devDir = path.resolve(gyp.devDir, release.versionDir) // If '--ensure' was passed, then don't *always* install the version; // check if it is already installed, and only install when needed if (gyp.opts.ensure) { log.verbose('install', '--ensure was passed, so won\'t reinstall if already installed') fs.stat(devDir, function (err, stat) { if (err) { if (err.code == 'ENOENT') { log.verbose('install', 'version not already installed, continuing with install', release.version) go() } else if (err.code == 'EACCES') { eaccesFallback() } else { cb(err) } return } log.verbose('install', 'version is already installed, need to check "installVersion"') var installVersionFile = path.resolve(devDir, 'installVersion') fs.readFile(installVersionFile, 'ascii', function (err, ver) { if (err && err.code != 'ENOENT') { return cb(err) } var installVersion = parseInt(ver, 10) || 0 log.verbose('got "installVersion"', installVersion) log.verbose('needs "installVersion"', gyp.package.installVersion) if (installVersion < gyp.package.installVersion) { log.verbose('install', 'version is no good; reinstalling') go() } else { log.verbose('install', 'version is good') cb() } }) }) } else { go() } function download (url) { log.http('GET', url) var req = null var requestOpts = { uri: url , headers: { 'User-Agent': 'node-gyp v' + gyp.version + ' (node ' + process.version + ')' } } // basic support for a proxy server var proxyUrl = gyp.opts.proxy || process.env.http_proxy || process.env.HTTP_PROXY || process.env.npm_config_proxy if (proxyUrl) { if (/^https?:\/\//i.test(proxyUrl)) { log.verbose('download', 'using proxy url: "%s"', proxyUrl) requestOpts.proxy = proxyUrl } else { log.warn('download', 'ignoring invalid "proxy" config setting: "%s"', proxyUrl) } } try { // The "request" constructor can throw sometimes apparently :( // See: https://github.com/nodejs/node-gyp/issues/114 req = request(requestOpts) } catch (e) { cb(e) } if (req) { req.on('response', function (res) { log.http(res.statusCode, url) }) } return req } function getContentSha(res, callback) { var shasum = crypto.createHash('sha256') res.on('data', function (chunk) { shasum.update(chunk) }).on('end', function () { callback(null, shasum.digest('hex')) }) } function go () { log.verbose('ensuring nodedir is created', devDir) // first create the dir for the node dev files mkdir(devDir, function (err, created) { if (err) { if (err.code == 'EACCES') { eaccesFallback() } else { cb(err) } return } if (created) { log.verbose('created nodedir', created) } // now download the node tarball var tarPath = gyp.opts.tarball var badDownload = false , extractCount = 0 , gunzip = zlib.createGunzip() , extracter = tar.Extract({ path: devDir, strip: 1, filter: isValid }) var contentShasums = {} var expectShasums = {} // checks if a file to be extracted from the tarball is valid. // only .h header files and the gyp files get extracted function isValid () { var name = this.path.substring(devDir.length + 1) var isValid = valid(name) if (name === '' && this.type === 'Directory') { // the first directory entry is ok return true } if (isValid) { log.verbose('extracted file from tarball', name) extractCount++ } else { // invalid log.silly('ignoring from tarball', name) } return isValid } gunzip.on('error', cb) extracter.on('error', cb) extracter.on('end', afterTarball) // download the tarball, gunzip and extract! if (tarPath) { var input = fs.createReadStream(tarPath) input.pipe(gunzip).pipe(extracter) return } var req = download(release.tarballUrl) if (!req) return // something went wrong downloading the tarball? req.on('error', function (err) { if (err.code === 'ENOTFOUND') { return cb(new Error('This is most likely not a problem with node-gyp or the package itself and\n' + 'is related to network connectivity. In most cases you are behind a proxy or have bad \n' + 'network settings.')) } badDownload = true cb(err) }) req.on('close', function () { if (extractCount === 0) { cb(new Error('Connection closed while downloading tarball file')) } }) req.on('response', function (res) { if (res.statusCode !== 200) { badDownload = true cb(new Error(res.statusCode + ' response downloading ' + release.tarballUrl)) return } // content checksum getContentSha(res, function (_, checksum) { var filename = path.basename(release.tarballUrl).trim() contentShasums[filename] = checksum log.verbose('content checksum', filename, checksum) }) // start unzipping and untaring req.pipe(gunzip).pipe(extracter) }) // invoked after the tarball has finished being extracted function afterTarball () { if (badDownload) return if (extractCount === 0) { return cb(new Error('There was a fatal problem while downloading/extracting the tarball')) } log.verbose('tarball', 'done parsing tarball') var async = 0 if (win) { // need to download node.lib async++ downloadNodeLib(deref) } // write the "installVersion" file async++ var installVersionPath = path.resolve(devDir, 'installVersion') fs.writeFile(installVersionPath, gyp.package.installVersion + '\n', deref) // Only download SHASUMS.txt if not using tarPath override if (!tarPath) { // download SHASUMS.txt async++ downloadShasums(deref) } if (async === 0) { // no async tasks required cb() } function deref (err) { if (err) return cb(err) async-- if (!async) { log.verbose('download contents checksum', JSON.stringify(contentShasums)) // check content shasums for (var k in contentShasums) { log.verbose('validating download checksum for ' + k, '(%s == %s)', contentShasums[k], expectShasums[k]) if (contentShasums[k] !== expectShasums[k]) { cb(new Error(k + ' local checksum ' + contentShasums[k] + ' not match remote ' + expectShasums[k])) return } } cb() } } } function downloadShasums(done) { log.verbose('check download content checksum, need to download `SHASUMS256.txt`...') var shasumsPath = path.resolve(devDir, 'SHASUMS256.txt') log.verbose('checksum url', release.shasumsUrl) var req = download(release.shasumsUrl) if (!req) return req.on('error', done) req.on('response', function (res) { if (res.statusCode !== 200) { done(new Error(res.statusCode + ' status code downloading checksum')) return } var chunks = [] res.on('data', function (chunk) { chunks.push(chunk) }) res.on('end', function () { var lines = Buffer.concat(chunks).toString().trim().split('\n') lines.forEach(function (line) { var items = line.trim().split(/\s+/) if (items.length !== 2) return // 0035d18e2dcf9aad669b1c7c07319e17abfe3762 ./node-v0.11.4.tar.gz var name = items[1].replace(/^\.\//, '') expectShasums[name] = items[0] }) log.verbose('checksum data', JSON.stringify(expectShasums)) done() }) }) } function downloadNodeLib (done) { log.verbose('on Windows; need to download `' + release.name + '.lib`...') var dir32 = path.resolve(devDir, 'ia32') , dir64 = path.resolve(devDir, 'x64') , libPath32 = path.resolve(dir32, release.name + '.lib') , libPath64 = path.resolve(dir64, release.name + '.lib') log.verbose('32-bit ' + release.name + '.lib dir', dir32) log.verbose('64-bit ' + release.name + '.lib dir', dir64) log.verbose('`' + release.name + '.lib` 32-bit url', release.libUrl32) log.verbose('`' + release.name + '.lib` 64-bit url', release.libUrl64) var async = 2 mkdir(dir32, function (err) { if (err) return done(err) log.verbose('streaming 32-bit ' + release.name + '.lib to:', libPath32) var req = download(release.libUrl32) if (!req) return req.on('error', done) req.on('response', function (res) { if (res.statusCode !== 200) { done(new Error(res.statusCode + ' status code downloading 32-bit ' + release.name + '.lib')) return } getContentSha(res, function (_, checksum) { contentShasums[release.libPath32] = checksum log.verbose('content checksum', release.libPath32, checksum) }) var ws = fs.createWriteStream(libPath32) ws.on('error', cb) req.pipe(ws) }) req.on('end', function () { --async || done() }) }) mkdir(dir64, function (err) { if (err) return done(err) log.verbose('streaming 64-bit ' + release.name + '.lib to:', libPath64) var req = download(release.libUrl64) if (!req) return req.on('error', done) req.on('response', function (res) { if (res.statusCode !== 200) { done(new Error(res.statusCode + ' status code downloading 64-bit ' + release.name + '.lib')) return } getContentSha(res, function (_, checksum) { contentShasums[release.libPath64] = checksum log.verbose('content checksum', release.libPath64, checksum) }) var ws = fs.createWriteStream(libPath64) ws.on('error', cb) req.pipe(ws) }) req.on('end', function () { --async || done() }) }) } // downloadNodeLib() }) // mkdir() } // go() /** * Checks if a given filename is "valid" for this installation. */ function valid (file) { // header files return minimatch(file, '*.h', { matchBase: true }) || minimatch(file, '*.gypi', { matchBase: true }) } /** * The EACCES fallback is a workaround for npm's `sudo` behavior, where * it drops the permissions before invoking any child processes (like * node-gyp). So what happens is the "nobody" user doesn't have * permission to create the dev dir. As a fallback, make the tmpdir() be * the dev dir for this installation. This is not ideal, but at least * the compilation will succeed... */ function eaccesFallback () { var tmpdir = osenv.tmpdir() gyp.devDir = path.resolve(tmpdir, '.node-gyp') log.warn('EACCES', 'user "%s" does not have permission to access the dev dir "%s"', osenv.user(), devDir) log.warn('EACCES', 'attempting to reinstall using temporary dev dir "%s"', gyp.devDir) if (process.cwd() == tmpdir) { log.verbose('tmpdir == cwd', 'automatically will remove dev files after to save disk space') gyp.todo.push({ name: 'remove', args: argv }) } gyp.commands.install(argv, cb) } } npm_3.5.2.orig/node_modules/node-gyp/lib/list.js0000644000000000000000000000131612631326456017751 0ustar 00000000000000 module.exports = exports = list exports.usage = 'Prints a listing of the currently installed node development files' /** * Module dependencies. */ var fs = require('graceful-fs') , path = require('path') , log = require('npmlog') function list (gyp, args, callback) { var devDir = gyp.devDir log.verbose('list', 'using node-gyp dir:', devDir) // readdir() the node-gyp dir fs.readdir(devDir, onreaddir) function onreaddir (err, versions) { if (err && err.code != 'ENOENT') { return callback(err) } if (Array.isArray(versions)) { versions = versions.filter(function (v) { return v != 'current' }) } else { versions = [] } callback(null, versions) } } npm_3.5.2.orig/node_modules/node-gyp/lib/node-gyp.js0000644000000000000000000001214012631326456020515 0ustar 00000000000000 /** * Module exports. */ module.exports = exports = gyp /** * Module dependencies. */ var fs = require('graceful-fs') , path = require('path') , nopt = require('nopt') , log = require('npmlog') , child_process = require('child_process') , EE = require('events').EventEmitter , inherits = require('util').inherits , commands = [ // Module build commands 'build' , 'clean' , 'configure' , 'rebuild' // Development Header File management commands , 'install' , 'list' , 'remove' ] , aliases = { 'ls': 'list' , 'rm': 'remove' } // differentiate node-gyp's logs from npm's log.heading = 'gyp' /** * The `gyp` function. */ function gyp () { return new Gyp() } function Gyp () { var self = this // set the dir where node-gyp dev files get installed // TODO: make this *more* configurable? // see: https://github.com/nodejs/node-gyp/issues/21 var homeDir = process.env.HOME || process.env.USERPROFILE if (!homeDir) { throw new Error( "node-gyp requires that the user's home directory is specified " + "in either of the environmental variables HOME or USERPROFILE" ); } this.devDir = path.resolve(homeDir, '.node-gyp') this.commands = {} commands.forEach(function (command) { self.commands[command] = function (argv, callback) { log.verbose('command', command, argv) return require('./' + command)(self, argv, callback) } }) } inherits(Gyp, EE) exports.Gyp = Gyp var proto = Gyp.prototype /** * Export the contents of the package.json. */ proto.package = require('../package') /** * nopt configuration definitions */ proto.configDefs = { help: Boolean // everywhere , arch: String // 'configure' , debug: Boolean // 'build' , directory: String // bin , make: String // 'build' , msvs_version: String // 'configure' , ensure: Boolean // 'install' , solution: String // 'build' (windows only) , proxy: String // 'install' , nodedir: String // 'configure' , loglevel: String // everywhere , python: String // 'configure' , 'dist-url': String // 'install' , 'tarball': String // 'install' , jobs: String // 'build' , thin: String // 'configure' } /** * nopt shorthands */ proto.shorthands = { release: '--no-debug' , C: '--directory' , debug: '--debug' , j: '--jobs' , silly: '--loglevel=silly' , verbose: '--loglevel=verbose' } /** * expose the command aliases for the bin file to use. */ proto.aliases = aliases /** * Parses the given argv array and sets the 'opts', * 'argv' and 'command' properties. */ proto.parseArgv = function parseOpts (argv) { this.opts = nopt(this.configDefs, this.shorthands, argv) this.argv = this.opts.argv.remain.slice() var commands = this.todo = [] // create a copy of the argv array with aliases mapped argv = this.argv.map(function (arg) { // is this an alias? if (arg in this.aliases) { arg = this.aliases[arg] } return arg }, this) // process the mapped args into "command" objects ("name" and "args" props) argv.slice().forEach(function (arg) { if (arg in this.commands) { var args = argv.splice(0, argv.indexOf(arg)) argv.shift() if (commands.length > 0) { commands[commands.length - 1].args = args } commands.push({ name: arg, args: [] }) } }, this) if (commands.length > 0) { commands[commands.length - 1].args = argv.splice(0) } // support for inheriting config env variables from npm var npm_config_prefix = 'npm_config_' Object.keys(process.env).forEach(function (name) { if (name.indexOf(npm_config_prefix) !== 0) return var val = process.env[name] if (name === npm_config_prefix + 'loglevel') { log.level = val } else { // add the user-defined options to the config name = name.substring(npm_config_prefix.length) // gyp@741b7f1 enters an infinite loop when it encounters // zero-length options so ensure those don't get through. if (name) this.opts[name] = val } }, this) if (this.opts.loglevel) { log.level = this.opts.loglevel } log.resume() } /** * Spawns a child process and emits a 'spawn' event. */ proto.spawn = function spawn (command, args, opts) { if (!opts) opts = {} if (!opts.silent && !opts.stdio) { opts.stdio = [ 0, 1, 2 ] } var cp = child_process.spawn(command, args, opts) log.info('spawn', command) log.info('spawn args', args) return cp } /** * Returns the usage instructions for node-gyp. */ proto.usage = function usage () { var str = [ '' , ' Usage: node-gyp [options]' , '' , ' where is one of:' , commands.map(function (c) { return ' - ' + c + ' - ' + require('./' + c).usage }).join('\n') , '' , 'node-gyp@' + this.version + ' ' + path.resolve(__dirname, '..') , 'node@' + process.versions.node ].join('\n') return str } /** * Version number getter. */ Object.defineProperty(proto, 'version', { get: function () { return this.package.version } , enumerable: true }) npm_3.5.2.orig/node_modules/node-gyp/lib/process-release.js0000644000000000000000000001210512631326456022070 0ustar 00000000000000var semver = require('semver') , url = require('url') , path = require('path') , bitsre = /\/win-(x86|x64)\// , bitsreV3 = /\/win-(x86|ia32|x64)\// // io.js v3.x.x shipped with "ia32" but should // have been "x86" // Captures all the logic required to determine download URLs, local directory and // file names. Inputs come from command-line switches (--target, --dist-url), // `process.version` and `process.release` where it exists. function processRelease (argv, gyp, defaultVersion, defaultRelease) { var version = (semver.valid(argv[0]) && argv[0]) || gyp.opts.target || defaultVersion , versionSemver = semver.parse(version) , overrideDistUrl = gyp.opts['dist-url'] || gyp.opts.disturl , isDefaultVersion , isIojs , name , distBaseUrl , baseUrl , libUrl32 , libUrl64 , tarballUrl if (!versionSemver) { // not a valid semver string, nothing we can do return { version: version } } // flatten version into String version = versionSemver.version // defaultVersion should come from process.version so ought to be valid semver isDefaultVersion = version === semver.parse(defaultVersion).version // can't use process.release if we're using --target=x.y.z if (!isDefaultVersion) defaultRelease = null if (defaultRelease) { // v3 onward, has process.release name = defaultRelease.name.replace(/io\.js/, 'iojs') // remove the '.' for directory naming purposes isIojs = name === 'iojs' } else { // old node or alternative --target= // semver.satisfies() doesn't like prerelease tags so test major directly isIojs = versionSemver.major >= 1 && versionSemver.major < 4 name = isIojs ? 'iojs' : 'node' } // check for the nvm.sh standard mirror env variables if (!overrideDistUrl) { if (isIojs && process.env.NVM_IOJS_ORG_MIRROR) overrideDistUrl = process.env.NVM_IOJS_ORG_MIRROR else if (process.env.NVM_NODEJS_ORG_MIRROR) overrideDistUrl = process.env.NVM_NODEJS_ORG_MIRROR } if (overrideDistUrl) distBaseUrl = overrideDistUrl.replace(/\/+$/, '') else distBaseUrl = isIojs ? 'https://iojs.org/download/release' : 'https://nodejs.org/dist' distBaseUrl += '/v' + version + '/' // new style, based on process.release so we have a lot of the data we need if (defaultRelease && defaultRelease.headersUrl && !overrideDistUrl) { baseUrl = url.resolve(defaultRelease.headersUrl, './') libUrl32 = resolveLibUrl(name, defaultRelease.libUrl || baseUrl || distBaseUrl, 'x86', versionSemver.major) libUrl64 = resolveLibUrl(name, defaultRelease.libUrl || baseUrl || distBaseUrl, 'x64', versionSemver.major) return { version: version, semver: versionSemver, name: name, baseUrl: baseUrl, tarballUrl: defaultRelease.headersUrl, shasumsUrl: url.resolve(baseUrl, 'SHASUMS256.txt'), versionDir: (name !== 'node' ? name + '-' : '') + version, libUrl32: libUrl32, libUrl64: libUrl64, libPath32: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl32).path)), libPath64: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl64).path)) } } // older versions without process.release are captured here and we have to make // a lot of assumptions, additionally if you --target=x.y.z then we can't use the // current process.release baseUrl = distBaseUrl libUrl32 = resolveLibUrl(name, baseUrl, 'x86', versionSemver.major) libUrl64 = resolveLibUrl(name, baseUrl, 'x64', versionSemver.major) // making the bold assumption that anything with a version number >3.0.0 will // have a *-headers.tar.gz file in its dist location, even some frankenstein // custom version tarballUrl = url.resolve(baseUrl, name + '-v' + version + (versionSemver.major >= 3 ? '-headers' : '') + '.tar.gz') return { version: version, semver: versionSemver, name: name, baseUrl: baseUrl, tarballUrl: tarballUrl, shasumsUrl: url.resolve(baseUrl, 'SHASUMS256.txt'), versionDir: (name !== 'node' ? name + '-' : '') + version, libUrl32: libUrl32, libUrl64: libUrl64, libPath32: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl32).path)), libPath64: normalizePath(path.relative(url.parse(baseUrl).path, url.parse(libUrl64).path)) } } function normalizePath (p) { return path.normalize(p).replace(/\\/g, '/') } function resolveLibUrl (name, defaultUrl, arch, versionMajor) { var base = url.resolve(defaultUrl, './') , hasLibUrl = bitsre.test(defaultUrl) || (versionMajor === 3 && bitsreV3.test(defaultUrl)) if (!hasLibUrl) { // let's assume it's a baseUrl then if (versionMajor >= 1) return url.resolve(base, 'win-' + arch +'/' + name + '.lib') // prior to io.js@1.0.0 32-bit node.lib lives in /, 64-bit lives in /x64/ return url.resolve(base, (arch === 'x64' ? 'x64/' : '') + name + '.lib') } // else we have a proper url to a .lib, just make sure it's the right arch return defaultUrl.replace(versionMajor === 3 ? bitsreV3 : bitsre, '/win-' + arch + '/') } module.exports = processRelease npm_3.5.2.orig/node_modules/node-gyp/lib/rebuild.js0000644000000000000000000000046412631326456020427 0ustar 00000000000000 module.exports = exports = rebuild exports.usage = 'Runs "clean", "configure" and "build" all at once' function rebuild (gyp, argv, callback) { gyp.todo.push( { name: 'clean', args: [] } , { name: 'configure', args: argv } , { name: 'build', args: [] } ) process.nextTick(callback) } npm_3.5.2.orig/node_modules/node-gyp/lib/remove.js0000644000000000000000000000244712631326456020301 0ustar 00000000000000 module.exports = exports = remove exports.usage = 'Removes the node development files for the specified version' /** * Module dependencies. */ var fs = require('fs') , rm = require('rimraf') , path = require('path') , log = require('npmlog') , semver = require('semver') function remove (gyp, argv, callback) { var devDir = gyp.devDir log.verbose('remove', 'using node-gyp dir:', devDir) // get the user-specified version to remove var version = argv[0] || gyp.opts.target log.verbose('remove', 'removing target version:', version) if (!version) { return callback(new Error('You must specify a version number to remove. Ex: "' + process.version + '"')) } var versionSemver = semver.parse(version) if (versionSemver) { // flatten the version Array into a String version = versionSemver.version } var versionPath = path.resolve(gyp.devDir, version) log.verbose('remove', 'removing development files for version:', version) // first check if its even installed fs.stat(versionPath, function (err, stat) { if (err) { if (err.code == 'ENOENT') { callback(null, 'version was already uninstalled: ' + version) } else { callback(err) } return } // Go ahead and delete the dir rm(versionPath, callback) }) } npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/0000755000000000000000000000000012631326456021271 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/0000755000000000000000000000000012631326456022317 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/0000755000000000000000000000000012631326456021642 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/0000755000000000000000000000000012631326456022416 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/LICENSE0000644000000000000000000000137512631326456022304 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/README.md0000644000000000000000000003544012631326456022556 0ustar 00000000000000[![Build Status](https://travis-ci.org/isaacs/node-glob.svg?branch=master)](https://travis-ci.org/isaacs/node-glob/) [![Dependency Status](https://david-dm.org/isaacs/node-glob.svg)](https://david-dm.org/isaacs/node-glob) [![devDependency Status](https://david-dm.org/isaacs/node-glob/dev-status.svg)](https://david-dm.org/isaacs/node-glob#info=devDependencies) [![optionalDependency Status](https://david-dm.org/isaacs/node-glob/optional-status.svg)](https://david-dm.org/isaacs/node-glob#info=optionalDependencies) # Glob Match files using the patterns the shell uses, like stars and stuff. This is a glob implementation in JavaScript. It uses the `minimatch` library to do its matching. ![](oh-my-glob.gif) ## Usage ```javascript var glob = require("glob") // options is optional glob("**/*.js", options, function (er, files) { // files is an array of filenames. // If the `nonull` option is set, and nothing // was found, then files is ["**/*.js"] // er is an error object or null. }) ``` ## Glob Primer "Globs" are the patterns you type when you do stuff like `ls *.js` on the command line, or put `build/*` in a `.gitignore` file. Before parsing the path part patterns, braced sections are expanded into a set. Braced sections start with `{` and end with `}`, with any number of comma-delimited sections within. Braced sections may contain slash characters, so `a{/b/c,bcd}` would expand into `a/b/c` and `abcd`. The following characters have special magic meaning when used in a path portion: * `*` Matches 0 or more characters in a single path portion * `?` Matches 1 character * `[...]` Matches a range of characters, similar to a RegExp range. If the first character of the range is `!` or `^` then it matches any character not in the range. * `!(pattern|pattern|pattern)` Matches anything that does not match any of the patterns provided. * `?(pattern|pattern|pattern)` Matches zero or one occurrence of the patterns provided. * `+(pattern|pattern|pattern)` Matches one or more occurrences of the patterns provided. * `*(a|b|c)` Matches zero or more occurrences of the patterns provided * `@(pattern|pat*|pat?erN)` Matches exactly one of the patterns provided * `**` If a "globstar" is alone in a path portion, then it matches zero or more directories and subdirectories searching for matches. It does not crawl symlinked directories. ### Dots If a file or directory path portion has a `.` as the first character, then it will not match any glob pattern unless that pattern's corresponding path part also has a `.` as its first character. For example, the pattern `a/.*/c` would match the file at `a/.b/c`. However the pattern `a/*/c` would not, because `*` does not start with a dot character. You can make glob treat dots as normal characters by setting `dot:true` in the options. ### Basename Matching If you set `matchBase:true` in the options, and the pattern has no slashes in it, then it will seek for any file anywhere in the tree with a matching basename. For example, `*.js` would match `test/simple/basic.js`. ### Negation The intent for negation would be for a pattern starting with `!` to match everything that *doesn't* match the supplied pattern. However, the implementation is weird, and for the time being, this should be avoided. The behavior will change or be deprecated in version 5. ### Empty Sets If no matching files are found, then an empty array is returned. This differs from the shell, where the pattern itself is returned. For example: $ echo a*s*d*f a*s*d*f To get the bash-style behavior, set the `nonull:true` in the options. ### See Also: * `man sh` * `man bash` (Search for "Pattern Matching") * `man 3 fnmatch` * `man 5 gitignore` * [minimatch documentation](https://github.com/isaacs/minimatch) ## glob.hasMagic(pattern, [options]) Returns `true` if there are any special characters in the pattern, and `false` otherwise. Note that the options affect the results. If `noext:true` is set in the options object, then `+(a|b)` will not be considered a magic pattern. If the pattern has a brace expansion, like `a/{b/c,x/y}` then that is considered magical, unless `nobrace:true` is set in the options. ## glob(pattern, [options], cb) * `pattern` {String} Pattern to be matched * `options` {Object} * `cb` {Function} * `err` {Error | null} * `matches` {Array} filenames found matching the pattern Perform an asynchronous glob search. ## glob.sync(pattern, [options]) * `pattern` {String} Pattern to be matched * `options` {Object} * return: {Array} filenames found matching the pattern Perform a synchronous glob search. ## Class: glob.Glob Create a Glob object by instantiating the `glob.Glob` class. ```javascript var Glob = require("glob").Glob var mg = new Glob(pattern, options, cb) ``` It's an EventEmitter, and starts walking the filesystem to find matches immediately. ### new glob.Glob(pattern, [options], [cb]) * `pattern` {String} pattern to search for * `options` {Object} * `cb` {Function} Called when an error occurs, or matches are found * `err` {Error | null} * `matches` {Array} filenames found matching the pattern Note that if the `sync` flag is set in the options, then matches will be immediately available on the `g.found` member. ### Properties * `minimatch` The minimatch object that the glob uses. * `options` The options object passed in. * `aborted` Boolean which is set to true when calling `abort()`. There is no way at this time to continue a glob search after aborting, but you can re-use the statCache to avoid having to duplicate syscalls. * `statCache` Collection of all the stat results the glob search performed. * `cache` Convenience object. Each field has the following possible values: * `false` - Path does not exist * `true` - Path exists * `'DIR'` - Path exists, and is not a directory * `'FILE'` - Path exists, and is a directory * `[file, entries, ...]` - Path exists, is a directory, and the array value is the results of `fs.readdir` * `statCache` Cache of `fs.stat` results, to prevent statting the same path multiple times. * `symlinks` A record of which paths are symbolic links, which is relevant in resolving `**` patterns. * `realpathCache` An optional object which is passed to `fs.realpath` to minimize unnecessary syscalls. It is stored on the instantiated Glob object, and may be re-used. ### Events * `end` When the matching is finished, this is emitted with all the matches found. If the `nonull` option is set, and no match was found, then the `matches` list contains the original pattern. The matches are sorted, unless the `nosort` flag is set. * `match` Every time a match is found, this is emitted with the matched. * `error` Emitted when an unexpected error is encountered, or whenever any fs error occurs if `options.strict` is set. * `abort` When `abort()` is called, this event is raised. ### Methods * `pause` Temporarily stop the search * `resume` Resume the search * `abort` Stop the search forever ### Options All the options that can be passed to Minimatch can also be passed to Glob to change pattern matching behavior. Also, some have been added, or have glob-specific ramifications. All options are false by default, unless otherwise noted. All options are added to the Glob object, as well. If you are running many `glob` operations, you can pass a Glob object as the `options` argument to a subsequent operation to shortcut some `stat` and `readdir` calls. At the very least, you may pass in shared `symlinks`, `statCache`, `realpathCache`, and `cache` options, so that parallel glob operations will be sped up by sharing information about the filesystem. * `cwd` The current working directory in which to search. Defaults to `process.cwd()`. * `root` The place where patterns starting with `/` will be mounted onto. Defaults to `path.resolve(options.cwd, "/")` (`/` on Unix systems, and `C:\` or some such on Windows.) * `dot` Include `.dot` files in normal matches and `globstar` matches. Note that an explicit dot in a portion of the pattern will always match dot files. * `nomount` By default, a pattern starting with a forward-slash will be "mounted" onto the root setting, so that a valid filesystem path is returned. Set this flag to disable that behavior. * `mark` Add a `/` character to directory matches. Note that this requires additional stat calls. * `nosort` Don't sort the results. * `stat` Set to true to stat *all* results. This reduces performance somewhat, and is completely unnecessary, unless `readdir` is presumed to be an untrustworthy indicator of file existence. * `silent` When an unusual error is encountered when attempting to read a directory, a warning will be printed to stderr. Set the `silent` option to true to suppress these warnings. * `strict` When an unusual error is encountered when attempting to read a directory, the process will just continue on in search of other matches. Set the `strict` option to raise an error in these cases. * `cache` See `cache` property above. Pass in a previously generated cache object to save some fs calls. * `statCache` A cache of results of filesystem information, to prevent unnecessary stat calls. While it should not normally be necessary to set this, you may pass the statCache from one glob() call to the options object of another, if you know that the filesystem will not change between calls. (See "Race Conditions" below.) * `symlinks` A cache of known symbolic links. You may pass in a previously generated `symlinks` object to save `lstat` calls when resolving `**` matches. * `sync` DEPRECATED: use `glob.sync(pattern, opts)` instead. * `nounique` In some cases, brace-expanded patterns can result in the same file showing up multiple times in the result set. By default, this implementation prevents duplicates in the result set. Set this flag to disable that behavior. * `nonull` Set to never return an empty set, instead returning a set containing the pattern itself. This is the default in glob(3). * `debug` Set to enable debug logging in minimatch and glob. * `nobrace` Do not expand `{a,b}` and `{1..3}` brace sets. * `noglobstar` Do not match `**` against multiple filenames. (Ie, treat it as a normal `*` instead.) * `noext` Do not match `+(a|b)` "extglob" patterns. * `nocase` Perform a case-insensitive match. Note: on case-insensitive filesystems, non-magic patterns will match by default, since `stat` and `readdir` will not raise errors. * `matchBase` Perform a basename-only match if the pattern does not contain any slash characters. That is, `*.js` would be treated as equivalent to `**/*.js`, matching all js files in all directories. * `nonegate` Suppress `negate` behavior. (See below.) * `nocomment` Suppress `comment` behavior. (See below.) * `nonull` Return the pattern when no matches are found. * `nodir` Do not match directories, only files. (Note: to match *only* directories, simply put a `/` at the end of the pattern.) * `ignore` Add a pattern or an array of patterns to exclude matches. * `follow` Follow symlinked directories when expanding `**` patterns. Note that this can result in a lot of duplicate references in the presence of cyclic links. * `realpath` Set to true to call `fs.realpath` on all of the results. In the case of a symlink that cannot be resolved, the full absolute path to the matched entry is returned (though it will usually be a broken symlink) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between node-glob and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.3, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. Note that symlinked directories are not crawled as part of a `**`, though their contents may match against subsequent portions of the pattern. This prevents infinite loops and duplicates and the like. If an escaped pattern has no matches, and the `nonull` flag is set, then glob returns the pattern as-provided, rather than interpreting the character escapes. For example, `glob.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. ## Windows **Please only use forward-slashes in glob expressions.** Though windows uses either `/` or `\` as its path separator, only `/` characters are used by this glob implementation. You must use forward-slashes **only** in glob expressions. Back-slashes will always be interpreted as escape characters, not path separators. Results from absolute patterns such as `/foo/*` are mounted onto the root setting using `path.join`. On windows, this will by default result in `/foo/*` matching `C:\foo\bar.txt`. ## Race Conditions Glob searching, by its very nature, is susceptible to race conditions, since it relies on directory walking and such. As a result, it is possible that a file that exists when glob looks for it may have been deleted or modified by the time it returns the result. As part of its internal implementation, this program caches all stat and readdir calls that it makes, in order to cut down on system overhead. However, this also makes it even more susceptible to races, especially if the cache or statCache objects are reused between glob calls. Users are thus advised not to use a glob result as a guarantee of filesystem state in the face of rapid changes. For the vast majority of operations, this is never a problem. ## Contributing Any change to behavior (including bugfixes) must come with a test. Patches that fail tests or reduce performance will be rejected. ``` # to run tests npm test # to re-generate test fixtures npm run test-regen # to benchmark against bash/zsh npm run bench # to profile javascript npm run prof ``` npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/common.js0000644000000000000000000001341512631326456023123 0ustar 00000000000000exports.alphasort = alphasort exports.alphasorti = alphasorti exports.isAbsolute = process.platform === "win32" ? absWin : absUnix exports.setopts = setopts exports.ownProp = ownProp exports.makeAbs = makeAbs exports.finish = finish exports.mark = mark exports.isIgnored = isIgnored exports.childrenIgnored = childrenIgnored function ownProp (obj, field) { return Object.prototype.hasOwnProperty.call(obj, field) } var path = require("path") var minimatch = require("minimatch") var Minimatch = minimatch.Minimatch function absWin (p) { if (absUnix(p)) return true // pull off the device/UNC bit from a windows path. // from node's lib/path.js var splitDeviceRe = /^([a-zA-Z]:|[\\\/]{2}[^\\\/]+[\\\/]+[^\\\/]+)?([\\\/])?([\s\S]*?)$/ var result = splitDeviceRe.exec(p) var device = result[1] || '' var isUnc = device && device.charAt(1) !== ':' var isAbsolute = !!result[2] || isUnc // UNC paths are always absolute return isAbsolute } function absUnix (p) { return p.charAt(0) === "/" || p === "" } function alphasorti (a, b) { return a.toLowerCase().localeCompare(b.toLowerCase()) } function alphasort (a, b) { return a.localeCompare(b) } function setupIgnores (self, options) { self.ignore = options.ignore || [] if (!Array.isArray(self.ignore)) self.ignore = [self.ignore] if (self.ignore.length) { self.ignore = self.ignore.map(ignoreMap) } } function ignoreMap (pattern) { var gmatcher = null if (pattern.slice(-3) === '/**') { var gpattern = pattern.replace(/(\/\*\*)+$/, '') gmatcher = new Minimatch(gpattern, { nonegate: true }) } return { matcher: new Minimatch(pattern, { nonegate: true }), gmatcher: gmatcher } } function setopts (self, pattern, options) { if (!options) options = {} // base-matching: just use globstar for that. if (options.matchBase && -1 === pattern.indexOf("/")) { if (options.noglobstar) { throw new Error("base matching requires globstar") } pattern = "**/" + pattern } self.pattern = pattern self.strict = options.strict !== false self.realpath = !!options.realpath self.realpathCache = options.realpathCache || Object.create(null) self.follow = !!options.follow self.dot = !!options.dot self.mark = !!options.mark self.nodir = !!options.nodir if (self.nodir) self.mark = true self.sync = !!options.sync self.nounique = !!options.nounique self.nonull = !!options.nonull self.nosort = !!options.nosort self.nocase = !!options.nocase self.stat = !!options.stat self.noprocess = !!options.noprocess self.maxLength = options.maxLength || Infinity self.cache = options.cache || Object.create(null) self.statCache = options.statCache || Object.create(null) self.symlinks = options.symlinks || Object.create(null) setupIgnores(self, options) self.changedCwd = false var cwd = process.cwd() if (!ownProp(options, "cwd")) self.cwd = cwd else { self.cwd = options.cwd self.changedCwd = path.resolve(options.cwd) !== cwd } self.root = options.root || path.resolve(self.cwd, "/") self.root = path.resolve(self.root) if (process.platform === "win32") self.root = self.root.replace(/\\/g, "/") self.nomount = !!options.nomount self.minimatch = new Minimatch(pattern, options) self.options = self.minimatch.options } function finish (self) { var nou = self.nounique var all = nou ? [] : Object.create(null) for (var i = 0, l = self.matches.length; i < l; i ++) { var matches = self.matches[i] if (!matches || Object.keys(matches).length === 0) { if (self.nonull) { // do like the shell, and spit out the literal glob var literal = self.minimatch.globSet[i] if (nou) all.push(literal) else all[literal] = true } } else { // had matches var m = Object.keys(matches) if (nou) all.push.apply(all, m) else m.forEach(function (m) { all[m] = true }) } } if (!nou) all = Object.keys(all) if (!self.nosort) all = all.sort(self.nocase ? alphasorti : alphasort) // at *some* point we statted all of these if (self.mark) { for (var i = 0; i < all.length; i++) { all[i] = self._mark(all[i]) } if (self.nodir) { all = all.filter(function (e) { return !(/\/$/.test(e)) }) } } if (self.ignore.length) all = all.filter(function(m) { return !isIgnored(self, m) }) self.found = all } function mark (self, p) { var abs = makeAbs(self, p) var c = self.cache[abs] var m = p if (c) { var isDir = c === 'DIR' || Array.isArray(c) var slash = p.slice(-1) === '/' if (isDir && !slash) m += '/' else if (!isDir && slash) m = m.slice(0, -1) if (m !== p) { var mabs = makeAbs(self, m) self.statCache[mabs] = self.statCache[abs] self.cache[mabs] = self.cache[abs] } } return m } // lotta situps... function makeAbs (self, f) { var abs = f if (f.charAt(0) === '/') { abs = path.join(self.root, f) } else if (exports.isAbsolute(f)) { abs = f } else if (self.changedCwd) { abs = path.resolve(self.cwd, f) } else if (self.realpath) { abs = path.resolve(f) } return abs } // Return true, if pattern ends with globstar '**', for the accompanying parent directory. // Ex:- If node_modules/** is the pattern, add 'node_modules' to ignore list along with it's contents function isIgnored (self, path) { if (!self.ignore.length) return false return self.ignore.some(function(item) { return item.matcher.match(path) || !!(item.gmatcher && item.gmatcher.match(path)) }) } function childrenIgnored (self, path) { if (!self.ignore.length) return false return self.ignore.some(function(item) { return !!(item.gmatcher && item.gmatcher.match(path)) }) } npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/glob.js0000644000000000000000000004364712631326456022570 0ustar 00000000000000// Approach: // // 1. Get the minimatch set // 2. For each pattern in the set, PROCESS(pattern, false) // 3. Store matches per-set, then uniq them // // PROCESS(pattern, inGlobStar) // Get the first [n] items from pattern that are all strings // Join these together. This is PREFIX. // If there is no more remaining, then stat(PREFIX) and // add to matches if it succeeds. END. // // If inGlobStar and PREFIX is symlink and points to dir // set ENTRIES = [] // else readdir(PREFIX) as ENTRIES // If fail, END // // with ENTRIES // If pattern[n] is GLOBSTAR // // handle the case where the globstar match is empty // // by pruning it out, and testing the resulting pattern // PROCESS(pattern[0..n] + pattern[n+1 .. $], false) // // handle other cases. // for ENTRY in ENTRIES (not dotfiles) // // attach globstar + tail onto the entry // // Mark that this entry is a globstar match // PROCESS(pattern[0..n] + ENTRY + pattern[n .. $], true) // // else // not globstar // for ENTRY in ENTRIES (not dotfiles, unless pattern[n] is dot) // Test ENTRY against pattern[n] // If fails, continue // If passes, PROCESS(pattern[0..n] + item + pattern[n+1 .. $]) // // Caveat: // Cache all stats and readdirs results to minimize syscall. Since all // we ever care about is existence and directory-ness, we can just keep // `true` for files, and [children,...] for directories, or `false` for // things that don't exist. module.exports = glob var fs = require('fs') var minimatch = require('minimatch') var Minimatch = minimatch.Minimatch var inherits = require('inherits') var EE = require('events').EventEmitter var path = require('path') var assert = require('assert') var globSync = require('./sync.js') var common = require('./common.js') var alphasort = common.alphasort var alphasorti = common.alphasorti var isAbsolute = common.isAbsolute var setopts = common.setopts var ownProp = common.ownProp var inflight = require('inflight') var util = require('util') var childrenIgnored = common.childrenIgnored var once = require('once') function glob (pattern, options, cb) { if (typeof options === 'function') cb = options, options = {} if (!options) options = {} if (options.sync) { if (cb) throw new TypeError('callback provided to sync glob') return globSync(pattern, options) } return new Glob(pattern, options, cb) } glob.sync = globSync var GlobSync = glob.GlobSync = globSync.GlobSync // old api surface glob.glob = glob glob.hasMagic = function (pattern, options_) { var options = util._extend({}, options_) options.noprocess = true var g = new Glob(pattern, options) var set = g.minimatch.set if (set.length > 1) return true for (var j = 0; j < set[0].length; j++) { if (typeof set[0][j] !== 'string') return true } return false } glob.Glob = Glob inherits(Glob, EE) function Glob (pattern, options, cb) { if (typeof options === 'function') { cb = options options = null } if (options && options.sync) { if (cb) throw new TypeError('callback provided to sync glob') return new GlobSync(pattern, options) } if (!(this instanceof Glob)) return new Glob(pattern, options, cb) setopts(this, pattern, options) this._didRealPath = false // process each pattern in the minimatch set var n = this.minimatch.set.length // The matches are stored as {: true,...} so that // duplicates are automagically pruned. // Later, we do an Object.keys() on these. // Keep them as a list so we can fill in when nonull is set. this.matches = new Array(n) if (typeof cb === 'function') { cb = once(cb) this.on('error', cb) this.on('end', function (matches) { cb(null, matches) }) } var self = this var n = this.minimatch.set.length this._processing = 0 this.matches = new Array(n) this._emitQueue = [] this._processQueue = [] this.paused = false if (this.noprocess) return this if (n === 0) return done() for (var i = 0; i < n; i ++) { this._process(this.minimatch.set[i], i, false, done) } function done () { --self._processing if (self._processing <= 0) self._finish() } } Glob.prototype._finish = function () { assert(this instanceof Glob) if (this.aborted) return if (this.realpath && !this._didRealpath) return this._realpath() common.finish(this) this.emit('end', this.found) } Glob.prototype._realpath = function () { if (this._didRealpath) return this._didRealpath = true var n = this.matches.length if (n === 0) return this._finish() var self = this for (var i = 0; i < this.matches.length; i++) this._realpathSet(i, next) function next () { if (--n === 0) self._finish() } } Glob.prototype._realpathSet = function (index, cb) { var matchset = this.matches[index] if (!matchset) return cb() var found = Object.keys(matchset) var self = this var n = found.length if (n === 0) return cb() var set = this.matches[index] = Object.create(null) found.forEach(function (p, i) { // If there's a problem with the stat, then it means that // one or more of the links in the realpath couldn't be // resolved. just return the abs value in that case. p = self._makeAbs(p) fs.realpath(p, self.realpathCache, function (er, real) { if (!er) set[real] = true else if (er.syscall === 'stat') set[p] = true else self.emit('error', er) // srsly wtf right here if (--n === 0) { self.matches[index] = set cb() } }) }) } Glob.prototype._mark = function (p) { return common.mark(this, p) } Glob.prototype._makeAbs = function (f) { return common.makeAbs(this, f) } Glob.prototype.abort = function () { this.aborted = true this.emit('abort') } Glob.prototype.pause = function () { if (!this.paused) { this.paused = true this.emit('pause') } } Glob.prototype.resume = function () { if (this.paused) { this.emit('resume') this.paused = false if (this._emitQueue.length) { var eq = this._emitQueue.slice(0) this._emitQueue.length = 0 for (var i = 0; i < eq.length; i ++) { var e = eq[i] this._emitMatch(e[0], e[1]) } } if (this._processQueue.length) { var pq = this._processQueue.slice(0) this._processQueue.length = 0 for (var i = 0; i < pq.length; i ++) { var p = pq[i] this._processing-- this._process(p[0], p[1], p[2], p[3]) } } } } Glob.prototype._process = function (pattern, index, inGlobStar, cb) { assert(this instanceof Glob) assert(typeof cb === 'function') if (this.aborted) return this._processing++ if (this.paused) { this._processQueue.push([pattern, index, inGlobStar, cb]) return } //console.error('PROCESS %d', this._processing, pattern) // Get the first [n] parts of pattern that are all strings. var n = 0 while (typeof pattern[n] === 'string') { n ++ } // now n is the index of the first one that is *not* a string. // see if there's anything else var prefix switch (n) { // if not, then this is rather simple case pattern.length: this._processSimple(pattern.join('/'), index, cb) return case 0: // pattern *starts* with some non-trivial item. // going to readdir(cwd), but not include the prefix in matches. prefix = null break default: // pattern has some string bits in the front. // whatever it starts with, whether that's 'absolute' like /foo/bar, // or 'relative' like '../baz' prefix = pattern.slice(0, n).join('/') break } var remain = pattern.slice(n) // get the list of entries. var read if (prefix === null) read = '.' else if (isAbsolute(prefix) || isAbsolute(pattern.join('/'))) { if (!prefix || !isAbsolute(prefix)) prefix = '/' + prefix read = prefix } else read = prefix var abs = this._makeAbs(read) //if ignored, skip _processing if (childrenIgnored(this, read)) return cb() var isGlobStar = remain[0] === minimatch.GLOBSTAR if (isGlobStar) this._processGlobStar(prefix, read, abs, remain, index, inGlobStar, cb) else this._processReaddir(prefix, read, abs, remain, index, inGlobStar, cb) } Glob.prototype._processReaddir = function (prefix, read, abs, remain, index, inGlobStar, cb) { var self = this this._readdir(abs, inGlobStar, function (er, entries) { return self._processReaddir2(prefix, read, abs, remain, index, inGlobStar, entries, cb) }) } Glob.prototype._processReaddir2 = function (prefix, read, abs, remain, index, inGlobStar, entries, cb) { // if the abs isn't a dir, then nothing can match! if (!entries) return cb() // It will only match dot entries if it starts with a dot, or if // dot is set. Stuff like @(.foo|.bar) isn't allowed. var pn = remain[0] var negate = !!this.minimatch.negate var rawGlob = pn._glob var dotOk = this.dot || rawGlob.charAt(0) === '.' var matchedEntries = [] for (var i = 0; i < entries.length; i++) { var e = entries[i] if (e.charAt(0) !== '.' || dotOk) { var m if (negate && !prefix) { m = !e.match(pn) } else { m = e.match(pn) } if (m) matchedEntries.push(e) } } //console.error('prd2', prefix, entries, remain[0]._glob, matchedEntries) var len = matchedEntries.length // If there are no matched entries, then nothing matches. if (len === 0) return cb() // if this is the last remaining pattern bit, then no need for // an additional stat *unless* the user has specified mark or // stat explicitly. We know they exist, since readdir returned // them. if (remain.length === 1 && !this.mark && !this.stat) { if (!this.matches[index]) this.matches[index] = Object.create(null) for (var i = 0; i < len; i ++) { var e = matchedEntries[i] if (prefix) { if (prefix !== '/') e = prefix + '/' + e else e = prefix + e } if (e.charAt(0) === '/' && !this.nomount) { e = path.join(this.root, e) } this._emitMatch(index, e) } // This was the last one, and no stats were needed return cb() } // now test all matched entries as stand-ins for that part // of the pattern. remain.shift() for (var i = 0; i < len; i ++) { var e = matchedEntries[i] var newPattern if (prefix) { if (prefix !== '/') e = prefix + '/' + e else e = prefix + e } this._process([e].concat(remain), index, inGlobStar, cb) } cb() } Glob.prototype._emitMatch = function (index, e) { if (this.aborted) return if (this.matches[index][e]) return if (this.paused) { this._emitQueue.push([index, e]) return } var abs = this._makeAbs(e) if (this.nodir) { var c = this.cache[abs] if (c === 'DIR' || Array.isArray(c)) return } if (this.mark) e = this._mark(e) this.matches[index][e] = true var st = this.statCache[abs] if (st) this.emit('stat', e, st) this.emit('match', e) } Glob.prototype._readdirInGlobStar = function (abs, cb) { if (this.aborted) return // follow all symlinked directories forever // just proceed as if this is a non-globstar situation if (this.follow) return this._readdir(abs, false, cb) var lstatkey = 'lstat\0' + abs var self = this var lstatcb = inflight(lstatkey, lstatcb_) if (lstatcb) fs.lstat(abs, lstatcb) function lstatcb_ (er, lstat) { if (er) return cb() var isSym = lstat.isSymbolicLink() self.symlinks[abs] = isSym // If it's not a symlink or a dir, then it's definitely a regular file. // don't bother doing a readdir in that case. if (!isSym && !lstat.isDirectory()) { self.cache[abs] = 'FILE' cb() } else self._readdir(abs, false, cb) } } Glob.prototype._readdir = function (abs, inGlobStar, cb) { if (this.aborted) return cb = inflight('readdir\0'+abs+'\0'+inGlobStar, cb) if (!cb) return //console.error('RD %j %j', +inGlobStar, abs) if (inGlobStar && !ownProp(this.symlinks, abs)) return this._readdirInGlobStar(abs, cb) if (ownProp(this.cache, abs)) { var c = this.cache[abs] if (!c || c === 'FILE') return cb() if (Array.isArray(c)) return cb(null, c) } var self = this fs.readdir(abs, readdirCb(this, abs, cb)) } function readdirCb (self, abs, cb) { return function (er, entries) { if (er) self._readdirError(abs, er, cb) else self._readdirEntries(abs, entries, cb) } } Glob.prototype._readdirEntries = function (abs, entries, cb) { if (this.aborted) return // if we haven't asked to stat everything, then just // assume that everything in there exists, so we can avoid // having to stat it a second time. if (!this.mark && !this.stat) { for (var i = 0; i < entries.length; i ++) { var e = entries[i] if (abs === '/') e = abs + e else e = abs + '/' + e this.cache[e] = true } } this.cache[abs] = entries return cb(null, entries) } Glob.prototype._readdirError = function (f, er, cb) { if (this.aborted) return // handle errors, and cache the information switch (er.code) { case 'ENOTDIR': // totally normal. means it *does* exist. this.cache[this._makeAbs(f)] = 'FILE' break case 'ENOENT': // not terribly unusual case 'ELOOP': case 'ENAMETOOLONG': case 'UNKNOWN': this.cache[this._makeAbs(f)] = false break default: // some unusual error. Treat as failure. this.cache[this._makeAbs(f)] = false if (this.strict) return this.emit('error', er) if (!this.silent) console.error('glob error', er) break } return cb() } Glob.prototype._processGlobStar = function (prefix, read, abs, remain, index, inGlobStar, cb) { var self = this this._readdir(abs, inGlobStar, function (er, entries) { self._processGlobStar2(prefix, read, abs, remain, index, inGlobStar, entries, cb) }) } Glob.prototype._processGlobStar2 = function (prefix, read, abs, remain, index, inGlobStar, entries, cb) { //console.error('pgs2', prefix, remain[0], entries) // no entries means not a dir, so it can never have matches // foo.txt/** doesn't match foo.txt if (!entries) return cb() // test without the globstar, and with every child both below // and replacing the globstar. var remainWithoutGlobStar = remain.slice(1) var gspref = prefix ? [ prefix ] : [] var noGlobStar = gspref.concat(remainWithoutGlobStar) // the noGlobStar pattern exits the inGlobStar state this._process(noGlobStar, index, false, cb) var isSym = this.symlinks[abs] var len = entries.length // If it's a symlink, and we're in a globstar, then stop if (isSym && inGlobStar) return cb() for (var i = 0; i < len; i++) { var e = entries[i] if (e.charAt(0) === '.' && !this.dot) continue // these two cases enter the inGlobStar state var instead = gspref.concat(entries[i], remainWithoutGlobStar) this._process(instead, index, true, cb) var below = gspref.concat(entries[i], remain) this._process(below, index, true, cb) } cb() } Glob.prototype._processSimple = function (prefix, index, cb) { // XXX review this. Shouldn't it be doing the mounting etc // before doing stat? kinda weird? var self = this this._stat(prefix, function (er, exists) { self._processSimple2(prefix, index, er, exists, cb) }) } Glob.prototype._processSimple2 = function (prefix, index, er, exists, cb) { //console.error('ps2', prefix, exists) if (!this.matches[index]) this.matches[index] = Object.create(null) // If it doesn't exist, then just mark the lack of results if (!exists) return cb() if (prefix && isAbsolute(prefix) && !this.nomount) { var trail = /[\/\\]$/.test(prefix) if (prefix.charAt(0) === '/') { prefix = path.join(this.root, prefix) } else { prefix = path.resolve(this.root, prefix) if (trail) prefix += '/' } } if (process.platform === 'win32') prefix = prefix.replace(/\\/g, '/') // Mark this as a match this._emitMatch(index, prefix) cb() } // Returns either 'DIR', 'FILE', or false Glob.prototype._stat = function (f, cb) { var abs = this._makeAbs(f) var needDir = f.slice(-1) === '/' if (f.length > this.maxLength) return cb() if (!this.stat && ownProp(this.cache, abs)) { var c = this.cache[abs] if (Array.isArray(c)) c = 'DIR' // It exists, but maybe not how we need it if (!needDir || c === 'DIR') return cb(null, c) if (needDir && c === 'FILE') return cb() // otherwise we have to stat, because maybe c=true // if we know it exists, but not what it is. } var exists var stat = this.statCache[abs] if (stat !== undefined) { if (stat === false) return cb(null, stat) else { var type = stat.isDirectory() ? 'DIR' : 'FILE' if (needDir && type === 'FILE') return cb() else return cb(null, type, stat) } } var self = this var statcb = inflight('stat\0' + abs, lstatcb_) if (statcb) fs.lstat(abs, statcb) function lstatcb_ (er, lstat) { if (lstat && lstat.isSymbolicLink()) { // If it's a symlink, then treat it as the target, unless // the target does not exist, then treat it as a file. return fs.stat(abs, function (er, stat) { if (er) self._stat2(f, abs, null, lstat, cb) else self._stat2(f, abs, er, stat, cb) }) } else { self._stat2(f, abs, er, lstat, cb) } } } Glob.prototype._stat2 = function (f, abs, er, stat, cb) { if (er) { this.statCache[abs] = false return cb() } var needDir = f.slice(-1) === '/' this.statCache[abs] = stat if (abs.slice(-1) === '/' && !stat.isDirectory()) return cb(null, false, stat) var c = stat.isDirectory() ? 'DIR' : 'FILE' this.cache[abs] = this.cache[abs] || c if (needDir && c !== 'DIR') return cb() return cb(null, c, stat) } npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/0000755000000000000000000000000012631326456023746 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/package.json0000644000000000000000000000341012631326456023555 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "glob", "description": "a little globber", "version": "4.5.3", "repository": { "type": "git", "url": "git://github.com/isaacs/node-glob.git" }, "main": "glob.js", "files": [ "glob.js", "sync.js", "common.js" ], "engines": { "node": "*" }, "dependencies": { "inflight": "^1.0.4", "inherits": "2", "minimatch": "^2.0.1", "once": "^1.3.0" }, "devDependencies": { "mkdirp": "0", "rimraf": "^2.2.8", "tap": "^0.5.0", "tick": "0.0.6" }, "scripts": { "prepublish": "npm run benchclean", "profclean": "rm -f v8.log profile.txt", "test": "npm run profclean && tap test/*.js", "test-regen": "npm run profclean && TEST_REGEN=1 node test/00-setup.js", "bench": "bash benchmark.sh", "prof": "bash prof.sh && cat profile.txt", "benchclean": "bash benchclean.sh" }, "license": "ISC", "gitHead": "a4e461ab59a837eee80a4d8dbdbf5ae1054a646f", "bugs": { "url": "https://github.com/isaacs/node-glob/issues" }, "homepage": "https://github.com/isaacs/node-glob", "_id": "glob@4.5.3", "_shasum": "c6cb73d3226c1efef04de3c56d012f03377ee15f", "_from": "glob@>=3.0.0 <4.0.0||>=4.0.0 <5.0.0", "_npmVersion": "2.7.1", "_nodeVersion": "1.4.2", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "c6cb73d3226c1efef04de3c56d012f03377ee15f", "tarball": "http://registry.npmjs.org/glob/-/glob-4.5.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/glob/-/glob-4.5.3.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/sync.js0000644000000000000000000002614512631326456022613 0ustar 00000000000000module.exports = globSync globSync.GlobSync = GlobSync var fs = require('fs') var minimatch = require('minimatch') var Minimatch = minimatch.Minimatch var Glob = require('./glob.js').Glob var util = require('util') var path = require('path') var assert = require('assert') var common = require('./common.js') var alphasort = common.alphasort var alphasorti = common.alphasorti var isAbsolute = common.isAbsolute var setopts = common.setopts var ownProp = common.ownProp var childrenIgnored = common.childrenIgnored function globSync (pattern, options) { if (typeof options === 'function' || arguments.length === 3) throw new TypeError('callback provided to sync glob\n'+ 'See: https://github.com/isaacs/node-glob/issues/167') return new GlobSync(pattern, options).found } function GlobSync (pattern, options) { if (!pattern) throw new Error('must provide pattern') if (typeof options === 'function' || arguments.length === 3) throw new TypeError('callback provided to sync glob\n'+ 'See: https://github.com/isaacs/node-glob/issues/167') if (!(this instanceof GlobSync)) return new GlobSync(pattern, options) setopts(this, pattern, options) if (this.noprocess) return this var n = this.minimatch.set.length this.matches = new Array(n) for (var i = 0; i < n; i ++) { this._process(this.minimatch.set[i], i, false) } this._finish() } GlobSync.prototype._finish = function () { assert(this instanceof GlobSync) if (this.realpath) { var self = this this.matches.forEach(function (matchset, index) { var set = self.matches[index] = Object.create(null) for (var p in matchset) { try { p = self._makeAbs(p) var real = fs.realpathSync(p, this.realpathCache) set[real] = true } catch (er) { if (er.syscall === 'stat') set[self._makeAbs(p)] = true else throw er } } }) } common.finish(this) } GlobSync.prototype._process = function (pattern, index, inGlobStar) { assert(this instanceof GlobSync) // Get the first [n] parts of pattern that are all strings. var n = 0 while (typeof pattern[n] === 'string') { n ++ } // now n is the index of the first one that is *not* a string. // See if there's anything else var prefix switch (n) { // if not, then this is rather simple case pattern.length: this._processSimple(pattern.join('/'), index) return case 0: // pattern *starts* with some non-trivial item. // going to readdir(cwd), but not include the prefix in matches. prefix = null break default: // pattern has some string bits in the front. // whatever it starts with, whether that's 'absolute' like /foo/bar, // or 'relative' like '../baz' prefix = pattern.slice(0, n).join('/') break } var remain = pattern.slice(n) // get the list of entries. var read if (prefix === null) read = '.' else if (isAbsolute(prefix) || isAbsolute(pattern.join('/'))) { if (!prefix || !isAbsolute(prefix)) prefix = '/' + prefix read = prefix } else read = prefix var abs = this._makeAbs(read) //if ignored, skip processing if (childrenIgnored(this, read)) return var isGlobStar = remain[0] === minimatch.GLOBSTAR if (isGlobStar) this._processGlobStar(prefix, read, abs, remain, index, inGlobStar) else this._processReaddir(prefix, read, abs, remain, index, inGlobStar) } GlobSync.prototype._processReaddir = function (prefix, read, abs, remain, index, inGlobStar) { var entries = this._readdir(abs, inGlobStar) // if the abs isn't a dir, then nothing can match! if (!entries) return // It will only match dot entries if it starts with a dot, or if // dot is set. Stuff like @(.foo|.bar) isn't allowed. var pn = remain[0] var negate = !!this.minimatch.negate var rawGlob = pn._glob var dotOk = this.dot || rawGlob.charAt(0) === '.' var matchedEntries = [] for (var i = 0; i < entries.length; i++) { var e = entries[i] if (e.charAt(0) !== '.' || dotOk) { var m if (negate && !prefix) { m = !e.match(pn) } else { m = e.match(pn) } if (m) matchedEntries.push(e) } } var len = matchedEntries.length // If there are no matched entries, then nothing matches. if (len === 0) return // if this is the last remaining pattern bit, then no need for // an additional stat *unless* the user has specified mark or // stat explicitly. We know they exist, since readdir returned // them. if (remain.length === 1 && !this.mark && !this.stat) { if (!this.matches[index]) this.matches[index] = Object.create(null) for (var i = 0; i < len; i ++) { var e = matchedEntries[i] if (prefix) { if (prefix.slice(-1) !== '/') e = prefix + '/' + e else e = prefix + e } if (e.charAt(0) === '/' && !this.nomount) { e = path.join(this.root, e) } this.matches[index][e] = true } // This was the last one, and no stats were needed return } // now test all matched entries as stand-ins for that part // of the pattern. remain.shift() for (var i = 0; i < len; i ++) { var e = matchedEntries[i] var newPattern if (prefix) newPattern = [prefix, e] else newPattern = [e] this._process(newPattern.concat(remain), index, inGlobStar) } } GlobSync.prototype._emitMatch = function (index, e) { var abs = this._makeAbs(e) if (this.mark) e = this._mark(e) if (this.matches[index][e]) return if (this.nodir) { var c = this.cache[this._makeAbs(e)] if (c === 'DIR' || Array.isArray(c)) return } this.matches[index][e] = true if (this.stat) this._stat(e) } GlobSync.prototype._readdirInGlobStar = function (abs) { // follow all symlinked directories forever // just proceed as if this is a non-globstar situation if (this.follow) return this._readdir(abs, false) var entries var lstat var stat try { lstat = fs.lstatSync(abs) } catch (er) { // lstat failed, doesn't exist return null } var isSym = lstat.isSymbolicLink() this.symlinks[abs] = isSym // If it's not a symlink or a dir, then it's definitely a regular file. // don't bother doing a readdir in that case. if (!isSym && !lstat.isDirectory()) this.cache[abs] = 'FILE' else entries = this._readdir(abs, false) return entries } GlobSync.prototype._readdir = function (abs, inGlobStar) { var entries if (inGlobStar && !ownProp(this.symlinks, abs)) return this._readdirInGlobStar(abs) if (ownProp(this.cache, abs)) { var c = this.cache[abs] if (!c || c === 'FILE') return null if (Array.isArray(c)) return c } try { return this._readdirEntries(abs, fs.readdirSync(abs)) } catch (er) { this._readdirError(abs, er) return null } } GlobSync.prototype._readdirEntries = function (abs, entries) { // if we haven't asked to stat everything, then just // assume that everything in there exists, so we can avoid // having to stat it a second time. if (!this.mark && !this.stat) { for (var i = 0; i < entries.length; i ++) { var e = entries[i] if (abs === '/') e = abs + e else e = abs + '/' + e this.cache[e] = true } } this.cache[abs] = entries // mark and cache dir-ness return entries } GlobSync.prototype._readdirError = function (f, er) { // handle errors, and cache the information switch (er.code) { case 'ENOTDIR': // totally normal. means it *does* exist. this.cache[this._makeAbs(f)] = 'FILE' break case 'ENOENT': // not terribly unusual case 'ELOOP': case 'ENAMETOOLONG': case 'UNKNOWN': this.cache[this._makeAbs(f)] = false break default: // some unusual error. Treat as failure. this.cache[this._makeAbs(f)] = false if (this.strict) throw er if (!this.silent) console.error('glob error', er) break } } GlobSync.prototype._processGlobStar = function (prefix, read, abs, remain, index, inGlobStar) { var entries = this._readdir(abs, inGlobStar) // no entries means not a dir, so it can never have matches // foo.txt/** doesn't match foo.txt if (!entries) return // test without the globstar, and with every child both below // and replacing the globstar. var remainWithoutGlobStar = remain.slice(1) var gspref = prefix ? [ prefix ] : [] var noGlobStar = gspref.concat(remainWithoutGlobStar) // the noGlobStar pattern exits the inGlobStar state this._process(noGlobStar, index, false) var len = entries.length var isSym = this.symlinks[abs] // If it's a symlink, and we're in a globstar, then stop if (isSym && inGlobStar) return for (var i = 0; i < len; i++) { var e = entries[i] if (e.charAt(0) === '.' && !this.dot) continue // these two cases enter the inGlobStar state var instead = gspref.concat(entries[i], remainWithoutGlobStar) this._process(instead, index, true) var below = gspref.concat(entries[i], remain) this._process(below, index, true) } } GlobSync.prototype._processSimple = function (prefix, index) { // XXX review this. Shouldn't it be doing the mounting etc // before doing stat? kinda weird? var exists = this._stat(prefix) if (!this.matches[index]) this.matches[index] = Object.create(null) // If it doesn't exist, then just mark the lack of results if (!exists) return if (prefix && isAbsolute(prefix) && !this.nomount) { var trail = /[\/\\]$/.test(prefix) if (prefix.charAt(0) === '/') { prefix = path.join(this.root, prefix) } else { prefix = path.resolve(this.root, prefix) if (trail) prefix += '/' } } if (process.platform === 'win32') prefix = prefix.replace(/\\/g, '/') // Mark this as a match this.matches[index][prefix] = true } // Returns either 'DIR', 'FILE', or false GlobSync.prototype._stat = function (f) { var abs = this._makeAbs(f) var needDir = f.slice(-1) === '/' if (f.length > this.maxLength) return false if (!this.stat && ownProp(this.cache, abs)) { var c = this.cache[abs] if (Array.isArray(c)) c = 'DIR' // It exists, but maybe not how we need it if (!needDir || c === 'DIR') return c if (needDir && c === 'FILE') return false // otherwise we have to stat, because maybe c=true // if we know it exists, but not what it is. } var exists var stat = this.statCache[abs] if (!stat) { var lstat try { lstat = fs.lstatSync(abs) } catch (er) { return false } if (lstat.isSymbolicLink()) { try { stat = fs.statSync(abs) } catch (er) { stat = lstat } } else { stat = lstat } } this.statCache[abs] = stat var c = stat.isDirectory() ? 'DIR' : 'FILE' this.cache[abs] = this.cache[abs] || c if (needDir && c !== 'DIR') return false return c } GlobSync.prototype._mark = function (p) { return common.mark(this, p) } GlobSync.prototype._makeAbs = function (f) { return common.makeAbs(this, f) } npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/0000755000000000000000000000000012631326456025717 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/LICENSE0000644000000000000000000000137512631326456026732 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/README.md0000644000000000000000000001471412631326456027205 0ustar 00000000000000# minimatch A minimal matching utility. [![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch) This is the matching library used internally by npm. It works by converting glob expressions into JavaScript `RegExp` objects. ## Usage ```javascript var minimatch = require("minimatch") minimatch("bar.foo", "*.foo") // true! minimatch("bar.foo", "*.bar") // false! minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! ``` ## Features Supports these glob features: * Brace Expansion * Extended glob matching * "Globstar" `**` matching See: * `man sh` * `man bash` * `man 3 fnmatch` * `man 5 gitignore` ## Minimatch Class Create a minimatch object by instanting the `minimatch.Minimatch` class. ```javascript var Minimatch = require("minimatch").Minimatch var mm = new Minimatch(pattern, options) ``` ### Properties * `pattern` The original pattern the minimatch object represents. * `options` The options supplied to the constructor. * `set` A 2-dimensional array of regexp or string expressions. Each row in the array corresponds to a brace-expanded pattern. Each item in the row corresponds to a single path-part. For example, the pattern `{a,b/c}/d` would expand to a set of patterns like: [ [ a, d ] , [ b, c, d ] ] If a portion of the pattern doesn't have any "magic" in it (that is, it's something like `"foo"` rather than `fo*o?`), then it will be left as a string rather than converted to a regular expression. * `regexp` Created by the `makeRe` method. A single regular expression expressing the entire pattern. This is useful in cases where you wish to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. * `negate` True if the pattern is negated. * `comment` True if the pattern is a comment. * `empty` True if the pattern is `""`. ### Methods * `makeRe` Generate the `regexp` member if necessary, and return it. Will return `false` if the pattern is invalid. * `match(fname)` Return true if the filename matches the pattern, or false otherwise. * `matchOne(fileArray, patternArray, partial)` Take a `/`-split filename, and match it against a single row in the `regExpSet`. This method is mainly for internal use, but is exposed so that it can be used by a glob-walker that needs to avoid excessive filesystem calls. All other methods are internal, and will be called as necessary. ## Functions The top-level exported function has a `cache` property, which is an LRU cache set to store 100 items. So, calling these methods repeatedly with the same pattern and options will use the same Minimatch object, saving the cost of parsing it multiple times. ### minimatch(path, pattern, options) Main export. Tests a path against the pattern using the options. ```javascript var isJS = minimatch(file, "*.js", { matchBase: true }) ``` ### minimatch.filter(pattern, options) Returns a function that tests its supplied argument, suitable for use with `Array.filter`. Example: ```javascript var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) ``` ### minimatch.match(list, pattern, options) Match against the list of files, in the style of fnmatch or glob. If nothing is matched, and options.nonull is set, then return a list containing the pattern itself. ```javascript var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) ``` ### minimatch.makeRe(pattern, options) Make a regular expression object from the pattern. ## Options All options are `false` by default. ### debug Dump a ton of stuff to stderr. ### nobrace Do not expand `{a,b}` and `{1..3}` brace sets. ### noglobstar Disable `**` matching against multiple folder names. ### dot Allow patterns to match filenames starting with a period, even if the pattern does not explicitly have a period in that spot. Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` is set. ### noext Disable "extglob" style patterns like `+(a|b)`. ### nocase Perform a case-insensitive match. ### nonull When a match is not found by `minimatch.match`, return a list containing the pattern itself if this option is set. When not set, an empty list is returned if there are no matches. ### matchBase If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. ### nocomment Suppress the behavior of treating `#` at the start of a pattern as a comment. ### nonegate Suppress the behavior of treating a leading `!` character as negation. ### flipNegate Returns from negate expressions the same as if they were not negated. (Ie, true on a hit, false on a miss.) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between minimatch and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.1, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. If an escaped pattern has no matches, and the `nonull` flag is set, then minimatch.match returns the pattern as-provided, rather than interpreting the character escapes. For example, `minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/browser.js0000644000000000000000000007515512631326456027755 0ustar 00000000000000(function(f){if(typeof exports==="object"&&typeof module!=="undefined"){module.exports=f()}else if(typeof define==="function"&&define.amd){define([],f)}else{var g;if(typeof window!=="undefined"){g=window}else if(typeof global!=="undefined"){g=global}else if(typeof self!=="undefined"){g=self}else{g=this}g.minimatch = f()}})(function(){var define,module,exports;return (function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o any number of characters var star = qmark + '*?' // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. var twoStarDot = '(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?' // not a ^ or / followed by a dot, // followed by anything, any number of times. var twoStarNoDot = '(?:(?!(?:\\\/|^)\\.).)*?' // characters that need to be escaped in RegExp. var reSpecials = charSet('().*{}+?[]^$\\!') // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split('').reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { return false } // "" only matches "" if (pattern.trim() === '') return p === '' return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options) } if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} pattern = pattern.trim() // windows support: need to use /, not \ if (path.sep !== '/') { pattern = pattern.split(path.sep).join('/') } this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function () {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return s.indexOf(false) === -1 }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern var negate = false var options = this.options var negateOffset = 0 if (options.nonegate) return for (var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === '!' ; i++) { negate = !negate negateOffset++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return braceExpand(pattern, options) } Minimatch.prototype.braceExpand = braceExpand function braceExpand (pattern, options) { if (!options) { if (this instanceof Minimatch) { options = this.options } else { options = {} } } pattern = typeof pattern === 'undefined' ? this.pattern : pattern if (typeof pattern === 'undefined') { throw new Error('undefined pattern') } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } return expand(pattern) } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === '**') return GLOBSTAR if (pattern === '') return '' var re = '' var hasMagic = !!options.nocase var escaping = false // ? => one single character var patternListStack = [] var negativeLists = [] var plType var stateChar var inClass = false var reClassStart = -1 var classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. var patternStart = pattern.charAt(0) === '.' ? '' // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))' : '(?!\\.)' var self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case '*': re += star hasMagic = true break case '?': re += qmark hasMagic = true break default: re += '\\' + stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for (var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i++) { this.debug('%s\t%s %s %j', pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += '\\' + c escaping = false continue } switch (c) { case '/': // completely not allowed, even escaped. // Should already be path-split by now. return false case '\\': clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case '?': case '*': case '+': case '@': case '!': this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === '!' && i === classStart + 1) c = '^' re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case '(': if (inClass) { re += '(' continue } if (!stateChar) { re += '\\(' continue } plType = stateChar patternListStack.push({ type: plType, start: i - 1, reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === '!' ? '(?:(?!(?:' : '(?:' this.debug('plType %j %j', stateChar, re) stateChar = false continue case ')': if (inClass || !patternListStack.length) { re += '\\)' continue } clearStateChar() hasMagic = true re += ')' var pl = patternListStack.pop() plType = pl.type // negation is (?:(?!js)[^/]*) // The others are (?:) switch (plType) { case '!': negativeLists.push(pl) re += ')[^/]*?)' pl.reEnd = re.length break case '?': case '+': case '*': re += plType break case '@': break // the default anyway } continue case '|': if (inClass || !patternListStack.length || escaping) { re += '\\|' escaping = false continue } clearStateChar() re += '|' continue // these are mostly the same in regexp and glob case '[': // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += '\\' + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case ']': // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += '\\' + c escaping = false continue } // handle the case where we left a class open. // "[z-a]" is valid, equivalent to "\[z-a\]" if (inClass) { // split where the last [ was, make sure we don't have // an invalid re. if so, re-walk the contents of the // would-be class to re-translate any characters that // were passed through as-is // TODO: It would probably be faster to determine this // without a try/catch and a new RegExp, but it's tricky // to do safely. For now, this is safe and works. var cs = pattern.substring(classStart + 1, i) try { RegExp('[' + cs + ']') } catch (er) { // not a valid class! var sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]' hasMagic = hasMagic || sp[1] inClass = false continue } } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === '^' && inClass)) { re += '\\' } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is cs = pattern.substr(classStart + 1) sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = '\\' } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + '|' }) this.debug('tail=%j\n %s', tail, tail) var t = pl.type === '*' ? star : pl.type === '?' ? qmark : '\\' + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + '\\(' + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += '\\\\' } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case '.': case '[': case '(': addPatternStart = true } // Hack to work around lack of negative lookbehind in JS // A pattern like: *.!(x).!(y|z) needs to ensure that a name // like 'a.xyz.yz' doesn't match. So, the first negative // lookahead, has to look ALL the way ahead, to the end of // the pattern. for (var n = negativeLists.length - 1; n > -1; n--) { var nl = negativeLists[n] var nlBefore = re.slice(0, nl.reStart) var nlFirst = re.slice(nl.reStart, nl.reEnd - 8) var nlLast = re.slice(nl.reEnd - 8, nl.reEnd) var nlAfter = re.slice(nl.reEnd) nlLast += nlAfter // Handle nested stuff like *(*.js|!(*.json)), where open parens // mean that we should *not* include the ) in the bit that is considered // "after" the negated section. var openParensBefore = nlBefore.split('(').length - 1 var cleanAfter = nlAfter for (i = 0; i < openParensBefore; i++) { cleanAfter = cleanAfter.replace(/\)[+*?]?/, '') } nlAfter = cleanAfter var dollar = '' if (nlAfter === '' && isSub !== SUBPARSE) { dollar = '$' } var newRe = nlBefore + nlFirst + nlAfter + dollar + nlLast re = newRe } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== '' && hasMagic) { re = '(?=.)' + re } if (addPatternStart) { re = patternStart + re } // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [re, hasMagic] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? 'i' : '' var regExp = new RegExp('^' + re + '$', flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) { this.regexp = false return this.regexp } var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot var flags = options.nocase ? 'i' : '' var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === 'string') ? regExpEscape(p) : p._src }).join('\\\/') }).join('|') // must match entire pattern // ending in a * or ** will make it less strict. re = '^(?:' + re + ')$' // can match anything, as long as it's not this. if (this.negate) re = '^(?!' + re + ').*$' try { this.regexp = new RegExp(re, flags) } catch (ex) { this.regexp = false } return this.regexp } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug('match', f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === '' if (f === '/' && partial) return true var options = this.options // windows: need to use /, not \ if (path.sep !== '/') { f = f.split(path.sep).join('/') } // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, 'split', f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, 'set', set) // Find the basename of the path by looking for the last non-empty segment var filename var i for (i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (i = 0; i < set.length; i++) { var pattern = set[i] var file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug('matchOne', { 'this': this, file: file, pattern: pattern }) this.debug('matchOne', file.length, pattern.length) for (var fi = 0, pi = 0, fl = file.length, pl = pattern.length ; (fi < fl) && (pi < pl) ; fi++, pi++) { this.debug('matchOne loop') var p = pattern[pi] var f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi var pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for (; fi < fl; fi++) { if (file[fi] === '.' || file[fi] === '..' || (!options.dot && file[fi].charAt(0) === '.')) return false } return true } // ok, let's see if we can swallow whatever we can. while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === '.' || swallowee === '..' || (!options.dot && swallowee.charAt(0) === '.')) { this.debug('dot detected!', file, fr, pattern, pr) break } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug('\n>>> no match, partial?', file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === 'string') { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug('string match', p, f, hit) } else { hit = f.match(p) this.debug('pattern match', p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === '') return emptyFileEnd } // should be unreachable. throw new Error('wtf?') } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, '$1') } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&') } },{"brace-expansion":2,"path":undefined}],2:[function(require,module,exports){ var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; var expansions = expand(escapeBraces(str)); return expansions.filter(identity).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } function expand(str) { var expansions = []; var m = balanced('{', '}', str); if (!m || /\$$/.test(m.pre)) return [str]; var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body); var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = /^(.*,)+(.+)?$/.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (m.post.match(/,.*}/)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0]).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { expansions.push([pre, N[j], post[k]].join('')) } } return expansions; } },{"balanced-match":3,"concat-map":4}],3:[function(require,module,exports){ module.exports = balanced; function balanced(a, b, str) { var bal = 0; var m = {}; var ended = false; for (var i = 0; i < str.length; i++) { if (a == str.substr(i, a.length)) { if (!('start' in m)) m.start = i; bal++; } else if (b == str.substr(i, b.length) && 'start' in m) { ended = true; bal--; if (!bal) { m.end = i; m.pre = str.substr(0, m.start); m.body = (m.end - m.start > 1) ? str.substring(m.start + a.length, m.end) : ''; m.post = str.slice(m.end + b.length); return m; } } } // if we opened more than we closed, find the one we closed if (bal && ended) { var start = m.start + a.length; m = balanced(a, b, str.substr(start)); if (m) { m.start += start; m.end += start; m.pre = str.slice(0, start) + m.pre; } return m; } } },{}],4:[function(require,module,exports){ module.exports = function (xs, fn) { var res = []; for (var i = 0; i < xs.length; i++) { var x = fn(xs[i], i); if (Array.isArray(x)) res.push.apply(res, x); else res.push(x); } return res; }; },{}]},{},[1])(1) });npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/minimatch.js0000644000000000000000000006044012631326456030232 0ustar 00000000000000module.exports = minimatch minimatch.Minimatch = Minimatch var path = { sep: '/' } try { path = require('path') } catch (er) {} var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} var expand = require('brace-expansion') // any single thing other than / // don't need to escape / when using new RegExp() var qmark = '[^/]' // * => any number of characters var star = qmark + '*?' // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. var twoStarDot = '(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?' // not a ^ or / followed by a dot, // followed by anything, any number of times. var twoStarNoDot = '(?:(?!(?:\\\/|^)\\.).)*?' // characters that need to be escaped in RegExp. var reSpecials = charSet('().*{}+?[]^$\\!') // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split('').reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { return false } // "" only matches "" if (pattern.trim() === '') return p === '' return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options) } if (typeof pattern !== 'string') { throw new TypeError('glob pattern string required') } if (!options) options = {} pattern = pattern.trim() // windows support: need to use /, not \ if (path.sep !== '/') { pattern = pattern.split(path.sep).join('/') } this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function () {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === '#') { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return s.indexOf(false) === -1 }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern var negate = false var options = this.options var negateOffset = 0 if (options.nonegate) return for (var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === '!' ; i++) { negate = !negate negateOffset++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return braceExpand(pattern, options) } Minimatch.prototype.braceExpand = braceExpand function braceExpand (pattern, options) { if (!options) { if (this instanceof Minimatch) { options = this.options } else { options = {} } } pattern = typeof pattern === 'undefined' ? this.pattern : pattern if (typeof pattern === 'undefined') { throw new Error('undefined pattern') } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } return expand(pattern) } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === '**') return GLOBSTAR if (pattern === '') return '' var re = '' var hasMagic = !!options.nocase var escaping = false // ? => one single character var patternListStack = [] var negativeLists = [] var plType var stateChar var inClass = false var reClassStart = -1 var classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. var patternStart = pattern.charAt(0) === '.' ? '' // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))' : '(?!\\.)' var self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case '*': re += star hasMagic = true break case '?': re += qmark hasMagic = true break default: re += '\\' + stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for (var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i++) { this.debug('%s\t%s %s %j', pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += '\\' + c escaping = false continue } switch (c) { case '/': // completely not allowed, even escaped. // Should already be path-split by now. return false case '\\': clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case '?': case '*': case '+': case '@': case '!': this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === '!' && i === classStart + 1) c = '^' re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case '(': if (inClass) { re += '(' continue } if (!stateChar) { re += '\\(' continue } plType = stateChar patternListStack.push({ type: plType, start: i - 1, reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === '!' ? '(?:(?!(?:' : '(?:' this.debug('plType %j %j', stateChar, re) stateChar = false continue case ')': if (inClass || !patternListStack.length) { re += '\\)' continue } clearStateChar() hasMagic = true re += ')' var pl = patternListStack.pop() plType = pl.type // negation is (?:(?!js)[^/]*) // The others are (?:) switch (plType) { case '!': negativeLists.push(pl) re += ')[^/]*?)' pl.reEnd = re.length break case '?': case '+': case '*': re += plType break case '@': break // the default anyway } continue case '|': if (inClass || !patternListStack.length || escaping) { re += '\\|' escaping = false continue } clearStateChar() re += '|' continue // these are mostly the same in regexp and glob case '[': // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += '\\' + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case ']': // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += '\\' + c escaping = false continue } // handle the case where we left a class open. // "[z-a]" is valid, equivalent to "\[z-a\]" if (inClass) { // split where the last [ was, make sure we don't have // an invalid re. if so, re-walk the contents of the // would-be class to re-translate any characters that // were passed through as-is // TODO: It would probably be faster to determine this // without a try/catch and a new RegExp, but it's tricky // to do safely. For now, this is safe and works. var cs = pattern.substring(classStart + 1, i) try { RegExp('[' + cs + ']') } catch (er) { // not a valid class! var sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]' hasMagic = hasMagic || sp[1] inClass = false continue } } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === '^' && inClass)) { re += '\\' } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is cs = pattern.substr(classStart + 1) sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + '\\[' + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = '\\' } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + '|' }) this.debug('tail=%j\n %s', tail, tail) var t = pl.type === '*' ? star : pl.type === '?' ? qmark : '\\' + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + '\\(' + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += '\\\\' } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case '.': case '[': case '(': addPatternStart = true } // Hack to work around lack of negative lookbehind in JS // A pattern like: *.!(x).!(y|z) needs to ensure that a name // like 'a.xyz.yz' doesn't match. So, the first negative // lookahead, has to look ALL the way ahead, to the end of // the pattern. for (var n = negativeLists.length - 1; n > -1; n--) { var nl = negativeLists[n] var nlBefore = re.slice(0, nl.reStart) var nlFirst = re.slice(nl.reStart, nl.reEnd - 8) var nlLast = re.slice(nl.reEnd - 8, nl.reEnd) var nlAfter = re.slice(nl.reEnd) nlLast += nlAfter // Handle nested stuff like *(*.js|!(*.json)), where open parens // mean that we should *not* include the ) in the bit that is considered // "after" the negated section. var openParensBefore = nlBefore.split('(').length - 1 var cleanAfter = nlAfter for (i = 0; i < openParensBefore; i++) { cleanAfter = cleanAfter.replace(/\)[+*?]?/, '') } nlAfter = cleanAfter var dollar = '' if (nlAfter === '' && isSub !== SUBPARSE) { dollar = '$' } var newRe = nlBefore + nlFirst + nlAfter + dollar + nlLast re = newRe } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== '' && hasMagic) { re = '(?=.)' + re } if (addPatternStart) { re = patternStart + re } // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [re, hasMagic] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? 'i' : '' var regExp = new RegExp('^' + re + '$', flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) { this.regexp = false return this.regexp } var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot var flags = options.nocase ? 'i' : '' var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === 'string') ? regExpEscape(p) : p._src }).join('\\\/') }).join('|') // must match entire pattern // ending in a * or ** will make it less strict. re = '^(?:' + re + ')$' // can match anything, as long as it's not this. if (this.negate) re = '^(?!' + re + ').*$' try { this.regexp = new RegExp(re, flags) } catch (ex) { this.regexp = false } return this.regexp } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug('match', f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === '' if (f === '/' && partial) return true var options = this.options // windows: need to use /, not \ if (path.sep !== '/') { f = f.split(path.sep).join('/') } // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, 'split', f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, 'set', set) // Find the basename of the path by looking for the last non-empty segment var filename var i for (i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (i = 0; i < set.length; i++) { var pattern = set[i] var file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug('matchOne', { 'this': this, file: file, pattern: pattern }) this.debug('matchOne', file.length, pattern.length) for (var fi = 0, pi = 0, fl = file.length, pl = pattern.length ; (fi < fl) && (pi < pl) ; fi++, pi++) { this.debug('matchOne loop') var p = pattern[pi] var f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi var pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for (; fi < fl; fi++) { if (file[fi] === '.' || file[fi] === '..' || (!options.dot && file[fi].charAt(0) === '.')) return false } return true } // ok, let's see if we can swallow whatever we can. while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === '.' || swallowee === '..' || (!options.dot && swallowee.charAt(0) === '.')) { this.debug('dot detected!', file, fr, pattern, pr) break } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug('\n>>> no match, partial?', file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === 'string') { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug('string match', p, f, hit) } else { hit = f.match(p) this.debug('pattern match', p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === '') return emptyFileEnd } // should be unreachable. throw new Error('wtf?') } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, '$1') } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&') } npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/0000755000000000000000000000000012631326456030374 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/package.json0000644000000000000000000000310112631326456030200 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "minimatch", "description": "a glob matcher in javascript", "version": "2.0.10", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "main": "minimatch.js", "scripts": { "posttest": "standard minimatch.js test/*.js", "test": "tap test/*.js", "prepublish": "browserify -o browser.js -e minimatch.js -s minimatch --bare" }, "engines": { "node": "*" }, "dependencies": { "brace-expansion": "^1.0.0" }, "devDependencies": { "browserify": "^9.0.3", "standard": "^3.7.2", "tap": "^1.2.0" }, "license": "ISC", "files": [ "minimatch.js", "browser.js" ], "gitHead": "6afb85f0c324b321f76a38df81891e562693e257", "bugs": { "url": "https://github.com/isaacs/minimatch/issues" }, "homepage": "https://github.com/isaacs/minimatch#readme", "_id": "minimatch@2.0.10", "_shasum": "8d087c39c6b38c001b97fca7ce6d0e1e80afbac7", "_from": "minimatch@>=2.0.1 <3.0.0", "_npmVersion": "3.1.0", "_nodeVersion": "2.2.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "8d087c39c6b38c001b97fca7ce6d0e1e80afbac7", "tarball": "http://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000003412631326456032162 0ustar 00000000000000test .gitignore .travis.yml ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000673012631326456032173 0ustar 00000000000000# brace-expansion [Brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html), as known from sh/bash, in JavaScript. [![build status](https://secure.travis-ci.org/juliangruber/brace-expansion.svg)](http://travis-ci.org/juliangruber/brace-expansion) [![downloads](https://img.shields.io/npm/dm/brace-expansion.svg)](https://www.npmjs.org/package/brace-expansion) [![testling badge](https://ci.testling.com/juliangruber/brace-expansion.png)](https://ci.testling.com/juliangruber/brace-expansion) ## Example ```js var expand = require('brace-expansion'); expand('file-{a,b,c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('-v{,,}') // => ['-v', '-v', '-v'] expand('file{0..2}.jpg') // => ['file0.jpg', 'file1.jpg', 'file2.jpg'] expand('file-{a..c}.jpg') // => ['file-a.jpg', 'file-b.jpg', 'file-c.jpg'] expand('file{2..0}.jpg') // => ['file2.jpg', 'file1.jpg', 'file0.jpg'] expand('file{0..4..2}.jpg') // => ['file0.jpg', 'file2.jpg', 'file4.jpg'] expand('file-{a..e..2}.jpg') // => ['file-a.jpg', 'file-c.jpg', 'file-e.jpg'] expand('file{00..10..5}.jpg') // => ['file00.jpg', 'file05.jpg', 'file10.jpg'] expand('{{A..C},{a..c}}') // => ['A', 'B', 'C', 'a', 'b', 'c'] expand('ppp{,config,oe{,conf}}') // => ['ppp', 'pppconfig', 'pppoe', 'pppoeconf'] ``` ## API ```js var expand = require('brace-expansion'); ``` ### var expanded = expand(str) Return an array of all possible and valid expansions of `str`. If none are found, `[str]` is returned. Valid expansions are: ```js /^(.*,)+(.+)?$/ // {a,b,...} ``` A comma seperated list of options, like `{a,b}` or `{a,{b,c}}` or `{,a,}`. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` A numeric sequence from `x` to `y` inclusive, with optional increment. If `x` or `y` start with a leading `0`, all the numbers will be padded to have equal length. Negative numbers and backwards iteration work too. ```js /^-?\d+\.\.-?\d+(\.\.-?\d+)?$/ // {x..y[..incr]} ``` An alphabetic sequence from `x` to `y` inclusive, with optional increment. `x` and `y` must be exactly one character, and if given, `incr` must be a number. For compatibility reasons, the string `${` is not eligible for brace expansion. ## Installation With [npm](https://npmjs.org) do: ```bash npm install brace-expansion ``` ## Contributors - [Julian Gruber](https://github.com/juliangruber) - [Isaac Z. Schlueter](https://github.com/isaacs) ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/example.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000061512631326456032167 0ustar 00000000000000var expand = require('./'); console.log(expand('http://any.org/archive{1996..1999}/vol{1..4}/part{a,b,c}.html')); console.log(expand('http://www.numericals.com/file{1..100..10}.txt')); console.log(expand('http://www.letters.com/file{a..z..2}.txt')); console.log(expand('mkdir /usr/local/src/bash/{old,new,dist,bugs}')); console.log(expand('chown root /usr/{ucb/{ex,edit},lib/{ex?.?*,how_ex}}')); ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000001035612631326456032172 0ustar 00000000000000var concatMap = require('concat-map'); var balanced = require('balanced-match'); module.exports = expandTop; var escSlash = '\0SLASH'+Math.random()+'\0'; var escOpen = '\0OPEN'+Math.random()+'\0'; var escClose = '\0CLOSE'+Math.random()+'\0'; var escComma = '\0COMMA'+Math.random()+'\0'; var escPeriod = '\0PERIOD'+Math.random()+'\0'; function numeric(str) { return parseInt(str, 10) == str ? parseInt(str, 10) : str.charCodeAt(0); } function escapeBraces(str) { return str.split('\\\\').join(escSlash) .split('\\{').join(escOpen) .split('\\}').join(escClose) .split('\\,').join(escComma) .split('\\.').join(escPeriod); } function unescapeBraces(str) { return str.split(escSlash).join('\\') .split(escOpen).join('{') .split(escClose).join('}') .split(escComma).join(',') .split(escPeriod).join('.'); } // Basically just str.split(","), but handling cases // where we have nested braced sections, which should be // treated as individual members, like {a,{b,c},d} function parseCommaParts(str) { if (!str) return ['']; var parts = []; var m = balanced('{', '}', str); if (!m) return str.split(','); var pre = m.pre; var body = m.body; var post = m.post; var p = pre.split(','); p[p.length-1] += '{' + body + '}'; var postParts = parseCommaParts(post); if (post.length) { p[p.length-1] += postParts.shift(); p.push.apply(p, postParts); } parts.push.apply(parts, p); return parts; } function expandTop(str) { if (!str) return []; return expand(escapeBraces(str), true).map(unescapeBraces); } function identity(e) { return e; } function embrace(str) { return '{' + str + '}'; } function isPadded(el) { return /^-?0\d/.test(el); } function lte(i, y) { return i <= y; } function gte(i, y) { return i >= y; } function expand(str, isTop) { var expansions = []; var m = balanced('{', '}', str); if (!m || /\$$/.test(m.pre)) return [str]; var isNumericSequence = /^-?\d+\.\.-?\d+(?:\.\.-?\d+)?$/.test(m.body); var isAlphaSequence = /^[a-zA-Z]\.\.[a-zA-Z](?:\.\.-?\d+)?$/.test(m.body); var isSequence = isNumericSequence || isAlphaSequence; var isOptions = /^(.*,)+(.+)?$/.test(m.body); if (!isSequence && !isOptions) { // {a},b} if (m.post.match(/,.*}/)) { str = m.pre + '{' + m.body + escClose + m.post; return expand(str); } return [str]; } var n; if (isSequence) { n = m.body.split(/\.\./); } else { n = parseCommaParts(m.body); if (n.length === 1) { // x{{a,b}}y ==> x{a}y x{b}y n = expand(n[0], false).map(embrace); if (n.length === 1) { var post = m.post.length ? expand(m.post, false) : ['']; return post.map(function(p) { return m.pre + n[0] + p; }); } } } // at this point, n is the parts, and we know it's not a comma set // with a single entry. // no need to expand pre, since it is guaranteed to be free of brace-sets var pre = m.pre; var post = m.post.length ? expand(m.post, false) : ['']; var N; if (isSequence) { var x = numeric(n[0]); var y = numeric(n[1]); var width = Math.max(n[0].length, n[1].length) var incr = n.length == 3 ? Math.abs(numeric(n[2])) : 1; var test = lte; var reverse = y < x; if (reverse) { incr *= -1; test = gte; } var pad = n.some(isPadded); N = []; for (var i = x; test(i, y); i += incr) { var c; if (isAlphaSequence) { c = String.fromCharCode(i); if (c === '\\') c = ''; } else { c = String(i); if (pad) { var need = width - c.length; if (need > 0) { var z = new Array(need + 1).join('0'); if (i < 0) c = '-' + z + c.slice(1); else c = z + c; } } } N.push(c); } } else { N = concatMap(n, function(el) { return expand(el, false) }); } for (var j = 0; j < N.length; j++) { for (var k = 0; k < post.length; k++) { var expansion = pre + N[j] + post[k]; if (!isTop || isSequence || expansion) expansions.push(expansion); } } return expansions; } ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000521112631326456032164 0ustar 00000000000000{ "_args": [ [ "brace-expansion@^1.0.0", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/glob/node_modules/minimatch" ] ], "_from": "brace-expansion@>=1.0.0 <2.0.0", "_id": "brace-expansion@1.1.2", "_inCache": true, "_installable": true, "_location": "/node-gyp/glob/minimatch/brace-expansion", "_nodeVersion": "4.2.1", "_npmUser": { "email": "julian@juliangruber.com", "name": "juliangruber" }, "_npmVersion": "2.14.7", "_phantomChildren": {}, "_requested": { "name": "brace-expansion", "raw": "brace-expansion@^1.0.0", "rawSpec": "^1.0.0", "scope": null, "spec": ">=1.0.0 <2.0.0", "type": "range" }, "_requiredBy": [ "/node-gyp/glob/minimatch" ], "_resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.2.tgz", "_shasum": "f21445d0488b658e2771efd870eff51df29f04ef", "_shrinkwrap": null, "_spec": "brace-expansion@^1.0.0", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/glob/node_modules/minimatch", "author": { "email": "mail@juliangruber.com", "name": "Julian Gruber", "url": "http://juliangruber.com" }, "bugs": { "url": "https://github.com/juliangruber/brace-expansion/issues" }, "dependencies": { "balanced-match": "^0.3.0", "concat-map": "0.0.1" }, "description": "Brace expansion as known from sh/bash", "devDependencies": { "tape": "4.2.2" }, "directories": {}, "dist": { "shasum": "f21445d0488b658e2771efd870eff51df29f04ef", "tarball": "http://registry.npmjs.org/brace-expansion/-/brace-expansion-1.1.2.tgz" }, "gitHead": "b03773a30fa516b1374945b68e9acb6253d595fa", "homepage": "https://github.com/juliangruber/brace-expansion", "keywords": [], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" }, { "name": "isaacs", "email": "isaacs@npmjs.com" } ], "name": "brace-expansion", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/juliangruber/brace-expansion.git" }, "scripts": { "gentest": "bash test/generate.sh", "test": "tape test/*.js" }, "testling": { "browsers": [ "android-browser/4.2..latest", "chrome/25..latest", "chrome/canary", "firefox/20..latest", "firefox/nightly", "ie/8..latest", "ipad/6.0..latest", "iphone/6.0..latest", "opera/12..latest", "opera/next", "safari/5.1..latest" ], "files": "test/*.js" }, "version": "1.1.2" } ././@LongLink0000000000000000000000000000021000000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000002712631326456032164 0ustar 00000000000000node_modules .DS_Store ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/.travis.ymlnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000004612631326456032165 0ustar 00000000000000language: node_js node_js: - "0.10" ././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/LICENSE.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000211012631326456032157 0ustar 00000000000000(MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022000000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/Makefilenpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000007012631326456032162 0ustar 00000000000000 test: @node_modules/.bin/tape test/*.js .PHONY: test ././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000576012631326456032175 0ustar 00000000000000# balanced-match Match balanced string pairs, like `{` and `}` or `` and ``. [![build status](https://secure.travis-ci.org/juliangruber/balanced-match.svg)](http://travis-ci.org/juliangruber/balanced-match) [![downloads](https://img.shields.io/npm/dm/balanced-match.svg)](https://www.npmjs.org/package/balanced-match) [![testling badge](https://ci.testling.com/juliangruber/balanced-match.png)](https://ci.testling.com/juliangruber/balanced-match) ## Example Get the first matching pair of braces: ```js var balanced = require('balanced-match'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ``` The matches are: ```bash $ node example.js { start: 3, end: 14, pre: 'pre', body: 'in{nested}', post: 'post' } { start: 3, end: 9, pre: 'pre', body: 'first', post: 'between{second}post' } ``` ## API ### var m = balanced(a, b, str) For the first non-nested matching pair of `a` and `b` in `str`, return an object with those keys: * **start** the index of the first match of `a` * **end** the index of the matching `b` * **pre** the preamble, `a` and `b` not included * **body** the match, `a` and `b` not included * **post** the postscript, `a` and `b` not included If there's no match, `undefined` will be returned. If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `['{', 'a', '']`. ### var r = balanced.range(a, b, str) For the first non-nested matching pair of `a` and `b` in `str`, return an array with indexes: `[ , ]`. If there's no match, `undefined` will be returned. If the `str` contains more `a` than `b` / there are unmatched pairs, the first match that was closed will be used. For example, `{{a}` will match `[ 1, 3 ]`. ## Installation With [npm](https://npmjs.org) do: ```bash npm install balanced-match ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/example.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000023112631326456032161 0ustar 00000000000000var balanced = require('./'); console.log(balanced('{', '}', 'pre{in{nested}}post')); console.log(balanced('{', '}', 'pre{first}between{second}post')); ././@LongLink0000000000000000000000000000022000000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000173512631326456032173 0ustar 00000000000000module.exports = balanced; function balanced(a, b, str) { var r = range(a, b, str); return r && { start: r[0], end: r[1], pre: str.slice(0, r[0]), body: str.slice(r[0] + a.length, r[1]), post: str.slice(r[1] + b.length) }; } balanced.range = range; function range(a, b, str) { var begs, beg, left, right, result; var ai = str.indexOf(a); var bi = str.indexOf(b, ai + 1); var i = ai; if (ai >= 0 && bi > 0) { begs = []; left = str.length; while (i < str.length && i >= 0 && ! result) { if (i == ai) { begs.push(i); ai = str.indexOf(a, i + 1); } else if (begs.length == 1) { result = [ begs.pop(), bi ]; } else { beg = begs.pop(); if (beg < left) { left = beg; right = bi; } bi = str.indexOf(b, i + 1); } i = ai < bi && ai >= 0 ? ai : bi; } if (begs.length) { result = [ left, right ]; } } return result; } ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000517112631326456032171 0ustar 00000000000000{ "_args": [ [ "balanced-match@^0.3.0", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion" ] ], "_from": "balanced-match@>=0.3.0 <0.4.0", "_id": "balanced-match@0.3.0", "_inCache": true, "_installable": true, "_location": "/node-gyp/glob/minimatch/brace-expansion/balanced-match", "_nodeVersion": "4.2.1", "_npmUser": { "email": "julian@juliangruber.com", "name": "juliangruber" }, "_npmVersion": "2.14.7", "_phantomChildren": {}, "_requested": { "name": "balanced-match", "raw": "balanced-match@^0.3.0", "rawSpec": "^0.3.0", "scope": null, "spec": ">=0.3.0 <0.4.0", "type": "range" }, "_requiredBy": [ "/node-gyp/glob/minimatch/brace-expansion" ], "_resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-0.3.0.tgz", "_shasum": "a91cdd1ebef1a86659e70ff4def01625fc2d6756", "_shrinkwrap": null, "_spec": "balanced-match@^0.3.0", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion", "author": { "email": "mail@juliangruber.com", "name": "Julian Gruber", "url": "http://juliangruber.com" }, "bugs": { "url": "https://github.com/juliangruber/balanced-match/issues" }, "dependencies": {}, "description": "Match balanced character pairs, like \"{\" and \"}\"", "devDependencies": { "tape": "~4.2.2" }, "directories": {}, "dist": { "shasum": "a91cdd1ebef1a86659e70ff4def01625fc2d6756", "tarball": "http://registry.npmjs.org/balanced-match/-/balanced-match-0.3.0.tgz" }, "gitHead": "a7114b0986554787e90b7ac595a043ca75ea77e5", "homepage": "https://github.com/juliangruber/balanced-match", "keywords": [ "balanced", "match", "parse", "regexp", "test" ], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "name": "balanced-match", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/juliangruber/balanced-match.git" }, "scripts": { "test": "make test" }, "testling": { "browsers": [ "android-browser/4.2..latest", "chrome/25..latest", "chrome/canary", "firefox/20..latest", "firefox/nightly", "ie/8..latest", "ipad/6.0..latest", "iphone/6.0..latest", "opera/12..latest", "opera/next", "safari/5.1..latest" ], "files": "test/*.js" }, "version": "0.3.0" } ././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/test/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/balanced-match/test/balanced.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000342712631326456032173 0ustar 00000000000000var test = require('tape'); var balanced = require('..'); test('balanced', function(t) { t.deepEqual(balanced('{', '}', 'pre{in{nest}}post'), { start: 3, end: 12, pre: 'pre', body: 'in{nest}', post: 'post' }); t.deepEqual(balanced('{', '}', '{{{{{{{{{in}post'), { start: 8, end: 11, pre: '{{{{{{{{', body: 'in', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre{body{in}post'), { start: 8, end: 11, pre: 'pre{body', body: 'in', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre}{in{nest}}post'), { start: 4, end: 13, pre: 'pre}', body: 'in{nest}', post: 'post' }); t.deepEqual(balanced('{', '}', 'pre{body}between{body2}post'), { start: 3, end: 8, pre: 'pre', body: 'body', post: 'between{body2}post' }); t.notOk(balanced('{', '}', 'nope'), 'should be notOk'); t.deepEqual(balanced('', '', 'preinnestpost'), { start: 3, end: 19, pre: 'pre', body: 'innest', post: 'post' }); t.deepEqual(balanced('', '', 'preinnestpost'), { start: 7, end: 23, pre: 'pre', body: 'innest', post: 'post' }); t.deepEqual(balanced('{{', '}}', 'pre{{{in}}}post'), { start: 3, end: 9, pre: 'pre', body: '{in}', post: 'post' }); t.deepEqual(balanced('{{{', '}}', 'pre{{{in}}}post'), { start: 3, end: 8, pre: 'pre', body: 'in', post: '}post' }); t.deepEqual(balanced('{', '}', 'pre{{first}in{second}post'), { start: 4, end: 10, pre: 'pre{', body: 'first', post: 'in{second}post' }); t.deepEqual(balanced('', 'prepost'), { start: 3, end: 4, pre: 'pre', body: '', post: 'post' }); t.end(); }); ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/.travis.ymlnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000005312631326456032163 0ustar 00000000000000language: node_js node_js: - 0.4 - 0.6 ././@LongLink0000000000000000000000000000021300000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000206112631326456032164 0ustar 00000000000000This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/README.markdownnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000221512631326456032165 0ustar 00000000000000concat-map ========== Concatenative mapdashery. [![browser support](http://ci.testling.com/substack/node-concat-map.png)](http://ci.testling.com/substack/node-concat-map) [![build status](https://secure.travis-ci.org/substack/node-concat-map.png)](http://travis-ci.org/substack/node-concat-map) example ======= ``` js var concatMap = require('concat-map'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ``` *** ``` [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ] ``` methods ======= ``` js var concatMap = require('concat-map') ``` concatMap(xs, fn) ----------------- Return an array of concatenated elements by calling `fn(x, i)` for each element `x` and each index `i` in the array `xs`. When `fn(x, i)` returns an array, its result will be concatenated with the result array. If `fn(x, i)` returns anything else, that value will be pushed onto the end of the result array. install ======= With [npm](http://npmjs.org) do: ``` npm install concat-map ``` license ======= MIT notes ===== This module was written while sitting high above the ground in a tree. ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000053112631326456032164 0ustar 00000000000000module.exports = function (xs, fn) { var res = []; for (var i = 0; i < xs.length; i++) { var x = fn(xs[i], i); if (isArray(x)) res.push.apply(res, x); else res.push(x); } return res; }; var isArray = Array.isArray || function (xs) { return Object.prototype.toString.call(xs) === '[object Array]'; }; ././@LongLink0000000000000000000000000000022000000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000467312631326456032177 0ustar 00000000000000{ "_args": [ [ "concat-map@0.0.1", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion" ] ], "_from": "concat-map@0.0.1", "_id": "concat-map@0.0.1", "_inCache": true, "_installable": true, "_location": "/node-gyp/glob/minimatch/brace-expansion/concat-map", "_npmUser": { "email": "mail@substack.net", "name": "substack" }, "_npmVersion": "1.3.21", "_phantomChildren": {}, "_requested": { "name": "concat-map", "raw": "concat-map@0.0.1", "rawSpec": "0.0.1", "scope": null, "spec": "0.0.1", "type": "version" }, "_requiredBy": [ "/node-gyp/glob/minimatch/brace-expansion" ], "_resolved": "https://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz", "_shasum": "d8a96bd77fd68df7793a73036a3ba0d5405d477b", "_shrinkwrap": null, "_spec": "concat-map@0.0.1", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion", "author": { "email": "mail@substack.net", "name": "James Halliday", "url": "http://substack.net" }, "bugs": { "url": "https://github.com/substack/node-concat-map/issues" }, "dependencies": {}, "description": "concatenative mapdashery", "devDependencies": { "tape": "~2.4.0" }, "directories": { "example": "example", "test": "test" }, "dist": { "shasum": "d8a96bd77fd68df7793a73036a3ba0d5405d477b", "tarball": "http://registry.npmjs.org/concat-map/-/concat-map-0.0.1.tgz" }, "homepage": "https://github.com/substack/node-concat-map", "keywords": [ "concat", "concatMap", "functional", "higher-order", "map" ], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "name": "concat-map", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/substack/node-concat-map.git" }, "scripts": { "test": "tape test/*.js" }, "testling": { "browsers": { "chrome": [ 10, 22 ], "ff": [ 10, 15, 3.5 ], "ie": [ 6, 7, 8, 9 ], "opera": [ 12 ], "safari": [ 5.1 ] }, "files": "test/*.js" }, "version": "0.0.1" } ././@LongLink0000000000000000000000000000021100000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/test/npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000755000000000000000000000000012631326456032163 5ustar 00000000000000././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/example/map.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000025312631326456032165 0ustar 00000000000000var concatMap = require('../'); var xs = [ 1, 2, 3, 4, 5, 6 ]; var ys = concatMap(xs, function (x) { return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); console.dir(ys); ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-expansion/node_modules/concat-map/test/map.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/glob/node_modules/minimatch/node_modules/brace-exp0000644000000000000000000000206312631326456032166 0ustar 00000000000000var concatMap = require('../'); var test = require('tape'); test('empty or not', function (t) { var xs = [ 1, 2, 3, 4, 5, 6 ]; var ixes = []; var ys = concatMap(xs, function (x, ix) { ixes.push(ix); return x % 2 ? [ x - 0.1, x, x + 0.1 ] : []; }); t.same(ys, [ 0.9, 1, 1.1, 2.9, 3, 3.1, 4.9, 5, 5.1 ]); t.same(ixes, [ 0, 1, 2, 3, 4, 5 ]); t.end(); }); test('always something', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function (x) { return x === 'b' ? [ 'B', 'B', 'B' ] : [ x ]; }); t.same(ys, [ 'a', 'B', 'B', 'B', 'c', 'd' ]); t.end(); }); test('scalars', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function (x) { return x === 'b' ? [ 'B', 'B', 'B' ] : x; }); t.same(ys, [ 'a', 'B', 'B', 'B', 'c', 'd' ]); t.end(); }); test('undefs', function (t) { var xs = [ 'a', 'b', 'c', 'd' ]; var ys = concatMap(xs, function () {}); t.same(ys, [ undefined, undefined, undefined, undefined ]); t.end(); }); npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/.npmignore0000644000000000000000000000001512631326456024312 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/.travis.yml0000644000000000000000000000005512631326456024430 0ustar 00000000000000language: node_js node_js: - 0.10 - 0.11 npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/LICENSE0000644000000000000000000000210412631326456023321 0ustar 00000000000000Copyright 2009, 2010, 2011 Isaac Z. Schlueter. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/README.md0000644000000000000000000001500512631326456023577 0ustar 00000000000000# minimatch A minimal matching utility. [![Build Status](https://secure.travis-ci.org/isaacs/minimatch.png)](http://travis-ci.org/isaacs/minimatch) This is the matching library used internally by npm. Eventually, it will replace the C binding in node-glob. It works by converting glob expressions into JavaScript `RegExp` objects. ## Usage ```javascript var minimatch = require("minimatch") minimatch("bar.foo", "*.foo") // true! minimatch("bar.foo", "*.bar") // false! minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! ``` ## Features Supports these glob features: * Brace Expansion * Extended glob matching * "Globstar" `**` matching See: * `man sh` * `man bash` * `man 3 fnmatch` * `man 5 gitignore` ## Minimatch Class Create a minimatch object by instanting the `minimatch.Minimatch` class. ```javascript var Minimatch = require("minimatch").Minimatch var mm = new Minimatch(pattern, options) ``` ### Properties * `pattern` The original pattern the minimatch object represents. * `options` The options supplied to the constructor. * `set` A 2-dimensional array of regexp or string expressions. Each row in the array corresponds to a brace-expanded pattern. Each item in the row corresponds to a single path-part. For example, the pattern `{a,b/c}/d` would expand to a set of patterns like: [ [ a, d ] , [ b, c, d ] ] If a portion of the pattern doesn't have any "magic" in it (that is, it's something like `"foo"` rather than `fo*o?`), then it will be left as a string rather than converted to a regular expression. * `regexp` Created by the `makeRe` method. A single regular expression expressing the entire pattern. This is useful in cases where you wish to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. * `negate` True if the pattern is negated. * `comment` True if the pattern is a comment. * `empty` True if the pattern is `""`. ### Methods * `makeRe` Generate the `regexp` member if necessary, and return it. Will return `false` if the pattern is invalid. * `match(fname)` Return true if the filename matches the pattern, or false otherwise. * `matchOne(fileArray, patternArray, partial)` Take a `/`-split filename, and match it against a single row in the `regExpSet`. This method is mainly for internal use, but is exposed so that it can be used by a glob-walker that needs to avoid excessive filesystem calls. All other methods are internal, and will be called as necessary. ## Functions The top-level exported function has a `cache` property, which is an LRU cache set to store 100 items. So, calling these methods repeatedly with the same pattern and options will use the same Minimatch object, saving the cost of parsing it multiple times. ### minimatch(path, pattern, options) Main export. Tests a path against the pattern using the options. ```javascript var isJS = minimatch(file, "*.js", { matchBase: true }) ``` ### minimatch.filter(pattern, options) Returns a function that tests its supplied argument, suitable for use with `Array.filter`. Example: ```javascript var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) ``` ### minimatch.match(list, pattern, options) Match against the list of files, in the style of fnmatch or glob. If nothing is matched, and options.nonull is set, then return a list containing the pattern itself. ```javascript var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) ``` ### minimatch.makeRe(pattern, options) Make a regular expression object from the pattern. ## Options All options are `false` by default. ### debug Dump a ton of stuff to stderr. ### nobrace Do not expand `{a,b}` and `{1..3}` brace sets. ### noglobstar Disable `**` matching against multiple folder names. ### dot Allow patterns to match filenames starting with a period, even if the pattern does not explicitly have a period in that spot. Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` is set. ### noext Disable "extglob" style patterns like `+(a|b)`. ### nocase Perform a case-insensitive match. ### nonull When a match is not found by `minimatch.match`, return a list containing the pattern itself if this option is set. When not set, an empty list is returned if there are no matches. ### matchBase If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. ### nocomment Suppress the behavior of treating `#` at the start of a pattern as a comment. ### nonegate Suppress the behavior of treating a leading `!` character as negation. ### flipNegate Returns from negate expressions the same as if they were not negated. (Ie, true on a hit, false on a miss.) ## Comparisons to other fnmatch/glob implementations While strict compliance with the existing standards is a worthwhile goal, some discrepancies exist between minimatch and other implementations, and are intentional. If the pattern starts with a `!` character, then it is negated. Set the `nonegate` flag to suppress this behavior, and treat leading `!` characters normally. This is perhaps relevant if you wish to start the pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` characters at the start of a pattern will negate the pattern multiple times. If a pattern starts with `#`, then it is treated as a comment, and will not match anything. Use `\#` to match a literal `#` at the start of a line, or set the `nocomment` flag to suppress this behavior. The double-star character `**` is supported by default, unless the `noglobstar` flag is set. This is supported in the manner of bsdglob and bash 4.1, where `**` only has special significance if it is the only thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but `a/**b` will not. If an escaped pattern has no matches, and the `nonull` flag is set, then minimatch.match returns the pattern as-provided, rather than interpreting the character escapes. For example, `minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than `"*a?"`. This is akin to setting the `nullglob` option in bash, except that it does not resolve escaped pattern characters. If brace expansion is not disabled, then it is performed before any other interpretation of the glob pattern. Thus, a pattern like `+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded **first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are checked for validity. Since those two are valid, matching proceeds. npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/minimatch.js0000644000000000000000000007021212631326456024630 0ustar 00000000000000;(function (require, exports, module, platform) { if (module) module.exports = minimatch else exports.minimatch = minimatch if (!require) { require = function (id) { switch (id) { case "sigmund": return function sigmund (obj) { return JSON.stringify(obj) } case "path": return { basename: function (f) { f = f.split(/[\/\\]/) var e = f.pop() if (!e) e = f.pop() return e }} case "lru-cache": return function LRUCache () { // not quite an LRU, but still space-limited. var cache = {} var cnt = 0 this.set = function (k, v) { cnt ++ if (cnt >= 100) cache = {} cache[k] = v } this.get = function (k) { return cache[k] } } } } } minimatch.Minimatch = Minimatch var LRU = require("lru-cache") , cache = minimatch.cache = new LRU({max: 100}) , GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} , sigmund = require("sigmund") var path = require("path") // any single thing other than / // don't need to escape / when using new RegExp() , qmark = "[^/]" // * => any number of characters , star = qmark + "*?" // ** when dots are allowed. Anything goes, except .. and . // not (^ or / followed by one or two dots followed by $ or /), // followed by anything, any number of times. , twoStarDot = "(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?" // not a ^ or / followed by a dot, // followed by anything, any number of times. , twoStarNoDot = "(?:(?!(?:\\\/|^)\\.).)*?" // characters that need to be escaped in RegExp. , reSpecials = charSet("().*{}+?[]^$\\!") // "abc" -> { a:true, b:true, c:true } function charSet (s) { return s.split("").reduce(function (set, c) { set[c] = true return set }, {}) } // normalizes slashes. var slashSplit = /\/+/ minimatch.filter = filter function filter (pattern, options) { options = options || {} return function (p, i, list) { return minimatch(p, pattern, options) } } function ext (a, b) { a = a || {} b = b || {} var t = {} Object.keys(b).forEach(function (k) { t[k] = b[k] }) Object.keys(a).forEach(function (k) { t[k] = a[k] }) return t } minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return minimatch var orig = minimatch var m = function minimatch (p, pattern, options) { return orig.minimatch(p, pattern, ext(def, options)) } m.Minimatch = function Minimatch (pattern, options) { return new orig.Minimatch(pattern, ext(def, options)) } return m } Minimatch.defaults = function (def) { if (!def || !Object.keys(def).length) return Minimatch return minimatch.defaults(def).Minimatch } function minimatch (p, pattern, options) { if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} // shortcut: comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { return false } // "" only matches "" if (pattern.trim() === "") return p === "" return new Minimatch(pattern, options).match(p) } function Minimatch (pattern, options) { if (!(this instanceof Minimatch)) { return new Minimatch(pattern, options, cache) } if (typeof pattern !== "string") { throw new TypeError("glob pattern string required") } if (!options) options = {} pattern = pattern.trim() // windows: need to use /, not \ // On other platforms, \ is a valid (albeit bad) filename char. if (platform === "win32") { pattern = pattern.split("\\").join("/") } // lru storage. // these things aren't particularly big, but walking down the string // and turning it into a regexp can get pretty costly. var cacheKey = pattern + "\n" + sigmund(options) var cached = minimatch.cache.get(cacheKey) if (cached) return cached minimatch.cache.set(cacheKey, this) this.options = options this.set = [] this.pattern = pattern this.regexp = null this.negate = false this.comment = false this.empty = false // make the set of regexps etc. this.make() } Minimatch.prototype.debug = function() {} Minimatch.prototype.make = make function make () { // don't do it more than once. if (this._made) return var pattern = this.pattern var options = this.options // empty patterns and comments match nothing. if (!options.nocomment && pattern.charAt(0) === "#") { this.comment = true return } if (!pattern) { this.empty = true return } // step 1: figure out negation, etc. this.parseNegate() // step 2: expand braces var set = this.globSet = this.braceExpand() if (options.debug) this.debug = console.error this.debug(this.pattern, set) // step 3: now we have a set, so turn each one into a series of path-portion // matching patterns. // These will be regexps, except in the case of "**", which is // set to the GLOBSTAR object for globstar behavior, // and will not contain any / characters set = this.globParts = set.map(function (s) { return s.split(slashSplit) }) this.debug(this.pattern, set) // glob --> regexps set = set.map(function (s, si, set) { return s.map(this.parse, this) }, this) this.debug(this.pattern, set) // filter out everything that didn't compile properly. set = set.filter(function (s) { return -1 === s.indexOf(false) }) this.debug(this.pattern, set) this.set = set } Minimatch.prototype.parseNegate = parseNegate function parseNegate () { var pattern = this.pattern , negate = false , options = this.options , negateOffset = 0 if (options.nonegate) return for ( var i = 0, l = pattern.length ; i < l && pattern.charAt(i) === "!" ; i ++) { negate = !negate negateOffset ++ } if (negateOffset) this.pattern = pattern.substr(negateOffset) this.negate = negate } // Brace expansion: // a{b,c}d -> abd acd // a{b,}c -> abc ac // a{0..3}d -> a0d a1d a2d a3d // a{b,c{d,e}f}g -> abg acdfg acefg // a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg // // Invalid sets are not expanded. // a{2..}b -> a{2..}b // a{b}c -> a{b}c minimatch.braceExpand = function (pattern, options) { return new Minimatch(pattern, options).braceExpand() } Minimatch.prototype.braceExpand = braceExpand function pad(n, width, z) { z = z || '0'; n = n + ''; return n.length >= width ? n : new Array(width - n.length + 1).join(z) + n; } function braceExpand (pattern, options) { options = options || this.options pattern = typeof pattern === "undefined" ? this.pattern : pattern if (typeof pattern === "undefined") { throw new Error("undefined pattern") } if (options.nobrace || !pattern.match(/\{.*\}/)) { // shortcut. no need to expand. return [pattern] } var escaping = false // examples and comments refer to this crazy pattern: // a{b,c{d,e},{f,g}h}x{y,z} // expected: // abxy // abxz // acdxy // acdxz // acexy // acexz // afhxy // afhxz // aghxy // aghxz // everything before the first \{ is just a prefix. // So, we pluck that off, and work with the rest, // and then prepend it to everything we find. if (pattern.charAt(0) !== "{") { this.debug(pattern) var prefix = null for (var i = 0, l = pattern.length; i < l; i ++) { var c = pattern.charAt(i) this.debug(i, c) if (c === "\\") { escaping = !escaping } else if (c === "{" && !escaping) { prefix = pattern.substr(0, i) break } } // actually no sets, all { were escaped. if (prefix === null) { this.debug("no sets") return [pattern] } var tail = braceExpand.call(this, pattern.substr(i), options) return tail.map(function (t) { return prefix + t }) } // now we have something like: // {b,c{d,e},{f,g}h}x{y,z} // walk through the set, expanding each part, until // the set ends. then, we'll expand the suffix. // If the set only has a single member, then'll put the {} back // first, handle numeric sets, since they're easier var numset = pattern.match(/^\{(-?[0-9]+)\.\.(-?[0-9]+)\}/) if (numset) { this.debug("numset", numset[1], numset[2]) var suf = braceExpand.call(this, pattern.substr(numset[0].length), options) , start = +numset[1] , needPadding = numset[1][0] === '0' , startWidth = numset[1].length , padded , end = +numset[2] , inc = start > end ? -1 : 1 , set = [] for (var i = start; i != (end + inc); i += inc) { padded = needPadding ? pad(i, startWidth) : i + '' // append all the suffixes for (var ii = 0, ll = suf.length; ii < ll; ii ++) { set.push(padded + suf[ii]) } } return set } // ok, walk through the set // We hope, somewhat optimistically, that there // will be a } at the end. // If the closing brace isn't found, then the pattern is // interpreted as braceExpand("\\" + pattern) so that // the leading \{ will be interpreted literally. var i = 1 // skip the \{ , depth = 1 , set = [] , member = "" , sawEnd = false , escaping = false function addMember () { set.push(member) member = "" } this.debug("Entering for") FOR: for (i = 1, l = pattern.length; i < l; i ++) { var c = pattern.charAt(i) this.debug("", i, c) if (escaping) { escaping = false member += "\\" + c } else { switch (c) { case "\\": escaping = true continue case "{": depth ++ member += "{" continue case "}": depth -- // if this closes the actual set, then we're done if (depth === 0) { addMember() // pluck off the close-brace i ++ break FOR } else { member += c continue } case ",": if (depth === 1) { addMember() } else { member += c } continue default: member += c continue } // switch } // else } // for // now we've either finished the set, and the suffix is // pattern.substr(i), or we have *not* closed the set, // and need to escape the leading brace if (depth !== 0) { this.debug("didn't close", pattern) return braceExpand.call(this, "\\" + pattern, options) } // x{y,z} -> ["xy", "xz"] this.debug("set", set) this.debug("suffix", pattern.substr(i)) var suf = braceExpand.call(this, pattern.substr(i), options) // ["b", "c{d,e}","{f,g}h"] -> // [["b"], ["cd", "ce"], ["fh", "gh"]] var addBraces = set.length === 1 this.debug("set pre-expanded", set) set = set.map(function (p) { return braceExpand.call(this, p, options) }, this) this.debug("set expanded", set) // [["b"], ["cd", "ce"], ["fh", "gh"]] -> // ["b", "cd", "ce", "fh", "gh"] set = set.reduce(function (l, r) { return l.concat(r) }) if (addBraces) { set = set.map(function (s) { return "{" + s + "}" }) } // now attach the suffixes. var ret = [] for (var i = 0, l = set.length; i < l; i ++) { for (var ii = 0, ll = suf.length; ii < ll; ii ++) { ret.push(set[i] + suf[ii]) } } return ret } // parse a component of the expanded set. // At this point, no pattern may contain "/" in it // so we're going to return a 2d array, where each entry is the full // pattern, split on '/', and then turned into a regular expression. // A regexp is made at the end which joins each array with an // escaped /, and another full one which joins each regexp with |. // // Following the lead of Bash 4.1, note that "**" only has special meaning // when it is the *only* thing in a path portion. Otherwise, any series // of * is equivalent to a single *. Globstar behavior is enabled by // default, and can be disabled by setting options.noglobstar. Minimatch.prototype.parse = parse var SUBPARSE = {} function parse (pattern, isSub) { var options = this.options // shortcuts if (!options.noglobstar && pattern === "**") return GLOBSTAR if (pattern === "") return "" var re = "" , hasMagic = !!options.nocase , escaping = false // ? => one single character , patternListStack = [] , plType , stateChar , inClass = false , reClassStart = -1 , classStart = -1 // . and .. never match anything that doesn't start with ., // even when options.dot is set. , patternStart = pattern.charAt(0) === "." ? "" // anything // not (start or / followed by . or .. followed by / or end) : options.dot ? "(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))" : "(?!\\.)" , self = this function clearStateChar () { if (stateChar) { // we had some state-tracking character // that wasn't consumed by this pass. switch (stateChar) { case "*": re += star hasMagic = true break case "?": re += qmark hasMagic = true break default: re += "\\"+stateChar break } self.debug('clearStateChar %j %j', stateChar, re) stateChar = false } } for ( var i = 0, len = pattern.length, c ; (i < len) && (c = pattern.charAt(i)) ; i ++ ) { this.debug("%s\t%s %s %j", pattern, i, re, c) // skip over any that are escaped. if (escaping && reSpecials[c]) { re += "\\" + c escaping = false continue } SWITCH: switch (c) { case "/": // completely not allowed, even escaped. // Should already be path-split by now. return false case "\\": clearStateChar() escaping = true continue // the various stateChar values // for the "extglob" stuff. case "?": case "*": case "+": case "@": case "!": this.debug("%s\t%s %s %j <-- stateChar", pattern, i, re, c) // all of those are literals inside a class, except that // the glob [!a] means [^a] in regexp if (inClass) { this.debug(' in class') if (c === "!" && i === classStart + 1) c = "^" re += c continue } // if we already have a stateChar, then it means // that there was something like ** or +? in there. // Handle the stateChar, then proceed with this one. self.debug('call clearStateChar %j', stateChar) clearStateChar() stateChar = c // if extglob is disabled, then +(asdf|foo) isn't a thing. // just clear the statechar *now*, rather than even diving into // the patternList stuff. if (options.noext) clearStateChar() continue case "(": if (inClass) { re += "(" continue } if (!stateChar) { re += "\\(" continue } plType = stateChar patternListStack.push({ type: plType , start: i - 1 , reStart: re.length }) // negation is (?:(?!js)[^/]*) re += stateChar === "!" ? "(?:(?!" : "(?:" this.debug('plType %j %j', stateChar, re) stateChar = false continue case ")": if (inClass || !patternListStack.length) { re += "\\)" continue } clearStateChar() hasMagic = true re += ")" plType = patternListStack.pop().type // negation is (?:(?!js)[^/]*) // The others are (?:) switch (plType) { case "!": re += "[^/]*?)" break case "?": case "+": case "*": re += plType case "@": break // the default anyway } continue case "|": if (inClass || !patternListStack.length || escaping) { re += "\\|" escaping = false continue } clearStateChar() re += "|" continue // these are mostly the same in regexp and glob case "[": // swallow any state-tracking char before the [ clearStateChar() if (inClass) { re += "\\" + c continue } inClass = true classStart = i reClassStart = re.length re += c continue case "]": // a right bracket shall lose its special // meaning and represent itself in // a bracket expression if it occurs // first in the list. -- POSIX.2 2.8.3.2 if (i === classStart + 1 || !inClass) { re += "\\" + c escaping = false continue } // finish up the class. hasMagic = true inClass = false re += c continue default: // swallow any state char that wasn't consumed clearStateChar() if (escaping) { // no need escaping = false } else if (reSpecials[c] && !(c === "^" && inClass)) { re += "\\" } re += c } // switch } // for // handle the case where we left a class open. // "[abc" is valid, equivalent to "\[abc" if (inClass) { // split where the last [ was, and escape it // this is a huge pita. We now have to re-walk // the contents of the would-be class to re-translate // any characters that were passed through as-is var cs = pattern.substr(classStart + 1) , sp = this.parse(cs, SUBPARSE) re = re.substr(0, reClassStart) + "\\[" + sp[0] hasMagic = hasMagic || sp[1] } // handle the case where we had a +( thing at the *end* // of the pattern. // each pattern list stack adds 3 chars, and we need to go through // and escape any | chars that were passed through as-is for the regexp. // Go through and escape them, taking care not to double-escape any // | chars that were already escaped. var pl while (pl = patternListStack.pop()) { var tail = re.slice(pl.reStart + 3) // maybe some even number of \, then maybe 1 \, followed by a | tail = tail.replace(/((?:\\{2})*)(\\?)\|/g, function (_, $1, $2) { if (!$2) { // the | isn't already escaped, so escape it. $2 = "\\" } // need to escape all those slashes *again*, without escaping the // one that we need for escaping the | character. As it works out, // escaping an even number of slashes can be done by simply repeating // it exactly after itself. That's why this trick works. // // I am sorry that you have to see this. return $1 + $1 + $2 + "|" }) this.debug("tail=%j\n %s", tail, tail) var t = pl.type === "*" ? star : pl.type === "?" ? qmark : "\\" + pl.type hasMagic = true re = re.slice(0, pl.reStart) + t + "\\(" + tail } // handle trailing things that only matter at the very end. clearStateChar() if (escaping) { // trailing \\ re += "\\\\" } // only need to apply the nodot start if the re starts with // something that could conceivably capture a dot var addPatternStart = false switch (re.charAt(0)) { case ".": case "[": case "(": addPatternStart = true } // if the re is not "" at this point, then we need to make sure // it doesn't match against an empty path part. // Otherwise a/* will match a/, which it should not. if (re !== "" && hasMagic) re = "(?=.)" + re if (addPatternStart) re = patternStart + re // parsing just a piece of a larger pattern. if (isSub === SUBPARSE) { return [ re, hasMagic ] } // skip the regexp for non-magical patterns // unescape anything in it, though, so that it'll be // an exact match against a file etc. if (!hasMagic) { return globUnescape(pattern) } var flags = options.nocase ? "i" : "" , regExp = new RegExp("^" + re + "$", flags) regExp._glob = pattern regExp._src = re return regExp } minimatch.makeRe = function (pattern, options) { return new Minimatch(pattern, options || {}).makeRe() } Minimatch.prototype.makeRe = makeRe function makeRe () { if (this.regexp || this.regexp === false) return this.regexp // at this point, this.set is a 2d array of partial // pattern strings, or "**". // // It's better to use .match(). This function shouldn't // be used, really, but it's pretty convenient sometimes, // when you just want to work with a regex. var set = this.set if (!set.length) return this.regexp = false var options = this.options var twoStar = options.noglobstar ? star : options.dot ? twoStarDot : twoStarNoDot , flags = options.nocase ? "i" : "" var re = set.map(function (pattern) { return pattern.map(function (p) { return (p === GLOBSTAR) ? twoStar : (typeof p === "string") ? regExpEscape(p) : p._src }).join("\\\/") }).join("|") // must match entire pattern // ending in a * or ** will make it less strict. re = "^(?:" + re + ")$" // can match anything, as long as it's not this. if (this.negate) re = "^(?!" + re + ").*$" try { return this.regexp = new RegExp(re, flags) } catch (ex) { return this.regexp = false } } minimatch.match = function (list, pattern, options) { options = options || {} var mm = new Minimatch(pattern, options) list = list.filter(function (f) { return mm.match(f) }) if (mm.options.nonull && !list.length) { list.push(pattern) } return list } Minimatch.prototype.match = match function match (f, partial) { this.debug("match", f, this.pattern) // short-circuit in the case of busted things. // comments, etc. if (this.comment) return false if (this.empty) return f === "" if (f === "/" && partial) return true var options = this.options // windows: need to use /, not \ // On other platforms, \ is a valid (albeit bad) filename char. if (platform === "win32") { f = f.split("\\").join("/") } // treat the test path as a set of pathparts. f = f.split(slashSplit) this.debug(this.pattern, "split", f) // just ONE of the pattern sets in this.set needs to match // in order for it to be valid. If negating, then just one // match means that we have failed. // Either way, return on the first hit. var set = this.set this.debug(this.pattern, "set", set) // Find the basename of the path by looking for the last non-empty segment var filename; for (var i = f.length - 1; i >= 0; i--) { filename = f[i] if (filename) break } for (var i = 0, l = set.length; i < l; i ++) { var pattern = set[i], file = f if (options.matchBase && pattern.length === 1) { file = [filename] } var hit = this.matchOne(file, pattern, partial) if (hit) { if (options.flipNegate) return true return !this.negate } } // didn't get any hits. this is success if it's a negative // pattern, failure otherwise. if (options.flipNegate) return false return this.negate } // set partial to true to test if, for example, // "/a/b" matches the start of "/*/b/*/d" // Partial means, if you run out of file before you run // out of pattern, then that's fine, as long as all // the parts match. Minimatch.prototype.matchOne = function (file, pattern, partial) { var options = this.options this.debug("matchOne", { "this": this , file: file , pattern: pattern }) this.debug("matchOne", file.length, pattern.length) for ( var fi = 0 , pi = 0 , fl = file.length , pl = pattern.length ; (fi < fl) && (pi < pl) ; fi ++, pi ++ ) { this.debug("matchOne loop") var p = pattern[pi] , f = file[fi] this.debug(pattern, p, f) // should be impossible. // some invalid regexp stuff in the set. if (p === false) return false if (p === GLOBSTAR) { this.debug('GLOBSTAR', [pattern, p, f]) // "**" // a/**/b/**/c would match the following: // a/b/x/y/z/c // a/x/y/z/b/c // a/b/x/b/x/c // a/b/c // To do this, take the rest of the pattern after // the **, and see if it would match the file remainder. // If so, return success. // If not, the ** "swallows" a segment, and try again. // This is recursively awful. // // a/**/b/**/c matching a/b/x/y/z/c // - a matches a // - doublestar // - matchOne(b/x/y/z/c, b/**/c) // - b matches b // - doublestar // - matchOne(x/y/z/c, c) -> no // - matchOne(y/z/c, c) -> no // - matchOne(z/c, c) -> no // - matchOne(c, c) yes, hit var fr = fi , pr = pi + 1 if (pr === pl) { this.debug('** at the end') // a ** at the end will just swallow the rest. // We have found a match. // however, it will not swallow /.x, unless // options.dot is set. // . and .. are *never* matched by **, for explosively // exponential reasons. for ( ; fi < fl; fi ++) { if (file[fi] === "." || file[fi] === ".." || (!options.dot && file[fi].charAt(0) === ".")) return false } return true } // ok, let's see if we can swallow whatever we can. WHILE: while (fr < fl) { var swallowee = file[fr] this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) // XXX remove this slice. Just pass the start index. if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { this.debug('globstar found match!', fr, fl, swallowee) // found a match. return true } else { // can't swallow "." or ".." ever. // can only swallow ".foo" when explicitly asked. if (swallowee === "." || swallowee === ".." || (!options.dot && swallowee.charAt(0) === ".")) { this.debug("dot detected!", file, fr, pattern, pr) break WHILE } // ** swallows a segment, and continue. this.debug('globstar swallow a segment, and continue') fr ++ } } // no match was found. // However, in partial mode, we can't say this is necessarily over. // If there's more *pattern* left, then if (partial) { // ran out of file this.debug("\n>>> no match, partial?", file, fr, pattern, pr) if (fr === fl) return true } return false } // something other than ** // non-magic patterns just have to match exactly // patterns with magic have been turned into regexps. var hit if (typeof p === "string") { if (options.nocase) { hit = f.toLowerCase() === p.toLowerCase() } else { hit = f === p } this.debug("string match", p, f, hit) } else { hit = f.match(p) this.debug("pattern match", p, f, hit) } if (!hit) return false } // Note: ending in / means that we'll get a final "" // at the end of the pattern. This can only match a // corresponding "" at the end of the file. // If the file ends in /, then it can only match a // a pattern that ends in /, unless the pattern just // doesn't have any more for it. But, a/b/ should *not* // match "a/b/*", even though "" matches against the // [^/]*? pattern, except in partial mode, where it might // simply not be reached yet. // However, a/b/ should still satisfy a/* // now either we fell off the end of the pattern, or we're done. if (fi === fl && pi === pl) { // ran out of pattern and filename at the same time. // an exact hit! return true } else if (fi === fl) { // ran out of file, but still had pattern left. // this is ok if we're doing the match as part of // a glob fs traversal. return partial } else if (pi === pl) { // ran out of pattern, still have file left. // this is only acceptable if we're on the very last // empty segment of a file with a trailing slash. // a/* should match a/b/ var emptyFileEnd = (fi === fl - 1) && (file[fi] === "") return emptyFileEnd } // should be unreachable. throw new Error("wtf?") } // replace stuff like \* with * function globUnescape (s) { return s.replace(/\\(.)/g, "$1") } function regExpEscape (s) { return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, "\\$&") } })( typeof require === "function" ? require : null, this, typeof module === "object" ? module : null, typeof process === "object" ? process.platform : "win32" ) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/0000755000000000000000000000000012631326456024774 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/package.json0000644000000000000000000000370012631326456024605 0ustar 00000000000000{ "_args": [ [ "minimatch@1", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp" ] ], "_from": "minimatch@>=1.0.0 <2.0.0", "_id": "minimatch@1.0.0", "_inCache": true, "_installable": true, "_location": "/node-gyp/minimatch", "_npmUser": { "email": "i@izs.me", "name": "isaacs" }, "_npmVersion": "1.4.21", "_phantomChildren": {}, "_requested": { "name": "minimatch", "raw": "minimatch@1", "rawSpec": "1", "scope": null, "spec": ">=1.0.0 <2.0.0", "type": "range" }, "_requiredBy": [ "/node-gyp" ], "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-1.0.0.tgz", "_shasum": "e0dd2120b49e1b724ce8d714c520822a9438576d", "_shrinkwrap": null, "_spec": "minimatch@1", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter", "url": "http://blog.izs.me" }, "bugs": { "url": "https://github.com/isaacs/minimatch/issues" }, "dependencies": { "lru-cache": "2", "sigmund": "~1.0.0" }, "description": "a glob matcher in javascript", "devDependencies": { "tap": "" }, "directories": {}, "dist": { "shasum": "e0dd2120b49e1b724ce8d714c520822a9438576d", "tarball": "http://registry.npmjs.org/minimatch/-/minimatch-1.0.0.tgz" }, "engines": { "node": "*" }, "gitHead": "b374a643976eb55cdc19c60b6dd51ebe9bcc607a", "homepage": "https://github.com/isaacs/minimatch", "license": { "type": "MIT", "url": "http://github.com/isaacs/minimatch/raw/master/LICENSE" }, "main": "minimatch.js", "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "name": "minimatch", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "scripts": { "test": "tap test/*.js" }, "version": "1.0.0" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/test/0000755000000000000000000000000012631326456023276 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/0000755000000000000000000000000012631326456026637 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/0000755000000000000000000000000012631326456026442 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/.npmignore0000644000000000000000000000001612631326456030633 0ustar 00000000000000/node_modules npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/.travis.yml0000644000000000000000000000016412631326456030751 0ustar 00000000000000language: node_js node_js: - '0.8' - '0.10' - '0.12' - 'iojs' before_install: - npm install -g npm@latest npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/CONTRIBUTORS0000644000000000000000000000101412631326456030513 0ustar 00000000000000# Authors, sorted by whether or not they are me Isaac Z. Schlueter Brian Cottingham Carlos Brito Lage Jesse Dailey Kevin O'Hara Marco Rogers Mark Cavage Marko Mikulicic Nathan Rajlich Satheesh Natesan Trent Mick ashleybrener n4kz npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/LICENSE0000644000000000000000000000137512631326456027652 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/README.md0000644000000000000000000001047512631326456030125 0ustar 00000000000000# lru cache A cache object that deletes the least-recently-used items. ## Usage: ```javascript var LRU = require("lru-cache") , options = { max: 500 , length: function (n) { return n * 2 } , dispose: function (key, n) { n.close() } , maxAge: 1000 * 60 * 60 } , cache = LRU(options) , otherCache = LRU(50) // sets just the max size cache.set("key", "value") cache.get("key") // "value" cache.reset() // empty the cache ``` If you put more stuff in it, then items will fall out. If you try to put an oversized thing in it, then it'll fall out right away. ## Keys should always be Strings or Numbers Note: this module will print warnings to `console.error` if you use a key that is not a String or Number. Because items are stored in an object, which coerces keys to a string, it won't go well for you if you try to use a key that is not a unique string, it'll cause surprise collisions. For example: ```JavaScript // Bad Example! Dont' do this! var cache = LRU() var a = {} var b = {} cache.set(a, 'this is a') cache.set(b, 'this is b') console.log(cache.get(a)) // prints: 'this is b' ``` ## Options * `max` The maximum size of the cache, checked by applying the length function to all values in the cache. Not setting this is kind of silly, since that's the whole purpose of this lib, but it defaults to `Infinity`. * `maxAge` Maximum age in ms. Items are not pro-actively pruned out as they age, but if you try to get an item that is too old, it'll drop it and return undefined instead of giving it to you. * `length` Function that is used to calculate the length of stored items. If you're storing strings or buffers, then you probably want to do something like `function(n){return n.length}`. The default is `function(n){return 1}`, which is fine if you want to store `max` like-sized things. * `dispose` Function that is called on items when they are dropped from the cache. This can be handy if you want to close file descriptors or do other cleanup tasks when items are no longer accessible. Called with `key, value`. It's called *before* actually removing the item from the internal cache, so if you want to immediately put it back in, you'll have to do that in a `nextTick` or `setTimeout` callback or it won't do anything. * `stale` By default, if you set a `maxAge`, it'll only actually pull stale items out of the cache when you `get(key)`. (That is, it's not pre-emptively doing a `setTimeout` or anything.) If you set `stale:true`, it'll return the stale value before deleting it. If you don't set this, then it'll return `undefined` when you try to get a stale entry, as if it had already been deleted. ## API * `set(key, value, maxAge)` * `get(key) => value` Both of these will update the "recently used"-ness of the key. They do what you think. `max` is optional and overrides the cache `max` option if provided. * `peek(key)` Returns the key value (or `undefined` if not found) without updating the "recently used"-ness of the key. (If you find yourself using this a lot, you *might* be using the wrong sort of data structure, but there are some use cases where it's handy.) * `del(key)` Deletes a key out of the cache. * `reset()` Clear the cache entirely, throwing away all values. * `has(key)` Check if a key is in the cache, without updating the recent-ness or deleting it for being stale. * `forEach(function(value,key,cache), [thisp])` Just like `Array.prototype.forEach`. Iterates over all the keys in the cache, in order of recent-ness. (Ie, more recently used items are iterated over first.) * `keys()` Return an array of the keys in the cache. * `values()` Return an array of the values in the cache. * `length()` Return total length of objects in cache taking into account `length` options function. * `itemCount` Return total quantity of objects currently in cache. Note, that `stale` (see options) items are returned as part of this item count. * `dump()` Return an array of the cache entries ready for serialization and usage with 'destinationCache.load(arr)`. * `load(cacheEntriesArray)` Loads another cache entries array, obtained with `sourceCache.dump()`, into the cache. The destination cache is reset before loading new entries npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/lib/0000755000000000000000000000000012631326456027405 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/package.json0000644000000000000000000000405012631326456031124 0ustar 00000000000000{ "_args": [ [ "lru-cache@2", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/minimatch" ] ], "_from": "lru-cache@>=2.0.0 <3.0.0", "_id": "lru-cache@2.7.3", "_inCache": true, "_installable": true, "_location": "/node-gyp/minimatch/lru-cache", "_nodeVersion": "4.0.0", "_npmUser": { "email": "i@izs.me", "name": "isaacs" }, "_npmVersion": "3.3.2", "_phantomChildren": {}, "_requested": { "name": "lru-cache", "raw": "lru-cache@2", "rawSpec": "2", "scope": null, "spec": ">=2.0.0 <3.0.0", "type": "range" }, "_requiredBy": [ "/node-gyp/minimatch" ], "_resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-2.7.3.tgz", "_shasum": "6d4524e8b955f95d4f5b58851ce21dd72fb4e952", "_shrinkwrap": null, "_spec": "lru-cache@2", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/minimatch", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter" }, "bugs": { "url": "https://github.com/isaacs/node-lru-cache/issues" }, "dependencies": {}, "description": "A cache object that deletes the least-recently-used items.", "devDependencies": { "tap": "^1.2.0", "weak": "" }, "directories": {}, "dist": { "shasum": "6d4524e8b955f95d4f5b58851ce21dd72fb4e952", "tarball": "http://registry.npmjs.org/lru-cache/-/lru-cache-2.7.3.tgz" }, "gitHead": "292048199f6d28b77fbe584279a1898e25e4c714", "homepage": "https://github.com/isaacs/node-lru-cache#readme", "keywords": [ "cache", "lru", "mru" ], "license": "ISC", "main": "lib/lru-cache.js", "maintainers": [ { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "name": "lru-cache", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/isaacs/node-lru-cache.git" }, "scripts": { "test": "tap test --gc" }, "version": "2.7.3" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/test/0000755000000000000000000000000012631326456027616 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/lib/lru-cache.js0000644000000000000000000002003612631326456031607 0ustar 00000000000000;(function () { // closure for web browsers if (typeof module === 'object' && module.exports) { module.exports = LRUCache } else { // just set the global for non-node platforms. this.LRUCache = LRUCache } function hOP (obj, key) { return Object.prototype.hasOwnProperty.call(obj, key) } function naiveLength () { return 1 } var didTypeWarning = false function typeCheckKey(key) { if (!didTypeWarning && typeof key !== 'string' && typeof key !== 'number') { didTypeWarning = true console.error(new TypeError("LRU: key must be a string or number. Almost certainly a bug! " + typeof key).stack) } } function LRUCache (options) { if (!(this instanceof LRUCache)) return new LRUCache(options) if (typeof options === 'number') options = { max: options } if (!options) options = {} this._max = options.max // Kind of weird to have a default max of Infinity, but oh well. if (!this._max || !(typeof this._max === "number") || this._max <= 0 ) this._max = Infinity this._lengthCalculator = options.length || naiveLength if (typeof this._lengthCalculator !== "function") this._lengthCalculator = naiveLength this._allowStale = options.stale || false this._maxAge = options.maxAge || null this._dispose = options.dispose this.reset() } // resize the cache when the max changes. Object.defineProperty(LRUCache.prototype, "max", { set : function (mL) { if (!mL || !(typeof mL === "number") || mL <= 0 ) mL = Infinity this._max = mL if (this._length > this._max) trim(this) } , get : function () { return this._max } , enumerable : true }) // resize the cache when the lengthCalculator changes. Object.defineProperty(LRUCache.prototype, "lengthCalculator", { set : function (lC) { if (typeof lC !== "function") { this._lengthCalculator = naiveLength this._length = this._itemCount for (var key in this._cache) { this._cache[key].length = 1 } } else { this._lengthCalculator = lC this._length = 0 for (var key in this._cache) { this._cache[key].length = this._lengthCalculator(this._cache[key].value) this._length += this._cache[key].length } } if (this._length > this._max) trim(this) } , get : function () { return this._lengthCalculator } , enumerable : true }) Object.defineProperty(LRUCache.prototype, "length", { get : function () { return this._length } , enumerable : true }) Object.defineProperty(LRUCache.prototype, "itemCount", { get : function () { return this._itemCount } , enumerable : true }) LRUCache.prototype.forEach = function (fn, thisp) { thisp = thisp || this var i = 0 var itemCount = this._itemCount for (var k = this._mru - 1; k >= 0 && i < itemCount; k--) if (this._lruList[k]) { i++ var hit = this._lruList[k] if (isStale(this, hit)) { del(this, hit) if (!this._allowStale) hit = undefined } if (hit) { fn.call(thisp, hit.value, hit.key, this) } } } LRUCache.prototype.keys = function () { var keys = new Array(this._itemCount) var i = 0 for (var k = this._mru - 1; k >= 0 && i < this._itemCount; k--) if (this._lruList[k]) { var hit = this._lruList[k] keys[i++] = hit.key } return keys } LRUCache.prototype.values = function () { var values = new Array(this._itemCount) var i = 0 for (var k = this._mru - 1; k >= 0 && i < this._itemCount; k--) if (this._lruList[k]) { var hit = this._lruList[k] values[i++] = hit.value } return values } LRUCache.prototype.reset = function () { if (this._dispose && this._cache) { for (var k in this._cache) { this._dispose(k, this._cache[k].value) } } this._cache = Object.create(null) // hash of items by key this._lruList = Object.create(null) // list of items in order of use recency this._mru = 0 // most recently used this._lru = 0 // least recently used this._length = 0 // number of items in the list this._itemCount = 0 } LRUCache.prototype.dump = function () { var arr = [] var i = 0 for (var k = this._mru - 1; k >= 0 && i < this._itemCount; k--) if (this._lruList[k]) { var hit = this._lruList[k] if (!isStale(this, hit)) { //Do not store staled hits ++i arr.push({ k: hit.key, v: hit.value, e: hit.now + (hit.maxAge || 0) }); } } //arr has the most read first return arr } LRUCache.prototype.dumpLru = function () { return this._lruList } LRUCache.prototype.set = function (key, value, maxAge) { maxAge = maxAge || this._maxAge typeCheckKey(key) var now = maxAge ? Date.now() : 0 var len = this._lengthCalculator(value) if (hOP(this._cache, key)) { if (len > this._max) { del(this, this._cache[key]) return false } // dispose of the old one before overwriting if (this._dispose) this._dispose(key, this._cache[key].value) this._cache[key].now = now this._cache[key].maxAge = maxAge this._cache[key].value = value this._length += (len - this._cache[key].length) this._cache[key].length = len this.get(key) if (this._length > this._max) trim(this) return true } var hit = new Entry(key, value, this._mru++, len, now, maxAge) // oversized objects fall out of cache automatically. if (hit.length > this._max) { if (this._dispose) this._dispose(key, value) return false } this._length += hit.length this._lruList[hit.lu] = this._cache[key] = hit this._itemCount ++ if (this._length > this._max) trim(this) return true } LRUCache.prototype.has = function (key) { typeCheckKey(key) if (!hOP(this._cache, key)) return false var hit = this._cache[key] if (isStale(this, hit)) { return false } return true } LRUCache.prototype.get = function (key) { typeCheckKey(key) return get(this, key, true) } LRUCache.prototype.peek = function (key) { typeCheckKey(key) return get(this, key, false) } LRUCache.prototype.pop = function () { var hit = this._lruList[this._lru] del(this, hit) return hit || null } LRUCache.prototype.del = function (key) { typeCheckKey(key) del(this, this._cache[key]) } LRUCache.prototype.load = function (arr) { //reset the cache this.reset(); var now = Date.now() //A previous serialized cache has the most recent items first for (var l = arr.length - 1; l >= 0; l-- ) { var hit = arr[l] typeCheckKey(hit.k) var expiresAt = hit.e || 0 if (expiresAt === 0) { //the item was created without expiration in a non aged cache this.set(hit.k, hit.v) } else { var maxAge = expiresAt - now //dont add already expired items if (maxAge > 0) this.set(hit.k, hit.v, maxAge) } } } function get (self, key, doUse) { typeCheckKey(key) var hit = self._cache[key] if (hit) { if (isStale(self, hit)) { del(self, hit) if (!self._allowStale) hit = undefined } else { if (doUse) use(self, hit) } if (hit) hit = hit.value } return hit } function isStale(self, hit) { if (!hit || (!hit.maxAge && !self._maxAge)) return false var stale = false; var diff = Date.now() - hit.now if (hit.maxAge) { stale = diff > hit.maxAge } else { stale = self._maxAge && (diff > self._maxAge) } return stale; } function use (self, hit) { shiftLU(self, hit) hit.lu = self._mru ++ self._lruList[hit.lu] = hit } function trim (self) { while (self._lru < self._mru && self._length > self._max) del(self, self._lruList[self._lru]) } function shiftLU (self, hit) { delete self._lruList[ hit.lu ] while (self._lru < self._mru && !self._lruList[self._lru]) self._lru ++ } function del (self, hit) { if (hit) { if (self._dispose) self._dispose(hit.key, hit.value) self._length -= hit.length self._itemCount -- delete self._cache[ hit.key ] shiftLU(self, hit) } } // classy, since V8 prefers predictable objects. function Entry (key, value, lu, length, now, maxAge) { this.key = key this.value = value this.lu = lu this.length = length this.now = now if (maxAge) this.maxAge = maxAge } })() npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/test/basic.js0000644000000000000000000002031212631326456031233 0ustar 00000000000000var test = require("tap").test , LRU = require("../") test("basic", function (t) { var cache = new LRU({max: 10}) cache.set("key", "value") t.equal(cache.get("key"), "value") t.equal(cache.get("nada"), undefined) t.equal(cache.length, 1) t.equal(cache.max, 10) t.end() }) test("least recently set", function (t) { var cache = new LRU(2) cache.set("a", "A") cache.set("b", "B") cache.set("c", "C") t.equal(cache.get("c"), "C") t.equal(cache.get("b"), "B") t.equal(cache.get("a"), undefined) t.end() }) test("lru recently gotten", function (t) { var cache = new LRU(2) cache.set("a", "A") cache.set("b", "B") cache.get("a") cache.set("c", "C") t.equal(cache.get("c"), "C") t.equal(cache.get("b"), undefined) t.equal(cache.get("a"), "A") t.end() }) test("del", function (t) { var cache = new LRU(2) cache.set("a", "A") cache.del("a") t.equal(cache.get("a"), undefined) t.end() }) test("max", function (t) { var cache = new LRU(3) // test changing the max, verify that the LRU items get dropped. cache.max = 100 for (var i = 0; i < 100; i ++) cache.set(i, i) t.equal(cache.length, 100) for (var i = 0; i < 100; i ++) { t.equal(cache.get(i), i) } cache.max = 3 t.equal(cache.length, 3) for (var i = 0; i < 97; i ++) { t.equal(cache.get(i), undefined) } for (var i = 98; i < 100; i ++) { t.equal(cache.get(i), i) } // now remove the max restriction, and try again. cache.max = "hello" for (var i = 0; i < 100; i ++) cache.set(i, i) t.equal(cache.length, 100) for (var i = 0; i < 100; i ++) { t.equal(cache.get(i), i) } // should trigger an immediate resize cache.max = 3 t.equal(cache.length, 3) for (var i = 0; i < 97; i ++) { t.equal(cache.get(i), undefined) } for (var i = 98; i < 100; i ++) { t.equal(cache.get(i), i) } t.end() }) test("reset", function (t) { var cache = new LRU(10) cache.set("a", "A") cache.set("b", "B") cache.reset() t.equal(cache.length, 0) t.equal(cache.max, 10) t.equal(cache.get("a"), undefined) t.equal(cache.get("b"), undefined) t.end() }) test("basic with weighed length", function (t) { var cache = new LRU({ max: 100, length: function (item) { return item.size } }) cache.set("key", {val: "value", size: 50}) t.equal(cache.get("key").val, "value") t.equal(cache.get("nada"), undefined) t.equal(cache.lengthCalculator(cache.get("key")), 50) t.equal(cache.length, 50) t.equal(cache.max, 100) t.end() }) test("weighed length item too large", function (t) { var cache = new LRU({ max: 10, length: function (item) { return item.size } }) t.equal(cache.max, 10) // should fall out immediately cache.set("key", {val: "value", size: 50}) t.equal(cache.length, 0) t.equal(cache.get("key"), undefined) t.end() }) test("least recently set with weighed length", function (t) { var cache = new LRU({ max:8, length: function (item) { return item.length } }) cache.set("a", "A") cache.set("b", "BB") cache.set("c", "CCC") cache.set("d", "DDDD") t.equal(cache.get("d"), "DDDD") t.equal(cache.get("c"), "CCC") t.equal(cache.get("b"), undefined) t.equal(cache.get("a"), undefined) t.end() }) test("lru recently gotten with weighed length", function (t) { var cache = new LRU({ max: 8, length: function (item) { return item.length } }) cache.set("a", "A") cache.set("b", "BB") cache.set("c", "CCC") cache.get("a") cache.get("b") cache.set("d", "DDDD") t.equal(cache.get("c"), undefined) t.equal(cache.get("d"), "DDDD") t.equal(cache.get("b"), "BB") t.equal(cache.get("a"), "A") t.end() }) test("lru recently updated with weighed length", function (t) { var cache = new LRU({ max: 8, length: function (item) { return item.length } }) cache.set("a", "A") cache.set("b", "BB") cache.set("c", "CCC") t.equal(cache.length, 6) //CCC BB A cache.set("a", "+A") t.equal(cache.length, 7) //+A CCC BB cache.set("b", "++BB") t.equal(cache.length, 6) //++BB +A t.equal(cache.get("c"), undefined) cache.set("c", "oversized") t.equal(cache.length, 6) //++BB +A t.equal(cache.get("c"), undefined) cache.set("a", "oversized") t.equal(cache.length, 4) //++BB t.equal(cache.get("a"), undefined) t.equal(cache.get("b"), "++BB") t.end() }) test("set returns proper booleans", function(t) { var cache = new LRU({ max: 5, length: function (item) { return item.length } }) t.equal(cache.set("a", "A"), true) // should return false for max exceeded t.equal(cache.set("b", "donuts"), false) t.equal(cache.set("b", "B"), true) t.equal(cache.set("c", "CCCC"), true) t.end() }) test("drop the old items", function(t) { var cache = new LRU({ max: 5, maxAge: 50 }) cache.set("a", "A") setTimeout(function () { cache.set("b", "b") t.equal(cache.get("a"), "A") }, 25) setTimeout(function () { cache.set("c", "C") // timed out t.notOk(cache.get("a")) }, 60 + 25) setTimeout(function () { t.notOk(cache.get("b")) t.equal(cache.get("c"), "C") }, 90) setTimeout(function () { t.notOk(cache.get("c")) t.end() }, 155) }) test("individual item can have it's own maxAge", function(t) { var cache = new LRU({ max: 5, maxAge: 50 }) cache.set("a", "A", 20) setTimeout(function () { t.notOk(cache.get("a")) t.end() }, 25) }) test("individual item can have it's own maxAge > cache's", function(t) { var cache = new LRU({ max: 5, maxAge: 20 }) cache.set("a", "A", 50) setTimeout(function () { t.equal(cache.get("a"), "A") t.end() }, 25) }) test("disposal function", function(t) { var disposed = false var cache = new LRU({ max: 1, dispose: function (k, n) { disposed = n } }) cache.set(1, 1) cache.set(2, 2) t.equal(disposed, 1) cache.set(3, 3) t.equal(disposed, 2) cache.reset() t.equal(disposed, 3) t.end() }) test("disposal function on too big of item", function(t) { var disposed = false var cache = new LRU({ max: 1, length: function (k) { return k.length }, dispose: function (k, n) { disposed = n } }) var obj = [ 1, 2 ] t.equal(disposed, false) cache.set("obj", obj) t.equal(disposed, obj) t.end() }) test("has()", function(t) { var cache = new LRU({ max: 1, maxAge: 10 }) cache.set('foo', 'bar') t.equal(cache.has('foo'), true) cache.set('blu', 'baz') t.equal(cache.has('foo'), false) t.equal(cache.has('blu'), true) setTimeout(function() { t.equal(cache.has('blu'), false) t.end() }, 15) }) test("stale", function(t) { var cache = new LRU({ maxAge: 10, stale: true }) cache.set('foo', 'bar') t.equal(cache.get('foo'), 'bar') t.equal(cache.has('foo'), true) setTimeout(function() { t.equal(cache.has('foo'), false) t.equal(cache.get('foo'), 'bar') t.equal(cache.get('foo'), undefined) t.end() }, 15) }) test("lru update via set", function(t) { var cache = LRU({ max: 2 }); cache.set('foo', 1); cache.set('bar', 2); cache.del('bar'); cache.set('baz', 3); cache.set('qux', 4); t.equal(cache.get('foo'), undefined) t.equal(cache.get('bar'), undefined) t.equal(cache.get('baz'), 3) t.equal(cache.get('qux'), 4) t.end() }) test("least recently set w/ peek", function (t) { var cache = new LRU(2) cache.set("a", "A") cache.set("b", "B") t.equal(cache.peek("a"), "A") cache.set("c", "C") t.equal(cache.get("c"), "C") t.equal(cache.get("b"), "B") t.equal(cache.get("a"), undefined) t.end() }) test("pop the least used item", function (t) { var cache = new LRU(3) , last cache.set("a", "A") cache.set("b", "B") cache.set("c", "C") t.equal(cache.length, 3) t.equal(cache.max, 3) // Ensure we pop a, c, b cache.get("b", "B") last = cache.pop() t.equal(last.key, "a") t.equal(last.value, "A") t.equal(cache.length, 2) t.equal(cache.max, 3) last = cache.pop() t.equal(last.key, "c") t.equal(last.value, "C") t.equal(cache.length, 1) t.equal(cache.max, 3) last = cache.pop() t.equal(last.key, "b") t.equal(last.value, "B") t.equal(cache.length, 0) t.equal(cache.max, 3) last = cache.pop() t.equal(last, null) t.equal(cache.length, 0) t.equal(cache.max, 3) t.end() }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/test/foreach.js0000644000000000000000000000445312631326456031571 0ustar 00000000000000var test = require('tap').test var LRU = require('../') test('forEach', function (t) { var l = new LRU(5) for (var i = 0; i < 10; i ++) { l.set(i.toString(), i.toString(2)) } var i = 9 l.forEach(function (val, key, cache) { t.equal(cache, l) t.equal(key, i.toString()) t.equal(val, i.toString(2)) i -= 1 }) // get in order of most recently used l.get(6) l.get(8) var order = [ 8, 6, 9, 7, 5 ] var i = 0 l.forEach(function (val, key, cache) { var j = order[i ++] t.equal(cache, l) t.equal(key, j.toString()) t.equal(val, j.toString(2)) }) t.equal(i, order.length); t.end() }) test('keys() and values()', function (t) { var l = new LRU(5) for (var i = 0; i < 10; i ++) { l.set(i.toString(), i.toString(2)) } t.similar(l.keys(), ['9', '8', '7', '6', '5']) t.similar(l.values(), ['1001', '1000', '111', '110', '101']) // get in order of most recently used l.get(6) l.get(8) t.similar(l.keys(), ['8', '6', '9', '7', '5']) t.similar(l.values(), ['1000', '110', '1001', '111', '101']) t.end() }) test('all entries are iterated over', function(t) { var l = new LRU(5) for (var i = 0; i < 10; i ++) { l.set(i.toString(), i.toString(2)) } var i = 0 l.forEach(function (val, key, cache) { if (i > 0) { cache.del(key) } i += 1 }) t.equal(i, 5) t.equal(l.keys().length, 1) t.end() }) test('all stale entries are removed', function(t) { var l = new LRU({ max: 5, maxAge: -5, stale: true }) for (var i = 0; i < 10; i ++) { l.set(i.toString(), i.toString(2)) } var i = 0 l.forEach(function () { i += 1 }) t.equal(i, 5) t.equal(l.keys().length, 0) t.end() }) test('expires', function (t) { var l = new LRU({ max: 10, maxAge: 50 }) for (var i = 0; i < 10; i++) { l.set(i.toString(), i.toString(2), ((i % 2) ? 25 : undefined)) } var i = 0 var order = [ 8, 6, 4, 2, 0 ] setTimeout(function () { l.forEach(function (val, key, cache) { var j = order[i++] t.equal(cache, l) t.equal(key, j.toString()) t.equal(val, j.toString(2)) }) t.equal(i, order.length); setTimeout(function () { var count = 0; l.forEach(function (val, key, cache) { count++; }) t.equal(0, count); t.end() }, 25) }, 26) }) ././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/test/memory-leak.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/test/memory-leak.0000644000000000000000000000156012631326456032043 0ustar 00000000000000#!/usr/bin/env node --expose_gc var weak = require('weak'); var test = require('tap').test var LRU = require('../') var l = new LRU({ max: 10 }) var refs = 0 function X() { refs ++ weak(this, deref) } function deref() { refs -- } test('no leaks', function (t) { // fill up the cache for (var i = 0; i < 100; i++) { l.set(i, new X); // throw some gets in there, too. if (i % 2 === 0) l.get(i / 2) } gc() var start = process.memoryUsage() // capture the memory var startRefs = refs // do it again, but more for (var i = 0; i < 10000; i++) { l.set(i, new X); // throw some gets in there, too. if (i % 2 === 0) l.get(i / 2) } gc() var end = process.memoryUsage() t.equal(refs, startRefs, 'no leaky refs') console.error('start: %j\n' + 'end: %j', start, end); t.pass(); t.end(); }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/lru-cache/test/serialize.js0000644000000000000000000001051612631326456032146 0ustar 00000000000000var test = require('tap').test var LRU = require('../') test('dump', function (t) { var cache = new LRU() t.equal(cache.dump().length, 0, "nothing in dump for empty cache") cache.set("a", "A") cache.set("b", "B") t.deepEqual(cache.dump(), [ { k: "b", v: "B", e: 0 }, { k: "a", v: "A", e: 0 } ]) cache.set("a", "A"); t.deepEqual(cache.dump(), [ { k: "a", v: "A", e: 0 }, { k: "b", v: "B", e: 0 } ]) cache.get("b"); t.deepEqual(cache.dump(), [ { k: "b", v: "B", e: 0 }, { k: "a", v: "A", e: 0 } ]) cache.del("a"); t.deepEqual(cache.dump(), [ { k: "b", v: "B", e: 0 } ]) t.end() }) test("do not dump stale items", function(t) { var cache = new LRU({ max: 5, maxAge: 50 }) //expires at 50 cache.set("a", "A") setTimeout(function () { //expires at 75 cache.set("b", "B") var s = cache.dump() t.equal(s.length, 2) t.equal(s[0].k, "b") t.equal(s[1].k, "a") }, 25) setTimeout(function () { //expires at 110 cache.set("c", "C") var s = cache.dump() t.equal(s.length, 2) t.equal(s[0].k, "c") t.equal(s[1].k, "b") }, 60) setTimeout(function () { //expires at 130 cache.set("d", "D", 40) var s = cache.dump() t.equal(s.length, 2) t.equal(s[0].k, "d") t.equal(s[1].k, "c") }, 90) setTimeout(function () { var s = cache.dump() t.equal(s.length, 1) t.equal(s[0].k, "d") }, 120) setTimeout(function () { var s = cache.dump() t.deepEqual(s, []) t.end() }, 155) }) test("load basic cache", function(t) { var cache = new LRU(), copy = new LRU() cache.set("a", "A") cache.set("b", "B") copy.load(cache.dump()) t.deepEquals(cache.dump(), copy.dump()) t.end() }) test("load staled cache", function(t) { var cache = new LRU({maxAge: 50}), copy = new LRU({maxAge: 50}), arr //expires at 50 cache.set("a", "A") setTimeout(function () { //expires at 80 cache.set("b", "B") arr = cache.dump() t.equal(arr.length, 2) }, 30) setTimeout(function () { copy.load(arr) t.equal(copy.get("a"), undefined) t.equal(copy.get("b"), "B") }, 60) setTimeout(function () { t.equal(copy.get("b"), undefined) t.end() }, 90) }) test("load to other size cache", function(t) { var cache = new LRU({max: 2}), copy = new LRU({max: 1}) cache.set("a", "A") cache.set("b", "B") copy.load(cache.dump()) t.equal(copy.get("a"), undefined) t.equal(copy.get("b"), "B") //update the last read from original cache cache.get("a") copy.load(cache.dump()) t.equal(copy.get("a"), "A") t.equal(copy.get("b"), undefined) t.end() }) test("load to other age cache", function(t) { var cache = new LRU({maxAge: 50}), aged = new LRU({maxAge: 100}), simple = new LRU(), arr, expired //created at 0 //a would be valid till 0 + 50 cache.set("a", "A") setTimeout(function () { //created at 20 //b would be valid till 20 + 50 cache.set("b", "B") //b would be valid till 20 + 70 cache.set("c", "C", 70) arr = cache.dump() t.equal(arr.length, 3) }, 20) setTimeout(function () { t.equal(cache.get("a"), undefined) t.equal(cache.get("b"), "B") t.equal(cache.get("c"), "C") aged.load(arr) t.equal(aged.get("a"), undefined) t.equal(aged.get("b"), "B") t.equal(aged.get("c"), "C") simple.load(arr) t.equal(simple.get("a"), undefined) t.equal(simple.get("b"), "B") t.equal(simple.get("c"), "C") }, 60) setTimeout(function () { t.equal(cache.get("a"), undefined) t.equal(cache.get("b"), undefined) t.equal(cache.get("c"), "C") aged.load(arr) t.equal(aged.get("a"), undefined) t.equal(aged.get("b"), undefined) t.equal(aged.get("c"), "C") simple.load(arr) t.equal(simple.get("a"), undefined) t.equal(simple.get("b"), undefined) t.equal(simple.get("c"), "C") }, 80) setTimeout(function () { t.equal(cache.get("a"), undefined) t.equal(cache.get("b"), undefined) t.equal(cache.get("c"), undefined) aged.load(arr) t.equal(aged.get("a"), undefined) t.equal(aged.get("b"), undefined) t.equal(aged.get("c"), undefined) simple.load(arr) t.equal(simple.get("a"), undefined) t.equal(simple.get("b"), undefined) t.equal(simple.get("c"), undefined) t.end() }, 100) }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/LICENSE0000644000000000000000000000137512631326456027455 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/README.md0000644000000000000000000000347612631326456027733 0ustar 00000000000000# sigmund Quick and dirty signatures for Objects. This is like a much faster `deepEquals` comparison, which returns a string key suitable for caches and the like. ## Usage ```javascript function doSomething (someObj) { var key = sigmund(someObj, maxDepth) // max depth defaults to 10 var cached = cache.get(key) if (cached) return cached var result = expensiveCalculation(someObj) cache.set(key, result) return result } ``` The resulting key will be as unique and reproducible as calling `JSON.stringify` or `util.inspect` on the object, but is much faster. In order to achieve this speed, some differences are glossed over. For example, the object `{0:'foo'}` will be treated identically to the array `['foo']`. Also, just as there is no way to summon the soul from the scribblings of a cocaine-addled psychoanalyst, there is no way to revive the object from the signature string that sigmund gives you. In fact, it's barely even readable. As with `util.inspect` and `JSON.stringify`, larger objects will produce larger signature strings. Because sigmund is a bit less strict than the more thorough alternatives, the strings will be shorter, and also there is a slightly higher chance for collisions. For example, these objects have the same signature: var obj1 = {a:'b',c:/def/,g:['h','i',{j:'',k:'l'}]} var obj2 = {a:'b',c:'/def/',g:['h','i','{jkl']} Like a good Freudian, sigmund is most effective when you already have some understanding of what you're looking for. It can help you help yourself, but you must be willing to do some work as well. Cycles are handled, and cyclical objects are silently omitted (though the key is included in the signature output.) The second argument is the maximum depth, which defaults to 10, because that is the maximum object traversal depth covered by most insurance carriers. npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/bench.js0000644000000000000000000001540612631326456030065 0ustar 00000000000000// different ways to id objects // use a req/res pair, since it's crazy deep and cyclical // sparseFE10 and sigmund are usually pretty close, which is to be expected, // since they are essentially the same algorithm, except that sigmund handles // regular expression objects properly. var http = require('http') var util = require('util') var sigmund = require('./sigmund.js') var sreq, sres, creq, cres, test http.createServer(function (q, s) { sreq = q sres = s sres.end('ok') this.close(function () { setTimeout(function () { start() }, 200) }) }).listen(1337, function () { creq = http.get({ port: 1337 }) creq.on('response', function (s) { cres = s }) }) function start () { test = [sreq, sres, creq, cres] // test = sreq // sreq.sres = sres // sreq.creq = creq // sreq.cres = cres for (var i in exports.compare) { console.log(i) var hash = exports.compare[i]() console.log(hash) console.log(hash.length) console.log('') } require('bench').runMain() } function customWs (obj, md, d) { d = d || 0 var to = typeof obj if (to === 'undefined' || to === 'function' || to === null) return '' if (d > md || !obj || to !== 'object') return ('' + obj).replace(/[\n ]+/g, '') if (Array.isArray(obj)) { return obj.map(function (i, _, __) { return customWs(i, md, d + 1) }).reduce(function (a, b) { return a + b }, '') } var keys = Object.keys(obj) return keys.map(function (k, _, __) { return k + ':' + customWs(obj[k], md, d + 1) }).reduce(function (a, b) { return a + b }, '') } function custom (obj, md, d) { d = d || 0 var to = typeof obj if (to === 'undefined' || to === 'function' || to === null) return '' if (d > md || !obj || to !== 'object') return '' + obj if (Array.isArray(obj)) { return obj.map(function (i, _, __) { return custom(i, md, d + 1) }).reduce(function (a, b) { return a + b }, '') } var keys = Object.keys(obj) return keys.map(function (k, _, __) { return k + ':' + custom(obj[k], md, d + 1) }).reduce(function (a, b) { return a + b }, '') } function sparseFE2 (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' Object.keys(v).forEach(function (k, _, __) { // pseudo-private values. skip those. if (k.charAt(0) === '_') return var to = typeof v[k] if (to === 'function' || to === 'undefined') return soFar += k + ':' ch(v[k], depth + 1) }) soFar += '}' } ch(obj, 0) return soFar } function sparseFE (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' Object.keys(v).forEach(function (k, _, __) { // pseudo-private values. skip those. if (k.charAt(0) === '_') return var to = typeof v[k] if (to === 'function' || to === 'undefined') return soFar += k ch(v[k], depth + 1) }) } ch(obj, 0) return soFar } function sparse (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' for (var k in v) { // pseudo-private values. skip those. if (k.charAt(0) === '_') continue var to = typeof v[k] if (to === 'function' || to === 'undefined') continue soFar += k ch(v[k], depth + 1) } } ch(obj, 0) return soFar } function noCommas (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' for (var k in v) { // pseudo-private values. skip those. if (k.charAt(0) === '_') continue var to = typeof v[k] if (to === 'function' || to === 'undefined') continue soFar += k + ':' ch(v[k], depth + 1) } soFar += '}' } ch(obj, 0) return soFar } function flatten (obj, maxDepth) { var seen = [] var soFar = '' function ch (v, depth) { if (depth > maxDepth) return if (typeof v === 'function' || typeof v === 'undefined') return if (typeof v !== 'object' || !v) { soFar += v return } if (seen.indexOf(v) !== -1 || depth === maxDepth) return seen.push(v) soFar += '{' for (var k in v) { // pseudo-private values. skip those. if (k.charAt(0) === '_') continue var to = typeof v[k] if (to === 'function' || to === 'undefined') continue soFar += k + ':' ch(v[k], depth + 1) soFar += ',' } soFar += '}' } ch(obj, 0) return soFar } exports.compare = { // 'custom 2': function () { // return custom(test, 2, 0) // }, // 'customWs 2': function () { // return customWs(test, 2, 0) // }, 'JSON.stringify (guarded)': function () { var seen = [] return JSON.stringify(test, function (k, v) { if (typeof v !== 'object' || !v) return v if (seen.indexOf(v) !== -1) return undefined seen.push(v) return v }) }, 'flatten 10': function () { return flatten(test, 10) }, // 'flattenFE 10': function () { // return flattenFE(test, 10) // }, 'noCommas 10': function () { return noCommas(test, 10) }, 'sparse 10': function () { return sparse(test, 10) }, 'sparseFE 10': function () { return sparseFE(test, 10) }, 'sparseFE2 10': function () { return sparseFE2(test, 10) }, sigmund: function() { return sigmund(test, 10) }, // 'util.inspect 1': function () { // return util.inspect(test, false, 1, false) // }, // 'util.inspect undefined': function () { // util.inspect(test) // }, // 'util.inspect 2': function () { // util.inspect(test, false, 2, false) // }, // 'util.inspect 3': function () { // util.inspect(test, false, 3, false) // }, // 'util.inspect 4': function () { // util.inspect(test, false, 4, false) // }, // 'util.inspect Infinity': function () { // util.inspect(test, false, Infinity, false) // } } /** results **/ npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/package.json0000644000000000000000000000403312631326456030730 0ustar 00000000000000{ "_args": [ [ "sigmund@~1.0.0", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/minimatch" ] ], "_from": "sigmund@>=1.0.0 <1.1.0", "_id": "sigmund@1.0.1", "_inCache": true, "_installable": true, "_location": "/node-gyp/minimatch/sigmund", "_nodeVersion": "2.0.1", "_npmUser": { "email": "isaacs@npmjs.com", "name": "isaacs" }, "_npmVersion": "2.10.0", "_phantomChildren": {}, "_requested": { "name": "sigmund", "raw": "sigmund@~1.0.0", "rawSpec": "~1.0.0", "scope": null, "spec": ">=1.0.0 <1.1.0", "type": "range" }, "_requiredBy": [ "/node-gyp/minimatch" ], "_resolved": "https://registry.npmjs.org/sigmund/-/sigmund-1.0.1.tgz", "_shasum": "3ff21f198cad2175f9f3b781853fd94d0d19b590", "_shrinkwrap": null, "_spec": "sigmund@~1.0.0", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/minimatch", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/isaacs/sigmund/issues" }, "dependencies": {}, "description": "Quick and dirty signatures for Objects.", "devDependencies": { "tap": "~0.3.0" }, "directories": { "test": "test" }, "dist": { "shasum": "3ff21f198cad2175f9f3b781853fd94d0d19b590", "tarball": "http://registry.npmjs.org/sigmund/-/sigmund-1.0.1.tgz" }, "gitHead": "527f97aa5bb253d927348698c0cd3bb267d098c6", "homepage": "https://github.com/isaacs/sigmund#readme", "keywords": [ "data", "key", "object", "psychoanalysis", "signature" ], "license": "ISC", "main": "sigmund.js", "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "name": "sigmund", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/isaacs/sigmund.git" }, "scripts": { "bench": "node bench.js", "test": "tap test/*.js" }, "version": "1.0.1" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/sigmund.js0000644000000000000000000000217212631326456030450 0ustar 00000000000000module.exports = sigmund function sigmund (subject, maxSessions) { maxSessions = maxSessions || 10; var notes = []; var analysis = ''; var RE = RegExp; function psychoAnalyze (subject, session) { if (session > maxSessions) return; if (typeof subject === 'function' || typeof subject === 'undefined') { return; } if (typeof subject !== 'object' || !subject || (subject instanceof RE)) { analysis += subject; return; } if (notes.indexOf(subject) !== -1 || session === maxSessions) return; notes.push(subject); analysis += '{'; Object.keys(subject).forEach(function (issue, _, __) { // pseudo-private values. skip those. if (issue.charAt(0) === '_') return; var to = typeof subject[issue]; if (to === 'function' || to === 'undefined') return; analysis += issue; psychoAnalyze(subject[issue], session + 1); }); } psychoAnalyze(subject, 0); return analysis; } // vim: set softtabstop=4 shiftwidth=4: npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/test/0000755000000000000000000000000012631326456027421 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/node_modules/sigmund/test/basic.js0000644000000000000000000000125312631326456031041 0ustar 00000000000000var test = require('tap').test var sigmund = require('../sigmund.js') // occasionally there are duplicates // that's an acceptable edge-case. JSON.stringify and util.inspect // have some collision potential as well, though less, and collision // detection is expensive. var hash = '{abc/def/g{0h1i2{jkl' var obj1 = {a:'b',c:/def/,g:['h','i',{j:'',k:'l'}]} var obj2 = {a:'b',c:'/def/',g:['h','i','{jkl']} var obj3 = JSON.parse(JSON.stringify(obj1)) obj3.c = /def/ obj3.g[2].cycle = obj3 var cycleHash = '{abc/def/g{0h1i2{jklcycle' test('basic', function (t) { t.equal(sigmund(obj1), hash) t.equal(sigmund(obj2), hash) t.equal(sigmund(obj3), cycleHash) t.end() }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/test/basic.js0000644000000000000000000003505112631326456024721 0ustar 00000000000000// http://www.bashcookbook.com/bashinfo/source/bash-1.14.7/tests/glob-test // // TODO: Some of these tests do very bad things with backslashes, and will // most likely fail badly on windows. They should probably be skipped. var tap = require("tap") , globalBefore = Object.keys(global) , mm = require("../") , files = [ "a", "b", "c", "d", "abc" , "abd", "abe", "bb", "bcd" , "ca", "cb", "dd", "de" , "bdir/", "bdir/cfile"] , next = files.concat([ "a-b", "aXb" , ".x", ".y" ]) var patterns = [ "http://www.bashcookbook.com/bashinfo/source/bash-1.14.7/tests/glob-test" , ["a*", ["a", "abc", "abd", "abe"]] , ["X*", ["X*"], {nonull: true}] // allow null glob expansion , ["X*", []] // isaacs: Slightly different than bash/sh/ksh // \\* is not un-escaped to literal "*" in a failed match, // but it does make it get treated as a literal star , ["\\*", ["\\*"], {nonull: true}] , ["\\**", ["\\**"], {nonull: true}] , ["\\*\\*", ["\\*\\*"], {nonull: true}] , ["b*/", ["bdir/"]] , ["c*", ["c", "ca", "cb"]] , ["**", files] , ["\\.\\./*/", ["\\.\\./*/"], {nonull: true}] , ["s/\\..*//", ["s/\\..*//"], {nonull: true}] , "legendary larry crashes bashes" , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\\1/" , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\\1/"], {nonull: true}] , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\1/" , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\1/"], {nonull: true}] , "character classes" , ["[a-c]b*", ["abc", "abd", "abe", "bb", "cb"]] , ["[a-y]*[^c]", ["abd", "abe", "bb", "bcd", "bdir/", "ca", "cb", "dd", "de"]] , ["a*[^c]", ["abd", "abe"]] , function () { files.push("a-b", "aXb") } , ["a[X-]b", ["a-b", "aXb"]] , function () { files.push(".x", ".y") } , ["[^a-c]*", ["d", "dd", "de"]] , function () { files.push("a*b/", "a*b/ooo") } , ["a\\*b/*", ["a*b/ooo"]] , ["a\\*?/*", ["a*b/ooo"]] , ["*\\\\!*", [], {null: true}, ["echo !7"]] , ["*\\!*", ["echo !7"], null, ["echo !7"]] , ["*.\\*", ["r.*"], null, ["r.*"]] , ["a[b]c", ["abc"]] , ["a[\\b]c", ["abc"]] , ["a?c", ["abc"]] , ["a\\*c", [], {null: true}, ["abc"]] , ["", [""], { null: true }, [""]] , "http://www.opensource.apple.com/source/bash/bash-23/" + "bash/tests/glob-test" , function () { files.push("man/", "man/man1/", "man/man1/bash.1") } , ["*/man*/bash.*", ["man/man1/bash.1"]] , ["man/man1/bash.1", ["man/man1/bash.1"]] , ["a***c", ["abc"], null, ["abc"]] , ["a*****?c", ["abc"], null, ["abc"]] , ["?*****??", ["abc"], null, ["abc"]] , ["*****??", ["abc"], null, ["abc"]] , ["?*****?c", ["abc"], null, ["abc"]] , ["?***?****c", ["abc"], null, ["abc"]] , ["?***?****?", ["abc"], null, ["abc"]] , ["?***?****", ["abc"], null, ["abc"]] , ["*******c", ["abc"], null, ["abc"]] , ["*******?", ["abc"], null, ["abc"]] , ["a*cd**?**??k", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??k", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??k***", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??***k", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??***k**", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a****c**?**??*****", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["[-abc]", ["-"], null, ["-"]] , ["[abc-]", ["-"], null, ["-"]] , ["\\", ["\\"], null, ["\\"]] , ["[\\\\]", ["\\"], null, ["\\"]] , ["[[]", ["["], null, ["["]] , ["[", ["["], null, ["["]] , ["[*", ["[abc"], null, ["[abc"]] , "a right bracket shall lose its special meaning and\n" + "represent itself in a bracket expression if it occurs\n" + "first in the list. -- POSIX.2 2.8.3.2" , ["[]]", ["]"], null, ["]"]] , ["[]-]", ["]"], null, ["]"]] , ["[a-\z]", ["p"], null, ["p"]] , ["??**********?****?", [], { null: true }, ["abc"]] , ["??**********?****c", [], { null: true }, ["abc"]] , ["?************c****?****", [], { null: true }, ["abc"]] , ["*c*?**", [], { null: true }, ["abc"]] , ["a*****c*?**", [], { null: true }, ["abc"]] , ["a********???*******", [], { null: true }, ["abc"]] , ["[]", [], { null: true }, ["a"]] , ["[abc", [], { null: true }, ["["]] , "nocase tests" , ["XYZ", ["xYz"], { nocase: true, null: true } , ["xYz", "ABC", "IjK"]] , ["ab*", ["ABC"], { nocase: true, null: true } , ["xYz", "ABC", "IjK"]] , ["[ia]?[ck]", ["ABC", "IjK"], { nocase: true, null: true } , ["xYz", "ABC", "IjK"]] // [ pattern, [matches], MM opts, files, TAP opts] , "onestar/twostar" , ["{/*,*}", [], {null: true}, ["/asdf/asdf/asdf"]] , ["{/?,*}", ["/a", "bb"], {null: true} , ["/a", "/b/b", "/a/b/c", "bb"]] , "dots should not match unless requested" , ["**", ["a/b"], {}, ["a/b", "a/.d", ".a/.d"]] // .. and . can only match patterns starting with ., // even when options.dot is set. , function () { files = ["a/./b", "a/../b", "a/c/b", "a/.d/b"] } , ["a/*/b", ["a/c/b", "a/.d/b"], {dot: true}] , ["a/.*/b", ["a/./b", "a/../b", "a/.d/b"], {dot: true}] , ["a/*/b", ["a/c/b"], {dot:false}] , ["a/.*/b", ["a/./b", "a/../b", "a/.d/b"], {dot: false}] // this also tests that changing the options needs // to change the cache key, even if the pattern is // the same! , ["**", ["a/b","a/.d",".a/.d"], { dot: true } , [ ".a/.d", "a/.d", "a/b"]] , "paren sets cannot contain slashes" , ["*(a/b)", ["*(a/b)"], {nonull: true}, ["a/b"]] // brace sets trump all else. // // invalid glob pattern. fails on bash4 and bsdglob. // however, in this implementation, it's easier just // to do the intuitive thing, and let brace-expansion // actually come before parsing any extglob patterns, // like the documentation seems to say. // // XXX: if anyone complains about this, either fix it // or tell them to grow up and stop complaining. // // bash/bsdglob says this: // , ["*(a|{b),c)}", ["*(a|{b),c)}"], {}, ["a", "ab", "ac", "ad"]] // but we do this instead: , ["*(a|{b),c)}", ["a", "ab", "ac"], {}, ["a", "ab", "ac", "ad"]] // test partial parsing in the presence of comment/negation chars , ["[!a*", ["[!ab"], {}, ["[!ab", "[ab"]] , ["[#a*", ["[#ab"], {}, ["[#ab", "[ab"]] // like: {a,b|c\\,d\\\|e} except it's unclosed, so it has to be escaped. , ["+(a|*\\|c\\\\|d\\\\\\|e\\\\\\\\|f\\\\\\\\\\|g" , ["+(a|b\\|c\\\\|d\\\\|e\\\\\\\\|f\\\\\\\\|g"] , {} , ["+(a|b\\|c\\\\|d\\\\|e\\\\\\\\|f\\\\\\\\|g", "a", "b\\c"]] // crazy nested {,,} and *(||) tests. , function () { files = [ "a", "b", "c", "d" , "ab", "ac", "ad" , "bc", "cb" , "bc,d", "c,db", "c,d" , "d)", "(b|c", "*(b|c" , "b|c", "b|cc", "cb|c" , "x(a|b|c)", "x(a|c)" , "(a|b|c)", "(a|c)"] } , ["*(a|{b,c})", ["a", "b", "c", "ab", "ac"]] , ["{a,*(b|c,d)}", ["a","(b|c", "*(b|c", "d)"]] // a // *(b|c) // *(b|d) , ["{a,*(b|{c,d})}", ["a","b", "bc", "cb", "c", "d"]] , ["*(a|{b|c,c})", ["a", "b", "c", "ab", "ac", "bc", "cb"]] // test various flag settings. , [ "*(a|{b|c,c})", ["x(a|b|c)", "x(a|c)", "(a|b|c)", "(a|c)"] , { noext: true } ] , ["a?b", ["x/y/acb", "acb/"], {matchBase: true} , ["x/y/acb", "acb/", "acb/d/e", "x/y/acb/d"] ] , ["#*", ["#a", "#b"], {nocomment: true}, ["#a", "#b", "c#d"]] // begin channelling Boole and deMorgan... , "negation tests" , function () { files = ["d", "e", "!ab", "!abc", "a!b", "\\!a"] } // anything that is NOT a* matches. , ["!a*", ["\\!a", "d", "e", "!ab", "!abc"]] // anything that IS !a* matches. , ["!a*", ["!ab", "!abc"], {nonegate: true}] // anything that IS a* matches , ["!!a*", ["a!b"]] // anything that is NOT !a* matches , ["!\\!a*", ["a!b", "d", "e", "\\!a"]] // negation nestled within a pattern , function () { files = [ "foo.js" , "foo.bar" // can't match this one without negative lookbehind. , "foo.js.js" , "blar.js" , "foo." , "boo.js.boo" ] } , ["*.!(js)", ["foo.bar", "foo.", "boo.js.boo"] ] // https://github.com/isaacs/minimatch/issues/5 , function () { files = [ 'a/b/.x/c' , 'a/b/.x/c/d' , 'a/b/.x/c/d/e' , 'a/b/.x' , 'a/b/.x/' , 'a/.x/b' , '.x' , '.x/' , '.x/a' , '.x/a/b' , 'a/.x/b/.x/c' , '.x/.x' ] } , ["**/.x/**", [ '.x/' , '.x/a' , '.x/a/b' , 'a/.x/b' , 'a/b/.x/' , 'a/b/.x/c' , 'a/b/.x/c/d' , 'a/b/.x/c/d/e' ] ] ] var regexps = [ '/^(?:(?=.)a[^/]*?)$/', '/^(?:(?=.)X[^/]*?)$/', '/^(?:(?=.)X[^/]*?)$/', '/^(?:\\*)$/', '/^(?:(?=.)\\*[^/]*?)$/', '/^(?:\\*\\*)$/', '/^(?:(?=.)b[^/]*?\\/)$/', '/^(?:(?=.)c[^/]*?)$/', '/^(?:(?:(?!(?:\\/|^)\\.).)*?)$/', '/^(?:\\.\\.\\/(?!\\.)(?=.)[^/]*?\\/)$/', '/^(?:s\\/(?=.)\\.\\.[^/]*?\\/)$/', '/^(?:\\/\\^root:\\/\\{s\\/(?=.)\\^[^:][^/]*?:[^:][^/]*?:\\([^:]\\)[^/]*?\\.[^/]*?\\$\\/1\\/)$/', '/^(?:\\/\\^root:\\/\\{s\\/(?=.)\\^[^:][^/]*?:[^:][^/]*?:\\([^:]\\)[^/]*?\\.[^/]*?\\$\\/\u0001\\/)$/', '/^(?:(?!\\.)(?=.)[a-c]b[^/]*?)$/', '/^(?:(?!\\.)(?=.)[a-y][^/]*?[^c])$/', '/^(?:(?=.)a[^/]*?[^c])$/', '/^(?:(?=.)a[X-]b)$/', '/^(?:(?!\\.)(?=.)[^a-c][^/]*?)$/', '/^(?:a\\*b\\/(?!\\.)(?=.)[^/]*?)$/', '/^(?:(?=.)a\\*[^/]\\/(?!\\.)(?=.)[^/]*?)$/', '/^(?:(?!\\.)(?=.)[^/]*?\\\\\\![^/]*?)$/', '/^(?:(?!\\.)(?=.)[^/]*?\\![^/]*?)$/', '/^(?:(?!\\.)(?=.)[^/]*?\\.\\*)$/', '/^(?:(?=.)a[b]c)$/', '/^(?:(?=.)a[b]c)$/', '/^(?:(?=.)a[^/]c)$/', '/^(?:a\\*c)$/', 'false', '/^(?:(?!\\.)(?=.)[^/]*?\\/(?=.)man[^/]*?\\/(?=.)bash\\.[^/]*?)$/', '/^(?:man\\/man1\\/bash\\.1)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/]*?c)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]c)$/', '/^(?:(?!\\.)(?=.)[^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/][^/])$/', '/^(?:(?!\\.)(?=.)[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/][^/])$/', '/^(?:(?!\\.)(?=.)[^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]c)$/', '/^(?:(?!\\.)(?=.)[^/][^/]*?[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/]*?[^/]*?c)$/', '/^(?:(?!\\.)(?=.)[^/][^/]*?[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/]*?[^/]*?[^/])$/', '/^(?:(?!\\.)(?=.)[^/][^/]*?[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/]*?[^/]*?)$/', '/^(?:(?!\\.)(?=.)[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?c)$/', '/^(?:(?!\\.)(?=.)[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/])$/', '/^(?:(?=.)a[^/]*?cd[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/][^/]k)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/][^/]*?[^/]*?cd[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/][^/]k)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/][^/]*?[^/]*?cd[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/][^/]k[^/]*?[^/]*?[^/]*?)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/][^/]*?[^/]*?cd[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/][^/][^/]*?[^/]*?[^/]*?k)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/][^/]*?[^/]*?cd[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/][^/][^/]*?[^/]*?[^/]*?k[^/]*?[^/]*?)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/]*?[^/]*?c[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/][^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?)$/', '/^(?:(?!\\.)(?=.)[-abc])$/', '/^(?:(?!\\.)(?=.)[abc-])$/', '/^(?:\\\\)$/', '/^(?:(?!\\.)(?=.)[\\\\])$/', '/^(?:(?!\\.)(?=.)[\\[])$/', '/^(?:\\[)$/', '/^(?:(?=.)\\[(?!\\.)(?=.)[^/]*?)$/', '/^(?:(?!\\.)(?=.)[\\]])$/', '/^(?:(?!\\.)(?=.)[\\]-])$/', '/^(?:(?!\\.)(?=.)[a-z])$/', '/^(?:(?!\\.)(?=.)[^/][^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/]*?[^/]*?[^/])$/', '/^(?:(?!\\.)(?=.)[^/][^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/]*?[^/]*?c)$/', '/^(?:(?!\\.)(?=.)[^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?c[^/]*?[^/]*?[^/]*?[^/]*?[^/][^/]*?[^/]*?[^/]*?[^/]*?)$/', '/^(?:(?!\\.)(?=.)[^/]*?c[^/]*?[^/][^/]*?[^/]*?)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?c[^/]*?[^/][^/]*?[^/]*?)$/', '/^(?:(?=.)a[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/][^/][^/][^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?[^/]*?)$/', '/^(?:\\[\\])$/', '/^(?:\\[abc)$/', '/^(?:(?=.)XYZ)$/i', '/^(?:(?=.)ab[^/]*?)$/i', '/^(?:(?!\\.)(?=.)[ia][^/][ck])$/i', '/^(?:\\/(?!\\.)(?=.)[^/]*?|(?!\\.)(?=.)[^/]*?)$/', '/^(?:\\/(?!\\.)(?=.)[^/]|(?!\\.)(?=.)[^/]*?)$/', '/^(?:(?:(?!(?:\\/|^)\\.).)*?)$/', '/^(?:a\\/(?!(?:^|\\/)\\.{1,2}(?:$|\\/))(?=.)[^/]*?\\/b)$/', '/^(?:a\\/(?=.)\\.[^/]*?\\/b)$/', '/^(?:a\\/(?!\\.)(?=.)[^/]*?\\/b)$/', '/^(?:a\\/(?=.)\\.[^/]*?\\/b)$/', '/^(?:(?:(?!(?:\\/|^)(?:\\.{1,2})($|\\/)).)*?)$/', '/^(?:(?!\\.)(?=.)[^/]*?\\(a\\/b\\))$/', '/^(?:(?!\\.)(?=.)(?:a|b)*|(?!\\.)(?=.)(?:a|c)*)$/', '/^(?:(?=.)\\[(?=.)\\!a[^/]*?)$/', '/^(?:(?=.)\\[(?=.)#a[^/]*?)$/', '/^(?:(?=.)\\+\\(a\\|[^/]*?\\|c\\\\\\\\\\|d\\\\\\\\\\|e\\\\\\\\\\\\\\\\\\|f\\\\\\\\\\\\\\\\\\|g)$/', '/^(?:(?!\\.)(?=.)(?:a|b)*|(?!\\.)(?=.)(?:a|c)*)$/', '/^(?:a|(?!\\.)(?=.)[^/]*?\\(b\\|c|d\\))$/', '/^(?:a|(?!\\.)(?=.)(?:b|c)*|(?!\\.)(?=.)(?:b|d)*)$/', '/^(?:(?!\\.)(?=.)(?:a|b|c)*|(?!\\.)(?=.)(?:a|c)*)$/', '/^(?:(?!\\.)(?=.)[^/]*?\\(a\\|b\\|c\\)|(?!\\.)(?=.)[^/]*?\\(a\\|c\\))$/', '/^(?:(?=.)a[^/]b)$/', '/^(?:(?=.)#[^/]*?)$/', '/^(?!^(?:(?=.)a[^/]*?)$).*$/', '/^(?:(?=.)\\!a[^/]*?)$/', '/^(?:(?=.)a[^/]*?)$/', '/^(?!^(?:(?=.)\\!a[^/]*?)$).*$/', '/^(?:(?!\\.)(?=.)[^/]*?\\.(?:(?!js)[^/]*?))$/', '/^(?:(?:(?!(?:\\/|^)\\.).)*?\\/\\.x\\/(?:(?!(?:\\/|^)\\.).)*?)$/' ] var re = 0; tap.test("basic tests", function (t) { var start = Date.now() // [ pattern, [matches], MM opts, files, TAP opts] patterns.forEach(function (c) { if (typeof c === "function") return c() if (typeof c === "string") return t.comment(c) var pattern = c[0] , expect = c[1].sort(alpha) , options = c[2] || {} , f = c[3] || files , tapOpts = c[4] || {} // options.debug = true var m = new mm.Minimatch(pattern, options) var r = m.makeRe() var expectRe = regexps[re++] tapOpts.re = String(r) || JSON.stringify(r) tapOpts.files = JSON.stringify(f) tapOpts.pattern = pattern tapOpts.set = m.set tapOpts.negated = m.negate var actual = mm.match(f, pattern, options) actual.sort(alpha) t.equivalent( actual, expect , JSON.stringify(pattern) + " " + JSON.stringify(expect) , tapOpts ) t.equal(tapOpts.re, expectRe, tapOpts) }) t.comment("time=" + (Date.now() - start) + "ms") t.end() }) tap.test("global leak test", function (t) { var globalAfter = Object.keys(global) t.equivalent(globalAfter, globalBefore, "no new globals, please") t.end() }) function alpha (a, b) { return a > b ? 1 : -1 } npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/test/brace-expand.js0000644000000000000000000000137612631326456026174 0ustar 00000000000000var tap = require("tap") , minimatch = require("../") tap.test("brace expansion", function (t) { // [ pattern, [expanded] ] ; [ [ "a{b,c{d,e},{f,g}h}x{y,z}" , [ "abxy" , "abxz" , "acdxy" , "acdxz" , "acexy" , "acexz" , "afhxy" , "afhxz" , "aghxy" , "aghxz" ] ] , [ "a{1..5}b" , [ "a1b" , "a2b" , "a3b" , "a4b" , "a5b" ] ] , [ "a{b}c", ["a{b}c"] ] , [ "a{00..05}b" , ["a00b" ,"a01b" ,"a02b" ,"a03b" ,"a04b" ,"a05b" ] ] ].forEach(function (tc) { var p = tc[0] , expect = tc[1] t.equivalent(minimatch.braceExpand(p), expect, p) }) console.error("ending") t.end() }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/test/caching.js0000644000000000000000000000066412631326456025236 0ustar 00000000000000var Minimatch = require("../minimatch.js").Minimatch var tap = require("tap") tap.test("cache test", function (t) { var mm1 = new Minimatch("a?b") var mm2 = new Minimatch("a?b") t.equal(mm1, mm2, "should get the same object") // the lru should drop it after 100 entries for (var i = 0; i < 100; i ++) { new Minimatch("a"+i) } mm2 = new Minimatch("a?b") t.notEqual(mm1, mm2, "cache should have dropped") t.end() }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/test/defaults.js0000644000000000000000000002277112631326456025454 0ustar 00000000000000// http://www.bashcookbook.com/bashinfo/source/bash-1.14.7/tests/glob-test // // TODO: Some of these tests do very bad things with backslashes, and will // most likely fail badly on windows. They should probably be skipped. var tap = require("tap") , globalBefore = Object.keys(global) , mm = require("../") , files = [ "a", "b", "c", "d", "abc" , "abd", "abe", "bb", "bcd" , "ca", "cb", "dd", "de" , "bdir/", "bdir/cfile"] , next = files.concat([ "a-b", "aXb" , ".x", ".y" ]) tap.test("basic tests", function (t) { var start = Date.now() // [ pattern, [matches], MM opts, files, TAP opts] ; [ "http://www.bashcookbook.com/bashinfo" + "/source/bash-1.14.7/tests/glob-test" , ["a*", ["a", "abc", "abd", "abe"]] , ["X*", ["X*"], {nonull: true}] // allow null glob expansion , ["X*", []] // isaacs: Slightly different than bash/sh/ksh // \\* is not un-escaped to literal "*" in a failed match, // but it does make it get treated as a literal star , ["\\*", ["\\*"], {nonull: true}] , ["\\**", ["\\**"], {nonull: true}] , ["\\*\\*", ["\\*\\*"], {nonull: true}] , ["b*/", ["bdir/"]] , ["c*", ["c", "ca", "cb"]] , ["**", files] , ["\\.\\./*/", ["\\.\\./*/"], {nonull: true}] , ["s/\\..*//", ["s/\\..*//"], {nonull: true}] , "legendary larry crashes bashes" , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\\1/" , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\\1/"], {nonull: true}] , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\1/" , ["/^root:/{s/^[^:]*:[^:]*:\([^:]*\).*$/\1/"], {nonull: true}] , "character classes" , ["[a-c]b*", ["abc", "abd", "abe", "bb", "cb"]] , ["[a-y]*[^c]", ["abd", "abe", "bb", "bcd", "bdir/", "ca", "cb", "dd", "de"]] , ["a*[^c]", ["abd", "abe"]] , function () { files.push("a-b", "aXb") } , ["a[X-]b", ["a-b", "aXb"]] , function () { files.push(".x", ".y") } , ["[^a-c]*", ["d", "dd", "de"]] , function () { files.push("a*b/", "a*b/ooo") } , ["a\\*b/*", ["a*b/ooo"]] , ["a\\*?/*", ["a*b/ooo"]] , ["*\\\\!*", [], {null: true}, ["echo !7"]] , ["*\\!*", ["echo !7"], null, ["echo !7"]] , ["*.\\*", ["r.*"], null, ["r.*"]] , ["a[b]c", ["abc"]] , ["a[\\b]c", ["abc"]] , ["a?c", ["abc"]] , ["a\\*c", [], {null: true}, ["abc"]] , ["", [""], { null: true }, [""]] , "http://www.opensource.apple.com/source/bash/bash-23/" + "bash/tests/glob-test" , function () { files.push("man/", "man/man1/", "man/man1/bash.1") } , ["*/man*/bash.*", ["man/man1/bash.1"]] , ["man/man1/bash.1", ["man/man1/bash.1"]] , ["a***c", ["abc"], null, ["abc"]] , ["a*****?c", ["abc"], null, ["abc"]] , ["?*****??", ["abc"], null, ["abc"]] , ["*****??", ["abc"], null, ["abc"]] , ["?*****?c", ["abc"], null, ["abc"]] , ["?***?****c", ["abc"], null, ["abc"]] , ["?***?****?", ["abc"], null, ["abc"]] , ["?***?****", ["abc"], null, ["abc"]] , ["*******c", ["abc"], null, ["abc"]] , ["*******?", ["abc"], null, ["abc"]] , ["a*cd**?**??k", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??k", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??k***", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??***k", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a**?**cd**?**??***k**", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["a****c**?**??*****", ["abcdecdhjk"], null, ["abcdecdhjk"]] , ["[-abc]", ["-"], null, ["-"]] , ["[abc-]", ["-"], null, ["-"]] , ["\\", ["\\"], null, ["\\"]] , ["[\\\\]", ["\\"], null, ["\\"]] , ["[[]", ["["], null, ["["]] , ["[", ["["], null, ["["]] , ["[*", ["[abc"], null, ["[abc"]] , "a right bracket shall lose its special meaning and\n" + "represent itself in a bracket expression if it occurs\n" + "first in the list. -- POSIX.2 2.8.3.2" , ["[]]", ["]"], null, ["]"]] , ["[]-]", ["]"], null, ["]"]] , ["[a-\z]", ["p"], null, ["p"]] , ["??**********?****?", [], { null: true }, ["abc"]] , ["??**********?****c", [], { null: true }, ["abc"]] , ["?************c****?****", [], { null: true }, ["abc"]] , ["*c*?**", [], { null: true }, ["abc"]] , ["a*****c*?**", [], { null: true }, ["abc"]] , ["a********???*******", [], { null: true }, ["abc"]] , ["[]", [], { null: true }, ["a"]] , ["[abc", [], { null: true }, ["["]] , "nocase tests" , ["XYZ", ["xYz"], { nocase: true, null: true } , ["xYz", "ABC", "IjK"]] , ["ab*", ["ABC"], { nocase: true, null: true } , ["xYz", "ABC", "IjK"]] , ["[ia]?[ck]", ["ABC", "IjK"], { nocase: true, null: true } , ["xYz", "ABC", "IjK"]] // [ pattern, [matches], MM opts, files, TAP opts] , "onestar/twostar" , ["{/*,*}", [], {null: true}, ["/asdf/asdf/asdf"]] , ["{/?,*}", ["/a", "bb"], {null: true} , ["/a", "/b/b", "/a/b/c", "bb"]] , "dots should not match unless requested" , ["**", ["a/b"], {}, ["a/b", "a/.d", ".a/.d"]] // .. and . can only match patterns starting with ., // even when options.dot is set. , function () { files = ["a/./b", "a/../b", "a/c/b", "a/.d/b"] } , ["a/*/b", ["a/c/b", "a/.d/b"], {dot: true}] , ["a/.*/b", ["a/./b", "a/../b", "a/.d/b"], {dot: true}] , ["a/*/b", ["a/c/b"], {dot:false}] , ["a/.*/b", ["a/./b", "a/../b", "a/.d/b"], {dot: false}] // this also tests that changing the options needs // to change the cache key, even if the pattern is // the same! , ["**", ["a/b","a/.d",".a/.d"], { dot: true } , [ ".a/.d", "a/.d", "a/b"]] , "paren sets cannot contain slashes" , ["*(a/b)", ["*(a/b)"], {nonull: true}, ["a/b"]] // brace sets trump all else. // // invalid glob pattern. fails on bash4 and bsdglob. // however, in this implementation, it's easier just // to do the intuitive thing, and let brace-expansion // actually come before parsing any extglob patterns, // like the documentation seems to say. // // XXX: if anyone complains about this, either fix it // or tell them to grow up and stop complaining. // // bash/bsdglob says this: // , ["*(a|{b),c)}", ["*(a|{b),c)}"], {}, ["a", "ab", "ac", "ad"]] // but we do this instead: , ["*(a|{b),c)}", ["a", "ab", "ac"], {}, ["a", "ab", "ac", "ad"]] // test partial parsing in the presence of comment/negation chars , ["[!a*", ["[!ab"], {}, ["[!ab", "[ab"]] , ["[#a*", ["[#ab"], {}, ["[#ab", "[ab"]] // like: {a,b|c\\,d\\\|e} except it's unclosed, so it has to be escaped. , ["+(a|*\\|c\\\\|d\\\\\\|e\\\\\\\\|f\\\\\\\\\\|g" , ["+(a|b\\|c\\\\|d\\\\|e\\\\\\\\|f\\\\\\\\|g"] , {} , ["+(a|b\\|c\\\\|d\\\\|e\\\\\\\\|f\\\\\\\\|g", "a", "b\\c"]] // crazy nested {,,} and *(||) tests. , function () { files = [ "a", "b", "c", "d" , "ab", "ac", "ad" , "bc", "cb" , "bc,d", "c,db", "c,d" , "d)", "(b|c", "*(b|c" , "b|c", "b|cc", "cb|c" , "x(a|b|c)", "x(a|c)" , "(a|b|c)", "(a|c)"] } , ["*(a|{b,c})", ["a", "b", "c", "ab", "ac"]] , ["{a,*(b|c,d)}", ["a","(b|c", "*(b|c", "d)"]] // a // *(b|c) // *(b|d) , ["{a,*(b|{c,d})}", ["a","b", "bc", "cb", "c", "d"]] , ["*(a|{b|c,c})", ["a", "b", "c", "ab", "ac", "bc", "cb"]] // test various flag settings. , [ "*(a|{b|c,c})", ["x(a|b|c)", "x(a|c)", "(a|b|c)", "(a|c)"] , { noext: true } ] , ["a?b", ["x/y/acb", "acb/"], {matchBase: true} , ["x/y/acb", "acb/", "acb/d/e", "x/y/acb/d"] ] , ["#*", ["#a", "#b"], {nocomment: true}, ["#a", "#b", "c#d"]] // begin channelling Boole and deMorgan... , "negation tests" , function () { files = ["d", "e", "!ab", "!abc", "a!b", "\\!a"] } // anything that is NOT a* matches. , ["!a*", ["\\!a", "d", "e", "!ab", "!abc"]] // anything that IS !a* matches. , ["!a*", ["!ab", "!abc"], {nonegate: true}] // anything that IS a* matches , ["!!a*", ["a!b"]] // anything that is NOT !a* matches , ["!\\!a*", ["a!b", "d", "e", "\\!a"]] // negation nestled within a pattern , function () { files = [ "foo.js" , "foo.bar" // can't match this one without negative lookbehind. , "foo.js.js" , "blar.js" , "foo." , "boo.js.boo" ] } , ["*.!(js)", ["foo.bar", "foo.", "boo.js.boo"] ] ].forEach(function (c) { if (typeof c === "function") return c() if (typeof c === "string") return t.comment(c) var pattern = c[0] , expect = c[1].sort(alpha) , options = c[2] , f = c[3] || files , tapOpts = c[4] || {} // options.debug = true var Class = mm.defaults(options).Minimatch var m = new Class(pattern, {}) var r = m.makeRe() tapOpts.re = String(r) || JSON.stringify(r) tapOpts.files = JSON.stringify(f) tapOpts.pattern = pattern tapOpts.set = m.set tapOpts.negated = m.negate var actual = mm.match(f, pattern, options) actual.sort(alpha) t.equivalent( actual, expect , JSON.stringify(pattern) + " " + JSON.stringify(expect) , tapOpts ) }) t.comment("time=" + (Date.now() - start) + "ms") t.end() }) tap.test("global leak test", function (t) { var globalAfter = Object.keys(global) t.equivalent(globalAfter, globalBefore, "no new globals, please") t.end() }) function alpha (a, b) { return a > b ? 1 : -1 } npm_3.5.2.orig/node_modules/node-gyp/node_modules/minimatch/test/extglob-ending-with-state-char.js0000644000000000000000000000031012631326456031536 0ustar 00000000000000var test = require('tap').test var minimatch = require('../') test('extglob ending with statechar', function(t) { t.notOk(minimatch('ax', 'a?(b*)')) t.ok(minimatch('ax', '?(a*|b)')) t.end() }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/.npmrc0000644000000000000000000000005412631326456022761 0ustar 00000000000000save-prefix = ~ proprietary-attribs = false npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/LICENSE0000644000000000000000000000137512631326456022655 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/README.md0000644000000000000000000001213612631326456023124 0ustar 00000000000000# npmlog The logger util that npm uses. This logger is very basic. It does the logging for npm. It supports custom levels and colored output. By default, logs are written to stderr. If you want to send log messages to outputs other than streams, then you can change the `log.stream` member, or you can just listen to the events that it emits, and do whatever you want with them. # Basic Usage ``` var log = require('npmlog') // additional stuff ---------------------------+ // message ----------+ | // prefix ----+ | | // level -+ | | | // v v v v log.info('fyi', 'I have a kitty cat: %j', myKittyCat) ``` ## log.level * {String} The level to display logs at. Any logs at or above this level will be displayed. The special level `silent` will prevent anything from being displayed ever. ## log.record * {Array} An array of all the log messages that have been entered. ## log.maxRecordSize * {Number} The maximum number of records to keep. If log.record gets bigger than 10% over this value, then it is sliced down to 90% of this value. The reason for the 10% window is so that it doesn't have to resize a large array on every log entry. ## log.prefixStyle * {Object} A style object that specifies how prefixes are styled. (See below) ## log.headingStyle * {Object} A style object that specifies how the heading is styled. (See below) ## log.heading * {String} Default: "" If set, a heading that is printed at the start of every line. ## log.stream * {Stream} Default: `process.stderr` The stream where output is written. ## log.enableColor() Force colors to be used on all messages, regardless of the output stream. ## log.disableColor() Disable colors on all messages. ## log.enableProgress() Enable the display of log activity spinner and progress bar ## log.disableProgress() Disable the display of a progress bar ## log.enableUnicode() Force the unicode theme to be used for the progress bar. ## log.disableUnicode() Disable the use of unicode in the progress bar. ## log.setGaugeTemplate(template) Overrides the default gauge template. ## log.pause() Stop emitting messages to the stream, but do not drop them. ## log.resume() Emit all buffered messages that were written while paused. ## log.log(level, prefix, message, ...) * `level` {String} The level to emit the message at * `prefix` {String} A string prefix. Set to "" to skip. * `message...` Arguments to `util.format` Emit a log message at the specified level. ## log\[level](prefix, message, ...) For example, * log.silly(prefix, message, ...) * log.verbose(prefix, message, ...) * log.info(prefix, message, ...) * log.http(prefix, message, ...) * log.warn(prefix, message, ...) * log.error(prefix, message, ...) Like `log.log(level, prefix, message, ...)`. In this way, each level is given a shorthand, so you can do `log.info(prefix, message)`. ## log.addLevel(level, n, style, disp) * `level` {String} Level indicator * `n` {Number} The numeric level * `style` {Object} Object with fg, bg, inverse, etc. * `disp` {String} Optional replacement for `level` in the output. Sets up a new level with a shorthand function and so forth. Note that if the number is `Infinity`, then setting the level to that will cause all log messages to be suppressed. If the number is `-Infinity`, then the only way to show it is to enable all log messages. ## log.newItem(name, todo, weight) * `name` {String} Optional; progress item name. * `todo` {Number} Optional; total amount of work to be done. Default 0. * `weight` {Number} Optional; the weight of this item relative to others. Default 1. This adds a new `are-we-there-yet` item tracker to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `Tracker` object. ## log.newStream(name, todo, weight) This adds a new `are-we-there-yet` stream tracker to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `TrackerStream` object. ## log.newGroup(name, weight) This adds a new `are-we-there-yet` tracker group to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `TrackerGroup` object. # Events Events are all emitted with the message object. * `log` Emitted for all messages * `log.` Emitted for all messages with the `` level. * `` Messages with prefixes also emit their prefix as an event. # Style Objects Style objects can have the following fields: * `fg` {String} Color for the foreground text * `bg` {String} Color for the background * `bold`, `inverse`, `underline` {Boolean} Set the associated property * `bell` {Boolean} Make a noise (This is pretty annoying, probably.) # Message Objects Every log event is emitted with a message object, and the `log.record` list contains all of them that have been created. They have the following fields: * `id` {Number} * `level` {String} * `prefix` {String} * `message` {String} Result of `util.format()` * `messageRaw` {Array} Arguments to `util.format()` npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/example.js0000644000000000000000000000312512631326456023634 0ustar 00000000000000var log = require('./log.js') log.heading = 'npm' console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n'+ new Error('a 404 error').stack) log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/log.js0000644000000000000000000001477112631326456022773 0ustar 00000000000000'use strict' var Progress = require('are-we-there-yet') var Gauge = require('gauge') var EE = require('events').EventEmitter var log = exports = module.exports = new EE var util = require('util') var ansi = require('ansi') log.cursor = ansi(process.stderr) log.stream = process.stderr // by default, let ansi decide based on tty-ness. var colorEnabled = undefined log.enableColor = function () { colorEnabled = true this.cursor.enabled = true } log.disableColor = function () { colorEnabled = false this.cursor.enabled = false } // default level log.level = 'info' log.gauge = new Gauge(log.cursor) log.tracker = new Progress.TrackerGroup() // no progress bars unless asked log.progressEnabled = false var gaugeTheme = undefined log.enableUnicode = function () { gaugeTheme = Gauge.unicode log.gauge.setTheme(gaugeTheme) } log.disableUnicode = function () { gaugeTheme = Gauge.ascii log.gauge.setTheme(gaugeTheme) } var gaugeTemplate = undefined log.setGaugeTemplate = function (template) { gaugeTemplate = template log.gauge.setTemplate(gaugeTemplate) } log.enableProgress = function () { if (this.progressEnabled) return this.progressEnabled = true if (this._pause) return this.tracker.on('change', this.showProgress) this.gauge.enable() this.showProgress() } log.disableProgress = function () { if (!this.progressEnabled) return this.clearProgress() this.progressEnabled = false this.tracker.removeListener('change', this.showProgress) this.gauge.disable() } var trackerConstructors = ['newGroup', 'newItem', 'newStream'] var mixinLog = function (tracker) { // mixin the public methods from log into the tracker // (except: conflicts and one's we handle specially) Object.keys(log).forEach(function (P) { if (P[0] === '_') return if (trackerConstructors.filter(function (C) { return C === P }).length) return if (tracker[P]) return if (typeof log[P] !== 'function') return var func = log[P] tracker[P] = function () { return func.apply(log, arguments) } }) // if the new tracker is a group, make sure any subtrackers get // mixed in too if (tracker instanceof Progress.TrackerGroup) { trackerConstructors.forEach(function (C) { var func = tracker[C] tracker[C] = function () { return mixinLog(func.apply(tracker, arguments)) } }) } return tracker } // Add tracker constructors to the top level log object trackerConstructors.forEach(function (C) { log[C] = function () { return mixinLog(this.tracker[C].apply(this.tracker, arguments)) } }) log.clearProgress = function () { if (!this.progressEnabled) return this.gauge.hide() } log.showProgress = function (name) { if (!this.progressEnabled) return this.gauge.show(name, this.tracker.completed()) }.bind(log) // bind for use in tracker's on-change listener // temporarily stop emitting, but don't drop log.pause = function () { this._paused = true } log.resume = function () { if (!this._paused) return this._paused = false var b = this._buffer this._buffer = [] b.forEach(function (m) { this.emitLog(m) }, this) if (this.progressEnabled) this.enableProgress() } log._buffer = [] var id = 0 log.record = [] log.maxRecordSize = 10000 log.log = function (lvl, prefix, message) { var l = this.levels[lvl] if (l === undefined) { return this.emit('error', new Error(util.format( 'Undefined log level: %j', lvl))) } var a = new Array(arguments.length - 2) var stack = null for (var i = 2; i < arguments.length; i ++) { var arg = a[i-2] = arguments[i] // resolve stack traces to a plain string. if (typeof arg === 'object' && arg && (arg instanceof Error) && arg.stack) { arg.stack = stack = arg.stack + '' } } if (stack) a.unshift(stack + '\n') message = util.format.apply(util, a) var m = { id: id++, level: lvl, prefix: String(prefix || ''), message: message, messageRaw: a } this.emit('log', m) this.emit('log.' + lvl, m) if (m.prefix) this.emit(m.prefix, m) this.record.push(m) var mrs = this.maxRecordSize var n = this.record.length - mrs if (n > mrs / 10) { var newSize = Math.floor(mrs * 0.9) this.record = this.record.slice(-1 * newSize) } this.emitLog(m) }.bind(log) log.emitLog = function (m) { if (this._paused) { this._buffer.push(m) return } if (this.progressEnabled) this.gauge.pulse(m.prefix) var l = this.levels[m.level] if (l === undefined) return if (l < this.levels[this.level]) return if (l > 0 && !isFinite(l)) return var style = log.style[m.level] var disp = log.disp[m.level] || m.level this.clearProgress() m.message.split(/\r?\n/).forEach(function (line) { if (this.heading) { this.write(this.heading, this.headingStyle) this.write(' ') } this.write(disp, log.style[m.level]) var p = m.prefix || '' if (p) this.write(' ') this.write(p, this.prefixStyle) this.write(' ' + line + '\n') }, this) this.showProgress() } log.write = function (msg, style) { if (!this.cursor) return if (this.stream !== this.cursor.stream) { this.cursor = ansi(this.stream, { enabled: colorEnabled }) var options = {} if (gaugeTheme != null) options.theme = gaugeTheme if (gaugeTemplate != null) options.template = gaugeTemplate this.gauge = new Gauge(options, this.cursor) } style = style || {} if (style.fg) this.cursor.fg[style.fg]() if (style.bg) this.cursor.bg[style.bg]() if (style.bold) this.cursor.bold() if (style.underline) this.cursor.underline() if (style.inverse) this.cursor.inverse() if (style.beep) this.cursor.beep() this.cursor.write(msg).reset() } log.addLevel = function (lvl, n, style, disp) { if (!disp) disp = lvl this.levels[lvl] = n this.style[lvl] = style if (!this[lvl]) this[lvl] = function () { var a = new Array(arguments.length + 1) a[0] = lvl for (var i = 0; i < arguments.length; i ++) { a[i + 1] = arguments[i] } return this.log.apply(this, a) }.bind(this) this.disp[lvl] = disp } log.prefixStyle = { fg: 'magenta' } log.headingStyle = { fg: 'white', bg: 'black' } log.style = {} log.levels = {} log.disp = {} log.addLevel('silly', -Infinity, { inverse: true }, 'sill') log.addLevel('verbose', 1000, { fg: 'blue', bg: 'black' }, 'verb') log.addLevel('info', 2000, { fg: 'green' }) log.addLevel('http', 3000, { fg: 'green', bg: 'black' }) log.addLevel('warn', 4000, { fg: 'black', bg: 'yellow' }, 'WARN') log.addLevel('error', 5000, { fg: 'red', bg: 'black' }, 'ERR!') log.addLevel('silent', Infinity) npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/0000755000000000000000000000000012631326456024317 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/package.json0000644000000000000000000000256412631326456024137 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "npmlog", "description": "logger for npm", "version": "1.2.1", "repository": { "type": "git", "url": "git://github.com/isaacs/npmlog.git" }, "main": "log.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "ansi": "~0.3.0", "are-we-there-yet": "~1.0.0", "gauge": "~1.2.0" }, "devDependencies": { "tap": "" }, "license": "ISC", "gitHead": "4e1a73a567036064ded425a7d48c863d53550b4f", "bugs": { "url": "https://github.com/isaacs/npmlog/issues" }, "homepage": "https://github.com/isaacs/npmlog#readme", "_id": "npmlog@1.2.1", "_shasum": "28e7be619609b53f7ad1dd300a10d64d716268b6", "_from": "npmlog@>=0.0.0 <1.0.0||>=1.0.0 <2.0.0", "_npmVersion": "2.10.0", "_nodeVersion": "2.0.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "28e7be619609b53f7ad1dd300a10d64d716268b6", "tarball": "http://registry.npmjs.org/npmlog/-/npmlog-1.2.1.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "iarna", "email": "me@re-becca.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/npmlog/-/npmlog-1.2.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/test/0000755000000000000000000000000012631326456022621 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/0000755000000000000000000000000012631326456025251 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/0000755000000000000000000000000012631326456027403 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/0000755000000000000000000000000012631326456025407 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/.npmignore0000644000000000000000000000001512631326456027244 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/History.md0000644000000000000000000000060612631326456027236 0ustar 00000000000000 0.3.0 / 2014-05-09 ================== * package: remove "test" script and "devDependencies" * package: remove "engines" section * pacakge: remove "bin" section * package: beautify * examples: remove `starwars` example (#15) * Documented goto, horizontalAbsolute, and eraseLine methods in README.md (#12, @Jammerwoch) * add `.jshintrc` file < 0.3.0 ======= * Prehistoric npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/README.md0000644000000000000000000000621312631326456026532 0ustar 00000000000000ansi.js ========= ### Advanced ANSI formatting tool for Node.js `ansi.js` is a module for Node.js that provides an easy-to-use API for writing ANSI escape codes to `Stream` instances. ANSI escape codes are used to do fancy things in a terminal window, like render text in colors, delete characters, lines, the entire window, or hide and show the cursor, and lots more! #### Features: * 256 color support for the terminal! * Make a beep sound from your terminal! * Works with *any* writable `Stream` instance. * Allows you to move the cursor anywhere on the terminal window. * Allows you to delete existing contents from the terminal window. * Allows you to hide and show the cursor. * Converts CSS color codes and RGB values into ANSI escape codes. * Low-level; you are in control of when escape codes are used, it's not abstracted. Installation ------------ Install with `npm`: ``` bash $ npm install ansi ``` Example ------- ``` js var ansi = require('ansi') , cursor = ansi(process.stdout) // You can chain your calls forever: cursor .red() // Set font color to red .bg.grey() // Set background color to grey .write('Hello World!') // Write 'Hello World!' to stdout .bg.reset() // Reset the bgcolor before writing the trailing \n, // to avoid Terminal glitches .write('\n') // And a final \n to wrap things up // Rendering modes are persistent: cursor.hex('#660000').bold().underline() // You can use the regular logging functions, text will be green: console.log('This is blood red, bold text') // To reset just the foreground color: cursor.fg.reset() console.log('This will still be bold') // to go to a location (x,y) on the console // note: 1-indexed, not 0-indexed: cursor.goto(10, 5).write('Five down, ten over') // to clear the current line: cursor.horizontalAbsolute(0).eraseLine().write('Starting again') // to go to a different column on the current line: cursor.horizontalAbsolute(5).write('column five') // Clean up after yourself! cursor.reset() ``` License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/0000755000000000000000000000000012631326456027067 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/lib/0000755000000000000000000000000012631326456026017 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/package.json0000644000000000000000000000242212631326456027537 0ustar 00000000000000{ "name": "ansi", "description": "Advanced ANSI formatting tool for Node.js", "keywords": [ "ansi", "formatting", "cursor", "color", "terminal", "rgb", "256", "stream" ], "version": "0.3.0", "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://tootallnate.net" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/ansi.js.git" }, "main": "./lib/ansi.js", "bugs": { "url": "https://github.com/TooTallNate/ansi.js/issues" }, "homepage": "https://github.com/TooTallNate/ansi.js", "_id": "ansi@0.3.0", "_shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "_from": "ansi@>=0.3.0 <0.4.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "TooTallNate", "email": "nathan@tootallnate.net" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "tarball": "http://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/beep/0000755000000000000000000000000012631326456030002 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/clear/0000755000000000000000000000000012631326456030155 5ustar 00000000000000././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/cursorPosition.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/cursorPosition.j0000755000000000000000000000120012631326456032300 0ustar 00000000000000#!/usr/bin/env node var tty = require('tty') var cursor = require('../')(process.stdout) // listen for the queryPosition report on stdin process.stdin.resume() raw(true) process.stdin.once('data', function (b) { var match = /\[(\d+)\;(\d+)R$/.exec(b.toString()) if (match) { var xy = match.slice(1, 3).reverse().map(Number) console.error(xy) } // cleanup and close stdin raw(false) process.stdin.pause() }) // send the query position request code to stdout cursor.queryPosition() function raw (mode) { if (process.stdin.setRawMode) { process.stdin.setRawMode(mode) } else { tty.setRawMode(mode) } } npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/progress/0000755000000000000000000000000012631326456030733 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/beep/index.js0000755000000000000000000000051212631326456031450 0ustar 00000000000000#!/usr/bin/env node /** * Invokes the terminal "beep" sound once per second on every exact second. */ process.title = 'beep' var cursor = require('../../')(process.stdout) function beep () { cursor.beep() setTimeout(beep, 1000 - (new Date()).getMilliseconds()) } setTimeout(beep, 1000 - (new Date()).getMilliseconds()) npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/clear/index.js0000755000000000000000000000054412631326456031630 0ustar 00000000000000#!/usr/bin/env node /** * Like GNU ncurses "clear" command. * https://github.com/mscdex/node-ncurses/blob/master/deps/ncurses/progs/clear.c */ process.title = 'clear' function lf () { return '\n' } require('../../')(process.stdout) .write(Array.apply(null, Array(process.stdout.getWindowSize()[1])).map(lf).join('')) .eraseData(2) .goto(1, 1) ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/progress/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/examples/progress/index.j0000644000000000000000000000327412631326456032223 0ustar 00000000000000#!/usr/bin/env node var assert = require('assert') , ansi = require('../../') function Progress (stream, width) { this.cursor = ansi(stream) this.delta = this.cursor.newlines this.width = width | 0 || 10 this.open = '[' this.close = ']' this.complete = '█' this.incomplete = '_' // initial render this.progress = 0 } Object.defineProperty(Progress.prototype, 'progress', { get: get , set: set , configurable: true , enumerable: true }) function get () { return this._progress } function set (v) { this._progress = Math.max(0, Math.min(v, 100)) var w = this.width - this.complete.length - this.incomplete.length , n = w * (this._progress / 100) | 0 , i = w - n , com = c(this.complete, n) , inc = c(this.incomplete, i) , delta = this.cursor.newlines - this.delta assert.equal(com.length + inc.length, w) if (delta > 0) { this.cursor.up(delta) this.delta = this.cursor.newlines } this.cursor .horizontalAbsolute(0) .eraseLine(2) .fg.white() .write(this.open) .fg.grey() .bold() .write(com) .resetBold() .write(inc) .fg.white() .write(this.close) .fg.reset() .write('\n') } function c (char, length) { return Array.apply(null, Array(length)).map(function () { return char }).join('') } // Usage var width = parseInt(process.argv[2], 10) || process.stdout.getWindowSize()[0] / 2 , p = new Progress(process.stdout, width) ;(function tick () { p.progress += Math.random() * 5 p.cursor .eraseLine(2) .write('Progress: ') .bold().write(p.progress.toFixed(2)) .write('%') .resetBold() .write('\n') if (p.progress < 100) setTimeout(tick, 100) })() npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/ansi/lib/ansi.js0000644000000000000000000001743412631326456027320 0ustar 00000000000000 /** * References: * * - http://en.wikipedia.org/wiki/ANSI_escape_code * - http://www.termsys.demon.co.uk/vtansi.htm * */ /** * Module dependencies. */ var emitNewlineEvents = require('./newlines') , prefix = '\x1b[' // For all escape codes , suffix = 'm' // Only for color codes /** * The ANSI escape sequences. */ var codes = { up: 'A' , down: 'B' , forward: 'C' , back: 'D' , nextLine: 'E' , previousLine: 'F' , horizontalAbsolute: 'G' , eraseData: 'J' , eraseLine: 'K' , scrollUp: 'S' , scrollDown: 'T' , savePosition: 's' , restorePosition: 'u' , queryPosition: '6n' , hide: '?25l' , show: '?25h' } /** * Rendering ANSI codes. */ var styles = { bold: 1 , italic: 3 , underline: 4 , inverse: 7 } /** * The negating ANSI code for the rendering modes. */ var reset = { bold: 22 , italic: 23 , underline: 24 , inverse: 27 } /** * The standard, styleable ANSI colors. */ var colors = { white: 37 , black: 30 , blue: 34 , cyan: 36 , green: 32 , magenta: 35 , red: 31 , yellow: 33 , grey: 90 , brightBlack: 90 , brightRed: 91 , brightGreen: 92 , brightYellow: 93 , brightBlue: 94 , brightMagenta: 95 , brightCyan: 96 , brightWhite: 97 } /** * Creates a Cursor instance based off the given `writable stream` instance. */ function ansi (stream, options) { if (stream._ansicursor) { return stream._ansicursor } else { return stream._ansicursor = new Cursor(stream, options) } } module.exports = exports = ansi /** * The `Cursor` class. */ function Cursor (stream, options) { if (!(this instanceof Cursor)) { return new Cursor(stream, options) } if (typeof stream != 'object' || typeof stream.write != 'function') { throw new Error('a valid Stream instance must be passed in') } // the stream to use this.stream = stream // when 'enabled' is false then all the functions are no-ops except for write() this.enabled = options && options.enabled if (typeof this.enabled === 'undefined') { this.enabled = stream.isTTY } this.enabled = !!this.enabled // then `buffering` is true, then `write()` calls are buffered in // memory until `flush()` is invoked this.buffering = !!(options && options.buffering) this._buffer = [] // controls the foreground and background colors this.fg = this.foreground = new Colorer(this, 0) this.bg = this.background = new Colorer(this, 10) // defaults this.Bold = false this.Italic = false this.Underline = false this.Inverse = false // keep track of the number of "newlines" that get encountered this.newlines = 0 emitNewlineEvents(stream) stream.on('newline', function () { this.newlines++ }.bind(this)) } exports.Cursor = Cursor /** * Helper function that calls `write()` on the underlying Stream. * Returns `this` instead of the write() return value to keep * the chaining going. */ Cursor.prototype.write = function (data) { if (this.buffering) { this._buffer.push(arguments) } else { this.stream.write.apply(this.stream, arguments) } return this } /** * Buffer `write()` calls into memory. * * @api public */ Cursor.prototype.buffer = function () { this.buffering = true return this } /** * Write out the in-memory buffer. * * @api public */ Cursor.prototype.flush = function () { this.buffering = false var str = this._buffer.map(function (args) { if (args.length != 1) throw new Error('unexpected args length! ' + args.length); return args[0]; }).join(''); this._buffer.splice(0); // empty this.write(str); return this } /** * The `Colorer` class manages both the background and foreground colors. */ function Colorer (cursor, base) { this.current = null this.cursor = cursor this.base = base } exports.Colorer = Colorer /** * Write an ANSI color code, ensuring that the same code doesn't get rewritten. */ Colorer.prototype._setColorCode = function setColorCode (code) { var c = String(code) if (this.current === c) return this.cursor.enabled && this.cursor.write(prefix + c + suffix) this.current = c return this } /** * Set up the positional ANSI codes. */ Object.keys(codes).forEach(function (name) { var code = String(codes[name]) Cursor.prototype[name] = function () { var c = code if (arguments.length > 0) { c = toArray(arguments).map(Math.round).join(';') + code } this.enabled && this.write(prefix + c) return this } }) /** * Set up the functions for the rendering ANSI codes. */ Object.keys(styles).forEach(function (style) { var name = style[0].toUpperCase() + style.substring(1) , c = styles[style] , r = reset[style] Cursor.prototype[style] = function () { if (this[name]) return this.enabled && this.write(prefix + c + suffix) this[name] = true return this } Cursor.prototype['reset' + name] = function () { if (!this[name]) return this.enabled && this.write(prefix + r + suffix) this[name] = false return this } }) /** * Setup the functions for the standard colors. */ Object.keys(colors).forEach(function (color) { var code = colors[color] Colorer.prototype[color] = function () { this._setColorCode(this.base + code) return this.cursor } Cursor.prototype[color] = function () { return this.foreground[color]() } }) /** * Makes a beep sound! */ Cursor.prototype.beep = function () { this.enabled && this.write('\x07') return this } /** * Moves cursor to specific position */ Cursor.prototype.goto = function (x, y) { x = x | 0 y = y | 0 this.enabled && this.write(prefix + y + ';' + x + 'H') return this } /** * Resets the color. */ Colorer.prototype.reset = function () { this._setColorCode(this.base + 39) return this.cursor } /** * Resets all ANSI formatting on the stream. */ Cursor.prototype.reset = function () { this.enabled && this.write(prefix + '0' + suffix) this.Bold = false this.Italic = false this.Underline = false this.Inverse = false this.foreground.current = null this.background.current = null return this } /** * Sets the foreground color with the given RGB values. * The closest match out of the 216 colors is picked. */ Colorer.prototype.rgb = function (r, g, b) { var base = this.base + 38 , code = rgb(r, g, b) this._setColorCode(base + ';5;' + code) return this.cursor } /** * Same as `cursor.fg.rgb(r, g, b)`. */ Cursor.prototype.rgb = function (r, g, b) { return this.foreground.rgb(r, g, b) } /** * Accepts CSS color codes for use with ANSI escape codes. * For example: `#FF000` would be bright red. */ Colorer.prototype.hex = function (color) { return this.rgb.apply(this, hex(color)) } /** * Same as `cursor.fg.hex(color)`. */ Cursor.prototype.hex = function (color) { return this.foreground.hex(color) } // UTIL FUNCTIONS // /** * Translates a 255 RGB value to a 0-5 ANSI RGV value, * then returns the single ANSI color code to use. */ function rgb (r, g, b) { var red = r / 255 * 5 , green = g / 255 * 5 , blue = b / 255 * 5 return rgb5(red, green, blue) } /** * Turns rgb 0-5 values into a single ANSI color code to use. */ function rgb5 (r, g, b) { var red = Math.round(r) , green = Math.round(g) , blue = Math.round(b) return 16 + (red*36) + (green*6) + blue } /** * Accepts a hex CSS color code string (# is optional) and * translates it into an Array of 3 RGB 0-255 values, which * can then be used with rgb(). */ function hex (color) { var c = color[0] === '#' ? color.substring(1) : color , r = c.substring(0, 2) , g = c.substring(2, 4) , b = c.substring(4, 6) return [parseInt(r, 16), parseInt(g, 16), parseInt(b, 16)] } /** * Turns an array-like object into a real array. */ function toArray (a) { var i = 0 , l = a.length , rtn = [] for (; i 0) { var len = data.length , i = 0 // now try to calculate any deltas if (typeof data == 'string') { for (; i 100% are not allowed. Triggers a `change` event. * tracker.finish() Marks this tracker as finished, tracker.completed() will now be 1. Triggers a `change` event. TrackerStream ============= * var tracker = new TrackerStream(**name**, **size**, **options**) * **name** *(optional)* The name of this counter to report in change events. Defaults to undefined. * **size** *(optional)* The number of bytes being sent through this stream. * **options** *(optional)* A hash of stream options The tracker stream object is a pass through stream that updates an internal tracker object each time a block passes through. It's intended to track downloads, file extraction and other related activities. You use it by piping your data source into it and then using it as your data source. If your data has a length attribute then that's used as the amount of work completed when the chunk is passed through. If it does not (eg, object streams) then each chunk counts as completing 1 unit of work, so your size should be the total number of objects being streamed. * tracker.addWork(**todo**) * **todo** Increase the expected overall size by **todo** bytes. Increases the amount of work to be done, thus decreasing the completion percentage. Triggers a `change` event. npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/index.js0000644000000000000000000000700412631326456031051 0ustar 00000000000000"use strict" var stream = require("readable-stream"); var EventEmitter = require("events").EventEmitter var util = require("util") var delegate = require("delegates") var TrackerGroup = exports.TrackerGroup = function (name) { EventEmitter.call(this) this.name = name this.trackGroup = [] var self = this this.totalWeight = 0 var noteChange = this.noteChange = function (name) { self.emit("change", name || this.name) }.bind(this) this.trackGroup.forEach(function(unit) { unit.on("change", noteChange) }) } util.inherits(TrackerGroup, EventEmitter) TrackerGroup.prototype.completed = function () { if (this.trackGroup.length==0) return 0 var valPerWeight = 1 / this.totalWeight var completed = 0 this.trackGroup.forEach(function(T) { completed += valPerWeight * T.weight * T.completed() }) return completed } TrackerGroup.prototype.addUnit = function (unit, weight, noChange) { unit.weight = weight || 1 this.totalWeight += unit.weight this.trackGroup.push(unit) unit.on("change", this.noteChange) if (! noChange) this.emit("change", this.name) return unit } TrackerGroup.prototype.newGroup = function (name, weight) { return this.addUnit(new TrackerGroup(name), weight) } TrackerGroup.prototype.newItem = function (name, todo, weight) { return this.addUnit(new Tracker(name, todo), weight) } TrackerGroup.prototype.newStream = function (name, todo, weight) { return this.addUnit(new TrackerStream(name, todo), weight) } TrackerGroup.prototype.finish = function () { if (! this.trackGroup.length) { this.addUnit(new Tracker(), 1, true) } var self = this this.trackGroup.forEach(function(T) { T.removeListener("change", self.noteChange) T.finish() }) this.emit("change", this.name) } var buffer = " " TrackerGroup.prototype.debug = function (depth) { depth = depth || 0 var indent = depth ? buffer.substr(0,depth) : "" var output = indent + (this.name||"top") + ": " + this.completed() + "\n" this.trackGroup.forEach(function(T) { if (T instanceof TrackerGroup) { output += T.debug(depth + 1) } else { output += indent + " " + T.name + ": " + T.completed() + "\n" } }) return output } var Tracker = exports.Tracker = function (name,todo) { EventEmitter.call(this) this.name = name this.workDone = 0 this.workTodo = todo || 0 } util.inherits(Tracker, EventEmitter) Tracker.prototype.completed = function () { return this.workTodo==0 ? 0 : this.workDone / this.workTodo } Tracker.prototype.addWork = function (work) { this.workTodo += work this.emit("change", this.name) } Tracker.prototype.completeWork = function (work) { this.workDone += work if (this.workDone > this.workTodo) this.workDone = this.workTodo this.emit("change", this.name) } Tracker.prototype.finish = function () { this.workTodo = this.workDone = 1 this.emit("change", this.name) } var TrackerStream = exports.TrackerStream = function (name, size, options) { stream.Transform.call(this, options) this.tracker = new Tracker(name, size) this.name = name var self = this this.tracker.on("change", function (name) { self.emit("change", name) }) } util.inherits(TrackerStream, stream.Transform) TrackerStream.prototype._transform = function (data, encoding, cb) { this.tracker.completeWork(data.length ? data.length : 1) this.push(data) cb() } TrackerStream.prototype._flush = function (cb) { this.tracker.finish() cb() } delegate(TrackerStream.prototype, "tracker") .method("completed") .method("addWork") npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/package.json0000644000000000000000000000256312631326456031677 0ustar 00000000000000{ "name": "are-we-there-yet", "version": "1.0.4", "description": "Keep track of the overall completion of many dispirate processes", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/are-we-there-yet.git" }, "author": { "name": "Rebecca Turner", "url": "http://re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/are-we-there-yet/issues" }, "homepage": "https://github.com/iarna/are-we-there-yet", "devDependencies": { "tap": "^0.4.13" }, "dependencies": { "delegates": "^0.1.0", "readable-stream": "^1.1.13" }, "gitHead": "7ce414849b81ab83935a935275def01914821bde", "_id": "are-we-there-yet@1.0.4", "_shasum": "527fe389f7bcba90806106b99244eaa07e886f85", "_from": "are-we-there-yet@>=1.0.0 <1.1.0", "_npmVersion": "2.0.0", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "dist": { "shasum": "527fe389f7bcba90806106b99244eaa07e886f85", "tarball": "http://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-1.0.4.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-1.0.4.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/0000755000000000000000000000000012631326456030362 5ustar 00000000000000././@LongLink0000000000000000000000000000015700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000001612631326456032057 0ustar 00000000000000node_modules/ ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/History.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000034412631326456032063 0ustar 00000000000000 0.1.0 / 2014-10-17 ================== * adds `.fluent()` to api 0.0.3 / 2014-01-13 ================== * fix receiver for .method() 0.0.2 / 2014-01-13 ================== * Object.defineProperty() sucks * Initial commit ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/Makefilenpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000014412631326456032061 0ustar 00000000000000 test: @./node_modules/.bin/mocha \ --require should \ --reporter spec \ --bail .PHONY: test././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/Readme.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000335012631326456032063 0ustar 00000000000000 # delegates Node method and accessor delegation utilty. ## Installation ``` $ npm install delegates ``` ## Example ```js var delegate = require('delegates'); ... delegate(proto, 'request') .method('acceptsLanguages') .method('acceptsEncodings') .method('acceptsCharsets') .method('accepts') .method('is') .access('querystring') .access('idempotent') .access('socket') .access('length') .access('query') .access('search') .access('status') .access('method') .access('path') .access('body') .access('host') .access('url') .getter('subdomains') .getter('protocol') .getter('header') .getter('stale') .getter('fresh') .getter('secure') .getter('ips') .getter('ip') ``` # API ## Delegate(proto, prop) Creates a delegator instance used to configure using the `prop` on the given `proto` object. (which is usually a prototype) ## Delegate#method(name) Allows the given method `name` to be accessed on the host. ## Delegate#getter(name) Creates a "getter" for the property with the given `name` on the delegated object. ## Delegate#setter(name) Creates a "setter" for the property with the given `name` on the delegated object. ## Delegate#access(name) Creates an "accessor" (ie: both getter *and* setter) for the property with the given `name` on the delegated object. ## Delegate#fluent(name) A unique type of "accessor" that works for a "fluent" API. When called as a getter, the method returns the expected value. However, if the method is called with a value, it will return itself so it can be chained. For example: ```js delegate(proto, 'request') .fluent('query') // getter var q = request.query(); // setter (chainable) request .query({ a: 1 }) .query({ b: 2 }); ``` # License MIT ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000402112631326456032057 0ustar 00000000000000 /** * Expose `Delegator`. */ module.exports = Delegator; /** * Initialize a delegator. * * @param {Object} proto * @param {String} target * @api public */ function Delegator(proto, target) { if (!(this instanceof Delegator)) return new Delegator(proto, target); this.proto = proto; this.target = target; this.methods = []; this.getters = []; this.setters = []; this.fluents = []; } /** * Delegate method `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.method = function(name){ var proto = this.proto; var target = this.target; this.methods.push(name); proto[name] = function(){ return this[target][name].apply(this[target], arguments); }; return this; }; /** * Delegator accessor `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.access = function(name){ return this.getter(name).setter(name); }; /** * Delegator getter `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.getter = function(name){ var proto = this.proto; var target = this.target; this.getters.push(name); proto.__defineGetter__(name, function(){ return this[target][name]; }); return this; }; /** * Delegator setter `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.setter = function(name){ var proto = this.proto; var target = this.target; this.setters.push(name); proto.__defineSetter__(name, function(val){ return this[target][name] = val; }); return this; }; /** * Delegator fluent accessor * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.fluent = function (name) { var proto = this.proto; var target = this.target; this.fluents.push(name); proto[name] = function(val){ if ('undefined' != typeof val) { this[target][name] = val; return this; } else { return this[target][name]; } }; return this; }; ././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000233112631326456032061 0ustar 00000000000000{ "name": "delegates", "version": "0.1.0", "repository": { "type": "git", "url": "git://github.com/visionmedia/node-delegates.git" }, "description": "delegate methods and accessors to another property", "keywords": [ "delegate", "delegation" ], "dependencies": {}, "devDependencies": { "mocha": "*", "should": "*" }, "license": "MIT", "bugs": { "url": "https://github.com/visionmedia/node-delegates/issues" }, "homepage": "https://github.com/visionmedia/node-delegates", "_id": "delegates@0.1.0", "_shasum": "b4b57be11a1653517a04b27f0949bdc327dfe390", "_from": "delegates@>=0.1.0 <0.2.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "dominicbarnes", "email": "dominic@dbarnes.info" }, "maintainers": [ { "name": "tjholowaychuk", "email": "tj@vision-media.ca" }, { "name": "dominicbarnes", "email": "dominic@dbarnes.info" } ], "dist": { "shasum": "b4b57be11a1653517a04b27f0949bdc327dfe390", "tarball": "http://registry.npmjs.org/delegates/-/delegates-0.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/delegates/-/delegates-0.1.0.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000337012631326456032065 0ustar 00000000000000 var assert = require('assert'); var delegate = require('..'); describe('.method(name)', function(){ it('should delegate methods', function(){ var obj = {}; obj.request = { foo: function(bar){ assert(this == obj.request); return bar; } }; delegate(obj, 'request').method('foo'); obj.foo('something').should.equal('something'); }) }) describe('.getter(name)', function(){ it('should delegate getters', function(){ var obj = {}; obj.request = { get type() { return 'text/html'; } } delegate(obj, 'request').getter('type'); obj.type.should.equal('text/html'); }) }) describe('.setter(name)', function(){ it('should delegate setters', function(){ var obj = {}; obj.request = { get type() { return this._type.toUpperCase(); }, set type(val) { this._type = val; } } delegate(obj, 'request').setter('type'); obj.type = 'hey'; obj.request.type.should.equal('HEY'); }) }) describe('.access(name)', function(){ it('should delegate getters and setters', function(){ var obj = {}; obj.request = { get type() { return this._type.toUpperCase(); }, set type(val) { this._type = val; } } delegate(obj, 'request').access('type'); obj.type = 'hey'; obj.type.should.equal('HEY'); }) }) describe('.fluent(name)', function () { it('should delegate in a fluent fashion', function () { var obj = { settings: { env: 'development' } }; delegate(obj, 'settings').fluent('env'); obj.env().should.equal('development'); obj.env('production').should.equal(obj); obj.settings.env.should.equal('production'); }) }) ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000004412631326456032060 0ustar 00000000000000build/ test/ examples/ fs.js zlib.js././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000211012631326456032054 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000243012631326456032061 0ustar 00000000000000# readable-stream ***Node-core streams for userland*** [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/) This package is a mirror of the Streams2 and Streams3 implementations in Node-core. If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core. **readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12. **readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` you’ll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when you’re ready to start using Streams3, pin to `"~1.1.0"` ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/duplex.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000006412631326456032062 0ustar 00000000000000module.exports = require("./lib/_stream_duplex.js") ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/float.patchnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000007374312631326456032100 0ustar 00000000000000diff --git a/lib/_stream_duplex.js b/lib/_stream_duplex.js index c5a741c..a2e0d8e 100644 --- a/lib/_stream_duplex.js +++ b/lib/_stream_duplex.js @@ -26,8 +26,8 @@ module.exports = Duplex; var util = require('util'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('./_stream_readable'); +var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); diff --git a/lib/_stream_passthrough.js b/lib/_stream_passthrough.js index a5e9864..330c247 100644 --- a/lib/_stream_passthrough.js +++ b/lib/_stream_passthrough.js @@ -25,7 +25,7 @@ module.exports = PassThrough; -var Transform = require('_stream_transform'); +var Transform = require('./_stream_transform'); var util = require('util'); util.inherits(PassThrough, Transform); diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js index 0c3fe3e..90a8298 100644 --- a/lib/_stream_readable.js +++ b/lib/_stream_readable.js @@ -23,10 +23,34 @@ module.exports = Readable; Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; +if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { + return emitter.listeners(type).length; +}; + +if (!global.setImmediate) global.setImmediate = function setImmediate(fn) { + return setTimeout(fn, 0); +}; +if (!global.clearImmediate) global.clearImmediate = function clearImmediate(i) { + return clearTimeout(i); +}; + var Stream = require('stream'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var StringDecoder; -var debug = util.debuglog('stream'); +var debug; +if (util.debuglog) + debug = util.debuglog('stream'); +else try { + debug = require('debuglog')('stream'); +} catch (er) { + debug = function() {}; +} util.inherits(Readable, Stream); @@ -380,7 +404,7 @@ function chunkInvalid(state, chunk) { function onEofChunk(stream, state) { - if (state.decoder && !state.ended) { + if (state.decoder && !state.ended && state.decoder.end) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js index b1f9fcc..b0caf57 100644 --- a/lib/_stream_transform.js +++ b/lib/_stream_transform.js @@ -64,8 +64,14 @@ module.exports = Transform; -var Duplex = require('_stream_duplex'); +var Duplex = require('./_stream_duplex'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} util.inherits(Transform, Duplex); diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index ba2e920..f49288b 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -27,6 +27,12 @@ module.exports = Writable; Writable.WritableState = WritableState; var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var Stream = require('stream'); util.inherits(Writable, Stream); @@ -119,7 +125,7 @@ function WritableState(options, stream) { function Writable(options) { // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. - if (!(this instanceof Writable) && !(this instanceof Stream.Duplex)) + if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex'))) return new Writable(options); this._writableState = new WritableState(options, this); diff --git a/test/simple/test-stream-big-push.js b/test/simple/test-stream-big-push.js index e3787e4..8cd2127 100644 --- a/test/simple/test-stream-big-push.js +++ b/test/simple/test-stream-big-push.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var str = 'asdfasdfasdfasdfasdf'; var r = new stream.Readable({ diff --git a/test/simple/test-stream-end-paused.js b/test/simple/test-stream-end-paused.js index bb73777..d40efc7 100644 --- a/test/simple/test-stream-end-paused.js +++ b/test/simple/test-stream-end-paused.js @@ -25,7 +25,7 @@ var gotEnd = false; // Make sure we don't miss the end event for paused 0-length streams -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var stream = new Readable(); var calledRead = false; stream._read = function() { diff --git a/test/simple/test-stream-pipe-after-end.js b/test/simple/test-stream-pipe-after-end.js index b46ee90..0be8366 100644 --- a/test/simple/test-stream-pipe-after-end.js +++ b/test/simple/test-stream-pipe-after-end.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var util = require('util'); util.inherits(TestReadable, Readable); diff --git a/test/simple/test-stream-pipe-cleanup.js b/test/simple/test-stream-pipe-cleanup.js deleted file mode 100644 index f689358..0000000 --- a/test/simple/test-stream-pipe-cleanup.js +++ /dev/null @@ -1,122 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -// This test asserts that Stream.prototype.pipe does not leave listeners -// hanging on the source or dest. - -var common = require('../common'); -var stream = require('stream'); -var assert = require('assert'); -var util = require('util'); - -function Writable() { - this.writable = true; - this.endCalls = 0; - stream.Stream.call(this); -} -util.inherits(Writable, stream.Stream); -Writable.prototype.end = function() { - this.endCalls++; -}; - -Writable.prototype.destroy = function() { - this.endCalls++; -}; - -function Readable() { - this.readable = true; - stream.Stream.call(this); -} -util.inherits(Readable, stream.Stream); - -function Duplex() { - this.readable = true; - Writable.call(this); -} -util.inherits(Duplex, Writable); - -var i = 0; -var limit = 100; - -var w = new Writable(); - -var r; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('end'); -} -assert.equal(0, r.listeners('end').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('close'); -} -assert.equal(0, r.listeners('close').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -r = new Readable(); - -for (i = 0; i < limit; i++) { - w = new Writable(); - r.pipe(w); - w.emit('close'); -} -assert.equal(0, w.listeners('close').length); - -r = new Readable(); -w = new Writable(); -var d = new Duplex(); -r.pipe(d); // pipeline A -d.pipe(w); // pipeline B -assert.equal(r.listeners('end').length, 2); // A.onend, A.cleanup -assert.equal(r.listeners('close').length, 2); // A.onclose, A.cleanup -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 3); // A.cleanup, B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -r.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 0); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 2); // B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -d.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 1); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 0); -assert.equal(d.listeners('close').length, 0); -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 0); diff --git a/test/simple/test-stream-pipe-error-handling.js b/test/simple/test-stream-pipe-error-handling.js index c5d724b..c7d6b7d 100644 --- a/test/simple/test-stream-pipe-error-handling.js +++ b/test/simple/test-stream-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream').Stream; +var Stream = require('../../').Stream; (function testErrorListenerCatches() { var source = new Stream(); diff --git a/test/simple/test-stream-pipe-event.js b/test/simple/test-stream-pipe-event.js index cb9d5fe..56f8d61 100644 --- a/test/simple/test-stream-pipe-event.js +++ b/test/simple/test-stream-pipe-event.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common'); -var stream = require('stream'); +var stream = require('../../'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream-push-order.js b/test/simple/test-stream-push-order.js index f2e6ec2..a5c9bf9 100644 --- a/test/simple/test-stream-push-order.js +++ b/test/simple/test-stream-push-order.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var assert = require('assert'); var s = new Readable({ diff --git a/test/simple/test-stream-push-strings.js b/test/simple/test-stream-push-strings.js index 06f43dc..1701a9a 100644 --- a/test/simple/test-stream-push-strings.js +++ b/test/simple/test-stream-push-strings.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var util = require('util'); util.inherits(MyStream, Readable); diff --git a/test/simple/test-stream-readable-event.js b/test/simple/test-stream-readable-event.js index ba6a577..a8e6f7b 100644 --- a/test/simple/test-stream-readable-event.js +++ b/test/simple/test-stream-readable-event.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; (function first() { // First test, not reading when the readable is added. diff --git a/test/simple/test-stream-readable-flow-recursion.js b/test/simple/test-stream-readable-flow-recursion.js index 2891ad6..11689ba 100644 --- a/test/simple/test-stream-readable-flow-recursion.js +++ b/test/simple/test-stream-readable-flow-recursion.js @@ -27,7 +27,7 @@ var assert = require('assert'); // more data continuously, but without triggering a nextTick // warning or RangeError. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; // throw an error if we trigger a nextTick warning. process.throwDeprecation = true; diff --git a/test/simple/test-stream-unshift-empty-chunk.js b/test/simple/test-stream-unshift-empty-chunk.js index 0c96476..7827538 100644 --- a/test/simple/test-stream-unshift-empty-chunk.js +++ b/test/simple/test-stream-unshift-empty-chunk.js @@ -24,7 +24,7 @@ var assert = require('assert'); // This test verifies that stream.unshift(Buffer(0)) or // stream.unshift('') does not set state.reading=false. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var nChunks = 10; diff --git a/test/simple/test-stream-unshift-read-race.js b/test/simple/test-stream-unshift-read-race.js index 83fd9fa..17c18aa 100644 --- a/test/simple/test-stream-unshift-read-race.js +++ b/test/simple/test-stream-unshift-read-race.js @@ -29,7 +29,7 @@ var assert = require('assert'); // 3. push() after the EOF signaling null is an error. // 4. _read() is not called after pushing the EOF null chunk. -var stream = require('stream'); +var stream = require('../../'); var hwm = 10; var r = stream.Readable({ highWaterMark: hwm }); var chunks = 10; @@ -51,7 +51,14 @@ r._read = function(n) { function push(fast) { assert(!pushedNull, 'push() after null push'); - var c = pos >= data.length ? null : data.slice(pos, pos + n); + var c; + if (pos >= data.length) + c = null; + else { + if (n + pos > data.length) + n = data.length - pos; + c = data.slice(pos, pos + n); + } pushedNull = c === null; if (fast) { pos += n; diff --git a/test/simple/test-stream-writev.js b/test/simple/test-stream-writev.js index 5b49e6e..b5321f3 100644 --- a/test/simple/test-stream-writev.js +++ b/test/simple/test-stream-writev.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var queue = []; for (var decode = 0; decode < 2; decode++) { diff --git a/test/simple/test-stream2-basic.js b/test/simple/test-stream2-basic.js index 3814bf0..248c1be 100644 --- a/test/simple/test-stream2-basic.js +++ b/test/simple/test-stream2-basic.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-compatibility.js b/test/simple/test-stream2-compatibility.js index 6cdd4e9..f0fa84b 100644 --- a/test/simple/test-stream2-compatibility.js +++ b/test/simple/test-stream2-compatibility.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-finish-pipe.js b/test/simple/test-stream2-finish-pipe.js index 39b274f..006a19b 100644 --- a/test/simple/test-stream2-finish-pipe.js +++ b/test/simple/test-stream2-finish-pipe.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Buffer = require('buffer').Buffer; var r = new stream.Readable(); diff --git a/test/simple/test-stream2-fs.js b/test/simple/test-stream2-fs.js deleted file mode 100644 index e162406..0000000 --- a/test/simple/test-stream2-fs.js +++ /dev/null @@ -1,72 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - - -var common = require('../common.js'); -var R = require('_stream_readable'); -var assert = require('assert'); - -var fs = require('fs'); -var FSReadable = fs.ReadStream; - -var path = require('path'); -var file = path.resolve(common.fixturesDir, 'x1024.txt'); - -var size = fs.statSync(file).size; - -var expectLengths = [1024]; - -var util = require('util'); -var Stream = require('stream'); - -util.inherits(TestWriter, Stream); - -function TestWriter() { - Stream.apply(this); - this.buffer = []; - this.length = 0; -} - -TestWriter.prototype.write = function(c) { - this.buffer.push(c.toString()); - this.length += c.length; - return true; -}; - -TestWriter.prototype.end = function(c) { - if (c) this.buffer.push(c.toString()); - this.emit('results', this.buffer); -} - -var r = new FSReadable(file); -var w = new TestWriter(); - -w.on('results', function(res) { - console.error(res, w.length); - assert.equal(w.length, size); - var l = 0; - assert.deepEqual(res.map(function (c) { - return c.length; - }), expectLengths); - console.log('ok'); -}); - -r.pipe(w); diff --git a/test/simple/test-stream2-httpclient-response-end.js b/test/simple/test-stream2-httpclient-response-end.js deleted file mode 100644 index 15cffc2..0000000 --- a/test/simple/test-stream2-httpclient-response-end.js +++ /dev/null @@ -1,52 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -var common = require('../common.js'); -var assert = require('assert'); -var http = require('http'); -var msg = 'Hello'; -var readable_event = false; -var end_event = false; -var server = http.createServer(function(req, res) { - res.writeHead(200, {'Content-Type': 'text/plain'}); - res.end(msg); -}).listen(common.PORT, function() { - http.get({port: common.PORT}, function(res) { - var data = ''; - res.on('readable', function() { - console.log('readable event'); - readable_event = true; - data += res.read(); - }); - res.on('end', function() { - console.log('end event'); - end_event = true; - assert.strictEqual(msg, data); - server.close(); - }); - }); -}); - -process.on('exit', function() { - assert(readable_event); - assert(end_event); -}); - diff --git a/test/simple/test-stream2-large-read-stall.js b/test/simple/test-stream2-large-read-stall.js index 2fbfbca..667985b 100644 --- a/test/simple/test-stream2-large-read-stall.js +++ b/test/simple/test-stream2-large-read-stall.js @@ -30,7 +30,7 @@ var PUSHSIZE = 20; var PUSHCOUNT = 1000; var HWM = 50; -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable({ highWaterMark: HWM }); @@ -39,23 +39,23 @@ var rs = r._readableState; r._read = push; r.on('readable', function() { - console.error('>> readable'); + //console.error('>> readable'); do { - console.error(' > read(%d)', READSIZE); + //console.error(' > read(%d)', READSIZE); var ret = r.read(READSIZE); - console.error(' < %j (%d remain)', ret && ret.length, rs.length); + //console.error(' < %j (%d remain)', ret && ret.length, rs.length); } while (ret && ret.length === READSIZE); - console.error('<< after read()', - ret && ret.length, - rs.needReadable, - rs.length); + //console.error('<< after read()', + // ret && ret.length, + // rs.needReadable, + // rs.length); }); var endEmitted = false; r.on('end', function() { endEmitted = true; - console.error('end'); + //console.error('end'); }); var pushes = 0; @@ -64,11 +64,11 @@ function push() { return; if (pushes++ === PUSHCOUNT) { - console.error(' push(EOF)'); + //console.error(' push(EOF)'); return r.push(null); } - console.error(' push #%d', pushes); + //console.error(' push #%d', pushes); if (r.push(new Buffer(PUSHSIZE))) setTimeout(push); } diff --git a/test/simple/test-stream2-objects.js b/test/simple/test-stream2-objects.js index 3e6931d..ff47d89 100644 --- a/test/simple/test-stream2-objects.js +++ b/test/simple/test-stream2-objects.js @@ -21,8 +21,8 @@ var common = require('../common.js'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var assert = require('assert'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-pipe-error-handling.js b/test/simple/test-stream2-pipe-error-handling.js index cf7531c..e3f3e4e 100644 --- a/test/simple/test-stream2-pipe-error-handling.js +++ b/test/simple/test-stream2-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); (function testErrorListenerCatches() { var count = 1000; diff --git a/test/simple/test-stream2-pipe-error-once-listener.js b/test/simple/test-stream2-pipe-error-once-listener.js index 5e8e3cb..53b2616 100755 --- a/test/simple/test-stream2-pipe-error-once-listener.js +++ b/test/simple/test-stream2-pipe-error-once-listener.js @@ -24,7 +24,7 @@ var common = require('../common.js'); var assert = require('assert'); var util = require('util'); -var stream = require('stream'); +var stream = require('../../'); var Read = function() { diff --git a/test/simple/test-stream2-push.js b/test/simple/test-stream2-push.js index b63edc3..eb2b0e9 100644 --- a/test/simple/test-stream2-push.js +++ b/test/simple/test-stream2-push.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; var assert = require('assert'); diff --git a/test/simple/test-stream2-read-sync-stack.js b/test/simple/test-stream2-read-sync-stack.js index e8a7305..9740a47 100644 --- a/test/simple/test-stream2-read-sync-stack.js +++ b/test/simple/test-stream2-read-sync-stack.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var N = 256 * 1024; diff --git a/test/simple/test-stream2-readable-empty-buffer-no-eof.js b/test/simple/test-stream2-readable-empty-buffer-no-eof.js index cd30178..4b1659d 100644 --- a/test/simple/test-stream2-readable-empty-buffer-no-eof.js +++ b/test/simple/test-stream2-readable-empty-buffer-no-eof.js @@ -22,10 +22,9 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; test1(); -test2(); function test1() { var r = new Readable(); @@ -88,31 +87,3 @@ function test1() { console.log('ok'); }); } - -function test2() { - var r = new Readable({ encoding: 'base64' }); - var reads = 5; - r._read = function(n) { - if (!reads--) - return r.push(null); // EOF - else - return r.push(new Buffer('x')); - }; - - var results = []; - function flow() { - var chunk; - while (null !== (chunk = r.read())) - results.push(chunk + ''); - } - r.on('readable', flow); - r.on('end', function() { - results.push('EOF'); - }); - flow(); - - process.on('exit', function() { - assert.deepEqual(results, [ 'eHh4', 'eHg=', 'EOF' ]); - console.log('ok'); - }); -} diff --git a/test/simple/test-stream2-readable-from-list.js b/test/simple/test-stream2-readable-from-list.js index 7c96ffe..04a96f5 100644 --- a/test/simple/test-stream2-readable-from-list.js +++ b/test/simple/test-stream2-readable-from-list.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var fromList = require('_stream_readable')._fromList; +var fromList = require('../../lib/_stream_readable')._fromList; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-readable-legacy-drain.js b/test/simple/test-stream2-readable-legacy-drain.js index 675da8e..51fd3d5 100644 --- a/test/simple/test-stream2-readable-legacy-drain.js +++ b/test/simple/test-stream2-readable-legacy-drain.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream'); +var Stream = require('../../'); var Readable = Stream.Readable; var r = new Readable(); diff --git a/test/simple/test-stream2-readable-non-empty-end.js b/test/simple/test-stream2-readable-non-empty-end.js index 7314ae7..c971898 100644 --- a/test/simple/test-stream2-readable-non-empty-end.js +++ b/test/simple/test-stream2-readable-non-empty-end.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var len = 0; var chunks = new Array(10); diff --git a/test/simple/test-stream2-readable-wrap-empty.js b/test/simple/test-stream2-readable-wrap-empty.js index 2e5cf25..fd8a3dc 100644 --- a/test/simple/test-stream2-readable-wrap-empty.js +++ b/test/simple/test-stream2-readable-wrap-empty.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var EE = require('events').EventEmitter; var oldStream = new EE(); diff --git a/test/simple/test-stream2-readable-wrap.js b/test/simple/test-stream2-readable-wrap.js index 90eea01..6b177f7 100644 --- a/test/simple/test-stream2-readable-wrap.js +++ b/test/simple/test-stream2-readable-wrap.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var EE = require('events').EventEmitter; var testRuns = 0, completedRuns = 0; diff --git a/test/simple/test-stream2-set-encoding.js b/test/simple/test-stream2-set-encoding.js index 5d2c32a..685531b 100644 --- a/test/simple/test-stream2-set-encoding.js +++ b/test/simple/test-stream2-set-encoding.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var util = require('util'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-transform.js b/test/simple/test-stream2-transform.js index 9c9ddd8..a0cacc6 100644 --- a/test/simple/test-stream2-transform.js +++ b/test/simple/test-stream2-transform.js @@ -21,8 +21,8 @@ var assert = require('assert'); var common = require('../common.js'); -var PassThrough = require('_stream_passthrough'); -var Transform = require('_stream_transform'); +var PassThrough = require('../../').PassThrough; +var Transform = require('../../').Transform; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-unpipe-drain.js b/test/simple/test-stream2-unpipe-drain.js index d66dc3c..365b327 100644 --- a/test/simple/test-stream2-unpipe-drain.js +++ b/test/simple/test-stream2-unpipe-drain.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var crypto = require('crypto'); var util = require('util'); diff --git a/test/simple/test-stream2-unpipe-leak.js b/test/simple/test-stream2-unpipe-leak.js index 99f8746..17c92ae 100644 --- a/test/simple/test-stream2-unpipe-leak.js +++ b/test/simple/test-stream2-unpipe-leak.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var chunk = new Buffer('hallo'); diff --git a/test/simple/test-stream2-writable.js b/test/simple/test-stream2-writable.js index 704100c..209c3a6 100644 --- a/test/simple/test-stream2-writable.js +++ b/test/simple/test-stream2-writable.js @@ -20,8 +20,8 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var W = require('_stream_writable'); -var D = require('_stream_duplex'); +var W = require('../../').Writable; +var D = require('../../').Duplex; var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream3-pause-then-read.js b/test/simple/test-stream3-pause-then-read.js index b91bde3..2f72c15 100644 --- a/test/simple/test-stream3-pause-then-read.js +++ b/test/simple/test-stream3-pause-then-read.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000327212631326456032066 0ustar 00000000000000{ "name": "readable-stream", "version": "1.1.13", "description": "Streams3, a user-land copy of the stream library from Node.js v0.11.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "isarray": "0.0.1", "string_decoder": "~0.10.x", "inherits": "~2.0.1" }, "devDependencies": { "tap": "~0.2.6" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/readable-stream.git" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "gitHead": "3b672fd7ae92acf5b4ffdbabf74b372a0a56b051", "bugs": { "url": "https://github.com/isaacs/readable-stream/issues" }, "homepage": "https://github.com/isaacs/readable-stream", "_id": "readable-stream@1.1.13", "_shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "_from": "readable-stream@>=1.1.13 <2.0.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/passthrough.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000007112631326456032060 0ustar 00000000000000module.exports = require("./lib/_stream_passthrough.js") ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/readable.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000055112631326456032063 0ustar 00000000000000exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = require('stream'); exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/transform.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000006712631326456032065 0ustar 00000000000000module.exports = require("./lib/_stream_transform.js") ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/writable.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000006612631326456032064 0ustar 00000000000000module.exports = require("./lib/_stream_writable.js") ././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000537312631326456032072 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. module.exports = Duplex; /**/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /**/ /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); forEach(objectKeys(Writable.prototype), function(method) { if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; }); function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. process.nextTick(this.end.bind(this)); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000327712631326456032073 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. module.exports = PassThrough; var Transform = require('./_stream_transform'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000006254712631326456032100 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. module.exports = Readable; /**/ var isArray = require('isarray'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /**/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /**/ var Stream = require('stream'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var StringDecoder; /**/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /**/ util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (util.isString(chunk) && !state.objectMode) { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (util.isNullOrUndefined(chunk)) { state.reading = false; if (!state.ended) onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (isNaN(n) || util.isNull(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else return state.length; } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (!util.isNumber(n) || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (util.isNull(ret)) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (!util.isNull(ret)) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) process.nextTick(function() { emitReadable_(stream); }); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; process.nextTick(function() { maybeReadMore_(stream, state); }); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) process.nextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { var self = this; process.nextTick(function() { debug('readable nexttick read 0'); self.read(0); }); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; if (!state.reading) { debug('resume read 0'); this.read(0); } resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; process.nextTick(function() { resume_(stream, state); }); } } function resume_(stream, state) { state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); if (!chunk || !state.objectMode && !chunk.length) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (util.isFunction(stream[i]) && util.isUndefined(this[i])) { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }}(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; process.nextTick(function() { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } }); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } ././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000001626612631326456032075 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. module.exports = Transform; var Duplex = require('./_stream_duplex'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(Transform, Duplex); function TransformState(options, stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (!util.isNullOrUndefined(data)) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(options, this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; this.once('prefinish', function() { if (util.isFunction(this._flush)) this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (!util.isNull(ts.writechunk) && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000003141512631326456032066 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. module.exports = Writable; /**/ var Buffer = require('buffer').Buffer; /**/ Writable.WritableState = WritableState; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Stream = require('stream'); util.inherits(Writable, Stream); function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.buffer = []; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, state, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); process.nextTick(function() { cb(er); }); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); process.nextTick(function() { cb(er); }); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (util.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (!util.isFunction(cb)) cb = function() {}; if (state.ended) writeAfterEnd(this, state, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.buffer.length) clearBuffer(this, state); } }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && util.isString(chunk)) { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (util.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) state.buffer.push(new WriteReq(chunk, encoding, cb)); else doWrite(stream, state, false, len, chunk, encoding, cb); return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { if (sync) process.nextTick(function() { state.pendingcb--; cb(er); }); else { state.pendingcb--; cb(er); } stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(stream, state); if (!finished && !state.corked && !state.bufferProcessing && state.buffer.length) { clearBuffer(stream, state); } if (sync) { process.nextTick(function() { afterWrite(stream, state, finished, cb); }); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; if (stream._writev && state.buffer.length > 1) { // Fast case, write everything using _writev() var cbs = []; for (var c = 0; c < state.buffer.length; c++) cbs.push(state.buffer[c].callback); // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; doWrite(stream, state, true, state.length, state.buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer state.buffer = []; } else { // Slow case, write chunks one-by-one for (var c = 0; c < state.buffer.length; c++) { var entry = state.buffer[c]; var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { c++; break; } } if (c < state.buffer.length) state.buffer = state.buffer.slice(c); else state.buffer.length = 0; } state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (util.isFunction(chunk)) { cb = chunk; chunk = null; encoding = null; } else if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (!util.isNullOrUndefined(chunk)) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(stream, state) { return (state.ending && state.length === 0 && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(stream, state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else prefinish(stream, state); } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) process.nextTick(cb); else stream.once('finish', cb); } state.ended = true; } ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000206512631326456032065 0ustar 00000000000000Copyright Node.js contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000010312631326456032054 0ustar 00000000000000# core-util-is The `util.is*` functions introduced in Node v0.12. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/float.patchnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000003762612631326456032100 0ustar 00000000000000diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/lib/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000023300000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000443212631326456032065 0ustar 00000000000000{ "_args": [ [ "core-util-is@~1.0.0", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream" ] ], "_from": "core-util-is@>=1.0.0 <1.1.0", "_id": "core-util-is@1.0.2", "_inCache": true, "_installable": true, "_location": "/node-gyp/npmlog/are-we-there-yet/readable-stream/core-util-is", "_nodeVersion": "4.0.0", "_npmUser": { "email": "i@izs.me", "name": "isaacs" }, "_npmVersion": "3.3.2", "_phantomChildren": {}, "_requested": { "name": "core-util-is", "raw": "core-util-is@~1.0.0", "rawSpec": "~1.0.0", "scope": null, "spec": ">=1.0.0 <1.1.0", "type": "range" }, "_requiredBy": [ "/node-gyp/npmlog/are-we-there-yet/readable-stream" ], "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.2.tgz", "_shasum": "b5fd54220aa2bc5ab57aab7140c940754503c1a7", "_shrinkwrap": null, "_spec": "core-util-is@~1.0.0", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "dependencies": {}, "description": "The `util.is*` functions introduced in Node v0.12.", "devDependencies": { "tap": "^2.3.0" }, "directories": {}, "dist": { "shasum": "b5fd54220aa2bc5ab57aab7140c940754503c1a7", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.2.tgz" }, "gitHead": "a177da234df5638b363ddc15fa324619a38577c8", "homepage": "https://github.com/isaacs/core-util-is#readme", "keywords": [ "isArray", "isBuffer", "isNumber", "isRegExp", "isString", "isThat", "isThis", "polyfill", "util" ], "license": "MIT", "main": "lib/util.js", "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "name": "core-util-is", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is.git" }, "scripts": { "test": "tap test.js" }, "version": "1.0.2" } ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/test.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000406512631326456032067 0ustar 00000000000000var assert = require('tap'); var t = require('./lib/util'); assert.equal(t.isArray([]), true); assert.equal(t.isArray({}), false); assert.equal(t.isBoolean(null), false); assert.equal(t.isBoolean(true), true); assert.equal(t.isBoolean(false), true); assert.equal(t.isNull(null), true); assert.equal(t.isNull(undefined), false); assert.equal(t.isNull(false), false); assert.equal(t.isNull(), false); assert.equal(t.isNullOrUndefined(null), true); assert.equal(t.isNullOrUndefined(undefined), true); assert.equal(t.isNullOrUndefined(false), false); assert.equal(t.isNullOrUndefined(), true); assert.equal(t.isNumber(null), false); assert.equal(t.isNumber('1'), false); assert.equal(t.isNumber(1), true); assert.equal(t.isString(null), false); assert.equal(t.isString('1'), true); assert.equal(t.isString(1), false); assert.equal(t.isSymbol(null), false); assert.equal(t.isSymbol('1'), false); assert.equal(t.isSymbol(1), false); assert.equal(t.isSymbol(Symbol()), true); assert.equal(t.isUndefined(null), false); assert.equal(t.isUndefined(undefined), true); assert.equal(t.isUndefined(false), false); assert.equal(t.isUndefined(), true); assert.equal(t.isRegExp(null), false); assert.equal(t.isRegExp('1'), false); assert.equal(t.isRegExp(new RegExp()), true); assert.equal(t.isObject({}), true); assert.equal(t.isObject([]), true); assert.equal(t.isObject(new RegExp()), true); assert.equal(t.isObject(new Date()), true); assert.equal(t.isDate(null), false); assert.equal(t.isDate('1'), false); assert.equal(t.isDate(new Date()), true); assert.equal(t.isError(null), false); assert.equal(t.isError({ err: true }), false); assert.equal(t.isError(new Error()), true); assert.equal(t.isFunction(null), false); assert.equal(t.isFunction({ }), false); assert.equal(t.isFunction(function() {}), true); assert.equal(t.isPrimitive(null), true); assert.equal(t.isPrimitive(''), true); assert.equal(t.isPrimitive(0), true); assert.equal(t.isPrimitive(new Date()), false); assert.equal(t.isBuffer(null), false); assert.equal(t.isBuffer({}), false); assert.equal(t.isBuffer(new Buffer(0)), true); ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/lib/util.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000571512631326456032072 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(arg) { if (Array.isArray) { return Array.isArray(arg); } return objectToString(arg) === '[object Array]'; } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; exports.isBuffer = Buffer.isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000302512631326456032062 0ustar 00000000000000 # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022000000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/build/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456032060 5ustar 00000000000000././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/component.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000072612631326456032067 0ustar 00000000000000{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000017012631326456032060 0ustar 00000000000000module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000406512631326456032067 0ustar 00000000000000{ "_args": [ [ "isarray@0.0.1", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream" ] ], "_from": "isarray@0.0.1", "_id": "isarray@0.0.1", "_inCache": true, "_installable": true, "_location": "/node-gyp/npmlog/are-we-there-yet/readable-stream/isarray", "_npmUser": { "email": "julian@juliangruber.com", "name": "juliangruber" }, "_npmVersion": "1.2.18", "_phantomChildren": {}, "_requested": { "name": "isarray", "raw": "isarray@0.0.1", "rawSpec": "0.0.1", "scope": null, "spec": "0.0.1", "type": "version" }, "_requiredBy": [ "/node-gyp/npmlog/are-we-there-yet/readable-stream" ], "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_shrinkwrap": null, "_spec": "isarray@0.0.1", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream", "author": { "email": "mail@juliangruber.com", "name": "Julian Gruber", "url": "http://juliangruber.com" }, "bugs": { "url": "https://github.com/juliangruber/isarray/issues" }, "dependencies": {}, "description": "Array#isArray for older browsers", "devDependencies": { "tap": "*" }, "directories": {}, "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "homepage": "https://github.com/juliangruber/isarray", "keywords": [ "array", "browser", "isarray" ], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "name": "isarray", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "scripts": { "test": "tap test/*.js" }, "version": "0.0.1" } ././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/build/build.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000777012631326456032075 0ustar 00000000000000 /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ././@LongLink0000000000000000000000000000023300000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000001312631326456032054 0ustar 00000000000000build test ././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000206412631326456032064 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000076212631326456032067 0ustar 00000000000000**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000001716412631326456032073 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000644000000000000000000000427012631326456032065 0ustar 00000000000000{ "_args": [ [ "string_decoder@~0.10.x", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream" ] ], "_from": "string_decoder@>=0.10.0 <0.11.0", "_id": "string_decoder@0.10.31", "_inCache": true, "_installable": true, "_location": "/node-gyp/npmlog/are-we-there-yet/readable-stream/string_decoder", "_npmUser": { "email": "rod@vagg.org", "name": "rvagg" }, "_npmVersion": "1.4.23", "_phantomChildren": {}, "_requested": { "name": "string_decoder", "raw": "string_decoder@~0.10.x", "rawSpec": "~0.10.x", "scope": null, "spec": ">=0.10.0 <0.11.0", "type": "range" }, "_requiredBy": [ "/node-gyp/npmlog/are-we-there-yet/readable-stream" ], "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_shrinkwrap": null, "_spec": "string_decoder@~0.10.x", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "dependencies": {}, "description": "The string_decoder module from Node core", "devDependencies": { "tap": "~0.4.8" }, "directories": {}, "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "browser", "browserify", "decoder", "string" ], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "name": "string_decoder", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "scripts": { "test": "tap test/simple/*.js" }, "version": "0.10.31" } ././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/tracker.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/tracker.0000644000000000000000000000317412631326456032023 0ustar 00000000000000"use strict" var test = require("tap").test var Tracker = require("../index.js").Tracker var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } test("Tracker", function (t) { t.plan(10) var name = "test" var track = new Tracker(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") var todo = 100 track = new Tracker(name, todo) t.is(track.completed(), 0, "Nothing done is 0 completion") testEvent(track, "change", afterCompleteWork) track.completeWork(100) function afterCompleteWork(er, onChangeName) { t.is(er, null, "completeWork: on change event fired") t.is(onChangeName, name, "completeWork: on change emits the correct name") } t.is(track.completed(), 1, "completeWork: 100% completed") testEvent(track, "change", afterAddWork) track.addWork(100) function afterAddWork(er, onChangeName) { t.is(er, null, "addWork: on change event fired") t.is(onChangeName, name, "addWork: on change emits the correct name") } t.is(track.completed(), 0.5, "addWork: 50% completed") track.completeWork(200) t.is(track.completed(), 1, "completeWork: Over completion is still only 100% complete") track = new Tracker(name, todo) track.completeWork(50) track.finish() t.is(track.completed(), 1, "finish: Explicitly finishing moves to 100%") }) ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/trackergroup.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/trackerg0000644000000000000000000000633412631326456032115 0ustar 00000000000000"use strict" var test = require("tap").test var Tracker = require("../index.js").Tracker var TrackerGroup = require("../index.js").TrackerGroup var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } test("TrackerGroup", function (t) { var name = "test" var track = new TrackerGroup(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") testEvent(track, "change", afterFinishEmpty) track.finish() var a, b function afterFinishEmpty(er, onChangeName) { t.is(er, null, "finishEmpty: on change event fired") t.is(onChangeName, name, "finishEmpty: on change emits the correct name") t.is(track.completed(), 1, "finishEmpty: Finishing an empty group actually finishes it") track = new TrackerGroup(name) a = track.newItem("a", 10, 1) b = track.newItem("b", 10, 1) t.is(track.completed(), 0, "Initially empty") testEvent(track, "change", afterCompleteWork) a.completeWork(5) } function afterCompleteWork(er, onChangeName) { t.is(er, null, "on change event fired") t.is(onChangeName, "a", "on change emits the correct name") t.is(track.completed(), 0.25, "Complete half of one is a quarter overall") testEvent(track, "change", afterFinishAll) track.finish() } function afterFinishAll(er, onChangeName) { t.is(er, null, "finishAll: on change event fired") t.is(onChangeName, name, "finishAll: on change emits the correct name") t.is(track.completed(), 1, "Finishing everything ") track = new TrackerGroup(name) a = track.newItem("a", 10, 2) b = track.newItem("b", 10, 1) t.is(track.completed(), 0, "weighted: Initially empty") testEvent(track, "change", afterWeightedCompleteWork) a.completeWork(5) } function afterWeightedCompleteWork(er, onChangeName) { t.is(er, null, "weighted: on change event fired") t.is(onChangeName, "a", "weighted: on change emits the correct name") t.is(Math.round(track.completed()*100), 33, "weighted: Complete half of double weighted") testEvent(track, "change", afterWeightedFinishAll) track.finish() } function afterWeightedFinishAll(er, onChangeName) { t.is(er, null, "weightedFinishAll: on change event fired") t.is(onChangeName, name, "weightedFinishAll: on change emits the correct name") t.is(track.completed(), 1, "weightedFinishaAll: Finishing everything ") track = new TrackerGroup(name) a = track.newGroup("a", 10) b = track.newGroup("b", 10) var a1 = a.newItem("a.1",10) a1.completeWork(5) t.is(track.completed(), 0.25, "nested: Initially quarter done") testEvent(track, "change", afterNestedComplete) b.finish() } function afterNestedComplete(er, onChangeName) { t.is(er, null, "nestedComplete: on change event fired") t.is(onChangeName, "b", "nestedComplete: on change emits the correct name") t.is(track.completed(), 0.75, "nestedComplete: Finishing everything ") t.end() } }) ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/trackerstream.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/are-we-there-yet/test/trackers0000644000000000000000000000344212631326456032126 0ustar 00000000000000"use strict" var test = require("tap").test var util = require("util") var stream = require("readable-stream") var TrackerStream = require("../index.js").TrackerStream var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } var Sink = function () { stream.Writable.apply(this,arguments) } util.inherits(Sink, stream.Writable) Sink.prototype._write = function (data, encoding, cb) { cb() } test("TrackerStream", function (t) { t.plan(9) var name = "test" var track = new TrackerStream(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") var todo = 10 track = new TrackerStream(name, todo) t.is(track.completed(), 0, "Nothing done is 0 completion") track.pipe(new Sink()) testEvent(track, "change", afterCompleteWork) track.write("0123456789") function afterCompleteWork(er, onChangeName) { t.is(er, null, "write: on change event fired") t.is(onChangeName, name, "write: on change emits the correct name") t.is(track.completed(), 1, "write: 100% completed") testEvent(track, "change", afterAddWork) track.addWork(10) } function afterAddWork(er, onChangeName) { t.is(er, null, "addWork: on change event fired") t.is(track.completed(), 0.5, "addWork: 50% completed") testEvent(track, "change", afterAllWork) track.write("ABCDEFGHIJKLMNOPQRST") } function afterAllWork(er) { t.is(er, null, "allWork: on change event fired") t.is(track.completed(), 1, "allWork: 100% completed") } }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/.npmignore0000644000000000000000000000114212631326456027404 0ustar 00000000000000# Logs logs *.log # Runtime data pids *.pid *.seed # Directory for instrumented libs generated by jscoverage/JSCover lib-cov # Coverage directory used by tools like istanbul coverage # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) .grunt # Compiled binary addons (http://nodejs.org/api/addons.html) build/Release # Dependency directory # Commenting this out is preferred by some people, see # https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git- node_modules # Users Environment Variables .lock-wscript # Editor cruft *~ .#* npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/LICENSE0000644000000000000000000000135712631326456026422 0ustar 00000000000000Copyright (c) 2014, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/README.md0000644000000000000000000001403612631326456026672 0ustar 00000000000000gauge ===== A nearly stateless terminal based horizontal guage / progress bar. ```javascript var Gauge = require("gauge") var gauge = new Gauge() gauge.show("test", 0.20) gauge.pulse("this") gauge.hide() ``` ![](example.png) ### `var gauge = new Gauge([options], [ansiStream])` * **options** – *(optional)* An option object. (See [below] for details.) * **ansiStream** – *(optional)* A stream that's been blessed by the [ansi] module to include various commands for controlling the cursor in a terminal. [ansi]: https://www.npmjs.com/package/ansi [below]: #theme-objects Constructs a new gauge. Gauges are drawn on a single line, and are not drawn if the current terminal isn't a tty. If you resize your terminal in a way that can be detected then the gauge will be drawn at the new size. As a general rule, growing your terminal will be clean, but shrinking your terminal will result in cruft as we don't have enough information to know where what we wrote previously is now located. The **options** object can have the following properties, all of which are optional: * maxUpdateFrequency: defaults to 50 msec, the gauge will not be drawn more than once in this period of time. This applies to `show` and `pulse` calls, but if you `hide` and then `show` the gauge it will draw it regardless of time since last draw. * theme: defaults to Gauge.unicode` if the terminal supports unicode according to [has-unicode], otherwise it defaults to `Gauge.ascii`. Details on the [theme object](#theme-objects) are documented elsewhere. * template: see [documentation elsewhere](#template-objects) for defaults and details. [has-unicode]: https://www.npmjs.com/package/has-unicode If **ansiStream** isn't passed in, then one will be constructed from stderr with `ansi(process.stderr)`. ### `gauge.show([name, [completed]])` * **name** – *(optional)* The name of the current thing contributing to progress. Defaults to the last value used, or "". * **completed** – *(optional)* The portion completed as a value between 0 and 1. Defaults to the last value used, or 0. If `process.stdout.isTTY` is false then this does nothing. If completed is 0 and `gauge.pulse` has never been called, then similarly nothing will be printed. If `maxUpdateFrequency` msec haven't passed since the last call to `show` or `pulse` then similarly, nothing will be printed. (Actually, the update is deferred until `maxUpdateFrequency` msec have passed and if nothing else has happened, the gauge update will happen.) ### `gauge.hide()` Removes the gauge from the terminal. ### `gauge.pulse([name])` * **name** – *(optional)* The specific thing that triggered this pulse Spins the spinner in the gauge to show output. If **name** is included then it will be combined with the last name passed to `gauge.show` using the subsection property of the theme (typically a right facing arrow). ### `gauge.disable()` Hides the gauge and ignores further calls to `show` or `pulse`. ### `gauge.enable()` Shows the gauge and resumes updating when `show` or `pulse` is called. ### `gauge.setTheme(theme)` Change the active theme, will be displayed with the next show or pulse ### `gauge.setTemplate(template)` Change the active template, will be displayed with the next show or pulse ### Theme Objects There are two theme objects available as a part of the module, `Gauge.unicode` and `Gauge.ascii`. Theme objects have the follow properties: | Property | Unicode | ASCII | | ---------- | ------- | ----- | | startgroup | ╢ | \| | | endgroup | ╟ | \| | | complete | █ | # | | incomplete | ░ | - | | spinner | ▀▐▄▌ | -\\\|/ | | subsection | → | -> | *startgroup*, *endgroup* and *subsection* can be as many characters as you want. *complete* and *incomplete* should be a single character width each. *spinner* is a list of characters to use in turn when displaying an activity spinner. The Gauge will spin as many characters as you give here. ### Template Objects A template is an array of objects and strings that, after being evaluated, will be turned into the gauge line. The default template is: ```javascript [ {type: "name", separated: true, maxLength: 25, minLength: 25, align: "left"}, {type: "spinner", separated: true}, {type: "startgroup"}, {type: "completionbar"}, {type: "endgroup"} ] ``` The various template elements can either be **plain strings**, in which case they will be be included verbatum in the output. If the template element is an object, it can have the following keys: * *type* can be: * `name` – The most recent name passed to `show`; if this is in response to a `pulse` then the name passed to `pulse` will be appended along with the subsection property from the theme. * `spinner` – If you've ever called `pulse` this will be one of the characters from the spinner property of the theme. * `startgroup` – The `startgroup` property from the theme. * `completionbar` – This progress bar itself * `endgroup` – The `endgroup` property from the theme. * *separated* – If true, the element will be separated with spaces from things on either side (and margins count as space, so it won't be indented), but only if its included. * *maxLength* – The maximum length for this element. If its value is longer it will be truncated. * *minLength* – The minimum length for this element. If its value is shorter it will be padded according to the *align* value. * *align* – (Default: left) Possible values "left", "right" and "center". Works as you'd expect from word processors. * *length* – Provides a single value for both *minLength* and *maxLength*. If both *length* and *minLength or *maxLength* are specifed then the latter take precedence. ### Tracking Completion If you have more than one thing going on that you want to track completion of, you may find the related [are-we-there-yet] helpful. It's `change` event can be wired up to the `show` method to get a more traditional progress bar interface. [are-we-there-yet]: https://www.npmjs.com/package/are-we-there-yet npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/example.png0000644000000000000000000004635112631326456027561 0ustar 00000000000000PNG  IHDRtol{ iCCPICC ProfileHWXS[R -)7AzޥJ! J AŎ.*vQAQEZQ,` "*ʺX& {;79gs|3(ڳ,T l~0*Ї$ @@0Yl;22\g*  q GΆ(&[ sbU!$,i2%)2l-2 c8 {Y = bE*Ħ)?I)c1Y1,E*d?HŚKvxt }ب (IΰnUB$rGNS# V#{~ 0@1%gz`[P p^npNΊqEѣ8:s?+|!3 =/㉶!VC2b(?7|F(p6]0 JfgF¬,)urcdXW:ʍq8\~g .BAV=V ;$ʋ ' H 72F A(~ İY ᗬ'.ьzK{ _qhGyPuL+{ZTio#<8=p7<>`ŝqQ?D1@4`{: 7 ] raiO̙ tARwߨ n Y;>; \X0o?2^˟Ǔ11cV?GF[Z3EVikNJLx" EIe8Q>/1:kPA.wNdAspG2l L[k$lxːw]N.EP]2S l0O%E2400'[܀A`z:Ȇg` (`-JT0'p\&F/x {0  !tDE qF<$Bd$ #bd>)F#.9A."]C Q 6jNDQo4Aih.CW[ t?ZA/7.%:Lc`z%bX XVU`X#ױ.q:- cq6/Wx^n|FWB0!FM($KxO$D\ <*vAbC$H$ ;)" I[IIIHdy.ٖ@N">)53򐜒\Gn=rrWz(;%AYBB<חw"ϓ_,EnOT9՗:*VRwoi41͋D˥U>*8 )RS4RVXxDbKiRqJtelU/*?W!pTV9Ct_:~ޫJT5Q VP-V=ڮ:f6GLZc3Y5Ì[y[9vܵqǫ{sՋTXQP4ל9[\fxnO 2׊Қ[MkP[G;P[UvCK'Cg)>].OwiL573 k7я/?ЀbljѠ`P0pa=#9#gtFFM㍗?7Q7 671y`J3410aF4s64na;_@--x-:'&LOpےjmgYcmŰ *z5pbu['~vβc}FfMM[m ;]"\r;t0_}NNNۜn;:G:rBpqYr哫ka׿,2=d2;iϤw}w..GN.O=Ogc//^gf_X}|u]? H tD Zt;X;\<0i-!ԐҐǡ04lr؆F!aIdNSS"Mye5?5=3z_51cMcűqq>ůJ rf"/!7ipMS{9L+vk9/М5LřG "Xm)l_fKg#]}Ꞻ>y{چt~/:#(cGƇ̈|~&eά9:BAWkΦapM5£NT;#,G(ik>wg5כd~" S6/2XlQUK(K2\).X_nie/%BBanwWV[u"Nѥb/ث.j_Wn_㸦|-q-u+_߳!lCFƢ6tľdff-[n]Kiz2۴a;gr;w|ygW஺ ㊒yo{5Zɯ쪊jvާoM Z#?m 2ć^!8=jtt1:nn@}z}WCbCǛaG e'N9E9ӃM3igzg6?pF˔s!.8ջ '.^<~ReummǮ8\9^wjCKGcS/D]eed9|]=a! :w ! yMPN:rXA҉nŮ} 55K5`;oXi9}xiBnSԻ~w ףkx!zT ' NS3,+z_^jo}A5eh~r Y kjlU,ઉ?^pY??__p{MROOڻP~eMz.hKFtW9]ֵuB噐m!~\ kj-> y?V|er wVJhRG.a؆%O(BZ#`lZ&kƴտ>epü$=d.t.~iUVPŋ205~B^Ѭժ>933 r:ŇCsWjjߚzrVͬS(n!`%g)1iT\T/4`:GѡWD -^C\VV$P9-r_B@E饬2aݚnSa*bof>l uGEKJ7Qߺx/9~5jCD$P X>G~kNH#9ssCs=2U{IZQը|.=HDKѱ\CYt"j˧> }x_"WuV){\y7l|4z-Z\NuG$:.^ kbxWtm8KwcD:䵬rܻ!T2ZNry.4$:.^4%Ntr$AN0d>w9+`6ё2 l 𣾫)+N .֝DKC\mǔ9H>C̙rcuoIr69𢣽ICtz&aSd*Weܪ0[Xɳwj=Ŵ8PE\ljXOtxEע|H NhDs= /ug$6< 0Yl4&n +:ݿG.u1ܠM^C|A$?(0Dv}>;gH#7|^W٪X;:jrgݚmaCEr״;e|_|czS5j$x_ ,.:ƭ-.o7Uc>u(z7[_g ((|k 5jYI(: R΅ߗTsbgDgd^^X4:q,{ؿ$6jjď8#=*ezE>_a~~!O$ߝϸOT|RKtUo$^t Li0-ii朲ŋRr##JJ1"tJER v<܋MZnpz`<霟ŋZё2''O&c\U"Taj,SW/j='utQ:%D77TO Lk$aZ)*Y 'RnËZ1,U <Աo@CcxQAtHuѷ#N@0Aqw##,:J8 1S}ROt ug91F8N_=)}?&Zp,srf|G5aӪ:OX:떝GI1N"-^zO@ʉ Dm xlu4E3˜E(Xdt Qj/mPzNVV/j=' DMݫ(5EߢRjD :@tDB|\kD̷Ksn:ԑM[O >/#/O!Cd/Y :РD5#_57y/c|B6>9|XѱOȇ݂J}9^Egzpb>qokjjZ}߰rHҶ;{V!ZTCJw~v7si)I{ijEw=>oO'宭.[Iȩ Ì<3ɑ64^p_ىkk="‚?n_Y׋, ZR9-\ &/_RcdR{Mf9~f8G:;+elRG.a؆%OBZ#`ƴ7׿;Mԣ۬%5֗7[O <%IȼL^@Ǧޓʠ2s 1uŇĤCsWjjf=l/=C \.[=3%H=s8DNaIslLr.*=XcuLfIY"ADBtpJ5E3R. 2-BѴH^v_.# zjgVK{3ے$Zbva (^hxy&LY9B3ڭ>?rγb3?D0C)` aB:t9Q[[ hG z<0S:$ZQ(\nx:)R!DHRtxFF  Z XBdCo>՜r!чW:1HQ)Y-iZts S=SixY ybx@tZ mDe"zEf-Qwf6q`em4y''7mF"y-;8O$BtZ$wx\y%4گZ{C#$wu9e%3X$?dKm0HEGjE9;8OL!?DZd:p9@}L]h/S$rS#&'<-`f$3.:qX@Ģ#=/ \&mϷz3_*ZnM=ey[Š=uMcDaDp'ӿgg} iPtX?;9NzlZwgݚ}aXTQP=`P/iT>G/"s @}"|&,P>9 FBzSK)ӭD'G2^|՜eYR"[[": R;$1ӪĬklvKNy'g-Έ JIzїEC)_/S Bi>ziUt&a۳oas=9[R2-å5yWoƘEZLqNTqF׋E#n>WCyAtqf-sh):Q<f)[(,>)':2_Y,BNߛ[$iý1ۤ #y!Q,>/xxDl2U)Br:?ppeS} :\G:Aɡ]BMs^LTδFMp"HF81,>R3H4$:DT};t w1rQ>)(:R Z9N͢S=' DϿP[zxX>(:,nc'cl[<eGѡ8Q>)':Lmw[8٬qcnyt-oE@ؖJ+AVNSt.:#ɍ,^zO@ʉPEHGO/NF (,DoeS}RNtШKѴ^ٽR3Y-*,>&:@tD@tuuϑε%F%;ԑMٛkt8T|E}P˧y]!{mf-8˜|z?8}6NtM6$@ݵ&?i0k+Od8ȅ烚jܱjVZ7g)P:E^IpȅJM)7谄ė/~=]vޯ͹;"'%5o4=79'm!=i}E26aIh,Esh:1Bdkܞ%,elB2۸Y8\q Tt̋KӘvwc°α K֓F6F%57[OB97[k# Z/a|CkW=kBŕ=*+?ֺҞZRrV$]LKLf_u,9D s:Ň~p׺bXyk}UtԒCs 9+Wd (g'>Vߊ0)Sci]i/*:1[8+}%CtDG59- sˌ*:jIbof)Ak* vܗ3=LsR-o_SGN}̱CL?D@t`,:/A[s6@lc)FIkEW΅uhDq2!T6TSijuf`~fNڟ!SIS,K,o %a+dFgJ,Jhr2TuFְX[Wڻ ut|.%1*:jiXљ2?.*}[9æ>.,lkc:Hkw4p"}ME+Rbkej|1r۸D"EtZf+V"NyA-I[]O#JJJg3Ns`E QO%[d::\0#4eoԞR2 74[خM\ԔSILNb{!;CqM%+zm₺*)uÎ%0?i13/1kP*Uz35̮jk=xM&ٶ@lɞ˽  41e\ERWYKld=sȭZ.7+Dg\P&>Pv=9ZWSK`SPM a+*'ߜZohD'Iorß\G㐳[pFXLm~`ԒlcA[sF zd-p /1< U ;Uiç Y[WSK:IC8MW|CV2.s( (}"\}h$8°Z:r͒GcMX |qk ;kE7YJZ-y6tEkɲ%=U HEi;ރ;ob jXЧڭtɲDn*^;S&ޞ R=:TNbCg9)%-,4ӊ5Kƙ%{pѱs{-,C}KJLTKjY3ў.s衦.M(-3}ui$g=*jZ="ΔUH#xz{WeqeoΖʈO!@Ģ3;`#0t$ߊ >0coQtǧWj' ѡ>*g$Ɂǰ ޿;{1$(: ~lI~L?߰^ ,+6щ9SE}`rbv׻,X.Vls\t-_s[? $ =S˟Gk,Xܬ(E'' );`rs\_%ÂŊvxs^tҷ{R6?4X`YӉ9I&:X}k["ت?‚Ѻ٫؟)sXR{D'G]rWb(X`9[&9XRhnFsls2yU2?Taщ9':hw?qpF_|?藺`r2b{x8|tvk`h_D<{uOŁ:_,X-y"9.O>ZY ,+)<,X.Vws~:9{ ەdu-,X\k!:2mcaX`9[8م=݄`lAtNgr"X`9Z߾T$Mf(7- s©_M¦K "`r-4HIq#i.!즿EtT/zODgZ#a &HQEW8R$u#^za`}ËZ@EtY+T)EfQQzCU_-=c6+zIDATcNCqS}RNtڌȷ&qZ3Y Koݲ")"&i[:;ŋZO H9a-!V]tF3TYT̑^7J "jQ_Y|ىEQ#i{53fv[TY|RMtD$ I+9[t@]C9ڔiHwͳtڥ^rwNӮHD.:]"M.:#-Ӿ?*:Ӄ)!TN5O#]VRfp#-l zu'/x?8|&Gh'2AM5-xeMsT ϫ.uh.:+:=@JK~[0O${iZ7V#9~ZRVIcsNUt%7k>ɟ."|Ei'~L"$j]$-(^  NN9p13JTtmXh4A%5'2Q ;ՋEt؉[OJChS]j+Ww{VUV~lȼ5!RZP蹡mߜw!)yE[޾*jɡRRcb"j^t*/|V"+jNWnVZp`:DrkNpgm{Ϫr-f]thﳙu"GG jI:تxQRR2-OUxPlT^|;}Bf3PEԂ2 g3rϕ0m< ⹒zT4ac.ܠx>=)WŘ0ױY-3<$4-1ؿА-8Ł]t=/:r j.:MrTRZt2mP4PzcK2ⳒKͧiPQt2C҂ ~.Ytv+잝\G[pFXL'~`Ԓl+A[s  eՋ.%KzS/ zd-p4mсFl? Gɞj`>xWpl@0iTRJj}tYt8*gK :#&g/AtWq;~{@td3D@tU4߻7 *+aD݊YωkNot.7.$&[9,X c&:s")oi BgŸs&b$lJsvϤR\_ӗ~ VXy yIzBt<3b񓼢sj?dG/5է 9V]]]#VAy,fX9 K=~ïOp,Wo~cj/D^N1e~+q(EOv4PXdX+rO5rs\;_`%jqYUY 0#>7FUR2bz\7ۮ?k$+o m~i`%uy9RNcө*,)P9Opj{#:X}k[Zت?‚ e^R7CxCqTubw!o{mZ Ut1e ַ=erLCGWõ!j7GWyB.sC,dk{NAii%|,+Q,,)]47#Ji| gwMHˆNmmm89`YcGY%蜗r+Q,蘞Jxq``A&WYg(48"dS}8#}/SXʈUtƭ/H}Wi3LJ?)TӢKWD$:"?p.ߕ\ VX|80?^6U <ܻqG1kHtRlq68_,X bz״?ٗ+u0tc :~}1`%Ôy:B6f+Y>#@fޓ zd-3٦xrJYD'+~>@co/DGf+],Xcpς>u|74u^2*z9謳0,έ-.9`J+Łg9f)<[/:e?SMJFQ=|}Nx]itSMtr[,X c(:*Y3KJ-FS$Ƶ`%ѹIb)m! Â(D'oARm.D`%Iu:{MS`%ԢӰT$Mf(7IڷZt2)w)KL9< 17\oH+A,ƘEZLqNTqF׋E#n>WCyAtqf-sh):Q<f)[(,>)':2_Y,BNߛ[$iý1ۤ #y!Q,>/xxDl2U)Br:?ppeS} :\G:Aɡ]BMs^LTδFMp"HF81,>R3H4$:DT};t W1rQ>)(:R Z9N͢S=' DϿP[zxX>(:,nS#}?&3x=EKDG0ߑoMjgd1)ߺeERDL2;;;7uvvGrc[B(,^)[:M-0f&7xQ>)':B#=͎\CvwR[IM[pev'ם!gp 5մj6=S1[@?޺ԡhkIq^/CDvЯuc)1'%5o4< n{QE9Yrs"Rǹ\4&!$BօNڂE Xs wX{~1DE؆%QICīZRcazʺ(cXXDؼ.Y}4=kܥr}gUe.ZӞ+Ջچ x]D15K훭r ,U;O.q-˩EBng)"tx{jCt.w[Ўߦ'*jIbofE|d^tOWL͋n&d!:\ѡ`,/>[1Llc)IGBõvѹSk""*ϻ\tx{,lWQtإ~=΂CtVZI#k4!:s> T9`k5WOtҰ3e~]TrEM}\*X2.:5ՋUt~;BNv+,9,ts .~ :[tZf;謕s;>Y'r$|%%%B=^*FEgg(d1^tL-8)s.:}v9Oq(X.\L 0 +M%+L8 zOޓrU }2C"O@S тX@EsA*/F Ms+)M%EzJW/L> E*>9$C\!>+$|A%E'3L)-a @E'Iolٹ%49;gt K-ɶϺ50ѹ`Y񼟯Q肜]R70G;OӆOm͜&:Y>_rj vp% NU,֧NEà opi^N"Ě^ǵ]hni nm@EV#1ߟ73b6J)[/:1=I2}:A>zfՅ~I OM) yh=gWcڟjjmЀhٷ3J7-)RR˚ItLlfY\(у X|v圣҂B\m א"]wU!UdܑCc8삝@t\#'V2DǕIWZ6+T<$&&g/AtWq;~\Atd3D@tn.ڥq2 ~~2XƊMtڭxK[:QS?Â0Vl3';ωz{g =+aDg W:D]e3.:)8ׯ幭 VlczoD^ҵO+*Oѕj?dG/5է 9V]]]#nzAyw>|9 K=+~O=0ϫdS{&:v(<,XcE):6~b"~#=%#RϸGxi|̸bO+ahW+W'dϪ#Ȇyn18o>jsdvwUVݕ]i/D'}QoM+q z6v+W!w\d9O٥-|-Cek{#:X}k[Zت?‚ e^'7C׋xCqTubw!o{mZ8S`}S/$:tt<\K *opz4(:{='?7$:ʢM6ѱQ DLst܌(EIS77-" #:eDe^D%蜗r+Q,xq``A&WYg(48QtxgֵdS}8#}/SXʈUtƭ/H}Wi3LJ?)TӢKWD$:BuHtsɰNg:XEÁxj58JJ:99z ^rݑɷ8po/Fd=m?⁻+u0tc :~}1`%Ôy:B6f+Y>#@0ޓ zd-3٦.erJYD'+~>@co/DGf+],Xcp/1aQ:S:`/FUtYFRFt%u8G^,XbŶ8P,g,vE'8~r Ch0Ѡ+nwo`%d,)?3MK{/:MkaJ+.s-)Sڶ]C+Q,N٥<]`J t$aJ+9E"bѩHxPn ֵo%Xfu]tW!K )`%'|5 :դ8.wRĉäu[| :|c5g1bQ@tSUyyb舛ODU |>)mx}@D%u$Z`x@jgT>O<3jbΓب?zh~$*/|<|w>>Sbg H-#VEqܒgz)d2i#g>s/J=O@ʉ8+*@Ё(I`p/v6iys~^~Ht/j=ODG0(=sUwP}N4+:҉hL"n\DGb@t50-m,]t|)ER7‰Eg * F7!1 :\D[H དྷZO HAёQth%m>':TxE:3ƃ΀DRO HEau#D716ٝ(:4X :j=' D|kU;#u tLq-;."bٹE񭳳;ZTBie:HinEg1C5ŋZO H9*P<A|)by_h7^zO@ʉ5r)+W3SQjf7I' D :hIENDB`npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/0000755000000000000000000000000012631326456030064 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/package.json0000644000000000000000000000263712631326456027705 0ustar 00000000000000{ "name": "gauge", "version": "1.2.2", "description": "A terminal based horizontal guage", "main": "progress-bar.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/gauge.git" }, "keywords": [ "progressbar", "progress", "gauge" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/gauge/issues" }, "homepage": "https://github.com/iarna/gauge", "dependencies": { "ansi": "^0.3.0", "has-unicode": "^1.0.0", "lodash.pad": "^3.0.0", "lodash.padleft": "^3.0.0", "lodash.padright": "^3.0.0" }, "devDependencies": { "tap": "^0.4.13" }, "gitHead": "9f7eeeeed3b74a70f30b721d570435f6ffbc0168", "_id": "gauge@1.2.2", "_shasum": "05b6730a19a8fcad3c340a142f0945222a3f815b", "_from": "gauge@>=1.2.0 <1.3.0", "_npmVersion": "3.1.0", "_nodeVersion": "0.10.38", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "dist": { "shasum": "05b6730a19a8fcad3c340a142f0945222a3f815b", "tarball": "http://registry.npmjs.org/gauge/-/gauge-1.2.2.tgz" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/gauge/-/gauge-1.2.2.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/progress-bar.js0000644000000000000000000001336512631326456030363 0ustar 00000000000000"use strict" var hasUnicode = require("has-unicode") var ansi = require("ansi") var align = { center: require("lodash.pad"), left: require("lodash.padright"), right: require("lodash.padleft") } var defaultStream = process.stderr function isTTY() { return process.stderr.isTTY } function getWritableTTYColumns() { // Writing to the final column wraps the line // We have to use stdout here, because Node's magic SIGWINCH handler only // updates process.stdout, not process.stderr return process.stdout.columns - 1 } var ProgressBar = module.exports = function (options, cursor) { if (! options) options = {} if (! cursor && options.write) { cursor = options options = {} } if (! cursor) { cursor = ansi(defaultStream) } this.cursor = cursor this.showing = false this.theme = options.theme || (hasUnicode() ? ProgressBar.unicode : ProgressBar.ascii) this.template = options.template || [ {type: "name", separated: true, length: 25}, {type: "spinner", separated: true}, {type: "startgroup"}, {type: "completionbar"}, {type: "endgroup"} ] this.updatefreq = options.maxUpdateFrequency || 50 this.lastName = "" this.lastCompleted = 0 this.spun = 0 this.last = new Date(0) var self = this this._handleSizeChange = function () { if (!self.showing) return self.hide() self.show() } } ProgressBar.prototype = {} ProgressBar.unicode = { startgroup: "╢", endgroup: "╟", complete: "█", incomplete: "░", spinner: "▀▐▄▌", subsection: "→" } ProgressBar.ascii = { startgroup: "|", endgroup: "|", complete: "#", incomplete: "-", spinner: "-\\|/", subsection: "->" } ProgressBar.prototype.setTheme = function(theme) { this.theme = theme } ProgressBar.prototype.setTemplate = function(template) { this.template = template } ProgressBar.prototype._enableResizeEvents = function() { process.stdout.on('resize', this._handleSizeChange) } ProgressBar.prototype._disableResizeEvents = function() { process.stdout.removeListener('resize', this._handleSizeChange) } ProgressBar.prototype.disable = function() { this.hide() this.disabled = true } ProgressBar.prototype.enable = function() { this.disabled = false this.show() } ProgressBar.prototype.hide = function() { if (!isTTY()) return if (this.disabled) return this.cursor.show() if (this.showing) this.cursor.up(1) this.cursor.horizontalAbsolute(0).eraseLine() this.showing = false } var repeat = function (str, count) { var out = "" for (var ii=0; ii Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000103212631326456032077 0ustar 00000000000000# lodash.pad v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.pad` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.pad ``` In Node.js/io.js: ```js var pad = require('lodash.pad'); ``` See the [documentation](https://lodash.com/docs#pad) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.pad) for more details. ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000325312631326456032106 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Pads `string` on the left and right sides if it's shorter than `length`. * Padding characters are truncated if they can't be evenly divided by `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.pad('abc', 8); * // => ' abc ' * * _.pad('abc', 8, '_-'); * // => '_-abc_-_' * * _.pad('abc', 3); * // => 'abc' */ function pad(string, length, chars) { string = baseToString(string); length = +length; var strLength = string.length; if (strLength >= length || !nativeIsFinite(length)) { return string; } var mid = (length - strLength) / 2, leftLength = nativeFloor(mid), rightLength = nativeCeil(mid); chars = createPadding('', rightLength, chars); return chars.slice(0, leftLength) + string + chars; } module.exports = pad; ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000755000000000000000000000000012631326456032101 5ustar 00000000000000././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000457212631326456032113 0ustar 00000000000000{ "name": "lodash.pad", "version": "3.1.1", "description": "The modern build of lodash’s `_.pad` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.pad@3.1.1", "_shasum": "2e078ebc33b331d2ba34bf8732af129fd5c04624", "_from": "lodash.pad@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "2e078ebc33b331d2ba34bf8732af129fd5c04624", "tarball": "http://registry.npmjs.org/lodash.pad/-/lodash.pad-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.pad/-/lodash.pad-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000020700000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000755000000000000000000000000012631326456032101 5ustar 00000000000000././@LongLink0000000000000000000000000000021000000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000755000000000000000000000000012631326456032101 5ustar 00000000000000././@LongLink0000000000000000000000000000021600000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000232112631326456032101 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022000000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000105312631326456032102 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000134212631326456032103 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000442512631326456032110 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000232112631326456032101 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000106112631326456032101 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000022000000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000254212631326456032106 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000755000000000000000000000000012631326456032101 5ustar 00000000000000././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000452412631326456032110 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000024300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000755000000000000000000000000012631326456032101 5ustar 00000000000000././@LongLink0000000000000000000000000000025200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000232112631326456032101 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000025400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000105712631326456032106 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000025300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000273712631326456032114 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000025700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/0000644000000000000000000000455312631326456032112 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/LICENSE.txtnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000232112631326456032176 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000106612631326456032203 0ustar 00000000000000# lodash.padleft v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.padLeft` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.padleft ``` In Node.js/io.js: ```js var padLeft = require('lodash.padleft'); ``` See the [documentation](https://lodash.com/docs#padLeft) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.padleft) for more details. ././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000277612631326456032214 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /** * Creates a function for `_.padLeft` or `_.padRight`. * * @private * @param {boolean} [fromRight] Specify padding from the right. * @returns {Function} Returns the new pad function. */ function createPadDir(fromRight) { return function(string, length, chars) { string = baseToString(string); return (fromRight ? string : '') + createPadding(string, length, chars) + (fromRight ? '' : string); }; } /** * Pads `string` on the left side if it is shorter than `length`. Padding * characters are truncated if they exceed `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.padLeft('abc', 6); * // => ' abc' * * _.padLeft('abc', 6, '_-'); * // => '_-_abc' * * _.padLeft('abc', 3); * // => 'abc' */ var padLeft = createPadDir(); module.exports = padLeft; ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000755000000000000000000000000012631326456032176 5ustar 00000000000000././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000463112631326456032204 0ustar 00000000000000{ "name": "lodash.padleft", "version": "3.1.1", "description": "The modern build of lodash’s `_.padLeft` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.padleft@3.1.1", "_shasum": "150151f1e0245edba15d50af2d71f1d5cff46530", "_from": "lodash.padleft@>=3.0.0 <4.0.0", "_npmVersion": "2.9.0", "_nodeVersion": "0.12.2", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "150151f1e0245edba15d50af2d71f1d5cff46530", "tarball": "http://registry.npmjs.org/lodash.padleft/-/lodash.padleft-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.padleft/-/lodash.padleft-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000021300000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000755000000000000000000000000012631326456032176 5ustar 00000000000000././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000755000000000000000000000000012631326456032176 5ustar 00000000000000././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000232112631326456032176 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000105312631326456032177 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000134212631326456032200 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000022700000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000442512631326456032205 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000232112631326456032176 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000106112631326456032176 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000254212631326456032203 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000755000000000000000000000000012631326456032176 5ustar 00000000000000././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000452412631326456032205 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000024700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000755000000000000000000000000012631326456032176 5ustar 00000000000000././@LongLink0000000000000000000000000000025600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000232112631326456032176 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000026000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000105712631326456032203 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000025700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000273712631326456032211 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000026300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padl0000644000000000000000000000455312631326456032207 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/LICENSE.txtnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000232112631326456032204 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000107512631326456032211 0ustar 00000000000000# lodash.padright v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.padRight` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.padright ``` In Node.js/io.js: ```js var padRight = require('lodash.padright'); ``` See the [documentation](https://lodash.com/docs#padRight) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.padright) for more details. ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000301012631326456032200 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /** * Creates a function for `_.padLeft` or `_.padRight`. * * @private * @param {boolean} [fromRight] Specify padding from the right. * @returns {Function} Returns the new pad function. */ function createPadDir(fromRight) { return function(string, length, chars) { string = baseToString(string); return (fromRight ? string : '') + createPadding(string, length, chars) + (fromRight ? '' : string); }; } /** * Pads `string` on the right side if it is shorter than `length`. Padding * characters are truncated if they exceed `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.padRight('abc', 6); * // => 'abc ' * * _.padRight('abc', 6, '_-'); * // => 'abc_-_' * * _.padRight('abc', 3); * // => 'abc' */ var padRight = createPadDir(true); module.exports = padRight; ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000755000000000000000000000000012631326456032204 5ustar 00000000000000././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000464112631326456032213 0ustar 00000000000000{ "name": "lodash.padright", "version": "3.1.1", "description": "The modern build of lodash’s `_.padRight` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.padright@3.1.1", "_shasum": "79f7770baaa39738c040aeb5465e8d88f2aacec0", "_from": "lodash.padright@>=3.0.0 <4.0.0", "_npmVersion": "2.9.0", "_nodeVersion": "0.12.2", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "79f7770baaa39738c040aeb5465e8d88f2aacec0", "tarball": "http://registry.npmjs.org/lodash.padright/-/lodash.padright-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.padright/-/lodash.padright-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000755000000000000000000000000012631326456032204 5ustar 00000000000000././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000755000000000000000000000000012631326456032204 5ustar 00000000000000././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000232112631326456032204 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000105312631326456032205 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000134212631326456032206 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000442512631326456032213 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000232112631326456032204 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000106112631326456032204 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000254212631326456032211 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000755000000000000000000000000012631326456032204 5ustar 00000000000000././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000452412631326456032213 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000025000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000755000000000000000000000000012631326456032204 5ustar 00000000000000././@LongLink0000000000000000000000000000025700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000232112631326456032204 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000026100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000105712631326456032211 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000026000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000273712631326456032217 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000026400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padr0000644000000000000000000000455312631326456032215 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/node_modules/gauge/test/progress-bar.js0000644000000000000000000001257212631326456031341 0ustar 00000000000000"use strict" var test = require("tap").test var ProgressBar = require("../progress-bar.js") var cursor = [] var C var bar = new ProgressBar({theme: ProgressBar.ascii}, C = { show: function () { cursor.push(["show"]) return C }, hide: function () { cursor.push(["hide"]) return C }, up: function (lines) { cursor.push(["up",lines]) return C }, horizontalAbsolute: function (col) { cursor.push(["horizontalAbsolute", col]) return C }, eraseLine: function () { cursor.push(["eraseLine"]) return C }, write: function (line) { cursor.push(["write", line]) return C } }) function isOutput(t, msg, output) { var tests = [] for (var ii = 0; ii P | |----|\n' ], [ 'show' ] ]) }) test("window resizing", function (t) { t.plan(16) process.stderr.isTTY = true process.stdout.columns = 32 bar.show("NAME", 0.1) cursor = [] bar.last = new Date(0) bar.pulse() isOutput(t, "32 columns", [ [ 'up', 1 ], [ 'hide' ], [ 'horizontalAbsolute', 0 ], [ 'write', 'NAME / |##------------------|\n' ], [ 'show' ] ]) process.stdout.columns = 16 bar.show("NAME", 0.5) cursor = [] bar.last = new Date(0) bar.pulse() isOutput(t, "16 columns", [ [ 'up', 1 ], [ 'hide' ], [ 'horizontalAbsolute', 0 ], [ 'write', 'NAME - |##--|\n' ], [ 'show' ] ]); }); npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/test/basic.js0000644000000000000000000002147012631326456024244 0ustar 00000000000000var tap = require('tap') var log = require('../') var result = [] var logEvents = [] var logInfoEvents = [] var logPrefixEvents = [] var util = require('util') var resultExpect = [ '\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[7msill\u001b[0m \u001b[0m\u001b[35msilly prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[34m\u001b[40mverb\u001b[0m \u001b[0m\u001b[35mverbose prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32minfo\u001b[0m \u001b[0m\u001b[35minfo prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32m\u001b[40mhttp\u001b[0m \u001b[0m\u001b[35mhttp prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[30m\u001b[43mWARN\u001b[0m \u001b[0m\u001b[35mwarn prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35merror prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32minfo\u001b[0m \u001b[0m\u001b[35minfo prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32m\u001b[40mhttp\u001b[0m \u001b[0m\u001b[35mhttp prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[30m\u001b[43mWARN\u001b[0m \u001b[0m\u001b[35mwarn prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35merror prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m This is a longer\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m message, with some details\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m and maybe a stack.\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m \n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u0007noise\u001b[0m\u001b[35m\u001b[0m LOUD NOISES\n', '\u001b[0m' ] var logPrefixEventsExpect = [ { id: 2, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 9, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 16, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] } ] // should be the same. var logInfoEventsExpect = logPrefixEventsExpect var logEventsExpect = [ { id: 0, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 1, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 2, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 3, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 4, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 5, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 6, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 7, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 8, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 9, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 10, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 11, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 12, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 13, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 14, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 15, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 16, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 17, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 18, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 19, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 20, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 21, level: 'error', prefix: '404', message: 'This is a longer\nmessage, with some details\nand maybe a stack.\n', messageRaw: [ 'This is a longer\nmessage, with some details\nand maybe a stack.\n' ] }, { id: 22, level: 'noise', prefix: false, message: 'LOUD NOISES', messageRaw: [ 'LOUD NOISES' ] } ] var Stream = require('stream').Stream var s = new Stream() s.write = function (m) { result.push(m) } s.writable = true s.isTTY = true s.end = function () {} log.stream = s log.heading = 'npm' tap.test('basic', function (t) { log.on('log', logEvents.push.bind(logEvents)) log.on('log.info', logInfoEvents.push.bind(logInfoEvents)) log.on('info prefix', logPrefixEvents.push.bind(logPrefixEvents)) console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n') log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') t.deepEqual(result.join('').trim(), resultExpect.join('').trim(), 'result') t.deepEqual(log.record, logEventsExpect, 'record') t.deepEqual(logEvents, logEventsExpect, 'logEvents') t.deepEqual(logInfoEvents, logInfoEventsExpect, 'logInfoEvents') t.deepEqual(logPrefixEvents, logPrefixEventsExpect, 'logPrefixEvents') t.end() }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/npmlog/test/progress.js0000644000000000000000000000621612631326456025030 0ustar 00000000000000'use strict' var test = require('tap').test var log = require('../log.js') var actions = [] log.gauge = { enable: function () { actions.push(['enable']) }, disable: function () { actions.push(['disable']) }, hide: function () { actions.push(['hide']) }, show: function (name, completed) { actions.push(['show', name, completed]) }, pulse: function (name) { actions.push(['pulse', name]) } } function didActions(t, msg, output) { var tests = [] for (var ii = 0; ii < output.length; ++ ii) { for (var jj = 0; jj < output[ii].length; ++ jj) { tests.push({cmd: ii, arg: jj}) } } t.is(actions.length, output.length, msg) tests.forEach(function (test) { t.is(actions[test.cmd] ? actions[test.cmd][test.arg] : null, output[test.cmd][test.arg], msg + ': ' + output[test.cmd] + (test.arg ? ' arg #'+test.arg : '')) }) actions = [] } test('enableProgress', function (t) { t.plan(6) log.enableProgress() didActions(t, 'enableProgress', [ [ 'enable' ], [ 'show', undefined, 0 ] ]) log.enableProgress() didActions(t, 'enableProgress again', []) }) test('disableProgress', function (t) { t.plan(4) log.disableProgress() didActions(t, 'disableProgress', [ [ 'hide' ], [ 'disable' ] ]) log.disableProgress() didActions(t, 'disableProgress again', []) }) test('showProgress', function (t) { t.plan(5) log.showProgress('foo') didActions(t, 'showProgress disabled', []) log.enableProgress() actions = [] log.showProgress('foo') didActions(t, 'showProgress', [ [ 'show', 'foo', 0 ] ]) }) test('clearProgress', function (t) { t.plan(3) log.clearProgress() didActions(t, 'clearProgress', [ [ 'hide' ] ]) log.disableProgress() actions = [] log.clearProgress() didActions(t, 'clearProgress disabled', [ ]) }) test("newItem", function (t) { t.plan(12) log.enableProgress() actions = [] var a = log.newItem("test", 10) didActions(t, "newItem", [ [ 'show', undefined, 0 ] ]) a.completeWork(5) didActions(t, "newItem:completeWork", [ [ 'show', 'test', 0.5 ] ]) a.finish() didActions(t, "newItem:finish", [ [ 'show', 'test', 1 ] ]) }) // test that log objects proxy through. And test that completion status filters up test("newGroup", function (t) { t.plan(23) var a = log.newGroup("newGroup") didActions(t, "newGroup", [ [ 'show', undefined, 0.5 ] ]) a.warn("test", "this is a test") didActions(t, "newGroup:warn", [ [ 'pulse', 'test' ], [ 'hide' ], [ 'show', undefined, 0.5 ] ]) var b = a.newItem("newGroup2", 10) didActions(t, "newGroup:newItem", [ [ 'show', 'newGroup', 0.5 ] ]) b.completeWork(5) didActions(t, "newGroup:completeWork", [ [ 'show', 'newGroup2', 0.75 ] ]) a.finish() didActions(t, "newGroup:finish", [ [ 'show', 'newGroup', 1 ] ]) }) test("newStream", function (t) { t.plan(13) var a = log.newStream("newStream", 10) didActions(t, "newStream", [ [ 'show', undefined, 0.6666666666666666 ] ]) a.write("abcde") didActions(t, "newStream", [ [ 'show', 'newStream', 0.8333333333333333 ] ]) a.write("fghij") didActions(t, "newStream", [ [ 'show', 'newStream', 1 ] ]) t.is(log.tracker.completed(), 1, "Overall completion") }) npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/.npmignore0000644000000000000000000000001612631326456024412 0ustar 00000000000000/node_modules npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/.travis.yml0000644000000000000000000000007512631326456024531 0ustar 00000000000000language: node_js node_js: - 0.8 - 0.9 - 0.10 - 0.11 npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/History.md0000644000000000000000000000067512631326456024411 0ustar 00000000000000 1.0.0 / 2014-11-11 ================== * index: add support for a configrable `property` name to use * README: fix Travis badge 0.0.2 / 2013-12-22 ================== * README++ * test: add unshift() test * test: add more tests * index: ensure that the indexed getters/setters are set up in the constructor * add .travis.yml file * add initial tests 0.0.1 / 2013-12-21 ================== * add README.md * initial commit npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/README.md0000644000000000000000000000543712631326456023706 0ustar 00000000000000path-array ========== ### Treat your `$PATH` like a JavaScript Array [![Build Status](https://travis-ci.org/TooTallNate/path-array.svg?branch=master)](https://travis-ci.org/TooTallNate/path-array) This module provides a JavaScript `Array` implementation that is backed by your `$PATH` env variable. That is, you can use regular Array functions like `shift()`, `pop()`, `push()`, `unshift()`, etc. to mutate your `$PATH`. Also works for preparing an `env` object for passing to [`child_process.spawn()`][cp.spawn]. Installation ------------ Install with `npm`: ``` bash $ npm install path-array ``` Example ------- Interacting with your own `$PATH` env variable: ``` js var PathArray = require('path-array'); // no args uses `process.env` by default var p = new PathArray(); console.log(p); // [ './node_modules/.bin', // '/opt/local/bin', // '/opt/local/sbin', // '/usr/local/bin', // '/usr/local/sbin', // '/usr/bin', // '/bin', // '/usr/sbin', // '/sbin', // '/usr/local/bin', // '/opt/X11/bin' ] // push another path entry. this function mutates the `process.env.PATH` p.unshift('/foo'); console.log(process.env.PATH); // '/foo:./node_modules/.bin:/opt/local/bin:/opt/local/sbin:/usr/local/bin:/usr/local/sbin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/opt/X11/bin' ``` API --- ### new PathArray([env]) → PathArray Creates and returns a new `PathArray` instance with the given `env` object. If no `env` is specified, then [`process.env`][process.env] is used by default. License ------- (The MIT License) Copyright (c) 2013 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. [process.env]: http://nodejs.org/docs/latest/api/process.html#process_process_env [cp.spawn]: http://nodejs.org/docs/latest/api/child_process.html#child_process_child_process_spawn_command_args_options npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/index.js0000644000000000000000000000573512631326456024075 0ustar 00000000000000 /** * Module dependencies. */ var inherits = require('util').inherits; var delimiter = require('path').delimiter || ':'; var ArrayIndex = require('array-index'); /** * Module exports. */ module.exports = PathArray; /** * `PathArray` constructor. Treat your `$PATH` like a mutable JavaScript Array! * * @param {Env} env - `process.env` object to use. * @param {String} [property] - optional property name to use (`PATH` by default). * @public */ function PathArray (env, property) { if (!(this instanceof PathArray)) return new PathArray(env); ArrayIndex.call(this); this.property = property || 'PATH'; // overwrite only the `get` operator of the ".length" property Object.defineProperty(this, 'length', { get: this._getLength }); // store the `process.env` object as a non-enumerable `_env` Object.defineProperty(this, '_env', { value: env || process.env, writable: true, enumerable: false, configurable: true }); // need to invoke the `length` getter to ensure that the // indexed getters/setters are set up at this point void(this.length); } // inherit from ArrayIndex inherits(PathArray, ArrayIndex); /** * Returns the current $PATH representation as an Array. * * @api private */ PathArray.prototype._array = function () { var path = this._env[this.property]; if (!path) return []; return path.split(delimiter); }; /** * Sets the `env` object's `PATH` string to the values in the passed in Array * instance. * * @api private */ PathArray.prototype._setArray = function (arr) { // mutate the $PATH this._env[this.property] = arr.join(delimiter); }; /** * `.length` getter operation implementation. * * @api private */ PathArray.prototype._getLength = function () { var length = this._array().length; // invoke the ArrayIndex internal `set` operator to ensure that // there's getters/setters defined for the determined length so far... this.length = length; return length; }; /** * ArrayIndex [0] getter operator implementation. * * @api private */ PathArray.prototype.__get__ = function get (index) { return this._array()[index]; }; /** * ArrayIndex [0]= setter operator implementation. * * @api private */ PathArray.prototype.__set__ = function set (index, value) { var arr = this._array(); arr[index] = value; this._setArray(arr); return value; }; /** * `toString()` returns the current $PATH string. * * @api public */ PathArray.prototype.toString = function toString () { return this._env[this.property] || ''; }; // proxy the JavaScript Array functions, and mutate the $PATH Object.getOwnPropertyNames(Array.prototype).forEach(function (name) { if ('constructor' == name) return; if ('function' != typeof Array.prototype[name]) return; if (/to(Locale)?String/.test(name)) return; //console.log('proxy %s', name); PathArray.prototype[name] = function () { var arr = this._array(); var rtn = arr[name].apply(arr, arguments); this._setArray(arr); return rtn; }; }); npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/0000755000000000000000000000000012631326456025073 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/package.json0000644000000000000000000000263012631326456024705 0ustar 00000000000000{ "name": "path-array", "version": "1.0.0", "description": "Treat your $PATH like a JavaScript Array", "main": "index.js", "scripts": { "test": "mocha --reporter spec" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/node-path-array.git" }, "keywords": [ "PATH", "env", "array" ], "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://n8.io/" }, "license": "MIT", "bugs": { "url": "https://github.com/TooTallNate/node-path-array/issues" }, "homepage": "https://github.com/TooTallNate/node-path-array", "dependencies": { "array-index": "~0.1.0" }, "devDependencies": { "mocha": "~1.16.1" }, "gitHead": "5d1fedd54e4413459f67e4a4babb024144cd00d0", "_id": "path-array@1.0.0", "_shasum": "6c14130c33084f0150553c657b38397ab67aaa4e", "_from": "path-array@>=1.0.0 <2.0.0", "_npmVersion": "1.4.28", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "6c14130c33084f0150553c657b38397ab67aaa4e", "tarball": "http://registry.npmjs.org/path-array/-/path-array-1.0.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/path-array/-/path-array-1.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/test/0000755000000000000000000000000012631326456023375 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/0000755000000000000000000000000012631326456027316 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/.npmignore0000644000000000000000000000001512631326456031311 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/.travis.yml0000644000000000000000000000007312631326456031427 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" - "0.11" npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/History.md0000644000000000000000000000136312631326456031304 0ustar 00000000000000 0.1.1 / 2014-11-03 ================== * index: use `%o` debug formatters * .travis: don't test node v0.9.x * README: use svg for Travis badge * add .jshintrc file 0.1.0 / 2013-12-01 ================== * add `History.md` file * .travis.yml: test node v0.8-v0.11 * add component.json * package: update "main" field * package: beautify 0.0.4 / 2013-09-27 ================== * ensure that the `length` property has the same maximum as regular Arrays 0.0.3 / 2013-09-15 ================== * add `toArray()`, `toJSON()`, and `toString()` functions * add an `inspect()` function 0.0.2 / 2013-09-15 ================== * use "configurable: true" * add `travis.yml` file 0.0.1 / 2013-06-14 ================== * Initial release npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/Makefile0000644000000000000000000000024612631326456030760 0ustar 00000000000000 build: components index.js @component build --dev components: component.json @component install --dev clean: rm -fr build components template.js .PHONY: clean npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/README.md0000644000000000000000000001103612631326456030576 0ustar 00000000000000array-index =========== ### Invoke getter/setter functions on array-like objects [![Build Status](https://secure.travis-ci.org/TooTallNate/array-index.svg)](http://travis-ci.org/TooTallNate/array-index) This little module provides an `ArrayIndex` constructor function that you can inherit from with your own objects. When a numbered property gets read, then the `__get__` function on the object will be invoked. When a numbered property gets set, then the `__set__` function on the object will be invoked. Installation ------------ Install with `npm`: ``` bash $ npm install array-index ``` Examples -------- A quick silly example, using `Math.sqrt()` for the "getter": ``` js var ArrayIndex = require('array-index') // let's just create a singleton instance. var a = new ArrayIndex() // the "__get__" function is invoked for each "a[n]" access. // it is given a single argument, the "index" currently being accessed. // so here, we're passing in the `Math.sqrt()` function, so accessing // "a[9]" will return `Math.sqrt(9)`. a.__get__ = Math.sqrt // the "__get__" and "__set__" functions are only invoked up // to "a.length", so we must set that manually. a.length = 10 console.log(a) // [ 0, // 1, // 1.4142135623730951, // 1.7320508075688772, // 2, // 2.23606797749979, // 2.449489742783178, // 2.6457513110645907, // 2.8284271247461903, // 3, // __get__: [Function: sqrt] ] ``` Here's an example of creating a subclass of `ArrayIndex` using `util.inherits()`: ``` js var ArrayIndex = require('array-index') var inherits = require('util').inherits function MyArray (length) { // be sure to call the ArrayIndex constructor in your own constructor ArrayIndex.call(this, length) // the "set" object will contain values at indexes previously set, // so that they can be returned in the "getter" function. This is just a // silly example, your subclass will have more meaningful logic. Object.defineProperty(this, 'set', { value: Object.create(null), enumerable: false }) } // inherit from the ArrayIndex's prototype inherits(MyArray, ArrayIndex) MyArray.prototype.__get__ = function (index) { if (index in this.set) return this.set[index] return index * 2 } MyArray.prototype.__set__ = function (index, v) { this.set[index] = v } // and now you can create some instances var a = new MyArray(15) a[9] = a[10] = a[14] = '_' a[0] = 'nate' console.log(a) // [ 'nate', 2, 4, 6, 8, 10, 12, 14, 16, '_', '_', 22, 24, 26, '_' ] ``` API --- The `ArrayIndex` base class is meant to be subclassed, but it also has a few convenient functions built-in. ### "length" -> Number The length of the ArrayIndex instance. The `__get__` and `__set__` functions will only be invoked on the object up to this "length". You may set this length at any time to adjust the amount range where the getters/setters will be invoked. ### "toArray()" -> Array Returns a new regular Array instance with the same values that this ArrayIndex class would have. This function calls the `__get__` function repeatedly from `0...length-1` and returns the "flattened" array instance. ### "toJSON()" -> Array All `ArrayIndex` instances get basic support for `JSON.stringify()`, which is the same as a "flattened" Array being stringified. ### "toString()" -> String The `toString()` override is basically just `array.toArray().toString()`. ### "format()" -> String The `inspect()` implementation for the REPL attempts to mimic what a regular Array looks like in the REPL. License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/component.json0000644000000000000000000000061712631326456032217 0ustar 00000000000000{ "name": "array-index", "repo": "TooTallNate/array-index", "description": "Invoke getter/setter functions on array-like objects", "keywords": [ "index", "array", "getter", "setter", "proxy" ], "version": "0.1.1", "dependencies": { "visionmedia/debug": "*" }, "development": {}, "license": "MIT", "main": "index.js", "scripts": [ "index.js" ] } npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/index.js0000644000000000000000000000716212631326456030771 0ustar 00000000000000 /** * Module dependencies. */ var util = require('util') var debug = require('debug')('array-index') /** * JavaScript Array "length" is bound to an unsigned 32-bit int. * See: http://stackoverflow.com/a/6155063/376773 */ var MAX_LENGTH = Math.pow(2, 32) /** * Module exports. */ module.exports = ArrayIndex /** * Subclass this. */ function ArrayIndex (length) { Object.defineProperty(this, 'length', { get: getLength, set: setLength, enumerable: false, configurable: true }) Object.defineProperty(this, '__length', { value: 0, writable: true, enumerable: false, configurable: true }) if (arguments.length > 0) { this.length = length } } /** * You overwrite the "__get__" function in your subclass. */ ArrayIndex.prototype.__get__ = function () { throw new Error('you must implement the __get__ function') } /** * You overwrite the "__set__" function in your subclass. */ ArrayIndex.prototype.__set__ = function () { throw new Error('you must implement the __set__ function') } /** * Converts this array class into a real JavaScript Array. Note that this * is a "flattened" array and your defined getters and setters won't be invoked * when you interact with the returned Array. This function will call the * getter on every array index of the object. * * @return {Array} The flattened array * @api public */ ArrayIndex.prototype.toArray = function toArray () { var i = 0, l = this.length, array = new Array(l) for (; i < l; i++) { array[i] = this[i] } return array } /** * Basic support for `JSON.stringify()`. */ ArrayIndex.prototype.toJSON = function toJSON () { return this.toArray() } /** * toString() override. Use Array.prototype.toString(). */ ArrayIndex.prototype.toString = function toString () { var a = this.toArray() return a.toString.apply(a, arguments) } /** * inspect() override. For the REPL. */ ArrayIndex.prototype.inspect = function inspect () { var a = this.toArray() Object.keys(this).forEach(function (k) { a[k] = this[k] }, this) return util.inspect(a) } /** * Getter for the "length" property. * Returns the value of the "__length" property. */ function getLength () { debug('getting "length": %o', this.__length) return this.__length } /** * Setter for the "length" property. * Calls "ensureLength()", then sets the "__length" property. */ function setLength (v) { debug('setting "length": %o', v) return this.__length = ensureLength(v) } /** * Ensures that getters/setters from 0 up to "_length" have been defined * on `ArrayIndex.prototype`. * * @api private */ function ensureLength (_length) { var length if (_length > MAX_LENGTH) { length = MAX_LENGTH } else { length = _length | 0 } var cur = ArrayIndex.prototype.__length__ | 0 var num = length - cur if (num > 0) { var desc = {} debug('creating a descriptor object with %o entries', num) for (var i = cur; i < length; i++) { desc[i] = setup(i) } debug('done creating descriptor object') debug('calling `Object.defineProperties()` with %o entries', num) Object.defineProperties(ArrayIndex.prototype, desc) debug('finished `Object.defineProperties()`') ArrayIndex.prototype.__length__ = length } return length } /** * Returns a property descriptor for the given "index", with "get" and "set" * functions created within the closure. * * @api private */ function setup (index) { function get () { return this.__get__(index) } function set (v) { return this.__set__(index, v) } return { enumerable: true , configurable: true , get: get , set: set } } npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/0000755000000000000000000000000012631326456031773 5ustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/package.json0000644000000000000000000000265112631326456031610 0ustar 00000000000000{ "name": "array-index", "description": "Invoke getter/setter functions on array-like objects", "keywords": [ "index", "array", "getter", "setter", "proxy" ], "version": "0.1.1", "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://tootallnate.net" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/array-index.git" }, "main": "index.js", "scripts": { "test": "node test" }, "dependencies": { "debug": "*" }, "engines": { "node": "*" }, "gitHead": "65a5d884f25b4b7a1608e367d715d713dbd3b3d6", "bugs": { "url": "https://github.com/TooTallNate/array-index/issues" }, "homepage": "https://github.com/TooTallNate/array-index", "_id": "array-index@0.1.1", "_shasum": "4d5eaf06cc3d925847cd73d1535c217ba306d3e1", "_from": "array-index@>=0.1.0 <0.2.0", "_npmVersion": "2.1.3", "_nodeVersion": "0.10.32", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "4d5eaf06cc3d925847cd73d1535c217ba306d3e1", "tarball": "http://registry.npmjs.org/array-index/-/array-index-0.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/array-index/-/array-index-0.1.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/test.js0000644000000000000000000000262712631326456030642 0ustar 00000000000000 var ArrayIndex = require('./') var inherits = require('util').inherits var assert = require('assert') /** * Create a "subclass". */ function Arrayish (length) { ArrayIndex.call(this, length) this.sets = Object.create(null) } // inherit from `ArrayIndex` inherits(Arrayish, ArrayIndex) // create an instance and run some tests var a = new Arrayish(11) assert.throws(function () { a[0] }, /__get__/) assert.throws(function () { a[0] = 0 }, /__set__/) /** * This "getter" function checks if the index has previosly been "set", and if so * returns the index * the value previously set. If the index hasn't been set, * return the index as-is. */ Arrayish.prototype.__get__ = function get (index) { if (index in this.sets) { return +this.sets[index] * index } else { return index } } /** * Store the last value set for this index. */ Arrayish.prototype.__set__ = function set (index, value) { this.sets[index] = value } // test getters without being "set" assert.equal(0, a[0]) assert.equal(1, a[1]) assert.equal(2, a[2]) assert.equal(3, a[3]) assert.equal(4, a[4]) // test setters, followed by getters a[10] = 1 assert.equal(10, a[10]) a[10] = 2 assert.equal(20, a[10]) a[10] = 3 assert.equal(30, a[10]) // test "length" assert.equal(11, a.length) a[4] = 20 a[6] = 5.55432 var b = [0, 1, 2, 3, 80, 5, 33.325919999999996, 7, 8, 9, 30] assert.equal(JSON.stringify(b), JSON.stringify(a)) ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000755000000000000000000000000012631326456032137 5ustar 00000000000000././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000005212631326456032136 0ustar 00000000000000support test examples example *.sock dist ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/History.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000001275312631326456032151 0ustar 00000000000000 2.2.0 / 2015-05-09 ================== * package: update "ms" to v0.7.1 (#202, @dougwilson) * README: add logging to file example (#193, @DanielOchoa) * README: fixed a typo (#191, @amir-s) * browser: expose `storage` (#190, @stephenmathieson) * Makefile: add a `distclean` target (#189, @stephenmathieson) 2.1.3 / 2015-03-13 ================== * Updated stdout/stderr example (#186) * Updated example/stdout.js to match debug current behaviour * Renamed example/stderr.js to stdout.js * Update Readme.md (#184) * replace high intensity foreground color for bold (#182, #183) 2.1.2 / 2015-03-01 ================== * dist: recompile * update "ms" to v0.7.0 * package: update "browserify" to v9.0.3 * component: fix "ms.js" repo location * changed bower package name * updated documentation about using debug in a browser * fix: security error on safari (#167, #168, @yields) 2.1.1 / 2014-12-29 ================== * browser: use `typeof` to check for `console` existence * browser: check for `console.log` truthiness (fix IE 8/9) * browser: add support for Chrome apps * Readme: added Windows usage remarks * Add `bower.json` to properly support bower install 2.1.0 / 2014-10-15 ================== * node: implement `DEBUG_FD` env variable support * package: update "browserify" to v6.1.0 * package: add "license" field to package.json (#135, @panuhorsmalahti) 2.0.0 / 2014-09-01 ================== * package: update "browserify" to v5.11.0 * node: use stderr rather than stdout for logging (#29, @stephenmathieson) 1.0.4 / 2014-07-15 ================== * dist: recompile * example: remove `console.info()` log usage * example: add "Content-Type" UTF-8 header to browser example * browser: place %c marker after the space character * browser: reset the "content" color via `color: inherit` * browser: add colors support for Firefox >= v31 * debug: prefer an instance `log()` function over the global one (#119) * Readme: update documentation about styled console logs for FF v31 (#116, @wryk) 1.0.3 / 2014-07-09 ================== * Add support for multiple wildcards in namespaces (#122, @seegno) * browser: fix lint 1.0.2 / 2014-06-10 ================== * browser: update color palette (#113, @gscottolson) * common: make console logging function configurable (#108, @timoxley) * node: fix %o colors on old node <= 0.8.x * Makefile: find node path using shell/which (#109, @timoxley) 1.0.1 / 2014-06-06 ================== * browser: use `removeItem()` to clear localStorage * browser, node: don't set DEBUG if namespaces is undefined (#107, @leedm777) * package: add "contributors" section * node: fix comment typo * README: list authors 1.0.0 / 2014-06-04 ================== * make ms diff be global, not be scope * debug: ignore empty strings in enable() * node: make DEBUG_COLORS able to disable coloring * *: export the `colors` array * npmignore: don't publish the `dist` dir * Makefile: refactor to use browserify * package: add "browserify" as a dev dependency * Readme: add Web Inspector Colors section * node: reset terminal color for the debug content * node: map "%o" to `util.inspect()` * browser: map "%j" to `JSON.stringify()` * debug: add custom "formatters" * debug: use "ms" module for humanizing the diff * Readme: add "bash" syntax highlighting * browser: add Firebug color support * browser: add colors for WebKit browsers * node: apply log to `console` * rewrite: abstract common logic for Node & browsers * add .jshintrc file 0.8.1 / 2014-04-14 ================== * package: re-add the "component" section 0.8.0 / 2014-03-30 ================== * add `enable()` method for nodejs. Closes #27 * change from stderr to stdout * remove unnecessary index.js file 0.7.4 / 2013-11-13 ================== * remove "browserify" key from package.json (fixes something in browserify) 0.7.3 / 2013-10-30 ================== * fix: catch localStorage security error when cookies are blocked (Chrome) * add debug(err) support. Closes #46 * add .browser prop to package.json. Closes #42 0.7.2 / 2013-02-06 ================== * fix package.json * fix: Mobile Safari (private mode) is broken with debug * fix: Use unicode to send escape character to shell instead of octal to work with strict mode javascript 0.7.1 / 2013-02-05 ================== * add repository URL to package.json * add DEBUG_COLORED to force colored output * add browserify support * fix component. Closes #24 0.7.0 / 2012-05-04 ================== * Added .component to package.json * Added debug.component.js build 0.6.0 / 2012-03-16 ================== * Added support for "-" prefix in DEBUG [Vinay Pulim] * Added `.enabled` flag to the node version [TooTallNate] 0.5.0 / 2012-02-02 ================== * Added: humanize diffs. Closes #8 * Added `debug.disable()` to the CS variant * Removed padding. Closes #10 * Fixed: persist client-side variant again. Closes #9 0.4.0 / 2012-02-01 ================== * Added browser variant support for older browsers [TooTallNate] * Added `debug.enable('project:*')` to browser variant [TooTallNate] * Added padding to diff (moved it to the right) 0.3.0 / 2012-01-26 ================== * Added millisecond diff when isatty, otherwise UTC string 0.2.0 / 2012-01-22 ================== * Added wildcard support 0.1.0 / 2011-12-02 ================== * Added: remove colors unless stderr isatty [TooTallNate] 0.0.1 / 2010-01-03 ================== * Initial release ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/Makefilenpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000131312631326456032137 0ustar 00000000000000 # get Makefile directory name: http://stackoverflow.com/a/5982798/376773 THIS_MAKEFILE_PATH:=$(word $(words $(MAKEFILE_LIST)),$(MAKEFILE_LIST)) THIS_DIR:=$(shell cd $(dir $(THIS_MAKEFILE_PATH));pwd) # BIN directory BIN := $(THIS_DIR)/node_modules/.bin # applications NODE ?= $(shell which node) NPM ?= $(NODE) $(shell which npm) BROWSERIFY ?= $(NODE) $(BIN)/browserify all: dist/debug.js install: node_modules clean: @rm -rf dist dist: @mkdir -p $@ dist/debug.js: node_modules browser.js debug.js dist @$(BROWSERIFY) \ --standalone debug \ . > $@ distclean: clean @rm -rf node_modules node_modules: package.json @NODE_ENV= $(NPM) install @touch node_modules .PHONY: all install clean distclean ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/Readme.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000001424612631326456032150 0ustar 00000000000000# debug tiny node.js debugging utility modelled after node core's debugging technique. ## Installation ```bash $ npm install debug ``` ## Usage With `debug` you simply invoke the exported function to generate your debug function, passing it a name which will determine if a noop function is returned, or a decorated `console.error`, so all of the `console` format string goodies you're used to work fine. A unique color is selected per-function for visibility. Example _app.js_: ```js var debug = require('debug')('http') , http = require('http') , name = 'My App'; // fake app debug('booting %s', name); http.createServer(function(req, res){ debug(req.method + ' ' + req.url); res.end('hello\n'); }).listen(3000, function(){ debug('listening'); }); // fake worker of some kind require('./worker'); ``` Example _worker.js_: ```js var debug = require('debug')('worker'); setInterval(function(){ debug('doing some work'); }, 1000); ``` The __DEBUG__ environment variable is then used to enable these based on space or comma-delimited names. Here are some examples: ![debug http and worker](http://f.cl.ly/items/18471z1H402O24072r1J/Screenshot.png) ![debug worker](http://f.cl.ly/items/1X413v1a3M0d3C2c1E0i/Screenshot.png) #### Windows note On Windows the environment variable is set using the `set` command. ```cmd set DEBUG=*,-not_this ``` Then, run the program to be debugged as usual. ## Millisecond diff When actively developing an application it can be useful to see when the time spent between one `debug()` call and the next. Suppose for example you invoke `debug()` before requesting a resource, and after as well, the "+NNNms" will show you how much time was spent between calls. ![](http://f.cl.ly/items/2i3h1d3t121M2Z1A3Q0N/Screenshot.png) When stdout is not a TTY, `Date#toUTCString()` is used, making it more useful for logging the debug information as shown below: ![](http://f.cl.ly/items/112H3i0e0o0P0a2Q2r11/Screenshot.png) ## Conventions If you're using this in one or more of your libraries, you _should_ use the name of your library so that developers may toggle debugging as desired without guessing names. If you have more than one debuggers you _should_ prefix them with your library name and use ":" to separate features. For example "bodyParser" from Connect would then be "connect:bodyParser". ## Wildcards The `*` character may be used as a wildcard. Suppose for example your library has debuggers named "connect:bodyParser", "connect:compress", "connect:session", instead of listing all three with `DEBUG=connect:bodyParser,connect:compress,connect:session`, you may simply do `DEBUG=connect:*`, or to run everything using this module simply use `DEBUG=*`. You can also exclude specific debuggers by prefixing them with a "-" character. For example, `DEBUG=*,-connect:*` would include all debuggers except those starting with "connect:". ## Browser support Debug works in the browser as well, currently persisted by `localStorage`. Consider the situation shown below where you have `worker:a` and `worker:b`, and wish to debug both. Somewhere in the code on your page, include: ```js window.myDebug = require("debug"); ``` ("debug" is a global object in the browser so we give this object a different name.) When your page is open in the browser, type the following in the console: ```js myDebug.enable("worker:*") ``` Refresh the page. Debug output will continue to be sent to the console until it is disabled by typing `myDebug.disable()` in the console. ```js a = debug('worker:a'); b = debug('worker:b'); setInterval(function(){ a('doing some work'); }, 1000); setInterval(function(){ b('doing some work'); }, 1200); ``` #### Web Inspector Colors Colors are also enabled on "Web Inspectors" that understand the `%c` formatting option. These are WebKit web inspectors, Firefox ([since version 31](https://hacks.mozilla.org/2014/05/editable-box-model-multiple-selection-sublime-text-keys-much-more-firefox-developer-tools-episode-31/)) and the Firebug plugin for Firefox (any version). Colored output looks something like: ![](https://cloud.githubusercontent.com/assets/71256/3139768/b98c5fd8-e8ef-11e3-862a-f7253b6f47c6.png) ### stderr vs stdout You can set an alternative logging method per-namespace by overriding the `log` method on a per-namespace or globally: Example _stdout.js_: ```js var debug = require('debug'); var error = debug('app:error'); // by default stderr is used error('goes to stderr!'); var log = debug('app:log'); // set this namespace to log via console.log log.log = console.log.bind(console); // don't forget to bind to console! log('goes to stdout'); error('still goes to stderr!'); // set all output to go via console.info // overrides all per-namespace log settings debug.log = console.info.bind(console); error('now goes to stdout via console.info'); log('still goes to stdout, but via console.info now'); ``` ### Save debug output to a file You can save all debug statements to a file by piping them. Example: ```bash $ DEBUG_FD=3 node your-app.js 3> whatever.log ``` ## Authors - TJ Holowaychuk - Nathan Rajlich ## License (The MIT License) Copyright (c) 2014 TJ Holowaychuk <tj@vision-media.ca> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/bower.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000074012631326456032142 0ustar 00000000000000{ "name": "visionmedia-debug", "main": "dist/debug.js", "version": "2.2.0", "homepage": "https://github.com/visionmedia/debug", "authors": [ "TJ Holowaychuk " ], "description": "visionmedia-debug", "moduleType": [ "amd", "es6", "globals", "node" ], "keywords": [ "visionmedia", "debug" ], "license": "MIT", "ignore": [ "**/.*", "node_modules", "bower_components", "test", "tests" ] } ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/browser.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000726312631326456032151 0ustar 00000000000000 /** * This is the web browser implementation of `debug()`. * * Expose `debug()` as the module. */ exports = module.exports = require('./debug'); exports.log = log; exports.formatArgs = formatArgs; exports.save = save; exports.load = load; exports.useColors = useColors; exports.storage = 'undefined' != typeof chrome && 'undefined' != typeof chrome.storage ? chrome.storage.local : localstorage(); /** * Colors. */ exports.colors = [ 'lightseagreen', 'forestgreen', 'goldenrod', 'dodgerblue', 'darkorchid', 'crimson' ]; /** * Currently only WebKit-based Web Inspectors, Firefox >= v31, * and the Firebug extension (any Firefox version) are known * to support "%c" CSS customizations. * * TODO: add a `localStorage` variable to explicitly enable/disable colors */ function useColors() { // is webkit? http://stackoverflow.com/a/16459606/376773 return ('WebkitAppearance' in document.documentElement.style) || // is firebug? http://stackoverflow.com/a/398120/376773 (window.console && (console.firebug || (console.exception && console.table))) || // is firefox >= v31? // https://developer.mozilla.org/en-US/docs/Tools/Web_Console#Styling_messages (navigator.userAgent.toLowerCase().match(/firefox\/(\d+)/) && parseInt(RegExp.$1, 10) >= 31); } /** * Map %j to `JSON.stringify()`, since no Web Inspectors do that by default. */ exports.formatters.j = function(v) { return JSON.stringify(v); }; /** * Colorize log arguments if enabled. * * @api public */ function formatArgs() { var args = arguments; var useColors = this.useColors; args[0] = (useColors ? '%c' : '') + this.namespace + (useColors ? ' %c' : ' ') + args[0] + (useColors ? '%c ' : ' ') + '+' + exports.humanize(this.diff); if (!useColors) return args; var c = 'color: ' + this.color; args = [args[0], c, 'color: inherit'].concat(Array.prototype.slice.call(args, 1)); // the final "%c" is somewhat tricky, because there could be other // arguments passed either before or after the %c, so we need to // figure out the correct index to insert the CSS into var index = 0; var lastC = 0; args[0].replace(/%[a-z%]/g, function(match) { if ('%%' === match) return; index++; if ('%c' === match) { // we only are interested in the *last* %c // (the user may have provided their own) lastC = index; } }); args.splice(lastC, 0, c); return args; } /** * Invokes `console.log()` when available. * No-op when `console.log` is not a "function". * * @api public */ function log() { // this hackery is required for IE8/9, where // the `console.log` function doesn't have 'apply' return 'object' === typeof console && console.log && Function.prototype.apply.call(console.log, console, arguments); } /** * Save `namespaces`. * * @param {String} namespaces * @api private */ function save(namespaces) { try { if (null == namespaces) { exports.storage.removeItem('debug'); } else { exports.storage.debug = namespaces; } } catch(e) {} } /** * Load `namespaces`. * * @return {String} returns the previously persisted debug modes * @api private */ function load() { var r; try { r = exports.storage.debug; } catch(e) {} return r; } /** * Enable namespaces listed in `localStorage.debug` initially. */ exports.enable(load()); /** * Localstorage attempts to return the localstorage. * * This is necessary because safari throws * when a user disables cookies/localstorage * and you attempt to access it. * * @return {LocalStorage} * @api private */ function localstorage(){ try { return window.localStorage; } catch (e) {} } ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/component.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000046512631326456032146 0ustar 00000000000000{ "name": "debug", "repo": "visionmedia/debug", "description": "small debugging utility", "version": "2.2.0", "keywords": [ "debug", "log", "debugger" ], "main": "browser.js", "scripts": [ "browser.js", "debug.js" ], "dependencies": { "rauchg/ms.js": "0.7.1" } } ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/debug.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000001000012631326456032130 0ustar 00000000000000 /** * This is the common logic for both the Node.js and web browser * implementations of `debug()`. * * Expose `debug()` as the module. */ exports = module.exports = debug; exports.coerce = coerce; exports.disable = disable; exports.enable = enable; exports.enabled = enabled; exports.humanize = require('ms'); /** * The currently active debug mode names, and names to skip. */ exports.names = []; exports.skips = []; /** * Map of special "%n" handling functions, for the debug "format" argument. * * Valid key names are a single, lowercased letter, i.e. "n". */ exports.formatters = {}; /** * Previously assigned color. */ var prevColor = 0; /** * Previous log timestamp. */ var prevTime; /** * Select a color. * * @return {Number} * @api private */ function selectColor() { return exports.colors[prevColor++ % exports.colors.length]; } /** * Create a debugger with the given `namespace`. * * @param {String} namespace * @return {Function} * @api public */ function debug(namespace) { // define the `disabled` version function disabled() { } disabled.enabled = false; // define the `enabled` version function enabled() { var self = enabled; // set `diff` timestamp var curr = +new Date(); var ms = curr - (prevTime || curr); self.diff = ms; self.prev = prevTime; self.curr = curr; prevTime = curr; // add the `color` if not set if (null == self.useColors) self.useColors = exports.useColors(); if (null == self.color && self.useColors) self.color = selectColor(); var args = Array.prototype.slice.call(arguments); args[0] = exports.coerce(args[0]); if ('string' !== typeof args[0]) { // anything else let's inspect with %o args = ['%o'].concat(args); } // apply any `formatters` transformations var index = 0; args[0] = args[0].replace(/%([a-z%])/g, function(match, format) { // if we encounter an escaped % then don't increase the array index if (match === '%%') return match; index++; var formatter = exports.formatters[format]; if ('function' === typeof formatter) { var val = args[index]; match = formatter.call(self, val); // now we need to remove `args[index]` since it's inlined in the `format` args.splice(index, 1); index--; } return match; }); if ('function' === typeof exports.formatArgs) { args = exports.formatArgs.apply(self, args); } var logFn = enabled.log || exports.log || console.log.bind(console); logFn.apply(self, args); } enabled.enabled = true; var fn = exports.enabled(namespace) ? enabled : disabled; fn.namespace = namespace; return fn; } /** * Enables a debug mode by namespaces. This can include modes * separated by a colon and wildcards. * * @param {String} namespaces * @api public */ function enable(namespaces) { exports.save(namespaces); var split = (namespaces || '').split(/[\s,]+/); var len = split.length; for (var i = 0; i < len; i++) { if (!split[i]) continue; // ignore empty strings namespaces = split[i].replace(/\*/g, '.*?'); if (namespaces[0] === '-') { exports.skips.push(new RegExp('^' + namespaces.substr(1) + '$')); } else { exports.names.push(new RegExp('^' + namespaces + '$')); } } } /** * Disable debug output. * * @api public */ function disable() { exports.enable(''); } /** * Returns true if the given mode name is enabled, false otherwise. * * @param {String} name * @return {Boolean} * @api public */ function enabled(name) { var i, len; for (i = 0, len = exports.skips.length; i < len; i++) { if (exports.skips[i].test(name)) { return false; } } for (i = 0, len = exports.names.length; i < len; i++) { if (exports.names[i].test(name)) { return true; } } return false; } /** * Coerce `val`. * * @param {Mixed} val * @return {Mixed} * @api private */ function coerce(val) { if (val instanceof Error) return val.stack || val.message; return val; } ././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000001122612631326456032143 0ustar 00000000000000 /** * Module dependencies. */ var tty = require('tty'); var util = require('util'); /** * This is the Node.js implementation of `debug()`. * * Expose `debug()` as the module. */ exports = module.exports = require('./debug'); exports.log = log; exports.formatArgs = formatArgs; exports.save = save; exports.load = load; exports.useColors = useColors; /** * Colors. */ exports.colors = [6, 2, 3, 4, 5, 1]; /** * The file descriptor to write the `debug()` calls to. * Set the `DEBUG_FD` env variable to override with another value. i.e.: * * $ DEBUG_FD=3 node script.js 3>debug.log */ var fd = parseInt(process.env.DEBUG_FD, 10) || 2; var stream = 1 === fd ? process.stdout : 2 === fd ? process.stderr : createWritableStdioStream(fd); /** * Is stdout a TTY? Colored output is enabled when `true`. */ function useColors() { var debugColors = (process.env.DEBUG_COLORS || '').trim().toLowerCase(); if (0 === debugColors.length) { return tty.isatty(fd); } else { return '0' !== debugColors && 'no' !== debugColors && 'false' !== debugColors && 'disabled' !== debugColors; } } /** * Map %o to `util.inspect()`, since Node doesn't do that out of the box. */ var inspect = (4 === util.inspect.length ? // node <= 0.8.x function (v, colors) { return util.inspect(v, void 0, void 0, colors); } : // node > 0.8.x function (v, colors) { return util.inspect(v, { colors: colors }); } ); exports.formatters.o = function(v) { return inspect(v, this.useColors) .replace(/\s*\n\s*/g, ' '); }; /** * Adds ANSI color escape codes if enabled. * * @api public */ function formatArgs() { var args = arguments; var useColors = this.useColors; var name = this.namespace; if (useColors) { var c = this.color; args[0] = ' \u001b[3' + c + ';1m' + name + ' ' + '\u001b[0m' + args[0] + '\u001b[3' + c + 'm' + ' +' + exports.humanize(this.diff) + '\u001b[0m'; } else { args[0] = new Date().toUTCString() + ' ' + name + ' ' + args[0]; } return args; } /** * Invokes `console.error()` with the specified arguments. */ function log() { return stream.write(util.format.apply(this, arguments) + '\n'); } /** * Save `namespaces`. * * @param {String} namespaces * @api private */ function save(namespaces) { if (null == namespaces) { // If you set a process.env field to null or undefined, it gets cast to the // string 'null' or 'undefined'. Just delete instead. delete process.env.DEBUG; } else { process.env.DEBUG = namespaces; } } /** * Load `namespaces`. * * @return {String} returns the previously persisted debug modes * @api private */ function load() { return process.env.DEBUG; } /** * Copied from `node/src/node.js`. * * XXX: It's lame that node doesn't expose this API out-of-the-box. It also * relies on the undocumented `tty_wrap.guessHandleType()` which is also lame. */ function createWritableStdioStream (fd) { var stream; var tty_wrap = process.binding('tty_wrap'); // Note stream._type is used for test-module-load-list.js switch (tty_wrap.guessHandleType(fd)) { case 'TTY': stream = new tty.WriteStream(fd); stream._type = 'tty'; // Hack to have stream not keep the event loop alive. // See https://github.com/joyent/node/issues/1726 if (stream._handle && stream._handle.unref) { stream._handle.unref(); } break; case 'FILE': var fs = require('fs'); stream = new fs.SyncWriteStream(fd, { autoClose: false }); stream._type = 'fs'; break; case 'PIPE': case 'TCP': var net = require('net'); stream = new net.Socket({ fd: fd, readable: false, writable: true }); // FIXME Should probably have an option in net.Socket to create a // stream from an existing fd which is writable only. But for now // we'll just add this hack and set the `readable` member to false. // Test: ./node test/fixtures/echo.js < /etc/passwd stream.readable = false; stream.read = null; stream._type = 'pipe'; // FIXME Hack to have stream not keep the event loop alive. // See https://github.com/joyent/node/issues/1726 if (stream._handle && stream._handle.unref) { stream._handle.unref(); } break; default: // Probably an error on in uv_guess_handle() throw new Error('Implement me. Unknown stream file type!'); } // For supporting legacy API we put the FD here. stream.fd = fd; stream._isStdio = true; return stream; } /** * Enable namespaces listed in `process.env.DEBUG` initially. */ exports.enable(load()); ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000755000000000000000000000000012631326456032137 5ustar 00000000000000././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000447112631326456032147 0ustar 00000000000000{ "_args": [ [ "debug@*", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/path-array/node_modules/array-index" ] ], "_from": "debug@*", "_id": "debug@2.2.0", "_inCache": true, "_installable": true, "_location": "/node-gyp/path-array/array-index/debug", "_nodeVersion": "0.12.2", "_npmUser": { "email": "nathan@tootallnate.net", "name": "tootallnate" }, "_npmVersion": "2.7.4", "_phantomChildren": {}, "_requested": { "name": "debug", "raw": "debug@*", "rawSpec": "*", "scope": null, "spec": "*", "type": "range" }, "_requiredBy": [ "/node-gyp/path-array/array-index" ], "_resolved": "https://registry.npmjs.org/debug/-/debug-2.2.0.tgz", "_shasum": "f87057e995b1a1f6ae6a4960664137bc56f039da", "_shrinkwrap": null, "_spec": "debug@*", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/path-array/node_modules/array-index", "author": { "email": "tj@vision-media.ca", "name": "TJ Holowaychuk" }, "browser": "./browser.js", "bugs": { "url": "https://github.com/visionmedia/debug/issues" }, "component": { "scripts": { "debug/debug.js": "debug.js", "debug/index.js": "browser.js" } }, "contributors": [ { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://n8.io" } ], "dependencies": { "ms": "0.7.1" }, "description": "small debugging utility", "devDependencies": { "browserify": "9.0.3", "mocha": "*" }, "directories": {}, "dist": { "shasum": "f87057e995b1a1f6ae6a4960664137bc56f039da", "tarball": "http://registry.npmjs.org/debug/-/debug-2.2.0.tgz" }, "gitHead": "b38458422b5aa8aa6d286b10dfe427e8a67e2b35", "homepage": "https://github.com/visionmedia/debug", "keywords": [ "debug", "debugger", "log" ], "license": "MIT", "main": "./node.js", "maintainers": [ { "name": "tjholowaychuk", "email": "tj@vision-media.ca" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "name": "debug", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/visionmedia/debug.git" }, "scripts": {}, "version": "2.2.0" } ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000755000000000000000000000000012631326456032137 5ustar 00000000000000././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/.npmignorenpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000006512631326456032142 0ustar 00000000000000node_modules test History.md Makefile component.json ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/History.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000243312631326456032143 0ustar 00000000000000 0.7.1 / 2015-04-20 ================== * prevent extraordinary long inputs (@evilpacket) * Fixed broken readme link 0.7.0 / 2014-11-24 ================== * add time abbreviations, updated tests and readme for the new units * fix example in the readme. * add LICENSE file 0.6.2 / 2013-12-05 ================== * Adding repository section to package.json to suppress warning from NPM. 0.6.1 / 2013-05-10 ================== * fix singularization [visionmedia] 0.6.0 / 2013-03-15 ================== * fix minutes 0.5.1 / 2013-02-24 ================== * add component namespace 0.5.0 / 2012-11-09 ================== * add short formatting as default and .long option * add .license property to component.json * add version to component.json 0.4.0 / 2012-10-22 ================== * add rounding to fix crazy decimals 0.3.0 / 2012-09-07 ================== * fix `ms()` [visionmedia] 0.2.0 / 2012-09-03 ================== * add component.json [visionmedia] * add days support [visionmedia] * add hours support [visionmedia] * add minutes support [visionmedia] * add seconds support [visionmedia] * add ms string support [visionmedia] * refactor tests to facilitate ms(number) [visionmedia] 0.1.0 / 2012-03-07 ================== * Initial release ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/LICENSEnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000211112631326456032134 0ustar 00000000000000(The MIT License) Copyright (c) 2014 Guillermo Rauch Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/README.mdnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000164512631326456032147 0ustar 00000000000000# ms.js: miliseconds conversion utility ```js ms('2 days') // 172800000 ms('1d') // 86400000 ms('10h') // 36000000 ms('2.5 hrs') // 9000000 ms('2h') // 7200000 ms('1m') // 60000 ms('5s') // 5000 ms('100') // 100 ``` ```js ms(60000) // "1m" ms(2 * 60000) // "2m" ms(ms('10 hours')) // "10h" ``` ```js ms(60000, { long: true }) // "1 minute" ms(2 * 60000, { long: true }) // "2 minutes" ms(ms('10 hours'), { long: true }) // "10 hours" ``` - Node/Browser compatible. Published as [`ms`](https://www.npmjs.org/package/ms) in [NPM](http://nodejs.org/download). - If a number is supplied to `ms`, a string with a unit is returned. - If a string that contains the number is supplied, it returns it as a number (e.g: it returns `100` for `'100'`). - If you pass a string with a number and a valid unit, the number of equivalent ms is returned. ## License MIT ././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/index.jsnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000443412631326456032146 0ustar 00000000000000/** * Helpers. */ var s = 1000; var m = s * 60; var h = m * 60; var d = h * 24; var y = d * 365.25; /** * Parse or format the given `val`. * * Options: * * - `long` verbose formatting [false] * * @param {String|Number} val * @param {Object} options * @return {String|Number} * @api public */ module.exports = function(val, options){ options = options || {}; if ('string' == typeof val) return parse(val); return options.long ? long(val) : short(val); }; /** * Parse the given `str` and return milliseconds. * * @param {String} str * @return {Number} * @api private */ function parse(str) { str = '' + str; if (str.length > 10000) return; var match = /^((?:\d+)?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|years?|yrs?|y)?$/i.exec(str); if (!match) return; var n = parseFloat(match[1]); var type = (match[2] || 'ms').toLowerCase(); switch (type) { case 'years': case 'year': case 'yrs': case 'yr': case 'y': return n * y; case 'days': case 'day': case 'd': return n * d; case 'hours': case 'hour': case 'hrs': case 'hr': case 'h': return n * h; case 'minutes': case 'minute': case 'mins': case 'min': case 'm': return n * m; case 'seconds': case 'second': case 'secs': case 'sec': case 's': return n * s; case 'milliseconds': case 'millisecond': case 'msecs': case 'msec': case 'ms': return n; } } /** * Short format for `ms`. * * @param {Number} ms * @return {String} * @api private */ function short(ms) { if (ms >= d) return Math.round(ms / d) + 'd'; if (ms >= h) return Math.round(ms / h) + 'h'; if (ms >= m) return Math.round(ms / m) + 'm'; if (ms >= s) return Math.round(ms / s) + 's'; return ms + 'ms'; } /** * Long format for `ms`. * * @param {Number} ms * @return {String} * @api private */ function long(ms) { return plural(ms, d, 'day') || plural(ms, h, 'hour') || plural(ms, m, 'minute') || plural(ms, s, 'second') || ms + ' ms'; } /** * Pluralization helper. */ function plural(ms, n, name) { if (ms < n) return; if (ms < n * 1.5) return Math.floor(ms / n) + ' ' + name; return Math.ceil(ms / n) + ' ' + name + 's'; } ././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug/node_modules/ms/package.jsonnpm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/d0000644000000000000000000000356212631326456032147 0ustar 00000000000000{ "_args": [ [ "ms@0.7.1", "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug" ] ], "_from": "ms@0.7.1", "_id": "ms@0.7.1", "_inCache": true, "_installable": true, "_location": "/node-gyp/path-array/array-index/debug/ms", "_nodeVersion": "0.12.2", "_npmUser": { "email": "rauchg@gmail.com", "name": "rauchg" }, "_npmVersion": "2.7.5", "_phantomChildren": {}, "_requested": { "name": "ms", "raw": "ms@0.7.1", "rawSpec": "0.7.1", "scope": null, "spec": "0.7.1", "type": "version" }, "_requiredBy": [ "/node-gyp/path-array/array-index/debug" ], "_resolved": "https://registry.npmjs.org/ms/-/ms-0.7.1.tgz", "_shasum": "9cd13c03adbff25b65effde7ce864ee952017098", "_shrinkwrap": null, "_spec": "ms@0.7.1", "_where": "/Users/rebecca/code/release/npm-3/node_modules/node-gyp/node_modules/path-array/node_modules/array-index/node_modules/debug", "bugs": { "url": "https://github.com/guille/ms.js/issues" }, "component": { "scripts": { "ms/index.js": "index.js" } }, "dependencies": {}, "description": "Tiny ms conversion utility", "devDependencies": { "expect.js": "*", "mocha": "*", "serve": "*" }, "directories": {}, "dist": { "shasum": "9cd13c03adbff25b65effde7ce864ee952017098", "tarball": "http://registry.npmjs.org/ms/-/ms-0.7.1.tgz" }, "gitHead": "713dcf26d9e6fd9dbc95affe7eff9783b7f1b909", "homepage": "https://github.com/guille/ms.js", "main": "./index", "maintainers": [ { "name": "rauchg", "email": "rauchg@gmail.com" } ], "name": "ms", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/guille/ms.js.git" }, "scripts": {}, "version": "0.7.1" } npm_3.5.2.orig/node_modules/node-gyp/node_modules/path-array/test/test.js0000644000000000000000000000443012631326456024713 0ustar 00000000000000 /** * Module dependencies. */ var assert = require('assert'); var PathArray = require('../'); var delimiter = require('path').delimiter || ':'; describe('PathArray', function () { it('should use `process.env` by default', function () { var p = new PathArray(); assert.equal(p._env, process.env); }); it('should return the $PATH string for .toString()', function () { var p = new PathArray(); assert.equal(p.toString(), process.env.PATH); }); it('should accept an arbitrary `env` object', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); assert.equal(p.toString(), env.PATH); }); it('should work for [n] getter syntax', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); assert.equal('/foo', p[0]); assert.equal('/bar', p[1]); }); it('should work for [n]= setter syntax', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); p[0] = '/baz'; assert.equal('/baz' + delimiter + '/bar', env.PATH); }); it('should work with .push()', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); p.push('/baz'); assert.equal('/foo' + delimiter + '/bar' + delimiter + '/baz', env.PATH); }); it('should work with .shift()', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); assert.equal('/foo', p.shift()); assert.equal('/bar', env.PATH); }); it('should work with .pop()', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); assert.equal('/bar', p.pop()); assert.equal('/foo', env.PATH); }); it('should work with .unshift()', function () { var env = { PATH: '/foo' + delimiter + '/bar' }; var p = new PathArray(env); p.unshift('/baz'); assert.equal('/baz' + delimiter + '/foo' + delimiter + '/bar', env.PATH); }); it('should be able to specify property name to use with second argument', function () { var env = { PYTHONPATH: '/foo' }; var p = new PathArray(env, 'PYTHONPATH'); assert.equal(1, p.length); p.push('/baz'); assert.equal(2, p.length); assert.equal('/foo' + delimiter + '/baz', env.PYTHONPATH); }); }); npm_3.5.2.orig/node_modules/node-gyp/src/win_delay_load_hook.c0000644000000000000000000000153012631326456022615 0ustar 00000000000000/* * When this file is linked to a DLL, it sets up a delay-load hook that * intervenes when the DLL is trying to load 'node.exe' or 'iojs.exe' * dynamically. Instead of trying to locate the .exe file it'll just return * a handle to the process image. * * This allows compiled addons to work when node.exe or iojs.exe is renamed. */ #ifdef _MSC_VER #ifndef WIN32_LEAN_AND_MEAN #define WIN32_LEAN_AND_MEAN #endif #include #include #include static FARPROC WINAPI load_exe_hook(unsigned int event, DelayLoadInfo* info) { HMODULE m; if (event != dliNotePreLoadLibrary) return NULL; if (_stricmp(info->szDll, "iojs.exe") != 0 && _stricmp(info->szDll, "node.exe") != 0) return NULL; m = GetModuleHandle(NULL); return (FARPROC) m; } PfnDliHook __pfnDliNotifyHook2 = load_exe_hook; #endif npm_3.5.2.orig/node_modules/node-gyp/test/docker.sh0000755000000000000000000001307212631326456020461 0ustar 00000000000000#!/bin/bash #set -e test_node_versions="0.8.28 0.10.40 0.12.7" test_iojs_versions="1.8.4 2.4.0 3.3.0" __dirname="$(CDPATH= cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" dot_node_gyp=${__dirname}/.node-gyp/ # borrows from https://github.com/rvagg/dnt/ # Simple setup function for a container: # setup_container(image id, base image, commands to run to set up) setup_container() { local container_id="$1" local base_container="$2" local run_cmd="$3" # Does this image exist? If yes, ignore docker inspect "$container_id" &> /dev/null if [[ $? -eq 0 ]]; then echo "Found existing container [$container_id]" else # No such image, so make it echo "Did not find container [$container_id], creating..." docker run -i $base_container /bin/bash -c "$run_cmd" sleep 2 docker commit $(docker ps -l -q) $container_id fi } # Run tests inside each of the versioned containers, copy cwd into npm's copy of node-gyp # so it'll be invoked by npm when a compile is needed # run_tests(version, test-commands) run_tests() { local version="$1" local run_cmd="$2" run_cmd="rsync -aAXx --delete --exclude .git --exclude build /node-gyp-src/ /usr/lib/node_modules/npm/node_modules/node-gyp/; /bin/su -s /bin/bash node-gyp -c 'cd && ${run_cmd}'" rm -rf $dot_node_gyp docker run \ --rm -i \ -v ~/.npm/:/node-gyp/.npm/ \ -v ${dot_node_gyp}:/node-gyp/.node-gyp/ \ -v $(pwd):/node-gyp-src/:ro \ node-gyp-test/${version} /bin/bash -c "${run_cmd}" } # A base image with build tools and a user account setup_container "node-gyp-test/base" "ubuntu:14.04" " apt-get update && apt-get install -y build-essential python git rsync curl && adduser --gecos node-gyp --home /node-gyp/ --disabled-login node-gyp && echo "node-gyp:node-gyp" | chpasswd " # An image on top of the base containing clones of repos we want to use for testing setup_container "node-gyp-test/clones" "node-gyp-test/base" " cd /node-gyp/ && git clone https://github.com/justmoon/node-bignum.git && cd /node-gyp/ && git clone https://github.com/bnoordhuis/node-buffertools.git && chown -R node-gyp.node-gyp /node-gyp/ " # An image for each of the node versions we want to test with that version installed and the latest npm for v in $test_node_versions; do setup_container "node-gyp-test/${v}" "node-gyp-test/clones" " curl -sL https://nodejs.org/dist/v${v}/node-v${v}-linux-x64.tar.gz | tar -zxv --strip-components=1 -C /usr/ && npm install npm@latest -g && node -v && npm -v " done # An image for each of the io.js versions we want to test with that version installed and the latest npm for v in $test_iojs_versions; do setup_container "node-gyp-test/${v}" "node-gyp-test/clones" " curl -sL https://iojs.org/dist/v${v}/iojs-v${v}-linux-x64.tar.gz | tar -zxv --strip-components=1 -C /usr/ && npm install npm@latest -g && node -v && npm -v " done # Run the tests for all of the test images we've created, # we should see node-gyp doing its download, configure and run thing # _NOTE: bignum doesn't compile on 0.8 currently so it'll fail for that version only_ for v in $test_node_versions $test_iojs_versions; do run_tests $v " cd node-buffertools && npm install --loglevel=info && npm test && cd " # removed for now, too noisy: cd node-bignum && npm install --loglevel=info && npm test done # Test use of --target=x.y.z to compile against alternate versions test_download_node_version() { local run_with_ver="$1" local expected_dir="$2" local expected_ver="$3" run_tests $run_with_ver "cd node-buffertools && npm install --loglevel=info --target=${expected_ver}" local node_ver=$(cat "${dot_node_gyp}${expected_dir}/node_version.h" | grep '#define NODE_\w*_VERSION [0-9]*$') node_ver=$(echo $node_ver | sed 's/#define NODE_[A-Z]*_VERSION //g' | sed 's/ /./g') if [ "X$(echo $node_ver)" != "X${expected_ver}" ]; then echo "Did not download v${expected_ver} using --target, instead got: $(echo $node_ver)" exit 1 fi echo "Verified correct download of [v${node_ver}]" } test_download_node_version "0.12.7" "0.10.30/src" "0.10.30" test_download_node_version "3.3.0" "iojs-1.8.4/src" "1.8.4" # should download the headers file test_download_node_version "3.3.0" "iojs-3.2.0/include/node" "3.2.0" # TODO: test --dist-url by starting up a localhost server and serving up tarballs # testing --dist-url, using simple-proxy.js to make localhost work as a distribution # point for tarballs # we can test whether it uses the proxy because after 2 connections the proxy will # die and therefore should not be running at the end of the test, `nc` can tell us this run_tests "3.3.0" " (node /node-gyp-src/test/simple-proxy.js 8080 /foobar/ https://iojs.org/dist/ &) && cd node-buffertools && /node-gyp-src/bin/node-gyp.js --loglevel=info --dist-url=http://localhost:8080/foobar/ rebuild && nc -z localhost 8080 && echo -e \"\\n\\n\\033[31mFAILED TO USE LOCAL PROXY\\033[39m\\n\\n\" " run_tests "3.3.0" " (node /node-gyp-src/test/simple-proxy.js 8080 /doobar/ https://iojs.org/dist/ &) && cd node-buffertools && NVM_IOJS_ORG_MIRROR=http://localhost:8080/doobar/ /node-gyp-src/bin/node-gyp.js --loglevel=info rebuild && nc -z localhost 8080 && echo -e \"\\n\\n\\033[31mFAILED TO USE LOCAL PROXY\\033[39m\\n\\n\" " run_tests "0.12.7" " (node /node-gyp-src/test/simple-proxy.js 8080 /boombar/ https://nodejs.org/dist/ &) && cd node-buffertools && NVM_NODEJS_ORG_MIRROR=http://localhost:8080/boombar/ /node-gyp-src/bin/node-gyp.js --loglevel=info rebuild && nc -z localhost 8080 && echo -e \"\\n\\n\\033[31mFAILED TO USE LOCAL PROXY\\033[39m\\n\\n\" " rm -rf $dot_node_gyp npm_3.5.2.orig/node_modules/node-gyp/test/simple-proxy.js0000644000000000000000000000120312631326456021652 0ustar 00000000000000var http = require('http') , https = require('https') , server = http.createServer(handler) , port = +process.argv[2] , prefix = process.argv[3] , upstream = process.argv[4] , calls = 0 server.listen(port) function handler (req, res) { if (req.url.indexOf(prefix) != 0) throw new Error('request url [' + req.url + '] does not start with [' + prefix + ']') var upstreamUrl = upstream + req.url.substring(prefix.length) console.log(req.url + ' -> ' + upstreamUrl) https.get(upstreamUrl, function (ures) { ures.on('end', function () { if (++calls == 2) server.close() }) ures.pipe(res) }) } npm_3.5.2.orig/node_modules/node-gyp/test/test-find-node-directory.js0000644000000000000000000001044712631326456024036 0ustar 00000000000000var test = require('tape') var path = require('path') var findNodeDirectory = require('../lib/find-node-directory') var platforms = ['darwin', 'freebsd', 'linux', 'sunos', 'win32', 'aix'] // we should find the directory based on the directory // the script is running in and it should match the layout // in a build tree where npm is installed in // .... /deps/npm test('test find-node-directory - node install', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj = {execPath: '/x/y/bin/node', platform: platforms[next]} t.equal( findNodeDirectory('/x/deps/npm/node_modules/node-gyp/lib', processObj), path.join('/x')) } }) // we should find the directory based on the directory // the script is running in and it should match the layout // in an installed tree where npm is installed in // .... /lib/node_modules/npm or .../node_modules/npm // depending on the patform test('test find-node-directory - node build', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj = {execPath: '/x/y/bin/node', platform: platforms[next]} if (platforms[next] === 'win32') { t.equal( findNodeDirectory('/y/node_modules/npm/node_modules/node-gyp/lib', processObj), path.join('/y')) } else { t.equal( findNodeDirectory('/y/lib/node_modules/npm/node_modules/node-gyp/lib', processObj), path.join('/y')) } } }) // we should find the directory based on the execPath // for node and match because it was in the bin directory test('test find-node-directory - node in bin directory', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj = {execPath: '/x/y/bin/node', platform: platforms[next]} t.equal( findNodeDirectory('/nothere/npm/node_modules/node-gyp/lib', processObj), path.join('/x/y')) } }) // we should find the directory based on the execPath // for node and match because it was in the Release directory test('test find-node-directory - node in build release dir', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj if (platforms[next] === 'win32') { processObj = {execPath: '/x/y/Release/node', platform: platforms[next]} } else { processObj = {execPath: '/x/y/out/Release/node', platform: platforms[next]} } t.equal( findNodeDirectory('/nothere/npm/node_modules/node-gyp/lib', processObj), path.join('/x/y')) } }) // we should find the directory based on the execPath // for node and match because it was in the Debug directory test('test find-node-directory - node in Debug release dir', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj if (platforms[next] === 'win32') { processObj = {execPath: '/a/b/Debug/node', platform: platforms[next]} } else { processObj = {execPath: '/a/b/out/Debug/node', platform: platforms[next]} } t.equal( findNodeDirectory('/nothere/npm/node_modules/node-gyp/lib', processObj), path.join('/a/b')) } }) // we should not find it as it will not match based on the execPath nor // the directory from which the script is running test('test find-node-directory - not found', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj = {execPath: '/x/y/z/y', platform:next} t.equal(findNodeDirectory('/a/b/c/d', processObj), '') } }) // we should find the directory based on the directory // the script is running in and it should match the layout // in a build tree where npm is installed in // .... /deps/npm // same test as above but make sure additional directory entries // don't cause an issue test('test find-node-directory - node install', function (t) { t.plan(platforms.length) for (var next = 0; next < platforms.length; next++) { var processObj = {execPath: '/x/y/bin/node', platform: platforms[next]} t.equal( findNodeDirectory('/x/y/z/a/b/c/deps/npm/node_modules/node-gyp/lib', processObj), path.join('/x/y/z/a/b/c')) } }) npm_3.5.2.orig/node_modules/node-gyp/test/test-find-python.js0000644000000000000000000000104112631326456022416 0ustar 00000000000000'use strict' var test = require('tape') var configure = require('../lib/configure') var execFile = require('child_process').execFile test('find python executable', function (t) { t.plan(4) configure.test.findPython('python', function (err, found) { t.strictEqual(err, null) var proc = execFile(found, ['-V'], function (err, stdout, stderr) { t.strictEqual(err, null) t.strictEqual(stdout, '') t.ok(/Python 2/.test(stderr)) }) proc.stdout.setEncoding('utf-8') proc.stderr.setEncoding('utf-8') }) }) npm_3.5.2.orig/node_modules/node-gyp/test/test-options.js0000644000000000000000000000131112631326456021652 0ustar 00000000000000'use strict'; var test = require('tape') var gyp = require('../lib/node-gyp') test('options in environment', function (t) { t.plan(1) // `npm test` dumps a ton of npm_config_* variables in the environment. Object.keys(process.env) .filter(function(key) { return /^npm_config_/.test(key) }) .forEach(function(key) { delete process.env[key] }) // Zero-length keys should get filtered out. process.env.npm_config_ = '42' // Other keys should get added. process.env.npm_config_x = '42' // Except loglevel. process.env.npm_config_loglevel = 'debug' var g = gyp(); g.parseArgv(['rebuild']) // Also sets opts.argv. t.deepEqual(Object.keys(g.opts).sort(), ['argv', 'x']) }) npm_3.5.2.orig/node_modules/node-gyp/test/test-process-release.js0000644000000000000000000003712512631326456023267 0ustar 00000000000000var test = require('tape') var processRelease = require('../lib/process-release') test('test process release - process.version = 0.8.20', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v0.8.20', null) t.equal(release.semver.version, '0.8.20') delete release.semver t.deepEqual(release, { version: '0.8.20', name: 'node', baseUrl: 'https://nodejs.org/dist/v0.8.20/', tarballUrl: 'https://nodejs.org/dist/v0.8.20/node-v0.8.20.tar.gz', shasumsUrl: 'https://nodejs.org/dist/v0.8.20/SHASUMS256.txt', versionDir: '0.8.20', libUrl32: 'https://nodejs.org/dist/v0.8.20/node.lib', libUrl64: 'https://nodejs.org/dist/v0.8.20/x64/node.lib', libPath32: 'node.lib', libPath64: 'x64/node.lib' }) }) test('test process release - process.version = 0.10.21', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v0.10.21', null) t.equal(release.semver.version, '0.10.21') delete release.semver t.deepEqual(release, { version: '0.10.21', name: 'node', baseUrl: 'https://nodejs.org/dist/v0.10.21/', tarballUrl: 'https://nodejs.org/dist/v0.10.21/node-v0.10.21.tar.gz', shasumsUrl: 'https://nodejs.org/dist/v0.10.21/SHASUMS256.txt', versionDir: '0.10.21', libUrl32: 'https://nodejs.org/dist/v0.10.21/node.lib', libUrl64: 'https://nodejs.org/dist/v0.10.21/x64/node.lib', libPath32: 'node.lib', libPath64: 'x64/node.lib' }) }) test('test process release - process.version = 0.12.22', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v0.12.22', null) t.equal(release.semver.version, '0.12.22') delete release.semver t.deepEqual(release, { version: '0.12.22', name: 'node', baseUrl: 'https://nodejs.org/dist/v0.12.22/', tarballUrl: 'https://nodejs.org/dist/v0.12.22/node-v0.12.22.tar.gz', shasumsUrl: 'https://nodejs.org/dist/v0.12.22/SHASUMS256.txt', versionDir: '0.12.22', libUrl32: 'https://nodejs.org/dist/v0.12.22/node.lib', libUrl64: 'https://nodejs.org/dist/v0.12.22/x64/node.lib', libPath32: 'node.lib', libPath64: 'x64/node.lib' }) }) test('test process release - process.release ~ node@4.1.23', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v4.1.23', { name: 'node', headersUrl: 'https://nodejs.org/dist/v4.1.23/node-v4.1.23-headers.tar.gz' }) t.equal(release.semver.version, '4.1.23') delete release.semver t.deepEqual(release, { version: '4.1.23', name: 'node', baseUrl: 'https://nodejs.org/dist/v4.1.23/', tarballUrl: 'https://nodejs.org/dist/v4.1.23/node-v4.1.23-headers.tar.gz', shasumsUrl: 'https://nodejs.org/dist/v4.1.23/SHASUMS256.txt', versionDir: '4.1.23', libUrl32: 'https://nodejs.org/dist/v4.1.23/win-x86/node.lib', libUrl64: 'https://nodejs.org/dist/v4.1.23/win-x64/node.lib', libPath32: 'win-x86/node.lib', libPath64: 'win-x64/node.lib' }) }) test('test process release - process.release ~ node@4.1.23 / corp build', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v4.1.23', { name: 'node', headersUrl: 'https://some.custom.location/node-v4.1.23-headers.tar.gz' }) t.equal(release.semver.version, '4.1.23') delete release.semver t.deepEqual(release, { version: '4.1.23', name: 'node', baseUrl: 'https://some.custom.location/', tarballUrl: 'https://some.custom.location/node-v4.1.23-headers.tar.gz', shasumsUrl: 'https://some.custom.location/SHASUMS256.txt', versionDir: '4.1.23', libUrl32: 'https://some.custom.location/win-x86/node.lib', libUrl64: 'https://some.custom.location/win-x64/node.lib', libPath32: 'win-x86/node.lib', libPath64: 'win-x64/node.lib' }) }) test('test process release - process.version = 1.8.4', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v1.8.4', null) t.equal(release.semver.version, '1.8.4') delete release.semver t.deepEqual(release, { version: '1.8.4', name: 'iojs', baseUrl: 'https://iojs.org/download/release/v1.8.4/', tarballUrl: 'https://iojs.org/download/release/v1.8.4/iojs-v1.8.4.tar.gz', shasumsUrl: 'https://iojs.org/download/release/v1.8.4/SHASUMS256.txt', versionDir: 'iojs-1.8.4', libUrl32: 'https://iojs.org/download/release/v1.8.4/win-x86/iojs.lib', libUrl64: 'https://iojs.org/download/release/v1.8.4/win-x64/iojs.lib', libPath32: 'win-x86/iojs.lib', libPath64: 'win-x64/iojs.lib' }) }) test('test process release - process.release ~ iojs@3.2.24', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v3.2.24', { name: 'io.js', headersUrl: 'https://iojs.org/download/release/v3.2.24/iojs-v3.2.24-headers.tar.gz' }) t.equal(release.semver.version, '3.2.24') delete release.semver t.deepEqual(release, { version: '3.2.24', name: 'iojs', baseUrl: 'https://iojs.org/download/release/v3.2.24/', tarballUrl: 'https://iojs.org/download/release/v3.2.24/iojs-v3.2.24-headers.tar.gz', shasumsUrl: 'https://iojs.org/download/release/v3.2.24/SHASUMS256.txt', versionDir: 'iojs-3.2.24', libUrl32: 'https://iojs.org/download/release/v3.2.24/win-x86/iojs.lib', libUrl64: 'https://iojs.org/download/release/v3.2.24/win-x64/iojs.lib', libPath32: 'win-x86/iojs.lib', libPath64: 'win-x64/iojs.lib' }) }) test('test process release - process.release ~ iojs@3.2.11 +libUrl32', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v3.2.11', { name: 'io.js', headersUrl: 'https://iojs.org/download/release/v3.2.11/iojs-v3.2.11-headers.tar.gz', libUrl: 'https://iojs.org/download/release/v3.2.11/win-x86/iojs.lib' // custom }) t.equal(release.semver.version, '3.2.11') delete release.semver t.deepEqual(release, { version: '3.2.11', name: 'iojs', baseUrl: 'https://iojs.org/download/release/v3.2.11/', tarballUrl: 'https://iojs.org/download/release/v3.2.11/iojs-v3.2.11-headers.tar.gz', shasumsUrl: 'https://iojs.org/download/release/v3.2.11/SHASUMS256.txt', versionDir: 'iojs-3.2.11', libUrl32: 'https://iojs.org/download/release/v3.2.11/win-x86/iojs.lib', libUrl64: 'https://iojs.org/download/release/v3.2.11/win-x64/iojs.lib', libPath32: 'win-x86/iojs.lib', libPath64: 'win-x64/iojs.lib' }) }) test('test process release - process.release ~ iojs@3.2.101 +libUrl64', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v3.2.101', { name: 'io.js', headersUrl: 'https://iojs.org/download/release/v3.2.101/iojs-v3.2.101-headers.tar.gz', libUrl: 'https://iojs.org/download/release/v3.2.101/win-x64/iojs.lib' // custom }) t.equal(release.semver.version, '3.2.101') delete release.semver t.deepEqual(release, { version: '3.2.101', name: 'iojs', baseUrl: 'https://iojs.org/download/release/v3.2.101/', tarballUrl: 'https://iojs.org/download/release/v3.2.101/iojs-v3.2.101-headers.tar.gz', shasumsUrl: 'https://iojs.org/download/release/v3.2.101/SHASUMS256.txt', versionDir: 'iojs-3.2.101', libUrl32: 'https://iojs.org/download/release/v3.2.101/win-x86/iojs.lib', libUrl64: 'https://iojs.org/download/release/v3.2.101/win-x64/iojs.lib', libPath32: 'win-x86/iojs.lib', libPath64: 'win-x64/iojs.lib' }) }) test('test process release - process.release ~ iojs@3.3.0 - borked win-ia32', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v3.2.101', { name: 'io.js', headersUrl: 'https://iojs.org/download/release/v3.2.101/iojs-v3.2.101-headers.tar.gz', libUrl: 'https://iojs.org/download/release/v3.2.101/win-ia32/iojs.lib' // custom }) t.equal(release.semver.version, '3.2.101') delete release.semver t.deepEqual(release, { version: '3.2.101', name: 'iojs', baseUrl: 'https://iojs.org/download/release/v3.2.101/', tarballUrl: 'https://iojs.org/download/release/v3.2.101/iojs-v3.2.101-headers.tar.gz', shasumsUrl: 'https://iojs.org/download/release/v3.2.101/SHASUMS256.txt', versionDir: 'iojs-3.2.101', libUrl32: 'https://iojs.org/download/release/v3.2.101/win-x86/iojs.lib', libUrl64: 'https://iojs.org/download/release/v3.2.101/win-x64/iojs.lib', libPath32: 'win-x86/iojs.lib', libPath64: 'win-x64/iojs.lib' }) }) test('test process release - process.release ~ node@4.1.23 --target=0.10.40', function (t) { t.plan(2) var release = processRelease([], { opts: { target: '0.10.40' } }, 'v4.1.23', { name: 'node', headersUrl: 'https://nodejs.org/dist/v4.1.23/node-v4.1.23-headers.tar.gz' }) t.equal(release.semver.version, '0.10.40') delete release.semver t.deepEqual(release, { version: '0.10.40', name: 'node', baseUrl: 'https://nodejs.org/dist/v0.10.40/', tarballUrl: 'https://nodejs.org/dist/v0.10.40/node-v0.10.40.tar.gz', shasumsUrl: 'https://nodejs.org/dist/v0.10.40/SHASUMS256.txt', versionDir: '0.10.40', libUrl32: 'https://nodejs.org/dist/v0.10.40/node.lib', libUrl64: 'https://nodejs.org/dist/v0.10.40/x64/node.lib', libPath32: 'node.lib', libPath64: 'x64/node.lib' }) }) test('test process release - process.release ~ node@4.1.23 --target=1.8.4', function (t) { t.plan(2) var release = processRelease([], { opts: { target: '1.8.4' } }, 'v4.1.23', { name: 'node', headersUrl: 'https://nodejs.org/dist/v4.1.23/node-v4.1.23-headers.tar.gz' }) t.equal(release.semver.version, '1.8.4') delete release.semver t.deepEqual(release, { version: '1.8.4', name: 'iojs', baseUrl: 'https://iojs.org/download/release/v1.8.4/', tarballUrl: 'https://iojs.org/download/release/v1.8.4/iojs-v1.8.4.tar.gz', shasumsUrl: 'https://iojs.org/download/release/v1.8.4/SHASUMS256.txt', versionDir: 'iojs-1.8.4', libUrl32: 'https://iojs.org/download/release/v1.8.4/win-x86/iojs.lib', libUrl64: 'https://iojs.org/download/release/v1.8.4/win-x64/iojs.lib', libPath32: 'win-x86/iojs.lib', libPath64: 'win-x64/iojs.lib' }) }) test('test process release - process.release ~ node@4.1.23 --dist-url=https://foo.bar/baz', function (t) { t.plan(2) var release = processRelease([], { opts: { 'dist-url': 'https://foo.bar/baz' } }, 'v4.1.23', { name: 'node', headersUrl: 'https://nodejs.org/dist/v4.1.23/node-v4.1.23-headers.tar.gz' }) t.equal(release.semver.version, '4.1.23') delete release.semver t.deepEqual(release, { version: '4.1.23', name: 'node', baseUrl: 'https://foo.bar/baz/v4.1.23/', tarballUrl: 'https://foo.bar/baz/v4.1.23/node-v4.1.23-headers.tar.gz', shasumsUrl: 'https://foo.bar/baz/v4.1.23/SHASUMS256.txt', versionDir: '4.1.23', libUrl32: 'https://foo.bar/baz/v4.1.23/win-x86/node.lib', libUrl64: 'https://foo.bar/baz/v4.1.23/win-x64/node.lib', libPath32: 'win-x86/node.lib', libPath64: 'win-x64/node.lib' }) }) test('test process release - process.release ~ frankenstein@4.1.23', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v4.1.23', { name: 'frankenstein', headersUrl: 'https://frankensteinjs.org/dist/v4.1.23/frankenstein-v4.1.23-headers.tar.gz' }) t.equal(release.semver.version, '4.1.23') delete release.semver t.deepEqual(release, { version: '4.1.23', name: 'frankenstein', baseUrl: 'https://frankensteinjs.org/dist/v4.1.23/', tarballUrl: 'https://frankensteinjs.org/dist/v4.1.23/frankenstein-v4.1.23-headers.tar.gz', shasumsUrl: 'https://frankensteinjs.org/dist/v4.1.23/SHASUMS256.txt', versionDir: 'frankenstein-4.1.23', libUrl32: 'https://frankensteinjs.org/dist/v4.1.23/win-x86/frankenstein.lib', libUrl64: 'https://frankensteinjs.org/dist/v4.1.23/win-x64/frankenstein.lib', libPath32: 'win-x86/frankenstein.lib', libPath64: 'win-x64/frankenstein.lib' }) }) test('test process release - process.release ~ frankenstein@4.1.23 --dist-url=http://foo.bar/baz/', function (t) { t.plan(2) var release = processRelease([], { opts: { 'dist-url': 'http://foo.bar/baz/' } }, 'v4.1.23', { name: 'frankenstein', headersUrl: 'https://frankensteinjs.org/dist/v4.1.23/frankenstein-v4.1.23.tar.gz' }) t.equal(release.semver.version, '4.1.23') delete release.semver t.deepEqual(release, { version: '4.1.23', name: 'frankenstein', baseUrl: 'http://foo.bar/baz/v4.1.23/', tarballUrl: 'http://foo.bar/baz/v4.1.23/frankenstein-v4.1.23-headers.tar.gz', shasumsUrl: 'http://foo.bar/baz/v4.1.23/SHASUMS256.txt', versionDir: 'frankenstein-4.1.23', libUrl32: 'http://foo.bar/baz/v4.1.23/win-x86/frankenstein.lib', libUrl64: 'http://foo.bar/baz/v4.1.23/win-x64/frankenstein.lib', libPath32: 'win-x86/frankenstein.lib', libPath64: 'win-x64/frankenstein.lib' }) }) test('test process release - process.release ~ node@4.0.0-rc.4', function (t) { t.plan(2) var release = processRelease([], { opts: {} }, 'v4.0.0-rc.4', { name: 'node', headersUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/node-v4.0.0-rc.4-headers.tar.gz' }) t.equal(release.semver.version, '4.0.0-rc.4') delete release.semver t.deepEqual(release, { version: '4.0.0-rc.4', name: 'node', baseUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/', tarballUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/node-v4.0.0-rc.4-headers.tar.gz', shasumsUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/SHASUMS256.txt', versionDir: '4.0.0-rc.4', libUrl32: 'https://nodejs.org/download/rc/v4.0.0-rc.4/win-x86/node.lib', libUrl64: 'https://nodejs.org/download/rc/v4.0.0-rc.4/win-x64/node.lib', libPath32: 'win-x86/node.lib', libPath64: 'win-x64/node.lib' }) }) test('test process release - process.release ~ node@4.0.0-rc.4 passed as argv[0]', function (t) { t.plan(2) // note the missing 'v' on the arg, it should normalise when checking // whether we're on the default or not var release = processRelease([ '4.0.0-rc.4' ], { opts: {} }, 'v4.0.0-rc.4', { name: 'node', headersUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/node-v4.0.0-rc.4-headers.tar.gz' }) t.equal(release.semver.version, '4.0.0-rc.4') delete release.semver t.deepEqual(release, { version: '4.0.0-rc.4', name: 'node', baseUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/', tarballUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/node-v4.0.0-rc.4-headers.tar.gz', shasumsUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/SHASUMS256.txt', versionDir: '4.0.0-rc.4', libUrl32: 'https://nodejs.org/download/rc/v4.0.0-rc.4/win-x86/node.lib', libUrl64: 'https://nodejs.org/download/rc/v4.0.0-rc.4/win-x64/node.lib', libPath32: 'win-x86/node.lib', libPath64: 'win-x64/node.lib' }) }) test('test process release - process.release ~ node@4.0.0-rc.4 - bogus string passed as argv[0]', function (t) { t.plan(2) // additional arguments can be passed in on the commandline that should be ignored if they // are not specifying a valid version @ position 0 var release = processRelease([ 'this is no version!' ], { opts: {} }, 'v4.0.0-rc.4', { name: 'node', headersUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/node-v4.0.0-rc.4-headers.tar.gz' }) t.equal(release.semver.version, '4.0.0-rc.4') delete release.semver t.deepEqual(release, { version: '4.0.0-rc.4', name: 'node', baseUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/', tarballUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/node-v4.0.0-rc.4-headers.tar.gz', shasumsUrl: 'https://nodejs.org/download/rc/v4.0.0-rc.4/SHASUMS256.txt', versionDir: '4.0.0-rc.4', libUrl32: 'https://nodejs.org/download/rc/v4.0.0-rc.4/win-x86/node.lib', libUrl64: 'https://nodejs.org/download/rc/v4.0.0-rc.4/win-x64/node.lib', libPath32: 'win-x86/node.lib', libPath64: 'win-x64/node.lib' }) }) npm_3.5.2.orig/node_modules/nopt/.npmignore0000644000000000000000000000001512631326456017122 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/nopt/.travis.yml0000644000000000000000000000020612631326456017236 0ustar 00000000000000language: node_js language: node_js node_js: - '0.8' - '0.10' - '0.12' - 'iojs' before_install: - npm install -g npm@latest npm_3.5.2.orig/node_modules/nopt/LICENSE0000644000000000000000000000137512631326456016142 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/nopt/README.md0000644000000000000000000001701212631326456016407 0ustar 00000000000000If you want to write an option parser, and have it be good, there are two ways to do it. The Right Way, and the Wrong Way. The Wrong Way is to sit down and write an option parser. We've all done that. The Right Way is to write some complex configurable program with so many options that you hit the limit of your frustration just trying to manage them all, and defer it with duct-tape solutions until you see exactly to the core of the problem, and finally snap and write an awesome option parser. If you want to write an option parser, don't write an option parser. Write a package manager, or a source control system, or a service restarter, or an operating system. You probably won't end up with a good one of those, but if you don't give up, and you are relentless and diligent enough in your procrastination, you may just end up with a very nice option parser. ## USAGE // my-program.js var nopt = require("nopt") , Stream = require("stream").Stream , path = require("path") , knownOpts = { "foo" : [String, null] , "bar" : [Stream, Number] , "baz" : path , "bloo" : [ "big", "medium", "small" ] , "flag" : Boolean , "pick" : Boolean , "many1" : [String, Array] , "many2" : [path] } , shortHands = { "foofoo" : ["--foo", "Mr. Foo"] , "b7" : ["--bar", "7"] , "m" : ["--bloo", "medium"] , "p" : ["--pick"] , "f" : ["--flag"] } // everything is optional. // knownOpts and shorthands default to {} // arg list defaults to process.argv // slice defaults to 2 , parsed = nopt(knownOpts, shortHands, process.argv, 2) console.log(parsed) This would give you support for any of the following: ```bash $ node my-program.js --foo "blerp" --no-flag { "foo" : "blerp", "flag" : false } $ node my-program.js ---bar 7 --foo "Mr. Hand" --flag { bar: 7, foo: "Mr. Hand", flag: true } $ node my-program.js --foo "blerp" -f -----p { foo: "blerp", flag: true, pick: true } $ node my-program.js -fp --foofoo { foo: "Mr. Foo", flag: true, pick: true } $ node my-program.js --foofoo -- -fp # -- stops the flag parsing. { foo: "Mr. Foo", argv: { remain: ["-fp"] } } $ node my-program.js --blatzk -fp # unknown opts are ok. { blatzk: true, flag: true, pick: true } $ node my-program.js --blatzk=1000 -fp # but you need to use = if they have a value { blatzk: 1000, flag: true, pick: true } $ node my-program.js --no-blatzk -fp # unless they start with "no-" { blatzk: false, flag: true, pick: true } $ node my-program.js --baz b/a/z # known paths are resolved. { baz: "/Users/isaacs/b/a/z" } # if Array is one of the types, then it can take many # values, and will always be an array. The other types provided # specify what types are allowed in the list. $ node my-program.js --many1 5 --many1 null --many1 foo { many1: ["5", "null", "foo"] } $ node my-program.js --many2 foo --many2 bar { many2: ["/path/to/foo", "path/to/bar"] } ``` Read the tests at the bottom of `lib/nopt.js` for more examples of what this puppy can do. ## Types The following types are supported, and defined on `nopt.typeDefs` * String: A normal string. No parsing is done. * path: A file system path. Gets resolved against cwd if not absolute. * url: A url. If it doesn't parse, it isn't accepted. * Number: Must be numeric. * Date: Must parse as a date. If it does, and `Date` is one of the options, then it will return a Date object, not a string. * Boolean: Must be either `true` or `false`. If an option is a boolean, then it does not need a value, and its presence will imply `true` as the value. To negate boolean flags, do `--no-whatever` or `--whatever false` * NaN: Means that the option is strictly not allowed. Any value will fail. * Stream: An object matching the "Stream" class in node. Valuable for use when validating programmatically. (npm uses this to let you supply any WriteStream on the `outfd` and `logfd` config options.) * Array: If `Array` is specified as one of the types, then the value will be parsed as a list of options. This means that multiple values can be specified, and that the value will always be an array. If a type is an array of values not on this list, then those are considered valid values. For instance, in the example above, the `--bloo` option can only be one of `"big"`, `"medium"`, or `"small"`, and any other value will be rejected. When parsing unknown fields, `"true"`, `"false"`, and `"null"` will be interpreted as their JavaScript equivalents. You can also mix types and values, or multiple types, in a list. For instance `{ blah: [Number, null] }` would allow a value to be set to either a Number or null. When types are ordered, this implies a preference, and the first type that can be used to properly interpret the value will be used. To define a new type, add it to `nopt.typeDefs`. Each item in that hash is an object with a `type` member and a `validate` method. The `type` member is an object that matches what goes in the type list. The `validate` method is a function that gets called with `validate(data, key, val)`. Validate methods should assign `data[key]` to the valid value of `val` if it can be handled properly, or return boolean `false` if it cannot. You can also call `nopt.clean(data, types, typeDefs)` to clean up a config object and remove its invalid properties. ## Error Handling By default, nopt outputs a warning to standard error when invalid values for known options are found. You can change this behavior by assigning a method to `nopt.invalidHandler`. This method will be called with the offending `nopt.invalidHandler(key, val, types)`. If no `nopt.invalidHandler` is assigned, then it will console.error its whining. If it is assigned to boolean `false` then the warning is suppressed. ## Abbreviations Yes, they are supported. If you define options like this: ```javascript { "foolhardyelephants" : Boolean , "pileofmonkeys" : Boolean } ``` Then this will work: ```bash node program.js --foolhar --pil node program.js --no-f --pileofmon # etc. ``` ## Shorthands Shorthands are a hash of shorter option names to a snippet of args that they expand to. If multiple one-character shorthands are all combined, and the combination does not unambiguously match any other option or shorthand, then they will be broken up into their constituent parts. For example: ```json { "s" : ["--loglevel", "silent"] , "g" : "--global" , "f" : "--force" , "p" : "--parseable" , "l" : "--long" } ``` ```bash npm ls -sgflp # just like doing this: npm ls --loglevel silent --global --force --long --parseable ``` ## The Rest of the args The config object returned by nopt is given a special member called `argv`, which is an object with the following fields: * `remain`: The remaining args after all the parsing has occurred. * `original`: The args as they originally appeared. * `cooked`: The args after flags and shorthands are expanded. ## Slicing Node programs are called with more or less the exact argv as it appears in C land, after the v8 and node-specific options have been plucked off. As such, `argv[0]` is always `node` and `argv[1]` is always the JavaScript program being run. That's usually not very useful to you. So they're sliced off by default. If you want them, then you can pass in `0` as the last argument, or any other number that you'd like to slice off the start of the list. npm_3.5.2.orig/node_modules/nopt/bin/0000755000000000000000000000000012631326456015677 5ustar 00000000000000npm_3.5.2.orig/node_modules/nopt/examples/0000755000000000000000000000000012631326456016745 5ustar 00000000000000npm_3.5.2.orig/node_modules/nopt/lib/0000755000000000000000000000000012631326456015675 5ustar 00000000000000npm_3.5.2.orig/node_modules/nopt/package.json0000644000000000000000000002137312631326456017423 0ustar 00000000000000{ "name": "nopt", "version": "3.0.4", "description": "Option parsing for Node, supporting types, shorthands, etc. Used by npm.", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "main": "lib/nopt.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+ssh://git@github.com/isaacs/nopt.git" }, "bin": { "nopt": "./bin/nopt.js" }, "license": "ISC", "dependencies": { "abbrev": "1" }, "devDependencies": { "tap": "^1.2.0" }, "readme": "If you want to write an option parser, and have it be good, there are\ntwo ways to do it. The Right Way, and the Wrong Way.\n\nThe Wrong Way is to sit down and write an option parser. We've all done\nthat.\n\nThe Right Way is to write some complex configurable program with so many\noptions that you hit the limit of your frustration just trying to\nmanage them all, and defer it with duct-tape solutions until you see\nexactly to the core of the problem, and finally snap and write an\nawesome option parser.\n\nIf you want to write an option parser, don't write an option parser.\nWrite a package manager, or a source control system, or a service\nrestarter, or an operating system. You probably won't end up with a\ngood one of those, but if you don't give up, and you are relentless and\ndiligent enough in your procrastination, you may just end up with a very\nnice option parser.\n\n## USAGE\n\n // my-program.js\n var nopt = require(\"nopt\")\n , Stream = require(\"stream\").Stream\n , path = require(\"path\")\n , knownOpts = { \"foo\" : [String, null]\n , \"bar\" : [Stream, Number]\n , \"baz\" : path\n , \"bloo\" : [ \"big\", \"medium\", \"small\" ]\n , \"flag\" : Boolean\n , \"pick\" : Boolean\n , \"many1\" : [String, Array]\n , \"many2\" : [path]\n }\n , shortHands = { \"foofoo\" : [\"--foo\", \"Mr. Foo\"]\n , \"b7\" : [\"--bar\", \"7\"]\n , \"m\" : [\"--bloo\", \"medium\"]\n , \"p\" : [\"--pick\"]\n , \"f\" : [\"--flag\"]\n }\n // everything is optional.\n // knownOpts and shorthands default to {}\n // arg list defaults to process.argv\n // slice defaults to 2\n , parsed = nopt(knownOpts, shortHands, process.argv, 2)\n console.log(parsed)\n\nThis would give you support for any of the following:\n\n```bash\n$ node my-program.js --foo \"blerp\" --no-flag\n{ \"foo\" : \"blerp\", \"flag\" : false }\n\n$ node my-program.js ---bar 7 --foo \"Mr. Hand\" --flag\n{ bar: 7, foo: \"Mr. Hand\", flag: true }\n\n$ node my-program.js --foo \"blerp\" -f -----p\n{ foo: \"blerp\", flag: true, pick: true }\n\n$ node my-program.js -fp --foofoo\n{ foo: \"Mr. Foo\", flag: true, pick: true }\n\n$ node my-program.js --foofoo -- -fp # -- stops the flag parsing.\n{ foo: \"Mr. Foo\", argv: { remain: [\"-fp\"] } }\n\n$ node my-program.js --blatzk -fp # unknown opts are ok.\n{ blatzk: true, flag: true, pick: true }\n\n$ node my-program.js --blatzk=1000 -fp # but you need to use = if they have a value\n{ blatzk: 1000, flag: true, pick: true }\n\n$ node my-program.js --no-blatzk -fp # unless they start with \"no-\"\n{ blatzk: false, flag: true, pick: true }\n\n$ node my-program.js --baz b/a/z # known paths are resolved.\n{ baz: \"/Users/isaacs/b/a/z\" }\n\n# if Array is one of the types, then it can take many\n# values, and will always be an array. The other types provided\n# specify what types are allowed in the list.\n\n$ node my-program.js --many1 5 --many1 null --many1 foo\n{ many1: [\"5\", \"null\", \"foo\"] }\n\n$ node my-program.js --many2 foo --many2 bar\n{ many2: [\"/path/to/foo\", \"path/to/bar\"] }\n```\n\nRead the tests at the bottom of `lib/nopt.js` for more examples of\nwhat this puppy can do.\n\n## Types\n\nThe following types are supported, and defined on `nopt.typeDefs`\n\n* String: A normal string. No parsing is done.\n* path: A file system path. Gets resolved against cwd if not absolute.\n* url: A url. If it doesn't parse, it isn't accepted.\n* Number: Must be numeric.\n* Date: Must parse as a date. If it does, and `Date` is one of the options,\n then it will return a Date object, not a string.\n* Boolean: Must be either `true` or `false`. If an option is a boolean,\n then it does not need a value, and its presence will imply `true` as\n the value. To negate boolean flags, do `--no-whatever` or `--whatever\n false`\n* NaN: Means that the option is strictly not allowed. Any value will\n fail.\n* Stream: An object matching the \"Stream\" class in node. Valuable\n for use when validating programmatically. (npm uses this to let you\n supply any WriteStream on the `outfd` and `logfd` config options.)\n* Array: If `Array` is specified as one of the types, then the value\n will be parsed as a list of options. This means that multiple values\n can be specified, and that the value will always be an array.\n\nIf a type is an array of values not on this list, then those are\nconsidered valid values. For instance, in the example above, the\n`--bloo` option can only be one of `\"big\"`, `\"medium\"`, or `\"small\"`,\nand any other value will be rejected.\n\nWhen parsing unknown fields, `\"true\"`, `\"false\"`, and `\"null\"` will be\ninterpreted as their JavaScript equivalents.\n\nYou can also mix types and values, or multiple types, in a list. For\ninstance `{ blah: [Number, null] }` would allow a value to be set to\neither a Number or null. When types are ordered, this implies a\npreference, and the first type that can be used to properly interpret\nthe value will be used.\n\nTo define a new type, add it to `nopt.typeDefs`. Each item in that\nhash is an object with a `type` member and a `validate` method. The\n`type` member is an object that matches what goes in the type list. The\n`validate` method is a function that gets called with `validate(data,\nkey, val)`. Validate methods should assign `data[key]` to the valid\nvalue of `val` if it can be handled properly, or return boolean\n`false` if it cannot.\n\nYou can also call `nopt.clean(data, types, typeDefs)` to clean up a\nconfig object and remove its invalid properties.\n\n## Error Handling\n\nBy default, nopt outputs a warning to standard error when invalid values for\nknown options are found. You can change this behavior by assigning a method\nto `nopt.invalidHandler`. This method will be called with\nthe offending `nopt.invalidHandler(key, val, types)`.\n\nIf no `nopt.invalidHandler` is assigned, then it will console.error\nits whining. If it is assigned to boolean `false` then the warning is\nsuppressed.\n\n## Abbreviations\n\nYes, they are supported. If you define options like this:\n\n```javascript\n{ \"foolhardyelephants\" : Boolean\n, \"pileofmonkeys\" : Boolean }\n```\n\nThen this will work:\n\n```bash\nnode program.js --foolhar --pil\nnode program.js --no-f --pileofmon\n# etc.\n```\n\n## Shorthands\n\nShorthands are a hash of shorter option names to a snippet of args that\nthey expand to.\n\nIf multiple one-character shorthands are all combined, and the\ncombination does not unambiguously match any other option or shorthand,\nthen they will be broken up into their constituent parts. For example:\n\n```json\n{ \"s\" : [\"--loglevel\", \"silent\"]\n, \"g\" : \"--global\"\n, \"f\" : \"--force\"\n, \"p\" : \"--parseable\"\n, \"l\" : \"--long\"\n}\n```\n\n```bash\nnpm ls -sgflp\n# just like doing this:\nnpm ls --loglevel silent --global --force --long --parseable\n```\n\n## The Rest of the args\n\nThe config object returned by nopt is given a special member called\n`argv`, which is an object with the following fields:\n\n* `remain`: The remaining args after all the parsing has occurred.\n* `original`: The args as they originally appeared.\n* `cooked`: The args after flags and shorthands are expanded.\n\n## Slicing\n\nNode programs are called with more or less the exact argv as it appears\nin C land, after the v8 and node-specific options have been plucked off.\nAs such, `argv[0]` is always `node` and `argv[1]` is always the\nJavaScript program being run.\n\nThat's usually not very useful to you. So they're sliced off by\ndefault. If you want them, then you can pass in `0` as the last\nargument, or any other number that you'd like to slice off the start of\nthe list.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/nopt/issues" }, "homepage": "https://github.com/isaacs/nopt#readme", "_id": "nopt@3.0.4", "_shasum": "dd63bc9c72a6e4e85b85cdc0ca314598facede5e", "_resolved": "https://registry.npmjs.org/nopt/-/nopt-3.0.4.tgz", "_from": "nopt@>=3.0.4 <3.1.0" } npm_3.5.2.orig/node_modules/nopt/test/0000755000000000000000000000000012631326456016106 5ustar 00000000000000npm_3.5.2.orig/node_modules/nopt/bin/nopt.js0000755000000000000000000000301512631326456017217 0ustar 00000000000000#!/usr/bin/env node var nopt = require("../lib/nopt") , path = require("path") , types = { num: Number , bool: Boolean , help: Boolean , list: Array , "num-list": [Number, Array] , "str-list": [String, Array] , "bool-list": [Boolean, Array] , str: String , clear: Boolean , config: Boolean , length: Number , file: path } , shorthands = { s: [ "--str", "astring" ] , b: [ "--bool" ] , nb: [ "--no-bool" ] , tft: [ "--bool-list", "--no-bool-list", "--bool-list", "true" ] , "?": ["--help"] , h: ["--help"] , H: ["--help"] , n: [ "--num", "125" ] , c: ["--config"] , l: ["--length"] , f: ["--file"] } , parsed = nopt( types , shorthands , process.argv , 2 ) console.log("parsed", parsed) if (parsed.help) { console.log("") console.log("nopt cli tester") console.log("") console.log("types") console.log(Object.keys(types).map(function M (t) { var type = types[t] if (Array.isArray(type)) { return [t, type.map(function (type) { return type.name })] } return [t, type && type.name] }).reduce(function (s, i) { s[i[0]] = i[1] return s }, {})) console.log("") console.log("shorthands") console.log(shorthands) } npm_3.5.2.orig/node_modules/nopt/examples/my-program.js0000755000000000000000000000201012631326456021371 0ustar 00000000000000#!/usr/bin/env node //process.env.DEBUG_NOPT = 1 // my-program.js var nopt = require("../lib/nopt") , Stream = require("stream").Stream , path = require("path") , knownOpts = { "foo" : [String, null] , "bar" : [Stream, Number] , "baz" : path , "bloo" : [ "big", "medium", "small" ] , "flag" : Boolean , "pick" : Boolean } , shortHands = { "foofoo" : ["--foo", "Mr. Foo"] , "b7" : ["--bar", "7"] , "m" : ["--bloo", "medium"] , "p" : ["--pick"] , "f" : ["--flag", "true"] , "g" : ["--flag"] , "s" : "--flag" } // everything is optional. // knownOpts and shorthands default to {} // arg list defaults to process.argv // slice defaults to 2 , parsed = nopt(knownOpts, shortHands, process.argv, 2) console.log("parsed =\n"+ require("util").inspect(parsed)) npm_3.5.2.orig/node_modules/nopt/lib/nopt.js0000644000000000000000000002660512631326456017224 0ustar 00000000000000// info about each config option. var debug = process.env.DEBUG_NOPT || process.env.NOPT_DEBUG ? function () { console.error.apply(console, arguments) } : function () {} var url = require("url") , path = require("path") , Stream = require("stream").Stream , abbrev = require("abbrev") module.exports = exports = nopt exports.clean = clean exports.typeDefs = { String : { type: String, validate: validateString } , Boolean : { type: Boolean, validate: validateBoolean } , url : { type: url, validate: validateUrl } , Number : { type: Number, validate: validateNumber } , path : { type: path, validate: validatePath } , Stream : { type: Stream, validate: validateStream } , Date : { type: Date, validate: validateDate } } function nopt (types, shorthands, args, slice) { args = args || process.argv types = types || {} shorthands = shorthands || {} if (typeof slice !== "number") slice = 2 debug(types, shorthands, args, slice) args = args.slice(slice) var data = {} , key , remain = [] , cooked = args , original = args.slice(0) parse(args, data, remain, types, shorthands) // now data is full clean(data, types, exports.typeDefs) data.argv = {remain:remain,cooked:cooked,original:original} Object.defineProperty(data.argv, 'toString', { value: function () { return this.original.map(JSON.stringify).join(" ") }, enumerable: false }) return data } function clean (data, types, typeDefs) { typeDefs = typeDefs || exports.typeDefs var remove = {} , typeDefault = [false, true, null, String, Array] Object.keys(data).forEach(function (k) { if (k === "argv") return var val = data[k] , isArray = Array.isArray(val) , type = types[k] if (!isArray) val = [val] if (!type) type = typeDefault if (type === Array) type = typeDefault.concat(Array) if (!Array.isArray(type)) type = [type] debug("val=%j", val) debug("types=", type) val = val.map(function (val) { // if it's an unknown value, then parse false/true/null/numbers/dates if (typeof val === "string") { debug("string %j", val) val = val.trim() if ((val === "null" && ~type.indexOf(null)) || (val === "true" && (~type.indexOf(true) || ~type.indexOf(Boolean))) || (val === "false" && (~type.indexOf(false) || ~type.indexOf(Boolean)))) { val = JSON.parse(val) debug("jsonable %j", val) } else if (~type.indexOf(Number) && !isNaN(val)) { debug("convert to number", val) val = +val } else if (~type.indexOf(Date) && !isNaN(Date.parse(val))) { debug("convert to date", val) val = new Date(val) } } if (!types.hasOwnProperty(k)) { return val } // allow `--no-blah` to set 'blah' to null if null is allowed if (val === false && ~type.indexOf(null) && !(~type.indexOf(false) || ~type.indexOf(Boolean))) { val = null } var d = {} d[k] = val debug("prevalidated val", d, val, types[k]) if (!validate(d, k, val, types[k], typeDefs)) { if (exports.invalidHandler) { exports.invalidHandler(k, val, types[k], data) } else if (exports.invalidHandler !== false) { debug("invalid: "+k+"="+val, types[k]) } return remove } debug("validated val", d, val, types[k]) return d[k] }).filter(function (val) { return val !== remove }) if (!val.length) delete data[k] else if (isArray) { debug(isArray, data[k], val) data[k] = val } else data[k] = val[0] debug("k=%s val=%j", k, val, data[k]) }) } function validateString (data, k, val) { data[k] = String(val) } function validatePath (data, k, val) { if (val === true) return false if (val === null) return true val = String(val) var homePattern = process.platform === 'win32' ? /^~(\/|\\)/ : /^~\// if (val.match(homePattern) && process.env.HOME) { val = path.resolve(process.env.HOME, val.substr(2)) } data[k] = path.resolve(String(val)) return true } function validateNumber (data, k, val) { debug("validate Number %j %j %j", k, val, isNaN(val)) if (isNaN(val)) return false data[k] = +val } function validateDate (data, k, val) { debug("validate Date %j %j %j", k, val, Date.parse(val)) var s = Date.parse(val) if (isNaN(s)) return false data[k] = new Date(val) } function validateBoolean (data, k, val) { if (val instanceof Boolean) val = val.valueOf() else if (typeof val === "string") { if (!isNaN(val)) val = !!(+val) else if (val === "null" || val === "false") val = false else val = true } else val = !!val data[k] = val } function validateUrl (data, k, val) { val = url.parse(String(val)) if (!val.host) return false data[k] = val.href } function validateStream (data, k, val) { if (!(val instanceof Stream)) return false data[k] = val } function validate (data, k, val, type, typeDefs) { // arrays are lists of types. if (Array.isArray(type)) { for (var i = 0, l = type.length; i < l; i ++) { if (type[i] === Array) continue if (validate(data, k, val, type[i], typeDefs)) return true } delete data[k] return false } // an array of anything? if (type === Array) return true // NaN is poisonous. Means that something is not allowed. if (type !== type) { debug("Poison NaN", k, val, type) delete data[k] return false } // explicit list of values if (val === type) { debug("Explicitly allowed %j", val) // if (isArray) (data[k] = data[k] || []).push(val) // else data[k] = val data[k] = val return true } // now go through the list of typeDefs, validate against each one. var ok = false , types = Object.keys(typeDefs) for (var i = 0, l = types.length; i < l; i ++) { debug("test type %j %j %j", k, val, types[i]) var t = typeDefs[types[i]] if (t && type === t.type) { var d = {} ok = false !== t.validate(d, k, val) val = d[k] if (ok) { // if (isArray) (data[k] = data[k] || []).push(val) // else data[k] = val data[k] = val break } } } debug("OK? %j (%j %j %j)", ok, k, val, types[i]) if (!ok) delete data[k] return ok } function parse (args, data, remain, types, shorthands) { debug("parse", args, data, remain) var key = null , abbrevs = abbrev(Object.keys(types)) , shortAbbr = abbrev(Object.keys(shorthands)) for (var i = 0; i < args.length; i ++) { var arg = args[i] debug("arg", arg) if (arg.match(/^-{2,}$/)) { // done with keys. // the rest are args. remain.push.apply(remain, args.slice(i + 1)) args[i] = "--" break } var hadEq = false if (arg.charAt(0) === "-" && arg.length > 1) { if (arg.indexOf("=") !== -1) { hadEq = true var v = arg.split("=") arg = v.shift() v = v.join("=") args.splice.apply(args, [i, 1].concat([arg, v])) } // see if it's a shorthand // if so, splice and back up to re-parse it. var shRes = resolveShort(arg, shorthands, shortAbbr, abbrevs) debug("arg=%j shRes=%j", arg, shRes) if (shRes) { debug(arg, shRes) args.splice.apply(args, [i, 1].concat(shRes)) if (arg !== shRes[0]) { i -- continue } } arg = arg.replace(/^-+/, "") var no = null while (arg.toLowerCase().indexOf("no-") === 0) { no = !no arg = arg.substr(3) } if (abbrevs[arg]) arg = abbrevs[arg] var isArray = types[arg] === Array || Array.isArray(types[arg]) && types[arg].indexOf(Array) !== -1 // allow unknown things to be arrays if specified multiple times. if (!types.hasOwnProperty(arg) && data.hasOwnProperty(arg)) { if (!Array.isArray(data[arg])) data[arg] = [data[arg]] isArray = true } var val , la = args[i + 1] var isBool = typeof no === 'boolean' || types[arg] === Boolean || Array.isArray(types[arg]) && types[arg].indexOf(Boolean) !== -1 || (typeof types[arg] === 'undefined' && !hadEq) || (la === "false" && (types[arg] === null || Array.isArray(types[arg]) && ~types[arg].indexOf(null))) if (isBool) { // just set and move along val = !no // however, also support --bool true or --bool false if (la === "true" || la === "false") { val = JSON.parse(la) la = null if (no) val = !val i ++ } // also support "foo":[Boolean, "bar"] and "--foo bar" if (Array.isArray(types[arg]) && la) { if (~types[arg].indexOf(la)) { // an explicit type val = la i ++ } else if ( la === "null" && ~types[arg].indexOf(null) ) { // null allowed val = null i ++ } else if ( !la.match(/^-{2,}[^-]/) && !isNaN(la) && ~types[arg].indexOf(Number) ) { // number val = +la i ++ } else if ( !la.match(/^-[^-]/) && ~types[arg].indexOf(String) ) { // string val = la i ++ } } if (isArray) (data[arg] = data[arg] || []).push(val) else data[arg] = val continue } if (types[arg] === String && la === undefined) la = "" if (la && la.match(/^-{2,}$/)) { la = undefined i -- } val = la === undefined ? true : la if (isArray) (data[arg] = data[arg] || []).push(val) else data[arg] = val i ++ continue } remain.push(arg) } } function resolveShort (arg, shorthands, shortAbbr, abbrevs) { // handle single-char shorthands glommed together, like // npm ls -glp, but only if there is one dash, and only if // all of the chars are single-char shorthands, and it's // not a match to some other abbrev. arg = arg.replace(/^-+/, '') // if it's an exact known option, then don't go any further if (abbrevs[arg] === arg) return null // if it's an exact known shortopt, same deal if (shorthands[arg]) { // make it an array, if it's a list of words if (shorthands[arg] && !Array.isArray(shorthands[arg])) shorthands[arg] = shorthands[arg].split(/\s+/) return shorthands[arg] } // first check to see if this arg is a set of single-char shorthands var singles = shorthands.___singles if (!singles) { singles = Object.keys(shorthands).filter(function (s) { return s.length === 1 }).reduce(function (l,r) { l[r] = true return l }, {}) shorthands.___singles = singles debug('shorthand singles', singles) } var chrs = arg.split("").filter(function (c) { return singles[c] }) if (chrs.join("") === arg) return chrs.map(function (c) { return shorthands[c] }).reduce(function (l, r) { return l.concat(r) }, []) // if it's an arg abbrev, and not a literal shorthand, then prefer the arg if (abbrevs[arg] && !shorthands[arg]) return null // if it's an abbr for a shorthand, then use that if (shortAbbr[arg]) arg = shortAbbr[arg] // make it an array, if it's a list of words if (shorthands[arg] && !Array.isArray(shorthands[arg])) shorthands[arg] = shorthands[arg].split(/\s+/) return shorthands[arg] } npm_3.5.2.orig/node_modules/nopt/test/basic.js0000644000000000000000000001612012631326456017525 0ustar 00000000000000var nopt = require("../") , test = require('tap').test test("passing a string results in a string", function (t) { var parsed = nopt({ key: String }, {}, ["--key", "myvalue"], 0) t.same(parsed.key, "myvalue") t.end() }) // https://github.com/npm/nopt/issues/31 test("Empty String results in empty string, not true", function (t) { var parsed = nopt({ empty: String }, {}, ["--empty"], 0) t.same(parsed.empty, "") t.end() }) test("~ path is resolved to $HOME", function (t) { var path = require("path") if (!process.env.HOME) process.env.HOME = "/tmp" var parsed = nopt({key: path}, {}, ["--key=~/val"], 0) t.same(parsed.key, path.resolve(process.env.HOME, "val")) t.end() }) // https://github.com/npm/nopt/issues/24 test("Unknown options are not parsed as numbers", function (t) { var parsed = nopt({"parse-me": Number}, null, ['--leave-as-is=1.20', '--parse-me=1.20'], 0) t.equal(parsed['leave-as-is'], '1.20') t.equal(parsed['parse-me'], 1.2) t.end() }); test("other tests", function (t) { var util = require("util") , Stream = require("stream") , path = require("path") , url = require("url") , shorthands = { s : ["--loglevel", "silent"] , d : ["--loglevel", "info"] , dd : ["--loglevel", "verbose"] , ddd : ["--loglevel", "silly"] , noreg : ["--no-registry"] , reg : ["--registry"] , "no-reg" : ["--no-registry"] , silent : ["--loglevel", "silent"] , verbose : ["--loglevel", "verbose"] , h : ["--usage"] , H : ["--usage"] , "?" : ["--usage"] , help : ["--usage"] , v : ["--version"] , f : ["--force"] , desc : ["--description"] , "no-desc" : ["--no-description"] , "local" : ["--no-global"] , l : ["--long"] , p : ["--parseable"] , porcelain : ["--parseable"] , g : ["--global"] } , types = { aoa: Array , nullstream: [null, Stream] , date: Date , str: String , browser : String , cache : path , color : ["always", Boolean] , depth : Number , description : Boolean , dev : Boolean , editor : path , force : Boolean , global : Boolean , globalconfig : path , group : [String, Number] , gzipbin : String , logfd : [Number, Stream] , loglevel : ["silent","win","error","warn","info","verbose","silly"] , long : Boolean , "node-version" : [false, String] , npaturl : url , npat : Boolean , "onload-script" : [false, String] , outfd : [Number, Stream] , parseable : Boolean , pre: Boolean , prefix: path , proxy : url , "rebuild-bundle" : Boolean , registry : url , searchopts : String , searchexclude: [null, String] , shell : path , t: [Array, String] , tag : String , tar : String , tmp : path , "unsafe-perm" : Boolean , usage : Boolean , user : String , username : String , userconfig : path , version : Boolean , viewer: path , _exit : Boolean , path: path } ; [["-v", {version:true}, []] ,["---v", {version:true}, []] ,["ls -s --no-reg connect -d", {loglevel:"info",registry:null},["ls","connect"]] ,["ls ---s foo",{loglevel:"silent"},["ls","foo"]] ,["ls --registry blargle", {}, ["ls"]] ,["--no-registry", {registry:null}, []] ,["--no-color true", {color:false}, []] ,["--no-color false", {color:true}, []] ,["--no-color", {color:false}, []] ,["--color false", {color:false}, []] ,["--color --logfd 7", {logfd:7,color:true}, []] ,["--color=true", {color:true}, []] ,["--logfd=10", {logfd:10}, []] ,["--tmp=/tmp -tar=gtar",{tmp:"/tmp",tar:"gtar"},[]] ,["--tmp=tmp -tar=gtar", {tmp:path.resolve(process.cwd(), "tmp"),tar:"gtar"},[]] ,["--logfd x", {}, []] ,["a -true -- -no-false", {true:true},["a","-no-false"]] ,["a -no-false", {false:false},["a"]] ,["a -no-no-true", {true:true}, ["a"]] ,["a -no-no-no-false", {false:false}, ["a"]] ,["---NO-no-No-no-no-no-nO-no-no"+ "-No-no-no-no-no-no-no-no-no"+ "-no-no-no-no-NO-NO-no-no-no-no-no-no"+ "-no-body-can-do-the-boogaloo-like-I-do" ,{"body-can-do-the-boogaloo-like-I-do":false}, []] ,["we are -no-strangers-to-love "+ "--you-know=the-rules --and=so-do-i "+ "---im-thinking-of=a-full-commitment "+ "--no-you-would-get-this-from-any-other-guy "+ "--no-gonna-give-you-up "+ "-no-gonna-let-you-down=true "+ "--no-no-gonna-run-around false "+ "--desert-you=false "+ "--make-you-cry false "+ "--no-tell-a-lie "+ "--no-no-and-hurt-you false" ,{"strangers-to-love":false ,"you-know":"the-rules" ,"and":"so-do-i" ,"you-would-get-this-from-any-other-guy":false ,"gonna-give-you-up":false ,"gonna-let-you-down":false ,"gonna-run-around":false ,"desert-you":false ,"make-you-cry":false ,"tell-a-lie":false ,"and-hurt-you":false },["we", "are"]] ,["-t one -t two -t three" ,{t: ["one", "two", "three"]} ,[]] ,["-t one -t null -t three four five null" ,{t: ["one", "null", "three"]} ,["four", "five", "null"]] ,["-t foo" ,{t:["foo"]} ,[]] ,["--no-t" ,{t:["false"]} ,[]] ,["-no-no-t" ,{t:["true"]} ,[]] ,["-aoa one -aoa null -aoa 100" ,{aoa:["one", null, '100']} ,[]] ,["-str 100" ,{str:"100"} ,[]] ,["--color always" ,{color:"always"} ,[]] ,["--no-nullstream" ,{nullstream:null} ,[]] ,["--nullstream false" ,{nullstream:null} ,[]] ,["--notadate=2011-01-25" ,{notadate: "2011-01-25"} ,[]] ,["--date 2011-01-25" ,{date: new Date("2011-01-25")} ,[]] ,["-cl 1" ,{config: true, length: 1} ,[] ,{config: Boolean, length: Number, clear: Boolean} ,{c: "--config", l: "--length"}] ,["--acount bla" ,{"acount":true} ,["bla"] ,{account: Boolean, credentials: Boolean, options: String} ,{a:"--account", c:"--credentials",o:"--options"}] ,["--clear" ,{clear:true} ,[] ,{clear:Boolean,con:Boolean,len:Boolean,exp:Boolean,add:Boolean,rep:Boolean} ,{c:"--con",l:"--len",e:"--exp",a:"--add",r:"--rep"}] ,["--file -" ,{"file":"-"} ,[] ,{file:String} ,{}] ,["--file -" ,{"file":true} ,["-"] ,{file:Boolean} ,{}] ,["--path" ,{"path":null} ,[]] ,["--path ." ,{"path":process.cwd()} ,[]] ].forEach(function (test) { var argv = test[0].split(/\s+/) , opts = test[1] , rem = test[2] , actual = nopt(test[3] || types, test[4] || shorthands, argv, 0) , parsed = actual.argv delete actual.argv for (var i in opts) { var e = JSON.stringify(opts[i]) , a = JSON.stringify(actual[i] === undefined ? null : actual[i]) if (e && typeof e === "object") { t.deepEqual(e, a) } else { t.equal(e, a) } } t.deepEqual(rem, parsed.remain) }) t.end() }) npm_3.5.2.orig/node_modules/normalize-git-url/.npmignore0000644000000000000000000000001612631326456021524 0ustar 00000000000000node_modules/ npm_3.5.2.orig/node_modules/normalize-git-url/CHANGELOG.md0000644000000000000000000000032412631326456021340 0ustar 00000000000000### 1.0.0 (2014-12-25): * [`8b3d874`](https://github.com/npm/normalize-git-url/commit/8b3d874afd14f4cdde65d418e0a35a615c746bba) Initial version, with simple tests. ([@othiym23](https://github.com/othiym23)) npm_3.5.2.orig/node_modules/normalize-git-url/LICENSE0000644000000000000000000000134512631326456020540 0ustar 00000000000000Copyright (c) 2014-2015, Forrest L Norvell Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/normalize-git-url/README.md0000644000000000000000000000253312631326456021012 0ustar 00000000000000# normalize-git-url You have a bunch of Git URLs. You want to convert them to a canonical representation, probably for use inside npm so that it doesn't end up creating a bunch of superfluous cached origins. You use this package. ## Usage ```javascript var ngu = require('normalize-git-url'); var normalized = ngu("git+ssh://git@github.com:organization/repo.git#hashbrowns") // get back: // { // url : "ssh://git@github.com/organization/repo.git", // branch : "hashbrowns" // did u know hashbrowns are delicious? // } ``` ## API There's just the one function, and all it takes is a single parameter, a non-normalized Git URL. ### normalizeGitUrl(url) * `url` {String} The Git URL (very loosely speaking) to be normalized. Returns an object with the following format: * `url` {String} The normalized URL. * `branch` {String} The treeish to be checked out once the repo at `url` is cloned. It doesn't have to be a branch, but it's a lot easier to intuit what the output is for with that name. ## Limitations Right now this doesn't try to special-case GitHub too much -- it doesn't ensure that `.git` is added to the end of URLs, it doesn't prefer `https:` over `http:` or `ssh:`, it doesn't deal with redirects, and it doesn't try to resolve symbolic names to treeish hashcodes. For now, it just tries to account for minor differences in representation. npm_3.5.2.orig/node_modules/normalize-git-url/normalize-git-url.js0000644000000000000000000000210312631326456023443 0ustar 00000000000000var url = require('url') module.exports = function normalize (u) { var parsed = url.parse(u) // If parsing actually alters the URL, it is almost certainly an // scp-style URL, or an invalid one. var altered = u !== url.format(parsed) // git is so tricky! // if the path is like ssh://foo:22/some/path then it works, but // it needs the ssh:// // If the path is like ssh://foo:some/path then it works, but // only if you remove the ssh:// if (parsed.protocol) { parsed.protocol = parsed.protocol.replace(/^git\+/, '') } // figure out what we should check out. var checkout = parsed.hash && parsed.hash.substr(1) || 'master' parsed.hash = '' var returnedUrl if (altered) { if (u.match(/^git\+https?/) && parsed.pathname.match(/\/?:[^0-9]/)) { returnedUrl = u.replace(/^git\+(.*:[^:]+):(.*)/, '$1/$2') } else { returnedUrl = u.replace(/^(?:git\+)?ssh:\/\//, '') } returnedUrl = returnedUrl.replace(/#[^#]*$/, '') } else { returnedUrl = url.format(parsed) } return { url: returnedUrl, branch: checkout } } npm_3.5.2.orig/node_modules/normalize-git-url/package.json0000644000000000000000000000305312631326456022017 0ustar 00000000000000{ "name": "normalize-git-url", "version": "3.0.1", "description": "Normalizes Git URLs. For npm, but you can use it too.", "main": "normalize-git-url.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "^1.1.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/npm/normalize-git-url.git" }, "keywords": [ "git", "github", "url", "normalize", "npm" ], "author": { "name": "Forrest L Norvell", "email": "ogd@aoaioxxysz.net" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/normalize-git-url/issues" }, "homepage": "https://github.com/npm/normalize-git-url", "gitHead": "8393cd4345e404eb6ad2ff6853dcc8287807ca22", "_id": "normalize-git-url@3.0.1", "_shasum": "d40d419d05a15870271e50534dbb7b8ccd9b0a5c", "_from": "normalize-git-url@>=3.0.1 <3.1.0", "_npmVersion": "3.1.2", "_nodeVersion": "2.2.2", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "d40d419d05a15870271e50534dbb7b8ccd9b0a5c", "tarball": "http://registry.npmjs.org/normalize-git-url/-/normalize-git-url-3.0.1.tgz" }, "maintainers": [ { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "iarna", "email": "me@re-becca.org" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "_resolved": "https://registry.npmjs.org/normalize-git-url/-/normalize-git-url-3.0.1.tgz" } npm_3.5.2.orig/node_modules/normalize-git-url/test/0000755000000000000000000000000012631326456020507 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-git-url/test/basic.js0000644000000000000000000000423112631326456022126 0ustar 00000000000000var test = require('tap').test var normalize = require('../normalize-git-url.js') test('basic normalization tests', function (t) { t.same( normalize('git+ssh://user@hostname:project.git#commit-ish'), { url: 'user@hostname:project.git', branch: 'commit-ish' } ) t.same( normalize('git+http://user@hostname/project/blah.git#commit-ish'), { url: 'http://user@hostname/project/blah.git', branch: 'commit-ish' } ) t.same( normalize('git+https://user@hostname/project/blah.git#commit-ish'), { url: 'https://user@hostname/project/blah.git', branch: 'commit-ish' } ) t.same( normalize('git+https://user@hostname:project/blah.git#commit-ish'), { url: 'https://user@hostname/project/blah.git', branch: 'commit-ish' } ) t.same( normalize('git+ssh://git@github.com:npm/npm.git#v1.0.27'), { url: 'git@github.com:npm/npm.git', branch: 'v1.0.27' } ) t.same( normalize('git+ssh://git@github.com:/npm/npm.git#v1.0.28'), { url: 'git@github.com:/npm/npm.git', branch: 'v1.0.28' } ) t.same( normalize('git+ssh://git@github.com:org/repo#dev'), { url: 'git@github.com:org/repo', branch: 'dev' } ) t.same( normalize('git+ssh://git@github.com/org/repo#dev'), { url: 'ssh://git@github.com/org/repo', branch: 'dev' } ) t.same( normalize('git+ssh://foo:22/some/path'), { url: 'ssh://foo:22/some/path', branch: 'master' } ) t.same( normalize('git@github.com:org/repo#dev'), { url: 'git@github.com:org/repo', branch: 'dev' } ) t.same( normalize('git+https://github.com/KenanY/node-uuid'), { url: 'https://github.com/KenanY/node-uuid', branch: 'master' } ) t.same( normalize('git+https://github.com/KenanY/node-uuid#7a018f2d075b03a73409e8356f9b29c9ad4ea2c5'), { url: 'https://github.com/KenanY/node-uuid', branch: '7a018f2d075b03a73409e8356f9b29c9ad4ea2c5' } ) t.same( normalize('git+ssh://git@git.example.com:b/b.git#v1.0.0'), { url: 'git@git.example.com:b/b.git', branch: 'v1.0.0' } ) t.same( normalize('git+ssh://git@github.com:npm/npm-proto.git#othiym23/organized'), { url: 'git@github.com:npm/npm-proto.git', branch: 'othiym23/organized' } ) t.end() }) npm_3.5.2.orig/node_modules/normalize-package-data/.npmignore0000644000000000000000000000001612631326456022443 0ustar 00000000000000/node_modules/npm_3.5.2.orig/node_modules/normalize-package-data/.travis.yml0000644000000000000000000000004612631326456022560 0ustar 00000000000000language: node_js node_js: - "0.10" npm_3.5.2.orig/node_modules/normalize-package-data/AUTHORS0000644000000000000000000000022712631326456021520 0ustar 00000000000000# Names sorted by how much code was originally theirs. Isaac Z. Schlueter Meryn Stol Robert Kowalski npm_3.5.2.orig/node_modules/normalize-package-data/LICENSE0000644000000000000000000000256312631326456021462 0ustar 00000000000000This package contains code originally written by Isaac Z. Schlueter. Used with permission. Copyright (c) Meryn Stol ("Author") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npm_3.5.2.orig/node_modules/normalize-package-data/README.md0000644000000000000000000001616112631326456021733 0ustar 00000000000000# normalize-package-data [![Build Status](https://travis-ci.org/npm/normalize-package-data.png?branch=master)](https://travis-ci.org/npm/normalize-package-data) normalize-package data exports a function that normalizes package metadata. This data is typically found in a package.json file, but in principle could come from any source - for example the npm registry. normalize-package-data is used by [read-package-json](https://npmjs.org/package/read-package-json) to normalize the data it reads from a package.json file. In turn, read-package-json is used by [npm](https://npmjs.org/package/npm) and various npm-related tools. ## Installation ``` npm install normalize-package-data ``` ## Usage Basic usage is really simple. You call the function that normalize-package-data exports. Let's call it `normalizeData`. ```javascript normalizeData = require('normalize-package-data') packageData = fs.readFileSync("package.json") normalizeData(packageData) // packageData is now normalized ``` #### Strict mode You may activate strict validation by passing true as the second argument. ```javascript normalizeData = require('normalize-package-data') packageData = fs.readFileSync("package.json") warnFn = function(msg) { console.error(msg) } normalizeData(packageData, true) // packageData is now normalized ``` If strict mode is activated, only Semver 2.0 version strings are accepted. Otherwise, Semver 1.0 strings are accepted as well. Packages must have a name, and the name field must not have contain leading or trailing whitespace. #### Warnings Optionally, you may pass a "warning" function. It gets called whenever the `normalizeData` function encounters something that doesn't look right. It indicates less than perfect input data. ```javascript normalizeData = require('normalize-package-data') packageData = fs.readFileSync("package.json") warnFn = function(msg) { console.error(msg) } normalizeData(packageData, warnFn) // packageData is now normalized. Any number of warnings may have been logged. ``` You may combine strict validation with warnings by passing `true` as the second argument, and `warnFn` as third. When `private` field is set to `true`, warnings will be suppressed. ### Potential exceptions If the supplied data has an invalid name or version vield, `normalizeData` will throw an error. Depending on where you call `normalizeData`, you may want to catch these errors so can pass them to a callback. ## What normalization (currently) entails * The value of `name` field gets trimmed (unless in strict mode). * The value of the `version` field gets cleaned by `semver.clean`. See [documentation for the semver module](https://github.com/isaacs/node-semver). * If `name` and/or `version` fields are missing, they are set to empty strings. * If `files` field is not an array, it will be removed. * If `bin` field is a string, then `bin` field will become an object with `name` set to the value of the `name` field, and `bin` set to the original string value. * If `man` field is a string, it will become an array with the original string as its sole member. * If `keywords` field is string, it is considered to be a list of keywords separated by one or more white-space characters. It gets converted to an array by splitting on `\s+`. * All people fields (`author`, `maintainers`, `contributors`) get converted into objects with name, email and url properties. * If `bundledDependencies` field (a typo) exists and `bundleDependencies` field does not, `bundledDependencies` will get renamed to `bundleDependencies`. * If the value of any of the dependencies fields (`dependencies`, `devDependencies`, `optionalDependencies`) is a string, it gets converted into an object with familiar `name=>value` pairs. * The values in `optionalDependencies` get added to `dependencies`. The `optionalDependencies` array is left untouched. * As of v2: Dependencies that point at known hosted git providers (currently: github, bitbucket, gitlab) will have their URLs canonicalized, but protocols will be preserved. * As of v2: Dependencies that use shortcuts for hosted git providers (`org/proj`, `github:org/proj`, `bitbucket:org/proj`, `gitlab:org/proj`, `gist:docid`) will have the shortcut left in place. (In the case of github, the `org/proj` form will be expanded to `github:org/proj`.) THIS MARKS A BREAKING CHANGE FROM V1, where the shorcut was previously expanded to a URL. * If `description` field does not exist, but `readme` field does, then (more or less) the first paragraph of text that's found in the readme is taken as value for `description`. * If `repository` field is a string, it will become an object with `url` set to the original string value, and `type` set to `"git"`. * If `repository.url` is not a valid url, but in the style of "[owner-name]/[repo-name]", `repository.url` will be set to git+https://github.com/[owner-name]/[repo-name].git * If `bugs` field is a string, the value of `bugs` field is changed into an object with `url` set to the original string value. * If `bugs` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `bugs` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/issues . If the repository field points to a GitHub Gist repo url, the associated http url is chosen. * If `bugs` field is an object, the resulting value only has email and url properties. If email and url properties are not strings, they are ignored. If no valid values for either email or url is found, bugs field will be removed. * If `homepage` field is not a string, it will be removed. * If the url in the `homepage` field does not specify a protocol, then http is assumed. For example, `myproject.org` will be changed to `http://myproject.org`. * If `homepage` field does not exist, but `repository` field points to a repository hosted on GitHub, the value of the `homepage` field gets set to an url in the form of https://github.com/[owner-name]/[repo-name]/ . If the repository field points to a GitHub Gist repo url, the associated http url is chosen. ### Rules for name field If `name` field is given, the value of the name field must be a string. The string may not: * start with a period. * contain the following characters: `/@\s+%` * contain and characters that would need to be encoded for use in urls. * resemble the word `node_modules` or `favicon.ico` (case doesn't matter). ### Rules for version field If `version` field is given, the value of the version field must be a valid *semver* string, as determined by the `semver.valid` method. See [documentation for the semver module](https://github.com/isaacs/node-semver). ### Rules for license field The `license` field should be a valid *SPDX license expression* or one of the special values allowed by [validate-npm-package-license](https://npmjs.com/packages/validate-npm-package-license). See [documentation for the license field in package.json](https://docs.npmjs.com/files/package.json#license). ## Credits This package contains code based on read-package-json written by Isaac Z. Schlueter. Used with permisson. ## License normalize-package-data is released under the [BSD 2-Clause License](http://opensource.org/licenses/MIT). Copyright (c) 2013 Meryn Stol npm_3.5.2.orig/node_modules/normalize-package-data/lib/0000755000000000000000000000000012631326456021215 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/0000755000000000000000000000000012631326456023124 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/package.json0000644000000000000000000000417012631326456022737 0ustar 00000000000000{ "name": "normalize-package-data", "version": "2.3.5", "author": { "name": "Meryn Stol", "email": "merynstol@gmail.com" }, "description": "Normalizes data that can be found in package.json files.", "license": "BSD-2-Clause", "repository": { "type": "git", "url": "git://github.com/npm/normalize-package-data.git" }, "main": "lib/normalize.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "hosted-git-info": "^2.1.4", "is-builtin-module": "^1.0.0", "semver": "2 || 3 || 4 || 5", "validate-npm-package-license": "^3.0.1" }, "devDependencies": { "async": "^1.5.0", "tap": "^2.2.0", "underscore": "^1.8.3" }, "contributors": [ { "name": "Isaac Z. Schlueter", "email": "i@izs.me" }, { "name": "Meryn Stol", "email": "merynstol@gmail.com" }, { "name": "Robert Kowalski", "email": "rok@kowalski.gd" } ], "gitHead": "3dc7756af20b3b1b24c6d75302448ca3659e0a65", "bugs": { "url": "https://github.com/npm/normalize-package-data/issues" }, "homepage": "https://github.com/npm/normalize-package-data#readme", "_id": "normalize-package-data@2.3.5", "_shasum": "8d924f142960e1777e7ffe170543631cc7cb02df", "_from": "normalize-package-data@>=2.3.5 <2.4.0", "_npmVersion": "3.3.6", "_nodeVersion": "5.0.0", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "dist": { "shasum": "8d924f142960e1777e7ffe170543631cc7cb02df", "tarball": "http://registry.npmjs.org/normalize-package-data/-/normalize-package-data-2.3.5.tgz" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" }, { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "meryn", "email": "merynstol@gmail.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/normalize-package-data/-/normalize-package-data-2.3.5.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/normalize-package-data/test/0000755000000000000000000000000012631326456021426 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/lib/extract_description.js0000644000000000000000000000077512631326456025641 0ustar 00000000000000module.exports = extractDescription // Extracts description from contents of a readme file in markdown format function extractDescription (d) { if (!d) return; if (d === "ERROR: No README data found!") return; // the first block of text before the first heading // that isn't the first line heading d = d.trim().split('\n') for (var s = 0; d[s] && d[s].trim().match(/^(#|$)/); s ++); var l = d.length for (var e = s + 1; e < l && d[e].trim(); e ++); return d.slice(s, e).join(' ').trim() } npm_3.5.2.orig/node_modules/normalize-package-data/lib/fixer.js0000644000000000000000000002703512631326456022677 0ustar 00000000000000var semver = require("semver") var validateLicense = require('validate-npm-package-license'); var hostedGitInfo = require("hosted-git-info") var isBuiltinModule = require("is-builtin-module") var depTypes = ["dependencies","devDependencies","optionalDependencies"] var extractDescription = require("./extract_description") var url = require("url") var typos = require("./typos") var fixer = module.exports = { // default warning function warn: function() {}, fixRepositoryField: function(data) { if (data.repositories) { this.warn("repositories"); data.repository = data.repositories[0] } if (!data.repository) return this.warn("missingRepository") if (typeof data.repository === "string") { data.repository = { type: "git", url: data.repository } } var r = data.repository.url || "" if (r) { var hosted = hostedGitInfo.fromUrl(r) if (hosted) { r = data.repository.url = hosted.getDefaultRepresentation() == "shortcut" ? hosted.https() : hosted.toString() } } if (r.match(/github.com\/[^\/]+\/[^\/]+\.git\.git$/)) { this.warn("brokenGitUrl", r) } } , fixTypos: function(data) { Object.keys(typos.topLevel).forEach(function (d) { if (data.hasOwnProperty(d)) { this.warn("typo", d, typos.topLevel[d]) } }, this) } , fixScriptsField: function(data) { if (!data.scripts) return if (typeof data.scripts !== "object") { this.warn("nonObjectScripts") delete data.scripts return } Object.keys(data.scripts).forEach(function (k) { if (typeof data.scripts[k] !== "string") { this.warn("nonStringScript") delete data.scripts[k] } else if (typos.script[k] && !data.scripts[typos.script[k]]) { this.warn("typo", k, typos.script[k], "scripts") } }, this) } , fixFilesField: function(data) { var files = data.files if (files && !Array.isArray(files)) { this.warn("nonArrayFiles") delete data.files } else if (data.files) { data.files = data.files.filter(function(file) { if (!file || typeof file !== "string") { this.warn("invalidFilename", file) return false } else { return true } }, this) } } , fixBinField: function(data) { if (!data.bin) return; if (typeof data.bin === "string") { var b = {} var match if (match = data.name.match(/^@[^/]+[/](.*)$/)) { b[match[1]] = data.bin } else { b[data.name] = data.bin } data.bin = b } } , fixManField: function(data) { if (!data.man) return; if (typeof data.man === "string") { data.man = [ data.man ] } } , fixBundleDependenciesField: function(data) { var bdd = "bundledDependencies" var bd = "bundleDependencies" if (data[bdd] && !data[bd]) { data[bd] = data[bdd] delete data[bdd] } if (data[bd] && !Array.isArray(data[bd])) { this.warn("nonArrayBundleDependencies") delete data[bd] } else if (data[bd]) { data[bd] = data[bd].filter(function(bd) { if (!bd || typeof bd !== 'string') { this.warn("nonStringBundleDependency", bd) return false } else { if (!data.dependencies) { data.dependencies = {} } if (!data.dependencies.hasOwnProperty(bd)) { this.warn("nonDependencyBundleDependency", bd) data.dependencies[bd] = "*" } return true } }, this) } } , fixDependencies: function(data, strict) { var loose = !strict objectifyDeps(data, this.warn) addOptionalDepsToDeps(data, this.warn) this.fixBundleDependenciesField(data) ;['dependencies','devDependencies'].forEach(function(deps) { if (!(deps in data)) return if (!data[deps] || typeof data[deps] !== "object") { this.warn("nonObjectDependencies", deps) delete data[deps] return } Object.keys(data[deps]).forEach(function (d) { var r = data[deps][d] if (typeof r !== 'string') { this.warn("nonStringDependency", d, JSON.stringify(r)) delete data[deps][d] } var hosted = hostedGitInfo.fromUrl(data[deps][d]) if (hosted) data[deps][d] = hosted.toString() }, this) }, this) } , fixModulesField: function (data) { if (data.modules) { this.warn("deprecatedModules") delete data.modules } } , fixKeywordsField: function (data) { if (typeof data.keywords === "string") { data.keywords = data.keywords.split(/,\s+/) } if (data.keywords && !Array.isArray(data.keywords)) { delete data.keywords this.warn("nonArrayKeywords") } else if (data.keywords) { data.keywords = data.keywords.filter(function(kw) { if (typeof kw !== "string" || !kw) { this.warn("nonStringKeyword"); return false } else { return true } }, this) } } , fixVersionField: function(data, strict) { // allow "loose" semver 1.0 versions in non-strict mode // enforce strict semver 2.0 compliance in strict mode var loose = !strict if (!data.version) { data.version = "" return true } if (!semver.valid(data.version, loose)) { throw new Error('Invalid version: "'+ data.version + '"') } data.version = semver.clean(data.version, loose) return true } , fixPeople: function(data) { modifyPeople(data, unParsePerson) modifyPeople(data, parsePerson) } , fixNameField: function(data, options) { if (typeof options === "boolean") options = {strict: options} else if (typeof options === "undefined") options = {} var strict = options.strict if (!data.name && !strict) { data.name = "" return } if (typeof data.name !== "string") { throw new Error("name field must be a string.") } if (!strict) data.name = data.name.trim() ensureValidName(data.name, strict, options.allowLegacyCase) if (isBuiltinModule(data.name)) this.warn("conflictingName", data.name) } , fixDescriptionField: function (data) { if (data.description && typeof data.description !== 'string') { this.warn("nonStringDescription") delete data.description } if (data.readme && !data.description) data.description = extractDescription(data.readme) if(data.description === undefined) delete data.description; if (!data.description) this.warn("missingDescription") } , fixReadmeField: function (data) { if (!data.readme) { this.warn("missingReadme") data.readme = "ERROR: No README data found!" } } , fixBugsField: function(data) { if (!data.bugs && data.repository && data.repository.url) { var hosted = hostedGitInfo.fromUrl(data.repository.url) if(hosted && hosted.bugs()) { data.bugs = {url: hosted.bugs()} } } else if(data.bugs) { var emailRe = /^.+@.*\..+$/ if(typeof data.bugs == "string") { if(emailRe.test(data.bugs)) data.bugs = {email:data.bugs} else if(url.parse(data.bugs).protocol) data.bugs = {url: data.bugs} else this.warn("nonEmailUrlBugsString") } else { bugsTypos(data.bugs, this.warn) var oldBugs = data.bugs data.bugs = {} if(oldBugs.url) { if(typeof(oldBugs.url) == "string" && url.parse(oldBugs.url).protocol) data.bugs.url = oldBugs.url else this.warn("nonUrlBugsUrlField") } if(oldBugs.email) { if(typeof(oldBugs.email) == "string" && emailRe.test(oldBugs.email)) data.bugs.email = oldBugs.email else this.warn("nonEmailBugsEmailField") } } if(!data.bugs.email && !data.bugs.url) { delete data.bugs this.warn("emptyNormalizedBugs") } } } , fixHomepageField: function(data) { if (!data.homepage && data.repository && data.repository.url) { var hosted = hostedGitInfo.fromUrl(data.repository.url) if (hosted && hosted.docs()) data.homepage = hosted.docs() } if (!data.homepage) return if(typeof data.homepage !== "string") { this.warn("nonUrlHomepage") return delete data.homepage } if(!url.parse(data.homepage).protocol) { this.warn("missingProtocolHomepage") data.homepage = "http://" + data.homepage } } , fixLicenseField: function(data) { if (!data.license) { return this.warn("missingLicense") } else{ if ( typeof(data.license) !== 'string' || data.license.length < 1 ) { this.warn("invalidLicense") } else { if (!validateLicense(data.license).validForNewPackages) this.warn("invalidLicense") } } } } function isValidScopedPackageName(spec) { if (spec.charAt(0) !== '@') return false var rest = spec.slice(1).split('/') if (rest.length !== 2) return false return rest[0] && rest[1] && rest[0] === encodeURIComponent(rest[0]) && rest[1] === encodeURIComponent(rest[1]) } function isCorrectlyEncodedName(spec) { return !spec.match(/[\/@\s\+%:]/) && spec === encodeURIComponent(spec) } function ensureValidName (name, strict, allowLegacyCase) { if (name.charAt(0) === "." || !(isValidScopedPackageName(name) || isCorrectlyEncodedName(name)) || (strict && (!allowLegacyCase) && name !== name.toLowerCase()) || name.toLowerCase() === "node_modules" || name.toLowerCase() === "favicon.ico") { throw new Error("Invalid name: " + JSON.stringify(name)) } } function modifyPeople (data, fn) { if (data.author) data.author = fn(data.author) ;["maintainers", "contributors"].forEach(function (set) { if (!Array.isArray(data[set])) return; data[set] = data[set].map(fn) }) return data } function unParsePerson (person) { if (typeof person === "string") return person var name = person.name || "" var u = person.url || person.web var url = u ? (" ("+u+")") : "" var e = person.email || person.mail var email = e ? (" <"+e+">") : "" return name+email+url } function parsePerson (person) { if (typeof person !== "string") return person var name = person.match(/^([^\(<]+)/) var url = person.match(/\(([^\)]+)\)/) var email = person.match(/<([^>]+)>/) var obj = {} if (name && name[0].trim()) obj.name = name[0].trim() if (email) obj.email = email[1]; if (url) obj.url = url[1]; return obj } function addOptionalDepsToDeps (data, warn) { var o = data.optionalDependencies if (!o) return; var d = data.dependencies || {} Object.keys(o).forEach(function (k) { d[k] = o[k] }) data.dependencies = d } function depObjectify (deps, type, warn) { if (!deps) return {} if (typeof deps === "string") { deps = deps.trim().split(/[\n\r\s\t ,]+/) } if (!Array.isArray(deps)) return deps warn("deprecatedArrayDependencies", type) var o = {} deps.filter(function (d) { return typeof d === "string" }).forEach(function(d) { d = d.trim().split(/(:?[@\s><=])/) var dn = d.shift() var dv = d.join("") dv = dv.trim() dv = dv.replace(/^@/, "") o[dn] = dv }) return o } function objectifyDeps (data, warn) { depTypes.forEach(function (type) { if (!data[type]) return; data[type] = depObjectify(data[type], type, warn) }) } function bugsTypos(bugs, warn) { if (!bugs) return Object.keys(bugs).forEach(function (k) { if (typos.bugs[k]) { warn("typo", k, typos.bugs[k], "bugs") bugs[typos.bugs[k]] = bugs[k] delete bugs[k] } }) } npm_3.5.2.orig/node_modules/normalize-package-data/lib/make_warning.js0000644000000000000000000000130512631326456024214 0ustar 00000000000000var util = require("util") var messages = require("./warning_messages.json") module.exports = function() { var args = Array.prototype.slice.call(arguments, 0) var warningName = args.shift() if (warningName == "typo") { return makeTypoWarning.apply(null,args) } else { var msgTemplate = messages[warningName] ? messages[warningName] : warningName + ": '%s'" args.unshift(msgTemplate) return util.format.apply(null, args) } } function makeTypoWarning (providedName, probableName, field) { if (field) { providedName = field + "['" + providedName + "']" probableName = field + "['" + probableName + "']" } return util.format(messages.typo, providedName, probableName) }npm_3.5.2.orig/node_modules/normalize-package-data/lib/normalize.js0000644000000000000000000000250012631326456023550 0ustar 00000000000000module.exports = normalize var fixer = require("./fixer") normalize.fixer = fixer var makeWarning = require("./make_warning") var fieldsToFix = ['name','version','description','repository','modules','scripts' ,'files','bin','man','bugs','keywords','readme','homepage','license'] var otherThingsToFix = ['dependencies','people', 'typos'] var thingsToFix = fieldsToFix.map(function(fieldName) { return ucFirst(fieldName) + "Field" }) // two ways to do this in CoffeeScript on only one line, sub-70 chars: // thingsToFix = fieldsToFix.map (name) -> ucFirst(name) + "Field" // thingsToFix = (ucFirst(name) + "Field" for name in fieldsToFix) thingsToFix = thingsToFix.concat(otherThingsToFix) function normalize (data, warn, strict) { if(warn === true) warn = null, strict = true if(!strict) strict = false if(!warn || data.private) warn = function(msg) { /* noop */ } if (data.scripts && data.scripts.install === "node-gyp rebuild" && !data.scripts.preinstall) { data.gypfile = true } fixer.warn = function() { warn(makeWarning.apply(null, arguments)) } thingsToFix.forEach(function(thingName) { fixer["fix" + ucFirst(thingName)](data, strict) }) data._id = data.name + "@" + data.version } function ucFirst (string) { return string.charAt(0).toUpperCase() + string.slice(1); } npm_3.5.2.orig/node_modules/normalize-package-data/lib/safe_format.js0000644000000000000000000000036512631326456024045 0ustar 00000000000000var util = require('util') module.exports = function() { var args = Array.prototype.slice.call(arguments, 0) args.forEach(function(arg) { if (!arg) throw new TypeError('Bad arguments.') }) return util.format.apply(null, arguments) }npm_3.5.2.orig/node_modules/normalize-package-data/lib/typos.json0000644000000000000000000000135412631326456023271 0ustar 00000000000000{ "topLevel": { "dependancies": "dependencies" ,"dependecies": "dependencies" ,"depdenencies": "dependencies" ,"devEependencies": "devDependencies" ,"depends": "dependencies" ,"dev-dependencies": "devDependencies" ,"devDependences": "devDependencies" ,"devDepenencies": "devDependencies" ,"devdependencies": "devDependencies" ,"repostitory": "repository" ,"repo": "repository" ,"prefereGlobal": "preferGlobal" ,"hompage": "homepage" ,"hampage": "homepage" ,"autohr": "author" ,"autor": "author" ,"contributers": "contributors" ,"publicationConfig": "publishConfig" ,"script": "scripts" }, "bugs": { "web": "url", "name": "url" }, "script": { "server": "start", "tests": "test" } } npm_3.5.2.orig/node_modules/normalize-package-data/lib/warning_messages.json0000644000000000000000000000352112631326456025445 0ustar 00000000000000{ "repositories": "'repositories' (plural) Not supported. Please pick one as the 'repository' field" ,"missingRepository": "No repository field." ,"brokenGitUrl": "Probably broken git url: %s" ,"nonObjectScripts": "scripts must be an object" ,"nonStringScript": "script values must be string commands" ,"nonArrayFiles": "Invalid 'files' member" ,"invalidFilename": "Invalid filename in 'files' list: %s" ,"nonArrayBundleDependencies": "Invalid 'bundleDependencies' list. Must be array of package names" ,"nonStringBundleDependency": "Invalid bundleDependencies member: %s" ,"nonDependencyBundleDependency": "Non-dependency in bundleDependencies: %s" ,"nonObjectDependencies": "%s field must be an object" ,"nonStringDependency": "Invalid dependency: %s %s" ,"deprecatedArrayDependencies": "specifying %s as array is deprecated" ,"deprecatedModules": "modules field is deprecated" ,"nonArrayKeywords": "keywords should be an array of strings" ,"nonStringKeyword": "keywords should be an array of strings" ,"conflictingName": "%s is also the name of a node core module." ,"nonStringDescription": "'description' field should be a string" ,"missingDescription": "No description" ,"missingReadme": "No README data" ,"missingLicense": "No license field." ,"nonEmailUrlBugsString": "Bug string field must be url, email, or {email,url}" ,"nonUrlBugsUrlField": "bugs.url field must be a string url. Deleted." ,"nonEmailBugsEmailField": "bugs.email field must be a string email. Deleted." ,"emptyNormalizedBugs": "Normalized value of bugs field is an empty object. Deleted." ,"nonUrlHomepage": "homepage field must be a string url. Deleted." ,"invalidLicense": "license should be a valid SPDX license expression" ,"missingProtocolHomepage": "homepage field must start with a protocol." ,"typo": "%s should probably be %s." } npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/0000755000000000000000000000000012631326456026466 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/index.js0000644000000000000000000000034112631326456030131 0ustar 00000000000000'use strict'; var builtinModules = require('builtin-modules'); module.exports = function (str) { if (typeof str !== 'string') { throw new TypeError('Expected a string'); } return builtinModules.indexOf(str) !== -1; }; npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/license0000644000000000000000000000213712631326456030036 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/0000755000000000000000000000000012631326456031143 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/package.json0000644000000000000000000000332312631326456030755 0ustar 00000000000000{ "name": "is-builtin-module", "version": "1.0.0", "description": "Check if a string matches the name of a Node.js builtin module", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/is-builtin-module.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "node test.js" }, "files": [ "index.js" ], "keywords": [ "builtin", "built-in", "builtins", "node", "modules", "core", "bundled", "list", "array", "names", "is", "detect", "check", "match" ], "dependencies": { "builtin-modules": "^1.0.0" }, "devDependencies": { "ava": "0.0.4" }, "gitHead": "da55ebf031f3864c5d309e25e49ed816957d70a2", "bugs": { "url": "https://github.com/sindresorhus/is-builtin-module/issues" }, "homepage": "https://github.com/sindresorhus/is-builtin-module", "_id": "is-builtin-module@1.0.0", "_shasum": "540572d34f7ac3119f8f76c30cbc1b1e037affbe", "_from": "is-builtin-module@>=1.0.0 <2.0.0", "_npmVersion": "2.7.4", "_nodeVersion": "0.12.2", "_npmUser": { "name": "sindresorhus", "email": "sindresorhus@gmail.com" }, "dist": { "shasum": "540572d34f7ac3119f8f76c30cbc1b1e037affbe", "tarball": "http://registry.npmjs.org/is-builtin-module/-/is-builtin-module-1.0.0.tgz" }, "maintainers": [ { "name": "sindresorhus", "email": "sindresorhus@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/is-builtin-module/-/is-builtin-module-1.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/readme.md0000644000000000000000000000116112631326456030244 0ustar 00000000000000# is-builtin-module [![Build Status](https://travis-ci.org/sindresorhus/is-builtin-module.svg?branch=master)](https://travis-ci.org/sindresorhus/is-builtin-module) > Check if a string matches the name of a Node.js builtin module ## Install ``` $ npm install --save is-builtin-module ``` ## Usage ```js var isBuiltinModule = require('is-builtin-module'); isBuiltinModule('fs'); //=> true isBuiltinModule('unicorn'); //=> false :( ``` ## Related - [builtin-modules](https://github.com/sindresorhus/builtin-modules) - List of the Node.js builtin modules ## License MIT © [Sindre Sorhus](http://sindresorhus.com) ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000755000000000000000000000000012631326456032203 5ustar 00000000000000././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/builtin-modules.jsonnpm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000644000000000000000000000054112631326456032205 0ustar 00000000000000[ "assert", "buffer", "child_process", "cluster", "console", "constants", "crypto", "dgram", "dns", "domain", "events", "fs", "http", "https", "module", "net", "os", "path", "process", "punycode", "querystring", "readline", "repl", "stream", "string_decoder", "timers", "tls", "tty", "url", "util", "v8", "vm", "zlib" ] ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/index.jsnpm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000644000000000000000000000032612631326456032206 0ustar 00000000000000'use strict'; var blacklist = [ 'freelist', 'sys' ]; module.exports = Object.keys(process.binding('natives')).filter(function (el) { return !/^_|^internal/.test(el) && blacklist.indexOf(el) === -1; }).sort(); ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/licensenpm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000644000000000000000000000213712631326456032210 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/package.jsonnpm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000644000000000000000000000322112631326456032203 0ustar 00000000000000{ "name": "builtin-modules", "version": "1.1.0", "description": "List of the Node.js builtin modules", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/builtin-modules.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "xo && ava", "make": "node make.js" }, "files": [ "index.js", "static.js", "builtin-modules.json" ], "keywords": [ "builtin", "built-in", "builtins", "node", "modules", "core", "bundled", "list", "array", "names" ], "devDependencies": { "ava": "*", "xo": "*" }, "gitHead": "d317be16fab701f2ac73bc9aa771f60ec052ed66", "bugs": { "url": "https://github.com/sindresorhus/builtin-modules/issues" }, "homepage": "https://github.com/sindresorhus/builtin-modules#readme", "_id": "builtin-modules@1.1.0", "_shasum": "1053955fd994a5746e525e4ac717b81caf07491c", "_from": "builtin-modules@>=1.0.0 <2.0.0", "_npmVersion": "2.13.3", "_nodeVersion": "3.0.0", "_npmUser": { "name": "sindresorhus", "email": "sindresorhus@gmail.com" }, "dist": { "shasum": "1053955fd994a5746e525e4ac717b81caf07491c", "tarball": "http://registry.npmjs.org/builtin-modules/-/builtin-modules-1.1.0.tgz" }, "maintainers": [ { "name": "sindresorhus", "email": "sindresorhus@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/builtin-modules/-/builtin-modules-1.1.0.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/readme.mdnpm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000644000000000000000000000167512631326456032216 0ustar 00000000000000# builtin-modules [![Build Status](https://travis-ci.org/sindresorhus/builtin-modules.svg?branch=master)](https://travis-ci.org/sindresorhus/builtin-modules) > List of the Node.js builtin modules The list is just a [JSON file](builtin-modules.json) and can be used wherever. ## Install ``` $ npm install --save builtin-modules ``` ## Usage ```js var builtinModules = require('builtin-modules'); console.log(builinModules); //=> ['assert', 'buffer', ...] ``` ## API Returns an array of builtin modules fetched from the running Node.js version. ### Static list This module also comes bundled with a static array of builtin modules generated from the latest Node.js version. You can get it with `require('builtin-modules/static');` ## Related - [is-builtin-module](https://github.com/sindresorhus/is-builtin-module) - Check if a string matches the name of a Node.js builtin module ## License MIT © [Sindre Sorhus](http://sindresorhus.com) ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/builtin-modules/static.jsnpm_3.5.2.orig/node_modules/normalize-package-data/node_modules/is-builtin-module/node_modules/built0000644000000000000000000000010212631326456032176 0ustar 00000000000000'use strict'; module.exports = require('./builtin-modules.json'); npm_3.5.2.orig/node_modules/normalize-package-data/test/basic.js0000644000000000000000000000277112631326456023054 0ustar 00000000000000var tap = require("tap") var normalize = require("../lib/normalize") var path = require("path") var fs = require("fs") tap.test("basic test", function (t) { var p = path.resolve(__dirname, "./fixtures/read-package-json.json") fs.readFile (p, function (err, contents) { if (err) throw err; var originalData = JSON.parse(contents.toString()) var data = JSON.parse(contents.toString()) normalize(data) t.ok(data) verifyFields(t, data, originalData) t.end() }) }) function verifyFields (t, normalized, original) { t.equal(normalized.version, original.version, "Version field stays same") t.equal(normalized._id, normalized.name + "@" + normalized.version, "It gets good id.") t.equal(normalized.name, original.name, "Name stays the same.") t.type(normalized.author, "object", "author field becomes object") t.deepEqual(normalized.scripts, original.scripts, "scripts field (object) stays same") t.equal(normalized.main, original.main) // optional deps are folded in. t.deepEqual(normalized.optionalDependencies, original.optionalDependencies) t.has(normalized.dependencies, original.optionalDependencies, "opt depedencies are copied into dependencies") t.has(normalized.dependencies, original.dependencies, "regular depedencies stay in place") t.deepEqual(normalized.devDependencies, original.devDependencies) t.type(normalized.bugs, "object", "bugs should become object") t.equal(normalized.bugs.url, "https://github.com/isaacs/read-package-json/issues") } npm_3.5.2.orig/node_modules/normalize-package-data/test/consistency.js0000644000000000000000000000232512631326456024327 0ustar 00000000000000var tap = require("tap") var normalize = require("../lib/normalize") var path = require("path") var fs = require("fs") var _ = require("underscore") var async = require("async") var data, clonedData var warn tap.test("consistent normalization", function(t) { path.resolve(__dirname, "./fixtures/read-package-json.json") fs.readdir (__dirname + "/fixtures", function (err, entries) { // entries = ['coffee-script.json'] // uncomment to limit to a specific file verifyConsistency = function(entryName, next) { warn = function(msg) { // t.equal("",msg) // uncomment to have some kind of logging of warnings } filename = __dirname + "/fixtures/" + entryName fs.readFile(filename, function(err, contents) { if (err) return next(err) data = JSON.parse(contents.toString()) normalize(data, warn) clonedData = _.clone(data) normalize(data, warn) t.deepEqual(clonedData, data, "Normalization of " + entryName + " is consistent.") next(null) }) // fs.readFile } // verifyConsistency async.forEach(entries, verifyConsistency, function(err) { if (err) throw err t.end() }) }) // fs.readdir }) // tap.testnpm_3.5.2.orig/node_modules/normalize-package-data/test/dependencies.js0000644000000000000000000000262012631326456024412 0ustar 00000000000000var tap = require("tap") var normalize = require("../lib/normalize") var warningMessages = require("../lib/warning_messages.json") var safeFormat = require("../lib/safe_format") tap.test("warn if dependency contains anything else but a string", function(t) { var a var warnings = [] function warn(w) { warnings.push(w) } normalize(a={ dependencies: { "a": 123}, devDependencies: { "b": 456}, optionalDependencies: { "c": 789} }, warn) var wanted1 = safeFormat(warningMessages.nonStringDependency, "a", 123) var wanted2 = safeFormat(warningMessages.nonStringDependency, "b", 456) var wanted3 = safeFormat(warningMessages.nonStringDependency, "c", 789) t.ok(~warnings.indexOf(wanted1), wanted1) t.ok(~warnings.indexOf(wanted2), wanted2) t.ok(~warnings.indexOf(wanted3), wanted3) t.end() }) tap.test("warn if bundleDependencies array contains anything else but strings", function(t) { var a var warnings = [] function warn(w) { warnings.push(w) } normalize(a={ bundleDependencies: ["abc", 123, {foo:"bar"}] }, warn) var wanted1 = safeFormat(warningMessages.nonStringBundleDependency, 123) var wanted2 = safeFormat(warningMessages.nonStringBundleDependency, {foo:"bar"}) var wanted2 = safeFormat(warningMessages.nonDependencyBundleDependency, "abc") t.ok(~warnings.indexOf(wanted1), wanted1) t.ok(~warnings.indexOf(wanted2), wanted2) t.end() }) npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/0000755000000000000000000000000012631326456023277 5ustar 00000000000000npm_3.5.2.orig/node_modules/normalize-package-data/test/github-urls.js0000644000000000000000000000254412631326456024236 0ustar 00000000000000var tap = require("tap") var normalize = require("../lib/normalize") var fs = require("fs") var async = require("async") var data var warn tap.test("consistent normalization", function(t) { var entries = [ 'read-package-json.json', 'http-server.json', "movefile.json", "node-module_exist.json" ] var verifyConsistency = function(entryName, next) { warn = function(msg) { // t.equal("",msg) // uncomment to have some kind of logging of warnings } var filename = __dirname + "/fixtures/" + entryName fs.readFile(filename, function(err, contents) { if (err) return next(err) data = JSON.parse(contents.toString()) normalize(data, warn) if(data.name == "node-module_exist") { t.same(data.bugs.url, "https://gist.github.com/3135914") } if(data.name == "read-package-json") { t.same(data.bugs.url, "https://github.com/isaacs/read-package-json/issues") } if(data.name == "http-server") { t.same(data.bugs.url, "https://github.com/nodejitsu/http-server/issues") } if(data.name == "movefile") { t.same(data.bugs.url, "https://github.com/yazgazan/movefile/issues") } next(null) }) // fs.readFile } // verifyConsistency async.forEach(entries, verifyConsistency, function(err) { if (err) throw err t.end() }) }) // tap.test npm_3.5.2.orig/node_modules/normalize-package-data/test/mixedcase-names.js0000644000000000000000000000140012631326456025022 0ustar 00000000000000var test = require('tap').test var normalize = require('../') var fixer = normalize.fixer test('mixedcase', function (t) { t.doesNotThrow(function () { fixer.fixNameField({name: 'foo'}, true) }) t.doesNotThrow(function () { fixer.fixNameField({name: 'foo'}, false) }) t.doesNotThrow(function () { fixer.fixNameField({name: 'foo'}) }) t.throws(function () { fixer.fixNameField({name: 'Foo'}, true) }, new Error('Invalid name: "Foo"'), 'should throw an error') t.throws(function () { fixer.fixNameField({name: 'Foo'}, {strict: true}) }, new Error('Invalid name: "Foo"'), 'should throw an error') t.doesNotThrow(function () { fixer.fixNameField({name: 'Foo'}, {strict: true, allowLegacyCase: true}) }) t.end() }) npm_3.5.2.orig/node_modules/normalize-package-data/test/normalize.js0000644000000000000000000001521412631326456023767 0ustar 00000000000000var tap = require("tap") var fs = require("fs") var path = require("path") var normalize = require("../lib/normalize") var warningMessages = require("../lib/warning_messages.json") var safeFormat = require("../lib/safe_format") var rpjPath = path.resolve(__dirname,"./fixtures/read-package-json.json") tap.test("normalize some package data", function(t) { var packageData = require(rpjPath) var warnings = [] normalize(packageData, function(warning) { warnings.push(warning) }) // there's no readme data in this particular object t.equal( warnings.length, 1, "There's exactly one warning.") fs.readFile(rpjPath, function(err, data) { if(err) throw err // Various changes have been made t.notEqual(packageData, JSON.parse(data), "Output is different from input.") t.end() }) }) tap.test("runs without passing warning function", function(t) { var packageData = require(rpjPath) fs.readFile(rpjPath, function(err, data) { if(err) throw err normalize(JSON.parse(data)) t.ok(true, "If you read this, this means I'm still alive.") t.end() }) }) tap.test("empty object", function(t) { var packageData = {} var expect = { name: '', version: '', readme: 'ERROR: No README data found!', _id: '@' } var warnings = [] function warn(m) { warnings.push(m) } normalize(packageData, warn) t.same(packageData, expect) t.same(warnings, [ warningMessages.missingDescription, warningMessages.missingRepository, warningMessages.missingReadme, warningMessages.missingLicense ]) t.end() }) tap.test("core module name", function(t) { var warnings = [] function warn(m) { warnings.push(m) } var a normalize(a={ name: "http", readme: "read yourself how about", homepage: 123, bugs: "what is this i don't even", repository: "Hello." }, warn) var expect = [ safeFormat(warningMessages.conflictingName, 'http'), warningMessages.nonEmailUrlBugsString, warningMessages.emptyNormalizedBugs, warningMessages.nonUrlHomepage, warningMessages.missingLicense ] t.same(warnings, expect) t.end() }) tap.test("urls required", function(t) { var warnings = [] function warn(w) { warnings.push(w) } normalize({ bugs: { url: "/1", email: "not an email address" } }, warn) var a normalize(a={ readme: "read yourself how about", homepage: 123, bugs: "what is this i don't even", repository: "Hello." }, warn) console.error(a) var expect = [ warningMessages.missingDescription, warningMessages.missingRepository, warningMessages.nonUrlBugsUrlField, warningMessages.nonEmailBugsEmailField, warningMessages.emptyNormalizedBugs, warningMessages.missingReadme, warningMessages.missingLicense, warningMessages.nonEmailUrlBugsString, warningMessages.emptyNormalizedBugs, warningMessages.nonUrlHomepage, warningMessages.missingLicense] t.same(warnings, expect) t.end() }) tap.test("homepage field must start with a protocol.", function(t) { var warnings = [] function warn(w) { warnings.push(w) } var a normalize(a={ homepage: 'example.org' }, warn) console.error(a) var expect = [ warningMessages.missingDescription, warningMessages.missingRepository, warningMessages.missingReadme, warningMessages.missingProtocolHomepage, warningMessages.missingLicense] t.same(warnings, expect) t.same(a.homepage, 'http://example.org') t.end() }) tap.test("license field should be a valid SPDX expression", function(t) { var warnings = [] function warn(w) { warnings.push(w) } var a normalize(a={ license: 'Apache 2' }, warn) console.error(a) var expect = [ warningMessages.missingDescription, warningMessages.missingRepository, warningMessages.missingReadme, warningMessages.invalidLicense] t.same(warnings, expect) t.end() }) tap.test("gist bugs url", function(t) { var d = { repository: "git@gist.github.com:123456.git" } normalize(d) t.same(d.repository, { type: 'git', url: 'git+ssh://git@gist.github.com/123456.git' }) t.same(d.bugs, { url: 'https://gist.github.com/123456' }) t.end(); }); tap.test("singularize repositories", function(t) { var d = {repositories:["git@gist.github.com:123456.git"]} normalize(d) t.same(d.repository, { type: 'git', url: 'git+ssh://git@gist.github.com/123456.git' }) t.end() }); tap.test("treat visionmedia/express as github repo", function(t) { var d = {repository: {type: "git", url: "visionmedia/express"}} normalize(d) t.same(d.repository, { type: "git", url: "git+https://github.com/visionmedia/express.git" }) t.end() }); tap.test("treat isaacs/node-graceful-fs as github repo", function(t) { var d = {repository: {type: "git", url: "isaacs/node-graceful-fs"}} normalize(d) t.same(d.repository, { type: "git", url: "git+https://github.com/isaacs/node-graceful-fs.git" }) t.end() }); tap.test("homepage field will set to github url if repository is a github repo", function(t) { var a normalize(a={ repository: { type: "git", url: "https://github.com/isaacs/node-graceful-fs" } }) t.same(a.homepage, 'https://github.com/isaacs/node-graceful-fs#readme') t.end() }) tap.test("homepage field will set to github gist url if repository is a gist", function(t) { var a normalize(a={ repository: { type: "git", url: "git@gist.github.com:123456.git" } }) t.same(a.homepage, 'https://gist.github.com/123456') t.end() }) tap.test("homepage field will set to github gist url if repository is a shorthand reference", function(t) { var a normalize(a={ repository: { type: "git", url: "sindresorhus/chalk" } }) t.same(a.homepage, 'https://github.com/sindresorhus/chalk#readme') t.end() }) tap.test("don't mangle github shortcuts in dependencies", function(t) { var d = {dependencies: {"node-graceful-fs": "isaacs/node-graceful-fs"}} normalize(d) t.same(d.dependencies, {"node-graceful-fs": "github:isaacs/node-graceful-fs" }) t.end() }); tap.test("deprecation warning for array in dependencies fields", function(t) { var a var warnings = [] function warn(w) { warnings.push(w) } normalize(a={ dependencies: [], devDependencies: [], optionalDependencies: [] }, warn) t.ok(~warnings.indexOf(safeFormat(warningMessages.deprecatedArrayDependencies, 'dependencies')), "deprecation warning") t.ok(~warnings.indexOf(safeFormat(warningMessages.deprecatedArrayDependencies, 'devDependencies')), "deprecation warning") t.ok(~warnings.indexOf(safeFormat(warningMessages.deprecatedArrayDependencies, 'optionalDependencies')), "deprecation warning") t.end() }) npm_3.5.2.orig/node_modules/normalize-package-data/test/scoped.js0000644000000000000000000000262012631326456023241 0ustar 00000000000000var test = require("tap").test var fixNameField = require("../lib/fixer.js").fixNameField var fixBinField = require("../lib/fixer.js").fixBinField test("a simple scoped module has a valid name", function (t) { var data = {name : "@org/package"} fixNameField(data, false) t.equal(data.name, "@org/package", "name was unchanged") t.end() }) test("'org@package' is not a valid name", function (t) { t.throws(function () { fixNameField({name : "org@package"}, false) }, "blows up as expected") t.end() }) test("'org=package' is not a valid name", function (t) { t.throws(function () { fixNameField({name : "org=package"}, false) }, "blows up as expected") t.end() }) test("'@org=sub/package' is not a valid name", function (t) { t.throws(function () { fixNameField({name : "@org=sub/package"}, false) }, "blows up as expected") t.end() }) test("'@org/' is not a valid name", function (t) { t.throws(function () { fixNameField({name : "@org/"}, false) }, "blows up as expected") t.end() }) test("'@/package' is not a valid name", function (t) { t.throws(function () { fixNameField({name : "@/package"}, false) }, "blows up as expected") t.end() }) test("name='@org/package', bin='bin.js' is bin={package:'bin.js'}", function (t) { var obj = {name : "@org/package", bin: "bin.js"} fixBinField(obj) t.isDeeply(obj.bin, {package: 'bin.js'}) t.end() }) npm_3.5.2.orig/node_modules/normalize-package-data/test/scripts.js0000644000000000000000000000140012631326456023446 0ustar 00000000000000var tap = require("tap") var normalize = require("../lib/normalize") var path = require("path") var fs = require("fs") tap.test("bad scripts", function (t) { var p = path.resolve(__dirname, "./fixtures/badscripts.json") fs.readFile (p, function (err, contents) { if (err) throw err var originalData = JSON.parse(contents.toString()) var data = JSON.parse(contents.toString()) normalize(data) t.ok(data) verifyFields(t, data, originalData) t.end() }) }) function verifyFields (t, normalized, original) { t.equal(normalized.version, original.version, "Version field stays same") t.equal(normalized.name, original.name, "Name stays the same.") // scripts is not an object, so it should be deleted t.notOk(normalized.scripts) } npm_3.5.2.orig/node_modules/normalize-package-data/test/strict.js0000644000000000000000000000223212631326456023273 0ustar 00000000000000var test = require("tap").test var normalize = require("../") test("strict", function(t) { var threw try { threw = false normalize({name: "X"}, true) } catch (er) { threw = true t.equal(er.message, 'Invalid name: "X"') } finally { t.equal(threw, true) } try { threw = false normalize({name:" x "}, true) } catch (er) { threw = true t.equal(er.message, 'Invalid name: " x "') } finally { t.equal(threw, true) } try { threw = false normalize({name:"x",version:"01.02.03"}, true) } catch (er) { threw = true t.equal(er.message, 'Invalid version: "01.02.03"') } finally { t.equal(threw, true) } // these should not throw var slob = {name:" X ",version:"01.02.03",dependencies:{ y:">01.02.03", z:"! 99 $$ASFJ(Aawenf90awenf as;naw.3j3qnraw || an elephant" }} normalize(slob, false) t.same(slob, { name: 'X', version: '1.2.3', dependencies: { y: '>01.02.03', z: '! 99 $$ASFJ(Aawenf90awenf as;naw.3j3qnraw || an elephant' }, readme: 'ERROR: No README data found!', _id: 'X@1.2.3' }) t.end() }) npm_3.5.2.orig/node_modules/normalize-package-data/test/typo.js0000644000000000000000000001014512631326456022760 0ustar 00000000000000var test = require('tap').test var normalize = require('../') var typos = require('../lib/typos.json') var warningMessages = require("../lib/warning_messages.json") var safeFormat = require("../lib/safe_format") test('typos', function(t) { var warnings = [] function warn(m) { warnings.push(m) } var typoMessage = safeFormat.bind(undefined, warningMessages.typo) var expect = [ warningMessages.missingRepository, warningMessages.missingLicense, typoMessage('dependancies', 'dependencies'), typoMessage('dependecies', 'dependencies'), typoMessage('depdenencies', 'dependencies'), typoMessage('devEependencies', 'devDependencies'), typoMessage('depends', 'dependencies'), typoMessage('dev-dependencies', 'devDependencies'), typoMessage('devDependences', 'devDependencies'), typoMessage('devDepenencies', 'devDependencies'), typoMessage('devdependencies', 'devDependencies'), typoMessage('repostitory', 'repository'), typoMessage('repo', 'repository'), typoMessage('prefereGlobal', 'preferGlobal'), typoMessage('hompage', 'homepage'), typoMessage('hampage', 'homepage'), typoMessage('autohr', 'author'), typoMessage('autor', 'author'), typoMessage('contributers', 'contributors'), typoMessage('publicationConfig', 'publishConfig') ] normalize({"dependancies": "dependencies" ,"dependecies": "dependencies" ,"depdenencies": "dependencies" ,"devEependencies": "devDependencies" ,"depends": "dependencies" ,"dev-dependencies": "devDependencies" ,"devDependences": "devDependencies" ,"devDepenencies": "devDependencies" ,"devdependencies": "devDependencies" ,"repostitory": "repository" ,"repo": "repository" ,"prefereGlobal": "preferGlobal" ,"hompage": "homepage" ,"hampage": "homepage" ,"autohr": "author" ,"autor": "author" ,"contributers": "contributors" ,"publicationConfig": "publishConfig" ,readme:"asdf" ,name:"name" ,version:"1.2.5"}, warn) t.same(warnings, expect) warnings.length = 0 var expect = [ warningMessages.missingDescription, warningMessages.missingRepository, typoMessage("bugs['web']", "bugs['url']"), typoMessage("bugs['name']", "bugs['url']"), warningMessages.nonUrlBugsUrlField, warningMessages.emptyNormalizedBugs, warningMessages.missingReadme, warningMessages.missingLicense] normalize({name:"name" ,version:"1.2.5" ,bugs:{web:"url",name:"url"}}, warn) t.same(warnings, expect) warnings.length = 0 var expect = [ warningMessages.missingDescription, warningMessages.missingRepository, warningMessages.missingReadme, warningMessages.missingLicense, typoMessage('script', 'scripts') ] normalize({name:"name" ,version:"1.2.5" ,script:{server:"start",tests:"test"}}, warn) t.same(warnings, expect) warnings.length = 0 expect = [ warningMessages.missingDescription, warningMessages.missingRepository, typoMessage("scripts['server']", "scripts['start']"), typoMessage("scripts['tests']", "scripts['test']"), warningMessages.missingReadme, warningMessages.missingLicense] normalize({name:"name" ,version:"1.2.5" ,scripts:{server:"start",tests:"test"}}, warn) t.same(warnings, expect) warnings.length = 0 expect = [ warningMessages.missingDescription, warningMessages.missingRepository, warningMessages.missingReadme, warningMessages.missingLicense] normalize({name:"name" ,version:"1.2.5" ,scripts:{server:"start",tests:"test" ,start:"start",test:"test"}}, warn) t.same(warnings, expect) warnings.length = 0 expect = [] normalize({private: true ,name:"name" ,version:"1.2.5" ,scripts:{server:"start",tests:"test"}}, warn) t.same(warnings, expect) t.end(); }) npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/async.json0000644000000000000000000000154712631326456025316 0ustar 00000000000000{ "name": "async", "description": "Higher-order functions and common patterns for asynchronous code", "main": "./lib/async", "author": "Caolan McMahon", "version": "0.2.6", "repository" : { "type" : "git", "url" : "http://github.com/caolan/async.git" }, "bugs" : { "url" : "http://github.com/caolan/async/issues" }, "licenses" : [ { "type" : "MIT", "url" : "http://github.com/caolan/async/raw/master/LICENSE" } ], "devDependencies": { "nodeunit": ">0.0.0", "uglify-js": "1.2.x", "nodelint": ">0.0.0" }, "jam": { "main": "lib/async.js", "include": [ "lib/async.js", "README.md", "LICENSE" ] }, "scripts": { "test": "nodeunit test/test-async.js" } }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/badscripts.json0000644000000000000000000000011612631326456026326 0ustar 00000000000000{ "name": "bad-scripts-package", "version": "0.0.1", "scripts": "foo" } npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/bcrypt.json0000644000000000000000000000343512631326456025502 0ustar 00000000000000{ "name": "bcrypt", "description": "A bcrypt library for NodeJS.", "keywords": [ "bcrypt", "password", "auth", "authentication", "encryption", "crypt", "crypto" ], "main": "./bcrypt", "version": "0.7.5", "author": "Nick Campbell (http://github.com/ncb000gt)", "engines": { "node": ">= 0.6.0" }, "repository": { "type": "git", "url": "http://github.com/ncb000gt/node.bcrypt.js.git" }, "licenses": [ { "type": "MIT" } ], "bugs": { "url": "http://github.com/ncb000gt/node.bcrypt.js/issues" }, "scripts": { "test": "node-gyp configure build && nodeunit test" }, "dependencies": { "bindings": "1.0.0" }, "devDependencies": { "nodeunit": ">=0.6.4" }, "contributors": [ "Antonio Salazar Cardozo (https://github.com/Shadowfiend)", "Van Nguyen (https://github.com/thegoleffect)", "David Trejo (https://github.com/dtrejo)", "Ben Glow (https://github.com/pixelglow)", "NewITFarmer.com <> (https://github.com/newitfarmer)", "Alfred Westerveld (https://github.com/alfredwesterveld)", "Vincent Côté-Roy (https://github.com/vincentcr)", "Lloyd Hilaiel (https://github.com/lloyd)", "Roman Shtylman (https://github.com/shtylman)", "Vadim Graboys (https://github.com/vadimg)", "Ben Noorduis <> (https://github.com/bnoordhuis)", "Nate Rajlich (https://github.com/tootallnate)", "Sean McArthur (https://github.com/seanmonstar)", "Fanie Oosthuysen (https://github.com/weareu)" ] }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/coffee-script.json0000644000000000000000000000167212631326456026731 0ustar 00000000000000{ "name": "coffee-script", "description": "Unfancy JavaScript", "keywords": ["javascript", "language", "coffeescript", "compiler"], "author": "Jeremy Ashkenas", "version": "1.6.2", "licenses": [{ "type": "MIT", "url": "https://raw.github.com/jashkenas/coffee-script/master/LICENSE" }], "engines": { "node": ">=0.8.0" }, "directories" : { "lib" : "./lib/coffee-script" }, "main" : "./lib/coffee-script/coffee-script", "bin": { "coffee": "./bin/coffee", "cake": "./bin/cake" }, "scripts": { "test": "node ./bin/cake test" }, "homepage": "http://coffeescript.org", "bugs": "https://github.com/jashkenas/coffee-script/issues", "repository": { "type": "git", "url": "git://github.com/jashkenas/coffee-script.git" }, "devDependencies": { "uglify-js": "~2.2", "jison": ">=0.2.0" } }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/http-server.json0000644000000000000000000000230612631326456026456 0ustar 00000000000000{ "name": "http-server", "preferGlobal": true, "version": "0.3.0", "author": "Nodejitsu ", "description": "a simple zero-configuration command-line http server", "contributors": [ { "name": "Marak Squires", "email": "marak@nodejitsu.com" } ], "bin": { "http-server": "./bin/http-server" }, "scripts": { "start": "node ./bin/http-server", "test": "vows --spec --isolate", "predeploy": "echo This will be run before deploying the app", "postdeploy": "echo This will be run after deploying the app" }, "main": "./lib/http-server", "repository": { "type": "git", "url": "https://github.com/nodejitsu/http-server.git" }, "keywords": [ "cli", "http", "server" ], "dependencies" : { "colors" : "*", "flatiron" : "0.1.x", "optimist" : "0.2.x", "union" : "0.1.x", "ecstatic" : "0.1.x", "plates" : "https://github.com/flatiron/plates/tarball/master" }, "analyze": false, "devDependencies": { "vows" : "0.5.x", "request" : "2.1.x" }, "bundledDependencies": [ "union", "ecstatic" ], "license": "MIT", "engines": { "node": ">=0.6" } }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/movefile.json0000644000000000000000000000107012631326456025776 0ustar 00000000000000{ "name": "movefile", "description": "rename implementation working over devices", "version": "0.2.0", "author": "yazgazan ", "main": "./build/Release/movefile", "keywords": ["move", "file", "rename"], "repository": "git://github.com/yazgazan/movefile.git", "directories": { "lib": "./build/Release/" }, "scripts": { "install": "./node_modules/node-gyp/bin/node-gyp.js configure && ./node_modules/node-gyp/bin/node-gyp.js build" }, "engines": { "node": "*" }, "dependencies": { "node-gyp": "~0.9.1" } }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/no-description.json0000644000000000000000000000006512631326456027130 0ustar 00000000000000{ "name": "foo-bar-package", "version": "0.0.1" }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/node-module_exist.json0000644000000000000000000000111612631326456027615 0ustar 00000000000000{ "name": "node-module_exist", "description": "Find if a NodeJS module is available to require or not", "version": "0.0.1", "main": "module_exist.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "repository": { "type": "git", "url": "git@gist.github.com:3135914.git" }, "homepage": "https://github.com/FGRibreau", "author": { "name": "Francois-Guillaume Ribreau", "url": "http://fgribreau.com.com/" }, "devDependencies": { "nodeunit": "~0.7.4" }, "keywords": [ "core", "modules" ], "license": "MIT" }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/npm.json0000644000000000000000000000541212631326456024766 0ustar 00000000000000{ "version": "1.2.17", "name": "npm", "publishConfig": { "proprietary-attribs": false }, "description": "A package manager for node", "keywords": [ "package manager", "modules", "install", "package.json" ], "preferGlobal": true, "config": { "publishtest": false }, "homepage": "https://npmjs.org/doc/", "author": "Isaac Z. Schlueter (http://blog.izs.me)", "repository": { "type": "git", "url": "https://github.com/isaacs/npm" }, "bugs": { "email": "npm-@googlegroups.com", "url": "http://github.com/isaacs/npm/issues" }, "directories": { "doc": "./doc", "man": "./man", "lib": "./lib", "bin": "./bin" }, "main": "./lib/npm.js", "bin": "./bin/npm-cli.js", "dependencies": { "semver": "~1.1.2", "ini": "~1.1.0", "slide": "1", "abbrev": "~1.0.4", "graceful-fs": "~1.2.0", "minimatch": "~0.2.11", "nopt": "~2.1.1", "rimraf": "2", "request": "~2.9", "which": "1", "tar": "~0.1.17", "fstream": "~0.1.22", "block-stream": "*", "inherits": "1", "mkdirp": "~0.3.3", "read": "~1.0.4", "lru-cache": "~2.3.0", "node-gyp": "~0.9.3", "fstream-npm": "~0.1.3", "uid-number": "0", "archy": "0", "chownr": "0", "npmlog": "0", "ansi": "~0.1.2", "npm-registry-client": "~0.2.18", "read-package-json": "~0.3.0", "read-installed": "0", "glob": "~3.1.21", "init-package-json": "0.0.6", "osenv": "0", "lockfile": "~0.3.0", "retry": "~0.6.0", "once": "~1.1.1", "npmconf": "0", "opener": "~1.3.0", "chmodr": "~0.1.0", "cmd-shim": "~1.1.0" }, "bundleDependencies": [ "semver", "ini", "slide", "abbrev", "graceful-fs", "minimatch", "nopt", "rimraf", "request", "which", "tar", "fstream", "block-stream", "inherits", "mkdirp", "read", "lru-cache", "node-gyp", "fstream-npm", "uid-number", "archy", "chownr", "npmlog", "ansi", "npm-registry-client", "read-package-json", "read-installed", "glob", "init-package-json", "osenv", "lockfile", "retry", "once", "npmconf", "opener", "chmodr", "cmd-shim" ], "devDependencies": { "ronn": "~0.3.6", "tap": "~0.4.0" }, "engines": { "node": ">=0.6", "npm": "1" }, "scripts": { "test": "node ./test/run.js && tap test/tap/*.js", "tap": "tap test/tap/*.js", "prepublish": "node bin/npm-cli.js prune ; rm -rf test/*/*/node_modules ; make -j4 doc", "dumpconf": "env | grep npm | sort | uniq", "echo": "node bin/npm-cli.js" }, "licenses": [ { "type": "MIT +no-false-attribs", "url": "https://github.com/isaacs/npm/raw/master/LICENSE" } ] }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/read-package-json.json0000644000000000000000000000124412631326456027446 0ustar 00000000000000{ "name": "read-package-json", "version": "0.1.1", "author": "Isaac Z. Schlueter (http://blog.izs.me/)", "description": "The thing npm uses to read package.json files with semantics and defaults and validation", "repository": { "type": "git", "url": "git://github.com/isaacs/read-package-json.git" }, "license": "MIT", "main": "read-json.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "glob": "~3.1.9", "lru-cache": "~1.1.0", "semver": "~1.0.14", "slide": "~1.1.3" }, "devDependencies": { "tap": "~0.2.5" }, "optionalDependencies": { "npmlog": "0", "graceful-fs": "~1.1.8" } } npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/request.json0000644000000000000000000000146512631326456025670 0ustar 00000000000000{ "name": "request", "description": "Simplified HTTP request client.", "tags": [ "http", "simple", "util", "utility" ], "version": "2.16.7", "author": "Mikeal Rogers ", "repository": { "type": "git", "url": "http://github.com/mikeal/request.git" }, "bugs": { "url": "http://github.com/mikeal/request/issues" }, "engines": [ "node >= 0.8.0" ], "main": "index.js", "dependencies": { "form-data": "~0.0.3", "mime": "~1.2.7", "hawk": "~0.10.2", "node-uuid": "~1.4.0", "cookie-jar": "~0.2.0", "aws-sign": "~0.2.0", "oauth-sign": "~0.2.0", "forever-agent": "~0.2.0", "tunnel-agent": "~0.2.0", "json-stringify-safe": "~3.0.0", "qs": "~0.5.4" }, "scripts": { "test": "node tests/run.js" } }npm_3.5.2.orig/node_modules/normalize-package-data/test/fixtures/underscore.json0000644000000000000000000000117512631326456026347 0ustar 00000000000000{ "name" : "underscore", "description" : "JavaScript's functional programming helper library.", "homepage" : "http://underscorejs.org", "keywords" : ["util", "functional", "server", "client", "browser"], "author" : "Jeremy Ashkenas ", "repository" : {"type": "git", "url": "git://github.com/documentcloud/underscore.git"}, "main" : "underscore.js", "version" : "1.4.4", "devDependencies": { "phantomjs": "1.9.0-1" }, "scripts": { "test": "phantomjs test/vendor/runner.js test/index.html?noglobals=true" }, "license" : "MIT" }npm_3.5.2.orig/node_modules/npm-cache-filename/LICENSE0000644000000000000000000000136412631326456020571 0ustar 00000000000000The ISC License Copyright (c) npm, Inc. and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npm-cache-filename/README.md0000644000000000000000000000105612631326456021041 0ustar 00000000000000# npm-cache-filename Given a cache folder and url, return the appropriate cache folder. ## USAGE ```javascript var cf = require('npm-cache-filename'); console.log(cf('/tmp/cache', 'https://registry.npmjs.org:1234/foo/bar')); // outputs: /tmp/cache/registry.npmjs.org_1234/foo/bar ``` As a bonus, you can also bind it to a specific root path: ```javascript var cf = require('npm-cache-filename'); var getFile = cf('/tmp/cache'); console.log(getFile('https://registry.npmjs.org:1234/foo/bar')); // outputs: /tmp/cache/registry.npmjs.org_1234/foo/bar ``` npm_3.5.2.orig/node_modules/npm-cache-filename/index.js0000644000000000000000000000122512631326456021225 0ustar 00000000000000var url = require('url');; var path = require('path');; module.exports = cf;; function cf(root, u) { if (!u) return cf.bind(null, root);; u = url.parse(u);; var h = u.host.replace(/:/g, '_');; // Strip off any /-rev/... or ?rev=... bits var revre = /(\?rev=|\?.*?&rev=|\/-rev\/).*$/;; var parts = u.path.replace(revre, '').split('/').slice(1);; // Make sure different git references get different folders if (u.hash && u.hash.length > 1) { parts.push(u.hash.slice(1));; };; var p = [root, h].concat(parts.map(function(part) { return encodeURIComponent(part).replace(/%/g, '_');; }));; return path.join.apply(path, p);; } npm_3.5.2.orig/node_modules/npm-cache-filename/package.json0000644000000000000000000000272312631326456022052 0ustar 00000000000000{ "name": "npm-cache-filename", "version": "1.0.2", "description": "Given a cache folder and url, return the appropriate cache folder.", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "^1.2.0" }, "scripts": { "test": "tap test.js" }, "repository": { "type": "git", "url": "git://github.com/npm/npm-cache-filename.git" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/npm-cache-filename/issues" }, "homepage": "https://github.com/npm/npm-cache-filename", "gitHead": "b7eef12919fdf544a3b83bba73093f7268c40c1e", "_id": "npm-cache-filename@1.0.2", "_shasum": "ded306c5b0bfc870a9e9faf823bc5f283e05ae11", "_from": "npm-cache-filename@>=1.0.2 <1.1.0", "_npmVersion": "2.12.1", "_nodeVersion": "2.2.2", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "ded306c5b0bfc870a9e9faf823bc5f283e05ae11", "tarball": "http://registry.npmjs.org/npm-cache-filename/-/npm-cache-filename-1.0.2.tgz" }, "maintainers": [ { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "kat", "email": "kat@lua.cz" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/npm-cache-filename/-/npm-cache-filename-1.0.2.tgz" } npm_3.5.2.orig/node_modules/npm-cache-filename/test.js0000644000000000000000000000155112631326456021077 0ustar 00000000000000var test = require('tap').test;; test('it does the thing it says it does', function(t) { var cf = require('./');; t.equal(cf('/tmp/cache', 'https://foo:134/xyz?adf=foo:bar/baz'), '/tmp/cache/foo_134/xyz_3Fadf_3Dfoo_3Abar/baz');; var getFile = cf('/tmp/cache');; t.equal(getFile('https://foo:134/xyz?adf=foo:bar/baz'), '/tmp/cache/foo_134/xyz_3Fadf_3Dfoo_3Abar/baz');; t.equal(cf("/tmp", "https://foo:134/xyz/-rev/baz"), '/tmp/foo_134/xyz') t.equal(cf("/tmp", "https://foo:134/xyz/?rev=baz"), '/tmp/foo_134/xyz') t.equal(cf("/tmp", "https://foo:134/xyz/?foo&rev=baz"), '/tmp/foo_134/xyz') t.equal(cf("/tmp", "https://foo:134/xyz-rev/baz"), '/tmp/foo_134/xyz-rev/baz') t.equal(cf("/tmp", "git://foo:134/xyz-rev/baz.git#master"), '/tmp/foo_134/xyz-rev/baz.git/master') t.end(); });; npm_3.5.2.orig/node_modules/npm-install-checks/LICENSE0000644000000000000000000000246512631326456020657 0ustar 00000000000000Copyright (c) Robert Kowalski and Isaac Z. Schlueter ("Authors") All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHORS AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npm_3.5.2.orig/node_modules/npm-install-checks/README.md0000644000000000000000000000102312631326456021116 0ustar 00000000000000# npm-install-checks A package that contains checks that npm runs during the installation. ## API ### .checkEngine(target, npmVer, nodeVer, force, strict, cb) Check if node/npm version is supported by the package. Error type: `ENOTSUP` ### .checkPlatform(target, force, cb) Check if OS/Arch is supported by the package. Error type: `EBADPLATFORM` ### .checkCycle(target, ancestors, cb) Check for cyclic dependencies. Error type: `ECYCLE` ### .checkGit(folder, cb) Check if a folder is a .git folder. Error type: `EISGIT` npm_3.5.2.orig/node_modules/npm-install-checks/index.js0000644000000000000000000000774512631326456021325 0ustar 00000000000000var fs = require("fs") var path = require("path") var log = require("npmlog") var semver = require("semver") exports.checkEngine = checkEngine function checkEngine (target, npmVer, nodeVer, force, strict, cb) { var nodev = force ? null : nodeVer , eng = target.engines if (!eng) return cb() if (nodev && eng.node && !semver.satisfies(nodev, eng.node) || eng.npm && !semver.satisfies(npmVer, eng.npm)) { if (strict) { var er = new Error("Unsupported") er.code = "ENOTSUP" er.required = eng er.pkgid = target._id return cb(er) } else { log.warn( "engine", "%s: wanted: %j (current: %j)" , target._id, eng, {node: nodev, npm: npmVer} ) } } return cb() } exports.checkPlatform = checkPlatform function checkPlatform (target, force, cb) { var platform = process.platform , arch = process.arch , osOk = true , cpuOk = true if (force) { return cb() } if (target.os) { osOk = checkList(platform, target.os) } if (target.cpu) { cpuOk = checkList(arch, target.cpu) } if (!osOk || !cpuOk) { var er = new Error("Unsupported") er.code = "EBADPLATFORM" er.os = target.os || ['any'] er.cpu = target.cpu || ['any'] er.pkgid = target._id return cb(er) } return cb() } function checkList (value, list) { var tmp , match = false , blc = 0 if (typeof list === "string") { list = [list] } if (list.length === 1 && list[0] === "any") { return true } for (var i = 0; i < list.length; ++i) { tmp = list[i] if (tmp[0] === '!') { tmp = tmp.slice(1) if (tmp === value) { return false } ++blc } else { match = match || tmp === value } } return match || blc === list.length } exports.checkCycle = checkCycle function checkCycle (target, ancestors, cb) { // there are some very rare and pathological edge-cases where // a cycle can cause npm to try to install a never-ending tree // of stuff. // Simplest: // // A -> B -> A' -> B' -> A -> B -> A' -> B' -> A -> ... // // Solution: Simply flat-out refuse to install any name@version // that is already in the prototype tree of the ancestors object. // A more correct, but more complex, solution would be to symlink // the deeper thing into the new location. // Will do that if anyone whines about this irl. // // Note: `npm install foo` inside of the `foo` package will abort // earlier if `--force` is not set. However, if it IS set, then // we need to still fail here, but just skip the first level. Of // course, it'll still fail eventually if it's a true cycle, and // leave things in an undefined state, but that's what is to be // expected when `--force` is used. That is why getPrototypeOf // is used *twice* here: to skip the first level of repetition. var p = Object.getPrototypeOf(Object.getPrototypeOf(ancestors)) , name = target.name , version = target.version while (p && p !== Object.prototype && p[name] !== version) { p = Object.getPrototypeOf(p) } if (p[name] !== version) return cb() var er = new Error("Unresolvable cycle detected") var tree = [target._id, JSON.parse(JSON.stringify(ancestors))] , t = Object.getPrototypeOf(ancestors) while (t && t !== Object.prototype) { if (t === p) t.THIS_IS_P = true tree.push(JSON.parse(JSON.stringify(t))) t = Object.getPrototypeOf(t) } log.verbose("unresolvable dependency tree", tree) er.pkgid = target._id er.code = "ECYCLE" return cb(er) } exports.checkGit = checkGit function checkGit (folder, cb) { // if it's a git repo then don't touch it! fs.lstat(folder, function (er, s) { if (er || !s.isDirectory()) return cb() else checkGit_(folder, cb) }) } function checkGit_ (folder, cb) { fs.stat(path.resolve(folder, ".git"), function (er, s) { if (!er && s.isDirectory()) { var e = new Error("Appears to be a git repo or submodule.") e.path = folder e.code = "EISGIT" return cb(e) } cb() }) } npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/0000755000000000000000000000000012631326456022320 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/package.json0000644000000000000000000000324512631326456022135 0ustar 00000000000000{ "name": "npm-install-checks", "version": "2.0.1", "description": "checks that npm runs during the installation of a module", "main": "index.js", "dependencies": { "npmlog": "0.1 || 1", "semver": "^2.3.0 || 3.x || 4 || 5" }, "devDependencies": { "mkdirp": "~0.3.5", "rimraf": "~2.2.5", "tap": "^1.2.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/npm/npm-install-checks.git" }, "homepage": "https://github.com/npm/npm-install-checks", "keywords": [ "npm,", "install" ], "author": { "name": "Robert Kowalski", "email": "rok@kowalski.gd" }, "license": "BSD-2-Clause", "bugs": { "url": "https://github.com/npm/npm-install-checks/issues" }, "gitHead": "1e9474f30490cd7621e976e91fa611d35e644f64", "_id": "npm-install-checks@2.0.1", "_shasum": "a93540b53f04fa9d916d2733d6541f6db7d88e46", "_from": "npm-install-checks@2.0.1", "_npmVersion": "3.3.4", "_nodeVersion": "4.0.0", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "dist": { "shasum": "a93540b53f04fa9d916d2733d6541f6db7d88e46", "tarball": "http://registry.npmjs.org/npm-install-checks/-/npm-install-checks-2.0.1.tgz" }, "maintainers": [ { "name": "robertkowalski", "email": "rok@kowalski.gd" }, { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "iarna", "email": "me@re-becca.org" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/npm-install-checks/-/npm-install-checks-2.0.1.tgz" } npm_3.5.2.orig/node_modules/npm-install-checks/test/0000755000000000000000000000000012631326456020622 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/0000755000000000000000000000000012631326456023614 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/.npmrc0000644000000000000000000000005412631326456024733 0ustar 00000000000000save-prefix = ~ proprietary-attribs = false npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/LICENSE0000644000000000000000000000137512631326456024627 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/README.md0000644000000000000000000001213612631326456025076 0ustar 00000000000000# npmlog The logger util that npm uses. This logger is very basic. It does the logging for npm. It supports custom levels and colored output. By default, logs are written to stderr. If you want to send log messages to outputs other than streams, then you can change the `log.stream` member, or you can just listen to the events that it emits, and do whatever you want with them. # Basic Usage ``` var log = require('npmlog') // additional stuff ---------------------------+ // message ----------+ | // prefix ----+ | | // level -+ | | | // v v v v log.info('fyi', 'I have a kitty cat: %j', myKittyCat) ``` ## log.level * {String} The level to display logs at. Any logs at or above this level will be displayed. The special level `silent` will prevent anything from being displayed ever. ## log.record * {Array} An array of all the log messages that have been entered. ## log.maxRecordSize * {Number} The maximum number of records to keep. If log.record gets bigger than 10% over this value, then it is sliced down to 90% of this value. The reason for the 10% window is so that it doesn't have to resize a large array on every log entry. ## log.prefixStyle * {Object} A style object that specifies how prefixes are styled. (See below) ## log.headingStyle * {Object} A style object that specifies how the heading is styled. (See below) ## log.heading * {String} Default: "" If set, a heading that is printed at the start of every line. ## log.stream * {Stream} Default: `process.stderr` The stream where output is written. ## log.enableColor() Force colors to be used on all messages, regardless of the output stream. ## log.disableColor() Disable colors on all messages. ## log.enableProgress() Enable the display of log activity spinner and progress bar ## log.disableProgress() Disable the display of a progress bar ## log.enableUnicode() Force the unicode theme to be used for the progress bar. ## log.disableUnicode() Disable the use of unicode in the progress bar. ## log.setGaugeTemplate(template) Overrides the default gauge template. ## log.pause() Stop emitting messages to the stream, but do not drop them. ## log.resume() Emit all buffered messages that were written while paused. ## log.log(level, prefix, message, ...) * `level` {String} The level to emit the message at * `prefix` {String} A string prefix. Set to "" to skip. * `message...` Arguments to `util.format` Emit a log message at the specified level. ## log\[level](prefix, message, ...) For example, * log.silly(prefix, message, ...) * log.verbose(prefix, message, ...) * log.info(prefix, message, ...) * log.http(prefix, message, ...) * log.warn(prefix, message, ...) * log.error(prefix, message, ...) Like `log.log(level, prefix, message, ...)`. In this way, each level is given a shorthand, so you can do `log.info(prefix, message)`. ## log.addLevel(level, n, style, disp) * `level` {String} Level indicator * `n` {Number} The numeric level * `style` {Object} Object with fg, bg, inverse, etc. * `disp` {String} Optional replacement for `level` in the output. Sets up a new level with a shorthand function and so forth. Note that if the number is `Infinity`, then setting the level to that will cause all log messages to be suppressed. If the number is `-Infinity`, then the only way to show it is to enable all log messages. ## log.newItem(name, todo, weight) * `name` {String} Optional; progress item name. * `todo` {Number} Optional; total amount of work to be done. Default 0. * `weight` {Number} Optional; the weight of this item relative to others. Default 1. This adds a new `are-we-there-yet` item tracker to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `Tracker` object. ## log.newStream(name, todo, weight) This adds a new `are-we-there-yet` stream tracker to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `TrackerStream` object. ## log.newGroup(name, weight) This adds a new `are-we-there-yet` tracker group to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `TrackerGroup` object. # Events Events are all emitted with the message object. * `log` Emitted for all messages * `log.` Emitted for all messages with the `` level. * `` Messages with prefixes also emit their prefix as an event. # Style Objects Style objects can have the following fields: * `fg` {String} Color for the foreground text * `bg` {String} Color for the background * `bold`, `inverse`, `underline` {Boolean} Set the associated property * `bell` {Boolean} Make a noise (This is pretty annoying, probably.) # Message Objects Every log event is emitted with a message object, and the `log.record` list contains all of them that have been created. They have the following fields: * `id` {Number} * `level` {String} * `prefix` {String} * `message` {String} Result of `util.format()` * `messageRaw` {Array} Arguments to `util.format()` npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/example.js0000644000000000000000000000312512631326456025606 0ustar 00000000000000var log = require('./log.js') log.heading = 'npm' console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n'+ new Error('a 404 error').stack) log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/log.js0000644000000000000000000001477112631326456024745 0ustar 00000000000000'use strict' var Progress = require('are-we-there-yet') var Gauge = require('gauge') var EE = require('events').EventEmitter var log = exports = module.exports = new EE var util = require('util') var ansi = require('ansi') log.cursor = ansi(process.stderr) log.stream = process.stderr // by default, let ansi decide based on tty-ness. var colorEnabled = undefined log.enableColor = function () { colorEnabled = true this.cursor.enabled = true } log.disableColor = function () { colorEnabled = false this.cursor.enabled = false } // default level log.level = 'info' log.gauge = new Gauge(log.cursor) log.tracker = new Progress.TrackerGroup() // no progress bars unless asked log.progressEnabled = false var gaugeTheme = undefined log.enableUnicode = function () { gaugeTheme = Gauge.unicode log.gauge.setTheme(gaugeTheme) } log.disableUnicode = function () { gaugeTheme = Gauge.ascii log.gauge.setTheme(gaugeTheme) } var gaugeTemplate = undefined log.setGaugeTemplate = function (template) { gaugeTemplate = template log.gauge.setTemplate(gaugeTemplate) } log.enableProgress = function () { if (this.progressEnabled) return this.progressEnabled = true if (this._pause) return this.tracker.on('change', this.showProgress) this.gauge.enable() this.showProgress() } log.disableProgress = function () { if (!this.progressEnabled) return this.clearProgress() this.progressEnabled = false this.tracker.removeListener('change', this.showProgress) this.gauge.disable() } var trackerConstructors = ['newGroup', 'newItem', 'newStream'] var mixinLog = function (tracker) { // mixin the public methods from log into the tracker // (except: conflicts and one's we handle specially) Object.keys(log).forEach(function (P) { if (P[0] === '_') return if (trackerConstructors.filter(function (C) { return C === P }).length) return if (tracker[P]) return if (typeof log[P] !== 'function') return var func = log[P] tracker[P] = function () { return func.apply(log, arguments) } }) // if the new tracker is a group, make sure any subtrackers get // mixed in too if (tracker instanceof Progress.TrackerGroup) { trackerConstructors.forEach(function (C) { var func = tracker[C] tracker[C] = function () { return mixinLog(func.apply(tracker, arguments)) } }) } return tracker } // Add tracker constructors to the top level log object trackerConstructors.forEach(function (C) { log[C] = function () { return mixinLog(this.tracker[C].apply(this.tracker, arguments)) } }) log.clearProgress = function () { if (!this.progressEnabled) return this.gauge.hide() } log.showProgress = function (name) { if (!this.progressEnabled) return this.gauge.show(name, this.tracker.completed()) }.bind(log) // bind for use in tracker's on-change listener // temporarily stop emitting, but don't drop log.pause = function () { this._paused = true } log.resume = function () { if (!this._paused) return this._paused = false var b = this._buffer this._buffer = [] b.forEach(function (m) { this.emitLog(m) }, this) if (this.progressEnabled) this.enableProgress() } log._buffer = [] var id = 0 log.record = [] log.maxRecordSize = 10000 log.log = function (lvl, prefix, message) { var l = this.levels[lvl] if (l === undefined) { return this.emit('error', new Error(util.format( 'Undefined log level: %j', lvl))) } var a = new Array(arguments.length - 2) var stack = null for (var i = 2; i < arguments.length; i ++) { var arg = a[i-2] = arguments[i] // resolve stack traces to a plain string. if (typeof arg === 'object' && arg && (arg instanceof Error) && arg.stack) { arg.stack = stack = arg.stack + '' } } if (stack) a.unshift(stack + '\n') message = util.format.apply(util, a) var m = { id: id++, level: lvl, prefix: String(prefix || ''), message: message, messageRaw: a } this.emit('log', m) this.emit('log.' + lvl, m) if (m.prefix) this.emit(m.prefix, m) this.record.push(m) var mrs = this.maxRecordSize var n = this.record.length - mrs if (n > mrs / 10) { var newSize = Math.floor(mrs * 0.9) this.record = this.record.slice(-1 * newSize) } this.emitLog(m) }.bind(log) log.emitLog = function (m) { if (this._paused) { this._buffer.push(m) return } if (this.progressEnabled) this.gauge.pulse(m.prefix) var l = this.levels[m.level] if (l === undefined) return if (l < this.levels[this.level]) return if (l > 0 && !isFinite(l)) return var style = log.style[m.level] var disp = log.disp[m.level] || m.level this.clearProgress() m.message.split(/\r?\n/).forEach(function (line) { if (this.heading) { this.write(this.heading, this.headingStyle) this.write(' ') } this.write(disp, log.style[m.level]) var p = m.prefix || '' if (p) this.write(' ') this.write(p, this.prefixStyle) this.write(' ' + line + '\n') }, this) this.showProgress() } log.write = function (msg, style) { if (!this.cursor) return if (this.stream !== this.cursor.stream) { this.cursor = ansi(this.stream, { enabled: colorEnabled }) var options = {} if (gaugeTheme != null) options.theme = gaugeTheme if (gaugeTemplate != null) options.template = gaugeTemplate this.gauge = new Gauge(options, this.cursor) } style = style || {} if (style.fg) this.cursor.fg[style.fg]() if (style.bg) this.cursor.bg[style.bg]() if (style.bold) this.cursor.bold() if (style.underline) this.cursor.underline() if (style.inverse) this.cursor.inverse() if (style.beep) this.cursor.beep() this.cursor.write(msg).reset() } log.addLevel = function (lvl, n, style, disp) { if (!disp) disp = lvl this.levels[lvl] = n this.style[lvl] = style if (!this[lvl]) this[lvl] = function () { var a = new Array(arguments.length + 1) a[0] = lvl for (var i = 0; i < arguments.length; i ++) { a[i + 1] = arguments[i] } return this.log.apply(this, a) }.bind(this) this.disp[lvl] = disp } log.prefixStyle = { fg: 'magenta' } log.headingStyle = { fg: 'white', bg: 'black' } log.style = {} log.levels = {} log.disp = {} log.addLevel('silly', -Infinity, { inverse: true }, 'sill') log.addLevel('verbose', 1000, { fg: 'blue', bg: 'black' }, 'verb') log.addLevel('info', 2000, { fg: 'green' }) log.addLevel('http', 3000, { fg: 'green', bg: 'black' }) log.addLevel('warn', 4000, { fg: 'black', bg: 'yellow' }, 'WARN') log.addLevel('error', 5000, { fg: 'red', bg: 'black' }, 'ERR!') log.addLevel('silent', Infinity) npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/0000755000000000000000000000000012631326456026271 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/package.json0000644000000000000000000000256412631326456026111 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "npmlog", "description": "logger for npm", "version": "1.2.1", "repository": { "type": "git", "url": "git://github.com/isaacs/npmlog.git" }, "main": "log.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "ansi": "~0.3.0", "are-we-there-yet": "~1.0.0", "gauge": "~1.2.0" }, "devDependencies": { "tap": "" }, "license": "ISC", "gitHead": "4e1a73a567036064ded425a7d48c863d53550b4f", "bugs": { "url": "https://github.com/isaacs/npmlog/issues" }, "homepage": "https://github.com/isaacs/npmlog#readme", "_id": "npmlog@1.2.1", "_shasum": "28e7be619609b53f7ad1dd300a10d64d716268b6", "_from": "npmlog@>=0.1.0 <0.2.0||>=1.0.0 <2.0.0", "_npmVersion": "2.10.0", "_nodeVersion": "2.0.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "28e7be619609b53f7ad1dd300a10d64d716268b6", "tarball": "http://registry.npmjs.org/npmlog/-/npmlog-1.2.1.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "iarna", "email": "me@re-becca.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/npmlog/-/npmlog-1.2.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/test/0000755000000000000000000000000012631326456024573 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/0000755000000000000000000000000012631326456027223 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/0000755000000000000000000000000012631326456031355 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/0000755000000000000000000000000012631326456027361 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/.npmignore0000644000000000000000000000001512631326456031216 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/History.md0000644000000000000000000000060612631326456031210 0ustar 00000000000000 0.3.0 / 2014-05-09 ================== * package: remove "test" script and "devDependencies" * package: remove "engines" section * pacakge: remove "bin" section * package: beautify * examples: remove `starwars` example (#15) * Documented goto, horizontalAbsolute, and eraseLine methods in README.md (#12, @Jammerwoch) * add `.jshintrc` file < 0.3.0 ======= * Prehistoric npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/README.md0000644000000000000000000000621312631326456030504 0ustar 00000000000000ansi.js ========= ### Advanced ANSI formatting tool for Node.js `ansi.js` is a module for Node.js that provides an easy-to-use API for writing ANSI escape codes to `Stream` instances. ANSI escape codes are used to do fancy things in a terminal window, like render text in colors, delete characters, lines, the entire window, or hide and show the cursor, and lots more! #### Features: * 256 color support for the terminal! * Make a beep sound from your terminal! * Works with *any* writable `Stream` instance. * Allows you to move the cursor anywhere on the terminal window. * Allows you to delete existing contents from the terminal window. * Allows you to hide and show the cursor. * Converts CSS color codes and RGB values into ANSI escape codes. * Low-level; you are in control of when escape codes are used, it's not abstracted. Installation ------------ Install with `npm`: ``` bash $ npm install ansi ``` Example ------- ``` js var ansi = require('ansi') , cursor = ansi(process.stdout) // You can chain your calls forever: cursor .red() // Set font color to red .bg.grey() // Set background color to grey .write('Hello World!') // Write 'Hello World!' to stdout .bg.reset() // Reset the bgcolor before writing the trailing \n, // to avoid Terminal glitches .write('\n') // And a final \n to wrap things up // Rendering modes are persistent: cursor.hex('#660000').bold().underline() // You can use the regular logging functions, text will be green: console.log('This is blood red, bold text') // To reset just the foreground color: cursor.fg.reset() console.log('This will still be bold') // to go to a location (x,y) on the console // note: 1-indexed, not 0-indexed: cursor.goto(10, 5).write('Five down, ten over') // to clear the current line: cursor.horizontalAbsolute(0).eraseLine().write('Starting again') // to go to a different column on the current line: cursor.horizontalAbsolute(5).write('column five') // Clean up after yourself! cursor.reset() ``` License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/0000755000000000000000000000000012631326456031041 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/lib/0000755000000000000000000000000012631326456027771 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/package.json0000644000000000000000000000242212631326456031511 0ustar 00000000000000{ "name": "ansi", "description": "Advanced ANSI formatting tool for Node.js", "keywords": [ "ansi", "formatting", "cursor", "color", "terminal", "rgb", "256", "stream" ], "version": "0.3.0", "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://tootallnate.net" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/ansi.js.git" }, "main": "./lib/ansi.js", "bugs": { "url": "https://github.com/TooTallNate/ansi.js/issues" }, "homepage": "https://github.com/TooTallNate/ansi.js", "_id": "ansi@0.3.0", "_shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "_from": "ansi@>=0.3.0 <0.4.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "TooTallNate", "email": "nathan@tootallnate.net" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "tarball": "http://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/beep/0000755000000000000000000000000012631326456031754 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/clear/0000755000000000000000000000000012631326456032127 5ustar 00000000000000././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/cursorPosition.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/cursor0000755000000000000000000000120012631326456032275 0ustar 00000000000000#!/usr/bin/env node var tty = require('tty') var cursor = require('../')(process.stdout) // listen for the queryPosition report on stdin process.stdin.resume() raw(true) process.stdin.once('data', function (b) { var match = /\[(\d+)\;(\d+)R$/.exec(b.toString()) if (match) { var xy = match.slice(1, 3).reverse().map(Number) console.error(xy) } // cleanup and close stdin raw(false) process.stdin.pause() }) // send the query position request code to stdout cursor.queryPosition() function raw (mode) { if (process.stdin.setRawMode) { process.stdin.setRawMode(mode) } else { tty.setRawMode(mode) } } ././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/progress/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/progre0000755000000000000000000000000012631326456032260 5ustar 00000000000000././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/beep/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/beep/i0000755000000000000000000000051212631326456032130 0ustar 00000000000000#!/usr/bin/env node /** * Invokes the terminal "beep" sound once per second on every exact second. */ process.title = 'beep' var cursor = require('../../')(process.stdout) function beep () { cursor.beep() setTimeout(beep, 1000 - (new Date()).getMilliseconds()) } setTimeout(beep, 1000 - (new Date()).getMilliseconds()) ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/clear/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/clear/0000755000000000000000000000054412631326456032137 0ustar 00000000000000#!/usr/bin/env node /** * Like GNU ncurses "clear" command. * https://github.com/mscdex/node-ncurses/blob/master/deps/ncurses/progs/clear.c */ process.title = 'clear' function lf () { return '\n' } require('../../')(process.stdout) .write(Array.apply(null, Array(process.stdout.getWindowSize()[1])).map(lf).join('')) .eraseData(2) .goto(1, 1) ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/progress/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/examples/progre0000644000000000000000000000327412631326456032270 0ustar 00000000000000#!/usr/bin/env node var assert = require('assert') , ansi = require('../../') function Progress (stream, width) { this.cursor = ansi(stream) this.delta = this.cursor.newlines this.width = width | 0 || 10 this.open = '[' this.close = ']' this.complete = '█' this.incomplete = '_' // initial render this.progress = 0 } Object.defineProperty(Progress.prototype, 'progress', { get: get , set: set , configurable: true , enumerable: true }) function get () { return this._progress } function set (v) { this._progress = Math.max(0, Math.min(v, 100)) var w = this.width - this.complete.length - this.incomplete.length , n = w * (this._progress / 100) | 0 , i = w - n , com = c(this.complete, n) , inc = c(this.incomplete, i) , delta = this.cursor.newlines - this.delta assert.equal(com.length + inc.length, w) if (delta > 0) { this.cursor.up(delta) this.delta = this.cursor.newlines } this.cursor .horizontalAbsolute(0) .eraseLine(2) .fg.white() .write(this.open) .fg.grey() .bold() .write(com) .resetBold() .write(inc) .fg.white() .write(this.close) .fg.reset() .write('\n') } function c (char, length) { return Array.apply(null, Array(length)).map(function () { return char }).join('') } // Usage var width = parseInt(process.argv[2], 10) || process.stdout.getWindowSize()[0] / 2 , p = new Progress(process.stdout, width) ;(function tick () { p.progress += Math.random() * 5 p.cursor .eraseLine(2) .write('Progress: ') .bold().write(p.progress.toFixed(2)) .write('%') .resetBold() .write('\n') if (p.progress < 100) setTimeout(tick, 100) })() npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/ansi/lib/ansi.js0000644000000000000000000001743412631326456031272 0ustar 00000000000000 /** * References: * * - http://en.wikipedia.org/wiki/ANSI_escape_code * - http://www.termsys.demon.co.uk/vtansi.htm * */ /** * Module dependencies. */ var emitNewlineEvents = require('./newlines') , prefix = '\x1b[' // For all escape codes , suffix = 'm' // Only for color codes /** * The ANSI escape sequences. */ var codes = { up: 'A' , down: 'B' , forward: 'C' , back: 'D' , nextLine: 'E' , previousLine: 'F' , horizontalAbsolute: 'G' , eraseData: 'J' , eraseLine: 'K' , scrollUp: 'S' , scrollDown: 'T' , savePosition: 's' , restorePosition: 'u' , queryPosition: '6n' , hide: '?25l' , show: '?25h' } /** * Rendering ANSI codes. */ var styles = { bold: 1 , italic: 3 , underline: 4 , inverse: 7 } /** * The negating ANSI code for the rendering modes. */ var reset = { bold: 22 , italic: 23 , underline: 24 , inverse: 27 } /** * The standard, styleable ANSI colors. */ var colors = { white: 37 , black: 30 , blue: 34 , cyan: 36 , green: 32 , magenta: 35 , red: 31 , yellow: 33 , grey: 90 , brightBlack: 90 , brightRed: 91 , brightGreen: 92 , brightYellow: 93 , brightBlue: 94 , brightMagenta: 95 , brightCyan: 96 , brightWhite: 97 } /** * Creates a Cursor instance based off the given `writable stream` instance. */ function ansi (stream, options) { if (stream._ansicursor) { return stream._ansicursor } else { return stream._ansicursor = new Cursor(stream, options) } } module.exports = exports = ansi /** * The `Cursor` class. */ function Cursor (stream, options) { if (!(this instanceof Cursor)) { return new Cursor(stream, options) } if (typeof stream != 'object' || typeof stream.write != 'function') { throw new Error('a valid Stream instance must be passed in') } // the stream to use this.stream = stream // when 'enabled' is false then all the functions are no-ops except for write() this.enabled = options && options.enabled if (typeof this.enabled === 'undefined') { this.enabled = stream.isTTY } this.enabled = !!this.enabled // then `buffering` is true, then `write()` calls are buffered in // memory until `flush()` is invoked this.buffering = !!(options && options.buffering) this._buffer = [] // controls the foreground and background colors this.fg = this.foreground = new Colorer(this, 0) this.bg = this.background = new Colorer(this, 10) // defaults this.Bold = false this.Italic = false this.Underline = false this.Inverse = false // keep track of the number of "newlines" that get encountered this.newlines = 0 emitNewlineEvents(stream) stream.on('newline', function () { this.newlines++ }.bind(this)) } exports.Cursor = Cursor /** * Helper function that calls `write()` on the underlying Stream. * Returns `this` instead of the write() return value to keep * the chaining going. */ Cursor.prototype.write = function (data) { if (this.buffering) { this._buffer.push(arguments) } else { this.stream.write.apply(this.stream, arguments) } return this } /** * Buffer `write()` calls into memory. * * @api public */ Cursor.prototype.buffer = function () { this.buffering = true return this } /** * Write out the in-memory buffer. * * @api public */ Cursor.prototype.flush = function () { this.buffering = false var str = this._buffer.map(function (args) { if (args.length != 1) throw new Error('unexpected args length! ' + args.length); return args[0]; }).join(''); this._buffer.splice(0); // empty this.write(str); return this } /** * The `Colorer` class manages both the background and foreground colors. */ function Colorer (cursor, base) { this.current = null this.cursor = cursor this.base = base } exports.Colorer = Colorer /** * Write an ANSI color code, ensuring that the same code doesn't get rewritten. */ Colorer.prototype._setColorCode = function setColorCode (code) { var c = String(code) if (this.current === c) return this.cursor.enabled && this.cursor.write(prefix + c + suffix) this.current = c return this } /** * Set up the positional ANSI codes. */ Object.keys(codes).forEach(function (name) { var code = String(codes[name]) Cursor.prototype[name] = function () { var c = code if (arguments.length > 0) { c = toArray(arguments).map(Math.round).join(';') + code } this.enabled && this.write(prefix + c) return this } }) /** * Set up the functions for the rendering ANSI codes. */ Object.keys(styles).forEach(function (style) { var name = style[0].toUpperCase() + style.substring(1) , c = styles[style] , r = reset[style] Cursor.prototype[style] = function () { if (this[name]) return this.enabled && this.write(prefix + c + suffix) this[name] = true return this } Cursor.prototype['reset' + name] = function () { if (!this[name]) return this.enabled && this.write(prefix + r + suffix) this[name] = false return this } }) /** * Setup the functions for the standard colors. */ Object.keys(colors).forEach(function (color) { var code = colors[color] Colorer.prototype[color] = function () { this._setColorCode(this.base + code) return this.cursor } Cursor.prototype[color] = function () { return this.foreground[color]() } }) /** * Makes a beep sound! */ Cursor.prototype.beep = function () { this.enabled && this.write('\x07') return this } /** * Moves cursor to specific position */ Cursor.prototype.goto = function (x, y) { x = x | 0 y = y | 0 this.enabled && this.write(prefix + y + ';' + x + 'H') return this } /** * Resets the color. */ Colorer.prototype.reset = function () { this._setColorCode(this.base + 39) return this.cursor } /** * Resets all ANSI formatting on the stream. */ Cursor.prototype.reset = function () { this.enabled && this.write(prefix + '0' + suffix) this.Bold = false this.Italic = false this.Underline = false this.Inverse = false this.foreground.current = null this.background.current = null return this } /** * Sets the foreground color with the given RGB values. * The closest match out of the 216 colors is picked. */ Colorer.prototype.rgb = function (r, g, b) { var base = this.base + 38 , code = rgb(r, g, b) this._setColorCode(base + ';5;' + code) return this.cursor } /** * Same as `cursor.fg.rgb(r, g, b)`. */ Cursor.prototype.rgb = function (r, g, b) { return this.foreground.rgb(r, g, b) } /** * Accepts CSS color codes for use with ANSI escape codes. * For example: `#FF000` would be bright red. */ Colorer.prototype.hex = function (color) { return this.rgb.apply(this, hex(color)) } /** * Same as `cursor.fg.hex(color)`. */ Cursor.prototype.hex = function (color) { return this.foreground.hex(color) } // UTIL FUNCTIONS // /** * Translates a 255 RGB value to a 0-5 ANSI RGV value, * then returns the single ANSI color code to use. */ function rgb (r, g, b) { var red = r / 255 * 5 , green = g / 255 * 5 , blue = b / 255 * 5 return rgb5(red, green, blue) } /** * Turns rgb 0-5 values into a single ANSI color code to use. */ function rgb5 (r, g, b) { var red = Math.round(r) , green = Math.round(g) , blue = Math.round(b) return 16 + (red*36) + (green*6) + blue } /** * Accepts a hex CSS color code string (# is optional) and * translates it into an Array of 3 RGB 0-255 values, which * can then be used with rgb(). */ function hex (color) { var c = color[0] === '#' ? color.substring(1) : color , r = c.substring(0, 2) , g = c.substring(2, 4) , b = c.substring(4, 6) return [parseInt(r, 16), parseInt(g, 16), parseInt(b, 16)] } /** * Turns an array-like object into a real array. */ function toArray (a) { var i = 0 , l = a.length , rtn = [] for (; i 0) { var len = data.length , i = 0 // now try to calculate any deltas if (typeof data == 'string') { for (; i 100% are not allowed. Triggers a `change` event. * tracker.finish() Marks this tracker as finished, tracker.completed() will now be 1. Triggers a `change` event. TrackerStream ============= * var tracker = new TrackerStream(**name**, **size**, **options**) * **name** *(optional)* The name of this counter to report in change events. Defaults to undefined. * **size** *(optional)* The number of bytes being sent through this stream. * **options** *(optional)* A hash of stream options The tracker stream object is a pass through stream that updates an internal tracker object each time a block passes through. It's intended to track downloads, file extraction and other related activities. You use it by piping your data source into it and then using it as your data source. If your data has a length attribute then that's used as the amount of work completed when the chunk is passed through. If it does not (eg, object streams) then each chunk counts as completing 1 unit of work, so your size should be the total number of objects being streamed. * tracker.addWork(**todo**) * **todo** Increase the expected overall size by **todo** bytes. Increases the amount of work to be done, thus decreasing the completion percentage. Triggers a `change` event. ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/ind0000644000000000000000000000700412631326456032053 0ustar 00000000000000"use strict" var stream = require("readable-stream"); var EventEmitter = require("events").EventEmitter var util = require("util") var delegate = require("delegates") var TrackerGroup = exports.TrackerGroup = function (name) { EventEmitter.call(this) this.name = name this.trackGroup = [] var self = this this.totalWeight = 0 var noteChange = this.noteChange = function (name) { self.emit("change", name || this.name) }.bind(this) this.trackGroup.forEach(function(unit) { unit.on("change", noteChange) }) } util.inherits(TrackerGroup, EventEmitter) TrackerGroup.prototype.completed = function () { if (this.trackGroup.length==0) return 0 var valPerWeight = 1 / this.totalWeight var completed = 0 this.trackGroup.forEach(function(T) { completed += valPerWeight * T.weight * T.completed() }) return completed } TrackerGroup.prototype.addUnit = function (unit, weight, noChange) { unit.weight = weight || 1 this.totalWeight += unit.weight this.trackGroup.push(unit) unit.on("change", this.noteChange) if (! noChange) this.emit("change", this.name) return unit } TrackerGroup.prototype.newGroup = function (name, weight) { return this.addUnit(new TrackerGroup(name), weight) } TrackerGroup.prototype.newItem = function (name, todo, weight) { return this.addUnit(new Tracker(name, todo), weight) } TrackerGroup.prototype.newStream = function (name, todo, weight) { return this.addUnit(new TrackerStream(name, todo), weight) } TrackerGroup.prototype.finish = function () { if (! this.trackGroup.length) { this.addUnit(new Tracker(), 1, true) } var self = this this.trackGroup.forEach(function(T) { T.removeListener("change", self.noteChange) T.finish() }) this.emit("change", this.name) } var buffer = " " TrackerGroup.prototype.debug = function (depth) { depth = depth || 0 var indent = depth ? buffer.substr(0,depth) : "" var output = indent + (this.name||"top") + ": " + this.completed() + "\n" this.trackGroup.forEach(function(T) { if (T instanceof TrackerGroup) { output += T.debug(depth + 1) } else { output += indent + " " + T.name + ": " + T.completed() + "\n" } }) return output } var Tracker = exports.Tracker = function (name,todo) { EventEmitter.call(this) this.name = name this.workDone = 0 this.workTodo = todo || 0 } util.inherits(Tracker, EventEmitter) Tracker.prototype.completed = function () { return this.workTodo==0 ? 0 : this.workDone / this.workTodo } Tracker.prototype.addWork = function (work) { this.workTodo += work this.emit("change", this.name) } Tracker.prototype.completeWork = function (work) { this.workDone += work if (this.workDone > this.workTodo) this.workDone = this.workTodo this.emit("change", this.name) } Tracker.prototype.finish = function () { this.workTodo = this.workDone = 1 this.emit("change", this.name) } var TrackerStream = exports.TrackerStream = function (name, size, options) { stream.Transform.call(this, options) this.tracker = new Tracker(name, size) this.name = name var self = this this.tracker.on("change", function (name) { self.emit("change", name) }) } util.inherits(TrackerStream, stream.Transform) TrackerStream.prototype._transform = function (data, encoding, cb) { this.tracker.completeWork(data.length ? data.length : 1) this.push(data) cb() } TrackerStream.prototype._flush = function (cb) { this.tracker.finish() cb() } delegate(TrackerStream.prototype, "tracker") .method("completed") .method("addWork") ././@LongLink0000000000000000000000000000015700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/pac0000644000000000000000000000256312631326456032051 0ustar 00000000000000{ "name": "are-we-there-yet", "version": "1.0.4", "description": "Keep track of the overall completion of many dispirate processes", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/are-we-there-yet.git" }, "author": { "name": "Rebecca Turner", "url": "http://re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/are-we-there-yet/issues" }, "homepage": "https://github.com/iarna/are-we-there-yet", "devDependencies": { "tap": "^0.4.13" }, "dependencies": { "delegates": "^0.1.0", "readable-stream": "^1.1.13" }, "gitHead": "7ce414849b81ab83935a935275def01914821bde", "_id": "are-we-there-yet@1.0.4", "_shasum": "527fe389f7bcba90806106b99244eaa07e886f85", "_from": "are-we-there-yet@>=1.0.0 <1.1.0", "_npmVersion": "2.0.0", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "dist": { "shasum": "527fe389f7bcba90806106b99244eaa07e886f85", "tarball": "http://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-1.0.4.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-1.0.4.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/test/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/tes0000755000000000000000000000000012631326456032071 5ustar 00000000000000././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/.npmignorenpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000001612631326456032055 0ustar 00000000000000node_modules/ ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/History.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000034412631326456032061 0ustar 00000000000000 0.1.0 / 2014-10-17 ================== * adds `.fluent()` to api 0.0.3 / 2014-01-13 ================== * fix receiver for .method() 0.0.2 / 2014-01-13 ================== * Object.defineProperty() sucks * Initial commit ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/Makefilenpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000014412631326456032057 0ustar 00000000000000 test: @./node_modules/.bin/mocha \ --require should \ --reporter spec \ --bail .PHONY: test././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/Readme.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000335012631326456032061 0ustar 00000000000000 # delegates Node method and accessor delegation utilty. ## Installation ``` $ npm install delegates ``` ## Example ```js var delegate = require('delegates'); ... delegate(proto, 'request') .method('acceptsLanguages') .method('acceptsEncodings') .method('acceptsCharsets') .method('accepts') .method('is') .access('querystring') .access('idempotent') .access('socket') .access('length') .access('query') .access('search') .access('status') .access('method') .access('path') .access('body') .access('host') .access('url') .getter('subdomains') .getter('protocol') .getter('header') .getter('stale') .getter('fresh') .getter('secure') .getter('ips') .getter('ip') ``` # API ## Delegate(proto, prop) Creates a delegator instance used to configure using the `prop` on the given `proto` object. (which is usually a prototype) ## Delegate#method(name) Allows the given method `name` to be accessed on the host. ## Delegate#getter(name) Creates a "getter" for the property with the given `name` on the delegated object. ## Delegate#setter(name) Creates a "setter" for the property with the given `name` on the delegated object. ## Delegate#access(name) Creates an "accessor" (ie: both getter *and* setter) for the property with the given `name` on the delegated object. ## Delegate#fluent(name) A unique type of "accessor" that works for a "fluent" API. When called as a getter, the method returns the expected value. However, if the method is called with a value, it will return itself so it can be chained. For example: ```js delegate(proto, 'request') .fluent('query') // getter var q = request.query(); // setter (chainable) request .query({ a: 1 }) .query({ b: 2 }); ``` # License MIT ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000402112631326456032055 0ustar 00000000000000 /** * Expose `Delegator`. */ module.exports = Delegator; /** * Initialize a delegator. * * @param {Object} proto * @param {String} target * @api public */ function Delegator(proto, target) { if (!(this instanceof Delegator)) return new Delegator(proto, target); this.proto = proto; this.target = target; this.methods = []; this.getters = []; this.setters = []; this.fluents = []; } /** * Delegate method `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.method = function(name){ var proto = this.proto; var target = this.target; this.methods.push(name); proto[name] = function(){ return this[target][name].apply(this[target], arguments); }; return this; }; /** * Delegator accessor `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.access = function(name){ return this.getter(name).setter(name); }; /** * Delegator getter `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.getter = function(name){ var proto = this.proto; var target = this.target; this.getters.push(name); proto.__defineGetter__(name, function(){ return this[target][name]; }); return this; }; /** * Delegator setter `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.setter = function(name){ var proto = this.proto; var target = this.target; this.setters.push(name); proto.__defineSetter__(name, function(val){ return this[target][name] = val; }); return this; }; /** * Delegator fluent accessor * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.fluent = function (name) { var proto = this.proto; var target = this.target; this.fluents.push(name); proto[name] = function(val){ if ('undefined' != typeof val) { this[target][name] = val; return this; } else { return this[target][name]; } }; return this; }; ././@LongLink0000000000000000000000000000020500000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000233112631326456032057 0ustar 00000000000000{ "name": "delegates", "version": "0.1.0", "repository": { "type": "git", "url": "git://github.com/visionmedia/node-delegates.git" }, "description": "delegate methods and accessors to another property", "keywords": [ "delegate", "delegation" ], "dependencies": {}, "devDependencies": { "mocha": "*", "should": "*" }, "license": "MIT", "bugs": { "url": "https://github.com/visionmedia/node-delegates/issues" }, "homepage": "https://github.com/visionmedia/node-delegates", "_id": "delegates@0.1.0", "_shasum": "b4b57be11a1653517a04b27f0949bdc327dfe390", "_from": "delegates@>=0.1.0 <0.2.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "dominicbarnes", "email": "dominic@dbarnes.info" }, "maintainers": [ { "name": "tjholowaychuk", "email": "tj@vision-media.ca" }, { "name": "dominicbarnes", "email": "dominic@dbarnes.info" } ], "dist": { "shasum": "b4b57be11a1653517a04b27f0949bdc327dfe390", "tarball": "http://registry.npmjs.org/delegates/-/delegates-0.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/delegates/-/delegates-0.1.0.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000337012631326456032063 0ustar 00000000000000 var assert = require('assert'); var delegate = require('..'); describe('.method(name)', function(){ it('should delegate methods', function(){ var obj = {}; obj.request = { foo: function(bar){ assert(this == obj.request); return bar; } }; delegate(obj, 'request').method('foo'); obj.foo('something').should.equal('something'); }) }) describe('.getter(name)', function(){ it('should delegate getters', function(){ var obj = {}; obj.request = { get type() { return 'text/html'; } } delegate(obj, 'request').getter('type'); obj.type.should.equal('text/html'); }) }) describe('.setter(name)', function(){ it('should delegate setters', function(){ var obj = {}; obj.request = { get type() { return this._type.toUpperCase(); }, set type(val) { this._type = val; } } delegate(obj, 'request').setter('type'); obj.type = 'hey'; obj.request.type.should.equal('HEY'); }) }) describe('.access(name)', function(){ it('should delegate getters and setters', function(){ var obj = {}; obj.request = { get type() { return this._type.toUpperCase(); }, set type(val) { this._type = val; } } delegate(obj, 'request').access('type'); obj.type = 'hey'; obj.type.should.equal('HEY'); }) }) describe('.fluent(name)', function () { it('should delegate in a fluent fashion', function () { var obj = { settings: { env: 'development' } }; delegate(obj, 'settings').fluent('env'); obj.env().should.equal('development'); obj.env('production').should.equal(obj); obj.settings.env.should.equal('production'); }) }) ././@LongLink0000000000000000000000000000021100000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/.npmignorenpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000004412631326456032056 0ustar 00000000000000build/ test/ examples/ fs.js zlib.js././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000211012631326456032052 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000021000000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000243012631326456032057 0ustar 00000000000000# readable-stream ***Node-core streams for userland*** [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/) This package is a mirror of the Streams2 and Streams3 implementations in Node-core. If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core. **readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12. **readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` you’ll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when you’re ready to start using Streams3, pin to `"~1.1.0"` ././@LongLink0000000000000000000000000000021000000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/duplex.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000006412631326456032060 0ustar 00000000000000module.exports = require("./lib/_stream_duplex.js") ././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/float.patchnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000007374312631326456032076 0ustar 00000000000000diff --git a/lib/_stream_duplex.js b/lib/_stream_duplex.js index c5a741c..a2e0d8e 100644 --- a/lib/_stream_duplex.js +++ b/lib/_stream_duplex.js @@ -26,8 +26,8 @@ module.exports = Duplex; var util = require('util'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('./_stream_readable'); +var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); diff --git a/lib/_stream_passthrough.js b/lib/_stream_passthrough.js index a5e9864..330c247 100644 --- a/lib/_stream_passthrough.js +++ b/lib/_stream_passthrough.js @@ -25,7 +25,7 @@ module.exports = PassThrough; -var Transform = require('_stream_transform'); +var Transform = require('./_stream_transform'); var util = require('util'); util.inherits(PassThrough, Transform); diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js index 0c3fe3e..90a8298 100644 --- a/lib/_stream_readable.js +++ b/lib/_stream_readable.js @@ -23,10 +23,34 @@ module.exports = Readable; Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; +if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { + return emitter.listeners(type).length; +}; + +if (!global.setImmediate) global.setImmediate = function setImmediate(fn) { + return setTimeout(fn, 0); +}; +if (!global.clearImmediate) global.clearImmediate = function clearImmediate(i) { + return clearTimeout(i); +}; + var Stream = require('stream'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var StringDecoder; -var debug = util.debuglog('stream'); +var debug; +if (util.debuglog) + debug = util.debuglog('stream'); +else try { + debug = require('debuglog')('stream'); +} catch (er) { + debug = function() {}; +} util.inherits(Readable, Stream); @@ -380,7 +404,7 @@ function chunkInvalid(state, chunk) { function onEofChunk(stream, state) { - if (state.decoder && !state.ended) { + if (state.decoder && !state.ended && state.decoder.end) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js index b1f9fcc..b0caf57 100644 --- a/lib/_stream_transform.js +++ b/lib/_stream_transform.js @@ -64,8 +64,14 @@ module.exports = Transform; -var Duplex = require('_stream_duplex'); +var Duplex = require('./_stream_duplex'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} util.inherits(Transform, Duplex); diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index ba2e920..f49288b 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -27,6 +27,12 @@ module.exports = Writable; Writable.WritableState = WritableState; var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var Stream = require('stream'); util.inherits(Writable, Stream); @@ -119,7 +125,7 @@ function WritableState(options, stream) { function Writable(options) { // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. - if (!(this instanceof Writable) && !(this instanceof Stream.Duplex)) + if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex'))) return new Writable(options); this._writableState = new WritableState(options, this); diff --git a/test/simple/test-stream-big-push.js b/test/simple/test-stream-big-push.js index e3787e4..8cd2127 100644 --- a/test/simple/test-stream-big-push.js +++ b/test/simple/test-stream-big-push.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var str = 'asdfasdfasdfasdfasdf'; var r = new stream.Readable({ diff --git a/test/simple/test-stream-end-paused.js b/test/simple/test-stream-end-paused.js index bb73777..d40efc7 100644 --- a/test/simple/test-stream-end-paused.js +++ b/test/simple/test-stream-end-paused.js @@ -25,7 +25,7 @@ var gotEnd = false; // Make sure we don't miss the end event for paused 0-length streams -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var stream = new Readable(); var calledRead = false; stream._read = function() { diff --git a/test/simple/test-stream-pipe-after-end.js b/test/simple/test-stream-pipe-after-end.js index b46ee90..0be8366 100644 --- a/test/simple/test-stream-pipe-after-end.js +++ b/test/simple/test-stream-pipe-after-end.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var util = require('util'); util.inherits(TestReadable, Readable); diff --git a/test/simple/test-stream-pipe-cleanup.js b/test/simple/test-stream-pipe-cleanup.js deleted file mode 100644 index f689358..0000000 --- a/test/simple/test-stream-pipe-cleanup.js +++ /dev/null @@ -1,122 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -// This test asserts that Stream.prototype.pipe does not leave listeners -// hanging on the source or dest. - -var common = require('../common'); -var stream = require('stream'); -var assert = require('assert'); -var util = require('util'); - -function Writable() { - this.writable = true; - this.endCalls = 0; - stream.Stream.call(this); -} -util.inherits(Writable, stream.Stream); -Writable.prototype.end = function() { - this.endCalls++; -}; - -Writable.prototype.destroy = function() { - this.endCalls++; -}; - -function Readable() { - this.readable = true; - stream.Stream.call(this); -} -util.inherits(Readable, stream.Stream); - -function Duplex() { - this.readable = true; - Writable.call(this); -} -util.inherits(Duplex, Writable); - -var i = 0; -var limit = 100; - -var w = new Writable(); - -var r; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('end'); -} -assert.equal(0, r.listeners('end').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('close'); -} -assert.equal(0, r.listeners('close').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -r = new Readable(); - -for (i = 0; i < limit; i++) { - w = new Writable(); - r.pipe(w); - w.emit('close'); -} -assert.equal(0, w.listeners('close').length); - -r = new Readable(); -w = new Writable(); -var d = new Duplex(); -r.pipe(d); // pipeline A -d.pipe(w); // pipeline B -assert.equal(r.listeners('end').length, 2); // A.onend, A.cleanup -assert.equal(r.listeners('close').length, 2); // A.onclose, A.cleanup -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 3); // A.cleanup, B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -r.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 0); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 2); // B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -d.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 1); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 0); -assert.equal(d.listeners('close').length, 0); -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 0); diff --git a/test/simple/test-stream-pipe-error-handling.js b/test/simple/test-stream-pipe-error-handling.js index c5d724b..c7d6b7d 100644 --- a/test/simple/test-stream-pipe-error-handling.js +++ b/test/simple/test-stream-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream').Stream; +var Stream = require('../../').Stream; (function testErrorListenerCatches() { var source = new Stream(); diff --git a/test/simple/test-stream-pipe-event.js b/test/simple/test-stream-pipe-event.js index cb9d5fe..56f8d61 100644 --- a/test/simple/test-stream-pipe-event.js +++ b/test/simple/test-stream-pipe-event.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common'); -var stream = require('stream'); +var stream = require('../../'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream-push-order.js b/test/simple/test-stream-push-order.js index f2e6ec2..a5c9bf9 100644 --- a/test/simple/test-stream-push-order.js +++ b/test/simple/test-stream-push-order.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var assert = require('assert'); var s = new Readable({ diff --git a/test/simple/test-stream-push-strings.js b/test/simple/test-stream-push-strings.js index 06f43dc..1701a9a 100644 --- a/test/simple/test-stream-push-strings.js +++ b/test/simple/test-stream-push-strings.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var util = require('util'); util.inherits(MyStream, Readable); diff --git a/test/simple/test-stream-readable-event.js b/test/simple/test-stream-readable-event.js index ba6a577..a8e6f7b 100644 --- a/test/simple/test-stream-readable-event.js +++ b/test/simple/test-stream-readable-event.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; (function first() { // First test, not reading when the readable is added. diff --git a/test/simple/test-stream-readable-flow-recursion.js b/test/simple/test-stream-readable-flow-recursion.js index 2891ad6..11689ba 100644 --- a/test/simple/test-stream-readable-flow-recursion.js +++ b/test/simple/test-stream-readable-flow-recursion.js @@ -27,7 +27,7 @@ var assert = require('assert'); // more data continuously, but without triggering a nextTick // warning or RangeError. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; // throw an error if we trigger a nextTick warning. process.throwDeprecation = true; diff --git a/test/simple/test-stream-unshift-empty-chunk.js b/test/simple/test-stream-unshift-empty-chunk.js index 0c96476..7827538 100644 --- a/test/simple/test-stream-unshift-empty-chunk.js +++ b/test/simple/test-stream-unshift-empty-chunk.js @@ -24,7 +24,7 @@ var assert = require('assert'); // This test verifies that stream.unshift(Buffer(0)) or // stream.unshift('') does not set state.reading=false. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var nChunks = 10; diff --git a/test/simple/test-stream-unshift-read-race.js b/test/simple/test-stream-unshift-read-race.js index 83fd9fa..17c18aa 100644 --- a/test/simple/test-stream-unshift-read-race.js +++ b/test/simple/test-stream-unshift-read-race.js @@ -29,7 +29,7 @@ var assert = require('assert'); // 3. push() after the EOF signaling null is an error. // 4. _read() is not called after pushing the EOF null chunk. -var stream = require('stream'); +var stream = require('../../'); var hwm = 10; var r = stream.Readable({ highWaterMark: hwm }); var chunks = 10; @@ -51,7 +51,14 @@ r._read = function(n) { function push(fast) { assert(!pushedNull, 'push() after null push'); - var c = pos >= data.length ? null : data.slice(pos, pos + n); + var c; + if (pos >= data.length) + c = null; + else { + if (n + pos > data.length) + n = data.length - pos; + c = data.slice(pos, pos + n); + } pushedNull = c === null; if (fast) { pos += n; diff --git a/test/simple/test-stream-writev.js b/test/simple/test-stream-writev.js index 5b49e6e..b5321f3 100644 --- a/test/simple/test-stream-writev.js +++ b/test/simple/test-stream-writev.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var queue = []; for (var decode = 0; decode < 2; decode++) { diff --git a/test/simple/test-stream2-basic.js b/test/simple/test-stream2-basic.js index 3814bf0..248c1be 100644 --- a/test/simple/test-stream2-basic.js +++ b/test/simple/test-stream2-basic.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-compatibility.js b/test/simple/test-stream2-compatibility.js index 6cdd4e9..f0fa84b 100644 --- a/test/simple/test-stream2-compatibility.js +++ b/test/simple/test-stream2-compatibility.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-finish-pipe.js b/test/simple/test-stream2-finish-pipe.js index 39b274f..006a19b 100644 --- a/test/simple/test-stream2-finish-pipe.js +++ b/test/simple/test-stream2-finish-pipe.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Buffer = require('buffer').Buffer; var r = new stream.Readable(); diff --git a/test/simple/test-stream2-fs.js b/test/simple/test-stream2-fs.js deleted file mode 100644 index e162406..0000000 --- a/test/simple/test-stream2-fs.js +++ /dev/null @@ -1,72 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - - -var common = require('../common.js'); -var R = require('_stream_readable'); -var assert = require('assert'); - -var fs = require('fs'); -var FSReadable = fs.ReadStream; - -var path = require('path'); -var file = path.resolve(common.fixturesDir, 'x1024.txt'); - -var size = fs.statSync(file).size; - -var expectLengths = [1024]; - -var util = require('util'); -var Stream = require('stream'); - -util.inherits(TestWriter, Stream); - -function TestWriter() { - Stream.apply(this); - this.buffer = []; - this.length = 0; -} - -TestWriter.prototype.write = function(c) { - this.buffer.push(c.toString()); - this.length += c.length; - return true; -}; - -TestWriter.prototype.end = function(c) { - if (c) this.buffer.push(c.toString()); - this.emit('results', this.buffer); -} - -var r = new FSReadable(file); -var w = new TestWriter(); - -w.on('results', function(res) { - console.error(res, w.length); - assert.equal(w.length, size); - var l = 0; - assert.deepEqual(res.map(function (c) { - return c.length; - }), expectLengths); - console.log('ok'); -}); - -r.pipe(w); diff --git a/test/simple/test-stream2-httpclient-response-end.js b/test/simple/test-stream2-httpclient-response-end.js deleted file mode 100644 index 15cffc2..0000000 --- a/test/simple/test-stream2-httpclient-response-end.js +++ /dev/null @@ -1,52 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -var common = require('../common.js'); -var assert = require('assert'); -var http = require('http'); -var msg = 'Hello'; -var readable_event = false; -var end_event = false; -var server = http.createServer(function(req, res) { - res.writeHead(200, {'Content-Type': 'text/plain'}); - res.end(msg); -}).listen(common.PORT, function() { - http.get({port: common.PORT}, function(res) { - var data = ''; - res.on('readable', function() { - console.log('readable event'); - readable_event = true; - data += res.read(); - }); - res.on('end', function() { - console.log('end event'); - end_event = true; - assert.strictEqual(msg, data); - server.close(); - }); - }); -}); - -process.on('exit', function() { - assert(readable_event); - assert(end_event); -}); - diff --git a/test/simple/test-stream2-large-read-stall.js b/test/simple/test-stream2-large-read-stall.js index 2fbfbca..667985b 100644 --- a/test/simple/test-stream2-large-read-stall.js +++ b/test/simple/test-stream2-large-read-stall.js @@ -30,7 +30,7 @@ var PUSHSIZE = 20; var PUSHCOUNT = 1000; var HWM = 50; -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable({ highWaterMark: HWM }); @@ -39,23 +39,23 @@ var rs = r._readableState; r._read = push; r.on('readable', function() { - console.error('>> readable'); + //console.error('>> readable'); do { - console.error(' > read(%d)', READSIZE); + //console.error(' > read(%d)', READSIZE); var ret = r.read(READSIZE); - console.error(' < %j (%d remain)', ret && ret.length, rs.length); + //console.error(' < %j (%d remain)', ret && ret.length, rs.length); } while (ret && ret.length === READSIZE); - console.error('<< after read()', - ret && ret.length, - rs.needReadable, - rs.length); + //console.error('<< after read()', + // ret && ret.length, + // rs.needReadable, + // rs.length); }); var endEmitted = false; r.on('end', function() { endEmitted = true; - console.error('end'); + //console.error('end'); }); var pushes = 0; @@ -64,11 +64,11 @@ function push() { return; if (pushes++ === PUSHCOUNT) { - console.error(' push(EOF)'); + //console.error(' push(EOF)'); return r.push(null); } - console.error(' push #%d', pushes); + //console.error(' push #%d', pushes); if (r.push(new Buffer(PUSHSIZE))) setTimeout(push); } diff --git a/test/simple/test-stream2-objects.js b/test/simple/test-stream2-objects.js index 3e6931d..ff47d89 100644 --- a/test/simple/test-stream2-objects.js +++ b/test/simple/test-stream2-objects.js @@ -21,8 +21,8 @@ var common = require('../common.js'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var assert = require('assert'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-pipe-error-handling.js b/test/simple/test-stream2-pipe-error-handling.js index cf7531c..e3f3e4e 100644 --- a/test/simple/test-stream2-pipe-error-handling.js +++ b/test/simple/test-stream2-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); (function testErrorListenerCatches() { var count = 1000; diff --git a/test/simple/test-stream2-pipe-error-once-listener.js b/test/simple/test-stream2-pipe-error-once-listener.js index 5e8e3cb..53b2616 100755 --- a/test/simple/test-stream2-pipe-error-once-listener.js +++ b/test/simple/test-stream2-pipe-error-once-listener.js @@ -24,7 +24,7 @@ var common = require('../common.js'); var assert = require('assert'); var util = require('util'); -var stream = require('stream'); +var stream = require('../../'); var Read = function() { diff --git a/test/simple/test-stream2-push.js b/test/simple/test-stream2-push.js index b63edc3..eb2b0e9 100644 --- a/test/simple/test-stream2-push.js +++ b/test/simple/test-stream2-push.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; var assert = require('assert'); diff --git a/test/simple/test-stream2-read-sync-stack.js b/test/simple/test-stream2-read-sync-stack.js index e8a7305..9740a47 100644 --- a/test/simple/test-stream2-read-sync-stack.js +++ b/test/simple/test-stream2-read-sync-stack.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var N = 256 * 1024; diff --git a/test/simple/test-stream2-readable-empty-buffer-no-eof.js b/test/simple/test-stream2-readable-empty-buffer-no-eof.js index cd30178..4b1659d 100644 --- a/test/simple/test-stream2-readable-empty-buffer-no-eof.js +++ b/test/simple/test-stream2-readable-empty-buffer-no-eof.js @@ -22,10 +22,9 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; test1(); -test2(); function test1() { var r = new Readable(); @@ -88,31 +87,3 @@ function test1() { console.log('ok'); }); } - -function test2() { - var r = new Readable({ encoding: 'base64' }); - var reads = 5; - r._read = function(n) { - if (!reads--) - return r.push(null); // EOF - else - return r.push(new Buffer('x')); - }; - - var results = []; - function flow() { - var chunk; - while (null !== (chunk = r.read())) - results.push(chunk + ''); - } - r.on('readable', flow); - r.on('end', function() { - results.push('EOF'); - }); - flow(); - - process.on('exit', function() { - assert.deepEqual(results, [ 'eHh4', 'eHg=', 'EOF' ]); - console.log('ok'); - }); -} diff --git a/test/simple/test-stream2-readable-from-list.js b/test/simple/test-stream2-readable-from-list.js index 7c96ffe..04a96f5 100644 --- a/test/simple/test-stream2-readable-from-list.js +++ b/test/simple/test-stream2-readable-from-list.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var fromList = require('_stream_readable')._fromList; +var fromList = require('../../lib/_stream_readable')._fromList; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-readable-legacy-drain.js b/test/simple/test-stream2-readable-legacy-drain.js index 675da8e..51fd3d5 100644 --- a/test/simple/test-stream2-readable-legacy-drain.js +++ b/test/simple/test-stream2-readable-legacy-drain.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream'); +var Stream = require('../../'); var Readable = Stream.Readable; var r = new Readable(); diff --git a/test/simple/test-stream2-readable-non-empty-end.js b/test/simple/test-stream2-readable-non-empty-end.js index 7314ae7..c971898 100644 --- a/test/simple/test-stream2-readable-non-empty-end.js +++ b/test/simple/test-stream2-readable-non-empty-end.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var len = 0; var chunks = new Array(10); diff --git a/test/simple/test-stream2-readable-wrap-empty.js b/test/simple/test-stream2-readable-wrap-empty.js index 2e5cf25..fd8a3dc 100644 --- a/test/simple/test-stream2-readable-wrap-empty.js +++ b/test/simple/test-stream2-readable-wrap-empty.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var EE = require('events').EventEmitter; var oldStream = new EE(); diff --git a/test/simple/test-stream2-readable-wrap.js b/test/simple/test-stream2-readable-wrap.js index 90eea01..6b177f7 100644 --- a/test/simple/test-stream2-readable-wrap.js +++ b/test/simple/test-stream2-readable-wrap.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var EE = require('events').EventEmitter; var testRuns = 0, completedRuns = 0; diff --git a/test/simple/test-stream2-set-encoding.js b/test/simple/test-stream2-set-encoding.js index 5d2c32a..685531b 100644 --- a/test/simple/test-stream2-set-encoding.js +++ b/test/simple/test-stream2-set-encoding.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var util = require('util'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-transform.js b/test/simple/test-stream2-transform.js index 9c9ddd8..a0cacc6 100644 --- a/test/simple/test-stream2-transform.js +++ b/test/simple/test-stream2-transform.js @@ -21,8 +21,8 @@ var assert = require('assert'); var common = require('../common.js'); -var PassThrough = require('_stream_passthrough'); -var Transform = require('_stream_transform'); +var PassThrough = require('../../').PassThrough; +var Transform = require('../../').Transform; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-unpipe-drain.js b/test/simple/test-stream2-unpipe-drain.js index d66dc3c..365b327 100644 --- a/test/simple/test-stream2-unpipe-drain.js +++ b/test/simple/test-stream2-unpipe-drain.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var crypto = require('crypto'); var util = require('util'); diff --git a/test/simple/test-stream2-unpipe-leak.js b/test/simple/test-stream2-unpipe-leak.js index 99f8746..17c92ae 100644 --- a/test/simple/test-stream2-unpipe-leak.js +++ b/test/simple/test-stream2-unpipe-leak.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var chunk = new Buffer('hallo'); diff --git a/test/simple/test-stream2-writable.js b/test/simple/test-stream2-writable.js index 704100c..209c3a6 100644 --- a/test/simple/test-stream2-writable.js +++ b/test/simple/test-stream2-writable.js @@ -20,8 +20,8 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var W = require('_stream_writable'); -var D = require('_stream_duplex'); +var W = require('../../').Writable; +var D = require('../../').Duplex; var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream3-pause-then-read.js b/test/simple/test-stream3-pause-then-read.js index b91bde3..2f72c15 100644 --- a/test/simple/test-stream3-pause-then-read.js +++ b/test/simple/test-stream3-pause-then-read.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000021300000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000327212631326456032064 0ustar 00000000000000{ "name": "readable-stream", "version": "1.1.13", "description": "Streams3, a user-land copy of the stream library from Node.js v0.11.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "isarray": "0.0.1", "string_decoder": "~0.10.x", "inherits": "~2.0.1" }, "devDependencies": { "tap": "~0.2.6" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/readable-stream.git" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "gitHead": "3b672fd7ae92acf5b4ffdbabf74b372a0a56b051", "bugs": { "url": "https://github.com/isaacs/readable-stream/issues" }, "homepage": "https://github.com/isaacs/readable-stream", "_id": "readable-stream@1.1.13", "_shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "_from": "readable-stream@>=1.1.13 <2.0.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/passthrough.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000007112631326456032056 0ustar 00000000000000module.exports = require("./lib/_stream_passthrough.js") ././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/readable.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000055112631326456032061 0ustar 00000000000000exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = require('stream'); exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); ././@LongLink0000000000000000000000000000021300000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/transform.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000006712631326456032063 0ustar 00000000000000module.exports = require("./lib/_stream_transform.js") ././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/writable.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000006612631326456032062 0ustar 00000000000000module.exports = require("./lib/_stream_writable.js") ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000537312631326456032070 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. module.exports = Duplex; /**/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /**/ /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); forEach(objectKeys(Writable.prototype), function(method) { if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; }); function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. process.nextTick(this.end.bind(this)); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000327712631326456032071 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. module.exports = PassThrough; var Transform = require('./_stream_transform'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000006254712631326456032076 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. module.exports = Readable; /**/ var isArray = require('isarray'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /**/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /**/ var Stream = require('stream'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var StringDecoder; /**/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /**/ util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (util.isString(chunk) && !state.objectMode) { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (util.isNullOrUndefined(chunk)) { state.reading = false; if (!state.ended) onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (isNaN(n) || util.isNull(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else return state.length; } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (!util.isNumber(n) || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (util.isNull(ret)) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (!util.isNull(ret)) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) process.nextTick(function() { emitReadable_(stream); }); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; process.nextTick(function() { maybeReadMore_(stream, state); }); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) process.nextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { var self = this; process.nextTick(function() { debug('readable nexttick read 0'); self.read(0); }); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; if (!state.reading) { debug('resume read 0'); this.read(0); } resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; process.nextTick(function() { resume_(stream, state); }); } } function resume_(stream, state) { state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); if (!chunk || !state.objectMode && !chunk.length) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (util.isFunction(stream[i]) && util.isUndefined(this[i])) { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }}(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; process.nextTick(function() { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } }); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } ././@LongLink0000000000000000000000000000022700000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000001626612631326456032073 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. module.exports = Transform; var Duplex = require('./_stream_duplex'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(Transform, Duplex); function TransformState(options, stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (!util.isNullOrUndefined(data)) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(options, this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; this.once('prefinish', function() { if (util.isFunction(this._flush)) this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (!util.isNull(ts.writechunk) && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000003141512631326456032064 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. module.exports = Writable; /**/ var Buffer = require('buffer').Buffer; /**/ Writable.WritableState = WritableState; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Stream = require('stream'); util.inherits(Writable, Stream); function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.buffer = []; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, state, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); process.nextTick(function() { cb(er); }); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); process.nextTick(function() { cb(er); }); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (util.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (!util.isFunction(cb)) cb = function() {}; if (state.ended) writeAfterEnd(this, state, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.buffer.length) clearBuffer(this, state); } }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && util.isString(chunk)) { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (util.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) state.buffer.push(new WriteReq(chunk, encoding, cb)); else doWrite(stream, state, false, len, chunk, encoding, cb); return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { if (sync) process.nextTick(function() { state.pendingcb--; cb(er); }); else { state.pendingcb--; cb(er); } stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(stream, state); if (!finished && !state.corked && !state.bufferProcessing && state.buffer.length) { clearBuffer(stream, state); } if (sync) { process.nextTick(function() { afterWrite(stream, state, finished, cb); }); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; if (stream._writev && state.buffer.length > 1) { // Fast case, write everything using _writev() var cbs = []; for (var c = 0; c < state.buffer.length; c++) cbs.push(state.buffer[c].callback); // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; doWrite(stream, state, true, state.length, state.buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer state.buffer = []; } else { // Slow case, write chunks one-by-one for (var c = 0; c < state.buffer.length; c++) { var entry = state.buffer[c]; var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { c++; break; } } if (c < state.buffer.length) state.buffer = state.buffer.slice(c); else state.buffer.length = 0; } state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (util.isFunction(chunk)) { cb = chunk; chunk = null; encoding = null; } else if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (!util.isNullOrUndefined(chunk)) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(stream, state) { return (state.ending && state.length === 0 && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(stream, state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else prefinish(stream, state); } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) process.nextTick(cb); else stream.once('finish', cb); } state.ended = true; } ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000023300000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000024200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000010312631326456032052 0ustar 00000000000000# core-util-is The `util.is*` functions introduced in Node v0.12. ././@LongLink0000000000000000000000000000024400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/float.patchnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000003762612631326456032076 0ustar 00000000000000diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/lib/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000024500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000250612631326456032063 0ustar 00000000000000{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is.git" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is", "_id": "core-util-is@1.0.1", "dist": { "shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" }, "_from": "core-util-is@>=1.0.0 <1.1.0", "_npmVersion": "1.3.23", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" } ././@LongLink0000000000000000000000000000024000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/util.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000570412631326456032066 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ././@LongLink0000000000000000000000000000024400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/lib/util.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000574012631326456032066 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000302512631326456032060 0ustar 00000000000000 # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/build/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000755000000000000000000000000012631326456032056 5ustar 00000000000000././@LongLink0000000000000000000000000000024200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/component.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000072612631326456032065 0ustar 00000000000000{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ././@LongLink0000000000000000000000000000023400000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000017012631326456032056 0ustar 00000000000000module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ././@LongLink0000000000000000000000000000024000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000241012631326456032055 0ustar 00000000000000{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "_id": "isarray@0.0.1", "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "_from": "isarray@0.0.1", "_npmVersion": "1.2.18", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "directories": {}, "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" }, "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000024200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/build/build.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000777112631326456032074 0ustar 00000000000000 /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ././@LongLink0000000000000000000000000000024500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/.npmignorenpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000001312631326456032052 0ustar 00000000000000build test ././@LongLink0000000000000000000000000000024200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000206412631326456032062 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000024400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000076212631326456032065 0ustar 00000000000000**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.././@LongLink0000000000000000000000000000024300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000001716412631326456032071 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ././@LongLink0000000000000000000000000000024700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/nod0000644000000000000000000000252712631326456032066 0ustar 00000000000000{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_from": "string_decoder@>=0.10.0 <0.11.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/test/tracker.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/tes0000644000000000000000000000317412631326456032100 0ustar 00000000000000"use strict" var test = require("tap").test var Tracker = require("../index.js").Tracker var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } test("Tracker", function (t) { t.plan(10) var name = "test" var track = new Tracker(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") var todo = 100 track = new Tracker(name, todo) t.is(track.completed(), 0, "Nothing done is 0 completion") testEvent(track, "change", afterCompleteWork) track.completeWork(100) function afterCompleteWork(er, onChangeName) { t.is(er, null, "completeWork: on change event fired") t.is(onChangeName, name, "completeWork: on change emits the correct name") } t.is(track.completed(), 1, "completeWork: 100% completed") testEvent(track, "change", afterAddWork) track.addWork(100) function afterAddWork(er, onChangeName) { t.is(er, null, "addWork: on change event fired") t.is(onChangeName, name, "addWork: on change emits the correct name") } t.is(track.completed(), 0.5, "addWork: 50% completed") track.completeWork(200) t.is(track.completed(), 1, "completeWork: Over completion is still only 100% complete") track = new Tracker(name, todo) track.completeWork(50) track.finish() t.is(track.completed(), 1, "finish: Explicitly finishing moves to 100%") }) ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/test/trackergroup.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/tes0000644000000000000000000000633412631326456032101 0ustar 00000000000000"use strict" var test = require("tap").test var Tracker = require("../index.js").Tracker var TrackerGroup = require("../index.js").TrackerGroup var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } test("TrackerGroup", function (t) { var name = "test" var track = new TrackerGroup(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") testEvent(track, "change", afterFinishEmpty) track.finish() var a, b function afterFinishEmpty(er, onChangeName) { t.is(er, null, "finishEmpty: on change event fired") t.is(onChangeName, name, "finishEmpty: on change emits the correct name") t.is(track.completed(), 1, "finishEmpty: Finishing an empty group actually finishes it") track = new TrackerGroup(name) a = track.newItem("a", 10, 1) b = track.newItem("b", 10, 1) t.is(track.completed(), 0, "Initially empty") testEvent(track, "change", afterCompleteWork) a.completeWork(5) } function afterCompleteWork(er, onChangeName) { t.is(er, null, "on change event fired") t.is(onChangeName, "a", "on change emits the correct name") t.is(track.completed(), 0.25, "Complete half of one is a quarter overall") testEvent(track, "change", afterFinishAll) track.finish() } function afterFinishAll(er, onChangeName) { t.is(er, null, "finishAll: on change event fired") t.is(onChangeName, name, "finishAll: on change emits the correct name") t.is(track.completed(), 1, "Finishing everything ") track = new TrackerGroup(name) a = track.newItem("a", 10, 2) b = track.newItem("b", 10, 1) t.is(track.completed(), 0, "weighted: Initially empty") testEvent(track, "change", afterWeightedCompleteWork) a.completeWork(5) } function afterWeightedCompleteWork(er, onChangeName) { t.is(er, null, "weighted: on change event fired") t.is(onChangeName, "a", "weighted: on change emits the correct name") t.is(Math.round(track.completed()*100), 33, "weighted: Complete half of double weighted") testEvent(track, "change", afterWeightedFinishAll) track.finish() } function afterWeightedFinishAll(er, onChangeName) { t.is(er, null, "weightedFinishAll: on change event fired") t.is(onChangeName, name, "weightedFinishAll: on change emits the correct name") t.is(track.completed(), 1, "weightedFinishaAll: Finishing everything ") track = new TrackerGroup(name) a = track.newGroup("a", 10) b = track.newGroup("b", 10) var a1 = a.newItem("a.1",10) a1.completeWork(5) t.is(track.completed(), 0.25, "nested: Initially quarter done") testEvent(track, "change", afterNestedComplete) b.finish() } function afterNestedComplete(er, onChangeName) { t.is(er, null, "nestedComplete: on change event fired") t.is(onChangeName, "b", "nestedComplete: on change emits the correct name") t.is(track.completed(), 0.75, "nestedComplete: Finishing everything ") t.end() } }) ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/test/trackerstream.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/are-we-there-yet/tes0000644000000000000000000000344212631326456032076 0ustar 00000000000000"use strict" var test = require("tap").test var util = require("util") var stream = require("readable-stream") var TrackerStream = require("../index.js").TrackerStream var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } var Sink = function () { stream.Writable.apply(this,arguments) } util.inherits(Sink, stream.Writable) Sink.prototype._write = function (data, encoding, cb) { cb() } test("TrackerStream", function (t) { t.plan(9) var name = "test" var track = new TrackerStream(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") var todo = 10 track = new TrackerStream(name, todo) t.is(track.completed(), 0, "Nothing done is 0 completion") track.pipe(new Sink()) testEvent(track, "change", afterCompleteWork) track.write("0123456789") function afterCompleteWork(er, onChangeName) { t.is(er, null, "write: on change event fired") t.is(onChangeName, name, "write: on change emits the correct name") t.is(track.completed(), 1, "write: 100% completed") testEvent(track, "change", afterAddWork) track.addWork(10) } function afterAddWork(er, onChangeName) { t.is(er, null, "addWork: on change event fired") t.is(track.completed(), 0.5, "addWork: 50% completed") testEvent(track, "change", afterAllWork) track.write("ABCDEFGHIJKLMNOPQRST") } function afterAllWork(er) { t.is(er, null, "allWork: on change event fired") t.is(track.completed(), 1, "allWork: 100% completed") } }) npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/.npmignore0000644000000000000000000000114212631326456031356 0ustar 00000000000000# Logs logs *.log # Runtime data pids *.pid *.seed # Directory for instrumented libs generated by jscoverage/JSCover lib-cov # Coverage directory used by tools like istanbul coverage # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) .grunt # Compiled binary addons (http://nodejs.org/api/addons.html) build/Release # Dependency directory # Commenting this out is preferred by some people, see # https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git- node_modules # Users Environment Variables .lock-wscript # Editor cruft *~ .#* npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/LICENSE0000644000000000000000000000135712631326456030374 0ustar 00000000000000Copyright (c) 2014, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/README.md0000644000000000000000000001403612631326456030644 0ustar 00000000000000gauge ===== A nearly stateless terminal based horizontal guage / progress bar. ```javascript var Gauge = require("gauge") var gauge = new Gauge() gauge.show("test", 0.20) gauge.pulse("this") gauge.hide() ``` ![](example.png) ### `var gauge = new Gauge([options], [ansiStream])` * **options** – *(optional)* An option object. (See [below] for details.) * **ansiStream** – *(optional)* A stream that's been blessed by the [ansi] module to include various commands for controlling the cursor in a terminal. [ansi]: https://www.npmjs.com/package/ansi [below]: #theme-objects Constructs a new gauge. Gauges are drawn on a single line, and are not drawn if the current terminal isn't a tty. If you resize your terminal in a way that can be detected then the gauge will be drawn at the new size. As a general rule, growing your terminal will be clean, but shrinking your terminal will result in cruft as we don't have enough information to know where what we wrote previously is now located. The **options** object can have the following properties, all of which are optional: * maxUpdateFrequency: defaults to 50 msec, the gauge will not be drawn more than once in this period of time. This applies to `show` and `pulse` calls, but if you `hide` and then `show` the gauge it will draw it regardless of time since last draw. * theme: defaults to Gauge.unicode` if the terminal supports unicode according to [has-unicode], otherwise it defaults to `Gauge.ascii`. Details on the [theme object](#theme-objects) are documented elsewhere. * template: see [documentation elsewhere](#template-objects) for defaults and details. [has-unicode]: https://www.npmjs.com/package/has-unicode If **ansiStream** isn't passed in, then one will be constructed from stderr with `ansi(process.stderr)`. ### `gauge.show([name, [completed]])` * **name** – *(optional)* The name of the current thing contributing to progress. Defaults to the last value used, or "". * **completed** – *(optional)* The portion completed as a value between 0 and 1. Defaults to the last value used, or 0. If `process.stdout.isTTY` is false then this does nothing. If completed is 0 and `gauge.pulse` has never been called, then similarly nothing will be printed. If `maxUpdateFrequency` msec haven't passed since the last call to `show` or `pulse` then similarly, nothing will be printed. (Actually, the update is deferred until `maxUpdateFrequency` msec have passed and if nothing else has happened, the gauge update will happen.) ### `gauge.hide()` Removes the gauge from the terminal. ### `gauge.pulse([name])` * **name** – *(optional)* The specific thing that triggered this pulse Spins the spinner in the gauge to show output. If **name** is included then it will be combined with the last name passed to `gauge.show` using the subsection property of the theme (typically a right facing arrow). ### `gauge.disable()` Hides the gauge and ignores further calls to `show` or `pulse`. ### `gauge.enable()` Shows the gauge and resumes updating when `show` or `pulse` is called. ### `gauge.setTheme(theme)` Change the active theme, will be displayed with the next show or pulse ### `gauge.setTemplate(template)` Change the active template, will be displayed with the next show or pulse ### Theme Objects There are two theme objects available as a part of the module, `Gauge.unicode` and `Gauge.ascii`. Theme objects have the follow properties: | Property | Unicode | ASCII | | ---------- | ------- | ----- | | startgroup | ╢ | \| | | endgroup | ╟ | \| | | complete | █ | # | | incomplete | ░ | - | | spinner | ▀▐▄▌ | -\\\|/ | | subsection | → | -> | *startgroup*, *endgroup* and *subsection* can be as many characters as you want. *complete* and *incomplete* should be a single character width each. *spinner* is a list of characters to use in turn when displaying an activity spinner. The Gauge will spin as many characters as you give here. ### Template Objects A template is an array of objects and strings that, after being evaluated, will be turned into the gauge line. The default template is: ```javascript [ {type: "name", separated: true, maxLength: 25, minLength: 25, align: "left"}, {type: "spinner", separated: true}, {type: "startgroup"}, {type: "completionbar"}, {type: "endgroup"} ] ``` The various template elements can either be **plain strings**, in which case they will be be included verbatum in the output. If the template element is an object, it can have the following keys: * *type* can be: * `name` – The most recent name passed to `show`; if this is in response to a `pulse` then the name passed to `pulse` will be appended along with the subsection property from the theme. * `spinner` – If you've ever called `pulse` this will be one of the characters from the spinner property of the theme. * `startgroup` – The `startgroup` property from the theme. * `completionbar` – This progress bar itself * `endgroup` – The `endgroup` property from the theme. * *separated* – If true, the element will be separated with spaces from things on either side (and margins count as space, so it won't be indented), but only if its included. * *maxLength* – The maximum length for this element. If its value is longer it will be truncated. * *minLength* – The minimum length for this element. If its value is shorter it will be padded according to the *align* value. * *align* – (Default: left) Possible values "left", "right" and "center". Works as you'd expect from word processors. * *length* – Provides a single value for both *minLength* and *maxLength*. If both *length* and *minLength or *maxLength* are specifed then the latter take precedence. ### Tracking Completion If you have more than one thing going on that you want to track completion of, you may find the related [are-we-there-yet] helpful. It's `change` event can be wired up to the `show` method to get a more traditional progress bar interface. [are-we-there-yet]: https://www.npmjs.com/package/are-we-there-yet npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/example.png0000644000000000000000000004635112631326456031533 0ustar 00000000000000PNG  IHDRtol{ iCCPICC ProfileHWXS[R -)7AzޥJ! J AŎ.*vQAQEZQ,` "*ʺX& {;79gs|3(ڳ,T l~0*Ї$ @@0Yl;22\g*  q GΆ(&[ sbU!$,i2%)2l-2 c8 {Y = bE*Ħ)?I)c1Y1,E*d?HŚKvxt }ب (IΰnUB$rGNS# V#{~ 0@1%gz`[P p^npNΊqEѣ8:s?+|!3 =/㉶!VC2b(?7|F(p6]0 JfgF¬,)urcdXW:ʍq8\~g .BAV=V ;$ʋ ' H 72F A(~ İY ᗬ'.ьzK{ _qhGyPuL+{ZTio#<8=p7<>`ŝqQ?D1@4`{: 7 ] raiO̙ tARwߨ n Y;>; \X0o?2^˟Ǔ11cV?GF[Z3EVikNJLx" EIe8Q>/1:kPA.wNdAspG2l L[k$lxːw]N.EP]2S l0O%E2400'[܀A`z:Ȇg` (`-JT0'p\&F/x {0  !tDE qF<$Bd$ #bd>)F#.9A."]C Q 6jNDQo4Aih.CW[ t?ZA/7.%:Lc`z%bX XVU`X#ױ.q:- cq6/Wx^n|FWB0!FM($KxO$D\ <*vAbC$H$ ;)" I[IIIHdy.ٖ@N">)53򐜒\Gn=rrWz(;%AYBB<חw"ϓ_,EnOT9՗:*VRwoi41͋D˥U>*8 )RS4RVXxDbKiRqJtelU/*?W!pTV9Ct_:~ޫJT5Q VP-V=ڮ:f6GLZc3Y5Ì[y[9vܵqǫ{sՋTXQP4ל9[\fxnO 2׊Қ[MkP[G;P[UvCK'Cg)>].OwiL573 k7я/?ЀbljѠ`P0pa=#9#gtFFM㍗?7Q7 671y`J3410aF4s64na;_@--x-:'&LOpےjmgYcmŰ *z5pbu['~vβc}FfMM[m ;]"\r;t0_}NNNۜn;:G:rBpqYr哫ka׿,2=d2;iϤw}w..GN.O=Ogc//^gf_X}|u]? H tD Zt;X;\<0i-!ԐҐǡ04lr؆F!aIdNSS"Mye5?5=3z_51cMcűqq>ůJ rf"/!7ipMS{9L+vk9/М5LřG "Xm)l_fKg#]}Ꞻ>y{چt~/:#(cGƇ̈|~&eά9:BAWkΦapM5£NT;#,G(ik>wg5כd~" S6/2XlQUK(K2\).X_nie/%BBanwWV[u"Nѥb/ث.j_Wn_㸦|-q-u+_߳!lCFƢ6tľdff-[n]Kiz2۴a;gr;w|ygW஺ ㊒yo{5Zɯ쪊jvާoM Z#?m 2ć^!8=jtt1:nn@}z}WCbCǛaG e'N9E9ӃM3igzg6?pF˔s!.8ջ '.^<~ReummǮ8\9^wjCKGcS/D]eed9|]=a! :w ! yMPN:rXA҉nŮ} 55K5`;oXi9}xiBnSԻ~w ףkx!zT ' NS3,+z_^jo}A5eh~r Y kjlU,ઉ?^pY??__p{MROOڻP~eMz.hKFtW9]ֵuB噐m!~\ kj-> y?V|er wVJhRG.a؆%O(BZ#`lZ&kƴտ>epü$=d.t.~iUVPŋ205~B^Ѭժ>933 r:ŇCsWjjߚzrVͬS(n!`%g)1iT\T/4`:GѡWD -^C\VV$P9-r_B@E饬2aݚnSa*bof>l uGEKJ7Qߺx/9~5jCD$P X>G~kNH#9ssCs=2U{IZQը|.=HDKѱ\CYt"j˧> }x_"WuV){\y7l|4z-Z\NuG$:.^ kbxWtm8KwcD:䵬rܻ!T2ZNry.4$:.^4%Ntr$AN0d>w9+`6ё2 l 𣾫)+N .֝DKC\mǔ9H>C̙rcuoIr69𢣽ICtz&aSd*Weܪ0[Xɳwj=Ŵ8PE\ljXOtxEע|H NhDs= /ug$6< 0Yl4&n +:ݿG.u1ܠM^C|A$?(0Dv}>;gH#7|^W٪X;:jrgݚmaCEr״;e|_|czS5j$x_ ,.:ƭ-.o7Uc>u(z7[_g ((|k 5jYI(: R΅ߗTsbgDgd^^X4:q,{ؿ$6jjď8#=*ezE>_a~~!O$ߝϸOT|RKtUo$^t Li0-ii朲ŋRr##JJ1"tJER v<܋MZnpz`<霟ŋZё2''O&c\U"Taj,SW/j='utQ:%D77TO Lk$aZ)*Y 'RnËZ1,U <Աo@CcxQAtHuѷ#N@0Aqw##,:J8 1S}ROt ug91F8N_=)}?&Zp,srf|G5aӪ:OX:떝GI1N"-^zO@ʉ Dm xlu4E3˜E(Xdt Qj/mPzNVV/j=' DMݫ(5EߢRjD :@tDB|\kD̷Ksn:ԑM[O >/#/O!Cd/Y :РD5#_57y/c|B6>9|XѱOȇ݂J}9^Egzpb>qokjjZ}߰rHҶ;{V!ZTCJw~v7si)I{ijEw=>oO'宭.[Iȩ Ì<3ɑ64^p_ىkk="‚?n_Y׋, ZR9-\ &/_RcdR{Mf9~f8G:;+elRG.a؆%OBZ#`ƴ7׿;Mԣ۬%5֗7[O <%IȼL^@Ǧޓʠ2s 1uŇĤCsWjjf=l/=C \.[=3%H=s8DNaIslLr.*=XcuLfIY"ADBtpJ5E3R. 2-BѴH^v_.# zjgVK{3ے$Zbva (^hxy&LY9B3ڭ>?rγb3?D0C)` aB:t9Q[[ hG z<0S:$ZQ(\nx:)R!DHRtxFF  Z XBdCo>՜r!чW:1HQ)Y-iZts S=SixY ybx@tZ mDe"zEf-Qwf6q`em4y''7mF"y-;8O$BtZ$wx\y%4گZ{C#$wu9e%3X$?dKm0HEGjE9;8OL!?DZd:p9@}L]h/S$rS#&'<-`f$3.:qX@Ģ#=/ \&mϷz3_*ZnM=ey[Š=uMcDaDp'ӿgg} iPtX?;9NzlZwgݚ}aXTQP=`P/iT>G/"s @}"|&,P>9 FBzSK)ӭD'G2^|՜eYR"[[": R;$1ӪĬklvKNy'g-Έ JIzїEC)_/S Bi>ziUt&a۳oas=9[R2-å5yWoƘEZLqNTqF׋E#n>WCyAtqf-sh):Q<f)[(,>)':2_Y,BNߛ[$iý1ۤ #y!Q,>/xxDl2U)Br:?ppeS} :\G:Aɡ]BMs^LTδFMp"HF81,>R3H4$:DT};t w1rQ>)(:R Z9N͢S=' DϿP[zxX>(:,nc'cl[<eGѡ8Q>)':Lmw[8٬qcnyt-oE@ؖJ+AVNSt.:#ɍ,^zO@ʉPEHGO/NF (,DoeS}RNtШKѴ^ٽR3Y-*,>&:@tD@tuuϑε%F%;ԑMٛkt8T|E}P˧y]!{mf-8˜|z?8}6NtM6$@ݵ&?i0k+Od8ȅ烚jܱjVZ7g)P:E^IpȅJM)7谄ė/~=]vޯ͹;"'%5o4=79'm!=i}E26aIh,Esh:1Bdkܞ%,elB2۸Y8\q Tt̋KӘvwc°α K֓F6F%57[OB97[k# Z/a|CkW=kBŕ=*+?ֺҞZRrV$]LKLf_u,9D s:Ň~p׺bXyk}UtԒCs 9+Wd (g'>Vߊ0)Sci]i/*:1[8+}%CtDG59- sˌ*:jIbof)Ak* vܗ3=LsR-o_SGN}̱CL?D@t`,:/A[s6@lc)FIkEW΅uhDq2!T6TSijuf`~fNڟ!SIS,K,o %a+dFgJ,Jhr2TuFְX[Wڻ ut|.%1*:jiXљ2?.*}[9æ>.,lkc:Hkw4p"}ME+Rbkej|1r۸D"EtZf+V"NyA-I[]O#JJJg3Ns`E QO%[d::\0#4eoԞR2 74[خM\ԔSILNb{!;CqM%+zm₺*)uÎ%0?i13/1kP*Uz35̮jk=xM&ٶ@lɞ˽  41e\ERWYKld=sȭZ.7+Dg\P&>Pv=9ZWSK`SPM a+*'ߜZohD'Iorß\G㐳[pFXLm~`ԒlcA[sF zd-p /1< U ;Uiç Y[WSK:IC8MW|CV2.s( (}"\}h$8°Z:r͒GcMX |qk ;kE7YJZ-y6tEkɲ%=U HEi;ރ;ob jXЧڭtɲDn*^;S&ޞ R=:TNbCg9)%-,4ӊ5Kƙ%{pѱs{-,C}KJLTKjY3ў.s衦.M(-3}ui$g=*jZ="ΔUH#xz{WeqeoΖʈO!@Ģ3;`#0t$ߊ >0coQtǧWj' ѡ>*g$Ɂǰ ޿;{1$(: ~lI~L?߰^ ,+6щ9SE}`rbv׻,X.Vls\t-_s[? $ =S˟Gk,Xܬ(E'' );`rs\_%ÂŊvxs^tҷ{R6?4X`YӉ9I&:X}k["ت?‚Ѻ٫؟)sXR{D'G]rWb(X`9[&9XRhnFsls2yU2?Taщ9':hw?qpF_|?藺`r2b{x8|tvk`h_D<{uOŁ:_,X-y"9.O>ZY ,+)<,X.Vws~:9{ ەdu-,X\k!:2mcaX`9[8م=݄`lAtNgr"X`9Z߾T$Mf(7- s©_M¦K "`r-4HIq#i.!즿EtT/zODgZ#a &HQEW8R$u#^za`}ËZ@EtY+T)EfQQzCU_-=c6+zIDATcNCqS}RNtڌȷ&qZ3Y Koݲ")"&i[:;ŋZO H9a-!V]tF3TYT̑^7J "jQ_Y|ىEQ#i{53fv[TY|RMtD$ I+9[t@]C9ڔiHwͳtڥ^rwNӮHD.:]"M.:#-Ӿ?*:Ӄ)!TN5O#]VRfp#-l zu'/x?8|&Gh'2AM5-xeMsT ϫ.uh.:+:=@JK~[0O${iZ7V#9~ZRVIcsNUt%7k>ɟ."|Ei'~L"$j]$-(^  NN9p13JTtmXh4A%5'2Q ;ՋEt؉[OJChS]j+Ww{VUV~lȼ5!RZP蹡mߜw!)yE[޾*jɡRRcb"j^t*/|V"+jNWnVZp`:DrkNpgm{Ϫr-f]thﳙu"GG jI:تxQRR2-OUxPlT^|;}Bf3PEԂ2 g3rϕ0m< ⹒zT4ac.ܠx>=)WŘ0ױY-3<$4-1ؿА-8Ł]t=/:r j.:MrTRZt2mP4PzcK2ⳒKͧiPQt2C҂ ~.Ytv+잝\G[pFXL'~`Ԓl+A[s  eՋ.%KzS/ zd-p4mсFl? Gɞj`>xWpl@0iTRJj}tYt8*gK :#&g/AtWq;~{@td3D@tU4߻7 *+aD݊YωkNot.7.$&[9,X c&:s")oi BgŸs&b$lJsvϤR\_ӗ~ VXy yIzBt<3b񓼢sj?dG/5է 9V]]]#VAy,fX9 K=~ïOp,Wo~cj/D^N1e~+q(EOv4PXdX+rO5rs\;_`%jqYUY 0#>7FUR2bz\7ۮ?k$+o m~i`%uy9RNcө*,)P9Opj{#:X}k[Zت?‚ e^R7CxCqTubw!o{mZ Ut1e ַ=erLCGWõ!j7GWyB.sC,dk{NAii%|,+Q,,)]47#Ji| gwMHˆNmmm89`YcGY%蜗r+Q,蘞Jxq``A&WYg(48"dS}8#}/SXʈUtƭ/H}Wi3LJ?)TӢKWD$:"?p.ߕ\ VX|80?^6U <ܻqG1kHtRlq68_,X bz״?ٗ+u0tc :~}1`%Ôy:B6f+Y>#@fޓ zd-3٦xrJYD'+~>@co/DGf+],Xcpς>u|74u^2*z9謳0,έ-.9`J+Łg9f)<[/:e?SMJFQ=|}Nx]itSMtr[,X c(:*Y3KJ-FS$Ƶ`%ѹIb)m! Â(D'oARm.D`%Iu:{MS`%ԢӰT$Mf(7IڷZt2)w)KL9< 17\oH+A,ƘEZLqNTqF׋E#n>WCyAtqf-sh):Q<f)[(,>)':2_Y,BNߛ[$iý1ۤ #y!Q,>/xxDl2U)Br:?ppeS} :\G:Aɡ]BMs^LTδFMp"HF81,>R3H4$:DT};t W1rQ>)(:R Z9N͢S=' DϿP[zxX>(:,nS#}?&3x=EKDG0ߑoMjgd1)ߺeERDL2;;;7uvvGrc[B(,^)[:M-0f&7xQ>)':B#=͎\CvwR[IM[pev'ם!gp 5մj6=S1[@?޺ԡhkIq^/CDvЯuc)1'%5o4< n{QE9Yrs"Rǹ\4&!$BօNڂE Xs wX{~1DE؆%QICīZRcazʺ(cXXDؼ.Y}4=kܥr}gUe.ZӞ+Ջچ x]D15K훭r ,U;O.q-˩EBng)"tx{jCt.w[Ўߦ'*jIbofE|d^tOWL͋n&d!:\ѡ`,/>[1Llc)IGBõvѹSk""*ϻ\tx{,lWQtإ~=΂CtVZI#k4!:s> T9`k5WOtҰ3e~]TrEM}\*X2.:5ՋUt~;BNv+,9,ts .~ :[tZf;謕s;>Y'r$|%%%B=^*FEgg(d1^tL-8)s.:}v9Oq(X.\L 0 +M%+L8 zOޓrU }2C"O@S тX@EsA*/F Ms+)M%EzJW/L> E*>9$C\!>+$|A%E'3L)-a @E'Iolٹ%49;gt K-ɶϺ50ѹ`Y񼟯Q肜]R70G;OӆOm͜&:Y>_rj vp% NU,֧NEà opi^N"Ě^ǵ]hni nm@EV#1ߟ73b6J)[/:1=I2}:A>zfՅ~I OM) yh=gWcڟjjmЀhٷ3J7-)RR˚ItLlfY\(у X|v圣҂B\m א"]wU!UdܑCc8삝@t\#'V2DǕIWZ6+T<$&&g/AtWq;~\Atd3D@tn.ڥq2 ~~2XƊMtڭxK[:QS?Â0Vl3';ωz{g =+aDg W:D]e3.:)8ׯ幭 VlczoD^ҵO+*Oѕj?dG/5է 9V]]]#nzAyw>|9 K=+~O=0ϫdS{&:v(<,XcE):6~b"~#=%#RϸGxi|̸bO+ahW+W'dϪ#Ȇyn18o>jsdvwUVݕ]i/D'}QoM+q z6v+W!w\d9O٥-|-Cek{#:X}k[Zت?‚ e^'7C׋xCqTubw!o{mZ8S`}S/$:tt<\K *opz4(:{='?7$:ʢM6ѱQ DLst܌(EIS77-" #:eDe^D%蜗r+Q,xq``A&WYg(48QtxgֵdS}8#}/SXʈUtƭ/H}Wi3LJ?)TӢKWD$:BuHtsɰNg:XEÁxj58JJ:99z ^rݑɷ8po/Fd=m?⁻+u0tc :~}1`%Ôy:B6f+Y>#@0ޓ zd-3٦.erJYD'+~>@co/DGf+],Xcp/1aQ:S:`/FUtYFRFt%u8G^,XbŶ8P,g,vE'8~r Ch0Ѡ+nwo`%d,)?3MK{/:MkaJ+.s-)Sڶ]C+Q,N٥<]`J t$aJ+9E"bѩHxPn ֵo%Xfu]tW!K )`%'|5 :դ8.wRĉäu[| :|c5g1bQ@tSUyyb舛ODU |>)mx}@D%u$Z`x@jgT>O<3jbΓب?zh~$*/|<|w>>Sbg H-#VEqܒgz)d2i#g>s/J=O@ʉ8+*@Ё(I`p/v6iys~^~Ht/j=ODG0(=sUwP}N4+:҉hL"n\DGb@t50-m,]t|)ER7‰Eg * F7!1 :\D[H དྷZO HAёQth%m>':TxE:3ƃ΀DRO HEau#D716ٝ(:4X :j=' D|kU;#u tLq-;."bٹE񭳳;ZTBie:HinEg1C5ŋZO H9*P<A|)by_h7^zO@ʉ5r)+W3SQjf7I' D :hIENDB`npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/0000755000000000000000000000000012631326456032036 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/package.json0000644000000000000000000000263712631326456031657 0ustar 00000000000000{ "name": "gauge", "version": "1.2.2", "description": "A terminal based horizontal guage", "main": "progress-bar.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/gauge.git" }, "keywords": [ "progressbar", "progress", "gauge" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/gauge/issues" }, "homepage": "https://github.com/iarna/gauge", "dependencies": { "ansi": "^0.3.0", "has-unicode": "^1.0.0", "lodash.pad": "^3.0.0", "lodash.padleft": "^3.0.0", "lodash.padright": "^3.0.0" }, "devDependencies": { "tap": "^0.4.13" }, "gitHead": "9f7eeeeed3b74a70f30b721d570435f6ffbc0168", "_id": "gauge@1.2.2", "_shasum": "05b6730a19a8fcad3c340a142f0945222a3f815b", "_from": "gauge@>=1.2.0 <1.3.0", "_npmVersion": "3.1.0", "_nodeVersion": "0.10.38", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "dist": { "shasum": "05b6730a19a8fcad3c340a142f0945222a3f815b", "tarball": "http://registry.npmjs.org/gauge/-/gauge-1.2.2.tgz" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/gauge/-/gauge-1.2.2.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/progress-bar.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/progress-bar.j0000644000000000000000000001336512631326456032152 0ustar 00000000000000"use strict" var hasUnicode = require("has-unicode") var ansi = require("ansi") var align = { center: require("lodash.pad"), left: require("lodash.padright"), right: require("lodash.padleft") } var defaultStream = process.stderr function isTTY() { return process.stderr.isTTY } function getWritableTTYColumns() { // Writing to the final column wraps the line // We have to use stdout here, because Node's magic SIGWINCH handler only // updates process.stdout, not process.stderr return process.stdout.columns - 1 } var ProgressBar = module.exports = function (options, cursor) { if (! options) options = {} if (! cursor && options.write) { cursor = options options = {} } if (! cursor) { cursor = ansi(defaultStream) } this.cursor = cursor this.showing = false this.theme = options.theme || (hasUnicode() ? ProgressBar.unicode : ProgressBar.ascii) this.template = options.template || [ {type: "name", separated: true, length: 25}, {type: "spinner", separated: true}, {type: "startgroup"}, {type: "completionbar"}, {type: "endgroup"} ] this.updatefreq = options.maxUpdateFrequency || 50 this.lastName = "" this.lastCompleted = 0 this.spun = 0 this.last = new Date(0) var self = this this._handleSizeChange = function () { if (!self.showing) return self.hide() self.show() } } ProgressBar.prototype = {} ProgressBar.unicode = { startgroup: "╢", endgroup: "╟", complete: "█", incomplete: "░", spinner: "▀▐▄▌", subsection: "→" } ProgressBar.ascii = { startgroup: "|", endgroup: "|", complete: "#", incomplete: "-", spinner: "-\\|/", subsection: "->" } ProgressBar.prototype.setTheme = function(theme) { this.theme = theme } ProgressBar.prototype.setTemplate = function(template) { this.template = template } ProgressBar.prototype._enableResizeEvents = function() { process.stdout.on('resize', this._handleSizeChange) } ProgressBar.prototype._disableResizeEvents = function() { process.stdout.removeListener('resize', this._handleSizeChange) } ProgressBar.prototype.disable = function() { this.hide() this.disabled = true } ProgressBar.prototype.enable = function() { this.disabled = false this.show() } ProgressBar.prototype.hide = function() { if (!isTTY()) return if (this.disabled) return this.cursor.show() if (this.showing) this.cursor.up(1) this.cursor.horizontalAbsolute(0).eraseLine() this.showing = false } var repeat = function (str, count) { var out = "" for (var ii=0; ii Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000103212631326456032210 0ustar 00000000000000# lodash.pad v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.pad` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.pad ``` In Node.js/io.js: ```js var pad = require('lodash.pad'); ``` See the [documentation](https://lodash.com/docs#pad) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.pad) for more details. ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000325312631326456032217 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Pads `string` on the left and right sides if it's shorter than `length`. * Padding characters are truncated if they can't be evenly divided by `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.pad('abc', 8); * // => ' abc ' * * _.pad('abc', 8, '_-'); * // => '_-abc_-_' * * _.pad('abc', 3); * // => 'abc' */ function pad(string, length, chars) { string = baseToString(string); length = +length; var strLength = string.length; if (strLength >= length || !nativeIsFinite(length)) { return string; } var mid = (length - strLength) / 2, leftLength = nativeFloor(mid), rightLength = nativeCeil(mid); chars = createPadding('', rightLength, chars); return chars.slice(0, leftLength) + string + chars; } module.exports = pad; ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000457212631326456032224 0ustar 00000000000000{ "name": "lodash.pad", "version": "3.1.1", "description": "The modern build of lodash’s `_.pad` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.pad@3.1.1", "_shasum": "2e078ebc33b331d2ba34bf8732af129fd5c04624", "_from": "lodash.pad@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "2e078ebc33b331d2ba34bf8732af129fd5c04624", "tarball": "http://registry.npmjs.org/lodash.pad/-/lodash.pad-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.pad/-/lodash.pad-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000105312631326456032213 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000134212631326456032214 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000442512631326456032221 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023300000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000106112631326456032212 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000254212631326456032217 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000452412631326456032221 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000025500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000026400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000026600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000105712631326456032217 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000026500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000273712631326456032225 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000027100000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000455312631326456032223 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/LICENSE.txtnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000106612631326456032217 0ustar 00000000000000# lodash.padleft v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.padLeft` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.padleft ``` In Node.js/io.js: ```js var padLeft = require('lodash.padleft'); ``` See the [documentation](https://lodash.com/docs#padLeft) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.padleft) for more details. ././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000277612631326456032230 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /** * Creates a function for `_.padLeft` or `_.padRight`. * * @private * @param {boolean} [fromRight] Specify padding from the right. * @returns {Function} Returns the new pad function. */ function createPadDir(fromRight) { return function(string, length, chars) { string = baseToString(string); return (fromRight ? string : '') + createPadding(string, length, chars) + (fromRight ? '' : string); }; } /** * Pads `string` on the left side if it is shorter than `length`. Padding * characters are truncated if they exceed `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.padLeft('abc', 6); * // => ' abc' * * _.padLeft('abc', 6, '_-'); * // => '_-_abc' * * _.padLeft('abc', 3); * // => 'abc' */ var padLeft = createPadDir(); module.exports = padLeft; ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000463112631326456032220 0ustar 00000000000000{ "name": "lodash.padleft", "version": "3.1.1", "description": "The modern build of lodash’s `_.padLeft` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.padleft@3.1.1", "_shasum": "150151f1e0245edba15d50af2d71f1d5cff46530", "_from": "lodash.padleft@>=3.0.0 <4.0.0", "_npmVersion": "2.9.0", "_nodeVersion": "0.12.2", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "150151f1e0245edba15d50af2d71f1d5cff46530", "tarball": "http://registry.npmjs.org/lodash.padleft/-/lodash.padleft-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.padleft/-/lodash.padleft-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000023400000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000105312631326456032213 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000134212631326456032214 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000024100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000442512631326456032221 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000106112631326456032212 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000254212631326456032217 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000024300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000024200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000452412631326456032221 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000026100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000027000000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000027200000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000105712631326456032217 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000027100000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000273712631326456032225 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000027500000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000455312631326456032223 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/LICENSE.txtnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000107512631326456032217 0ustar 00000000000000# lodash.padright v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.padRight` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.padright ``` In Node.js/io.js: ```js var padRight = require('lodash.padright'); ``` See the [documentation](https://lodash.com/docs#padRight) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.padright) for more details. ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000301012631326456032206 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /** * Creates a function for `_.padLeft` or `_.padRight`. * * @private * @param {boolean} [fromRight] Specify padding from the right. * @returns {Function} Returns the new pad function. */ function createPadDir(fromRight) { return function(string, length, chars) { string = baseToString(string); return (fromRight ? string : '') + createPadding(string, length, chars) + (fromRight ? '' : string); }; } /** * Pads `string` on the right side if it is shorter than `length`. Padding * characters are truncated if they exceed `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.padRight('abc', 6); * // => 'abc ' * * _.padRight('abc', 6, '_-'); * // => 'abc_-_' * * _.padRight('abc', 3); * // => 'abc' */ var padRight = createPadDir(true); module.exports = padRight; ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000464112631326456032221 0ustar 00000000000000{ "name": "lodash.padright", "version": "3.1.1", "description": "The modern build of lodash’s `_.padRight` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.padright@3.1.1", "_shasum": "79f7770baaa39738c040aeb5465e8d88f2aacec0", "_from": "lodash.padright@>=3.0.0 <4.0.0", "_npmVersion": "2.9.0", "_nodeVersion": "0.12.2", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "79f7770baaa39738c040aeb5465e8d88f2aacec0", "tarball": "http://registry.npmjs.org/lodash.padright/-/lodash.padright-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.padright/-/lodash.padright-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000022700000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000105312631326456032213 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000134212631326456032214 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000024200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000442512631326456032221 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000024000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000106112631326456032212 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000023700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000254212631326456032217 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000024400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000024300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000452412631326456032221 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000026200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000027100000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000232112631326456032212 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000027300000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000105712631326456032217 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000027200000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000273712631326456032225 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000027600000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/node_modules/l0000644000000000000000000000455312631326456032223 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000015300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/test/progress-bar.jsnpm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/node_modules/gauge/test/progress-0000644000000000000000000001257212631326456032213 0ustar 00000000000000"use strict" var test = require("tap").test var ProgressBar = require("../progress-bar.js") var cursor = [] var C var bar = new ProgressBar({theme: ProgressBar.ascii}, C = { show: function () { cursor.push(["show"]) return C }, hide: function () { cursor.push(["hide"]) return C }, up: function (lines) { cursor.push(["up",lines]) return C }, horizontalAbsolute: function (col) { cursor.push(["horizontalAbsolute", col]) return C }, eraseLine: function () { cursor.push(["eraseLine"]) return C }, write: function (line) { cursor.push(["write", line]) return C } }) function isOutput(t, msg, output) { var tests = [] for (var ii = 0; ii P | |----|\n' ], [ 'show' ] ]) }) test("window resizing", function (t) { t.plan(16) process.stderr.isTTY = true process.stdout.columns = 32 bar.show("NAME", 0.1) cursor = [] bar.last = new Date(0) bar.pulse() isOutput(t, "32 columns", [ [ 'up', 1 ], [ 'hide' ], [ 'horizontalAbsolute', 0 ], [ 'write', 'NAME / |##------------------|\n' ], [ 'show' ] ]) process.stdout.columns = 16 bar.show("NAME", 0.5) cursor = [] bar.last = new Date(0) bar.pulse() isOutput(t, "16 columns", [ [ 'up', 1 ], [ 'hide' ], [ 'horizontalAbsolute', 0 ], [ 'write', 'NAME - |##--|\n' ], [ 'show' ] ]); }); npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/test/basic.js0000644000000000000000000002147012631326456026216 0ustar 00000000000000var tap = require('tap') var log = require('../') var result = [] var logEvents = [] var logInfoEvents = [] var logPrefixEvents = [] var util = require('util') var resultExpect = [ '\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[7msill\u001b[0m \u001b[0m\u001b[35msilly prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[34m\u001b[40mverb\u001b[0m \u001b[0m\u001b[35mverbose prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32minfo\u001b[0m \u001b[0m\u001b[35minfo prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32m\u001b[40mhttp\u001b[0m \u001b[0m\u001b[35mhttp prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[30m\u001b[43mWARN\u001b[0m \u001b[0m\u001b[35mwarn prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35merror prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32minfo\u001b[0m \u001b[0m\u001b[35minfo prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32m\u001b[40mhttp\u001b[0m \u001b[0m\u001b[35mhttp prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[30m\u001b[43mWARN\u001b[0m \u001b[0m\u001b[35mwarn prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35merror prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m This is a longer\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m message, with some details\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m and maybe a stack.\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m \n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u0007noise\u001b[0m\u001b[35m\u001b[0m LOUD NOISES\n', '\u001b[0m' ] var logPrefixEventsExpect = [ { id: 2, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 9, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 16, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] } ] // should be the same. var logInfoEventsExpect = logPrefixEventsExpect var logEventsExpect = [ { id: 0, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 1, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 2, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 3, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 4, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 5, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 6, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 7, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 8, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 9, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 10, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 11, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 12, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 13, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 14, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 15, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 16, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 17, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 18, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 19, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 20, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 21, level: 'error', prefix: '404', message: 'This is a longer\nmessage, with some details\nand maybe a stack.\n', messageRaw: [ 'This is a longer\nmessage, with some details\nand maybe a stack.\n' ] }, { id: 22, level: 'noise', prefix: false, message: 'LOUD NOISES', messageRaw: [ 'LOUD NOISES' ] } ] var Stream = require('stream').Stream var s = new Stream() s.write = function (m) { result.push(m) } s.writable = true s.isTTY = true s.end = function () {} log.stream = s log.heading = 'npm' tap.test('basic', function (t) { log.on('log', logEvents.push.bind(logEvents)) log.on('log.info', logInfoEvents.push.bind(logInfoEvents)) log.on('info prefix', logPrefixEvents.push.bind(logPrefixEvents)) console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n') log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') t.deepEqual(result.join('').trim(), resultExpect.join('').trim(), 'result') t.deepEqual(log.record, logEventsExpect, 'record') t.deepEqual(logEvents, logEventsExpect, 'logEvents') t.deepEqual(logInfoEvents, logInfoEventsExpect, 'logInfoEvents') t.deepEqual(logPrefixEvents, logPrefixEventsExpect, 'logPrefixEvents') t.end() }) npm_3.5.2.orig/node_modules/npm-install-checks/node_modules/npmlog/test/progress.js0000644000000000000000000000621612631326456027002 0ustar 00000000000000'use strict' var test = require('tap').test var log = require('../log.js') var actions = [] log.gauge = { enable: function () { actions.push(['enable']) }, disable: function () { actions.push(['disable']) }, hide: function () { actions.push(['hide']) }, show: function (name, completed) { actions.push(['show', name, completed]) }, pulse: function (name) { actions.push(['pulse', name]) } } function didActions(t, msg, output) { var tests = [] for (var ii = 0; ii < output.length; ++ ii) { for (var jj = 0; jj < output[ii].length; ++ jj) { tests.push({cmd: ii, arg: jj}) } } t.is(actions.length, output.length, msg) tests.forEach(function (test) { t.is(actions[test.cmd] ? actions[test.cmd][test.arg] : null, output[test.cmd][test.arg], msg + ': ' + output[test.cmd] + (test.arg ? ' arg #'+test.arg : '')) }) actions = [] } test('enableProgress', function (t) { t.plan(6) log.enableProgress() didActions(t, 'enableProgress', [ [ 'enable' ], [ 'show', undefined, 0 ] ]) log.enableProgress() didActions(t, 'enableProgress again', []) }) test('disableProgress', function (t) { t.plan(4) log.disableProgress() didActions(t, 'disableProgress', [ [ 'hide' ], [ 'disable' ] ]) log.disableProgress() didActions(t, 'disableProgress again', []) }) test('showProgress', function (t) { t.plan(5) log.showProgress('foo') didActions(t, 'showProgress disabled', []) log.enableProgress() actions = [] log.showProgress('foo') didActions(t, 'showProgress', [ [ 'show', 'foo', 0 ] ]) }) test('clearProgress', function (t) { t.plan(3) log.clearProgress() didActions(t, 'clearProgress', [ [ 'hide' ] ]) log.disableProgress() actions = [] log.clearProgress() didActions(t, 'clearProgress disabled', [ ]) }) test("newItem", function (t) { t.plan(12) log.enableProgress() actions = [] var a = log.newItem("test", 10) didActions(t, "newItem", [ [ 'show', undefined, 0 ] ]) a.completeWork(5) didActions(t, "newItem:completeWork", [ [ 'show', 'test', 0.5 ] ]) a.finish() didActions(t, "newItem:finish", [ [ 'show', 'test', 1 ] ]) }) // test that log objects proxy through. And test that completion status filters up test("newGroup", function (t) { t.plan(23) var a = log.newGroup("newGroup") didActions(t, "newGroup", [ [ 'show', undefined, 0.5 ] ]) a.warn("test", "this is a test") didActions(t, "newGroup:warn", [ [ 'pulse', 'test' ], [ 'hide' ], [ 'show', undefined, 0.5 ] ]) var b = a.newItem("newGroup2", 10) didActions(t, "newGroup:newItem", [ [ 'show', 'newGroup', 0.5 ] ]) b.completeWork(5) didActions(t, "newGroup:completeWork", [ [ 'show', 'newGroup2', 0.75 ] ]) a.finish() didActions(t, "newGroup:finish", [ [ 'show', 'newGroup', 1 ] ]) }) test("newStream", function (t) { t.plan(13) var a = log.newStream("newStream", 10) didActions(t, "newStream", [ [ 'show', undefined, 0.6666666666666666 ] ]) a.write("abcde") didActions(t, "newStream", [ [ 'show', 'newStream', 0.8333333333333333 ] ]) a.write("fghij") didActions(t, "newStream", [ [ 'show', 'newStream', 1 ] ]) t.is(log.tracker.completed(), 1, "Overall completion") }) npm_3.5.2.orig/node_modules/npm-install-checks/test/check-engine.js0000644000000000000000000000176312631326456023507 0ustar 00000000000000var test = require("tap").test var c = require("../index.js").checkEngine test("no engine defined", function (t) { c({ engines: {}}, "1.1.2", "0.2.1", false, true, function (err) { t.notOk(err, "no error present") t.end() }) }) test("node version too old", function (t) { var target = { engines: { node: "0.10.24" }} c(target, "1.1.2", "0.10.18", false, true, function (err) { t.ok(err, "returns an error") t.equals(err.required.node, "0.10.24") t.end() }) }) test("npm version too old", function (t) { var target = { engines: { npm: "^1.4.6" }} c(target, "1.3.2", "0.2.1", false, true, function (err) { t.ok(err, "returns an error") t.equals(err.required.npm, "^1.4.6") t.end() }) }) test("strict=false w/engineStrict json does not return an error", function (t) { var target = { engines: { npm: "1.3.6" }, engineStrict: true } c(target, "1.4.2", "0.2.1", false, false, function (err) { t.notOk(err, "returns no error") t.end() }) }) npm_3.5.2.orig/node_modules/npm-install-checks/test/check-git.js0000644000000000000000000000127612631326456023024 0ustar 00000000000000var test = require("tap").test var c = require("../index.js").checkGit var fs = require("fs") var rimraf = require("rimraf") var mkdirp = require("mkdirp") var path = require("path") var gitFixturePath = path.resolve(__dirname, "out") test("is .git repo", function (t) { mkdirp(gitFixturePath + "/.git", function () { c(gitFixturePath, function (err) { t.ok(err, "error present") t.equal(err.code, "EISGIT") t.end() }) }) }) test("is not a .git repo", function (t) { c(__dirname, function (err) { t.notOk(err, "error not present") t.end() }) }) test("cleanup", function (t) { rimraf(gitFixturePath, function () { t.pass("cleanup") t.end() }) }) npm_3.5.2.orig/node_modules/npm-install-checks/test/check-platform.js0000644000000000000000000000161212631326456024057 0ustar 00000000000000var test = require("tap").test var c = require("../index.js").checkPlatform test("target cpu wrong", function (t) { var target = {} target.cpu = "enten-cpu" target.os = "any" c(target, false, function (err) { t.ok(err, "error present") t.equal(err.code, "EBADPLATFORM") t.end() }) }) test("os wrong", function (t) { var target = {} target.cpu = "any" target.os = "enten-os" c(target, false, function (err) { t.ok(err, "error present") t.equal(err.code, "EBADPLATFORM") t.end() }) }) test("nothing wrong", function (t) { var target = {} target.cpu = "any" target.os = "any" c(target, false, function (err) { t.notOk(err, "no error present") t.end() }) }) test("force", function (t) { var target = {} target.cpu = "enten-cpu" target.os = "any" c(target, true, function (err) { t.notOk(err, "no error present") t.end() }) }) npm_3.5.2.orig/node_modules/npm-package-arg/LICENSE0000644000000000000000000000135412631326456020111 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npm-package-arg/README.md0000644000000000000000000000734712631326456020373 0ustar 00000000000000# npm-package-arg Parse package name and specifier passed to commands like `npm install` or `npm cache add`. This just parses the text given-- it's worth noting that `npm` has further logic it applies by looking at your disk to figure out what ambiguous specifiers are. If you want that logic, please see [realize-package-specifier]. [realize-package-specifier]: https://www.npmjs.org/package/realize-package-specifier Arguments look like: `foo@1.2`, `@bar/foo@1.2`, `foo@user/foo`, `http://x.com/foo.tgz`, `git+https://github.com/user/foo`, `bitbucket:user/foo`, `foo.tar.gz` or `bar` ## EXAMPLES ```javascript var assert = require("assert") var npa = require("npm-package-arg") // Pass in the descriptor, and it'll return an object var parsed = npa("@bar/foo@1.2") // Returns an object like: { raw: '@bar/foo@1.2', // what was passed in name: "@bar/foo", // the name of the package scope: "@bar", // the private scope of the package, or null type: "range", // the type of specifier this is spec: ">=1.2.0 <1.3.0" // the expanded specifier rawSpec: "1.2" // the specifier as passed in } // Parsing urls pointing at hosted git services produces a variation: var parsed = npa("git+https://github.com/user/foo") // Returns an object like: { raw: 'git+https://github.com/user/foo', scope: null, name: null, rawSpec: 'git+https://github.com/user/foo', spec: 'user/foo', type: 'hosted', hosted: { type: 'github', ssh: 'git@github.com:user/foo.git', sshurl: 'git+ssh://git@github.com/user/foo.git', https: 'https://github.com/user/foo.git', directUrl: 'https://raw.githubusercontent.com/user/foo/master/package.json' } } // Completely unreasonable invalid garbage throws an error // Make sure you wrap this in a try/catch if you have not // already sanitized the inputs! assert.throws(function() { npa("this is not \0 a valid package name or url") }) ``` ## USING `var npa = require('npm-package-arg')` * var result = npa(*arg*) Parses *arg* and returns a result object detailing what *arg* is. *arg* -- a package descriptor, like: `foo@1.2`, or `foo@user/foo`, or `http://x.com/foo.tgz`, or `git+https://github.com/user/foo` ## RESULT OBJECT The objects that are returned by npm-package-arg contain the following keys: * `name` - If known, the `name` field expected in the resulting pkg. * `type` - One of the following strings: * `git` - A git repo * `hosted` - A hosted project, from github, bitbucket or gitlab. Originally either a full url pointing at one of these services or a shorthand like `user/project` or `github:user/project` for github or `bitbucket:user/project` for bitbucket. * `tag` - A tagged version, like `"foo@latest"` * `version` - A specific version number, like `"foo@1.2.3"` * `range` - A version range, like `"foo@2.x"` * `local` - A local file or folder path * `remote` - An http url (presumably to a tgz) * `spec` - The "thing". URL, the range, git repo, etc. * `hosted` - If type=hosted this will be an object with the following keys: * `type` - github, bitbucket or gitlab * `ssh` - The ssh path for this git repo * `sshUrl` - The ssh URL for this git repo * `httpsUrl` - The HTTPS URL for this git repo * `directUrl` - The URL for the package.json in this git repo * `raw` - The original un-modified string that was provided. * `rawSpec` - The part after the `name@...`, as it was originally provided. * `scope` - If a name is something like `@org/module` then the `scope` field will be set to `org`. If it doesn't have a scoped name, then scope is `null`. If you only include a name and no specifier part, eg, `foo` or `foo@` then a default of `latest` will be used (as of 4.1.0). This is contrast with previous behavior where `*` was used. npm_3.5.2.orig/node_modules/npm-package-arg/npa.js0000644000000000000000000001062612631326456020222 0ustar 00000000000000var url = require("url") var assert = require("assert") var util = require("util") var semver = require("semver") var path = require("path") var HostedGit = require("hosted-git-info") module.exports = npa var isWindows = process.platform === "win32" || global.FAKE_WINDOWS var slashRe = isWindows ? /\\|[/]/ : /[/]/ var parseName = /^(?:@([^/]+?)[/])?([^/]+?)$/ var nameAt = /^(@([^/]+?)[/])?([^/]+?)@/ var debug = util.debuglog ? util.debuglog("npa") : /\bnpa\b/i.test(process.env.NODE_DEBUG || "") ? function () { console.error("NPA: " + util.format.apply(util, arguments).split("\n").join("\nNPA: ")) } : function () {} function validName (name) { if (!name) { debug("not a name %j", name) return false } var n = name.trim() if (!n || n.charAt(0) === "." || !n.match(/^[a-zA-Z0-9]/) || n.match(/[/()&?#|<>@:%\s\\*'"!~`]/) || n.toLowerCase() === "node_modules" || n !== encodeURIComponent(n) || n.toLowerCase() === "favicon.ico") { debug("not a valid name %j", name) return false } return n } function npa (arg) { assert.equal(typeof arg, "string") arg = arg.trim() var res = new Result res.raw = arg res.scope = null // See if it's something like foo@... var nameparse = arg.match(nameAt) debug("nameparse", nameparse) if (nameparse && validName(nameparse[3]) && (!nameparse[2] || validName(nameparse[2]))) { res.name = (nameparse[1] || "") + nameparse[3] if (nameparse[2]) res.scope = "@" + nameparse[2] arg = arg.substr(nameparse[0].length) } else { res.name = null } res.rawSpec = arg res.spec = arg var urlparse = url.parse(arg) debug("urlparse", urlparse) // windows paths look like urls // don't be fooled! if (isWindows && urlparse && urlparse.protocol && urlparse.protocol.match(/^[a-zA-Z]:$/)) { debug("windows url-ish local path", urlparse) urlparse = {} } if (urlparse.protocol || HostedGit.fromUrl(arg)) { return parseUrl(res, arg, urlparse) } // at this point, it's not a url, and not hosted // If it's a valid name, and doesn't already have a name, then assume // $name@"" range // // if it's got / chars in it, then assume that it's local. if (res.name) { if (arg == '') arg = 'latest' var version = semver.valid(arg, true) var range = semver.validRange(arg, true) // foo@... if (version) { res.spec = version res.type = "version" } else if (range) { res.spec = range res.type = "range" } else if (slashRe.test(arg)) { parseLocal(res, arg) } else { res.type = "tag" res.spec = arg } } else { var p = arg.match(parseName) if (p && validName(p[2]) && (!p[1] || validName(p[1]))) { res.type = "tag" res.spec = "latest" res.rawSpec = "" res.name = arg if (p[1]) res.scope = "@" + p[1] } else { parseLocal(res, arg) } } return res } function parseLocal (res, arg) { // turns out nearly every character is allowed in fs paths if (/\0/.test(arg)) { throw new Error("Invalid Path: " + JSON.stringify(arg)) } res.type = "local" res.spec = path.resolve(arg) } function parseUrl (res, arg, urlparse) { var gitHost = HostedGit.fromUrl(arg) if (gitHost) { res.type = "hosted" res.spec = gitHost.toString(), res.hosted = { type: gitHost.type, ssh: gitHost.ssh(), sshUrl: gitHost.sshurl(), httpsUrl: gitHost.https(), gitUrl: gitHost.git(), shortcut: gitHost.shortcut(), directUrl: gitHost.file("package.json") } return res } // check the protocol, and then see if it's git or not switch (urlparse.protocol) { case "git:": case "git+http:": case "git+https:": case "git+rsync:": case "git+ftp:": case "git+ssh:": case "git+file:": res.type = "git" res.spec = arg.replace(/^git[+]/, "") break case "http:": case "https:": res.type = "remote" res.spec = arg break case "file:": res.type = "local" res.spec = urlparse.pathname break default: throw new Error("Unsupported URL Type: " + arg) break } return res } function Result () { if (!(this instanceof Result)) return new Result } Result.prototype.name = null Result.prototype.type = null Result.prototype.spec = null Result.prototype.raw = null Result.prototype.hosted = null npm_3.5.2.orig/node_modules/npm-package-arg/package.json0000644000000000000000000000411312631326456021366 0ustar 00000000000000{ "_args": [ [ "npm-package-arg@~4.1.0", "/Users/rebecca/code/npm" ] ], "_from": "npm-package-arg@>=4.1.0 <4.2.0", "_id": "npm-package-arg@4.1.0", "_inCache": true, "_installable": true, "_location": "/npm-package-arg", "_nodeVersion": "4.2.1", "_npmUser": { "email": "me@re-becca.org", "name": "iarna" }, "_npmVersion": "3.4.0", "_phantomChildren": {}, "_requested": { "name": "npm-package-arg", "raw": "npm-package-arg@~4.1.0", "rawSpec": "~4.1.0", "scope": null, "spec": ">=4.1.0 <4.2.0", "type": "range" }, "_requiredBy": [ "/", "/init-package-json", "/npm-registry-client", "/realize-package-specifier" ], "_shasum": "2e015f8ac00737cb97f997c9cbf059f42a74527d", "_shrinkwrap": null, "_spec": "npm-package-arg@~4.1.0", "_where": "/Users/rebecca/code/npm", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/npm/npm-package-arg/issues" }, "dependencies": { "hosted-git-info": "^2.1.4", "semver": "4 || 5" }, "description": "Parse the things that can be arguments to `npm install`", "devDependencies": { "tap": "^1.2.0" }, "directories": { "test": "test" }, "dist": { "shasum": "2e015f8ac00737cb97f997c9cbf059f42a74527d", "tarball": "http://registry.npmjs.org/npm-package-arg/-/npm-package-arg-4.1.0.tgz" }, "gitHead": "383b4783a076b825815be51eb1ab2e4bb8a1e1fc", "homepage": "https://github.com/npm/npm-package-arg", "license": "ISC", "main": "npa.js", "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "iarna", "email": "me@re-becca.org" } ], "name": "npm-package-arg", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git+https://github.com/npm/npm-package-arg.git" }, "scripts": { "test": "tap test/*.js" }, "version": "4.1.0" } npm_3.5.2.orig/node_modules/npm-package-arg/test/0000755000000000000000000000000012631326456020060 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-package-arg/test/basic.js0000644000000000000000000000667512631326456021515 0ustar 00000000000000var npa = require("../npa.js") var path = require("path") require("tap").test("basic", function (t) { t.setMaxListeners(999) var tests = { "foo@1.2": { name: "foo", type: "range", spec: ">=1.2.0 <1.3.0", raw: "foo@1.2", rawSpec: "1.2" }, "@foo/bar": { raw: "@foo/bar", name: "@foo/bar", scope: "@foo", rawSpec: "", spec: "latest", type: "tag" }, "@foo/bar@": { raw: "@foo/bar@", name: "@foo/bar", scope: "@foo", rawSpec: "", spec: "latest", type: "tag" }, "@foo/bar@baz": { raw: "@foo/bar@baz", name: "@foo/bar", scope: "@foo", rawSpec: "baz", spec: "baz", type: "tag" }, "@f fo o al/ a d s ;f ": { raw: "@f fo o al/ a d s ;f", name: null, rawSpec: "@f fo o al/ a d s ;f", spec: path.resolve("@f fo o al/ a d s ;f"), type: "local" }, "foo@1.2.3": { name: "foo", type: "version", spec: "1.2.3", raw: "foo@1.2.3" }, "foo@=v1.2.3": { name: "foo", type: "version", spec: "1.2.3", raw: "foo@=v1.2.3", rawSpec: "=v1.2.3" }, "git+ssh://git@notgithub.com/user/foo#1.2.3": { name: null, type: "git", spec: "ssh://git@notgithub.com/user/foo#1.2.3", raw: "git+ssh://git@notgithub.com/user/foo#1.2.3" }, "git+file://path/to/repo#1.2.3": { name: null, type: "git", spec: "file://path/to/repo#1.2.3", raw: "git+file://path/to/repo#1.2.3" }, "git://notgithub.com/user/foo": { name: null, type: "git", spec: "git://notgithub.com/user/foo", raw: "git://notgithub.com/user/foo" }, "@foo/bar@git+ssh://notgithub.com/user/foo": { name: "@foo/bar", scope: "@foo", spec: "ssh://notgithub.com/user/foo", rawSpec: "git+ssh://notgithub.com/user/foo", raw: "@foo/bar@git+ssh://notgithub.com/user/foo" }, "/path/to/foo": { name: null, type: "local", spec: path.resolve(__dirname, "/path/to/foo"), raw: "/path/to/foo" }, "file:path/to/foo": { name: null, type: "local", spec: "path/to/foo", raw: "file:path/to/foo" }, "file:~/path/to/foo": { name: null, type: "local", spec: "~/path/to/foo", raw: "file:~/path/to/foo" }, "file:../path/to/foo": { name: null, type: "local", spec: "../path/to/foo", raw: "file:../path/to/foo" }, "file:///path/to/foo": { name: null, type: "local", spec: "/path/to/foo", raw: "file:///path/to/foo" }, "https://server.com/foo.tgz": { name: null, type: "remote", spec: "https://server.com/foo.tgz", raw: "https://server.com/foo.tgz" }, "foo@latest": { name: "foo", type: "tag", spec: "latest", raw: "foo@latest" }, "foo": { name: "foo", type: "tag", spec: "latest", raw: "foo" } } Object.keys(tests).forEach(function (arg) { var res = npa(arg) t.type(res, "Result", arg + " is result") t.has(res, tests[arg], arg + " matches expectations") }) // Completely unreasonable invalid garbage throws an error t.throws(function() { npa("this is not a \0 valid package name or url") }) t.throws(function() { npa("gopher://yea right") }, "Unsupported URL Type: gopher://yea right") t.end() }) npm_3.5.2.orig/node_modules/npm-package-arg/test/bitbucket.js0000644000000000000000000000435212631326456022376 0ustar 00000000000000var npa = require("../npa.js") var path = require("path") require("tap").test("basic", function (t) { t.setMaxListeners(999) var tests = { "bitbucket:user/foo-js": { name: null, type: "hosted", hosted: { type: "bitbucket" }, spec: "bitbucket:user/foo-js", raw: "bitbucket:user/foo-js" }, "bitbucket:user/foo-js#bar/baz": { name: null, type: "hosted", hosted: { type: "bitbucket" }, spec: "bitbucket:user/foo-js#bar/baz", raw: "bitbucket:user/foo-js#bar/baz" }, "bitbucket:user..blerg--/..foo-js# . . . . . some . tags / / /": { name: null, type: "hosted", hosted: { type: "bitbucket" }, spec: "bitbucket:user..blerg--/..foo-js# . . . . . some . tags / / /", raw: "bitbucket:user..blerg--/..foo-js# . . . . . some . tags / / /" }, "bitbucket:user/foo-js#bar/baz/bin": { name: null, type: "hosted", hosted: { type: "bitbucket" }, spec: "bitbucket:user/foo-js#bar/baz/bin", raw: "bitbucket:user/foo-js#bar/baz/bin" }, "foo@bitbucket:user/foo-js": { name: "foo", type: "hosted", hosted: { type: "bitbucket" }, spec: "bitbucket:user/foo-js", raw: "foo@bitbucket:user/foo-js" }, "git+ssh://git@bitbucket.org/user/foo#1.2.3": { name: null, type: "hosted", hosted: { type: "bitbucket" }, spec: "git+ssh://git@bitbucket.org/user/foo.git#1.2.3", raw: "git+ssh://git@bitbucket.org/user/foo#1.2.3" }, "https://bitbucket.org/user/foo.git": { name: null, type: "hosted", hosted: { type: "bitbucket" }, spec: "git+https://bitbucket.org/user/foo.git", raw: "https://bitbucket.org/user/foo.git" }, "@foo/bar@git+ssh://bitbucket.org/user/foo": { name: "@foo/bar", scope: "@foo", type: "hosted", hosted: { type: "bitbucket" }, spec: "git+ssh://git@bitbucket.org/user/foo.git", rawSpec: "git+ssh://bitbucket.org/user/foo", raw: "@foo/bar@git+ssh://bitbucket.org/user/foo" } } Object.keys(tests).forEach(function (arg) { var res = npa(arg) t.type(res, "Result", arg + " is a result") t.has(res, tests[arg], arg + " matches expectations") }) t.end() }) npm_3.5.2.orig/node_modules/npm-package-arg/test/github.js0000644000000000000000000000512112631326456021677 0ustar 00000000000000var npa = require("../npa.js") var path = require("path") require("tap").test("basic", function (t) { t.setMaxListeners(999) var tests = { "user/foo-js": { name: null, type: "hosted", hosted: { type: "github" }, spec: "github:user/foo-js", raw: "user/foo-js" }, "user/foo-js#bar/baz": { name: null, type: "hosted", hosted: { type: "github" }, spec: "github:user/foo-js#bar/baz", raw: "user/foo-js#bar/baz" }, "user..blerg--/..foo-js# . . . . . some . tags / / /": { name: null, type: "hosted", hosted: { type: "github" }, spec: "github:user..blerg--/..foo-js# . . . . . some . tags / / /", raw: "user..blerg--/..foo-js# . . . . . some . tags / / /" }, "user/foo-js#bar/baz/bin": { name: null, type: "hosted", hosted: { type: "github" }, raw: "github:user/foo-js#bar/baz/bin", raw: "user/foo-js#bar/baz/bin" }, "foo@user/foo-js": { name: "foo", type: "hosted", hosted: { type: "github" }, spec: "github:user/foo-js", raw: "foo@user/foo-js" }, "github:user/foo-js": { name: null, type: "hosted", hosted: { type: "github" }, spec: "github:user/foo-js", raw: "github:user/foo-js" }, "git+ssh://git@github.com/user/foo#1.2.3": { name: null, type: "hosted", hosted: { type: "github" }, spec: "git+ssh://git@github.com/user/foo.git#1.2.3", raw: "git+ssh://git@github.com/user/foo#1.2.3" }, "git://github.com/user/foo": { name: null, type: "hosted", hosted: { type: "github" }, spec: "git://github.com/user/foo.git", raw: "git://github.com/user/foo" }, "https://github.com/user/foo.git": { name: null, type: "hosted", hosted: { type: "github" }, spec: "git+https://github.com/user/foo.git", raw: "https://github.com/user/foo.git" }, "@foo/bar@git+ssh://github.com/user/foo": { name: "@foo/bar", scope: "@foo", type: "hosted", hosted: { type: "github" }, spec: "git+ssh://git@github.com/user/foo.git", rawSpec: "git+ssh://github.com/user/foo", raw: "@foo/bar@git+ssh://github.com/user/foo" }, "foo@bar/foo": { name: "foo", type: "hosted", hosted: { type: "github" }, spec: "github:bar/foo", raw: "foo@bar/foo" } } Object.keys(tests).forEach(function (arg) { var res = npa(arg) t.type(res, "Result", arg + " is a result") t.has(res, tests[arg], arg + " matches expectations") }) t.end() }) npm_3.5.2.orig/node_modules/npm-package-arg/test/gitlab.js0000644000000000000000000000420512631326456021661 0ustar 00000000000000var npa = require("../npa.js") var path = require("path") require("tap").test("basic", function (t) { t.setMaxListeners(999) var tests = { "gitlab:user/foo-js": { name: null, type: "hosted", hosted: { type: "gitlab" }, raw: "gitlab:user/foo-js", raw: "gitlab:user/foo-js" }, "gitlab:user/foo-js#bar/baz": { name: null, type: "hosted", hosted: { type: "gitlab" }, raw: "gitlab:user/foo-js#bar/baz", raw: "gitlab:user/foo-js#bar/baz" }, "gitlab:user..blerg--/..foo-js# . . . . . some . tags / / /": { name: null, type: "hosted", hosted: { type: "gitlab" }, spec: "gitlab:user..blerg--/..foo-js# . . . . . some . tags / / /", raw: "gitlab:user..blerg--/..foo-js# . . . . . some . tags / / /" }, "gitlab:user/foo-js#bar/baz/bin": { name: null, type: "hosted", hosted: { type: "gitlab" }, spec: "gitlab:user/foo-js#bar/baz/bin", raw: "gitlab:user/foo-js#bar/baz/bin" }, "foo@gitlab:user/foo-js": { name: "foo", type: "hosted", hosted: { type: "gitlab" }, spec: "gitlab:user/foo-js", raw: "foo@gitlab:user/foo-js" }, "git+ssh://git@gitlab.com/user/foo#1.2.3": { name: null, type: "hosted", hosted: { type: "gitlab" }, spec: "git+ssh://git@gitlab.com/user/foo.git#1.2.3", raw: "git+ssh://git@gitlab.com/user/foo#1.2.3" }, "https://gitlab.com/user/foo.git": { name: null, type: "hosted", hosted: { type: "gitlab" }, spec: "git+https://gitlab.com/user/foo.git", raw: "https://gitlab.com/user/foo.git" }, "@foo/bar@git+ssh://gitlab.com/user/foo": { name: "@foo/bar", scope: "@foo", type: "hosted", hosted: { type: "gitlab" }, spec: "git+ssh://git@gitlab.com/user/foo.git", rawSpec: "git+ssh://gitlab.com/user/foo", raw: "@foo/bar@git+ssh://gitlab.com/user/foo" } } Object.keys(tests).forEach(function (arg) { var res = npa(arg) t.type(res, "Result", arg + " is a result") t.has(res, tests[arg], arg + " matches expectations") }) t.end() }) npm_3.5.2.orig/node_modules/npm-package-arg/test/windows.js0000644000000000000000000000151012631326456022105 0ustar 00000000000000global.FAKE_WINDOWS = true var npa = require("../npa.js") var test = require("tap").test var path = require("path") var cases = { "C:\\x\\y\\z": { raw: "C:\\x\\y\\z", scope: null, name: null, rawSpec: "C:\\x\\y\\z", spec: path.resolve("C:\\x\\y\\z"), type: "local" }, "foo@C:\\x\\y\\z": { raw: "foo@C:\\x\\y\\z", scope: null, name: "foo", rawSpec: "C:\\x\\y\\z", spec: path.resolve("C:\\x\\y\\z"), type: "local" }, "foo@/foo/bar/baz": { raw: "foo@/foo/bar/baz", scope: null, name: "foo", rawSpec: "/foo/bar/baz", spec: path.resolve("/foo/bar/baz"), type: "local" } } test("parse a windows path", function (t) { Object.keys(cases).forEach(function (c) { var expect = cases[c] var actual = npa(c) t.same(actual, expect, c) }) t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/.npmignore0000644000000000000000000000010312631326456022054 0ustar 00000000000000test/fixtures/cache node_modules npm-debug.log .eslintrc .jshintrc npm_3.5.2.orig/node_modules/npm-registry-client/.travis.yml0000644000000000000000000000031612631326456022174 0ustar 00000000000000language: node_js node_js: - "0.12" - "0.10" - "0.8" - iojs script: "npm test" sudo: false before_install: - "npm install -g npm@latest" notifications: slack: npm-inc:kRqQjto7YbINqHPb1X6nS3g8 npm_3.5.2.orig/node_modules/npm-registry-client/LICENSE0000644000000000000000000000137512631326456021076 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npm-registry-client/README.md0000644000000000000000000002560112631326456021346 0ustar 00000000000000# npm-registry-client The code that npm uses to talk to the registry. It handles all the caching and HTTP calls. ## Usage ```javascript var RegClient = require('npm-registry-client') var client = new RegClient(config) var uri = "https://registry.npmjs.org/npm" var params = {timeout: 1000} client.get(uri, params, function (error, data, raw, res) { // error is an error if there was a problem. // data is the parsed data object // raw is the json string // res is the response from couch }) ``` # Registry URLs The registry calls take either a full URL pointing to a resource in the registry, or a base URL for the registry as a whole (including the registry path – but be sure to terminate the path with `/`). `http` and `https` URLs are the only ones supported. ## Using the client Every call to the client follows the same pattern: * `uri` {String} The *fully-qualified* URI of the registry API method being invoked. * `params` {Object} Per-request parameters. * `callback` {Function} Callback to be invoked when the call is complete. ### Credentials Many requests to the registry can by authenticated, and require credentials for authorization. These credentials always look the same: * `username` {String} * `password` {String} * `email` {String} * `alwaysAuth` {Boolean} Whether calls to the target registry are always authed. **or** * `token` {String} * `alwaysAuth` {Boolean} Whether calls to the target registry are always authed. ## API ### client.access(uri, params, cb) * `uri` {String} Registry URL for the package's access API endpoint. Looks like `/-/package//access`. * `params` {Object} Object containing per-request properties. * `access` {String} New access level for the package. Can be either `public` or `restricted`. Registry will raise an error if trying to change the access level of an unscoped package. * `auth` {Credentials} Set the access level for scoped packages. For now, there are only two access levels: "public" and "restricted". ### client.adduser(uri, params, cb) * `uri` {String} Base registry URL. * `params` {Object} Object containing per-request properties. * `auth` {Credentials} * `cb` {Function} * `error` {Error | null} * `data` {Object} the parsed data object * `raw` {String} the json * `res` {Response Object} response from couch Add a user account to the registry, or verify the credentials. ### client.deprecate(uri, params, cb) * `uri` {String} Full registry URI for the deprecated package. * `params` {Object} Object containing per-request properties. * `version` {String} Semver version range. * `message` {String} The message to use as a deprecation warning. * `auth` {Credentials} * `cb` {Function} Deprecate a version of a package in the registry. ### client.distTags.fetch(uri, params, cb) * `uri` {String} Base URL for the registry. * `params` {Object} Object containing per-request properties. * `package` {String} Name of the package. * `auth` {Credentials} * `cb` {Function} Fetch all of the `dist-tags` for the named package. ### client.distTags.add(uri, params, cb) * `uri` {String} Base URL for the registry. * `params` {Object} Object containing per-request properties. * `package` {String} Name of the package. * `distTag` {String} Name of the new `dist-tag`. * `version` {String} Exact version to be mapped to the `dist-tag`. * `auth` {Credentials} * `cb` {Function} Add (or replace) a single dist-tag onto the named package. ### client.distTags.set(uri, params, cb) * `uri` {String} Base URL for the registry. * `params` {Object} Object containing per-request properties. * `package` {String} Name of the package. * `distTags` {Object} Object containing a map from tag names to package versions. * `auth` {Credentials} * `cb` {Function} Set all of the `dist-tags` for the named package at once, creating any `dist-tags` that do not already exit. Any `dist-tags` not included in the `distTags` map will be removed. ### client.distTags.update(uri, params, cb) * `uri` {String} Base URL for the registry. * `params` {Object} Object containing per-request properties. * `package` {String} Name of the package. * `distTags` {Object} Object containing a map from tag names to package versions. * `auth` {Credentials} * `cb` {Function} Update the values of multiple `dist-tags`, creating any `dist-tags` that do not already exist. Any pre-existing `dist-tags` not included in the `distTags` map will be left alone. ### client.distTags.rm(uri, params, cb) * `uri` {String} Base URL for the registry. * `params` {Object} Object containing per-request properties. * `package` {String} Name of the package. * `distTag` {String} Name of the new `dist-tag`. * `auth` {Credentials} * `cb` {Function} Remove a single `dist-tag` from the named package. ### client.get(uri, params, cb) * `uri` {String} The complete registry URI to fetch * `params` {Object} Object containing per-request properties. * `timeout` {Number} Duration before the request times out. Optional (default: never). * `follow` {Boolean} Follow 302/301 responses. Optional (default: true). * `staleOk` {Boolean} If there's cached data available, then return that to the callback quickly, and update the cache the background. Optional (default: false). * `auth` {Credentials} Optional. * `cb` {Function} Fetches data from the registry via a GET request, saving it in the cache folder with the ETag or the "Last Modified" timestamp. ### client.publish(uri, params, cb) * `uri` {String} The registry URI for the package to publish. * `params` {Object} Object containing per-request properties. * `metadata` {Object} Package metadata. * `access` {String} Access for the package. Can be `public` or `restricted` (no default). * `body` {Stream} Stream of the package body / tarball. * `auth` {Credentials} * `cb` {Function} Publish a package to the registry. Note that this does not create the tarball from a folder. ### client.star(uri, params, cb) * `uri` {String} The complete registry URI for the package to star. * `params` {Object} Object containing per-request properties. * `starred` {Boolean} True to star the package, false to unstar it. Optional (default: false). * `auth` {Credentials} * `cb` {Function} Star or unstar a package. Note that the user does not have to be the package owner to star or unstar a package, though other writes do require that the user be the package owner. ### client.stars(uri, params, cb) * `uri` {String} The base URL for the registry. * `params` {Object} Object containing per-request properties. * `username` {String} Name of user to fetch starred packages for. Optional (default: user in `auth`). * `auth` {Credentials} Optional (required if `username` is omitted). * `cb` {Function} View your own or another user's starred packages. ### client.tag(uri, params, cb) * `uri` {String} The complete registry URI to tag * `params` {Object} Object containing per-request properties. * `version` {String} Version to tag. * `tag` {String} Tag name to apply. * `auth` {Credentials} * `cb` {Function} Mark a version in the `dist-tags` hash, so that `pkg@tag` will fetch the specified version. ### client.unpublish(uri, params, cb) * `uri` {String} The complete registry URI of the package to unpublish. * `params` {Object} Object containing per-request properties. * `version` {String} version to unpublish. Optional – omit to unpublish all versions. * `auth` {Credentials} * `cb` {Function} Remove a version of a package (or all versions) from the registry. When the last version us unpublished, the entire document is removed from the database. ### client.whoami(uri, params, cb) * `uri` {String} The base registry for the URI. * `params` {Object} Object containing per-request properties. * `auth` {Credentials} * `cb` {Function} Simple call to see who the registry thinks you are. Especially useful with token-based auth. ## PLUMBING The below are primarily intended for use by the rest of the API, or by the npm caching logic directly. ### client.request(uri, params, cb) * `uri` {String} URI pointing to the resource to request. * `params` {Object} Object containing per-request properties. * `method` {String} HTTP method. Optional (default: "GET"). * `body` {Stream | Buffer | String | Object} The request body. Objects that are not Buffers or Streams are encoded as JSON. Optional – body only used for write operations. * `etag` {String} The cached ETag. Optional. * `lastModified` {String} The cached Last-Modified timestamp. Optional. * `follow` {Boolean} Follow 302/301 responses. Optional (default: true). * `auth` {Credentials} Optional. * `cb` {Function} * `error` {Error | null} * `data` {Object} the parsed data object * `raw` {String} the json * `res` {Response Object} response from couch Make a generic request to the registry. All the other methods are wrappers around `client.request`. ### client.fetch(uri, params, cb) * `uri` {String} The complete registry URI to upload to * `params` {Object} Object containing per-request properties. * `headers` {Stream} HTTP headers to be included with the request. Optional. * `auth` {Credentials} Optional. * `cb` {Function} Fetch a package from a URL, with auth set appropriately if included. Used to cache remote tarballs as well as request package tarballs from the registry. # Configuration The client uses its own configuration, which is just passed in as a simple nested object. The following are the supported values (with their defaults, if any): * `proxy.http` {URL} The URL to proxy HTTP requests through. * `proxy.https` {URL} The URL to proxy HTTPS requests through. Defaults to be the same as `proxy.http` if unset. * `proxy.localAddress` {IP} The local address to use on multi-homed systems. * `ssl.ca` {String} Certificate signing authority certificates to trust. * `ssl.certificate` {String} Client certificate (PEM encoded). Enable access to servers that require client certificates. * `ssl.key` {String} Private key (PEM encoded) for client certificate. * `ssl.strict` {Boolean} Whether or not to be strict with SSL certificates. Default = `true` * `retry.count` {Number} Number of times to retry on GET failures. Default = 2. * `retry.factor` {Number} `factor` setting for `node-retry`. Default = 10. * `retry.minTimeout` {Number} `minTimeout` setting for `node-retry`. Default = 10000 (10 seconds) * `retry.maxTimeout` {Number} `maxTimeout` setting for `node-retry`. Default = 60000 (60 seconds) * `userAgent` {String} User agent header to send. Default = `"node/{process.version}"` * `log` {Object} The logger to use. Defaults to `require("npmlog")` if that works, otherwise logs are disabled. * `defaultTag` {String} The default tag to use when publishing new packages. Default = `"latest"` * `couchToken` {Object} A token for use with [couch-login](https://npmjs.org/package/couch-login). * `sessionToken` {string} A random identifier for this set of client requests. Default = 8 random hexadecimal bytes. npm_3.5.2.orig/node_modules/npm-registry-client/index.js0000644000000000000000000000435312631326456021535 0ustar 00000000000000// utilities for working with the js-registry site. module.exports = RegClient var join = require('path').join var fs = require('graceful-fs') var npmlog try { npmlog = require('npmlog') } catch (er) { npmlog = { error: noop, warn: noop, info: noop, verbose: noop, silly: noop, http: noop, pause: noop, resume: noop } } function noop () {} function RegClient (config) { this.config = Object.create(config || {}) this.config.proxy = this.config.proxy || {} if (!this.config.proxy.https && this.config.proxy.http) { this.config.proxy.https = this.config.proxy.http } this.config.ssl = this.config.ssl || {} if (this.config.ssl.strict === undefined) this.config.ssl.strict = true this.config.retry = this.config.retry || {} if (typeof this.config.retry.retries !== 'number') this.config.retry.retries = 2 if (typeof this.config.retry.factor !== 'number') this.config.retry.factor = 10 if (typeof this.config.retry.minTimeout !== 'number') this.config.retry.minTimeout = 10000 if (typeof this.config.retry.maxTimeout !== 'number') this.config.retry.maxTimeout = 60000 this.config.userAgent = this.config.userAgent || 'node/' + process.version this.config.defaultTag = this.config.defaultTag || 'latest' this.log = this.config.log || npmlog delete this.config.log var client = this fs.readdirSync(join(__dirname, 'lib')).forEach(function (f) { var entry = join(__dirname, 'lib', f) // lib/group-name/operation.js -> client.groupName.operation var stat = fs.statSync(entry) if (stat.isDirectory()) { var groupName = f.replace(/-([a-z])/, dashToCamel) fs.readdirSync(entry).forEach(function (f) { if (!f.match(/\.js$/)) return if (!client[groupName]) { // keep client.groupName.operation from stomping client.operation client[groupName] = Object.create(client) } var name = f.replace(/\.js$/, '').replace(/-([a-z])/, dashToCamel) client[groupName][name] = require(join(entry, f)) }) return } if (!f.match(/\.js$/)) return var name = f.replace(/\.js$/, '').replace(/-([a-z])/, dashToCamel) client[name] = require(entry) }) } function dashToCamel (_, l) { return l.toUpperCase() } npm_3.5.2.orig/node_modules/npm-registry-client/lib/0000755000000000000000000000000012631326456020631 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/0000755000000000000000000000000012631326456022540 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/package.json0000644000000000000000000003113212631326456022351 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "npm-registry-client", "description": "Client for the npm registry", "version": "7.0.9", "repository": { "url": "git+https://github.com/npm/npm-registry-client.git" }, "main": "index.js", "scripts": { "test": "standard && tap test/*.js" }, "dependencies": { "chownr": "^1.0.1", "concat-stream": "^1.4.6", "graceful-fs": "^4.1.2", "mkdirp": "^0.5.0", "normalize-package-data": "~1.0.1 || ^2.0.0", "npm-package-arg": "^3.0.0 || ^4.0.0", "once": "^1.3.0", "request": "^2.47.0", "retry": "^0.8.0", "rimraf": "2", "semver": "2 >=2.2.1 || 3.x || 4 || 5", "slide": "^1.1.3", "npmlog": "~2.0.0" }, "devDependencies": { "negotiator": "^0.4.9", "nock": "^0.56.0", "readable-stream": "^2.0.2", "standard": "^4.0.0", "tap": "^1.2.0" }, "optionalDependencies": { "npmlog": "~2.0.0" }, "license": "ISC", "readme": "# npm-registry-client\n\nThe code that npm uses to talk to the registry.\n\nIt handles all the caching and HTTP calls.\n\n## Usage\n\n```javascript\nvar RegClient = require('npm-registry-client')\nvar client = new RegClient(config)\nvar uri = \"https://registry.npmjs.org/npm\"\nvar params = {timeout: 1000}\n\nclient.get(uri, params, function (error, data, raw, res) {\n // error is an error if there was a problem.\n // data is the parsed data object\n // raw is the json string\n // res is the response from couch\n})\n```\n\n# Registry URLs\n\nThe registry calls take either a full URL pointing to a resource in the\nregistry, or a base URL for the registry as a whole (including the registry\npath – but be sure to terminate the path with `/`). `http` and `https` URLs are\nthe only ones supported.\n\n## Using the client\n\nEvery call to the client follows the same pattern:\n\n* `uri` {String} The *fully-qualified* URI of the registry API method being\n invoked.\n* `params` {Object} Per-request parameters.\n* `callback` {Function} Callback to be invoked when the call is complete.\n\n### Credentials\n\nMany requests to the registry can by authenticated, and require credentials\nfor authorization. These credentials always look the same:\n\n* `username` {String}\n* `password` {String}\n* `email` {String}\n* `alwaysAuth` {Boolean} Whether calls to the target registry are always\n authed.\n\n**or**\n\n* `token` {String}\n* `alwaysAuth` {Boolean} Whether calls to the target registry are always\n authed.\n\n## API\n\n### client.access(uri, params, cb)\n\n* `uri` {String} Registry URL for the package's access API endpoint.\n Looks like `/-/package//access`.\n* `params` {Object} Object containing per-request properties.\n * `access` {String} New access level for the package. Can be either\n `public` or `restricted`. Registry will raise an error if trying\n to change the access level of an unscoped package.\n * `auth` {Credentials}\n\nSet the access level for scoped packages. For now, there are only two\naccess levels: \"public\" and \"restricted\".\n\n### client.adduser(uri, params, cb)\n\n* `uri` {String} Base registry URL.\n* `params` {Object} Object containing per-request properties.\n * `auth` {Credentials}\n* `cb` {Function}\n * `error` {Error | null}\n * `data` {Object} the parsed data object\n * `raw` {String} the json\n * `res` {Response Object} response from couch\n\nAdd a user account to the registry, or verify the credentials.\n\n### client.deprecate(uri, params, cb)\n\n* `uri` {String} Full registry URI for the deprecated package.\n* `params` {Object} Object containing per-request properties.\n * `version` {String} Semver version range.\n * `message` {String} The message to use as a deprecation warning.\n * `auth` {Credentials}\n* `cb` {Function}\n\nDeprecate a version of a package in the registry.\n\n### client.distTags.fetch(uri, params, cb)\n\n* `uri` {String} Base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `package` {String} Name of the package.\n * `auth` {Credentials}\n* `cb` {Function}\n\nFetch all of the `dist-tags` for the named package.\n\n### client.distTags.add(uri, params, cb)\n\n* `uri` {String} Base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `package` {String} Name of the package.\n * `distTag` {String} Name of the new `dist-tag`.\n * `version` {String} Exact version to be mapped to the `dist-tag`.\n * `auth` {Credentials}\n* `cb` {Function}\n\nAdd (or replace) a single dist-tag onto the named package.\n\n### client.distTags.set(uri, params, cb)\n\n* `uri` {String} Base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `package` {String} Name of the package.\n * `distTags` {Object} Object containing a map from tag names to package\n versions.\n * `auth` {Credentials}\n* `cb` {Function}\n\nSet all of the `dist-tags` for the named package at once, creating any\n`dist-tags` that do not already exit. Any `dist-tags` not included in the\n`distTags` map will be removed.\n\n### client.distTags.update(uri, params, cb)\n\n* `uri` {String} Base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `package` {String} Name of the package.\n * `distTags` {Object} Object containing a map from tag names to package\n versions.\n * `auth` {Credentials}\n* `cb` {Function}\n\nUpdate the values of multiple `dist-tags`, creating any `dist-tags` that do\nnot already exist. Any pre-existing `dist-tags` not included in the `distTags`\nmap will be left alone.\n\n### client.distTags.rm(uri, params, cb)\n\n* `uri` {String} Base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `package` {String} Name of the package.\n * `distTag` {String} Name of the new `dist-tag`.\n * `auth` {Credentials}\n* `cb` {Function}\n\nRemove a single `dist-tag` from the named package.\n\n### client.get(uri, params, cb)\n\n* `uri` {String} The complete registry URI to fetch\n* `params` {Object} Object containing per-request properties.\n * `timeout` {Number} Duration before the request times out. Optional\n (default: never).\n * `follow` {Boolean} Follow 302/301 responses. Optional (default: true).\n * `staleOk` {Boolean} If there's cached data available, then return that to\n the callback quickly, and update the cache the background. Optional\n (default: false).\n * `auth` {Credentials} Optional.\n* `cb` {Function}\n\nFetches data from the registry via a GET request, saving it in the cache folder\nwith the ETag or the \"Last Modified\" timestamp.\n\n### client.publish(uri, params, cb)\n\n* `uri` {String} The registry URI for the package to publish.\n* `params` {Object} Object containing per-request properties.\n * `metadata` {Object} Package metadata.\n * `access` {String} Access for the package. Can be `public` or `restricted` (no default).\n * `body` {Stream} Stream of the package body / tarball.\n * `auth` {Credentials}\n* `cb` {Function}\n\nPublish a package to the registry.\n\nNote that this does not create the tarball from a folder.\n\n### client.star(uri, params, cb)\n\n* `uri` {String} The complete registry URI for the package to star.\n* `params` {Object} Object containing per-request properties.\n * `starred` {Boolean} True to star the package, false to unstar it. Optional\n (default: false).\n * `auth` {Credentials}\n* `cb` {Function}\n\nStar or unstar a package.\n\nNote that the user does not have to be the package owner to star or unstar a\npackage, though other writes do require that the user be the package owner.\n\n### client.stars(uri, params, cb)\n\n* `uri` {String} The base URL for the registry.\n* `params` {Object} Object containing per-request properties.\n * `username` {String} Name of user to fetch starred packages for. Optional\n (default: user in `auth`).\n * `auth` {Credentials} Optional (required if `username` is omitted).\n* `cb` {Function}\n\nView your own or another user's starred packages.\n\n### client.tag(uri, params, cb)\n\n* `uri` {String} The complete registry URI to tag\n* `params` {Object} Object containing per-request properties.\n * `version` {String} Version to tag.\n * `tag` {String} Tag name to apply.\n * `auth` {Credentials}\n* `cb` {Function}\n\nMark a version in the `dist-tags` hash, so that `pkg@tag` will fetch the\nspecified version.\n\n### client.unpublish(uri, params, cb)\n\n* `uri` {String} The complete registry URI of the package to unpublish.\n* `params` {Object} Object containing per-request properties.\n * `version` {String} version to unpublish. Optional – omit to unpublish all\n versions.\n * `auth` {Credentials}\n* `cb` {Function}\n\nRemove a version of a package (or all versions) from the registry. When the\nlast version us unpublished, the entire document is removed from the database.\n\n### client.whoami(uri, params, cb)\n\n* `uri` {String} The base registry for the URI.\n* `params` {Object} Object containing per-request properties.\n * `auth` {Credentials}\n* `cb` {Function}\n\nSimple call to see who the registry thinks you are. Especially useful with\ntoken-based auth.\n\n\n## PLUMBING\n\nThe below are primarily intended for use by the rest of the API, or by the npm\ncaching logic directly.\n\n### client.request(uri, params, cb)\n\n* `uri` {String} URI pointing to the resource to request.\n* `params` {Object} Object containing per-request properties.\n * `method` {String} HTTP method. Optional (default: \"GET\").\n * `body` {Stream | Buffer | String | Object} The request body. Objects\n that are not Buffers or Streams are encoded as JSON. Optional – body\n only used for write operations.\n * `etag` {String} The cached ETag. Optional.\n * `lastModified` {String} The cached Last-Modified timestamp. Optional.\n * `follow` {Boolean} Follow 302/301 responses. Optional (default: true).\n * `auth` {Credentials} Optional.\n* `cb` {Function}\n * `error` {Error | null}\n * `data` {Object} the parsed data object\n * `raw` {String} the json\n * `res` {Response Object} response from couch\n\nMake a generic request to the registry. All the other methods are wrappers\naround `client.request`.\n\n### client.fetch(uri, params, cb)\n\n* `uri` {String} The complete registry URI to upload to\n* `params` {Object} Object containing per-request properties.\n * `headers` {Stream} HTTP headers to be included with the request. Optional.\n * `auth` {Credentials} Optional.\n* `cb` {Function}\n\nFetch a package from a URL, with auth set appropriately if included. Used to\ncache remote tarballs as well as request package tarballs from the registry.\n\n# Configuration\n\nThe client uses its own configuration, which is just passed in as a simple\nnested object. The following are the supported values (with their defaults, if\nany):\n\n* `proxy.http` {URL} The URL to proxy HTTP requests through.\n* `proxy.https` {URL} The URL to proxy HTTPS requests through. Defaults to be\n the same as `proxy.http` if unset.\n* `proxy.localAddress` {IP} The local address to use on multi-homed systems.\n* `ssl.ca` {String} Certificate signing authority certificates to trust.\n* `ssl.certificate` {String} Client certificate (PEM encoded). Enable access\n to servers that require client certificates.\n* `ssl.key` {String} Private key (PEM encoded) for client certificate.\n* `ssl.strict` {Boolean} Whether or not to be strict with SSL certificates.\n Default = `true`\n* `retry.count` {Number} Number of times to retry on GET failures. Default = 2.\n* `retry.factor` {Number} `factor` setting for `node-retry`. Default = 10.\n* `retry.minTimeout` {Number} `minTimeout` setting for `node-retry`.\n Default = 10000 (10 seconds)\n* `retry.maxTimeout` {Number} `maxTimeout` setting for `node-retry`.\n Default = 60000 (60 seconds)\n* `userAgent` {String} User agent header to send. Default =\n `\"node/{process.version}\"`\n* `log` {Object} The logger to use. Defaults to `require(\"npmlog\")` if\n that works, otherwise logs are disabled.\n* `defaultTag` {String} The default tag to use when publishing new packages.\n Default = `\"latest\"`\n* `couchToken` {Object} A token for use with\n [couch-login](https://npmjs.org/package/couch-login).\n* `sessionToken` {string} A random identifier for this set of client requests.\n Default = 8 random hexadecimal bytes.\n", "readmeFilename": "README.md", "gitHead": "2c0c83149edb270829582a234703404b2ba1c410", "bugs": { "url": "https://github.com/npm/npm-registry-client/issues" }, "homepage": "https://github.com/npm/npm-registry-client#readme", "_id": "npm-registry-client@7.0.9", "_shasum": "1baf86ee5285c4e6d38d4556208ded56049231bb", "_from": "npm-registry-client@>=7.0.9 <7.1.0" } npm_3.5.2.orig/node_modules/npm-registry-client/test/0000755000000000000000000000000012631326456021042 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/lib/access.js0000644000000000000000000001056712631326456022441 0ustar 00000000000000module.exports = access var assert = require('assert') var url = require('url') var npa = require('npm-package-arg') var subcommands = {} function access (sub, uri, params, cb) { accessAssertions(sub, uri, params, cb) return subcommands[sub].call(this, uri, params, cb) } subcommands.public = function (uri, params, cb) { return setAccess.call(this, 'public', uri, params, cb) } subcommands.restricted = function (uri, params, cb) { return setAccess.call(this, 'restricted', uri, params, cb) } function setAccess (access, uri, params, cb) { return this.request(apiUri(uri, 'package', params.package, 'access'), { method: 'POST', auth: params.auth, body: JSON.stringify({ access: access }) }, cb) } subcommands.grant = function (uri, params, cb) { var reqUri = apiUri(uri, 'team', params.scope, params.team, 'package') return this.request(reqUri, { method: 'PUT', auth: params.auth, body: JSON.stringify({ permissions: params.permissions, package: params.package }) }, cb) } subcommands.revoke = function (uri, params, cb) { var reqUri = apiUri(uri, 'team', params.scope, params.team, 'package') return this.request(reqUri, { method: 'DELETE', auth: params.auth, body: JSON.stringify({ package: params.package }) }, cb) } subcommands['ls-packages'] = function (uri, params, cb, type) { type = type || (params.team ? 'team' : 'org') var client = this var uriParams = '?format=cli' var reqUri = apiUri(uri, type, params.scope, params.team, 'package') return client.request(reqUri + uriParams, { method: 'GET', auth: params.auth }, function (err, perms) { if (err && err.statusCode === 404 && type === 'org') { subcommands['ls-packages'].call(client, uri, params, cb, 'user') } else { cb(err, perms && translatePermissions(perms)) } }) } subcommands['ls-collaborators'] = function (uri, params, cb) { var uriParams = '?format=cli' if (params.user) { uriParams += ('&user=' + encodeURIComponent(params.user)) } var reqUri = apiUri(uri, 'package', params.package, 'collaborators') return this.request(reqUri + uriParams, { method: 'GET', auth: params.auth }, function (err, perms) { cb(err, perms && translatePermissions(perms)) }) } subcommands.edit = function () { throw new Error('edit subcommand is not implemented yet') } function apiUri (registryUri) { var path = Array.prototype.slice.call(arguments, 1) .filter(function (x) { return x }) .map(encodeURIComponent) .join('/') return url.resolve(registryUri, '-/' + path) } function accessAssertions (subcommand, uri, params, cb) { assert(subcommands.hasOwnProperty(subcommand), 'access subcommand must be one of ' + Object.keys(subcommands).join(', ')) typeChecks({ 'uri': [uri, 'string'], 'params': [params, 'object'], 'auth': [params.auth, 'object'], 'callback': [cb, 'function'] }) if (contains([ 'public', 'restricted' ], subcommand)) { typeChecks({ 'package': [params.package, 'string'] }) assert(!!npa(params.package).scope, 'access commands are only accessible for scoped packages') } if (contains(['grant', 'revoke', 'ls-packages'], subcommand)) { typeChecks({ 'scope': [params.scope, 'string'] }) } if (contains(['grant', 'revoke'], subcommand)) { typeChecks({ 'team': [params.team, 'string'] }) } if (subcommand === 'grant') { typeChecks({ 'permissions': [params.permissions, 'string'] }) assert(params.permissions === 'read-only' || params.permissions === 'read-write', 'permissions must be either read-only or read-write') } } function typeChecks (specs) { Object.keys(specs).forEach(function (key) { var checks = specs[key] assert(typeof checks[0] === checks[1], key + ' is required and must be of type ' + checks[1]) }) } function contains (arr, item) { return arr.indexOf(item) !== -1 } function translatePermissions (perms) { var newPerms = {} for (var key in perms) { if (perms.hasOwnProperty(key)) { if (perms[key] === 'read') { newPerms[key] = 'read-only' } else if (perms[key] === 'write') { newPerms[key] = 'read-write' } else { // This shouldn't happen, but let's not break things // if the API starts returning different things. newPerms[key] = perms[key] } } } return newPerms } npm_3.5.2.orig/node_modules/npm-registry-client/lib/adduser.js0000644000000000000000000000737012631326456022625 0ustar 00000000000000module.exports = adduser var url = require('url') var assert = require('assert') function adduser (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to adduser') assert( params && typeof params === 'object', 'must pass params to adduser' ) assert(typeof cb === 'function', 'must pass callback to adduser') assert(params.auth && typeof params.auth, 'must pass auth to adduser') var auth = params.auth assert(typeof auth.username === 'string', 'must include username in auth') assert(typeof auth.password === 'string', 'must include password in auth') assert(typeof auth.email === 'string', 'must include email in auth') // normalize registry URL if (uri.slice(-1) !== '/') uri += '/' var username = auth.username.trim() var password = auth.password.trim() var email = auth.email.trim() // validation if (!username) return cb(new Error('No username supplied.')) if (!password) return cb(new Error('No password supplied.')) if (!email) return cb(new Error('No email address supplied.')) if (!email.match(/^[^@]+@[^\.]+\.[^\.]+/)) { return cb(new Error('Please use a real email address.')) } var userobj = { _id: 'org.couchdb.user:' + username, name: username, password: password, email: email, type: 'user', roles: [], date: new Date().toISOString() } var token = this.config.couchToken if (this.couchLogin) this.couchLogin.token = null cb = done.call(this, token, cb) var logObj = Object.keys(userobj).map(function (k) { if (k === 'password') return [k, 'XXXXX'] return [k, userobj[k]] }).reduce(function (s, kv) { s[kv[0]] = kv[1] return s }, {}) this.log.verbose('adduser', 'before first PUT', logObj) var client = this uri = url.resolve(uri, '-/user/org.couchdb.user:' + encodeURIComponent(username)) var options = { method: 'PUT', body: userobj, auth: auth } this.request( uri, options, function (error, data, json, response) { if (!error || !response || response.statusCode !== 409) { return cb(error, data, json, response) } client.log.verbose('adduser', 'update existing user') return client.request( uri + '?write=true', { auth: auth }, function (er, data, json, response) { if (er || data.error) { return cb(er, data, json, response) } Object.keys(data).forEach(function (k) { if (!userobj[k] || k === 'roles') { userobj[k] = data[k] } }) client.log.verbose('adduser', 'userobj', logObj) client.request(uri + '/-rev/' + userobj._rev, options, cb) } ) } ) function done (token, cb) { return function (error, data, json, response) { if (!error && (!response || response.statusCode === 201)) { return cb(error, data, json, response) } // there was some kind of error, reinstate previous auth/token/etc. if (client.couchLogin) { client.couchLogin.token = token if (client.couchLogin.tokenSet) { client.couchLogin.tokenSet(token) } } client.log.verbose('adduser', 'back', [error, data, json]) if (!error) { error = new Error( (response && response.statusCode || '') + ' ' + 'Could not create user\n' + JSON.stringify(data) ) } if (response && (response.statusCode === 401 || response.statusCode === 403)) { client.log.warn('adduser', 'Incorrect username or password\n' + 'You can reset your account by visiting:\n' + '\n' + ' https://npmjs.org/forgot\n') } return cb(error) } } } npm_3.5.2.orig/node_modules/npm-registry-client/lib/attempt.js0000644000000000000000000000074712631326456022655 0ustar 00000000000000var retry = require('retry') module.exports = attempt function attempt (cb) { // Tuned to spread 3 attempts over about a minute. // See formula at . var operation = retry.operation(this.config.retry) var client = this operation.attempt(function (currentAttempt) { client.log.info( 'attempt', 'registry request try #' + currentAttempt + ' at ' + (new Date()).toLocaleTimeString() ) cb(operation) }) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/authify.js0000644000000000000000000000124112631326456022636 0ustar 00000000000000module.exports = authify function authify (authed, parsed, headers, credentials) { if (credentials && credentials.token) { this.log.verbose('request', 'using bearer token for auth') headers.authorization = 'Bearer ' + credentials.token return null } if (authed) { if (credentials && credentials.username && credentials.password) { var username = encodeURIComponent(credentials.username) var password = encodeURIComponent(credentials.password) parsed.auth = username + ':' + password } else { return new Error( 'This request requires auth credentials. Run `npm login` and repeat the request.' ) } } } npm_3.5.2.orig/node_modules/npm-registry-client/lib/deprecate.js0000644000000000000000000000243012631326456023122 0ustar 00000000000000module.exports = deprecate var assert = require('assert') var semver = require('semver') function deprecate (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to deprecate') assert(params && typeof params === 'object', 'must pass params to deprecate') assert(typeof cb === 'function', 'must pass callback to deprecate') assert(typeof params.version === 'string', 'must pass version to deprecate') assert(typeof params.message === 'string', 'must pass message to deprecate') assert( params.auth && typeof params.auth === 'object', 'must pass auth to deprecate' ) var version = params.version var message = params.message var auth = params.auth if (semver.validRange(version) === null) { return cb(new Error('invalid version range: ' + version)) } this.get(uri + '?write=true', { auth: auth }, function (er, data) { if (er) return cb(er) // filter all the versions that match Object.keys(data.versions).filter(function (v) { return semver.satisfies(v, version) }).forEach(function (v) { data.versions[v].deprecated = message }) // now update the doc on the registry var options = { method: 'PUT', body: data, auth: auth } this.request(uri, options, cb) }.bind(this)) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/dist-tags/0000755000000000000000000000000012631326456022530 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/lib/fetch.js0000644000000000000000000000465512631326456022272 0ustar 00000000000000var assert = require('assert') var url = require('url') var request = require('request') var once = require('once') module.exports = fetch function fetch (uri, params, cb) { assert(typeof uri === 'string', 'must pass uri to request') assert(params && typeof params === 'object', 'must pass params to request') assert(typeof cb === 'function', 'must pass callback to request') cb = once(cb) var client = this this.attempt(function (operation) { makeRequest.call(client, uri, params, function (er, req) { if (er) return cb(er) req.on('error', function (er) { if (operation.retry(er)) { client.log.info('retry', 'will retry, error on last attempt: ' + er) } else { cb(er) } }) req.on('response', function (res) { client.log.http('fetch', '' + res.statusCode, uri) var er var statusCode = res && res.statusCode if (statusCode === 200) { // Work around bug in node v0.10.0 where the CryptoStream // gets stuck and never starts reading again. res.resume() if (process.version === 'v0.10.0') unstick(res) return cb(null, res) // Only retry on 408, 5xx or no `response`. } else if (statusCode === 408) { er = new Error('request timed out') } else if (statusCode >= 500) { er = new Error('server error ' + statusCode) } if (er && operation.retry(er)) { client.log.info('retry', 'will retry, error on last attempt: ' + er) } else { cb(new Error('fetch failed with status code ' + statusCode)) } }) }) }) } function unstick (response) { response.resume = (function (orig) { return function () { var ret = orig.apply(response, arguments) if (response.socket.encrypted) response.socket.encrypted.read(0) return ret } })(response.resume) } function makeRequest (remote, params, cb) { var parsed = url.parse(remote) this.log.http('fetch', 'GET', parsed.href) var headers = params.headers || {} var er = this.authify( params.auth && params.auth.alwaysAuth, parsed, headers, params.auth ) if (er) return cb(er) var opts = this.initialize( parsed, 'GET', 'application/x-tar, application/vnd.github+json; q=0.1', headers ) // always want to follow redirects for fetch opts.followRedirect = true cb(null, request(opts)) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/get.js0000644000000000000000000000121312631326456021743 0ustar 00000000000000module.exports = get var assert = require('assert') var url = require('url') /* * This is meant to be overridden in specific implementations if you * want specialized behavior for metadata (i.e. caching). */ function get (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to get') assert(params && typeof params === 'object', 'must pass params to get') assert(typeof cb === 'function', 'must pass callback to get') var parsed = url.parse(uri) assert( parsed.protocol === 'http:' || parsed.protocol === 'https:', 'must have a URL that starts with http: or https:' ) this.request(uri, params, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/initialize.js0000644000000000000000000000406412631326456023334 0ustar 00000000000000var crypto = require('crypto') var HttpAgent = require('http').Agent var HttpsAgent = require('https').Agent var pkg = require('../package.json') var httpAgent var httpsAgent module.exports = initialize function initialize (uri, method, accept, headers) { if (!this.config.sessionToken) { this.config.sessionToken = crypto.randomBytes(8).toString('hex') this.log.verbose('request id', this.config.sessionToken) } var opts = { url: uri, method: method, headers: headers, localAddress: this.config.proxy.localAddress, strictSSL: this.config.ssl.strict, cert: this.config.ssl.certificate, key: this.config.ssl.key, ca: this.config.ssl.ca, agent: getAgent(uri.protocol, this.config) } // allow explicit disabling of proxy in environment via CLI // // how false gets here is the CLI's problem (it's gross) if (this.config.proxy.http === false) { opts.proxy = null } else { // request will not pay attention to the NOPROXY environment variable if a // config value named proxy is passed in, even if it's set to null. var proxy if (uri.protocol === 'https:') { proxy = this.config.proxy.https } else { proxy = this.config.proxy.http } if (typeof proxy === 'string') opts.proxy = proxy } headers.version = this.version || pkg.version headers.accept = accept if (this.refer) headers.referer = this.refer headers['npm-session'] = this.config.sessionToken headers['user-agent'] = this.config.userAgent return opts } function getAgent (protocol, config) { if (protocol === 'https:') { if (!httpsAgent) { httpsAgent = new HttpsAgent({ keepAlive: true, localAddress: config.proxy.localAddress, rejectUnauthorized: config.ssl.strict, ca: config.ssl.ca, cert: config.ssl.certificate, key: config.ssl.key }) } return httpsAgent } else { if (!httpAgent) { httpAgent = new HttpAgent({ keepAlive: true, localAddress: config.proxy.localAddress }) } return httpAgent } } npm_3.5.2.orig/node_modules/npm-registry-client/lib/logout.js0000644000000000000000000000132212631326456022476 0ustar 00000000000000module.exports = logout var assert = require('assert') var url = require('url') function logout (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to logout') assert(params && typeof params === 'object', 'must pass params to logout') assert(typeof cb === 'function', 'must pass callback to star') var auth = params.auth assert(auth && typeof auth === 'object', 'must pass auth to logout') assert(typeof auth.token === 'string', 'can only log out for token auth') uri = url.resolve(uri, '-/user/token/' + auth.token) var options = { method: 'DELETE', auth: auth } this.log.verbose('logout', 'invalidating session token for user') this.request(uri, options, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/ping.js0000644000000000000000000000122312631326456022122 0ustar 00000000000000module.exports = ping var url = require('url') var assert = require('assert') function ping (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to ping') assert(params && typeof params === 'object', 'must pass params to ping') assert(typeof cb === 'function', 'must pass callback to ping') var auth = params.auth assert(auth && typeof auth === 'object', 'must pass auth to ping') this.request(url.resolve(uri, '-/ping?write=true'), { auth: auth }, function (er, fullData) { if (er) { cb(er) } else if (fullData) { cb(null, fullData) } else { cb(new Error('No data received')) } }) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/publish.js0000644000000000000000000001257212631326456022644 0ustar 00000000000000module.exports = publish var url = require('url') var semver = require('semver') var crypto = require('crypto') var Stream = require('stream').Stream var assert = require('assert') var fixer = require('normalize-package-data').fixer var concat = require('concat-stream') function escaped (name) { return name.replace('/', '%2f') } function publish (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to publish') assert(params && typeof params === 'object', 'must pass params to publish') assert(typeof cb === 'function', 'must pass callback to publish') var access = params.access assert( (!access) || ['public', 'restricted'].indexOf(access) !== -1, "if present, access level must be either 'public' or 'restricted'" ) var auth = params.auth assert(auth && typeof auth === 'object', 'must pass auth to publish') if (!(auth.token || (auth.password && auth.username && auth.email))) { var er = new Error('auth required for publishing') er.code = 'ENEEDAUTH' return cb(er) } var metadata = params.metadata assert( metadata && typeof metadata === 'object', 'must pass package metadata to publish' ) try { fixer.fixNameField(metadata, {strict: true, allowLegacyCase: true}) } catch (er) { return cb(er) } var version = semver.clean(metadata.version) if (!version) return cb(new Error('invalid semver: ' + metadata.version)) metadata.version = version var body = params.body assert(body, 'must pass package body to publish') assert(body instanceof Stream, 'package body passed to publish must be a stream') var client = this var sink = concat(function (tarbuffer) { putFirst.call(client, uri, metadata, tarbuffer, access, auth, cb) }) sink.on('error', cb) body.pipe(sink) } function putFirst (registry, data, tarbuffer, access, auth, cb) { // optimistically try to PUT all in one single atomic thing. // If 409, then GET and merge, try again. // If other error, then fail. var root = { _id: data.name, name: data.name, description: data.description, 'dist-tags': {}, versions: {}, readme: data.readme || '' } if (access) root.access = access if (!auth.token) { root.maintainers = [{ name: auth.username, email: auth.email }] data.maintainers = JSON.parse(JSON.stringify(root.maintainers)) } root.versions[ data.version ] = data var tag = data.tag || this.config.defaultTag root['dist-tags'][tag] = data.version var tbName = data.name + '-' + data.version + '.tgz' var tbURI = data.name + '/-/' + tbName data._id = data.name + '@' + data.version data.dist = data.dist || {} data.dist.shasum = crypto.createHash('sha1').update(tarbuffer).digest('hex') data.dist.tarball = url.resolve(registry, tbURI) .replace(/^https:\/\//, 'http://') root._attachments = {} root._attachments[ tbName ] = { 'content_type': 'application/octet-stream', 'data': tarbuffer.toString('base64'), 'length': tarbuffer.length } var fixed = url.resolve(registry, escaped(data.name)) var client = this var options = { method: 'PUT', body: root, auth: auth } this.request(fixed, options, function (er, parsed, json, res) { var r409 = 'must supply latest _rev to update existing package' var r409b = 'Document update conflict.' var conflict = res && res.statusCode === 409 if (parsed && (parsed.reason === r409 || parsed.reason === r409b)) { conflict = true } // a 409 is typical here. GET the data and merge in. if (er && !conflict) { client.log.error('publish', 'Failed PUT ' + (res && res.statusCode)) return cb(er) } if (!er && !conflict) return cb(er, parsed, json, res) // let's see what versions are already published. client.request(fixed + '?write=true', { auth: auth }, function (er, current) { if (er) return cb(er) putNext.call(client, registry, data.version, root, current, auth, cb) }) }) } function putNext (registry, newVersion, root, current, auth, cb) { // already have the tardata on the root object // just merge in existing stuff var curVers = Object.keys(current.versions || {}).map(function (v) { return semver.clean(v, true) }).concat(Object.keys(current.time || {}).map(function (v) { if (semver.valid(v, true)) return semver.clean(v, true) }).filter(function (v) { return v })) if (curVers.indexOf(newVersion) !== -1) { return cb(conflictError(root.name, newVersion)) } current.versions[newVersion] = root.versions[newVersion] current._attachments = current._attachments || {} for (var i in root) { switch (i) { // objects that copy over the new stuffs case 'dist-tags': case 'versions': case '_attachments': for (var j in root[i]) current[i][j] = root[i][j] break // ignore these case 'maintainers': break // copy default: current[i] = root[i] } } var maint = JSON.parse(JSON.stringify(root.maintainers)) root.versions[newVersion].maintainers = maint var uri = url.resolve(registry, escaped(root.name)) var options = { method: 'PUT', body: current, auth: auth } this.request(uri, options, cb) } function conflictError (pkgid, version) { var e = new Error('cannot modify pre-existing version') e.code = 'EPUBLISHCONFLICT' e.pkgid = pkgid e.version = version return e } npm_3.5.2.orig/node_modules/npm-registry-client/lib/request.js0000644000000000000000000001747512631326456022675 0ustar 00000000000000module.exports = regRequest // npm: means // 1. https // 2. send authorization // 3. content-type is 'application/json' -- metadata // var assert = require('assert') var url = require('url') var zlib = require('zlib') var Stream = require('stream').Stream var STATUS_CODES = require('http').STATUS_CODES var request = require('request') var once = require('once') function regRequest (uri, params, cb_) { assert(typeof uri === 'string', 'must pass uri to request') assert(params && typeof params === 'object', 'must pass params to request') assert(typeof cb_ === 'function', 'must pass callback to request') params.method = params.method || 'GET' this.log.verbose('request', 'uri', uri) // Since there are multiple places where an error could occur, // don't let the cb be called more than once. var cb = once(cb_) if (uri.match(/^\/?favicon.ico/)) { return cb(new Error("favicon.ico isn't a package, it's a picture.")) } var adduserChange = /\/?-\/user\/org\.couchdb\.user:([^/]+)\/-rev/ var isUserChange = uri.match(adduserChange) var adduserNew = /\/?-\/user\/org\.couchdb\.user:([^/?]+)$/ var isNewUser = uri.match(adduserNew) var alwaysAuth = params.auth && params.auth.alwaysAuth var isDelete = params.method === 'DELETE' var isWrite = params.body || isDelete if (isUserChange && !isWrite) { return cb(new Error('trying to change user document without writing(?!)')) } // new users can *not* use auth, because they don't *have* auth yet if (isUserChange) { this.log.verbose('request', 'updating existing user; sending authorization') params.authed = true } else if (isNewUser) { this.log.verbose('request', "new user, so can't send auth") params.authed = false } else if (alwaysAuth) { this.log.verbose('request', 'always-auth set; sending authorization') params.authed = true } else if (isWrite) { this.log.verbose('request', 'sending authorization for write operation') params.authed = true } else { // most of the time we don't want to auth this.log.verbose('request', 'no auth needed') params.authed = false } var self = this this.attempt(function (operation) { makeRequest.call(self, uri, params, function (er, parsed, raw, response) { if (response) { self.log.verbose('headers', response.headers) if (response.headers['npm-notice']) { self.log.warn('notice', response.headers['npm-notice']) } } if (!er || (er.message && er.message.match(/^SSL Error/))) { if (er) er.code = 'ESSL' return cb(er, parsed, raw, response) } // Only retry on 408, 5xx or no `response`. var statusCode = response && response.statusCode var timeout = statusCode === 408 var serverError = statusCode >= 500 var statusRetry = !statusCode || timeout || serverError if (er && statusRetry && operation.retry(er)) { self.log.info('retry', 'will retry, error on last attempt: ' + er) return undefined } cb.apply(null, arguments) }) }) } function makeRequest (uri, params, cb_) { var cb = once(cb_) var parsed = url.parse(uri) var headers = {} // metadata should be compressed headers['accept-encoding'] = 'gzip' var er = this.authify(params.authed, parsed, headers, params.auth) if (er) return cb_(er) var opts = this.initialize( parsed, params.method, 'application/json', headers ) opts.followRedirect = (typeof params.follow === 'boolean' ? params.follow : true) opts.encoding = null // tell request let body be Buffer instance if (params.etag) { this.log.verbose('etag', params.etag) headers[params.method === 'GET' ? 'if-none-match' : 'if-match'] = params.etag } if (params.lastModified && params.method === 'GET') { this.log.verbose('lastModified', params.lastModified) headers['if-modified-since'] = params.lastModified } // figure out wth body is if (params.body) { if (Buffer.isBuffer(params.body)) { opts.body = params.body headers['content-type'] = 'application/json' headers['content-length'] = params.body.length } else if (typeof params.body === 'string') { opts.body = params.body headers['content-type'] = 'application/json' headers['content-length'] = Buffer.byteLength(params.body) } else if (params.body instanceof Stream) { headers['content-type'] = 'application/octet-stream' if (params.body.size) headers['content-length'] = params.body.size } else { delete params.body._etag delete params.body._lastModified opts.json = params.body } } this.log.http('request', params.method, parsed.href || '/') var done = requestDone.call(this, params.method, uri, cb) var req = request(opts, decodeResponseBody(done)) req.on('error', cb) req.on('socket', function (s) { s.on('error', cb) }) if (params.body && (params.body instanceof Stream)) { params.body.pipe(req) } } function decodeResponseBody (cb) { return function (er, response, data) { if (er) return cb(er, response, data) // don't ever re-use connections that had server errors. // those sockets connect to the Bad Place! if (response.socket && response.statusCode > 500) { response.socket.destroy() } if (response.headers['content-encoding'] !== 'gzip') { return cb(er, response, data) } zlib.gunzip(data, function (er, buf) { if (er) return cb(er, response, data) cb(null, response, buf) }) } } // cb(er, parsed, raw, response) function requestDone (method, where, cb) { return function (er, response, data) { if (er) return cb(er) var urlObj = url.parse(where) if (urlObj.auth) urlObj.auth = '***' this.log.http(response.statusCode, url.format(urlObj)) if (Buffer.isBuffer(data)) { data = data.toString() } var parsed if (data && typeof data === 'string' && response.statusCode !== 304) { try { parsed = JSON.parse(data) } catch (ex) { ex.message += '\n' + data this.log.verbose('bad json', data) this.log.error('registry', 'error parsing json') return cb(ex, null, data, response) } } else if (data) { parsed = data data = JSON.stringify(parsed) } // expect data with any error codes if (!data && response.statusCode >= 400) { var code = response.statusCode return cb( makeError(code + ' ' + STATUS_CODES[code], null, code), null, data, response ) } er = null if (parsed && response.headers.etag) { parsed._etag = response.headers.etag } if (parsed && response.headers['last-modified']) { parsed._lastModified = response.headers['last-modified'] } // for the search endpoint, the 'error' property can be an object if (parsed && parsed.error && typeof parsed.error !== 'object' || response.statusCode >= 400) { var w = url.parse(where).pathname.substr(1) var name if (!w.match(/^-/)) { w = w.split('/') name = decodeURIComponent(w[w.indexOf('_rewrite') + 1]) } if (!parsed.error) { er = makeError( 'Registry returned ' + response.statusCode + ' for ' + method + ' on ' + where, name, response.statusCode ) } else if (name && parsed.error === 'not_found') { er = makeError('404 Not Found: ' + name, name, response.statusCode) } else { er = makeError( parsed.error + ' ' + (parsed.reason || '') + ': ' + (name || w), name, response.statusCode ) } } return cb(er, parsed, data, response) }.bind(this) } function makeError (message, name, code) { var er = new Error(message) if (name) er.pkgid = name if (code) { er.statusCode = code er.code = 'E' + code } return er } npm_3.5.2.orig/node_modules/npm-registry-client/lib/star.js0000644000000000000000000000257212631326456022146 0ustar 00000000000000module.exports = star var assert = require('assert') function star (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to star') assert(params && typeof params === 'object', 'must pass params to star') assert(typeof cb === 'function', 'must pass callback to star') var starred = !!params.starred var auth = params.auth assert(auth && typeof auth === 'object', 'must pass auth to star') if (!(auth.token || (auth.password && auth.username && auth.email))) { var er = new Error('Must be logged in to star/unstar packages') er.code = 'ENEEDAUTH' return cb(er) } var client = this this.request(uri + '?write=true', { auth: auth }, function (er, fullData) { if (er) return cb(er) client.whoami(uri, params, function (er, username) { if (er) return cb(er) var data = { _id: fullData._id, _rev: fullData._rev, users: fullData.users || {} } if (starred) { client.log.info('starring', data._id) data.users[username] = true client.log.verbose('starring', data) } else { delete data.users[username] client.log.info('unstarring', data._id) client.log.verbose('unstarring', data) } var options = { method: 'PUT', body: data, auth: auth } return client.request(uri, options, cb) }) }) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/stars.js0000644000000000000000000000121312631326456022320 0ustar 00000000000000module.exports = stars var assert = require('assert') var url = require('url') function stars (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to stars') assert(params && typeof params === 'object', 'must pass params to stars') assert(typeof cb === 'function', 'must pass callback to stars') var auth = params.auth var name = params.username || (auth && auth.username) if (!name) return cb(new Error('must pass either username or auth to stars')) var encoded = encodeURIComponent(name) var path = '-/_view/starredByUser?key="' + encoded + '"' this.request(url.resolve(uri, path), { auth: auth }, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/tag.js0000644000000000000000000000126212631326456021743 0ustar 00000000000000module.exports = tag var assert = require('assert') function tag (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to tag') assert(params && typeof params === 'object', 'must pass params to tag') assert(typeof cb === 'function', 'must pass callback to tag') assert(typeof params.version === 'string', 'must pass version to tag') assert(typeof params.tag === 'string', 'must pass tag name to tag') assert( params.auth && typeof params.auth === 'object', 'must pass auth to tag' ) var options = { method: 'PUT', body: JSON.stringify(params.version), auth: params.auth } this.request(uri + '/' + params.tag, options, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/team.js0000644000000000000000000000562212631326456022122 0ustar 00000000000000module.exports = team var assert = require('assert') var url = require('url') var subcommands = {} function team (sub, uri, params, cb) { teamAssertions(sub, uri, params, cb) return subcommands[sub].call(this, uri, params, cb) } subcommands.create = function (uri, params, cb) { return this.request(apiUri(uri, 'org', params.scope, 'team'), { method: 'PUT', auth: params.auth, body: JSON.stringify({ name: params.team }) }, cb) } subcommands.destroy = function (uri, params, cb) { return this.request(apiUri(uri, 'team', params.scope, params.team), { method: 'DELETE', auth: params.auth }, cb) } subcommands.add = function (uri, params, cb) { return this.request(apiUri(uri, 'team', params.scope, params.team, 'user'), { method: 'PUT', auth: params.auth, body: JSON.stringify({ user: params.user }) }, cb) } subcommands.rm = function (uri, params, cb) { return this.request(apiUri(uri, 'team', params.scope, params.team, 'user'), { method: 'DELETE', auth: params.auth, body: JSON.stringify({ user: params.user }) }, cb) } subcommands.ls = function (uri, params, cb) { var uriParams = '?format=cli' if (params.team) { var reqUri = apiUri( uri, 'team', params.scope, params.team, 'user') + uriParams return this.request(reqUri, { method: 'GET', auth: params.auth }, cb) } else { return this.request(apiUri(uri, 'org', params.scope, 'team') + uriParams, { method: 'GET', auth: params.auth }, cb) } } // TODO - we punted this to v2 // subcommands.edit = function (uri, params, cb) { // return this.request(apiUri(uri, 'team', params.scope, params.team, 'user'), { // method: 'POST', // auth: params.auth, // body: JSON.stringify({ // users: params.users // }) // }, cb) // } function apiUri (registryUri) { var path = Array.prototype.slice.call(arguments, 1) .map(encodeURIComponent) .join('/') return url.resolve(registryUri, '-/' + path) } function teamAssertions (subcommand, uri, params, cb) { assert(subcommand, 'subcommand is required') assert(subcommands.hasOwnProperty(subcommand), 'team subcommand must be one of ' + Object.keys(subcommands)) assert(typeof uri === 'string', 'registry URI is required') assert(typeof params === 'object', 'params are required') assert(typeof params.auth === 'object', 'auth is required') assert(typeof params.scope === 'string', 'scope is required') assert(!cb || typeof cb === 'function', 'callback must be a function') if (subcommand !== 'ls') { assert(typeof params.team === 'string', 'team name is required') } if (subcommand === 'rm' || subcommand === 'add') { assert(typeof params.user === 'string', 'user is required') } if (subcommand === 'edit') { assert(typeof params.users === 'object' && params.users.length != null, 'users is required') } } npm_3.5.2.orig/node_modules/npm-registry-client/lib/unpublish.js0000644000000000000000000000731212631326456023203 0ustar 00000000000000module.exports = unpublish // fetch the data // modify to remove the version in question // If no versions remaining, then DELETE // else, PUT the modified data // delete the tarball var semver = require('semver') var url = require('url') var chain = require('slide').chain var assert = require('assert') function unpublish (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to unpublish') assert(params && typeof params === 'object', 'must pass params to unpublish') assert(typeof cb === 'function', 'must pass callback to unpublish') var ver = params.version var auth = params.auth assert(auth && typeof auth === 'object', 'must pass auth to unpublish') var options = { timeout: -1, follow: false, auth: auth } this.get(uri + '?write=true', options, function (er, data) { if (er) { this.log.info('unpublish', uri + ' not published') return cb() } // remove all if no version specified if (!ver) { this.log.info('unpublish', 'No version specified, removing all') return this.request(uri + '/-rev/' + data._rev, { method: 'DELETE', auth: auth }, cb) } var versions = data.versions || {} var versionPublic = versions.hasOwnProperty(ver) var dist if (!versionPublic) { this.log.info('unpublish', uri + '@' + ver + ' not published') } else { dist = versions[ver].dist this.log.verbose('unpublish', 'removing attachments', dist) } delete versions[ver] // if it was the only version, then delete the whole package. if (!Object.keys(versions).length) { this.log.info('unpublish', 'No versions remain, removing entire package') return this.request(uri + '/-rev/' + data._rev, { method: 'DELETE', auth: auth }, cb) } if (!versionPublic) return cb() var latestVer = data['dist-tags'].latest for (var tag in data['dist-tags']) { if (data['dist-tags'][tag] === ver) delete data['dist-tags'][tag] } if (latestVer === ver) { data['dist-tags'].latest = Object.getOwnPropertyNames(versions).sort(semver.compareLoose).pop() } var rev = data._rev delete data._revisions delete data._attachments var cb_ = detacher.call(this, uri, data, dist, auth, cb) this.request(uri + '/-rev/' + rev, { method: 'PUT', body: data, auth: auth }, function (er) { if (er) { this.log.error('unpublish', 'Failed to update data') } cb_(er) }.bind(this)) }.bind(this)) } function detacher (uri, data, dist, credentials, cb) { return function (er) { if (er) return cb(er) this.get(escape(uri, data.name), { auth: credentials }, function (er, data) { if (er) return cb(er) var tb = url.parse(dist.tarball) detach.call(this, uri, data, tb.pathname, data._rev, credentials, function (er) { if (er || !dist.bin) return cb(er) chain(Object.keys(dist.bin).map(function (bt) { return function (cb) { var d = dist.bin[bt] detach.call(this, uri, data, url.parse(d.tarball).pathname, null, credentials, cb) }.bind(this) }, this), cb) }.bind(this)) }.bind(this)) }.bind(this) } function detach (uri, data, path, rev, credentials, cb) { if (rev) { path += '/-rev/' + rev this.log.info('detach', path) return this.request(url.resolve(uri, path), { method: 'DELETE', auth: credentials }, cb) } this.get(escape(uri, data.name), { auth: credentials }, function (er, data) { rev = data._rev if (!rev) return cb(new Error('No _rev found in ' + data._id)) detach.call(this, data, path, rev, cb) }.bind(this)) } function escape (base, name) { var escaped = name.replace(/\//, '%2f') return url.resolve(base, escaped) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/whoami.js0000644000000000000000000000123112631326456022450 0ustar 00000000000000module.exports = whoami var url = require('url') var assert = require('assert') function whoami (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to whoami') assert(params && typeof params === 'object', 'must pass params to whoami') assert(typeof cb === 'function', 'must pass callback to whoami') var auth = params.auth assert(auth && typeof auth === 'object', 'must pass auth to whoami') if (auth.username) return process.nextTick(cb.bind(this, null, auth.username)) this.request(url.resolve(uri, '-/whoami'), { auth: auth }, function (er, userdata) { if (er) return cb(er) cb(null, userdata.username) }) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/dist-tags/add.js0000644000000000000000000000220012631326456023610 0ustar 00000000000000module.exports = add var assert = require('assert') var url = require('url') var npa = require('npm-package-arg') function add (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to distTags.add') assert( params && typeof params === 'object', 'must pass params to distTags.add' ) assert(typeof cb === 'function', 'muss pass callback to distTags.add') assert( typeof params.package === 'string', 'must pass package name to distTags.add' ) assert( typeof params.distTag === 'string', 'must pass package distTag name to distTags.add' ) assert( typeof params.version === 'string', 'must pass version to be mapped to distTag to distTags.add' ) assert( params.auth && typeof params.auth === 'object', 'must pass auth to distTags.add' ) var p = npa(params.package) var pkg = p.scope ? params.package.replace('/', '%2f') : params.package var rest = '-/package/' + pkg + '/dist-tags/' + params.distTag var options = { method: 'PUT', body: JSON.stringify(params.version), auth: params.auth } this.request(url.resolve(uri, rest), options, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/dist-tags/fetch.js0000644000000000000000000000172512631326456024164 0ustar 00000000000000module.exports = fetch var assert = require('assert') var url = require('url') var npa = require('npm-package-arg') function fetch (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to distTags.fetch') assert( params && typeof params === 'object', 'must pass params to distTags.fetch' ) assert(typeof cb === 'function', 'muss pass callback to distTags.fetch') assert( typeof params.package === 'string', 'must pass package name to distTags.fetch' ) assert( params.auth && typeof params.auth === 'object', 'must pass auth to distTags.fetch' ) var p = npa(params.package) var pkg = p.scope ? params.package.replace('/', '%2f') : params.package var rest = '-/package/' + pkg + '/dist-tags' var options = { method: 'GET', auth: params.auth } this.request(url.resolve(uri, rest), options, function (er, data) { if (data && typeof data === 'object') delete data._etag cb(er, data) }) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/dist-tags/rm.js0000644000000000000000000000173312631326456023510 0ustar 00000000000000module.exports = rm var assert = require('assert') var url = require('url') var npa = require('npm-package-arg') function rm (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to distTags.rm') assert( params && typeof params === 'object', 'must pass params to distTags.rm' ) assert(typeof cb === 'function', 'muss pass callback to distTags.rm') assert( typeof params.package === 'string', 'must pass package name to distTags.rm' ) assert( typeof params.distTag === 'string', 'must pass package distTag name to distTags.rm' ) assert( params.auth && typeof params.auth === 'object', 'must pass auth to distTags.rm' ) var p = npa(params.package) var pkg = p.scope ? params.package.replace('/', '%2f') : params.package var rest = '-/package/' + pkg + '/dist-tags/' + params.distTag var options = { method: 'DELETE', auth: params.auth } this.request(url.resolve(uri, rest), options, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/dist-tags/set.js0000644000000000000000000000200512631326456023656 0ustar 00000000000000module.exports = set var assert = require('assert') var url = require('url') var npa = require('npm-package-arg') function set (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to distTags.set') assert( params && typeof params === 'object', 'must pass params to distTags.set' ) assert(typeof cb === 'function', 'muss pass callback to distTags.set') assert( typeof params.package === 'string', 'must pass package name to distTags.set' ) assert( params.distTags && typeof params.distTags === 'object', 'must pass distTags map to distTags.set' ) assert( params.auth && typeof params.auth === 'object', 'must pass auth to distTags.set' ) var p = npa(params.package) var pkg = p.scope ? params.package.replace('/', '%2f') : params.package var rest = '-/package/' + pkg + '/dist-tags' var options = { method: 'PUT', body: JSON.stringify(params.distTags), auth: params.auth } this.request(url.resolve(uri, rest), options, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/lib/dist-tags/update.js0000644000000000000000000000203612631326456024351 0ustar 00000000000000module.exports = update var assert = require('assert') var url = require('url') var npa = require('npm-package-arg') function update (uri, params, cb) { assert(typeof uri === 'string', 'must pass registry URI to distTags.update') assert( params && typeof params === 'object', 'must pass params to distTags.update' ) assert(typeof cb === 'function', 'muss pass callback to distTags.update') assert( typeof params.package === 'string', 'must pass package name to distTags.update' ) assert( params.distTags && typeof params.distTags === 'object', 'must pass distTags map to distTags.update' ) assert( params.auth && typeof params.auth === 'object', 'must pass auth to distTags.update' ) var p = npa(params.package) var pkg = p.scope ? params.package.replace('/', '%2f') : params.package var rest = '-/package/' + pkg + '/dist-tags' var options = { method: 'POST', body: JSON.stringify(params.distTags), auth: params.auth } this.request(url.resolve(uri, rest), options, cb) } npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/0000755000000000000000000000000012631326456025300 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/LICENSE0000644000000000000000000000207412631326456026310 0ustar 00000000000000The MIT License Copyright (c) 2013 Max Ogden Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/index.js0000644000000000000000000000677412631326456026763 0ustar 00000000000000var Writable = require('readable-stream').Writable var inherits = require('inherits') if (typeof Uint8Array === 'undefined') { var U8 = require('typedarray').Uint8Array } else { var U8 = Uint8Array } function ConcatStream(opts, cb) { if (!(this instanceof ConcatStream)) return new ConcatStream(opts, cb) if (typeof opts === 'function') { cb = opts opts = {} } if (!opts) opts = {} var encoding = opts.encoding var shouldInferEncoding = false if (!encoding) { shouldInferEncoding = true } else { encoding = String(encoding).toLowerCase() if (encoding === 'u8' || encoding === 'uint8') { encoding = 'uint8array' } } Writable.call(this, { objectMode: true }) this.encoding = encoding this.shouldInferEncoding = shouldInferEncoding if (cb) this.on('finish', function () { cb(this.getBody()) }) this.body = [] } module.exports = ConcatStream inherits(ConcatStream, Writable) ConcatStream.prototype._write = function(chunk, enc, next) { this.body.push(chunk) next() } ConcatStream.prototype.inferEncoding = function (buff) { var firstBuffer = buff === undefined ? this.body[0] : buff; if (Buffer.isBuffer(firstBuffer)) return 'buffer' if (typeof Uint8Array !== 'undefined' && firstBuffer instanceof Uint8Array) return 'uint8array' if (Array.isArray(firstBuffer)) return 'array' if (typeof firstBuffer === 'string') return 'string' if (Object.prototype.toString.call(firstBuffer) === "[object Object]") return 'object' return 'buffer' } ConcatStream.prototype.getBody = function () { if (!this.encoding && this.body.length === 0) return [] if (this.shouldInferEncoding) this.encoding = this.inferEncoding() if (this.encoding === 'array') return arrayConcat(this.body) if (this.encoding === 'string') return stringConcat(this.body) if (this.encoding === 'buffer') return bufferConcat(this.body) if (this.encoding === 'uint8array') return u8Concat(this.body) return this.body } var isArray = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]' } function isArrayish (arr) { return /Array\]$/.test(Object.prototype.toString.call(arr)) } function stringConcat (parts) { var strings = [] var needsToString = false for (var i = 0; i < parts.length; i++) { var p = parts[i] if (typeof p === 'string') { strings.push(p) } else if (Buffer.isBuffer(p)) { strings.push(p) } else { strings.push(Buffer(p)) } } if (Buffer.isBuffer(parts[0])) { strings = Buffer.concat(strings) strings = strings.toString('utf8') } else { strings = strings.join('') } return strings } function bufferConcat (parts) { var bufs = [] for (var i = 0; i < parts.length; i++) { var p = parts[i] if (Buffer.isBuffer(p)) { bufs.push(p) } else if (typeof p === 'string' || isArrayish(p) || (p && typeof p.subarray === 'function')) { bufs.push(Buffer(p)) } else bufs.push(Buffer(String(p))) } return Buffer.concat(bufs) } function arrayConcat (parts) { var res = [] for (var i = 0; i < parts.length; i++) { res.push.apply(res, parts[i]) } return res } function u8Concat (parts) { var len = 0 for (var i = 0; i < parts.length; i++) { if (typeof parts[i] === 'string') { parts[i] = Buffer(parts[i]) } len += parts[i].length } var u8 = new U8(len) for (var i = 0, offset = 0; i < parts.length; i++) { var part = parts[i] for (var j = 0; j < part.length; j++) { u8[offset++] = part[j] } } return u8 } npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/0000755000000000000000000000000012631326456027755 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/package.json0000644000000000000000000000370312631326456027571 0ustar 00000000000000{ "name": "concat-stream", "version": "1.5.1", "description": "writable stream that concatenates strings or binary data and calls a callback with the result", "tags": [ "stream", "simple", "util", "utility" ], "author": { "name": "Max Ogden", "email": "max@maxogden.com" }, "repository": { "type": "git", "url": "git+ssh://git@github.com/maxogden/concat-stream.git" }, "bugs": { "url": "http://github.com/maxogden/concat-stream/issues" }, "engines": [ "node >= 0.8" ], "main": "index.js", "files": [ "index.js" ], "scripts": { "test": "tape test/*.js test/server/*.js" }, "license": "MIT", "dependencies": { "inherits": "~2.0.1", "typedarray": "~0.0.5", "readable-stream": "~2.0.0" }, "devDependencies": { "tape": "~2.3.2" }, "testling": { "files": "test/*.js", "browsers": [ "ie/8..latest", "firefox/17..latest", "firefox/nightly", "chrome/22..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "gitHead": "522adc12d82f57c691a5f946fbc8ba08718dcdcb", "homepage": "https://github.com/maxogden/concat-stream#readme", "_id": "concat-stream@1.5.1", "_shasum": "f3b80acf9e1f48e3875c0688b41b6c31602eea1c", "_from": "concat-stream@>=1.4.6 <2.0.0", "_npmVersion": "2.14.2", "_nodeVersion": "4.0.0", "_npmUser": { "name": "maxogden", "email": "max@maxogden.com" }, "dist": { "shasum": "f3b80acf9e1f48e3875c0688b41b6c31602eea1c", "tarball": "http://registry.npmjs.org/concat-stream/-/concat-stream-1.5.1.tgz" }, "maintainers": [ { "name": "maxogden", "email": "max@maxogden.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/concat-stream/-/concat-stream-1.5.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/readme.md0000644000000000000000000000654112631326456027065 0ustar 00000000000000# concat-stream Writable stream that concatenates all the data from a stream and calls a callback with the result. Use this when you want to collect all the data from a stream into a single buffer. [![Build Status](https://travis-ci.org/maxogden/concat-stream.svg?branch=master)](https://travis-ci.org/maxogden/concat-stream) [![NPM](https://nodei.co/npm/concat-stream.png)](https://nodei.co/npm/concat-stream/) ### description Streams emit many buffers. If you want to collect all of the buffers, and when the stream ends concatenate all of the buffers together and receive a single buffer then this is the module for you. Only use this if you know you can fit all of the output of your stream into a single Buffer (e.g. in RAM). There are also `objectMode` streams that emit things other than Buffers, and you can concatenate these too. See below for details. ## Related `stream-each` is part of the [mississippi stream utility collection](https://github.com/maxogden/mississippi) which includes more useful stream modules similar to this one. ### examples #### Buffers ```js var fs = require('fs') var concat = require('concat-stream') var readStream = fs.createReadStream('cat.png') var concatStream = concat(gotPicture) readStream.on('error', handleError) readStream.pipe(concatStream) function gotPicture(imageBuffer) { // imageBuffer is all of `cat.png` as a node.js Buffer } function handleError(err) { // handle your error appropriately here, e.g.: console.error(err) // print the error to STDERR process.exit(1) // exit program with non-zero exit code } ``` #### Arrays ```js var write = concat(function(data) {}) write.write([1,2,3]) write.write([4,5,6]) write.end() // data will be [1,2,3,4,5,6] in the above callback ``` #### Uint8Arrays ```js var write = concat(function(data) {}) var a = new Uint8Array(3) a[0] = 97; a[1] = 98; a[2] = 99 write.write(a) write.write('!') write.end(Buffer('!!1')) ``` See `test/` for more examples # methods ```js var concat = require('concat-stream') ``` ## var writable = concat(opts={}, cb) Return a `writable` stream that will fire `cb(data)` with all of the data that was written to the stream. Data can be written to `writable` as strings, Buffers, arrays of byte integers, and Uint8Arrays. By default `concat-stream` will give you back the same data type as the type of the first buffer written to the stream. Use `opts.encoding` to set what format `data` should be returned as, e.g. if you if you don't want to rely on the built-in type checking or for some other reason. * `string` - get a string * `buffer` - get back a Buffer * `array` - get an array of byte integers * `uint8array`, `u8`, `uint8` - get back a Uint8Array * `object`, get back an array of Objects If you don't specify an encoding, and the types can't be inferred (e.g. you write things that aren't in the list above), it will try to convert concat them into a `Buffer`. # error handling `concat-stream` does not handle errors for you, so you must handle errors on whatever streams you pipe into `concat-stream`. This is a general rule when programming with node.js streams: always handle errors on each and every stream. Since `concat-stream` is not itself a stream it does not emit errors. We recommend using [`end-of-stream`](https://npmjs.org/end-of-stream) or [`pump`](https://npmjs.org/pump) for writing error tolerant stream code. # license MIT LICENSE ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/0000755000000000000000000000000012631326456032141 5ustar 00000000000000././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/.npmignorenpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000004412631326456032243 0ustar 00000000000000build/ test/ examples/ fs.js zlib.js././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/.travis.ymlnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000315012631326456032244 0ustar 00000000000000sudo: false language: node_js before_install: - npm install -g npm@2 - npm install -g npm notifications: email: false matrix: include: - node_js: '0.8' env: TASK=test - node_js: '0.10' env: TASK=test - node_js: '0.11' env: TASK=test - node_js: '0.12' env: TASK=test - node_js: 1 env: TASK=test - node_js: 2 env: TASK=test - node_js: 3 env: TASK=test - node_js: 4 env: TASK=test - node_js: 5 env: TASK=test - node_js: node env: TASK=test - node_js: node env: TASK=browser BROWSER_NAME=opera BROWSER_VERSION="11..latest" - node_js: node env: TASK=browser BROWSER_NAME=ie BROWSER_VERSION="9..latest" - node_js: node env: TASK=browser BROWSER_NAME=chrome BROWSER_VERSION="41..beta" - node_js: node env: TASK=browser BROWSER_NAME=firefox BROWSER_VERSION="36..latest" - node_js: node env: TASK=browser BROWSER_NAME=ipad BROWSER_VERSION="['6.1', '7.1', '8.2']" - node_js: node env: TASK=browser BROWSER_NAME=iphone BROWSER_VERSION="['6.1', '7.1', '8.2']" - node_js: node env: TASK=browser BROWSER_NAME=safari BROWSER_VERSION="5..latest" - node_js: node env: TASK=browser BROWSER_NAME=android BROWSER_VERSION="4.0..latest" script: "npm run $TASK" env: global: - secure: rE2Vvo7vnjabYNULNyLFxOyt98BoJexDqsiOnfiD6kLYYsiQGfr/sbZkPMOFm9qfQG7pjqx+zZWZjGSswhTt+626C0t/njXqug7Yps4c3dFblzGfreQHp7wNX5TFsvrxd6dAowVasMp61sJcRnB2w8cUzoe3RAYUDHyiHktwqMc= - secure: g9YINaKAdMatsJ28G9jCGbSaguXCyxSTy+pBO6Ch0Cf57ZLOTka3HqDj8p3nV28LUIHZ3ut5WO43CeYKwt4AUtLpBS3a0dndHdY6D83uY6b2qh5hXlrcbeQTq2cvw2y95F7hm4D1kwrgZ7ViqaKggRcEupAL69YbJnxeUDKWEdI= ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/.zuul.ymlnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000001112631326456032235 0ustar 00000000000000ui: tape ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/LICENSEnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000211012631326456032237 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/README.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000361012631326456032245 0ustar 00000000000000# readable-stream ***Node-core streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream) [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/) [![Sauce Test Status](https://saucelabs.com/browser-matrix/readable-stream.svg)](https://saucelabs.com/u/readable-stream) ```bash npm install --save readable-stream ``` ***Node-core streams for userland*** This package is a mirror of the Streams2 and Streams3 implementations in Node-core, including [documentation](doc/stream.markdown). If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html). As of version 2.0.0 **readable-stream** uses semantic versioning. # Streams WG Team Members * **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com> - Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B * **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com> - Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242 * **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org> - Release GPG key: DD8F2338BAE7501E3DD5AC78C273792F7D83545D * **Sam Newman** ([@sonewman](https://github.com/sonewman)) <newmansam@outlook.com> * **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com> * **Domenic Denicola** ([@domenic](https://github.com/domenic)) <d@domenic.me> ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/doc/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/duplex.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000006412631326456032245 0ustar 00000000000000module.exports = require("./lib/_stream_duplex.js") ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000376012631326456032253 0ustar 00000000000000{ "name": "readable-stream", "version": "2.0.4", "description": "Streams3, a user-land copy of the stream library from iojs v2.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "inherits": "~2.0.1", "isarray": "0.0.1", "process-nextick-args": "~1.0.0", "string_decoder": "~0.10.x", "util-deprecate": "~1.0.1" }, "devDependencies": { "tap": "~0.2.6", "tape": "~4.0.0", "zuul": "~3.0.0" }, "scripts": { "test": "tap test/parallel/*.js", "browser": "npm run write-zuul && zuul -- test/browser.js", "write-zuul": "printf \"ui: tape\nbrowsers:\n - name: $BROWSER_NAME\n version: $BROWSER_VERSION\n\">.zuul.yml" }, "repository": { "type": "git", "url": "git://github.com/nodejs/readable-stream.git" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "license": "MIT", "gitHead": "f2a4f4a659bacbe742a494b7d2aede64fab0d4f9", "bugs": { "url": "https://github.com/nodejs/readable-stream/issues" }, "homepage": "https://github.com/nodejs/readable-stream#readme", "_id": "readable-stream@2.0.4", "_shasum": "2523ef27ffa339d7ba9da8603f2d0599d06edbd8", "_from": "readable-stream@>=2.0.0 <2.1.0", "_npmVersion": "2.14.4", "_nodeVersion": "4.1.1", "_npmUser": { "name": "cwmma", "email": "calvin.metcalf@gmail.com" }, "dist": { "shasum": "2523ef27ffa339d7ba9da8603f2d0599d06edbd8", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-2.0.4.tgz" }, "maintainers": [ { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" }, { "name": "cwmma", "email": "calvin.metcalf@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.0.4.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/passthrough.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000007112631326456032243 0ustar 00000000000000module.exports = require("./lib/_stream_passthrough.js") ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/readable.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000101112631326456032236 0ustar 00000000000000var Stream = (function (){ try { return require('st' + 'ream'); // hack to fix a circular dependency issue when used with browserify } catch(_){} }()); exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = Stream || exports; exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/transform.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000006712631326456032250 0ustar 00000000000000module.exports = require("./lib/_stream_transform.js") ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/writable.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000006612631326456032247 0ustar 00000000000000module.exports = require("./lib/_stream_writable.js") ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/doc/stream.markdownnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000015371712631326456032263 0ustar 00000000000000# Stream Stability: 2 - Stable A stream is an abstract interface implemented by various objects in Node.js. For example a [request to an HTTP server](https://iojs.org/dist/v5.0.0/doc/api/http.html#http_http_incomingmessage) is a stream, as is [stdout][]. Streams are readable, writable, or both. All streams are instances of [EventEmitter][] You can load the Stream base classes by doing `require('stream')`. There are base classes provided for [Readable][] streams, [Writable][] streams, [Duplex][] streams, and [Transform][] streams. This document is split up into 3 sections. The first explains the parts of the API that you need to be aware of to use streams in your programs. If you never implement a streaming API yourself, you can stop there. The second section explains the parts of the API that you need to use if you implement your own custom streams yourself. The API is designed to make this easy for you to do. The third section goes into more depth about how streams work, including some of the internal mechanisms and functions that you should probably not modify unless you definitely know what you are doing. ## API for Stream Consumers Streams can be either [Readable][], [Writable][], or both ([Duplex][]). All streams are EventEmitters, but they also have other custom methods and properties depending on whether they are Readable, Writable, or Duplex. If a stream is both Readable and Writable, then it implements all of the methods and events below. So, a [Duplex][] or [Transform][] stream is fully described by this API, though their implementation may be somewhat different. It is not necessary to implement Stream interfaces in order to consume streams in your programs. If you **are** implementing streaming interfaces in your own program, please also refer to [API for Stream Implementors][] below. Almost all Node.js programs, no matter how simple, use Streams in some way. Here is an example of using Streams in an Node.js program: ```javascript var http = require('http'); var server = http.createServer(function (req, res) { // req is an http.IncomingMessage, which is a Readable Stream // res is an http.ServerResponse, which is a Writable Stream var body = ''; // we want to get the data as utf8 strings // If you don't set an encoding, then you'll get Buffer objects req.setEncoding('utf8'); // Readable streams emit 'data' events once a listener is added req.on('data', function (chunk) { body += chunk; }); // the end event tells you that you have entire body req.on('end', function () { try { var data = JSON.parse(body); } catch (er) { // uh oh! bad json! res.statusCode = 400; return res.end('error: ' + er.message); } // write back something interesting to the user: res.write(typeof data); res.end(); }); }); server.listen(1337); // $ curl localhost:1337 -d '{}' // object // $ curl localhost:1337 -d '"foo"' // string // $ curl localhost:1337 -d 'not json' // error: Unexpected token o ``` ### Class: stream.Readable The Readable stream interface is the abstraction for a *source* of data that you are reading from. In other words, data comes *out* of a Readable stream. A Readable stream will not start emitting data until you indicate that you are ready to receive it. Readable streams have two "modes": a **flowing mode** and a **paused mode**. When in flowing mode, data is read from the underlying system and provided to your program as fast as possible. In paused mode, you must explicitly call `stream.read()` to get chunks of data out. Streams start out in paused mode. **Note**: If no data event handlers are attached, and there are no [`pipe()`][] destinations, and the stream is switched into flowing mode, then data will be lost. You can switch to flowing mode by doing any of the following: * Adding a [`'data'` event][] handler to listen for data. * Calling the [`resume()`][] method to explicitly open the flow. * Calling the [`pipe()`][] method to send the data to a [Writable][]. You can switch back to paused mode by doing either of the following: * If there are no pipe destinations, by calling the [`pause()`][] method. * If there are pipe destinations, by removing any [`'data'` event][] handlers, and removing all pipe destinations by calling the [`unpipe()`][] method. Note that, for backwards compatibility reasons, removing `'data'` event handlers will **not** automatically pause the stream. Also, if there are piped destinations, then calling `pause()` will not guarantee that the stream will *remain* paused once those destinations drain and ask for more data. Examples of readable streams include: * [http responses, on the client](https://iojs.org/dist/v5.0.0/doc/api/http.html#http_http_incomingmessage) * [http requests, on the server](https://iojs.org/dist/v5.0.0/doc/api/http.html#http_http_incomingmessage) * [fs read streams](https://iojs.org/dist/v5.0.0/doc/api/fs.html#fs_class_fs_readstream) * [zlib streams][] * [crypto streams][] * [tcp sockets][] * [child process stdout and stderr][] * [process.stdin][] #### Event: 'readable' When a chunk of data can be read from the stream, it will emit a `'readable'` event. In some cases, listening for a `'readable'` event will cause some data to be read into the internal buffer from the underlying system, if it hadn't already. ```javascript var readable = getReadableStreamSomehow(); readable.on('readable', function() { // there is some data to read now }); ``` Once the internal buffer is drained, a `readable` event will fire again when more data is available. The `readable` event is not emitted in the "flowing" mode with the sole exception of the last one, on end-of-stream. The 'readable' event indicates that the stream has new information: either new data is available or the end of the stream has been reached. In the former case, `.read()` will return that data. In the latter case, `.read()` will return null. For instance, in the following example, `foo.txt` is an empty file: ```javascript var fs = require('fs'); var rr = fs.createReadStream('foo.txt'); rr.on('readable', function() { console.log('readable:', rr.read()); }); rr.on('end', function() { console.log('end'); }); ``` The output of running this script is: ``` bash-3.2$ node test.js readable: null end ``` #### Event: 'data' * `chunk` {Buffer | String} The chunk of data. Attaching a `data` event listener to a stream that has not been explicitly paused will switch the stream into flowing mode. Data will then be passed as soon as it is available. If you just want to get all the data out of the stream as fast as possible, this is the best way to do so. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); }); ``` #### Event: 'end' This event fires when there will be no more data to read. Note that the `end` event **will not fire** unless the data is completely consumed. This can be done by switching into flowing mode, or by calling `read()` repeatedly until you get to the end. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); }); readable.on('end', function() { console.log('there will be no more data.'); }); ``` #### Event: 'close' Emitted when the stream and any of its underlying resources (a file descriptor, for example) have been closed. The event indicates that no more events will be emitted, and no further computation will occur. Not all streams will emit the 'close' event. #### Event: 'error' * {Error Object} Emitted if there was an error receiving data. #### readable.read([size]) * `size` {Number} Optional argument to specify how much data to read. * Return {String | Buffer | null} The `read()` method pulls some data out of the internal buffer and returns it. If there is no data available, then it will return `null`. If you pass in a `size` argument, then it will return that many bytes. If `size` bytes are not available, then it will return `null`, unless we've ended, in which case it will return the data remaining in the buffer. If you do not specify a `size` argument, then it will return all the data in the internal buffer. This method should only be called in paused mode. In flowing mode, this method is called automatically until the internal buffer is drained. ```javascript var readable = getReadableStreamSomehow(); readable.on('readable', function() { var chunk; while (null !== (chunk = readable.read())) { console.log('got %d bytes of data', chunk.length); } }); ``` If this method returns a data chunk, then it will also trigger the emission of a [`'data'` event][]. Note that calling `readable.read([size])` after the `end` event has been triggered will return `null`. No runtime error will be raised. #### readable.setEncoding(encoding) * `encoding` {String} The encoding to use. * Return: `this` Call this function to cause the stream to return strings of the specified encoding instead of Buffer objects. For example, if you do `readable.setEncoding('utf8')`, then the output data will be interpreted as UTF-8 data, and returned as strings. If you do `readable.setEncoding('hex')`, then the data will be encoded in hexadecimal string format. This properly handles multi-byte characters that would otherwise be potentially mangled if you simply pulled the Buffers directly and called `buf.toString(encoding)` on them. If you want to read the data as strings, always use this method. ```javascript var readable = getReadableStreamSomehow(); readable.setEncoding('utf8'); readable.on('data', function(chunk) { assert.equal(typeof chunk, 'string'); console.log('got %d characters of string data', chunk.length); }); ``` #### readable.resume() * Return: `this` This method will cause the readable stream to resume emitting `data` events. This method will switch the stream into flowing mode. If you do *not* want to consume the data from a stream, but you *do* want to get to its `end` event, you can call [`readable.resume()`][] to open the flow of data. ```javascript var readable = getReadableStreamSomehow(); readable.resume(); readable.on('end', function() { console.log('got to the end, but did not read anything'); }); ``` #### readable.pause() * Return: `this` This method will cause a stream in flowing mode to stop emitting `data` events, switching out of flowing mode. Any data that becomes available will remain in the internal buffer. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); readable.pause(); console.log('there will be no more data for 1 second'); setTimeout(function() { console.log('now data will start flowing again'); readable.resume(); }, 1000); }); ``` #### readable.isPaused() * Return: `Boolean` This method returns whether or not the `readable` has been **explicitly** paused by client code (using `readable.pause()` without a corresponding `readable.resume()`). ```javascript var readable = new stream.Readable readable.isPaused() // === false readable.pause() readable.isPaused() // === true readable.resume() readable.isPaused() // === false ``` #### readable.pipe(destination[, options]) * `destination` {[Writable][] Stream} The destination for writing data * `options` {Object} Pipe options * `end` {Boolean} End the writer when the reader ends. Default = `true` This method pulls all the data out of a readable stream, and writes it to the supplied destination, automatically managing the flow so that the destination is not overwhelmed by a fast readable stream. Multiple destinations can be piped to safely. ```javascript var readable = getReadableStreamSomehow(); var writable = fs.createWriteStream('file.txt'); // All the data from readable goes into 'file.txt' readable.pipe(writable); ``` This function returns the destination stream, so you can set up pipe chains like so: ```javascript var r = fs.createReadStream('file.txt'); var z = zlib.createGzip(); var w = fs.createWriteStream('file.txt.gz'); r.pipe(z).pipe(w); ``` For example, emulating the Unix `cat` command: ```javascript process.stdin.pipe(process.stdout); ``` By default [`end()`][] is called on the destination when the source stream emits `end`, so that `destination` is no longer writable. Pass `{ end: false }` as `options` to keep the destination stream open. This keeps `writer` open so that "Goodbye" can be written at the end. ```javascript reader.pipe(writer, { end: false }); reader.on('end', function() { writer.end('Goodbye\n'); }); ``` Note that `process.stderr` and `process.stdout` are never closed until the process exits, regardless of the specified options. #### readable.unpipe([destination]) * `destination` {[Writable][] Stream} Optional specific stream to unpipe This method will remove the hooks set up for a previous `pipe()` call. If the destination is not specified, then all pipes are removed. If the destination is specified, but no pipe is set up for it, then this is a no-op. ```javascript var readable = getReadableStreamSomehow(); var writable = fs.createWriteStream('file.txt'); // All the data from readable goes into 'file.txt', // but only for the first second readable.pipe(writable); setTimeout(function() { console.log('stop writing to file.txt'); readable.unpipe(writable); console.log('manually close the file stream'); writable.end(); }, 1000); ``` #### readable.unshift(chunk) * `chunk` {Buffer | String} Chunk of data to unshift onto the read queue This is useful in certain cases where a stream is being consumed by a parser, which needs to "un-consume" some data that it has optimistically pulled out of the source, so that the stream can be passed on to some other party. Note that `stream.unshift(chunk)` cannot be called after the `end` event has been triggered; a runtime error will be raised. If you find that you must often call `stream.unshift(chunk)` in your programs, consider implementing a [Transform][] stream instead. (See API for Stream Implementors, below.) ```javascript // Pull off a header delimited by \n\n // use unshift() if we get too much // Call the callback with (error, header, stream) var StringDecoder = require('string_decoder').StringDecoder; function parseHeader(stream, callback) { stream.on('error', callback); stream.on('readable', onReadable); var decoder = new StringDecoder('utf8'); var header = ''; function onReadable() { var chunk; while (null !== (chunk = stream.read())) { var str = decoder.write(chunk); if (str.match(/\n\n/)) { // found the header boundary var split = str.split(/\n\n/); header += split.shift(); var remaining = split.join('\n\n'); var buf = new Buffer(remaining, 'utf8'); if (buf.length) stream.unshift(buf); stream.removeListener('error', callback); stream.removeListener('readable', onReadable); // now the body of the message can be read from the stream. callback(null, header, stream); } else { // still reading the header. header += str; } } } } ``` Note that, unlike `stream.push(chunk)`, `stream.unshift(chunk)` will not end the reading process by resetting the internal reading state of the stream. This can cause unexpected results if `unshift` is called during a read (i.e. from within a `_read` implementation on a custom stream). Following the call to `unshift` with an immediate `stream.push('')` will reset the reading state appropriately, however it is best to simply avoid calling `unshift` while in the process of performing a read. #### readable.wrap(stream) * `stream` {Stream} An "old style" readable stream Versions of Node.js prior to v0.10 had streams that did not implement the entire Streams API as it is today. (See "Compatibility" below for more information.) If you are using an older Node.js library that emits `'data'` events and has a [`pause()`][] method that is advisory only, then you can use the `wrap()` method to create a [Readable][] stream that uses the old stream as its data source. You will very rarely ever need to call this function, but it exists as a convenience for interacting with old Node.js programs and libraries. For example: ```javascript var OldReader = require('./old-api-module.js').OldReader; var oreader = new OldReader; var Readable = require('stream').Readable; var myReader = new Readable().wrap(oreader); myReader.on('readable', function() { myReader.read(); // etc. }); ``` ### Class: stream.Writable The Writable stream interface is an abstraction for a *destination* that you are writing data *to*. Examples of writable streams include: * [http requests, on the client](https://iojs.org/dist/v5.0.0/doc/api/http.html#http_class_http_clientrequest) * [http responses, on the server](https://iojs.org/dist/v5.0.0/doc/api/http.html#http_class_http_serverresponse) * [fs write streams](https://iojs.org/dist/v5.0.0/doc/api/fs.html#fs_class_fs_writestream) * [zlib streams][] * [crypto streams][] * [tcp sockets][] * [child process stdin](https://iojs.org/dist/v5.0.0/doc/api/child_process.html#child_process_child_stdin) * [process.stdout][], [process.stderr][] #### writable.write(chunk[, encoding][, callback]) * `chunk` {String | Buffer} The data to write * `encoding` {String} The encoding, if `chunk` is a String * `callback` {Function} Callback for when this chunk of data is flushed * Returns: {Boolean} True if the data was handled completely. This method writes some data to the underlying system, and calls the supplied callback once the data has been fully handled. The return value indicates if you should continue writing right now. If the data had to be buffered internally, then it will return `false`. Otherwise, it will return `true`. This return value is strictly advisory. You MAY continue to write, even if it returns `false`. However, writes will be buffered in memory, so it is best not to do this excessively. Instead, wait for the `drain` event before writing more data. #### Event: 'drain' If a [`writable.write(chunk)`][] call returns false, then the `drain` event will indicate when it is appropriate to begin writing more data to the stream. ```javascript // Write the data to the supplied writable stream one million times. // Be attentive to back-pressure. function writeOneMillionTimes(writer, data, encoding, callback) { var i = 1000000; write(); function write() { var ok = true; do { i -= 1; if (i === 0) { // last time! writer.write(data, encoding, callback); } else { // see if we should continue, or wait // don't pass the callback, because we're not done yet. ok = writer.write(data, encoding); } } while (i > 0 && ok); if (i > 0) { // had to stop early! // write some more once it drains writer.once('drain', write); } } } ``` #### writable.cork() Forces buffering of all writes. Buffered data will be flushed either at `.uncork()` or at `.end()` call. #### writable.uncork() Flush all data, buffered since `.cork()` call. #### writable.setDefaultEncoding(encoding) * `encoding` {String} The new default encoding Sets the default encoding for a writable stream. #### writable.end([chunk][, encoding][, callback]) * `chunk` {String | Buffer} Optional data to write * `encoding` {String} The encoding, if `chunk` is a String * `callback` {Function} Optional callback for when the stream is finished Call this method when no more data will be written to the stream. If supplied, the callback is attached as a listener on the `finish` event. Calling [`write()`][] after calling [`end()`][] will raise an error. ```javascript // write 'hello, ' and then end with 'world!' var file = fs.createWriteStream('example.txt'); file.write('hello, '); file.end('world!'); // writing more now is not allowed! ``` #### Event: 'finish' When the [`end()`][] method has been called, and all data has been flushed to the underlying system, this event is emitted. ```javascript var writer = getWritableStreamSomehow(); for (var i = 0; i < 100; i ++) { writer.write('hello, #' + i + '!\n'); } writer.end('this is the end\n'); writer.on('finish', function() { console.error('all writes are now complete.'); }); ``` #### Event: 'pipe' * `src` {[Readable][] Stream} source stream that is piping to this writable This is emitted whenever the `pipe()` method is called on a readable stream, adding this writable to its set of destinations. ```javascript var writer = getWritableStreamSomehow(); var reader = getReadableStreamSomehow(); writer.on('pipe', function(src) { console.error('something is piping into the writer'); assert.equal(src, reader); }); reader.pipe(writer); ``` #### Event: 'unpipe' * `src` {[Readable][] Stream} The source stream that [unpiped][] this writable This is emitted whenever the [`unpipe()`][] method is called on a readable stream, removing this writable from its set of destinations. ```javascript var writer = getWritableStreamSomehow(); var reader = getReadableStreamSomehow(); writer.on('unpipe', function(src) { console.error('something has stopped piping into the writer'); assert.equal(src, reader); }); reader.pipe(writer); reader.unpipe(writer); ``` #### Event: 'error' * {Error object} Emitted if there was an error when writing or piping data. ### Class: stream.Duplex Duplex streams are streams that implement both the [Readable][] and [Writable][] interfaces. See above for usage. Examples of Duplex streams include: * [tcp sockets][] * [zlib streams][] * [crypto streams][] ### Class: stream.Transform Transform streams are [Duplex][] streams where the output is in some way computed from the input. They implement both the [Readable][] and [Writable][] interfaces. See above for usage. Examples of Transform streams include: * [zlib streams][] * [crypto streams][] ## API for Stream Implementors To implement any sort of stream, the pattern is the same: 1. Extend the appropriate parent class in your own subclass. (The [`util.inherits`][] method is particularly helpful for this.) 2. Call the appropriate parent class constructor in your constructor, to be sure that the internal mechanisms are set up properly. 2. Implement one or more specific methods, as detailed below. The class to extend and the method(s) to implement depend on the sort of stream class you are writing:

      Use-case

      Class

      Method(s) to implement

      Reading only

      [Readable](#stream_class_stream_readable_1)

      [_read][]

      Writing only

      [Writable](#stream_class_stream_writable_1)

      [_write][], _writev

      Reading and writing

      [Duplex](#stream_class_stream_duplex_1)

      [_read][], [_write][], _writev

      Operate on written data, then read the result

      [Transform](#stream_class_stream_transform_1)

      _transform, _flush

      In your implementation code, it is very important to never call the methods described in [API for Stream Consumers][] above. Otherwise, you can potentially cause adverse side effects in programs that consume your streaming interfaces. ### Class: stream.Readable `stream.Readable` is an abstract class designed to be extended with an underlying implementation of the [`_read(size)`][] method. Please see above under [API for Stream Consumers][] for how to consume streams in your programs. What follows is an explanation of how to implement Readable streams in your programs. #### Example: A Counting Stream This is a basic example of a Readable stream. It emits the numerals from 1 to 1,000,000 in ascending order, and then ends. ```javascript var Readable = require('stream').Readable; var util = require('util'); util.inherits(Counter, Readable); function Counter(opt) { Readable.call(this, opt); this._max = 1000000; this._index = 1; } Counter.prototype._read = function() { var i = this._index++; if (i > this._max) this.push(null); else { var str = '' + i; var buf = new Buffer(str, 'ascii'); this.push(buf); } }; ``` #### Example: SimpleProtocol v1 (Sub-optimal) This is similar to the `parseHeader` function described above, but implemented as a custom stream. Also, note that this implementation does not convert the incoming data to a string. However, this would be better implemented as a [Transform][] stream. See below for a better implementation. ```javascript // A parser for a simple data protocol. // The "header" is a JSON object, followed by 2 \n characters, and // then a message body. // // NOTE: This can be done more simply as a Transform stream! // Using Readable directly for this is sub-optimal. See the // alternative example below under the Transform section. var Readable = require('stream').Readable; var util = require('util'); util.inherits(SimpleProtocol, Readable); function SimpleProtocol(source, options) { if (!(this instanceof SimpleProtocol)) return new SimpleProtocol(source, options); Readable.call(this, options); this._inBody = false; this._sawFirstCr = false; // source is a readable stream, such as a socket or file this._source = source; var self = this; source.on('end', function() { self.push(null); }); // give it a kick whenever the source is readable // read(0) will not consume any bytes source.on('readable', function() { self.read(0); }); this._rawHeader = []; this.header = null; } SimpleProtocol.prototype._read = function(n) { if (!this._inBody) { var chunk = this._source.read(); // if the source doesn't have data, we don't have data yet. if (chunk === null) return this.push(''); // check if the chunk has a \n\n var split = -1; for (var i = 0; i < chunk.length; i++) { if (chunk[i] === 10) { // '\n' if (this._sawFirstCr) { split = i; break; } else { this._sawFirstCr = true; } } else { this._sawFirstCr = false; } } if (split === -1) { // still waiting for the \n\n // stash the chunk, and try again. this._rawHeader.push(chunk); this.push(''); } else { this._inBody = true; var h = chunk.slice(0, split); this._rawHeader.push(h); var header = Buffer.concat(this._rawHeader).toString(); try { this.header = JSON.parse(header); } catch (er) { this.emit('error', new Error('invalid simple protocol data')); return; } // now, because we got some extra data, unshift the rest // back into the read queue so that our consumer will see it. var b = chunk.slice(split); this.unshift(b); // calling unshift by itself does not reset the reading state // of the stream; since we're inside _read, doing an additional // push('') will reset the state appropriately. this.push(''); // and let them know that we are done parsing the header. this.emit('header', this.header); } } else { // from there on, just provide the data to our consumer. // careful not to push(null), since that would indicate EOF. var chunk = this._source.read(); if (chunk) this.push(chunk); } }; // Usage: // var parser = new SimpleProtocol(source); // Now parser is a readable stream that will emit 'header' // with the parsed header data. ``` #### new stream.Readable([options]) * `options` {Object} * `highWaterMark` {Number} The maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resource. Default=16kb, or 16 for `objectMode` streams * `encoding` {String} If specified, then buffers will be decoded to strings using the specified encoding. Default=null * `objectMode` {Boolean} Whether this stream should behave as a stream of objects. Meaning that stream.read(n) returns a single value instead of a Buffer of size n. Default=false In classes that extend the Readable class, make sure to call the Readable constructor so that the buffering settings can be properly initialized. #### readable.\_read(size) * `size` {Number} Number of bytes to read asynchronously Note: **Implement this method, but do NOT call it directly.** This method is prefixed with an underscore because it is internal to the class that defines it and should only be called by the internal Readable class methods. All Readable stream implementations must provide a _read method to fetch data from the underlying resource. When _read is called, if data is available from the resource, `_read` should start pushing that data into the read queue by calling `this.push(dataChunk)`. `_read` should continue reading from the resource and pushing data until push returns false, at which point it should stop reading from the resource. Only when _read is called again after it has stopped should it start reading more data from the resource and pushing that data onto the queue. Note: once the `_read()` method is called, it will not be called again until the `push` method is called. The `size` argument is advisory. Implementations where a "read" is a single call that returns data can use this to know how much data to fetch. Implementations where that is not relevant, such as TCP or TLS, may ignore this argument, and simply provide data whenever it becomes available. There is no need, for example to "wait" until `size` bytes are available before calling [`stream.push(chunk)`][]. #### readable.push(chunk[, encoding]) * `chunk` {Buffer | null | String} Chunk of data to push into the read queue * `encoding` {String} Encoding of String chunks. Must be a valid Buffer encoding, such as `'utf8'` or `'ascii'` * return {Boolean} Whether or not more pushes should be performed Note: **This method should be called by Readable implementors, NOT by consumers of Readable streams.** If a value other than null is passed, The `push()` method adds a chunk of data into the queue for subsequent stream processors to consume. If `null` is passed, it signals the end of the stream (EOF), after which no more data can be written. The data added with `push` can be pulled out by calling the `read()` method when the `'readable'`event fires. This API is designed to be as flexible as possible. For example, you may be wrapping a lower-level source which has some sort of pause/resume mechanism, and a data callback. In those cases, you could wrap the low-level source object by doing something like this: ```javascript // source is an object with readStop() and readStart() methods, // and an `ondata` member that gets called when it has data, and // an `onend` member that gets called when the data is over. util.inherits(SourceWrapper, Readable); function SourceWrapper(options) { Readable.call(this, options); this._source = getLowlevelSourceObject(); var self = this; // Every time there's data, we push it into the internal buffer. this._source.ondata = function(chunk) { // if push() returns false, then we need to stop reading from source if (!self.push(chunk)) self._source.readStop(); }; // When the source ends, we push the EOF-signaling `null` chunk this._source.onend = function() { self.push(null); }; } // _read will be called when the stream wants to pull more data in // the advisory size argument is ignored in this case. SourceWrapper.prototype._read = function(size) { this._source.readStart(); }; ``` ### Class: stream.Writable `stream.Writable` is an abstract class designed to be extended with an underlying implementation of the [`_write(chunk, encoding, callback)`][] method. Please see above under [API for Stream Consumers][] for how to consume writable streams in your programs. What follows is an explanation of how to implement Writable streams in your programs. #### new stream.Writable([options]) * `options` {Object} * `highWaterMark` {Number} Buffer level when [`write()`][] starts returning false. Default=16kb, or 16 for `objectMode` streams * `decodeStrings` {Boolean} Whether or not to decode strings into Buffers before passing them to [`_write()`][]. Default=true * `objectMode` {Boolean} Whether or not the `write(anyObj)` is a valid operation. If set you can write arbitrary data instead of only `Buffer` / `String` data. Default=false In classes that extend the Writable class, make sure to call the constructor so that the buffering settings can be properly initialized. #### writable.\_write(chunk, encoding, callback) * `chunk` {Buffer | String} The chunk to be written. Will **always** be a buffer unless the `decodeStrings` option was set to `false`. * `encoding` {String} If the chunk is a string, then this is the encoding type. If chunk is a buffer, then this is the special value - 'buffer', ignore it in this case. * `callback` {Function} Call this function (optionally with an error argument) when you are done processing the supplied chunk. All Writable stream implementations must provide a [`_write()`][] method to send data to the underlying resource. Note: **This function MUST NOT be called directly.** It should be implemented by child classes, and called by the internal Writable class methods only. Call the callback using the standard `callback(error)` pattern to signal that the write completed successfully or with an error. If the `decodeStrings` flag is set in the constructor options, then `chunk` may be a string rather than a Buffer, and `encoding` will indicate the sort of string that it is. This is to support implementations that have an optimized handling for certain string data encodings. If you do not explicitly set the `decodeStrings` option to `false`, then you can safely ignore the `encoding` argument, and assume that `chunk` will always be a Buffer. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### writable.\_writev(chunks, callback) * `chunks` {Array} The chunks to be written. Each chunk has following format: `{ chunk: ..., encoding: ... }`. * `callback` {Function} Call this function (optionally with an error argument) when you are done processing the supplied chunks. Note: **This function MUST NOT be called directly.** It may be implemented by child classes, and called by the internal Writable class methods only. This function is completely optional to implement. In most cases it is unnecessary. If implemented, it will be called with all the chunks that are buffered in the write queue. ### Class: stream.Duplex A "duplex" stream is one that is both Readable and Writable, such as a TCP socket connection. Note that `stream.Duplex` is an abstract class designed to be extended with an underlying implementation of the `_read(size)` and [`_write(chunk, encoding, callback)`][] methods as you would with a Readable or Writable stream class. Since JavaScript doesn't have multiple prototypal inheritance, this class prototypally inherits from Readable, and then parasitically from Writable. It is thus up to the user to implement both the lowlevel `_read(n)` method as well as the lowlevel [`_write(chunk, encoding, callback)`][] method on extension duplex classes. #### new stream.Duplex(options) * `options` {Object} Passed to both Writable and Readable constructors. Also has the following fields: * `allowHalfOpen` {Boolean} Default=true. If set to `false`, then the stream will automatically end the readable side when the writable side ends and vice versa. * `readableObjectMode` {Boolean} Default=false. Sets `objectMode` for readable side of the stream. Has no effect if `objectMode` is `true`. * `writableObjectMode` {Boolean} Default=false. Sets `objectMode` for writable side of the stream. Has no effect if `objectMode` is `true`. In classes that extend the Duplex class, make sure to call the constructor so that the buffering settings can be properly initialized. ### Class: stream.Transform A "transform" stream is a duplex stream where the output is causally connected in some way to the input, such as a [zlib][] stream or a [crypto][] stream. There is no requirement that the output be the same size as the input, the same number of chunks, or arrive at the same time. For example, a Hash stream will only ever have a single chunk of output which is provided when the input is ended. A zlib stream will produce output that is either much smaller or much larger than its input. Rather than implement the [`_read()`][] and [`_write()`][] methods, Transform classes must implement the `_transform()` method, and may optionally also implement the `_flush()` method. (See below.) #### new stream.Transform([options]) * `options` {Object} Passed to both Writable and Readable constructors. In classes that extend the Transform class, make sure to call the constructor so that the buffering settings can be properly initialized. #### transform.\_transform(chunk, encoding, callback) * `chunk` {Buffer | String} The chunk to be transformed. Will **always** be a buffer unless the `decodeStrings` option was set to `false`. * `encoding` {String} If the chunk is a string, then this is the encoding type. If chunk is a buffer, then this is the special value - 'buffer', ignore it in this case. * `callback` {Function} Call this function (optionally with an error argument and data) when you are done processing the supplied chunk. Note: **This function MUST NOT be called directly.** It should be implemented by child classes, and called by the internal Transform class methods only. All Transform stream implementations must provide a `_transform` method to accept input and produce output. `_transform` should do whatever has to be done in this specific Transform class, to handle the bytes being written, and pass them off to the readable portion of the interface. Do asynchronous I/O, process things, and so on. Call `transform.push(outputChunk)` 0 or more times to generate output from this input chunk, depending on how much data you want to output as a result of this chunk. Call the callback function only when the current chunk is completely consumed. Note that there may or may not be output as a result of any particular input chunk. If you supply a second argument to the callback it will be passed to the push method. In other words the following are equivalent: ```javascript transform.prototype._transform = function (data, encoding, callback) { this.push(data); callback(); }; transform.prototype._transform = function (data, encoding, callback) { callback(null, data); }; ``` This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### transform.\_flush(callback) * `callback` {Function} Call this function (optionally with an error argument) when you are done flushing any remaining data. Note: **This function MUST NOT be called directly.** It MAY be implemented by child classes, and if so, will be called by the internal Transform class methods only. In some cases, your transform operation may need to emit a bit more data at the end of the stream. For example, a `Zlib` compression stream will store up some internal state so that it can optimally compress the output. At the end, however, it needs to do the best it can with what is left, so that the data will be complete. In those cases, you can implement a `_flush` method, which will be called at the very end, after all the written data is consumed, but before emitting `end` to signal the end of the readable side. Just like with `_transform`, call `transform.push(chunk)` zero or more times, as appropriate, and call `callback` when the flush operation is complete. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### Events: 'finish' and 'end' The [`finish`][] and [`end`][] events are from the parent Writable and Readable classes respectively. The `finish` event is fired after `.end()` is called and all chunks have been processed by `_transform`, `end` is fired after all data has been output which is after the callback in `_flush` has been called. #### Example: `SimpleProtocol` parser v2 The example above of a simple protocol parser can be implemented simply by using the higher level [Transform][] stream class, similar to the `parseHeader` and `SimpleProtocol v1` examples above. In this example, rather than providing the input as an argument, it would be piped into the parser, which is a more idiomatic Node.js stream approach. ```javascript var util = require('util'); var Transform = require('stream').Transform; util.inherits(SimpleProtocol, Transform); function SimpleProtocol(options) { if (!(this instanceof SimpleProtocol)) return new SimpleProtocol(options); Transform.call(this, options); this._inBody = false; this._sawFirstCr = false; this._rawHeader = []; this.header = null; } SimpleProtocol.prototype._transform = function(chunk, encoding, done) { if (!this._inBody) { // check if the chunk has a \n\n var split = -1; for (var i = 0; i < chunk.length; i++) { if (chunk[i] === 10) { // '\n' if (this._sawFirstCr) { split = i; break; } else { this._sawFirstCr = true; } } else { this._sawFirstCr = false; } } if (split === -1) { // still waiting for the \n\n // stash the chunk, and try again. this._rawHeader.push(chunk); } else { this._inBody = true; var h = chunk.slice(0, split); this._rawHeader.push(h); var header = Buffer.concat(this._rawHeader).toString(); try { this.header = JSON.parse(header); } catch (er) { this.emit('error', new Error('invalid simple protocol data')); return; } // and let them know that we are done parsing the header. this.emit('header', this.header); // now, because we got some extra data, emit this first. this.push(chunk.slice(split)); } } else { // from there on, just provide the data to our consumer as-is. this.push(chunk); } done(); }; // Usage: // var parser = new SimpleProtocol(); // source.pipe(parser) // Now parser is a readable stream that will emit 'header' // with the parsed header data. ``` ### Class: stream.PassThrough This is a trivial implementation of a [Transform][] stream that simply passes the input bytes across to the output. Its purpose is mainly for examples and testing, but there are occasionally use cases where it can come in handy as a building block for novel sorts of streams. ## Simplified Constructor API In simple cases there is now the added benefit of being able to construct a stream without inheritance. This can be done by passing the appropriate methods as constructor options: Examples: ### Readable ```javascript var readable = new stream.Readable({ read: function(n) { // sets this._read under the hood } }); ``` ### Writable ```javascript var writable = new stream.Writable({ write: function(chunk, encoding, next) { // sets this._write under the hood } }); // or var writable = new stream.Writable({ writev: function(chunks, next) { // sets this._writev under the hood } }); ``` ### Duplex ```javascript var duplex = new stream.Duplex({ read: function(n) { // sets this._read under the hood }, write: function(chunk, encoding, next) { // sets this._write under the hood } }); // or var duplex = new stream.Duplex({ read: function(n) { // sets this._read under the hood }, writev: function(chunks, next) { // sets this._writev under the hood } }); ``` ### Transform ```javascript var transform = new stream.Transform({ transform: function(chunk, encoding, next) { // sets this._transform under the hood }, flush: function(done) { // sets this._flush under the hood } }); ``` ## Streams: Under the Hood ### Buffering Both Writable and Readable streams will buffer data on an internal object which can be retrieved from `_writableState.getBuffer()` or `_readableState.buffer`, respectively. The amount of data that will potentially be buffered depends on the `highWaterMark` option which is passed into the constructor. Buffering in Readable streams happens when the implementation calls [`stream.push(chunk)`][]. If the consumer of the Stream does not call `stream.read()`, then the data will sit in the internal queue until it is consumed. Buffering in Writable streams happens when the user calls [`stream.write(chunk)`][] repeatedly, even when `write()` returns `false`. The purpose of streams, especially with the `pipe()` method, is to limit the buffering of data to acceptable levels, so that sources and destinations of varying speed will not overwhelm the available memory. ### `stream.read(0)` There are some cases where you want to trigger a refresh of the underlying readable stream mechanisms, without actually consuming any data. In that case, you can call `stream.read(0)`, which will always return null. If the internal read buffer is below the `highWaterMark`, and the stream is not currently reading, then calling `read(0)` will trigger a low-level `_read` call. There is almost never a need to do this. However, you will see some cases in Node.js's internals where this is done, particularly in the Readable stream class internals. ### `stream.push('')` Pushing a zero-byte string or Buffer (when not in [Object mode][]) has an interesting side effect. Because it *is* a call to [`stream.push()`][], it will end the `reading` process. However, it does *not* add any data to the readable buffer, so there's nothing for a user to consume. Very rarely, there are cases where you have no data to provide now, but the consumer of your stream (or, perhaps, another bit of your own code) will know when to check again, by calling `stream.read(0)`. In those cases, you *may* call `stream.push('')`. So far, the only use case for this functionality is in the [tls.CryptoStream][] class, which is deprecated in Node.js/io.js v1.0. If you find that you have to use `stream.push('')`, please consider another approach, because it almost certainly indicates that something is horribly wrong. ### Compatibility with Older Node.js Versions In versions of Node.js prior to v0.10, the Readable stream interface was simpler, but also less powerful and less useful. * Rather than waiting for you to call the `read()` method, `'data'` events would start emitting immediately. If you needed to do some I/O to decide how to handle data, then you had to store the chunks in some kind of buffer so that they would not be lost. * The [`pause()`][] method was advisory, rather than guaranteed. This meant that you still had to be prepared to receive `'data'` events even when the stream was in a paused state. In Node.js v0.10, the Readable class described below was added. For backwards compatibility with older Node.js programs, Readable streams switch into "flowing mode" when a `'data'` event handler is added, or when the [`resume()`][] method is called. The effect is that, even if you are not using the new `read()` method and `'readable'` event, you no longer have to worry about losing `'data'` chunks. Most programs will continue to function normally. However, this introduces an edge case in the following conditions: * No [`'data'` event][] handler is added. * The [`resume()`][] method is never called. * The stream is not piped to any writable destination. For example, consider the following code: ```javascript // WARNING! BROKEN! net.createServer(function(socket) { // we add an 'end' method, but never consume the data socket.on('end', function() { // It will never get here. socket.end('I got your message (but didnt read it)\n'); }); }).listen(1337); ``` In versions of Node.js prior to v0.10, the incoming message data would be simply discarded. However, in Node.js v0.10 and beyond, the socket will remain paused forever. The workaround in this situation is to call the `resume()` method to start the flow of data: ```javascript // Workaround net.createServer(function(socket) { socket.on('end', function() { socket.end('I got your message (but didnt read it)\n'); }); // start the flow of data, discarding it. socket.resume(); }).listen(1337); ``` In addition to new Readable streams switching into flowing mode, pre-v0.10 style streams can be wrapped in a Readable class using the `wrap()` method. ### Object Mode Normally, Streams operate on Strings and Buffers exclusively. Streams that are in **object mode** can emit generic JavaScript values other than Buffers and Strings. A Readable stream in object mode will always return a single item from a call to `stream.read(size)`, regardless of what the size argument is. A Writable stream in object mode will always ignore the `encoding` argument to `stream.write(data, encoding)`. The special value `null` still retains its special value for object mode streams. That is, for object mode readable streams, `null` as a return value from `stream.read()` indicates that there is no more data, and [`stream.push(null)`][] will signal the end of stream data (`EOF`). No streams in Node.js core are object mode streams. This pattern is only used by userland streaming libraries. You should set `objectMode` in your stream child class constructor on the options object. Setting `objectMode` mid-stream is not safe. For Duplex streams `objectMode` can be set exclusively for readable or writable side with `readableObjectMode` and `writableObjectMode` respectively. These options can be used to implement parsers and serializers with Transform streams. ```javascript var util = require('util'); var StringDecoder = require('string_decoder').StringDecoder; var Transform = require('stream').Transform; util.inherits(JSONParseStream, Transform); // Gets \n-delimited JSON string data, and emits the parsed objects function JSONParseStream() { if (!(this instanceof JSONParseStream)) return new JSONParseStream(); Transform.call(this, { readableObjectMode : true }); this._buffer = ''; this._decoder = new StringDecoder('utf8'); } JSONParseStream.prototype._transform = function(chunk, encoding, cb) { this._buffer += this._decoder.write(chunk); // split on newlines var lines = this._buffer.split(/\r?\n/); // keep the last partial line buffered this._buffer = lines.pop(); for (var l = 0; l < lines.length; l++) { var line = lines[l]; try { var obj = JSON.parse(line); } catch (er) { this.emit('error', er); return; } // push the parsed object out to the readable consumer this.push(obj); } cb(); }; JSONParseStream.prototype._flush = function(cb) { // Just handle any leftover var rem = this._buffer.trim(); if (rem) { try { var obj = JSON.parse(rem); } catch (er) { this.emit('error', er); return; } // push the parsed object out to the readable consumer this.push(obj); } cb(); }; ``` [EventEmitter]: https://iojs.org/dist/v5.0.0/doc/api/events.html#events_class_events_eventemitter [Object mode]: #stream_object_mode [`stream.push(chunk)`]: #stream_readable_push_chunk_encoding [`stream.push(null)`]: #stream_readable_push_chunk_encoding [`stream.push()`]: #stream_readable_push_chunk_encoding [`unpipe()`]: #stream_readable_unpipe_destination [unpiped]: #stream_readable_unpipe_destination [tcp sockets]: https://iojs.org/dist/v5.0.0/doc/api/net.html#net_class_net_socket [zlib streams]: zlib.html [zlib]: zlib.html [crypto streams]: crypto.html [crypto]: crypto.html [tls.CryptoStream]: https://iojs.org/dist/v5.0.0/doc/api/tls.html#tls_class_cryptostream [process.stdin]: https://iojs.org/dist/v5.0.0/doc/api/process.html#process_process_stdin [stdout]: https://iojs.org/dist/v5.0.0/doc/api/process.html#process_process_stdout [process.stdout]: https://iojs.org/dist/v5.0.0/doc/api/process.html#process_process_stdout [process.stderr]: https://iojs.org/dist/v5.0.0/doc/api/process.html#process_process_stderr [child process stdout and stderr]: https://iojs.org/dist/v5.0.0/doc/api/child_process.html#child_process_child_stdout [API for Stream Consumers]: #stream_api_for_stream_consumers [API for Stream Implementors]: #stream_api_for_stream_implementors [Readable]: #stream_class_stream_readable [Writable]: #stream_class_stream_writable [Duplex]: #stream_class_stream_duplex [Transform]: #stream_class_stream_transform [`end`]: #stream_event_end [`finish`]: #stream_event_finish [`_read(size)`]: #stream_readable_read_size_1 [`_read()`]: #stream_readable_read_size_1 [_read]: #stream_readable_read_size_1 [`writable.write(chunk)`]: #stream_writable_write_chunk_encoding_callback [`write(chunk, encoding, callback)`]: #stream_writable_write_chunk_encoding_callback [`write()`]: #stream_writable_write_chunk_encoding_callback [`stream.write(chunk)`]: #stream_writable_write_chunk_encoding_callback [`_write(chunk, encoding, callback)`]: #stream_writable_write_chunk_encoding_callback_1 [`_write()`]: #stream_writable_write_chunk_encoding_callback_1 [_write]: #stream_writable_write_chunk_encoding_callback_1 [`util.inherits`]: https://iojs.org/dist/v5.0.0/doc/api/util.html#util_util_inherits_constructor_superconstructor [`end()`]: #stream_writable_end_chunk_encoding_callback [`'data'` event]: #stream_event_data [`resume()`]: #stream_readable_resume [`readable.resume()`]: #stream_readable_resume [`pause()`]: #stream_readable_pause [`unpipe()`]: #stream_readable_unpipe_destination [`pipe()`]: #stream_readable_pipe_destination_options ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/doc/wg-meetings/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/doc/wg-meetings/2015-01-30.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000435012631326456032247 0ustar 00000000000000# streams WG Meeting 2015-01-30 ## Links * **Google Hangouts Video**: http://www.youtube.com/watch?v=I9nDOSGfwZg * **GitHub Issue**: https://github.com/iojs/readable-stream/issues/106 * **Original Minutes Google Doc**: https://docs.google.com/document/d/17aTgLnjMXIrfjgNaTUnHQO7m3xgzHR2VXBTmi03Qii4/ ## Agenda Extracted from https://github.com/iojs/readable-stream/labels/wg-agenda prior to meeting. * adopt a charter [#105](https://github.com/iojs/readable-stream/issues/105) * release and versioning strategy [#101](https://github.com/iojs/readable-stream/issues/101) * simpler stream creation [#102](https://github.com/iojs/readable-stream/issues/102) * proposal: deprecate implicit flowing of streams [#99](https://github.com/iojs/readable-stream/issues/99) ## Minutes ### adopt a charter * group: +1's all around ### What versioning scheme should be adopted? * group: +1’s 3.0.0 * domenic+group: pulling in patches from other sources where appropriate * mikeal: version independently, suggesting versions for io.js * mikeal+domenic: work with TC to notify in advance of changes simpler stream creation ### streamline creation of streams * sam: streamline creation of streams * domenic: nice simple solution posted but, we lose the opportunity to change the model may not be backwards incompatible (double check keys) **action item:** domenic will check ### remove implicit flowing of streams on(‘data’) * add isFlowing / isPaused * mikeal: worrying that we’re documenting polyfill methods – confuses users * domenic: more reflective API is probably good, with warning labels for users * new section for mad scientists (reflective stream access) * calvin: name the “third state” * mikeal: maybe borrow the name from whatwg? * domenic: we’re missing the “third state” * consensus: kind of difficult to name the third state * mikeal: figure out differences in states / compat * mathias: always flow on data – eliminates third state * explore what it breaks **action items:** * ask isaac for ability to list packages by what public io.js APIs they use (esp. Stream) * ask rod/build for infrastructure * **chris**: explore the “flow on data” approach * add isPaused/isFlowing * add new docs section * move isPaused to that section ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_duplex.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000351412631326456032250 0ustar 00000000000000// a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. 'use strict'; /**/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /**/ module.exports = Duplex; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); var keys = objectKeys(Writable.prototype); for (var v = 0; v < keys.length; v++) { var method = keys[v]; if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; } function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. processNextTick(onEndNT, this); } function onEndNT(self) { self.end(); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_passthrough.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000114012631326456032241 0ustar 00000000000000// a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. 'use strict'; module.exports = PassThrough; var Transform = require('./_stream_transform'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_readable.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000006233312631326456032254 0ustar 00000000000000'use strict'; module.exports = Readable; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var isArray = require('isarray'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Readable.ReadableState = ReadableState; var EE = require('events'); /**/ var EElistenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /**/ /**/ var Stream; (function (){try{ Stream = require('st' + 'ream'); }catch(_){}finally{ if (!Stream) Stream = require('events').EventEmitter; }}()) /**/ var Buffer = require('buffer').Buffer; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ /**/ var debugUtil = require('util'); var debug; if (debugUtil && debugUtil.debuglog) { debug = debugUtil.debuglog('stream'); } else { debug = function () {}; } /**/ var StringDecoder; util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = this.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; if (options && typeof options.read === 'function') this._read = options.read; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (!state.objectMode && typeof chunk === 'string') { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; Readable.prototype.isPaused = function() { return this._readableState.flowing === false; }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (chunk === null) { state.reading = false; onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 8MB var MAX_HWM = 0x800000; function computeNewHighWaterMark(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; n |= n >>> 1; n |= n >>> 2; n |= n >>> 4; n |= n >>> 8; n |= n >>> 16; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (n === null || isNaN(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = computeNewHighWaterMark(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else { return state.length; } } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (typeof n !== 'number' || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (ret === null) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (ret !== null) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!(Buffer.isBuffer(chunk)) && typeof chunk !== 'string' && chunk !== null && chunk !== undefined && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.ended) return; if (state.decoder) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) processNextTick(emitReadable_, stream); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; processNextTick(maybeReadMore_, stream, state); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) processNextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); var cleanedUp = false; function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); cleanedUp = true; // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { // If the user unpiped during `dest.write()`, it is possible // to get stuck in a permanently paused state if that write // also returned false. if (state.pipesCount === 1 && state.pipes[0] === dest && src.listenerCount('data') === 1 && !cleanedUp) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; } src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EElistenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EElistenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { processNextTick(nReadingNextTick, this); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; function nReadingNextTick(self) { debug('readable nexttick read 0'); self.read(0); } // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; processNextTick(resume_, stream, state); } } function resume_(stream, state) { if (!state.reading) { debug('resume read 0'); stream.read(0); } state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); // don't skip over falsy values in objectMode if (state.objectMode && (chunk === null || chunk === undefined)) return; else if (!state.objectMode && (!chunk || !chunk.length)) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (this[i] === undefined && typeof stream[i] === 'function') { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }; }(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else if (list.length === 1) ret = list[0]; else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; processNextTick(endReadableNT, state, stream); } } function endReadableNT(state, stream) { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_transform.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000001441512631326456032252 0ustar 00000000000000// a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. 'use strict'; module.exports = Transform; var Duplex = require('./_stream_duplex'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(Transform, Duplex); function TransformState(stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (data !== null && data !== undefined) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; if (options) { if (typeof options.transform === 'function') this._transform = options.transform; if (typeof options.flush === 'function') this._flush = options.flush; } this.once('prefinish', function() { if (typeof this._flush === 'function') this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (ts.writechunk !== null && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/lib/_stream_writable.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000003260712631326456032255 0ustar 00000000000000// A bit simpler than readable streams. // Implement an async ._write(chunk, encoding, cb), and it'll handle all // the drain event emission and buffering. 'use strict'; module.exports = Writable; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Writable.WritableState = WritableState; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ /**/ var internalUtil = { deprecate: require('util-deprecate') }; /**/ /**/ var Stream; (function (){try{ Stream = require('st' + 'ream'); }catch(_){}finally{ if (!Stream) Stream = require('events').EventEmitter; }}()) /**/ var Buffer = require('buffer').Buffer; util.inherits(Writable, Stream); function nop() {} function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; this.next = null; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = this.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.bufferedRequest = null; this.lastBufferedRequest = null; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } WritableState.prototype.getBuffer = function writableStateGetBuffer() { var current = this.bufferedRequest; var out = []; while (current) { out.push(current); current = current.next; } return out; }; (function (){try { Object.defineProperty(WritableState.prototype, 'buffer', { get: internalUtil.deprecate(function() { return this.getBuffer(); }, '_writableState.buffer is deprecated. Use _writableState.getBuffer ' + 'instead.') }); }catch(_){}}()); function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; if (options) { if (typeof options.write === 'function') this._write = options.write; if (typeof options.writev === 'function') this._writev = options.writev; } Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); processNextTick(cb, er); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!(Buffer.isBuffer(chunk)) && typeof chunk !== 'string' && chunk !== null && chunk !== undefined && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); processNextTick(cb, er); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (Buffer.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (typeof cb !== 'function') cb = nop; if (state.ended) writeAfterEnd(this, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state); } }; Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) { // node::ParseEncoding() requires lower case. if (typeof encoding === 'string') encoding = encoding.toLowerCase(); if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2','utf16le', 'utf-16le', 'raw'] .indexOf((encoding + '').toLowerCase()) > -1)) throw new TypeError('Unknown encoding: ' + encoding); this._writableState.defaultEncoding = encoding; }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (Buffer.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) { var last = state.lastBufferedRequest; state.lastBufferedRequest = new WriteReq(chunk, encoding, cb); if (last) { last.next = state.lastBufferedRequest; } else { state.bufferedRequest = state.lastBufferedRequest; } } else { doWrite(stream, state, false, len, chunk, encoding, cb); } return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { --state.pendingcb; if (sync) processNextTick(cb, er); else cb(er); stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(state); if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) { clearBuffer(stream, state); } if (sync) { processNextTick(afterWrite, stream, state, finished, cb); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; var entry = state.bufferedRequest; if (stream._writev && entry && entry.next) { // Fast case, write everything using _writev() var buffer = []; var cbs = []; while (entry) { cbs.push(entry.callback); buffer.push(entry); entry = entry.next; } // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; state.lastBufferedRequest = null; doWrite(stream, state, true, state.length, buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer } else { // Slow case, write chunks one-by-one while (entry) { var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); entry = entry.next; // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { break; } } if (entry === null) state.lastBufferedRequest = null; } state.bufferedRequest = entry; state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (typeof chunk === 'function') { cb = chunk; chunk = null; encoding = null; } else if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(state) { return (state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else { prefinish(stream, state); } } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) processNextTick(cb); else stream.once('finish', cb); } state.ended = true; } ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000021300000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000020500000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000020500000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/LICENSEnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000206512631326456032250 0ustar 00000000000000Copyright Node.js contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/README.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000010312631326456032237 0ustar 00000000000000# core-util-is The `util.is*` functions introduced in Node v0.12. ././@LongLink0000000000000000000000000000021600000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/float.patchnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000003762612631326456032263 0ustar 00000000000000diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}././@LongLink0000000000000000000000000000020700000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/lib/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000436212631326456032252 0ustar 00000000000000{ "_args": [ [ "core-util-is@~1.0.0", "/Users/rebecca/code/npm2/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream" ] ], "_from": "core-util-is@>=1.0.0 <1.1.0", "_id": "core-util-is@1.0.2", "_inCache": true, "_installable": true, "_location": "/npm-registry-client/concat-stream/readable-stream/core-util-is", "_nodeVersion": "4.0.0", "_npmUser": { "email": "i@izs.me", "name": "isaacs" }, "_npmVersion": "3.3.2", "_phantomChildren": {}, "_requested": { "name": "core-util-is", "raw": "core-util-is@~1.0.0", "rawSpec": "~1.0.0", "scope": null, "spec": ">=1.0.0 <1.1.0", "type": "range" }, "_requiredBy": [ "/npm-registry-client/concat-stream/readable-stream" ], "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.2.tgz", "_shasum": "b5fd54220aa2bc5ab57aab7140c940754503c1a7", "_shrinkwrap": null, "_spec": "core-util-is@~1.0.0", "_where": "/Users/rebecca/code/npm2/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream", "author": { "email": "i@izs.me", "name": "Isaac Z. Schlueter", "url": "http://blog.izs.me/" }, "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "dependencies": {}, "description": "The `util.is*` functions introduced in Node v0.12.", "devDependencies": { "tap": "^2.3.0" }, "directories": {}, "dist": { "shasum": "b5fd54220aa2bc5ab57aab7140c940754503c1a7", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.2.tgz" }, "gitHead": "a177da234df5638b363ddc15fa324619a38577c8", "homepage": "https://github.com/isaacs/core-util-is#readme", "keywords": [ "isArray", "isBuffer", "isNumber", "isRegExp", "isString", "isThat", "isThis", "polyfill", "util" ], "license": "MIT", "main": "lib/util.js", "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "name": "core-util-is", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is.git" }, "scripts": { "test": "tap test.js" }, "version": "1.0.2" } ././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/test.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000406512631326456032252 0ustar 00000000000000var assert = require('tap'); var t = require('./lib/util'); assert.equal(t.isArray([]), true); assert.equal(t.isArray({}), false); assert.equal(t.isBoolean(null), false); assert.equal(t.isBoolean(true), true); assert.equal(t.isBoolean(false), true); assert.equal(t.isNull(null), true); assert.equal(t.isNull(undefined), false); assert.equal(t.isNull(false), false); assert.equal(t.isNull(), false); assert.equal(t.isNullOrUndefined(null), true); assert.equal(t.isNullOrUndefined(undefined), true); assert.equal(t.isNullOrUndefined(false), false); assert.equal(t.isNullOrUndefined(), true); assert.equal(t.isNumber(null), false); assert.equal(t.isNumber('1'), false); assert.equal(t.isNumber(1), true); assert.equal(t.isString(null), false); assert.equal(t.isString('1'), true); assert.equal(t.isString(1), false); assert.equal(t.isSymbol(null), false); assert.equal(t.isSymbol('1'), false); assert.equal(t.isSymbol(1), false); assert.equal(t.isSymbol(Symbol()), true); assert.equal(t.isUndefined(null), false); assert.equal(t.isUndefined(undefined), true); assert.equal(t.isUndefined(false), false); assert.equal(t.isUndefined(), true); assert.equal(t.isRegExp(null), false); assert.equal(t.isRegExp('1'), false); assert.equal(t.isRegExp(new RegExp()), true); assert.equal(t.isObject({}), true); assert.equal(t.isObject([]), true); assert.equal(t.isObject(new RegExp()), true); assert.equal(t.isObject(new Date()), true); assert.equal(t.isDate(null), false); assert.equal(t.isDate('1'), false); assert.equal(t.isDate(new Date()), true); assert.equal(t.isError(null), false); assert.equal(t.isError({ err: true }), false); assert.equal(t.isError(new Error()), true); assert.equal(t.isFunction(null), false); assert.equal(t.isFunction({ }), false); assert.equal(t.isFunction(function() {}), true); assert.equal(t.isPrimitive(null), true); assert.equal(t.isPrimitive(''), true); assert.equal(t.isPrimitive(0), true); assert.equal(t.isPrimitive(new Date()), false); assert.equal(t.isBuffer(null), false); assert.equal(t.isBuffer({}), false); assert.equal(t.isBuffer(new Buffer(0)), true); ././@LongLink0000000000000000000000000000021600000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/core-util-is/lib/util.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000571512631326456032255 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(arg) { if (Array.isArray) { return Array.isArray(arg); } return objectToString(arg) === '[object Array]'; } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; exports.isBuffer = Buffer.isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ././@LongLink0000000000000000000000000000020700000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/README.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000302512631326456032245 0ustar 00000000000000 # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/build/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000755000000000000000000000000012631326456032243 5ustar 00000000000000././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/component.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000072612631326456032252 0ustar 00000000000000{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/index.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000017012631326456032243 0ustar 00000000000000module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ././@LongLink0000000000000000000000000000021200000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000401512631326456032245 0ustar 00000000000000{ "_args": [ [ "isarray@0.0.1", "/Users/rebecca/code/npm2/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream" ] ], "_from": "isarray@0.0.1", "_id": "isarray@0.0.1", "_inCache": true, "_installable": true, "_location": "/npm-registry-client/concat-stream/readable-stream/isarray", "_npmUser": { "email": "julian@juliangruber.com", "name": "juliangruber" }, "_npmVersion": "1.2.18", "_phantomChildren": {}, "_requested": { "name": "isarray", "raw": "isarray@0.0.1", "rawSpec": "0.0.1", "scope": null, "spec": "0.0.1", "type": "version" }, "_requiredBy": [ "/npm-registry-client/concat-stream/readable-stream" ], "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_shrinkwrap": null, "_spec": "isarray@0.0.1", "_where": "/Users/rebecca/code/npm2/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream", "author": { "email": "mail@juliangruber.com", "name": "Julian Gruber", "url": "http://juliangruber.com" }, "bugs": { "url": "https://github.com/juliangruber/isarray/issues" }, "dependencies": {}, "description": "Array#isArray for older browsers", "devDependencies": { "tap": "*" }, "directories": {}, "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "homepage": "https://github.com/juliangruber/isarray", "keywords": [ "array", "browser", "isarray" ], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "name": "isarray", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "scripts": { "test": "tap test/*.js" }, "version": "0.0.1" } ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/isarray/build/build.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000777112631326456032261 0ustar 00000000000000 /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/.travis.ymlnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000012112631326456032237 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" - "0.11" - "0.12" - "iojs" ././@LongLink0000000000000000000000000000022300000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/index.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000040712631326456032246 0ustar 00000000000000'use strict'; module.exports = nextTick; function nextTick(fn) { var args = new Array(arguments.length - 1); var i = 0; while (i < args.length) { args[i++] = arguments[i]; } process.nextTick(function afterTick() { fn.apply(null, args); }); } ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/license.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000205012631326456032242 0ustar 00000000000000# Copyright (c) 2015 Calvin Metcalf Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. **THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.** ././@LongLink0000000000000000000000000000022700000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000245412631326456032252 0ustar 00000000000000{ "name": "process-nextick-args", "version": "1.0.3", "description": "process.nextTick but always with args", "main": "index.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git+https://github.com/calvinmetcalf/process-nextick-args.git" }, "author": "", "license": "MIT", "bugs": { "url": "https://github.com/calvinmetcalf/process-nextick-args/issues" }, "homepage": "https://github.com/calvinmetcalf/process-nextick-args", "devDependencies": { "tap": "~0.2.6" }, "gitHead": "e855846a69662b9489f1ad3dde1ebf2ccc4370b8", "_id": "process-nextick-args@1.0.3", "_shasum": "e272eed825d5e9f4ea74d8d73b1fe311c3beb630", "_from": "process-nextick-args@>=1.0.0 <1.1.0", "_npmVersion": "2.9.0", "_nodeVersion": "2.5.0", "_npmUser": { "name": "cwmma", "email": "calvin.metcalf@gmail.com" }, "dist": { "shasum": "e272eed825d5e9f4ea74d8d73b1fe311c3beb630", "tarball": "http://registry.npmjs.org/process-nextick-args/-/process-nextick-args-1.0.3.tgz" }, "maintainers": [ { "name": "cwmma", "email": "calvin.metcalf@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-1.0.3.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/readme.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000070312631326456032245 0ustar 00000000000000process-nextick-args ===== [![Build Status](https://travis-ci.org/calvinmetcalf/process-nextick-args.svg?branch=master)](https://travis-ci.org/calvinmetcalf/process-nextick-args) ```bash npm install --save process-nextick-args ``` Always be able to pass arguments to process.nextTick, no matter the platform ```js var nextTick = require('process-nextick-args'); nextTick(function (a, b, c) { console.log(a, b, c); }, 'step', 3, 'profit'); ``` ././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/process-nextick-args/test.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000101612631326456032243 0ustar 00000000000000var test = require("tap").test; var nextTick = require('./'); test('should work', function (t) { t.plan(5); nextTick(function (a) { t.ok(a); nextTick(function (thing) { t.equals(thing, 7); }, 7); }, true); nextTick(function (a, b, c) { t.equals(a, 'step'); t.equals(b, 3); t.equals(c, 'profit'); }, 'step', 3, 'profit'); }); test('correct number of arguments', function (t) { t.plan(1); nextTick(function () { t.equals(2, arguments.length, 'correct number'); }, 1, 2); }); ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/.npmignorenpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000001312631326456032237 0ustar 00000000000000build test ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/LICENSEnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000206412631326456032247 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000021600000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/README.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000076212631326456032252 0ustar 00000000000000**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/index.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000001716412631326456032256 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/string_decoder/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000422012631326456032243 0ustar 00000000000000{ "_args": [ [ "string_decoder@~0.10.x", "/Users/rebecca/code/npm2/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream" ] ], "_from": "string_decoder@>=0.10.0 <0.11.0", "_id": "string_decoder@0.10.31", "_inCache": true, "_installable": true, "_location": "/npm-registry-client/concat-stream/readable-stream/string_decoder", "_npmUser": { "email": "rod@vagg.org", "name": "rvagg" }, "_npmVersion": "1.4.23", "_phantomChildren": {}, "_requested": { "name": "string_decoder", "raw": "string_decoder@~0.10.x", "rawSpec": "~0.10.x", "scope": null, "spec": ">=0.10.0 <0.11.0", "type": "range" }, "_requiredBy": [ "/npm-registry-client/concat-stream/readable-stream" ], "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_shrinkwrap": null, "_spec": "string_decoder@~0.10.x", "_where": "/Users/rebecca/code/npm2/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "dependencies": {}, "description": "The string_decoder module from Node core", "devDependencies": { "tap": "~0.4.8" }, "directories": {}, "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "browser", "browserify", "decoder", "string" ], "license": "MIT", "main": "index.js", "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "name": "string_decoder", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "scripts": { "test": "tap test/simple/*.js" }, "version": "0.10.31" } ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/History.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000043212631326456032244 0ustar 00000000000000 1.0.2 / 2015-10-07 ================== * use try/catch when checking `localStorage` (#3, @kumavis) 1.0.1 / 2014-11-25 ================== * browser: use `console.warn()` for deprecation calls * browser: more jsdocs 1.0.0 / 2014-04-30 ================== * initial commit ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/LICENSEnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000211612631326456032245 0ustar 00000000000000(The MIT License) Copyright (c) 2014 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000021600000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/README.mdnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000320212631326456032242 0ustar 00000000000000util-deprecate ============== ### The Node.js `util.deprecate()` function with browser support In Node.js, this module simply re-exports the `util.deprecate()` function. In the web browser (i.e. via browserify), a browser-specific implementation of the `util.deprecate()` function is used. ## API A `deprecate()` function is the only thing exposed by this module. ``` javascript // setup: exports.foo = deprecate(foo, 'foo() is deprecated, use bar() instead'); // users see: foo(); // foo() is deprecated, use bar() instead foo(); foo(); ``` ## License (The MIT License) Copyright (c) 2014 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000021700000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/browser.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000311612631326456032246 0ustar 00000000000000 /** * Module exports. */ module.exports = deprecate; /** * Mark that a method should not be used. * Returns a modified function which warns once by default. * * If `localStorage.noDeprecation = true` is set, then it is a no-op. * * If `localStorage.throwDeprecation = true` is set, then deprecated functions * will throw an Error when invoked. * * If `localStorage.traceDeprecation = true` is set, then deprecated functions * will invoke `console.trace()` instead of `console.error()`. * * @param {Function} fn - the function to deprecate * @param {String} msg - the string to print to the console when `fn` is invoked * @returns {Function} a new "deprecated" version of `fn` * @api public */ function deprecate (fn, msg) { if (config('noDeprecation')) { return fn; } var warned = false; function deprecated() { if (!warned) { if (config('throwDeprecation')) { throw new Error(msg); } else if (config('traceDeprecation')) { console.trace(msg); } else { console.warn(msg); } warned = true; } return fn.apply(this, arguments); } return deprecated; } /** * Checks `localStorage` for boolean values for the given `name`. * * @param {String} name * @returns {Boolean} * @api private */ function config (name) { // accessing global.localStorage can trigger a DOMException in sandboxed iframes try { if (!global.localStorage) return false; } catch (_) { return false; } var val = global.localStorage[name]; if (null == val) return false; return String(val).toLowerCase() === 'true'; } ././@LongLink0000000000000000000000000000021400000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/node.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000017312631326456032246 0ustar 00000000000000 /** * For Node.js, simply re-export the core `util.deprecate` function. */ module.exports = require('util').deprecate; ././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-stream/node_modules/util-deprecate/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/readable-str0000644000000000000000000000271612631326456032253 0ustar 00000000000000{ "name": "util-deprecate", "version": "1.0.2", "description": "The Node.js `util.deprecate()` function with browser support", "main": "node.js", "browser": "browser.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/util-deprecate.git" }, "keywords": [ "util", "deprecate", "browserify", "browser", "node" ], "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://n8.io/" }, "license": "MIT", "bugs": { "url": "https://github.com/TooTallNate/util-deprecate/issues" }, "homepage": "https://github.com/TooTallNate/util-deprecate", "gitHead": "475fb6857cd23fafff20c1be846c1350abf8e6d4", "_id": "util-deprecate@1.0.2", "_shasum": "450d4dc9fa70de732762fbd2d4a28981419a0ccf", "_from": "util-deprecate@>=1.0.1 <1.1.0", "_npmVersion": "2.14.4", "_nodeVersion": "4.1.2", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "450d4dc9fa70de732762fbd2d4a28981419a0ccf", "tarball": "http://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000015700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/.travis.ymlnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/.0000644000000000000000000000006012631326456032215 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" ././@LongLink0000000000000000000000000000015300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/LICENSEnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/L0000644000000000000000000000303512631326456032260 0ustar 00000000000000/* Copyright (c) 2010, Linden Research, Inc. Copyright (c) 2012, Joshua Bell Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. $/LicenseInfo$ */ // Original can be found at: // https://bitbucket.org/lindenlab/llsd // Modifications by Joshua Bell inexorabletash@gmail.com // https://github.com/inexorabletash/polyfill // ES3/ES5 implementation of the Krhonos Typed Array Specification // Ref: http://www.khronos.org/registry/typedarray/specs/latest/ // Date: 2011-02-01 // // Variations: // * Allows typed_array.get/set() as alias for subscripts (typed_array[]) ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/example/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/e0000755000000000000000000000000012631326456032306 5ustar 00000000000000././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/index.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/i0000644000000000000000000005151512631326456032323 0ustar 00000000000000var undefined = (void 0); // Paranoia // Beyond this value, index getters/setters (i.e. array[0], array[1]) are so slow to // create, and consume so much memory, that the browser appears frozen. var MAX_ARRAY_LENGTH = 1e5; // Approximations of internal ECMAScript conversion functions var ECMAScript = (function() { // Stash a copy in case other scripts modify these var opts = Object.prototype.toString, ophop = Object.prototype.hasOwnProperty; return { // Class returns internal [[Class]] property, used to avoid cross-frame instanceof issues: Class: function(v) { return opts.call(v).replace(/^\[object *|\]$/g, ''); }, HasProperty: function(o, p) { return p in o; }, HasOwnProperty: function(o, p) { return ophop.call(o, p); }, IsCallable: function(o) { return typeof o === 'function'; }, ToInt32: function(v) { return v >> 0; }, ToUint32: function(v) { return v >>> 0; } }; }()); // Snapshot intrinsics var LN2 = Math.LN2, abs = Math.abs, floor = Math.floor, log = Math.log, min = Math.min, pow = Math.pow, round = Math.round; // ES5: lock down object properties function configureProperties(obj) { if (getOwnPropNames && defineProp) { var props = getOwnPropNames(obj), i; for (i = 0; i < props.length; i += 1) { defineProp(obj, props[i], { value: obj[props[i]], writable: false, enumerable: false, configurable: false }); } } } // emulate ES5 getter/setter API using legacy APIs // http://blogs.msdn.com/b/ie/archive/2010/09/07/transitioning-existing-code-to-the-es5-getter-setter-apis.aspx // (second clause tests for Object.defineProperty() in IE<9 that only supports extending DOM prototypes, but // note that IE<9 does not support __defineGetter__ or __defineSetter__ so it just renders the method harmless) var defineProp if (Object.defineProperty && (function() { try { Object.defineProperty({}, 'x', {}); return true; } catch (e) { return false; } })()) { defineProp = Object.defineProperty; } else { defineProp = function(o, p, desc) { if (!o === Object(o)) throw new TypeError("Object.defineProperty called on non-object"); if (ECMAScript.HasProperty(desc, 'get') && Object.prototype.__defineGetter__) { Object.prototype.__defineGetter__.call(o, p, desc.get); } if (ECMAScript.HasProperty(desc, 'set') && Object.prototype.__defineSetter__) { Object.prototype.__defineSetter__.call(o, p, desc.set); } if (ECMAScript.HasProperty(desc, 'value')) { o[p] = desc.value; } return o; }; } var getOwnPropNames = Object.getOwnPropertyNames || function (o) { if (o !== Object(o)) throw new TypeError("Object.getOwnPropertyNames called on non-object"); var props = [], p; for (p in o) { if (ECMAScript.HasOwnProperty(o, p)) { props.push(p); } } return props; }; // ES5: Make obj[index] an alias for obj._getter(index)/obj._setter(index, value) // for index in 0 ... obj.length function makeArrayAccessors(obj) { if (!defineProp) { return; } if (obj.length > MAX_ARRAY_LENGTH) throw new RangeError("Array too large for polyfill"); function makeArrayAccessor(index) { defineProp(obj, index, { 'get': function() { return obj._getter(index); }, 'set': function(v) { obj._setter(index, v); }, enumerable: true, configurable: false }); } var i; for (i = 0; i < obj.length; i += 1) { makeArrayAccessor(i); } } // Internal conversion functions: // pack() - take a number (interpreted as Type), output a byte array // unpack() - take a byte array, output a Type-like number function as_signed(value, bits) { var s = 32 - bits; return (value << s) >> s; } function as_unsigned(value, bits) { var s = 32 - bits; return (value << s) >>> s; } function packI8(n) { return [n & 0xff]; } function unpackI8(bytes) { return as_signed(bytes[0], 8); } function packU8(n) { return [n & 0xff]; } function unpackU8(bytes) { return as_unsigned(bytes[0], 8); } function packU8Clamped(n) { n = round(Number(n)); return [n < 0 ? 0 : n > 0xff ? 0xff : n & 0xff]; } function packI16(n) { return [(n >> 8) & 0xff, n & 0xff]; } function unpackI16(bytes) { return as_signed(bytes[0] << 8 | bytes[1], 16); } function packU16(n) { return [(n >> 8) & 0xff, n & 0xff]; } function unpackU16(bytes) { return as_unsigned(bytes[0] << 8 | bytes[1], 16); } function packI32(n) { return [(n >> 24) & 0xff, (n >> 16) & 0xff, (n >> 8) & 0xff, n & 0xff]; } function unpackI32(bytes) { return as_signed(bytes[0] << 24 | bytes[1] << 16 | bytes[2] << 8 | bytes[3], 32); } function packU32(n) { return [(n >> 24) & 0xff, (n >> 16) & 0xff, (n >> 8) & 0xff, n & 0xff]; } function unpackU32(bytes) { return as_unsigned(bytes[0] << 24 | bytes[1] << 16 | bytes[2] << 8 | bytes[3], 32); } function packIEEE754(v, ebits, fbits) { var bias = (1 << (ebits - 1)) - 1, s, e, f, ln, i, bits, str, bytes; function roundToEven(n) { var w = floor(n), f = n - w; if (f < 0.5) return w; if (f > 0.5) return w + 1; return w % 2 ? w + 1 : w; } // Compute sign, exponent, fraction if (v !== v) { // NaN // http://dev.w3.org/2006/webapi/WebIDL/#es-type-mapping e = (1 << ebits) - 1; f = pow(2, fbits - 1); s = 0; } else if (v === Infinity || v === -Infinity) { e = (1 << ebits) - 1; f = 0; s = (v < 0) ? 1 : 0; } else if (v === 0) { e = 0; f = 0; s = (1 / v === -Infinity) ? 1 : 0; } else { s = v < 0; v = abs(v); if (v >= pow(2, 1 - bias)) { e = min(floor(log(v) / LN2), 1023); f = roundToEven(v / pow(2, e) * pow(2, fbits)); if (f / pow(2, fbits) >= 2) { e = e + 1; f = 1; } if (e > bias) { // Overflow e = (1 << ebits) - 1; f = 0; } else { // Normalized e = e + bias; f = f - pow(2, fbits); } } else { // Denormalized e = 0; f = roundToEven(v / pow(2, 1 - bias - fbits)); } } // Pack sign, exponent, fraction bits = []; for (i = fbits; i; i -= 1) { bits.push(f % 2 ? 1 : 0); f = floor(f / 2); } for (i = ebits; i; i -= 1) { bits.push(e % 2 ? 1 : 0); e = floor(e / 2); } bits.push(s ? 1 : 0); bits.reverse(); str = bits.join(''); // Bits to bytes bytes = []; while (str.length) { bytes.push(parseInt(str.substring(0, 8), 2)); str = str.substring(8); } return bytes; } function unpackIEEE754(bytes, ebits, fbits) { // Bytes to bits var bits = [], i, j, b, str, bias, s, e, f; for (i = bytes.length; i; i -= 1) { b = bytes[i - 1]; for (j = 8; j; j -= 1) { bits.push(b % 2 ? 1 : 0); b = b >> 1; } } bits.reverse(); str = bits.join(''); // Unpack sign, exponent, fraction bias = (1 << (ebits - 1)) - 1; s = parseInt(str.substring(0, 1), 2) ? -1 : 1; e = parseInt(str.substring(1, 1 + ebits), 2); f = parseInt(str.substring(1 + ebits), 2); // Produce number if (e === (1 << ebits) - 1) { return f !== 0 ? NaN : s * Infinity; } else if (e > 0) { // Normalized return s * pow(2, e - bias) * (1 + f / pow(2, fbits)); } else if (f !== 0) { // Denormalized return s * pow(2, -(bias - 1)) * (f / pow(2, fbits)); } else { return s < 0 ? -0 : 0; } } function unpackF64(b) { return unpackIEEE754(b, 11, 52); } function packF64(v) { return packIEEE754(v, 11, 52); } function unpackF32(b) { return unpackIEEE754(b, 8, 23); } function packF32(v) { return packIEEE754(v, 8, 23); } // // 3 The ArrayBuffer Type // (function() { /** @constructor */ var ArrayBuffer = function ArrayBuffer(length) { length = ECMAScript.ToInt32(length); if (length < 0) throw new RangeError('ArrayBuffer size is not a small enough positive integer'); this.byteLength = length; this._bytes = []; this._bytes.length = length; var i; for (i = 0; i < this.byteLength; i += 1) { this._bytes[i] = 0; } configureProperties(this); }; exports.ArrayBuffer = exports.ArrayBuffer || ArrayBuffer; // // 4 The ArrayBufferView Type // // NOTE: this constructor is not exported /** @constructor */ var ArrayBufferView = function ArrayBufferView() { //this.buffer = null; //this.byteOffset = 0; //this.byteLength = 0; }; // // 5 The Typed Array View Types // function makeConstructor(bytesPerElement, pack, unpack) { // Each TypedArray type requires a distinct constructor instance with // identical logic, which this produces. var ctor; ctor = function(buffer, byteOffset, length) { var array, sequence, i, s; if (!arguments.length || typeof arguments[0] === 'number') { // Constructor(unsigned long length) this.length = ECMAScript.ToInt32(arguments[0]); if (length < 0) throw new RangeError('ArrayBufferView size is not a small enough positive integer'); this.byteLength = this.length * this.BYTES_PER_ELEMENT; this.buffer = new ArrayBuffer(this.byteLength); this.byteOffset = 0; } else if (typeof arguments[0] === 'object' && arguments[0].constructor === ctor) { // Constructor(TypedArray array) array = arguments[0]; this.length = array.length; this.byteLength = this.length * this.BYTES_PER_ELEMENT; this.buffer = new ArrayBuffer(this.byteLength); this.byteOffset = 0; for (i = 0; i < this.length; i += 1) { this._setter(i, array._getter(i)); } } else if (typeof arguments[0] === 'object' && !(arguments[0] instanceof ArrayBuffer || ECMAScript.Class(arguments[0]) === 'ArrayBuffer')) { // Constructor(sequence array) sequence = arguments[0]; this.length = ECMAScript.ToUint32(sequence.length); this.byteLength = this.length * this.BYTES_PER_ELEMENT; this.buffer = new ArrayBuffer(this.byteLength); this.byteOffset = 0; for (i = 0; i < this.length; i += 1) { s = sequence[i]; this._setter(i, Number(s)); } } else if (typeof arguments[0] === 'object' && (arguments[0] instanceof ArrayBuffer || ECMAScript.Class(arguments[0]) === 'ArrayBuffer')) { // Constructor(ArrayBuffer buffer, // optional unsigned long byteOffset, optional unsigned long length) this.buffer = buffer; this.byteOffset = ECMAScript.ToUint32(byteOffset); if (this.byteOffset > this.buffer.byteLength) { throw new RangeError("byteOffset out of range"); } if (this.byteOffset % this.BYTES_PER_ELEMENT) { // The given byteOffset must be a multiple of the element // size of the specific type, otherwise an exception is raised. throw new RangeError("ArrayBuffer length minus the byteOffset is not a multiple of the element size."); } if (arguments.length < 3) { this.byteLength = this.buffer.byteLength - this.byteOffset; if (this.byteLength % this.BYTES_PER_ELEMENT) { throw new RangeError("length of buffer minus byteOffset not a multiple of the element size"); } this.length = this.byteLength / this.BYTES_PER_ELEMENT; } else { this.length = ECMAScript.ToUint32(length); this.byteLength = this.length * this.BYTES_PER_ELEMENT; } if ((this.byteOffset + this.byteLength) > this.buffer.byteLength) { throw new RangeError("byteOffset and length reference an area beyond the end of the buffer"); } } else { throw new TypeError("Unexpected argument type(s)"); } this.constructor = ctor; configureProperties(this); makeArrayAccessors(this); }; ctor.prototype = new ArrayBufferView(); ctor.prototype.BYTES_PER_ELEMENT = bytesPerElement; ctor.prototype._pack = pack; ctor.prototype._unpack = unpack; ctor.BYTES_PER_ELEMENT = bytesPerElement; // getter type (unsigned long index); ctor.prototype._getter = function(index) { if (arguments.length < 1) throw new SyntaxError("Not enough arguments"); index = ECMAScript.ToUint32(index); if (index >= this.length) { return undefined; } var bytes = [], i, o; for (i = 0, o = this.byteOffset + index * this.BYTES_PER_ELEMENT; i < this.BYTES_PER_ELEMENT; i += 1, o += 1) { bytes.push(this.buffer._bytes[o]); } return this._unpack(bytes); }; // NONSTANDARD: convenience alias for getter: type get(unsigned long index); ctor.prototype.get = ctor.prototype._getter; // setter void (unsigned long index, type value); ctor.prototype._setter = function(index, value) { if (arguments.length < 2) throw new SyntaxError("Not enough arguments"); index = ECMAScript.ToUint32(index); if (index >= this.length) { return undefined; } var bytes = this._pack(value), i, o; for (i = 0, o = this.byteOffset + index * this.BYTES_PER_ELEMENT; i < this.BYTES_PER_ELEMENT; i += 1, o += 1) { this.buffer._bytes[o] = bytes[i]; } }; // void set(TypedArray array, optional unsigned long offset); // void set(sequence array, optional unsigned long offset); ctor.prototype.set = function(index, value) { if (arguments.length < 1) throw new SyntaxError("Not enough arguments"); var array, sequence, offset, len, i, s, d, byteOffset, byteLength, tmp; if (typeof arguments[0] === 'object' && arguments[0].constructor === this.constructor) { // void set(TypedArray array, optional unsigned long offset); array = arguments[0]; offset = ECMAScript.ToUint32(arguments[1]); if (offset + array.length > this.length) { throw new RangeError("Offset plus length of array is out of range"); } byteOffset = this.byteOffset + offset * this.BYTES_PER_ELEMENT; byteLength = array.length * this.BYTES_PER_ELEMENT; if (array.buffer === this.buffer) { tmp = []; for (i = 0, s = array.byteOffset; i < byteLength; i += 1, s += 1) { tmp[i] = array.buffer._bytes[s]; } for (i = 0, d = byteOffset; i < byteLength; i += 1, d += 1) { this.buffer._bytes[d] = tmp[i]; } } else { for (i = 0, s = array.byteOffset, d = byteOffset; i < byteLength; i += 1, s += 1, d += 1) { this.buffer._bytes[d] = array.buffer._bytes[s]; } } } else if (typeof arguments[0] === 'object' && typeof arguments[0].length !== 'undefined') { // void set(sequence array, optional unsigned long offset); sequence = arguments[0]; len = ECMAScript.ToUint32(sequence.length); offset = ECMAScript.ToUint32(arguments[1]); if (offset + len > this.length) { throw new RangeError("Offset plus length of array is out of range"); } for (i = 0; i < len; i += 1) { s = sequence[i]; this._setter(offset + i, Number(s)); } } else { throw new TypeError("Unexpected argument type(s)"); } }; // TypedArray subarray(long begin, optional long end); ctor.prototype.subarray = function(start, end) { function clamp(v, min, max) { return v < min ? min : v > max ? max : v; } start = ECMAScript.ToInt32(start); end = ECMAScript.ToInt32(end); if (arguments.length < 1) { start = 0; } if (arguments.length < 2) { end = this.length; } if (start < 0) { start = this.length + start; } if (end < 0) { end = this.length + end; } start = clamp(start, 0, this.length); end = clamp(end, 0, this.length); var len = end - start; if (len < 0) { len = 0; } return new this.constructor( this.buffer, this.byteOffset + start * this.BYTES_PER_ELEMENT, len); }; return ctor; } var Int8Array = makeConstructor(1, packI8, unpackI8); var Uint8Array = makeConstructor(1, packU8, unpackU8); var Uint8ClampedArray = makeConstructor(1, packU8Clamped, unpackU8); var Int16Array = makeConstructor(2, packI16, unpackI16); var Uint16Array = makeConstructor(2, packU16, unpackU16); var Int32Array = makeConstructor(4, packI32, unpackI32); var Uint32Array = makeConstructor(4, packU32, unpackU32); var Float32Array = makeConstructor(4, packF32, unpackF32); var Float64Array = makeConstructor(8, packF64, unpackF64); exports.Int8Array = exports.Int8Array || Int8Array; exports.Uint8Array = exports.Uint8Array || Uint8Array; exports.Uint8ClampedArray = exports.Uint8ClampedArray || Uint8ClampedArray; exports.Int16Array = exports.Int16Array || Int16Array; exports.Uint16Array = exports.Uint16Array || Uint16Array; exports.Int32Array = exports.Int32Array || Int32Array; exports.Uint32Array = exports.Uint32Array || Uint32Array; exports.Float32Array = exports.Float32Array || Float32Array; exports.Float64Array = exports.Float64Array || Float64Array; }()); // // 6 The DataView View Type // (function() { function r(array, index) { return ECMAScript.IsCallable(array.get) ? array.get(index) : array[index]; } var IS_BIG_ENDIAN = (function() { var u16array = new(exports.Uint16Array)([0x1234]), u8array = new(exports.Uint8Array)(u16array.buffer); return r(u8array, 0) === 0x12; }()); // Constructor(ArrayBuffer buffer, // optional unsigned long byteOffset, // optional unsigned long byteLength) /** @constructor */ var DataView = function DataView(buffer, byteOffset, byteLength) { if (arguments.length === 0) { buffer = new exports.ArrayBuffer(0); } else if (!(buffer instanceof exports.ArrayBuffer || ECMAScript.Class(buffer) === 'ArrayBuffer')) { throw new TypeError("TypeError"); } this.buffer = buffer || new exports.ArrayBuffer(0); this.byteOffset = ECMAScript.ToUint32(byteOffset); if (this.byteOffset > this.buffer.byteLength) { throw new RangeError("byteOffset out of range"); } if (arguments.length < 3) { this.byteLength = this.buffer.byteLength - this.byteOffset; } else { this.byteLength = ECMAScript.ToUint32(byteLength); } if ((this.byteOffset + this.byteLength) > this.buffer.byteLength) { throw new RangeError("byteOffset and length reference an area beyond the end of the buffer"); } configureProperties(this); }; function makeGetter(arrayType) { return function(byteOffset, littleEndian) { byteOffset = ECMAScript.ToUint32(byteOffset); if (byteOffset + arrayType.BYTES_PER_ELEMENT > this.byteLength) { throw new RangeError("Array index out of range"); } byteOffset += this.byteOffset; var uint8Array = new exports.Uint8Array(this.buffer, byteOffset, arrayType.BYTES_PER_ELEMENT), bytes = [], i; for (i = 0; i < arrayType.BYTES_PER_ELEMENT; i += 1) { bytes.push(r(uint8Array, i)); } if (Boolean(littleEndian) === Boolean(IS_BIG_ENDIAN)) { bytes.reverse(); } return r(new arrayType(new exports.Uint8Array(bytes).buffer), 0); }; } DataView.prototype.getUint8 = makeGetter(exports.Uint8Array); DataView.prototype.getInt8 = makeGetter(exports.Int8Array); DataView.prototype.getUint16 = makeGetter(exports.Uint16Array); DataView.prototype.getInt16 = makeGetter(exports.Int16Array); DataView.prototype.getUint32 = makeGetter(exports.Uint32Array); DataView.prototype.getInt32 = makeGetter(exports.Int32Array); DataView.prototype.getFloat32 = makeGetter(exports.Float32Array); DataView.prototype.getFloat64 = makeGetter(exports.Float64Array); function makeSetter(arrayType) { return function(byteOffset, value, littleEndian) { byteOffset = ECMAScript.ToUint32(byteOffset); if (byteOffset + arrayType.BYTES_PER_ELEMENT > this.byteLength) { throw new RangeError("Array index out of range"); } // Get bytes var typeArray = new arrayType([value]), byteArray = new exports.Uint8Array(typeArray.buffer), bytes = [], i, byteView; for (i = 0; i < arrayType.BYTES_PER_ELEMENT; i += 1) { bytes.push(r(byteArray, i)); } // Flip if necessary if (Boolean(littleEndian) === Boolean(IS_BIG_ENDIAN)) { bytes.reverse(); } // Write them byteView = new exports.Uint8Array(this.buffer, byteOffset, arrayType.BYTES_PER_ELEMENT); byteView.set(bytes); }; } DataView.prototype.setUint8 = makeSetter(exports.Uint8Array); DataView.prototype.setInt8 = makeSetter(exports.Int8Array); DataView.prototype.setUint16 = makeSetter(exports.Uint16Array); DataView.prototype.setInt16 = makeSetter(exports.Int16Array); DataView.prototype.setUint32 = makeSetter(exports.Uint32Array); DataView.prototype.setInt32 = makeSetter(exports.Int32Array); DataView.prototype.setFloat32 = makeSetter(exports.Float32Array); DataView.prototype.setFloat64 = makeSetter(exports.Float64Array); exports.DataView = exports.DataView || DataView; }()); ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/package.jsonnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/p0000644000000000000000000000350512631326456032326 0ustar 00000000000000{ "name": "typedarray", "version": "0.0.6", "description": "TypedArray polyfill for old browsers", "main": "index.js", "devDependencies": { "tape": "~2.3.2" }, "scripts": { "test": "tape test/*.js test/server/*.js" }, "repository": { "type": "git", "url": "git://github.com/substack/typedarray.git" }, "homepage": "https://github.com/substack/typedarray", "keywords": [ "ArrayBuffer", "DataView", "Float32Array", "Float64Array", "Int8Array", "Int16Array", "Int32Array", "Uint8Array", "Uint8ClampedArray", "Uint16Array", "Uint32Array", "typed", "array", "polyfill" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "testling": { "files": "test/*.js", "browsers": [ "ie/6..latest", "firefox/16..latest", "firefox/nightly", "chrome/22..latest", "chrome/canary", "opera/12..latest", "opera/next", "safari/5.1..latest", "ipad/6.0..latest", "iphone/6.0..latest", "android-browser/4.2..latest" ] }, "bugs": { "url": "https://github.com/substack/typedarray/issues" }, "_id": "typedarray@0.0.6", "dist": { "shasum": "867ac74e3864187b1d3d47d996a78ec5c8830777", "tarball": "http://registry.npmjs.org/typedarray/-/typedarray-0.0.6.tgz" }, "_from": "typedarray@>=0.0.5 <0.1.0", "_npmVersion": "1.4.3", "_npmUser": { "name": "substack", "email": "mail@substack.net" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" } ], "directories": {}, "_shasum": "867ac74e3864187b1d3d47d996a78ec5c8830777", "_resolved": "https://registry.npmjs.org/typedarray/-/typedarray-0.0.6.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/readme.markdownnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/r0000644000000000000000000000204412631326456032325 0ustar 00000000000000# typedarray TypedArray polyfill ripped from [this module](https://raw.github.com/inexorabletash/polyfill). [![build status](https://secure.travis-ci.org/substack/typedarray.png)](http://travis-ci.org/substack/typedarray) [![testling badge](https://ci.testling.com/substack/typedarray.png)](https://ci.testling.com/substack/typedarray) # example ``` js var Uint8Array = require('typedarray').Uint8Array; var ua = new Uint8Array(5); ua[1] = 256 + 55; console.log(ua[1]); ``` output: ``` 55 ``` # methods ``` js var TA = require('typedarray') ``` The `TA` object has the following constructors: * TA.ArrayBuffer * TA.DataView * TA.Float32Array * TA.Float64Array * TA.Int8Array * TA.Int16Array * TA.Int32Array * TA.Uint8Array * TA.Uint8ClampedArray * TA.Uint16Array * TA.Uint32Array # install With [npm](https://npmjs.org) do: ``` npm install typedarray ``` To use this module in the browser, compile with [browserify](http://browserify.org) or download a UMD build from browserify CDN: http://wzrd.in/standalone/typedarray@latest # license MIT ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/test/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/t0000755000000000000000000000000012631326456032325 5ustar 00000000000000././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/example/tarray.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/e0000644000000000000000000000015612631326456032312 0ustar 00000000000000var Uint8Array = require('../').Uint8Array; var ua = new Uint8Array(5); ua[1] = 256 + 55; console.log(ua[1]); ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/test/server/npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/t0000755000000000000000000000000012631326456032325 5ustar 00000000000000././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/test/tarray.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/t0000644000000000000000000000033112631326456032324 0ustar 00000000000000var TA = require('../'); var test = require('tape'); test('tiny u8a test', function (t) { var ua = new(TA.Uint8Array)(5); t.equal(ua.length, 5); ua[1] = 256 + 55; t.equal(ua[1], 55); t.end(); }); ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/test/server/undef_globals.jsnpm_3.5.2.orig/node_modules/npm-registry-client/node_modules/concat-stream/node_modules/typedarray/t0000644000000000000000000000072412631326456032332 0ustar 00000000000000var test = require('tape'); var vm = require('vm'); var fs = require('fs'); var src = fs.readFileSync(__dirname + '/../../index.js', 'utf8'); test('u8a without globals', function (t) { var c = { module: { exports: {} }, }; c.exports = c.module.exports; vm.runInNewContext(src, c); var TA = c.module.exports; var ua = new(TA.Uint8Array)(5); t.equal(ua.length, 5); ua[1] = 256 + 55; t.equal(ua[1], 55); t.end(); }); npm_3.5.2.orig/node_modules/npm-registry-client/test/00-setup.js0000644000000000000000000000032712631326456022757 0ustar 00000000000000var tap = require('tap') var rimraf = require('rimraf') tap.test('setup', function (t) { rimraf(__dirname + '/fixtures/cache', function (er) { if (er) throw er t.pass('cache cleaned') t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/access.js0000644000000000000000000002440212631326456022643 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'http://localhost:1337' var PARAMS = { auth: { token: 'foo' }, scope: 'myorg', team: 'myteam', package: '@foo/bar', permissions: 'read-write' } var UNSCOPED = { auth: { token: 'foo' }, scope: 'myorg', team: 'myteam', package: 'bar', permissions: 'read-write' } var commands = [ 'public', 'restricted', 'grant', 'revoke', 'ls-packages', 'ls-collaborators' ] test('access public', function (t) { server.expect('POST', '/-/package/%40foo%2Fbar/access', function (req, res) { t.equal(req.method, 'POST') onJsonReq(req, function (json) { t.deepEqual(json, { access: 'public' }) res.statusCode = 200 res.json({ accessChanged: true }) }) }) var params = Object.create(PARAMS) params.package = '@foo/bar' client.access('public', URI, params, function (error, data) { t.ifError(error, 'no errors') t.ok(data.accessChanged, 'access level set') t.end() }) }) test('access restricted', function (t) { server.expect('POST', '/-/package/%40foo%2Fbar/access', function (req, res) { t.equal(req.method, 'POST') onJsonReq(req, function (json) { t.deepEqual(json, { access: 'restricted' }) res.statusCode = 200 res.json({ accessChanged: true }) }) }) client.access('restricted', URI, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.accessChanged, 'access level set') t.end() }) }) test('access grant basic', function (t) { server.expect('PUT', '/-/team/myorg/myteam/package', function (req, res) { t.equal(req.method, 'PUT') onJsonReq(req, function (json) { t.deepEqual(json, { permissions: PARAMS.permissions, package: PARAMS.package }) res.statusCode = 201 res.json({ accessChanged: true }) }) }) client.access('grant', URI, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.accessChanged, 'access level set') t.end() }) }) test('access grant basic unscoped', function (t) { server.expect('PUT', '/-/team/myorg/myteam/package', function (req, res) { t.equal(req.method, 'PUT') onJsonReq(req, function (json) { t.deepEqual(json, { permissions: UNSCOPED.permissions, package: UNSCOPED.package }) res.statusCode = 201 res.json({ accessChanged: true }) }) }) client.access('grant', URI, UNSCOPED, function (error, data) { t.ifError(error, 'no errors') t.ok(data.accessChanged, 'access level set') t.end() }) }) test('access revoke basic', function (t) { server.expect('DELETE', '/-/team/myorg/myteam/package', function (req, res) { t.equal(req.method, 'DELETE') onJsonReq(req, function (json) { t.deepEqual(json, { package: PARAMS.package }) res.statusCode = 200 res.json({ accessChanged: true }) }) }) client.access('revoke', URI, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.accessChanged, 'access level set') t.end() }) }) test('access revoke basic unscoped', function (t) { server.expect('DELETE', '/-/team/myorg/myteam/package', function (req, res) { t.equal(req.method, 'DELETE') onJsonReq(req, function (json) { t.deepEqual(json, { package: UNSCOPED.package }) res.statusCode = 200 res.json({ accessChanged: true }) }) }) client.access('revoke', URI, UNSCOPED, function (error, data) { t.ifError(error, 'no errors') t.ok(data.accessChanged, 'access level set') t.end() }) }) test('ls-packages on team', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } var uri = '/-/team/myorg/myteam/package?format=cli' server.expect('GET', uri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json(serverPackages) }) client.access('ls-packages', URI, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.same(data, clientPackages) t.end() }) }) test('ls-packages on org', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } var uri = '/-/org/myorg/package?format=cli' server.expect('GET', uri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json(serverPackages) }) var params = Object.create(PARAMS) params.team = null client.access('ls-packages', URI, params, function (error, data) { t.ifError(error, 'no errors') t.same(data, clientPackages) t.end() }) }) test('ls-packages on user', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } var firstUri = '/-/org/myorg/package?format=cli' server.expect('GET', firstUri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 404 res.json({error: 'not found'}) }) var secondUri = '/-/user/myorg/package?format=cli' server.expect('GET', secondUri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json(serverPackages) }) var params = Object.create(PARAMS) params.team = null client.access('ls-packages', URI, params, function (error, data) { t.ifError(error, 'no errors') t.same(data, clientPackages) t.end() }) }) test('ls-collaborators', function (t) { var serverCollaborators = { 'myorg:myteam': 'write', 'myorg:anotherteam': 'read' } var clientCollaborators = { 'myorg:myteam': 'read-write', 'myorg:anotherteam': 'read-only' } var uri = '/-/package/%40foo%2Fbar/collaborators?format=cli' server.expect('GET', uri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json(serverCollaborators) }) client.access('ls-collaborators', URI, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.same(data, clientCollaborators) t.end() }) }) test('ls-collaborators w/scope', function (t) { var serverCollaborators = { 'myorg:myteam': 'write', 'myorg:anotherteam': 'read' } var clientCollaborators = { 'myorg:myteam': 'read-write', 'myorg:anotherteam': 'read-only' } var uri = '/-/package/%40foo%2Fbar/collaborators?format=cli&user=zkat' server.expect('GET', uri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json(serverCollaborators) }) var params = Object.create(PARAMS) params.user = 'zkat' client.access('ls-collaborators', URI, params, function (error, data) { t.ifError(error, 'no errors') t.same(data, clientCollaborators) t.end() }) }) test('ls-collaborators w/o scope', function (t) { var serverCollaborators = { 'myorg:myteam': 'write', 'myorg:anotherteam': 'read' } var clientCollaborators = { 'myorg:myteam': 'read-write', 'myorg:anotherteam': 'read-only' } var uri = '/-/package/bar/collaborators?format=cli&user=zkat' server.expect('GET', uri, function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json(serverCollaborators) }) var params = Object.create(UNSCOPED) params.user = 'zkat' client.access('ls-collaborators', URI, params, function (error, data) { t.ifError(error, 'no errors') t.same(data, clientCollaborators) t.end() }) }) test('access command base validation', function (t) { t.throws(function () { client.access(undefined, URI, PARAMS, nop) }, 'command is required') t.throws(function () { client.access('whoops', URI, PARAMS, nop) }, 'command must be a valid subcommand') commands.forEach(function (cmd) { t.throws(function () { client.access(cmd, undefined, PARAMS, nop) }, 'registry URI is required') t.throws(function () { client.access(cmd, URI, undefined, nop) }, 'params is required') t.throws(function () { client.access(cmd, URI, '', nop) }, 'params must be an object') t.throws(function () { client.access(cmd, URI, {scope: 'o', team: 't'}, nop) }, 'auth is required') t.throws(function () { client.access(cmd, URI, {auth: 5, scope: 'o', team: 't'}, nop) }, 'auth must be an object') t.throws(function () { client.access(cmd, URI, PARAMS, {}) }, 'callback must be a function') t.throws(function () { client.access(cmd, URI, PARAMS, undefined) }, 'callback is required') if (contains([ 'public', 'restricted' ], cmd)) { t.throws(function () { var params = Object.create(PARAMS) params.package = null client.access(cmd, URI, params, nop) }, 'package is required') t.throws(function () { var params = Object.create(PARAMS) params.package = 'underscore' client.access(cmd, URI, params, nop) }, 'only scoped packages are allowed') } if (contains(['grant', 'revoke', 'ls-packages'], cmd)) { t.throws(function () { var params = Object.create(PARAMS) params.scope = null client.access(cmd, URI, params, nop) }, 'scope is required') } if (contains(['grant', 'revoke'], cmd)) { t.throws(function () { var params = Object.create(PARAMS) params.team = null client.access(cmd, URI, params, nop) }, 'team is required') } if (cmd === 'grant') { t.throws(function () { var params = Object.create(PARAMS) params.permissions = null client.access(cmd, URI, params, nop) }, 'permissions are required') t.throws(function () { var params = Object.create(PARAMS) params.permissions = 'idkwhat' client.access(cmd, URI, params, nop) }, 'permissions must be either read-only or read-write') } }) t.end() }) test('cleanup', function (t) { server.close() t.end() }) function onJsonReq (req, cb) { var buffer = '' req.setEncoding('utf8') req.on('data', function (data) { buffer += data }) req.on('end', function () { cb(buffer ? JSON.parse(buffer) : undefined) }) } function contains (arr, item) { return arr.indexOf(item) !== -1 } npm_3.5.2.orig/node_modules/npm-registry-client/test/adduser-new.js0000644000000000000000000000242112631326456023615 0ustar 00000000000000var tap = require('tap') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var password = '%1234@asdf%' var username = 'username' var email = 'i@izs.me' var userdata = { name: username, email: email, _id: 'org.couchdb.user:username', type: 'user', roles: [], date: '2012-06-07T04:11:21.591Z' } var SD = require('string_decoder').StringDecoder var decoder = new SD() tap.test('create new user account', function (t) { var auth = { username: username, password: password, email: email } var params = { auth: auth } server.expect('/registry/_design/app/_rewrite/-/user/org.couchdb.user:username', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.on('data', function (d) { b += decoder.write(d) }) req.on('end', function () { var o = JSON.parse(b) userdata.password = password userdata.date = o.date t.deepEqual(o, userdata) res.statusCode = 201 res.json(auth) }) }) client.adduser( 'http://localhost:1337/registry/_design/app/_rewrite', params, function (er, data) { if (er) throw er t.deepEqual(data, auth, 'received expected auth data') server.close() t.end() } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/adduser-update.js0000644000000000000000000000306412631326456024312 0ustar 00000000000000var tap = require('tap') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var password = '%1234@asdf%' var username = 'username' var email = 'i@izs.me' var userdata = { name: username, email: email, _id: 'org.couchdb.user:username', type: 'user', roles: [], date: '2012-06-07T04:11:21.591Z' } var SD = require('string_decoder').StringDecoder var decoder = new SD() tap.test('update a user acct', function (t) { var auth = { username: username, password: password, email: email } var params = { auth: auth } server.expect('PUT', '/-/user/org.couchdb.user:username', function (req, res) { t.equal(req.method, 'PUT') res.statusCode = 409 res.json({error: 'conflict'}) }) server.expect('GET', '/-/user/org.couchdb.user:username?write=true', function (req, res) { t.equal(req.method, 'GET') res.json(userdata) }) server.expect('PUT', '/-/user/org.couchdb.user:username/-rev/' + userdata._rev, function (req, res) { t.equal(req.method, 'PUT') var b = '' req.on('data', function (d) { b += decoder.write(d) }) req.on('end', function () { var o = JSON.parse(b) userdata.password = password userdata.date = o.date t.deepEqual(o, userdata) res.statusCode = 201 res.json(auth) }) }) client.adduser( 'http://localhost:1337/', params, function (er, data) { if (er) throw er t.deepEqual(data, auth, 'got expected auth data') server.close() t.end() } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/adduser.js0000644000000000000000000000646612631326456023043 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'https://npm.registry:8043/rewrite' var USERNAME = 'username' var PASSWORD = 'password' var EMAIL = 'n@p.m' var AUTH = { auth: { username: USERNAME, password: PASSWORD, email: EMAIL } } test('adduser call contract', function (t) { t.throws(function () { client.adduser(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.adduser([], AUTH, nop) }, 'requires URI to be a string') t.throws(function () { client.adduser(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.adduser(URI, '', nop) }, 'params must be object') t.throws(function () { client.adduser(URI, AUTH, undefined) }, 'requires callback') t.throws(function () { client.adduser(URI, AUTH, 'callback') }, 'callback must be function') t.throws( function () { var params = { auth: { password: PASSWORD, email: EMAIL } } client.adduser(URI, params, nop) }, { name: 'AssertionError', message: 'must include username in auth' }, 'auth must include username' ) t.throws( function () { var params = { auth: { username: USERNAME, email: EMAIL } } client.adduser(URI, params, nop) }, { name: 'AssertionError', message: 'must include password in auth' }, 'auth must include password' ) t.throws( function () { var params = { auth: { username: USERNAME, password: PASSWORD } } client.adduser(URI, params, nop) }, { name: 'AssertionError', message: 'must include email in auth' }, 'auth must include email' ) t.test('username missing', function (t) { var params = { auth: { username: '', password: PASSWORD, email: EMAIL } } client.adduser(URI, params, function (err) { t.equal(err && err.message, 'No username supplied.', 'username must not be empty') t.end() }) }) t.test('password missing', function (t) { var params = { auth: { username: USERNAME, password: '', email: EMAIL } } client.adduser(URI, params, function (err) { t.equal( err && err.message, 'No password supplied.', 'password must not be empty' ) t.end() }) }) t.test('email missing', function (t) { var params = { auth: { username: USERNAME, password: PASSWORD, email: '' } } client.adduser(URI, params, function (err) { t.equal( err && err.message, 'No email address supplied.', 'email must not be empty' ) t.end() }) }) t.test('email malformed', function (t) { var params = { auth: { username: USERNAME, password: PASSWORD, email: 'lolbutts' } } client.adduser(URI, params, function (err) { t.equal( err && err.message, 'Please use a real email address.', 'email must look like email' ) t.end() }) }) t.end() }) test('cleanup', function (t) { server.close() t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/config-defaults.js0000644000000000000000000000324312631326456024454 0ustar 00000000000000var test = require('tap').test require('./lib/server.js').close() var common = require('./lib/common.js') test('config defaults', function (t) { var client = common.freshClient() var proxy = client.config.proxy t.notOk(proxy.http, 'no default value for HTTP proxy') t.notOk(proxy.https, 'no default value for HTTPS proxy') t.notOk(proxy.localAddress, 'no default value for local address') var ssl = client.config.ssl t.notOk(ssl.ca, 'no default value for SSL certificate authority bundle') t.notOk(ssl.certificate, 'no default value for SSL client certificate') t.notOk(ssl.key, 'no default value for SSL client certificate key') t.equal(ssl.strict, true, 'SSL is strict by default') var retry = client.config.retry t.equal(retry.retries, 2, 'default retry count is 2') t.equal(retry.factor, 10, 'default retry factor is 10') t.equal(retry.minTimeout, 10000, 'retry minimum timeout is 10000 (10 seconds)') t.equal(retry.maxTimeout, 60000, 'retry maximum timeout is 60000 (60 seconds)') t.equal(client.config.userAgent, 'node/' + process.version, 'default userAgent') t.ok(client.log.info, "there's a default logger") t.equal(client.config.defaultTag, 'latest', 'default tag is "latest"') t.notOk(client.config.couchToken, 'no couchToken by default') t.notOk(client.config.sessionToken, 'no sessionToken by default') t.end() }) test('missing HTTPS proxy defaults to HTTP proxy', function (t) { var client = common.freshClient({ proxy: { http: 'http://proxy.npm:8088/' }}) t.equal(client.config.proxy.http, 'http://proxy.npm:8088/', 'HTTP proxy set') t.equal(client.config.proxy.http, client.config.proxy.https, 'HTTP === HTTPS') t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/config-override.js0000644000000000000000000000271712631326456024471 0ustar 00000000000000var test = require('tap').test require('./lib/server.js').close() var common = require('./lib/common.js') var config = { proxy: { http: 'http://proxy.npm:8088/', https: 'https://proxy.npm:8043/', localAddress: 'localhost.localdomain' }, ssl: { ca: 'not including a PEM', certificate: 'still not including a PEM', key: 'nope', strict: false }, retry: { count: 1, factor: 9001, minTimeout: -1, maxTimeout: Infinity }, userAgent: 'npm-awesome/4 (Mozilla 5.0)', log: { fake: function () {} }, defaultTag: 'next', couchToken: { object: true }, sessionToken: 'hamchunx' } test('config defaults', function (t) { var client = common.freshClient(config) var proxy = client.config.proxy t.equal(proxy.http, 'http://proxy.npm:8088/') t.equal(proxy.https, 'https://proxy.npm:8043/') t.equal(proxy.localAddress, 'localhost.localdomain') var ssl = client.config.ssl t.equal(ssl.ca, 'not including a PEM') t.equal(ssl.certificate, 'still not including a PEM') t.equal(ssl.key, 'nope') t.equal(ssl.strict, false) var retry = client.config.retry t.equal(retry.count, 1) t.equal(retry.factor, 9001) t.equal(retry.minTimeout, -1) t.equal(retry.maxTimeout, Infinity) t.equal(client.config.userAgent, 'npm-awesome/4 (Mozilla 5.0)') t.ok(client.log.fake) t.equal(client.config.defaultTag, 'next') t.ok(client.config.couchToken.object) t.equal(client.config.sessionToken, 'hamchunx') t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/deprecate.js0000644000000000000000000001145012631326456023335 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var cache = require('./fixtures/underscore/cache.json') var client = common.freshClient() function nop () {} var URI = 'https://npm.registry:8043/rewrite' var VERSION = '1.3.2' var MESSAGE = 'uhhh' var TOKEN = 'lolbutts' var AUTH = { token: TOKEN } var PARAMS = { version: VERSION, message: MESSAGE, auth: AUTH } test('deprecate call contract', function (t) { t.throws(function () { client.deprecate(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.deprecate([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.deprecate(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.deprecate(URI, '', nop) }, 'params must be object') t.throws(function () { client.deprecate(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.deprecate(URI, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { message: MESSAGE, auth: AUTH } client.deprecate(URI, params, nop) }, { name: 'AssertionError', message: 'must pass version to deprecate' }, 'params must include version to deprecate' ) t.throws( function () { var params = { version: VERSION, auth: AUTH } client.deprecate(URI, params, nop) }, { name: 'AssertionError', message: 'must pass message to deprecate' }, 'params must include deprecation message' ) t.throws( function () { var params = { version: VERSION, message: MESSAGE } client.deprecate(URI, params, nop) }, { name: 'AssertionError', message: 'must pass auth to deprecate' }, 'params must include auth' ) t.test('malformed semver in deprecation', function (t) { var params = { version: '-9001', message: MESSAGE, auth: AUTH } client.deprecate(URI, params, function (err) { t.equal( err && err.message, 'invalid version range: -9001', 'got expected semver validation failure' ) t.end() }) }) t.end() }) test('deprecate a package', function (t) { server.expect('GET', '/underscore?write=true', function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect('PUT', '/underscore', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var updated = JSON.parse(b) var undeprecated = [ '1.0.3', '1.0.4', '1.1.0', '1.1.1', '1.1.2', '1.1.3', '1.1.4', '1.1.5', '1.1.6', '1.1.7', '1.2.0', '1.2.1', '1.2.2', '1.2.3', '1.2.4', '1.3.0', '1.3.1', '1.3.3' ] for (var i = 0; i < undeprecated.length; i++) { var current = undeprecated[i] t.notEqual( updated.versions[current].deprecated, MESSAGE, current + ' not deprecated' ) } t.equal( updated.versions[VERSION].deprecated, MESSAGE, VERSION + ' deprecated' ) res.statusCode = 201 res.json({ deprecated: true }) }) }) client.deprecate( common.registry + '/underscore', PARAMS, function (er, data) { t.ifError(er) t.ok(data.deprecated, 'was deprecated') t.end() } ) }) test('deprecate a scoped package', function (t) { server.expect('GET', '/@test%2funderscore?write=true', function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect('PUT', '/@test%2funderscore', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var updated = JSON.parse(b) var undeprecated = [ '1.0.3', '1.0.4', '1.1.0', '1.1.1', '1.1.2', '1.1.3', '1.1.4', '1.1.5', '1.1.6', '1.1.7', '1.2.0', '1.2.1', '1.2.2', '1.2.3', '1.2.4', '1.3.0', '1.3.1', '1.3.3' ] for (var i = 0; i < undeprecated.length; i++) { var current = undeprecated[i] t.notEqual( updated.versions[current].deprecated, MESSAGE, current + ' not deprecated' ) } t.equal( updated.versions[VERSION].deprecated, MESSAGE, VERSION + ' deprecated' ) res.statusCode = 201 res.json({ deprecated: true }) }) }) client.deprecate( common.registry + '/@test%2funderscore', PARAMS, function (er, data) { t.ifError(er) t.ok(data.deprecated, 'was deprecated') t.end() } ) }) test('cleanup', function (t) { server.close() t.ok(true) t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/dist-tags-add.js0000644000000000000000000000613612631326456024033 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var BASE_URL = 'http://localhost:1337/' var URI = '/-/package/underscore/dist-tags/test' var TOKEN = 'foo' var AUTH = { token: TOKEN } var PACKAGE = 'underscore' var DIST_TAG = 'test' var VERSION = '3.1.3' var PARAMS = { 'package': PACKAGE, distTag: DIST_TAG, version: VERSION, auth: AUTH } test('distTags.add call contract', function (t) { t.throws(function () { client.distTags.add(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.distTags.add([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.distTags.add(BASE_URL, undefined, nop) }, 'requires params object') t.throws(function () { client.distTags.add(BASE_URL, '', nop) }, 'params must be object') t.throws(function () { client.distTags.add(BASE_URL, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.distTags.add(BASE_URL, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { distTag: DIST_TAG, version: VERSION, auth: AUTH } client.distTags.add(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package name to distTags.add' }, 'distTags.add must include package name' ) t.throws( function () { var params = { 'package': PACKAGE, version: VERSION, auth: AUTH } client.distTags.add(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package distTag name to distTags.add' }, 'distTags.add must include dist-tag' ) t.throws( function () { var params = { 'package': PACKAGE, distTag: DIST_TAG, auth: AUTH } client.distTags.add(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass version to be mapped to distTag to distTags.add' }, 'distTags.add must include version' ) t.throws( function () { var params = { 'package': PACKAGE, distTag: DIST_TAG, version: VERSION } client.distTags.add(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass auth to distTags.add' }, 'distTags.add must include auth' ) t.end() }) test('add a new dist-tag to a package', function (t) { server.expect('PUT', URI, function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { t.doesNotThrow(function () { var parsed = JSON.parse(b) t.deepEqual(parsed, VERSION) res.statusCode = 200 res.json({ test: VERSION }) }, 'got valid JSON from client') }) }) client.distTags.add(BASE_URL, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.test, 'dist-tag added') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/dist-tags-fetch.js0000644000000000000000000000431212631326456024366 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var BASE_URL = 'http://localhost:1337/' var URI = '/-/package/underscore/dist-tags' var TOKEN = 'foo' var AUTH = { token: TOKEN } var PACKAGE = 'underscore' var PARAMS = { 'package': PACKAGE, auth: AUTH } test('distTags.fetch call contract', function (t) { t.throws(function () { client.distTags.fetch(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.distTags.fetch([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.distTags.fetch(BASE_URL, undefined, nop) }, 'requires params object') t.throws(function () { client.distTags.fetch(BASE_URL, '', nop) }, 'params must be object') t.throws(function () { client.distTags.fetch(BASE_URL, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.distTags.fetch(BASE_URL, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { auth: AUTH } client.distTags.fetch(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package name to distTags.fetch' }, 'distTags.fetch must include package name' ) t.throws( function () { var params = { 'package': PACKAGE } client.distTags.fetch(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass auth to distTags.fetch' }, 'distTags.fetch must include auth' ) t.end() }) test('fetch dist-tags for a package', function (t) { server.expect('GET', URI, function (req, res) { t.equal(req.method, 'GET') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { t.notOk(b, 'no request body') res.statusCode = 200 res.json({ a: '1.0.0', b: '2.0.0', _etag: 'xxx' }) }) }) client.distTags.fetch(BASE_URL, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.same(data, { a: '1.0.0', b: '2.0.0' }, 'etag filtered from response') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/dist-tags-rm.js0000644000000000000000000000500412631326456023712 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var BASE_URL = 'http://localhost:1337/' var URI = '/-/package/underscore/dist-tags/test' var TOKEN = 'foo' var AUTH = { token: TOKEN } var PACKAGE = 'underscore' var DIST_TAG = 'test' var PARAMS = { 'package': PACKAGE, distTag: DIST_TAG, auth: AUTH } test('distTags.rm call contract', function (t) { t.throws(function () { client.distTags.rm(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.distTags.rm([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.distTags.rm(BASE_URL, undefined, nop) }, 'requires params object') t.throws(function () { client.distTags.rm(BASE_URL, '', nop) }, 'params must be object') t.throws(function () { client.distTags.rm(BASE_URL, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.distTags.rm(BASE_URL, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { distTag: DIST_TAG, auth: AUTH } client.distTags.rm(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package name to distTags.rm' }, 'distTags.rm must include package name' ) t.throws( function () { var params = { 'package': PACKAGE, auth: AUTH } client.distTags.rm(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package distTag name to distTags.rm' }, 'distTags.rm must include dist-tag' ) t.throws( function () { var params = { 'package': PACKAGE, distTag: DIST_TAG } client.distTags.rm(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass auth to distTags.rm' }, 'distTags.rm must include auth' ) t.end() }) test('remove a dist-tag from a package', function (t) { server.expect('DELETE', URI, function (req, res) { t.equal(req.method, 'DELETE') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { t.notOk(b, 'got no message body') res.statusCode = 200 res.json({}) }) }) client.distTags.rm(BASE_URL, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.notOk(data.test, 'dist-tag removed') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/dist-tags-set.js0000644000000000000000000000511612631326456024073 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var BASE_URL = 'http://localhost:1337/' var URI = '/-/package/underscore/dist-tags' var TOKEN = 'foo' var AUTH = { token: TOKEN } var PACKAGE = 'underscore' var DIST_TAGS = { 'a': '8.0.8', 'b': '3.0.3' } var PARAMS = { 'package': PACKAGE, distTags: DIST_TAGS, auth: AUTH } test('distTags.set call contract', function (t) { t.throws(function () { client.distTags.set(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.distTags.set([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.distTags.set(BASE_URL, undefined, nop) }, 'requires params object') t.throws(function () { client.distTags.set(BASE_URL, '', nop) }, 'params must be object') t.throws(function () { client.distTags.set(BASE_URL, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.distTags.set(BASE_URL, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { distTags: DIST_TAGS, auth: AUTH } client.distTags.set(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package name to distTags.set' }, 'distTags.set must include package name' ) t.throws( function () { var params = { 'package': PACKAGE, auth: AUTH } client.distTags.set(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass distTags map to distTags.set' }, 'distTags.set must include dist-tags' ) t.throws( function () { var params = { 'package': PACKAGE, distTags: DIST_TAGS } client.distTags.set(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass auth to distTags.set' }, 'distTags.set must include auth' ) t.end() }) test('set dist-tags for a package', function (t) { server.expect('PUT', URI, function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var d = JSON.parse(b) t.deepEqual(d, DIST_TAGS, 'got back tags') res.statusCode = 200 res.json(DIST_TAGS) }) }) client.distTags.set(BASE_URL, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.a && data.b, 'dist-tags set') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/dist-tags-update.js0000644000000000000000000000510412631326456024557 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var BASE_URL = 'http://localhost:1337/' var URI = '/-/package/underscore/dist-tags' var TOKEN = 'foo' var AUTH = { token: TOKEN } var PACKAGE = 'underscore' var DIST_TAGS = { 'a': '8.0.8', 'b': '3.0.3' } var PARAMS = { 'package': PACKAGE, distTags: DIST_TAGS, auth: AUTH } test('distTags.update call contract', function (t) { t.throws(function () { client.distTags.update(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.distTags.update([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.distTags.update(BASE_URL, undefined, nop) }, 'requires params object') t.throws(function () { client.distTags.update(BASE_URL, '', nop) }, 'params must be object') t.throws(function () { client.distTags.update(BASE_URL, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.distTags.update(BASE_URL, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { distTags: DIST_TAGS, auth: AUTH } client.distTags.update(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass package name to distTags.update' }, 'distTags.update must include package name' ) t.throws( function () { var params = { 'package': PACKAGE, auth: AUTH } client.distTags.update(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass distTags map to distTags.update' }, 'distTags.update must include dist-tags' ) t.throws( function () { var params = { 'package': PACKAGE, distTags: DIST_TAGS } client.distTags.update(BASE_URL, params, nop) }, { name: 'AssertionError', message: 'must pass auth to distTags.update' }, 'distTags.update must include auth' ) t.end() }) test('update dist-tags for a package', function (t) { server.expect('POST', URI, function (req, res) { t.equal(req.method, 'POST') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var d = JSON.parse(b) t.deepEqual(d, DIST_TAGS, 'got back tags') res.statusCode = 200 res.json(DIST_TAGS) }) }) client.distTags.update(BASE_URL, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.a && data.b, 'dist-tags set') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-404.js0000644000000000000000000000156312631326456023003 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var tap = require('tap') var server = require('./lib/server.js') var common = require('./lib/common.js') var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') tap.test('fetch with a 404 response', function (t) { server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(404) createReadStream(tgz).pipe(res) }) var client = common.freshClient() var defaulted = {} client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3.tgz', defaulted, function (err, res) { t.equal( err.message, 'fetch failed with status code 404', 'got expected error message' ) server.close() t.end() } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-408.js0000644000000000000000000000260012631326456023000 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var readFileSync = require('graceful-fs').readFileSync var tap = require('tap') var cat = require('concat-stream') var server = require('./lib/server.js') var common = require('./lib/common.js') var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') tap.test('fetch with retry on timeout', function (t) { server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(408) res.end() }) server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(200, { 'content-type': 'application/x-tar', 'content-encoding': 'gzip' }) createReadStream(tgz).pipe(res) }) var client = common.freshClient() var defaulted = {} client.config.retry.minTimeout = 100 client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3.tgz', defaulted, function (er, res) { t.ifError(er, 'loaded successfully') var sink = cat(function (data) { t.deepEqual(data, readFileSync(tgz)) server.close() t.end() }) res.on('error', function (error) { t.ifError(error, 'no errors on stream') }) res.pipe(sink) } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-503.js0000644000000000000000000000260512631326456023001 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var readFileSync = require('graceful-fs').readFileSync var tap = require('tap') var cat = require('concat-stream') var server = require('./lib/server.js') var common = require('./lib/common.js') var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') tap.test('fetch with retry on server error', function (t) { server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(503) res.end() }) server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(200, { 'content-type': 'application/x-tar', 'content-encoding': 'gzip' }) createReadStream(tgz).pipe(res) }) var client = common.freshClient() var defaulted = {} client.config.retry.minTimeout = 100 client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3.tgz', defaulted, function (er, res) { t.ifError(er, 'loaded successfully') var sink = cat(function (data) { t.deepEqual(data, readFileSync(tgz)) server.close() t.end() }) res.on('error', function (error) { t.ifError(error, 'no errors on stream') }) res.pipe(sink) } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-authed.js0000644000000000000000000000267112631326456023747 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var readFileSync = require('graceful-fs').readFileSync var tap = require('tap') var cat = require('concat-stream') var server = require('./lib/server.js') var common = require('./lib/common.js') var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') tap.test('basic fetch with scoped always-auth enabled', function (t) { server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') t.equal( req.headers.authorization, 'Basic dXNlcm5hbWU6JTEyMzRAYXNkZiU=', 'got expected auth header' ) res.writeHead(200, { 'content-type': 'application/x-tar', 'content-encoding': 'gzip' }) createReadStream(tgz).pipe(res) }) var auth = { username: 'username', password: '%1234@asdf%', email: 'i@izs.me', alwaysAuth: true } var client = common.freshClient() var authed = { auth: auth } client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3.tgz', authed, function (er, res) { t.ifError(er, 'loaded successfully') var sink = cat(function (data) { t.deepEqual(data, readFileSync(tgz)) server.close() t.end() }) res.on('error', function (error) { t.ifError(error, 'no errors on stream') }) res.pipe(sink) } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-basic.js0000644000000000000000000000405712631326456023556 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var readFileSync = require('graceful-fs').readFileSync var test = require('tap').test var concat = require('concat-stream') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') function nop () {} var URI = 'https://npm.registry:8043/rewrite' var USERNAME = 'username' var PASSWORD = 'hi' var EMAIL = 'n@p.m' var HEADERS = { 'npm-custom': 'lolbutts' } var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { headers: HEADERS, auth: AUTH } test('fetch call contract', function (t) { t.throws(function () { client.get(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.get([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.get(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.get(URI, '', nop) }, 'params must be object') t.throws(function () { client.get(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.get(URI, PARAMS, 'callback') }, 'callback must be function') t.end() }) test('basic fetch', function (t) { server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(200, { 'content-type': 'application/x-tar', 'content-encoding': 'gzip' }) createReadStream(tgz).pipe(res) }) var defaulted = {} client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3.tgz', defaulted, function (er, res) { t.ifError(er, 'loaded successfully') var sink = concat(function (data) { t.deepEqual(data, readFileSync(tgz)) server.close() t.end() }) res.on('error', function (error) { t.ifError(error, 'no errors on stream') }) res.pipe(sink) } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-github-api-json.js0000644000000000000000000000345012631326456025471 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var readFileSync = require('graceful-fs').readFileSync var tap = require('tap') var cat = require('concat-stream') var Negotiator = require('negotiator') var server = require('./lib/server.js') var common = require('./lib/common.js') var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') tap.test("fetch accepts github api's json", function (t) { server.expect('/underscore/-/underscore-1.3.3', function (req, res) { t.equal(req.method, 'GET', 'got expected method') var negotiator = new Negotiator(req) // fetching a tarball from `api.github.com` returns a 415 error if json is // not accepted if (negotiator.mediaTypes().indexOf('application/vnd.github+json') === -1) { res.writeHead(415, { 'Content-Type': 'application/json' }) } else { res.writeHead(302, { 'Content-Type': 'text/html', 'Location': '/underscore/-/underscore-1.3.3.tgz' }) } res.end() }) server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(200, { 'Content-Type': 'application/x-tar', 'Content-Encoding': 'gzip' }) createReadStream(tgz).pipe(res) }) var client = common.freshClient() var defaulted = {} client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3', defaulted, function (er, res) { t.ifError(er, 'loaded successfully') var sink = cat(function (data) { t.deepEqual(data, readFileSync(tgz)) server.close() t.end() }) res.on('error', function (error) { t.ifError(error, 'no errors on stream') }) res.pipe(sink) } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/fetch-not-authed.js0000644000000000000000000000257412631326456024547 0ustar 00000000000000var resolve = require('path').resolve var createReadStream = require('graceful-fs').createReadStream var readFileSync = require('graceful-fs').readFileSync var tap = require('tap') var cat = require('concat-stream') var server = require('./lib/server.js') var common = require('./lib/common.js') var tgz = resolve(__dirname, './fixtures/underscore/1.3.3/package.tgz') tap.test('basic fetch with scoped always-auth disabled', function (t) { server.expect('/underscore/-/underscore-1.3.3.tgz', function (req, res) { t.equal(req.method, 'GET', 'got expected method') t.notOk(req.headers.authorization, 'received no auth header') res.writeHead(200, { 'content-type': 'application/x-tar', 'content-encoding': 'gzip' }) createReadStream(tgz).pipe(res) }) var auth = { username: 'username', password: '%1234@asdf%', email: 'i@izs.me', alwaysAuth: false } var client = common.freshClient() var authed = { auth: auth } client.fetch( 'http://localhost:1337/underscore/-/underscore-1.3.3.tgz', authed, function (er, res) { t.ifError(er, 'loaded successfully') var sink = cat(function (data) { t.deepEqual(data, readFileSync(tgz)) server.close() t.end() }) res.on('error', function (error) { t.ifError(error, 'no errors on stream') }) res.pipe(sink) } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/get-403.js0000644000000000000000000000135312631326456022465 0ustar 00000000000000var tap = require('tap') var server = require('./lib/server.js') var common = require('./lib/common.js') tap.test('get returns 403', function (t) { server.expect('/underscore', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(403) res.end(JSON.stringify({ error: 'get that cat out of the toilet that\'s gross omg' })) }) var client = common.freshClient() client.get( 'http://localhost:1337/underscore', {}, function (er) { t.ok(er, 'failed as expected') t.equal(er.statusCode, 403, 'status code was attached to error as expected') t.equal(er.code, 'E403', 'error code was formatted as expected') server.close() t.end() } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/get-basic.js0000644000000000000000000000350112631326456023235 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var us = require('./fixtures/underscore/1.3.3/cache.json') var usroot = require('./fixtures/underscore/cache.json') function nop () {} var URI = 'https://npm.registry:8043/rewrite' var TIMEOUT = 3600 var FOLLOW = false var STALE_OK = true var TOKEN = 'lolbutts' var AUTH = { token: TOKEN } var PARAMS = { timeout: TIMEOUT, follow: FOLLOW, staleOk: STALE_OK, auth: AUTH } test('get call contract', function (t) { t.throws(function () { client.get(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.get([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.get(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.get(URI, '', nop) }, 'params must be object') t.throws(function () { client.get(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.get(URI, PARAMS, 'callback') }, 'callback must be function') t.end() }) test('basic request', function (t) { server.expect('/underscore/1.3.3', function (req, res) { res.json(us) }) server.expect('/underscore', function (req, res) { res.json(usroot) }) server.expect('/@bigco%2funderscore', function (req, res) { res.json(usroot) }) t.plan(3) client.get('http://localhost:1337/underscore/1.3.3', PARAMS, function (er, data) { t.deepEqual(data, us) }) client.get('http://localhost:1337/underscore', PARAMS, function (er, data) { t.deepEqual(data, usroot) }) client.get('http://localhost:1337/@bigco%2funderscore', PARAMS, function (er, data) { t.deepEqual(data, usroot) }) }) test('cleanup', function (t) { server.close() t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/get-error-403.js0000644000000000000000000000157512631326456023622 0ustar 00000000000000var tap = require('tap') var server = require('./lib/server.js') var common = require('./lib/common.js') tap.test('get fails with 403', function (t) { server.expect('/habanero', function (req, res) { t.equal(req.method, 'GET', 'got expected method') res.writeHead(403) res.end('{"error":"get that cat out of the toilet that\'s gross omg"}') }) var client = common.freshClient() client.config.retry.minTimeout = 100 client.get( 'http://localhost:1337/habanero', {}, function (er) { t.ok(er, 'failed as expected') t.equal(er.statusCode, 403, 'status code was attached as expected') t.equal(er.code, 'E403', 'error code was formatted as expected') t.equal( er.message, 'get that cat out of the toilet that\'s gross omg : habanero', 'got error message' ) server.close() t.end() } ) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/initialize.js0000644000000000000000000000504412631326456023544 0ustar 00000000000000var test = require('tap').test // var server = require('./lib/server.js') var Client = require('../') test('defaulted initialization', function (t) { var client = new Client() var options = client.initialize( 'http://localhost:1337/', 'GET', 'application/json', {} ) t.equal(options.url, 'http://localhost:1337/', 'URLs match') t.equal(options.method, 'GET', 'methods match') t.equal(options.proxy, undefined, "proxy won't overwrite environment") t.equal(options.localAddress, undefined, 'localAddress has no default value') t.equal(options.strictSSL, true, 'SSL is strict by default') t.equal(options.headers.accept, 'application/json', 'accept header set') t.equal( options.headers.version, require('../package.json').version, 'npm-registry-client version is present in headers' ) t.ok(options.headers['npm-session'], 'request ID generated') t.ok(options.headers['user-agent'], 'user-agent preset') var HttpAgent = require('http').Agent t.ok(options.agent instanceof HttpAgent, 'got an HTTP agent for an HTTP URL') t.end() }) test('referer set on client', function (t) { var client = new Client() client.refer = 'xtestx' var options = client.initialize( 'http://localhost:1337/', 'GET', 'application/json', {} ) t.equal(options.headers.referer, 'xtestx', 'referer header set') t.end() }) test('initializing with proxy explicitly disabled', function (t) { var client = new Client({ proxy: { http: false }}) var options = client.initialize( 'http://localhost:1337/', 'GET', 'application/json', {} ) t.ok('proxy' in options, 'proxy overridden by explicitly setting to false') t.equal(options.proxy, null, 'request will override proxy when empty proxy passed in') t.end() }) test('initializing with proxy undefined', function (t) { var client = new Client({ proxy: { http: undefined }}) var options = client.initialize( 'http://localhost:1337/', 'GET', 'application/json', {} ) t.notOk('proxy' in options, 'proxy can be read from env.PROXY by request') t.end() }) test('initializing with a certificate should map down to the https agent', function (t) { var certificate = '-----BEGIN CERTIFICATE----- TEST\nTEST -----END CERTIFICATE-----\n' var client = new Client({ ssl: { certificate: certificate } }) var options = client.initialize( { protocol: 'https:' }, 'GET', 'application/json', {} ) t.equal(options.agent.options.cert, certificate, 'certificate will be saved properly on agent') t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/lib/0000755000000000000000000000000012631326456021610 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-registry-client/test/logout.js0000644000000000000000000000306012631326456022710 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'http://localhost:1337/rewrite' var TOKEN = 'b00b00feed' var PARAMS = { auth: { token: TOKEN } } test('logout call contract', function (t) { t.throws(function () { client.logout(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.logout([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.logout(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.logout(URI, '', nop) }, 'params must be object') t.throws(function () { client.logout(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.logout(URI, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { auth: {} } client.logout(URI, params, nop) }, { name: 'AssertionError', message: 'can only log out for token auth' }, 'auth must include token' ) t.end() }) test('log out from a token-based registry', function (t) { server.expect('DELETE', '/-/user/token/' + TOKEN, function (req, res) { t.equal(req.method, 'DELETE') t.equal(req.headers.authorization, 'Bearer ' + TOKEN, 'request is authed') res.json({message: 'ok'}) }) client.logout(URI, PARAMS, function (er) { t.ifError(er, 'no errors') t.end() }) }) test('cleanup', function (t) { server.close() t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/ping.js0000644000000000000000000000326112631326456022337 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var TOKEN = 'not-bad-meaning-bad-but-bad-meaning-wombat' var AUTH = { token: TOKEN } var PARAMS = { auth: AUTH } var DEP_USER = 'username' var HOST = 'localhost' test('ping call contract', function (t) { t.throws(function () { client.ping(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.ping([], AUTH, nop) }, 'requires URI to be a string') t.throws(function () { client.ping(common.registry, undefined, nop) }, 'requires params object') t.throws(function () { client.ping(common.registry, '', nop) }, 'params must be object') t.throws(function () { client.ping(common.registry, AUTH, undefined) }, 'requires callback') t.throws(function () { client.ping(common.registry, AUTH, 'callback') }, 'callback must be function') t.throws( function () { var params = {} client.ping(common.registry, params, nop) }, { name: 'AssertionError', message: 'must pass auth to ping' }, 'must pass auth to ping' ) t.end() }) test('ping', function (t) { server.expect('GET', '/-/ping?write=true', function (req, res) { t.equal(req.method, 'GET') res.statusCode = 200 res.json({ ok: true, host: HOST, peer: HOST, username: DEP_USER }) }) client.ping(common.registry, PARAMS, function (error, found) { t.ifError(error, 'no errors') var wanted = { ok: true, host: HOST, peer: HOST, username: DEP_USER } t.same(found, wanted) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-again-scoped.js0000644000000000000000000000456612631326456025411 0ustar 00000000000000var tap = require('tap') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var auth = { username: 'username', password: '%1234@asdf%', email: 'i@izs.me', alwaysAuth: true } var client = common.freshClient() tap.test('publish again', function (t) { // not really a tarball, but doesn't matter var bodyPath = require.resolve('../package.json') var tarball = fs.createReadStream(bodyPath) var pd = fs.readFileSync(bodyPath) var pkg = require('../package.json') var lastTime = null server.expect('/@npm%2fnpm-registry-client', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = lastTime = JSON.parse(b) t.equal(o._id, '@npm/npm-registry-client') t.equal(o['dist-tags'].latest, pkg.version) t.has(o.versions[pkg.version], pkg) t.same(o.maintainers, [ { name: 'username', email: 'i@izs.me' } ]) var att = o._attachments[ pkg.name + '-' + pkg.version + '.tgz' ] t.same(att.data, pd.toString('base64')) res.statusCode = 409 res.json({reason: 'must supply latest _rev to update existing package'}) }) }) server.expect('/@npm%2fnpm-registry-client?write=true', function (req, res) { t.equal(req.method, 'GET') t.ok(lastTime) for (var i in lastTime.versions) { var v = lastTime.versions[i] delete lastTime.versions[i] lastTime.versions['0.0.2'] = v lastTime['dist-tags'] = { latest: '0.0.2' } } lastTime._rev = 'asdf' res.json(lastTime) }) server.expect('/@npm%2fnpm-registry-client', function (req, res) { t.equal(req.method, 'PUT') t.ok(lastTime) var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._rev, 'asdf') t.deepEqual(o.versions['0.0.2'], o.versions[pkg.version]) res.statusCode = 201 res.json({created: true}) }) }) pkg.name = '@npm/npm-registry-client' var params = { metadata: pkg, access: 'restricted', body: tarball, auth: auth } client.publish('http://localhost:1337/', params, function (er, data) { if (er) throw er t.deepEqual(data, { created: true }) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-again.js0000644000000000000000000000445712631326456024135 0ustar 00000000000000var tap = require('tap') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var auth = { username: 'username', password: '%1234@asdf%', email: 'i@izs.me', alwaysAuth: true } var client = common.freshClient() tap.test('publish again', function (t) { // not really a tarball, but doesn't matter var bodyPath = require.resolve('../package.json') var tarball = fs.createReadStream(bodyPath) var pd = fs.readFileSync(bodyPath) var pkg = require('../package.json') var lastTime = null server.expect('/npm-registry-client', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = lastTime = JSON.parse(b) t.equal(o._id, 'npm-registry-client') t.equal(o['dist-tags'].latest, pkg.version) t.has(o.versions[pkg.version], pkg) t.same(o.maintainers, [ { name: 'username', email: 'i@izs.me' } ]) var att = o._attachments[ pkg.name + '-' + pkg.version + '.tgz' ] t.same(att.data, pd.toString('base64')) res.statusCode = 409 res.json({reason: 'must supply latest _rev to update existing package'}) }) }) server.expect('/npm-registry-client?write=true', function (req, res) { t.equal(req.method, 'GET') t.ok(lastTime) for (var i in lastTime.versions) { var v = lastTime.versions[i] delete lastTime.versions[i] lastTime.versions['0.0.2'] = v lastTime['dist-tags'] = { latest: '0.0.2' } } lastTime._rev = 'asdf' res.json(lastTime) }) server.expect('/npm-registry-client', function (req, res) { t.equal(req.method, 'PUT') t.ok(lastTime) var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._rev, 'asdf') t.deepEqual(o.versions['0.0.2'], o.versions[pkg.version]) res.statusCode = 201 res.json({created: true}) }) }) var params = { metadata: pkg, access: 'public', body: tarball, auth: auth } client.publish('http://localhost:1337/', params, function (er, data) { if (er) throw er t.deepEqual(data, { created: true }) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-failed-no-message.js0000644000000000000000000000206712631326456026331 0ustar 00000000000000var createReadStream = require('fs').createReadStream var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var config = { retry: { retries: 0 } } var client = common.freshClient(config) var URI = 'http://localhost:1337/' var USERNAME = 'username' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var METADATA = require('../package.json') var ACCESS = 'public' // not really a tarball, but doesn't matter var BODY_PATH = require.resolve('../package.json') var BODY = createReadStream(BODY_PATH) var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { metadata: METADATA, access: ACCESS, body: BODY, auth: AUTH } test('publish with a 500 response but no message', function (t) { server.expect('/npm-registry-client', function (req, res) { res.statusCode = 500 res.json({ success: false }) }) client.publish(URI, PARAMS, function (er, data) { t.ok(er, 'got expected error') t.notOk(data, 'no payload on failure') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-mixcase-name.js0000644000000000000000000000457112631326456025422 0ustar 00000000000000var tap = require('tap') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var auth = { username: 'username', password: '%1234@asdf%', email: 'i@izs.me', alwaysAuth: true } var client = common.freshClient() tap.test('publish mixcase name', function (t) { // not really a tarball, but doesn't matter var bodyPath = require.resolve('../package.json') var tarball = fs.createReadStream(bodyPath) var pd = fs.readFileSync(bodyPath) var pkg = require('../package.json') var lastTime = null // change to mixed case name pkg.name = 'npm-Registry-Client' server.expect('/npm-Registry-Client', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = lastTime = JSON.parse(b) t.equal(o._id, 'npm-Registry-Client') t.equal(o['dist-tags'].latest, pkg.version) t.has(o.versions[pkg.version], pkg) t.same(o.maintainers, [ { name: 'username', email: 'i@izs.me' } ]) var att = o._attachments[ pkg.name + '-' + pkg.version + '.tgz' ] t.same(att.data, pd.toString('base64')) res.statusCode = 409 res.json({reason: 'must supply latest _rev to update existing package'}) }) }) server.expect('/npm-Registry-Client?write=true', function (req, res) { t.equal(req.method, 'GET') t.ok(lastTime) for (var i in lastTime.versions) { var v = lastTime.versions[i] delete lastTime.versions[i] lastTime.versions['0.0.2'] = v lastTime['dist-tags'] = { latest: '0.0.2' } } lastTime._rev = 'asdf' res.json(lastTime) }) server.expect('/npm-Registry-Client', function (req, res) { t.equal(req.method, 'PUT') t.ok(lastTime) var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._rev, 'asdf') t.deepEqual(o.versions['0.0.2'], o.versions[pkg.version]) res.statusCode = 201 res.json({created: true}) }) }) var params = { metadata: pkg, access: 'public', body: tarball, auth: auth } client.publish('http://localhost:1337/', params, function (er, data) { if (er) throw er t.deepEqual(data, { created: true }) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-new-mixcase-name.js0000644000000000000000000000374612631326456026214 0ustar 00000000000000var test = require('tap').test var crypto = require('crypto') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var URI = 'http://localhost:1337/' var USERNAME = 'username' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var METADATA = require('../package.json') var ACCESS = 'public' // not really a tarball, but doesn't matter var BODY_PATH = require.resolve('../package.json') var BODY = fs.createReadStream(BODY_PATH) var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { metadata: METADATA, access: ACCESS, body: BODY, auth: AUTH } test('publish-new-mixcase-name', function (t) { var pd = fs.readFileSync(BODY_PATH) // change to mixed-case name METADATA.name = 'npm-Registry-Client' server.expect('/npm-Registry-Client', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._id, 'npm-Registry-Client') t.equal(o['dist-tags'].latest, METADATA.version) t.equal(o.access, ACCESS) t.has(o.versions[METADATA.version], METADATA) t.same(o.maintainers, [{ name: 'username', email: 'i@izs.me' }]) t.same(o.maintainers, o.versions[METADATA.version].maintainers) var att = o._attachments[METADATA.name + '-' + METADATA.version + '.tgz'] t.same(att.data, pd.toString('base64')) var hash = crypto.createHash('sha1').update(pd).digest('hex') t.equal(o.versions[METADATA.version].dist.shasum, hash) res.statusCode = 403 res.json({error: 'Name must be lower-case'}) }) }) client.publish(URI, PARAMS, function (er, data, json, res) { t.assert(er instanceof Error) // expect error // TODO: need a test that ensures useful error message // t.similar(data.error, /must be lower-case/) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-scoped-auth-token.js0000644000000000000000000000315512631326456026402 0ustar 00000000000000var tap = require('tap') var crypto = require('crypto') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var auth = { token: 'of-glad-tidings' } var client = common.freshClient() tap.test('publish', function (t) { // not really a tarball, but doesn't matter var bodyPath = require.resolve('../package.json') var tarball = fs.createReadStream(bodyPath) var pd = fs.readFileSync(bodyPath) var pkg = require('../package.json') pkg.name = '@npm/npm-registry-client' server.expect('/@npm%2fnpm-registry-client', function (req, res) { t.equal(req.method, 'PUT') t.equal(req.headers.authorization, 'Bearer of-glad-tidings') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._id, '@npm/npm-registry-client') t.equal(o['dist-tags'].latest, pkg.version) t.has(o.versions[pkg.version], pkg) t.same(o.maintainers, o.versions[pkg.version].maintainers) var att = o._attachments[ pkg.name + '-' + pkg.version + '.tgz' ] t.same(att.data, pd.toString('base64')) var hash = crypto.createHash('sha1').update(pd).digest('hex') t.equal(o.versions[pkg.version].dist.shasum, hash) res.statusCode = 201 res.json({ created: true }) }) }) var params = { metadata: pkg, access: 'restricted', body: tarball, auth: auth } client.publish(common.registry, params, function (er, data) { if (er) throw er t.deepEqual(data, { created: true }) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish-scoped.js0000644000000000000000000000346312631326456024327 0ustar 00000000000000var tap = require('tap') var crypto = require('crypto') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var auth = { username: 'username', password: '%1234@asdf%', email: 'ogd@aoaioxxysz.net' } var client = common.freshClient() var _auth = new Buffer('username:%1234@asdf%').toString('base64') tap.test('publish', function (t) { // not really a tarball, but doesn't matter var bodyPath = require.resolve('../package.json') var tarball = fs.createReadStream(bodyPath) var pd = fs.readFileSync(bodyPath) var pkg = require('../package.json') pkg.name = '@npm/npm-registry-client' server.expect('/@npm%2fnpm-registry-client', function (req, res) { t.equal(req.method, 'PUT') t.equal(req.headers.authorization, 'Basic ' + _auth) var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._id, '@npm/npm-registry-client') t.equal(o['dist-tags'].latest, pkg.version) t.has(o.versions[pkg.version], pkg) t.same(o.maintainers, [ { name: 'username', email: 'ogd@aoaioxxysz.net' } ]) t.same(o.maintainers, o.versions[pkg.version].maintainers) var att = o._attachments[ pkg.name + '-' + pkg.version + '.tgz' ] t.same(att.data, pd.toString('base64')) var hash = crypto.createHash('sha1').update(pd).digest('hex') t.equal(o.versions[pkg.version].dist.shasum, hash) res.statusCode = 201 res.json({ created: true }) }) }) var params = { metadata: pkg, access: 'restricted', body: tarball, auth: auth } client.publish(common.registry, params, function (er, data) { if (er) throw er t.deepEqual(data, { created: true }) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/publish.js0000644000000000000000000001135212631326456023050 0ustar 00000000000000var test = require('tap').test var crypto = require('crypto') var fs = require('fs') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'http://localhost:1337/' var USERNAME = 'username' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var METADATA = require('../package.json') var ACCESS = 'public' // not really a tarball, but doesn't matter var BODY_PATH = require.resolve('../package.json') var BODY = fs.createReadStream(BODY_PATH) var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { metadata: METADATA, access: ACCESS, body: BODY, auth: AUTH } test('publish call contract', function (t) { t.throws(function () { client.publish(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.publish([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.publish(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.publish(URI, '', nop) }, 'params must be object') t.throws(function () { client.publish(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.publish(URI, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { access: ACCESS, body: BODY, auth: AUTH } client.publish(URI, params, nop) }, { name: 'AssertionError', message: 'must pass package metadata to publish' }, 'params must include metadata for package' ) t.throws( function () { var params = { metadata: METADATA, access: ACCESS, auth: AUTH } client.publish(URI, params, nop) }, { name: 'AssertionError', message: 'must pass package body to publish' }, 'params must include body of package to publish' ) t.throws( function () { var params = { metadata: METADATA, access: ACCESS, body: BODY } client.publish(URI, params, nop) }, { name: 'AssertionError', message: 'must pass auth to publish' }, 'params must include auth' ) t.throws( function () { var params = { metadata: -1, access: ACCESS, body: BODY, auth: AUTH } client.publish(URI, params, nop) }, { name: 'AssertionError', message: 'must pass package metadata to publish' }, 'metadata must be object' ) t.throws( function () { var params = { metadata: METADATA, access: 'hamchunx', body: BODY, auth: AUTH } client.publish(URI, params, nop) }, { name: 'AssertionError', message: "if present, access level must be either 'public' or 'restricted'" }, "access level must be 'public' or 'restricted'" ) t.throws( function () { var params = { metadata: METADATA, access: ACCESS, body: -1, auth: AUTH } client.publish(URI, params, nop) }, { name: 'AssertionError', message: 'package body passed to publish must be a stream' }, 'body must be a Stream' ) t.test('malformed semver in publish', function (t) { var metadata = JSON.parse(JSON.stringify(METADATA)) metadata.version = '%!@#$' var params = { metadata: metadata, access: ACCESS, message: BODY, auth: AUTH } client.publish(URI, params, function (err) { t.equal( err && err.message, 'invalid semver: %!@#$', 'got expected semver validation failure' ) t.end() }) }) t.end() }) test('publish', function (t) { var pd = fs.readFileSync(BODY_PATH) server.expect('/npm-registry-client', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var o = JSON.parse(b) t.equal(o._id, 'npm-registry-client') t.equal(o['dist-tags'].latest, METADATA.version) t.equal(o.access, ACCESS) t.has(o.versions[METADATA.version], METADATA) t.same(o.maintainers, [{ name: 'username', email: 'i@izs.me' }]) t.same(o.maintainers, o.versions[METADATA.version].maintainers) var att = o._attachments[METADATA.name + '-' + METADATA.version + '.tgz'] t.same(att.data, pd.toString('base64')) var hash = crypto.createHash('sha1').update(pd).digest('hex') t.equal(o.versions[METADATA.version].dist.shasum, hash) res.statusCode = 201 res.json({ created: true }) }) }) client.publish(URI, PARAMS, function (er, data) { if (er) throw er t.deepEqual(data, { created: true }) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/redirects.js0000644000000000000000000000221712631326456023366 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var pkg = { _id: 'some-package@1.2.3', name: 'some-package', version: '1.2.3' } test('basic request', function (t) { // Expect one request for { follow : false } server.expect('/-/some-package/1.2.3', function (req, res) { res.writeHead(301, { 'Location': '/some-package/1.2.3' }) res.end('Redirecting') }) // Expect 2 requests for { follow : true } server.expect('/-/some-package/1.2.3', function (req, res) { res.writeHead(301, { 'Location': '/some-package/1.2.3' }) res.end('Redirecting') }) server.expect('/some-package/1.2.3', function (req, res) { res.json(pkg) }) t.plan(2) client.get( 'http://localhost:1337/-/some-package/1.2.3', { follow: false }, function (er) { t.ok(er, 'Error must be set') } ) client.get( 'http://localhost:1337/-/some-package/1.2.3', { follow: true }, function (er, data) { t.deepEqual(data, pkg) } ) }) test('cleanup', function (t) { server.close() t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/request-gzip-content.js0000644000000000000000000000300612631326456025506 0ustar 00000000000000var zlib = require('zlib') var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient({ retry: { count: 1, minTimeout: 10, maxTimeout: 100 } }) var TEST_URL = common.registry + '/some-package-gzip/1.2.3' var pkg = { _id: 'some-package-gzip@1.2.3', name: 'some-package-gzip', version: '1.2.3' } zlib.gzip(JSON.stringify(pkg), function (err, pkgGzip) { test('request gzip package content', function (t) { t.ifError(err, 'example package compressed') server.expect('GET', '/some-package-gzip/1.2.3', function (req, res) { res.statusCode = 200 res.setHeader('Content-Encoding', 'gzip') res.setHeader('Content-Type', 'application/json') res.end(pkgGzip) }) client.get(TEST_URL, {}, function (er, data) { if (er) throw er t.deepEqual(data, pkg, 'some-package-gzip version 1.2.3') t.end() }) }) test('request wrong gzip package content', function (t) { // will retry 3 times for (var i = 0; i < 3; i++) { server.expect('GET', '/some-package-gzip/1.2.3', function (req, res) { res.statusCode = 200 res.setHeader('Content-Encoding', 'gzip') res.setHeader('Content-Type', 'application/json') res.end(new Buffer('wrong gzip content')) }) } client.get(TEST_URL, {}, function (er) { t.ok(er, 'ungzip error') t.end() }) }) test('cleanup', function (t) { server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/request.js0000644000000000000000000002033512631326456023073 0ustar 00000000000000var Readable = require('readable-stream').Readable var inherits = require('util').inherits var test = require('tap').test var concat = require('concat-stream') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function OneA () { Readable.call(this) this.push('A') this.push(null) } inherits(OneA, Readable) function nop () {} var URI = 'http://localhost:1337/' var USERNAME = 'username' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { auth: AUTH } test('request call contract', function (t) { t.throws( function () { client.request(undefined, PARAMS, nop) }, { name: 'AssertionError', message: 'must pass uri to request' }, 'requires a URI' ) t.throws( function () { client.request([], PARAMS, nop) }, { name: 'AssertionError', message: 'must pass uri to request' }, 'requires URI to be a string' ) t.throws( function () { client.request(URI, undefined, nop) }, { name: 'AssertionError', message: 'must pass params to request' }, 'requires params object' ) t.throws( function () { client.request(URI, '', nop) }, { name: 'AssertionError', message: 'must pass params to request' }, 'params must be object' ) t.throws( function () { client.request(URI, PARAMS, undefined) }, { name: 'AssertionError', message: 'must pass callback to request' }, 'requires callback' ) t.throws( function () { client.request(URI, PARAMS, 'callback') }, { name: 'AssertionError', message: 'must pass callback to request' }, 'callback must be function' ) t.end() }) test('run request through its paces', function (t) { t.plan(34) server.expect('/request-defaults', function (req, res) { t.equal(req.method, 'GET', 'uses GET by default') req.pipe(concat(function (d) { t.notOk(d.toString('utf7'), 'no data included in request') res.statusCode = 200 res.json({ fetched: 'defaults' }) })) }) server.expect('/last-modified', function (req, res) { t.equal(req.headers['if-modified-since'], 'test-last-modified', 'got test if-modified-since') res.statusCode = 200 res.json({ fetched: 'last-modified' }) }) server.expect('/etag', function (req, res) { t.equal(req.headers['if-none-match'], 'test-etag', 'got test etag') res.statusCode = 200 res.json({ fetched: 'etag' }) }) server.expect('POST', '/etag-post', function (req, res) { t.equal(req.headers['if-match'], 'post-etag', 'got test post etag') res.statusCode = 200 res.json({ posted: 'etag' }) }) server.expect('PUT', '/body-stream', function (req, res) { req.pipe(concat(function (d) { t.equal(d.toString('utf8'), 'A', 'streamed expected data') res.statusCode = 200 res.json({ put: 'stream' }) })) }) server.expect('PUT', '/body-buffer', function (req, res) { req.pipe(concat(function (d) { t.equal(d.toString('utf8'), 'hi', 'streamed expected data') res.statusCode = 200 res.json({ put: 'buffer' }) })) }) server.expect('PUT', '/body-string', function (req, res) { req.pipe(concat(function (d) { t.equal(d.toString('utf8'), 'erp', 'streamed expected data') res.statusCode = 200 res.json({ put: 'string' }) })) }) server.expect('PUT', '/body-object', function (req, res) { req.pipe(concat(function (d) { t.equal(d.toString('utf8'), '["tricky"]', 'streamed expected data') res.statusCode = 200 res.json({ put: 'object' }) })) }) server.expect('GET', '/body-error-string', function (req, res) { req.pipe(concat(function () { res.statusCode = 200 res.json({ error: 'not really an error', reason: 'unknown' }) })) }) server.expect('GET', '/body-error-object', function (req, res) { req.pipe(concat(function () { res.statusCode = 200 res.json({ error: {} }) })) }) server.expect('GET', '/@scoped%2Fpackage-failing', function (req, res) { req.pipe(concat(function () { res.statusCode = 402 res.json({ error: 'payment required' }) })) }) server.expect('GET', '/not-found-no-body', function (req, res) { req.pipe(concat(function () { res.statusCode = 404 res.end() })) }) var defaults = {} client.request( common.registry + '/request-defaults', defaults, function (er, data, raw, response) { t.ifError(er, 'call worked') t.deepEquals(data, { fetched: 'defaults' }, 'confirmed defaults work') t.equal(response.headers.connection, 'keep-alive', 'keep-alive set') } ) var lastModified = { lastModified: 'test-last-modified' } client.request(common.registry + '/last-modified', lastModified, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { fetched: 'last-modified' }, 'last-modified request sent') }) var etagged = { etag: 'test-etag' } client.request(common.registry + '/etag', etagged, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { fetched: 'etag' }, 'etag request sent') }) var postEtagged = { method: 'post', etag: 'post-etag' } client.request(common.registry + '/etag-post', postEtagged, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { posted: 'etag' }, 'POST etag request sent') }) var putStream = { method: 'PUT', body: new OneA(), auth: AUTH } client.request(common.registry + '/body-stream', putStream, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { put: 'stream' }, 'PUT request with stream sent') }) var putBuffer = { method: 'PUT', body: new Buffer('hi'), auth: AUTH } client.request(common.registry + '/body-buffer', putBuffer, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { put: 'buffer' }, 'PUT request with buffer sent') }) var putString = { method: 'PUT', body: 'erp', auth: AUTH } client.request(common.registry + '/body-string', putString, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { put: 'string' }, 'PUT request with string sent') }) var putObject = { method: 'PUT', body: { toJSON: function () { return ['tricky'] } }, auth: AUTH } client.request(common.registry + '/body-object', putObject, function (er, data) { t.ifError(er, 'call worked') t.deepEquals(data, { put: 'object' }, 'PUT request with object sent') }) client.request(common.registry + '/body-error-string', defaults, function (er) { t.equal( er && er.message, 'not really an error unknown: body-error-string', 'call worked' ) }) client.request(common.registry + '/body-error-object', defaults, function (er) { t.ifError(er, 'call worked') }) client.request(common.registry + '/@scoped%2Fpackage-failing', defaults, function (er) { t.equals(er.message, 'payment required : @scoped/package-failing') }) client.request(common.registry + '/not-found-no-body', defaults, function (er) { t.equals(er.message, '404 Not Found') t.equals(er.statusCode, 404, 'got back 404 as .statusCode') t.equals(er.code, 'E404', 'got back expected string code') t.notOk(er.pkgid, "no package name returned when there's no body on response") t.ok(typeof er !== 'string', "Error shouldn't be returned as string.") }) }) test('outputs notice if npm-notice header is set', function (t) { var client = common.freshClient({ log: { error: noop, warn: function (prefix, msg) { warnings.push(msg) }, info: noop, verbose: noop, silly: noop, http: noop, pause: noop, resume: noop } }) var message = 'notice me!' var warnings = [] function noop () {} server.expect('GET', '/npm-notice', function (req, res) { req.pipe(concat(function () { res.statusCode = 200 res.setHeader('npm-notice', message) res.end() })) }) client.request(common.registry + '/npm-notice', {}, function (er) { t.notEqual(warnings.indexOf(message), -1, 'notice not printed') t.end() }) }) test('cleanup', function (t) { server.close() t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/retries.js0000644000000000000000000000245512631326456023063 0ustar 00000000000000var tap = require('tap') var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient({ retry: { retries: 6, minTimeout: 10, maxTimeout: 100 } }) var pkg = { _id: 'some-package@1.2.3', name: 'some-package', version: '1.2.3' } tap.test('create new user account', function (t) { // first time, return a 408 server.expect('GET', '/some-package/1.2.3', function (req, res) { res.statusCode = 408 res.end('Timeout') }) // then, slam the door in their face server.expect('GET', '/some-package/1.2.3', function (req, res) { res.destroy() }) // then, blame someone else server.expect('GET', '/some-package/1.2.3', function (req, res) { res.statusCode = 502 res.end('Gateway Timeout') }) // 'No one's home right now, come back later' server.expect('GET', '/some-package/1.2.3', function (req, res) { res.statusCode = 503 res.setHeader('retry-after', '10') res.end('Come back later') }) // finally, you may enter. server.expect('GET', '/some-package/1.2.3', function (req, res) { res.statusCode = 200 res.json(pkg) }) client.get('http://localhost:1337/some-package/1.2.3', {}, function (er, data) { if (er) throw er t.deepEqual(data, pkg) server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/star.js0000644000000000000000000000770112631326456022356 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var cache = require('./fixtures/underscore/cache.json') var nock = require('nock') function nop () {} var URI = 'https://npm.registry:8043/rewrite' var STARRED = true var USERNAME = 'username' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { starred: STARRED, auth: AUTH } test('star call contract', function (t) { t.throws(function () { client.star(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.star([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.star(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.star(URI, '', nop) }, 'params must be object') t.throws(function () { client.star(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.star(URI, PARAMS, 'callback') }, 'callback must be function') t.throws( function () { var params = { starred: STARRED } client.star(URI, params, nop) }, { name: 'AssertionError', message: 'must pass auth to star' }, 'params must include auth' ) t.end() }) test('star a package', function (t) { server.expect('GET', '/underscore?write=true', function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect('PUT', '/underscore', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var updated = JSON.parse(b) var already = [ 'vesln', 'mvolkmann', 'lancehunt', 'mikl', 'linus', 'vasc', 'bat', 'dmalam', 'mbrevoort', 'danielr', 'rsimoes', 'thlorenz' ] for (var i = 0; i < already.length; i++) { var current = already[i] t.ok( updated.users[current], current + ' still likes this package' ) } t.ok(updated.users[USERNAME], 'user is in the starred list') res.statusCode = 201 res.json({ starred: true }) }) }) var params = { starred: STARRED, auth: AUTH } client.star('http://localhost:1337/underscore', params, function (er, data) { t.ifError(er, 'no errors') t.ok(data.starred, 'was starred') t.end() }) }) test('if password auth, only sets authorization on put', function (t) { var starGet = nock('http://localhost:1010') .get('/underscore?write=true') .reply(200, {}) var starPut = nock('http://localhost:1010', { reqheaders: { authorization: 'Basic ' + new Buffer(AUTH.username + ':' + AUTH.password).toString('base64') } }) .put('/underscore') .reply(200) var params = { starred: STARRED, auth: AUTH } client.star('http://localhost:1010/underscore', params, function (er) { t.ifError(er, 'starred without issues') starGet.done() starPut.done() t.end() }) }) test('if token auth, sets bearer on get and put', function (t) { var starGet = nock('http://localhost:1010', { reqheaders: { authorization: 'Bearer foo' } }) .get('/underscore?write=true') .reply(200, {}) var getUser = nock('http://localhost:1010', { reqheaders: { authorization: 'Bearer foo' } }) .get('/-/whoami') .reply(200, { username: 'bcoe' }) var starPut = nock('http://localhost:1010', { reqheaders: { authorization: 'Bearer foo' } }) .put('/underscore') .reply(200) var params = { starred: STARRED, auth: { token: 'foo' } } client.star('http://localhost:1010/underscore', params, function (er) { t.ifError(er, 'starred without error') starGet.done() starPut.done() getUser.done() t.end() }) }) test('cleanup', function (t) { server.close() t.end() }) npm_3.5.2.orig/node_modules/npm-registry-client/test/stars.js0000644000000000000000000000334212631326456022536 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'https://npm.registry:8043/rewrite' var USERNAME = 'sample' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { username: USERNAME, auth: AUTH } var USERS = [ 'benjamincoe', 'seldo', 'ceejbot' ] test('stars call contract', function (t) { t.throws(function () { client.stars(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { client.stars([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { client.stars(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.stars(URI, '', nop) }, 'params must be object') t.throws(function () { client.stars(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { client.stars(URI, PARAMS, 'callback') }, 'callback must be function') t.test('no username anywhere', function (t) { var params = {} client.stars(URI, params, function (err) { t.equal( err && err.message, 'must pass either username or auth to stars', 'username must not be empty') t.end() }) }) t.end() }) test('get the stars for a package', function (t) { server.expect('GET', '/-/_view/starredByUser?key=%22sample%22', function (req, res) { t.equal(req.method, 'GET') res.json(USERS) }) client.stars('http://localhost:1337/', PARAMS, function (er, info) { t.ifError(er, 'no errors') t.deepEqual(info, USERS, 'got the list of users') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/tag.js0000644000000000000000000000441612631326456022160 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'http://localhost:1337/underscore' var USERNAME = 'username' var PASSWORD = '%1234@asdf%' var EMAIL = 'i@izs.me' var VERSION = '1.3.2' var TAG = 'not-lodash' var AUTH = { username: USERNAME, password: PASSWORD, email: EMAIL } var PARAMS = { tag: TAG, version: VERSION, auth: AUTH } test('tag call contract', function (t) { t.throws(function () { client.tag(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.tag([], AUTH, nop) }, 'requires URI to be a string') t.throws(function () { client.tag(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.tag(URI, '', nop) }, 'params must be object') t.throws(function () { client.tag(URI, AUTH, undefined) }, 'requires callback') t.throws(function () { client.tag(URI, AUTH, 'callback') }, 'callback must be function') t.throws( function () { var params = { tag: TAG, auth: AUTH } client.tag(URI, params, nop) }, { name: 'AssertionError', message: 'must pass version to tag' }, 'tag must include version' ) t.throws( function () { var params = { version: VERSION, auth: AUTH } client.tag(URI, params, nop) }, { name: 'AssertionError', message: 'must pass tag name to tag' }, 'tag must include name' ) t.throws( function () { var params = { version: VERSION, tag: TAG } client.tag(URI, params, nop) }, { name: 'AssertionError', message: 'must pass auth to tag' }, 'params must include auth' ) t.end() }) test('tag a package', function (t) { server.expect('PUT', '/underscore/not-lodash', function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf8') req.on('data', function (d) { b += d }) req.on('end', function () { var updated = JSON.parse(b) t.deepEqual(updated, '1.3.2') res.statusCode = 201 res.json({ tagged: true }) }) }) client.tag(URI, PARAMS, function (error, data) { t.ifError(error, 'no errors') t.ok(data.tagged, 'was tagged') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/team.js0000644000000000000000000001304012631326456022324 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var URI = 'http://localhost:1337' var PARAMS = { auth: { token: 'foo' }, scope: 'myorg', team: 'myteam' } var commands = ['create', 'destroy', 'add', 'rm', 'ls'] test('team create basic', function (t) { var teamData = { name: PARAMS.team, scope_id: 1234, created: '2015-07-23T18:07:49.959Z', updated: '2015-07-23T18:07:49.959Z', deleted: null } server.expect('PUT', '/-/org/myorg/team', function (req, res) { t.equal(req.method, 'PUT') onJsonReq(req, function (json) { t.same(json, { name: PARAMS.team }) res.statusCode = 200 res.json(teamData) }) }) client.team('create', URI, PARAMS, function (err, data) { t.ifError(err, 'no errors') t.same(data, teamData) t.end() }) }) test('team destroy', function (t) { var teamData = { name: 'myteam', scope_id: 1234, created: '2015-07-23T18:07:49.959Z', updated: '2015-07-23T18:07:49.959Z', deleted: '2015-07-23T18:27:27.178Z' } server.expect('DELETE', '/-/team/myorg/myteam', function (req, res) { t.equal(req.method, 'DELETE') onJsonReq(req, function (json) { t.same(json, undefined) res.statusCode = 200 res.json(teamData) }) }) client.team('destroy', URI, PARAMS, function (err, data) { t.ifError(err, 'no errors') t.same(data, teamData) t.end() }) }) test('team add basic', function (t) { var params = Object.create(PARAMS) params.user = 'zkat' server.expect('PUT', '/-/team/myorg/myteam/user', function (req, res) { t.equal(req.method, 'PUT') onJsonReq(req, function (json) { t.same(json, { user: params.user }) res.statusCode = 200 res.json(undefined) }) }) client.team('add', URI, params, function (err, data) { t.ifError(err, 'no errors') t.same(data, undefined) t.end() }) }) test('team add user not in org', function (t) { var params = Object.create(PARAMS) params.user = 'zkat' var errMsg = 'user is already in team' server.expect('PUT', '/-/team/myorg/myteam/user', function (req, res) { t.equal(req.method, 'PUT') res.statusCode = 400 res.json({ error: errMsg }) }) client.team('add', URI, params, function (err, data) { t.equal(err.message, errMsg + ' : ' + '-/team/myorg/myteam/user') t.same(data, {error: errMsg}) t.end() }) }) test('team rm basic', function (t) { var params = Object.create(PARAMS) params.user = 'bcoe' server.expect('DELETE', '/-/team/myorg/myteam/user', function (req, res) { t.equal(req.method, 'DELETE') onJsonReq(req, function (json) { t.same(json, params) res.statusCode = 200 res.json(undefined) }) }) client.team('rm', URI, params, function (err, data) { t.ifError(err, 'no errors') t.same(data, undefined) t.end() }) }) test('team ls (on org)', function (t) { var params = Object.create(PARAMS) params.team = null var teams = ['myorg:team1', 'myorg:team2', 'myorg:team3'] server.expect('GET', '/-/org/myorg/team?format=cli', function (req, res) { t.equal(req.method, 'GET') onJsonReq(req, function (json) { t.same(json, undefined) res.statusCode = 200 res.json(teams) }) }) client.team('ls', URI, params, function (err, data) { t.ifError(err, 'no errors') t.same(data, teams) t.end() }) }) test('team ls (on team)', function (t) { var uri = '/-/team/myorg/myteam/user?format=cli' var users = ['zkat', 'bcoe'] server.expect('GET', uri, function (req, res) { t.equal(req.method, 'GET') onJsonReq(req, function (json) { t.same(json, undefined) res.statusCode = 200 res.json(users) }) }) client.team('ls', URI, PARAMS, function (err, data) { t.ifError(err, 'no errors') t.same(data, users) t.end() }) }) // test('team edit', function (t) { // server.expect('PUT', '/-/org/myorg/team', function (req, res) { // t.equal(req.method, 'PUT') // res.statusCode = 201 // res.json({}) // }) // client.team('create', URI, PARAMS, function (err, data) { // t.ifError(err, 'no errors') // t.end() // }) // }) test('team command base validation', function (t) { t.throws(function () { client.team(undefined, URI, PARAMS, nop) }, 'command is required') commands.forEach(function (cmd) { t.throws(function () { client.team(cmd, undefined, PARAMS, nop) }, 'registry URI is required') t.throws(function () { client.team(cmd, URI, undefined, nop) }, 'params is required') t.throws(function () { client.team(cmd, URI, {scope: 'o', team: 't'}, nop) }, 'auth is required') t.throws(function () { client.team(cmd, URI, {auth: {token: 'f'}, team: 't'}, nop) }, 'scope is required') t.throws(function () { client.team(cmd, URI, PARAMS, {}) }, 'callback must be a function') if (cmd !== 'ls') { t.throws(function () { client.team( cmd, URI, {auth: {token: 'f'}, scope: 'o'}, nop) }, 'team name is required') } if (cmd === 'add' || cmd === 'rm') { t.throws(function () { client.team( cmd, URI, PARAMS, nop) }, 'user is required') } }) t.end() }) test('cleanup', function (t) { server.close() t.end() }) function onJsonReq (req, cb) { var buffer = '' req.setEncoding('utf8') req.on('data', function (data) { buffer += data }) req.on('end', function () { cb(buffer ? JSON.parse(buffer) : undefined) }) } npm_3.5.2.orig/node_modules/npm-registry-client/test/unpublish-scoped.js0000644000000000000000000000266712631326456024677 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var cache = require('./fixtures/@npm/npm-registry-client/cache.json') var REV = '/-rev/213-0a1049cf56172b7d9a1184742c6477b9' var PACKAGE = '/@npm%2fnpm-registry-client' var URI = common.registry + PACKAGE var TOKEN = 'of-glad-tidings' var VERSION = '3.0.6' var AUTH = { token: TOKEN } var PARAMS = { version: VERSION, auth: AUTH } test('unpublish a package', function (t) { server.expect('GET', '/@npm%2fnpm-registry-client?write=true', function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect('PUT', '/@npm%2fnpm-registry-client' + REV, function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf-8') req.on('data', function (d) { b += d }) req.on('end', function () { var updated = JSON.parse(b) t.notOk(updated.versions[VERSION]) }) res.json(cache) }) server.expect('GET', PACKAGE, function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect( 'DELETE', PACKAGE + '/-' + PACKAGE + '-' + VERSION + '.tgz' + REV, function (req, res) { t.equal(req.method, 'DELETE') res.json({ unpublished: true }) } ) client.unpublish(URI, PARAMS, function (er) { t.ifError(er, 'no errors') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/unpublish.js0000644000000000000000000000432512631326456023415 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() var cache = require('./fixtures/underscore/cache.json') function nop () {} var REV = '/-rev/72-47f2986bfd8e8b55068b204588bbf484' var URI = 'http://localhost:1337/underscore' var TOKEN = 'of-glad-tidings' var VERSION = '1.3.2' var AUTH = { token: TOKEN } var PARAMS = { version: VERSION, auth: AUTH } test('unpublish call contract', function (t) { t.throws(function () { client.unpublish(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.unpublish([], AUTH, nop) }, 'requires URI to be a string') t.throws(function () { client.unpublish(URI, undefined, nop) }, 'requires params object') t.throws(function () { client.unpublish(URI, '', nop) }, 'params must be object') t.throws(function () { client.unpublish(URI, AUTH, undefined) }, 'requires callback') t.throws(function () { client.unpublish(URI, AUTH, 'callback') }, 'callback must be function') t.throws( function () { var params = { version: VERSION } client.unpublish(URI, params, nop) }, { name: 'AssertionError', message: 'must pass auth to unpublish' }, 'must pass auth to unpublish' ) t.end() }) test('unpublish a package', function (t) { server.expect('GET', '/underscore?write=true', function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect('PUT', '/underscore' + REV, function (req, res) { t.equal(req.method, 'PUT') var b = '' req.setEncoding('utf-8') req.on('data', function (d) { b += d }) req.on('end', function () { var updated = JSON.parse(b) t.notOk(updated.versions[VERSION]) }) res.json(cache) }) server.expect('GET', '/underscore', function (req, res) { t.equal(req.method, 'GET') res.json(cache) }) server.expect('DELETE', '/underscore/-/underscore-1.3.2.tgz' + REV, function (req, res) { t.equal(req.method, 'DELETE') res.json({ unpublished: true }) }) client.unpublish(URI, PARAMS, function (error) { t.ifError(error, 'no errors') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/whoami.js0000644000000000000000000000323312631326456022665 0ustar 00000000000000var test = require('tap').test var server = require('./lib/server.js') var common = require('./lib/common.js') var client = common.freshClient() function nop () {} var WHOIAM = 'wombat' var TOKEN = 'not-bad-meaning-bad-but-bad-meaning-wombat' var AUTH = { token: TOKEN } var PARAMS = { auth: AUTH } test('whoami call contract', function (t) { t.throws(function () { client.whoami(undefined, AUTH, nop) }, 'requires a URI') t.throws(function () { client.whoami([], AUTH, nop) }, 'requires URI to be a string') t.throws(function () { client.whoami(common.registry, undefined, nop) }, 'requires params object') t.throws(function () { client.whoami(common.registry, '', nop) }, 'params must be object') t.throws(function () { client.whoami(common.registry, AUTH, undefined) }, 'requires callback') t.throws(function () { client.whoami(common.registry, AUTH, 'callback') }, 'callback must be function') t.throws( function () { var params = {} client.whoami(common.registry, params, nop) }, { name: 'AssertionError', message: 'must pass auth to whoami' }, 'must pass auth to whoami' ) t.end() }) test('whoami', function (t) { server.expect('GET', '/-/whoami', function (req, res) { t.equal(req.method, 'GET') // only available for token-based auth for now t.equal( req.headers.authorization, 'Bearer not-bad-meaning-bad-but-bad-meaning-wombat' ) res.json({ username: WHOIAM }) }) client.whoami(common.registry, PARAMS, function (error, wombat) { t.ifError(error, 'no errors') t.equal(wombat, WHOIAM, 'im a wombat') server.close() t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/zz-cleanup.js0000644000000000000000000000033212631326456023466 0ustar 00000000000000var tap = require('tap') var rimraf = require('rimraf') tap.test('teardown', function (t) { rimraf(__dirname + '/fixtures/cache', function (er) { if (er) throw er t.pass('cache cleaned') t.end() }) }) npm_3.5.2.orig/node_modules/npm-registry-client/test/lib/common.js0000644000000000000000000000122012631326456023431 0ustar 00000000000000var server = require('./server.js') var RC = require('../../') var REGISTRY = 'http://localhost:' + server.port // cheesy hackaround for test deps (read: nock) that rely on setImmediate if (!global.setImmediate || !require('timers').setImmediate) { require('timers').setImmediate = global.setImmediate = function () { var args = [arguments[0], 0].concat([].slice.call(arguments, 1)) setTimeout.apply(this, args) } } module.exports = { port: server.port, registry: REGISTRY, freshClient: function freshClient (config) { var client = new RC(config) server.log = client.log client.log.level = 'silent' return client } } npm_3.5.2.orig/node_modules/npm-registry-client/test/lib/server.js0000644000000000000000000000305612631326456023460 0ustar 00000000000000// a fake registry server. var http = require('http') var server = http.createServer(handler) var port = server.port = process.env.PORT || 1337 var assert = require('assert') server.listen(port) module.exports = server server._expect = {} function handler (req, res) { req.connection.setTimeout(1000) // If we got authorization, make sure it's the right password. if (req.headers.authorization && req.headers.authorization.match(/^Basic/)) { var auth = req.headers.authorization.replace(/^Basic /, '') auth = new Buffer(auth, 'base64').toString('utf8') assert.equal(auth, 'username:%1234@asdf%') } var u = '* ' + req.url var mu = req.method + ' ' + req.url var k = server._expect[mu] ? mu : server._expect[u] ? u : null if (!k) throw Error('unexpected request: ' + req.method + ' ' + req.url) var fn = server._expect[k].shift() if (!fn) throw Error('unexpected request: ' + req.method + ' ' + req.url) this.log.info('fake-registry', Object.keys(server._expect).map(function (k) { return [k, server._expect[k].length] }).reduce(function (acc, kv) { acc[kv[0]] = kv[1] return acc }, {})) res.json = json fn(req, res) } function json (o) { this.setHeader('content-type', 'application/json') this.end(JSON.stringify(o)) } // this log is meanto to be overridden server.log = require('npmlog') server.expect = function (method, u, fn) { if (typeof u === 'function') { fn = u u = method method = '*' } u = method + ' ' + u server._expect[u] = server._expect[u] || [] server._expect[u].push(fn) } npm_3.5.2.orig/node_modules/npm-user-validate/.npmignore0000644000000000000000000000014512631326456021503 0ustar 00000000000000*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modulesnpm_3.5.2.orig/node_modules/npm-user-validate/.travis.yml0000644000000000000000000000005712631326456021617 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10"npm_3.5.2.orig/node_modules/npm-user-validate/LICENSE0000644000000000000000000000241712631326456020515 0ustar 00000000000000Copyright (c) Robert Kowalski All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.npm_3.5.2.orig/node_modules/npm-user-validate/README.md0000644000000000000000000000056612631326456020772 0ustar 00000000000000[![Build Status](https://travis-ci.org/npm/npm-user-validate.png?branch=master)](https://travis-ci.org/npm/npm-user-validate) [![devDependency Status](https://david-dm.org/npm/npm-user-validate/dev-status.png)](https://david-dm.org/npm/npm-user-validate#info=devDependencies) # npm-user-validate Validation for the npm client and npm-www (and probably other npm projects) npm_3.5.2.orig/node_modules/npm-user-validate/npm-user-validate.js0000644000000000000000000000147612631326456023407 0ustar 00000000000000exports.email = email exports.pw = pw exports.username = username var requirements = exports.requirements = { username: { lowerCase: 'Username must be lowercase', urlSafe: 'Username may not contain non-url-safe chars', dot: 'Username may not start with "."' }, password: {}, email: { valid: 'Email must be an email address' } }; function username (un) { if (un !== un.toLowerCase()) { return new Error(requirements.username.lowerCase) } if (un !== encodeURIComponent(un)) { return new Error(requirements.username.urlSafe) } if (un.charAt(0) === '.') { return new Error(requirements.username.dot) } return null } function email (em) { if (!em.match(/^.+@.+\..+$/)) { return new Error(requirements.email.valid) } return null } function pw (pw) { return null } npm_3.5.2.orig/node_modules/npm-user-validate/package.json0000644000000000000000000000261012631326456021771 0ustar 00000000000000{ "name": "npm-user-validate", "version": "0.1.2", "description": "User validations for npm", "main": "npm-user-validate.js", "devDependencies": { "tap": "0.4.3" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/npm/npm-user-validate.git" }, "keywords": [ "npm", "validation", "registry" ], "author": { "name": "Robert Kowalski", "email": "rok@kowalski.gd" }, "license": "BSD-2-Clause", "gitHead": "e5b280babff5b73fe74b496461bcf424a51881e1", "bugs": { "url": "https://github.com/npm/npm-user-validate/issues" }, "homepage": "https://github.com/npm/npm-user-validate#readme", "_id": "npm-user-validate@0.1.2", "_shasum": "d585da0b47c9f41a9e6ca684b6fd84ba41ebe87d", "_from": "npm-user-validate@>=0.1.2 <0.2.0", "_npmVersion": "2.10.0", "_nodeVersion": "2.0.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "d585da0b47c9f41a9e6ca684b6fd84ba41ebe87d", "tarball": "http://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.2.tgz" }, "maintainers": [ { "name": "robertkowalski", "email": "rok@kowalski.gd" }, { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/npm-user-validate/-/npm-user-validate-0.1.2.tgz" } npm_3.5.2.orig/node_modules/npm-user-validate/test/0000755000000000000000000000000012631326456020463 5ustar 00000000000000npm_3.5.2.orig/node_modules/npm-user-validate/test/email.test.js0000644000000000000000000000077012631326456023072 0ustar 00000000000000var test = require('tap').test var v = require('../npm-user-validate.js').email test('email misses an @', function (t) { err = v('namedomain') t.type(err, 'object') t.end() }) test('email misses a dot', function (t) { err = v('name@domain') t.type(err, 'object') t.end() }) test('email misses a string before the @', function (t) { err = v('@domain') t.type(err, 'object') t.end() }) test('email is ok', function (t) { err = v('name@domain.com') t.type(err, 'null') t.end() })npm_3.5.2.orig/node_modules/npm-user-validate/test/pw.test.js0000644000000000000000000000101612631326456022423 0ustar 00000000000000var test = require('tap').test var v = require('../npm-user-validate.js').pw test('pw contains a \'', function (t) { err = v('\'') t.type(err, 'null') t.end() }) test('pw contains a :', function (t) { err = v(':') t.type(err, 'null') t.end() }) test('pw contains a @', function (t) { err = v('@') t.notOk(err, 'null') t.end() }) test('pw contains a "', function (t) { err = v('"') t.type(err, 'null') t.end() }) test('pw is ok', function (t) { err = v('duck') t.type(err, 'null') t.end() }) npm_3.5.2.orig/node_modules/npm-user-validate/test/username.test.js0000644000000000000000000000100712631326456023614 0ustar 00000000000000var test = require('tap').test var v = require('../npm-user-validate.js').username test('username must be lowercase', function (t) { err = v('ERRR') t.type(err, 'object') t.end() }) test('username may not contain non-url-safe chars', function (t) { err = v('f ') t.type(err, 'object') t.end() }) test('username may not start with "."', function (t) { err = v('.username') t.type(err, 'object') t.end() }) test('username is ok', function (t) { err = v('ente') t.type(err, 'null') t.end() }) npm_3.5.2.orig/node_modules/npmlog/.travis.yml0000644000000000000000000000032712631326456017556 0ustar 00000000000000language: node_js sudo: false node_js: - "5" - "4" - iojs - "0.12" - "0.10" - "0.8" before_install: - "npm install -g npm" script: "npm test" notifications: slack: npm-inc:kRqQjto7YbINqHPb1X6nS3g8 npm_3.5.2.orig/node_modules/npmlog/LICENSE0000644000000000000000000000137512631326456016456 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npmlog/README.md0000644000000000000000000001213612631326456016725 0ustar 00000000000000# npmlog The logger util that npm uses. This logger is very basic. It does the logging for npm. It supports custom levels and colored output. By default, logs are written to stderr. If you want to send log messages to outputs other than streams, then you can change the `log.stream` member, or you can just listen to the events that it emits, and do whatever you want with them. # Basic Usage ``` var log = require('npmlog') // additional stuff ---------------------------+ // message ----------+ | // prefix ----+ | | // level -+ | | | // v v v v log.info('fyi', 'I have a kitty cat: %j', myKittyCat) ``` ## log.level * {String} The level to display logs at. Any logs at or above this level will be displayed. The special level `silent` will prevent anything from being displayed ever. ## log.record * {Array} An array of all the log messages that have been entered. ## log.maxRecordSize * {Number} The maximum number of records to keep. If log.record gets bigger than 10% over this value, then it is sliced down to 90% of this value. The reason for the 10% window is so that it doesn't have to resize a large array on every log entry. ## log.prefixStyle * {Object} A style object that specifies how prefixes are styled. (See below) ## log.headingStyle * {Object} A style object that specifies how the heading is styled. (See below) ## log.heading * {String} Default: "" If set, a heading that is printed at the start of every line. ## log.stream * {Stream} Default: `process.stderr` The stream where output is written. ## log.enableColor() Force colors to be used on all messages, regardless of the output stream. ## log.disableColor() Disable colors on all messages. ## log.enableProgress() Enable the display of log activity spinner and progress bar ## log.disableProgress() Disable the display of a progress bar ## log.enableUnicode() Force the unicode theme to be used for the progress bar. ## log.disableUnicode() Disable the use of unicode in the progress bar. ## log.setGaugeTemplate(template) Overrides the default gauge template. ## log.pause() Stop emitting messages to the stream, but do not drop them. ## log.resume() Emit all buffered messages that were written while paused. ## log.log(level, prefix, message, ...) * `level` {String} The level to emit the message at * `prefix` {String} A string prefix. Set to "" to skip. * `message...` Arguments to `util.format` Emit a log message at the specified level. ## log\[level](prefix, message, ...) For example, * log.silly(prefix, message, ...) * log.verbose(prefix, message, ...) * log.info(prefix, message, ...) * log.http(prefix, message, ...) * log.warn(prefix, message, ...) * log.error(prefix, message, ...) Like `log.log(level, prefix, message, ...)`. In this way, each level is given a shorthand, so you can do `log.info(prefix, message)`. ## log.addLevel(level, n, style, disp) * `level` {String} Level indicator * `n` {Number} The numeric level * `style` {Object} Object with fg, bg, inverse, etc. * `disp` {String} Optional replacement for `level` in the output. Sets up a new level with a shorthand function and so forth. Note that if the number is `Infinity`, then setting the level to that will cause all log messages to be suppressed. If the number is `-Infinity`, then the only way to show it is to enable all log messages. ## log.newItem(name, todo, weight) * `name` {String} Optional; progress item name. * `todo` {Number} Optional; total amount of work to be done. Default 0. * `weight` {Number} Optional; the weight of this item relative to others. Default 1. This adds a new `are-we-there-yet` item tracker to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `Tracker` object. ## log.newStream(name, todo, weight) This adds a new `are-we-there-yet` stream tracker to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `TrackerStream` object. ## log.newGroup(name, weight) This adds a new `are-we-there-yet` tracker group to the progress tracker. The object returned has the `log[level]` methods but is otherwise an `are-we-there-yet` `TrackerGroup` object. # Events Events are all emitted with the message object. * `log` Emitted for all messages * `log.` Emitted for all messages with the `` level. * `` Messages with prefixes also emit their prefix as an event. # Style Objects Style objects can have the following fields: * `fg` {String} Color for the foreground text * `bg` {String} Color for the background * `bold`, `inverse`, `underline` {Boolean} Set the associated property * `bell` {Boolean} Make a noise (This is pretty annoying, probably.) # Message Objects Every log event is emitted with a message object, and the `log.record` list contains all of them that have been created. They have the following fields: * `id` {Number} * `level` {String} * `prefix` {String} * `message` {String} Result of `util.format()` * `messageRaw` {Array} Arguments to `util.format()` npm_3.5.2.orig/node_modules/npmlog/example.js0000644000000000000000000000312512631326456017435 0ustar 00000000000000var log = require('./log.js') log.heading = 'npm' console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n'+ new Error('a 404 error').stack) log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') npm_3.5.2.orig/node_modules/npmlog/log.js0000644000000000000000000001506012631326456016564 0ustar 00000000000000'use strict' var Progress = require('are-we-there-yet') var Gauge = require('gauge') var EE = require('events').EventEmitter var log = exports = module.exports = new EE var util = require('util') var ansi = require('ansi') log.cursor = ansi(process.stderr) log.stream = process.stderr // by default, let ansi decide based on tty-ness. var colorEnabled = undefined log.enableColor = function () { colorEnabled = true this.cursor.enabled = true } log.disableColor = function () { colorEnabled = false this.cursor.enabled = false } // default level log.level = 'info' log.gauge = new Gauge(log.cursor) log.tracker = new Progress.TrackerGroup() // no progress bars unless asked log.progressEnabled = false var gaugeTheme = undefined log.enableUnicode = function () { gaugeTheme = Gauge.unicode log.gauge.setTheme(gaugeTheme) } log.disableUnicode = function () { gaugeTheme = Gauge.ascii log.gauge.setTheme(gaugeTheme) } var gaugeTemplate = undefined log.setGaugeTemplate = function (template) { gaugeTemplate = template log.gauge.setTemplate(gaugeTemplate) } log.enableProgress = function () { if (this.progressEnabled) return this.progressEnabled = true if (this._pause) return this.tracker.on('change', this.showProgress) this.gauge.enable() this.showProgress() } log.disableProgress = function () { if (!this.progressEnabled) return this.clearProgress() this.progressEnabled = false this.tracker.removeListener('change', this.showProgress) this.gauge.disable() } var trackerConstructors = ['newGroup', 'newItem', 'newStream'] var mixinLog = function (tracker) { // mixin the public methods from log into the tracker // (except: conflicts and one's we handle specially) Object.keys(log).forEach(function (P) { if (P[0] === '_') return if (trackerConstructors.filter(function (C) { return C === P }).length) return if (tracker[P]) return if (typeof log[P] !== 'function') return var func = log[P] tracker[P] = function () { return func.apply(log, arguments) } }) // if the new tracker is a group, make sure any subtrackers get // mixed in too if (tracker instanceof Progress.TrackerGroup) { trackerConstructors.forEach(function (C) { var func = tracker[C] tracker[C] = function () { return mixinLog(func.apply(tracker, arguments)) } }) } return tracker } // Add tracker constructors to the top level log object trackerConstructors.forEach(function (C) { log[C] = function () { return mixinLog(this.tracker[C].apply(this.tracker, arguments)) } }) log.clearProgress = function () { if (!this.progressEnabled) return this.gauge.hide() } log.showProgress = function (name) { if (!this.progressEnabled) return this.gauge.show(name, this.tracker.completed()) }.bind(log) // bind for use in tracker's on-change listener // temporarily stop emitting, but don't drop log.pause = function () { this._paused = true } log.resume = function () { if (!this._paused) return this._paused = false var b = this._buffer this._buffer = [] b.forEach(function (m) { this.emitLog(m) }, this) if (this.progressEnabled) this.enableProgress() } log._buffer = [] var id = 0 log.record = [] log.maxRecordSize = 10000 log.log = function (lvl, prefix, message) { var l = this.levels[lvl] if (l === undefined) { return this.emit('error', new Error(util.format( 'Undefined log level: %j', lvl))) } var a = new Array(arguments.length - 2) var stack = null for (var i = 2; i < arguments.length; i ++) { var arg = a[i-2] = arguments[i] // resolve stack traces to a plain string. if (typeof arg === 'object' && arg && (arg instanceof Error) && arg.stack) { arg.stack = stack = arg.stack + '' } } if (stack) a.unshift(stack + '\n') message = util.format.apply(util, a) var m = { id: id++, level: lvl, prefix: String(prefix || ''), message: message, messageRaw: a } this.emit('log', m) this.emit('log.' + lvl, m) if (m.prefix) this.emit(m.prefix, m) this.record.push(m) var mrs = this.maxRecordSize var n = this.record.length - mrs if (n > mrs / 10) { var newSize = Math.floor(mrs * 0.9) this.record = this.record.slice(-1 * newSize) } this.emitLog(m) }.bind(log) log.emitLog = function (m) { if (this._paused) { this._buffer.push(m) return } if (this.progressEnabled) this.gauge.pulse(m.prefix) var l = this.levels[m.level] if (l === undefined) return if (l < this.levels[this.level]) return if (l > 0 && !isFinite(l)) return var style = log.style[m.level] var disp = log.disp[m.level] || m.level this.clearProgress() m.message.split(/\r?\n/).forEach(function (line) { if (this.heading) { this.write(this.heading, this.headingStyle) this.write(' ') } this.write(disp, log.style[m.level]) var p = m.prefix || '' if (p) this.write(' ') this.write(p, this.prefixStyle) this.write(' ' + line + '\n') }, this) this.showProgress() } log.write = function (msg, style) { if (!this.cursor) return if (this.stream !== this.cursor.stream) { this.cursor = ansi(this.stream, { enabled: colorEnabled }) var options = {} if (gaugeTheme != null) options.theme = gaugeTheme if (gaugeTemplate != null) options.template = gaugeTemplate this.gauge = new Gauge(options, this.cursor) } style = style || {} if (style.fg) this.cursor.fg[style.fg]() if (style.bg) this.cursor.bg[style.bg]() if (style.bold) this.cursor.bold() if (style.underline) this.cursor.underline() if (style.inverse) this.cursor.inverse() if (style.beep) this.cursor.beep() this.cursor.write(msg).reset() } log.addLevel = function (lvl, n, style, disp) { if (!disp) disp = lvl this.levels[lvl] = n this.style[lvl] = style if (!this[lvl]) this[lvl] = function () { var a = new Array(arguments.length + 1) a[0] = lvl for (var i = 0; i < arguments.length; i ++) { a[i + 1] = arguments[i] } return this.log.apply(this, a) }.bind(this) this.disp[lvl] = disp } log.prefixStyle = { fg: 'magenta' } log.headingStyle = { fg: 'white', bg: 'black' } log.style = {} log.levels = {} log.disp = {} log.addLevel('silly', -Infinity, { inverse: true }, 'sill') log.addLevel('verbose', 1000, { fg: 'blue', bg: 'black' }, 'verb') log.addLevel('info', 2000, { fg: 'green' }) log.addLevel('http', 3000, { fg: 'green', bg: 'black' }) log.addLevel('warn', 4000, { fg: 'black', bg: 'yellow' }, 'WARN') log.addLevel('error', 5000, { fg: 'red', bg: 'black' }, 'ERR!') log.addLevel('silent', Infinity) // allow 'error' prefix log.on('error', function(){}) npm_3.5.2.orig/node_modules/npmlog/node_modules/0000755000000000000000000000000012631326456020120 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/package.json0000644000000000000000000001421212631326456017731 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "npmlog", "description": "logger for npm", "version": "2.0.0", "repository": { "type": "git", "url": "git+https://github.com/npm/npmlog.git" }, "main": "log.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "ansi": "~0.3.0", "are-we-there-yet": "~1.0.0", "gauge": "~1.2.0" }, "devDependencies": { "tap": "~2.2.0" }, "license": "ISC", "readme": "# npmlog\n\nThe logger util that npm uses.\n\nThis logger is very basic. It does the logging for npm. It supports\ncustom levels and colored output.\n\nBy default, logs are written to stderr. If you want to send log messages\nto outputs other than streams, then you can change the `log.stream`\nmember, or you can just listen to the events that it emits, and do\nwhatever you want with them.\n\n# Basic Usage\n\n```\nvar log = require('npmlog')\n\n// additional stuff ---------------------------+\n// message ----------+ |\n// prefix ----+ | |\n// level -+ | | |\n// v v v v\n log.info('fyi', 'I have a kitty cat: %j', myKittyCat)\n```\n\n## log.level\n\n* {String}\n\nThe level to display logs at. Any logs at or above this level will be\ndisplayed. The special level `silent` will prevent anything from being\ndisplayed ever.\n\n## log.record\n\n* {Array}\n\nAn array of all the log messages that have been entered.\n\n## log.maxRecordSize\n\n* {Number}\n\nThe maximum number of records to keep. If log.record gets bigger than\n10% over this value, then it is sliced down to 90% of this value.\n\nThe reason for the 10% window is so that it doesn't have to resize a\nlarge array on every log entry.\n\n## log.prefixStyle\n\n* {Object}\n\nA style object that specifies how prefixes are styled. (See below)\n\n## log.headingStyle\n\n* {Object}\n\nA style object that specifies how the heading is styled. (See below)\n\n## log.heading\n\n* {String} Default: \"\"\n\nIf set, a heading that is printed at the start of every line.\n\n## log.stream\n\n* {Stream} Default: `process.stderr`\n\nThe stream where output is written.\n\n## log.enableColor()\n\nForce colors to be used on all messages, regardless of the output\nstream.\n\n## log.disableColor()\n\nDisable colors on all messages.\n\n## log.enableProgress()\n\nEnable the display of log activity spinner and progress bar\n\n## log.disableProgress()\n\nDisable the display of a progress bar\n\n## log.enableUnicode()\n\nForce the unicode theme to be used for the progress bar.\n\n## log.disableUnicode()\n\nDisable the use of unicode in the progress bar.\n\n## log.setGaugeTemplate(template)\n\nOverrides the default gauge template.\n\n## log.pause()\n\nStop emitting messages to the stream, but do not drop them.\n\n## log.resume()\n\nEmit all buffered messages that were written while paused.\n\n## log.log(level, prefix, message, ...)\n\n* `level` {String} The level to emit the message at\n* `prefix` {String} A string prefix. Set to \"\" to skip.\n* `message...` Arguments to `util.format`\n\nEmit a log message at the specified level.\n\n## log\\[level](prefix, message, ...)\n\nFor example,\n\n* log.silly(prefix, message, ...)\n* log.verbose(prefix, message, ...)\n* log.info(prefix, message, ...)\n* log.http(prefix, message, ...)\n* log.warn(prefix, message, ...)\n* log.error(prefix, message, ...)\n\nLike `log.log(level, prefix, message, ...)`. In this way, each level is\ngiven a shorthand, so you can do `log.info(prefix, message)`.\n\n## log.addLevel(level, n, style, disp)\n\n* `level` {String} Level indicator\n* `n` {Number} The numeric level\n* `style` {Object} Object with fg, bg, inverse, etc.\n* `disp` {String} Optional replacement for `level` in the output.\n\nSets up a new level with a shorthand function and so forth.\n\nNote that if the number is `Infinity`, then setting the level to that\nwill cause all log messages to be suppressed. If the number is\n`-Infinity`, then the only way to show it is to enable all log messages.\n\n## log.newItem(name, todo, weight)\n\n* `name` {String} Optional; progress item name.\n* `todo` {Number} Optional; total amount of work to be done. Default 0.\n* `weight` {Number} Optional; the weight of this item relative to others. Default 1.\n\nThis adds a new `are-we-there-yet` item tracker to the progress tracker. The\nobject returned has the `log[level]` methods but is otherwise an\n`are-we-there-yet` `Tracker` object.\n\n## log.newStream(name, todo, weight)\n\nThis adds a new `are-we-there-yet` stream tracker to the progress tracker. The\nobject returned has the `log[level]` methods but is otherwise an\n`are-we-there-yet` `TrackerStream` object.\n\n## log.newGroup(name, weight)\n\nThis adds a new `are-we-there-yet` tracker group to the progress tracker. The\nobject returned has the `log[level]` methods but is otherwise an\n`are-we-there-yet` `TrackerGroup` object.\n\n# Events\n\nEvents are all emitted with the message object.\n\n* `log` Emitted for all messages\n* `log.` Emitted for all messages with the `` level.\n* `` Messages with prefixes also emit their prefix as an event.\n\n# Style Objects\n\nStyle objects can have the following fields:\n\n* `fg` {String} Color for the foreground text\n* `bg` {String} Color for the background\n* `bold`, `inverse`, `underline` {Boolean} Set the associated property\n* `bell` {Boolean} Make a noise (This is pretty annoying, probably.)\n\n# Message Objects\n\nEvery log event is emitted with a message object, and the `log.record`\nlist contains all of them that have been created. They have the\nfollowing fields:\n\n* `id` {Number}\n* `level` {String}\n* `prefix` {String}\n* `message` {String} Result of `util.format()`\n* `messageRaw` {Array} Arguments to `util.format()`\n", "readmeFilename": "README.md", "gitHead": "6eaa3f8eec672bb7b56a4df9b55dbfff3b9c6a71", "bugs": { "url": "https://github.com/npm/npmlog/issues" }, "homepage": "https://github.com/npm/npmlog#readme", "_id": "npmlog@2.0.0", "_shasum": "4076c200a3dda51133e6f3cf052130105f78bbdf", "_from": "npmlog@>=2.0.0 <2.1.0" } npm_3.5.2.orig/node_modules/npmlog/test/0000755000000000000000000000000012631326456016422 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/0000755000000000000000000000000012631326456021052 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/0000755000000000000000000000000012631326456023204 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/0000755000000000000000000000000012631326456021210 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/.npmignore0000644000000000000000000000001512631326456023045 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/History.md0000644000000000000000000000060612631326456023037 0ustar 00000000000000 0.3.0 / 2014-05-09 ================== * package: remove "test" script and "devDependencies" * package: remove "engines" section * pacakge: remove "bin" section * package: beautify * examples: remove `starwars` example (#15) * Documented goto, horizontalAbsolute, and eraseLine methods in README.md (#12, @Jammerwoch) * add `.jshintrc` file < 0.3.0 ======= * Prehistoric npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/README.md0000644000000000000000000000621312631326456022333 0ustar 00000000000000ansi.js ========= ### Advanced ANSI formatting tool for Node.js `ansi.js` is a module for Node.js that provides an easy-to-use API for writing ANSI escape codes to `Stream` instances. ANSI escape codes are used to do fancy things in a terminal window, like render text in colors, delete characters, lines, the entire window, or hide and show the cursor, and lots more! #### Features: * 256 color support for the terminal! * Make a beep sound from your terminal! * Works with *any* writable `Stream` instance. * Allows you to move the cursor anywhere on the terminal window. * Allows you to delete existing contents from the terminal window. * Allows you to hide and show the cursor. * Converts CSS color codes and RGB values into ANSI escape codes. * Low-level; you are in control of when escape codes are used, it's not abstracted. Installation ------------ Install with `npm`: ``` bash $ npm install ansi ``` Example ------- ``` js var ansi = require('ansi') , cursor = ansi(process.stdout) // You can chain your calls forever: cursor .red() // Set font color to red .bg.grey() // Set background color to grey .write('Hello World!') // Write 'Hello World!' to stdout .bg.reset() // Reset the bgcolor before writing the trailing \n, // to avoid Terminal glitches .write('\n') // And a final \n to wrap things up // Rendering modes are persistent: cursor.hex('#660000').bold().underline() // You can use the regular logging functions, text will be green: console.log('This is blood red, bold text') // To reset just the foreground color: cursor.fg.reset() console.log('This will still be bold') // to go to a location (x,y) on the console // note: 1-indexed, not 0-indexed: cursor.goto(10, 5).write('Five down, ten over') // to clear the current line: cursor.horizontalAbsolute(0).eraseLine().write('Starting again') // to go to a different column on the current line: cursor.horizontalAbsolute(5).write('column five') // Clean up after yourself! cursor.reset() ``` License ------- (The MIT License) Copyright (c) 2012 Nathan Rajlich <nathan@tootallnate.net> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/0000755000000000000000000000000012631326456022670 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/lib/0000755000000000000000000000000012631326456021620 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/package.json0000644000000000000000000000242212631326456023340 0ustar 00000000000000{ "name": "ansi", "description": "Advanced ANSI formatting tool for Node.js", "keywords": [ "ansi", "formatting", "cursor", "color", "terminal", "rgb", "256", "stream" ], "version": "0.3.0", "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://tootallnate.net" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/ansi.js.git" }, "main": "./lib/ansi.js", "bugs": { "url": "https://github.com/TooTallNate/ansi.js/issues" }, "homepage": "https://github.com/TooTallNate/ansi.js", "_id": "ansi@0.3.0", "_shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "_from": "ansi@>=0.3.0 <0.4.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "TooTallNate", "email": "nathan@tootallnate.net" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "74b2f1f187c8553c7f95015bcb76009fb43d38e0", "tarball": "http://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/ansi/-/ansi-0.3.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/beep/0000755000000000000000000000000012631326456023603 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/clear/0000755000000000000000000000000012631326456023756 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/cursorPosition.js0000755000000000000000000000120012631326456026264 0ustar 00000000000000#!/usr/bin/env node var tty = require('tty') var cursor = require('../')(process.stdout) // listen for the queryPosition report on stdin process.stdin.resume() raw(true) process.stdin.once('data', function (b) { var match = /\[(\d+)\;(\d+)R$/.exec(b.toString()) if (match) { var xy = match.slice(1, 3).reverse().map(Number) console.error(xy) } // cleanup and close stdin raw(false) process.stdin.pause() }) // send the query position request code to stdout cursor.queryPosition() function raw (mode) { if (process.stdin.setRawMode) { process.stdin.setRawMode(mode) } else { tty.setRawMode(mode) } } npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/progress/0000755000000000000000000000000012631326456024534 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/beep/index.js0000755000000000000000000000051212631326456025251 0ustar 00000000000000#!/usr/bin/env node /** * Invokes the terminal "beep" sound once per second on every exact second. */ process.title = 'beep' var cursor = require('../../')(process.stdout) function beep () { cursor.beep() setTimeout(beep, 1000 - (new Date()).getMilliseconds()) } setTimeout(beep, 1000 - (new Date()).getMilliseconds()) npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/clear/index.js0000755000000000000000000000054412631326456025431 0ustar 00000000000000#!/usr/bin/env node /** * Like GNU ncurses "clear" command. * https://github.com/mscdex/node-ncurses/blob/master/deps/ncurses/progs/clear.c */ process.title = 'clear' function lf () { return '\n' } require('../../')(process.stdout) .write(Array.apply(null, Array(process.stdout.getWindowSize()[1])).map(lf).join('')) .eraseData(2) .goto(1, 1) npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/examples/progress/index.js0000644000000000000000000000327412631326456026207 0ustar 00000000000000#!/usr/bin/env node var assert = require('assert') , ansi = require('../../') function Progress (stream, width) { this.cursor = ansi(stream) this.delta = this.cursor.newlines this.width = width | 0 || 10 this.open = '[' this.close = ']' this.complete = '█' this.incomplete = '_' // initial render this.progress = 0 } Object.defineProperty(Progress.prototype, 'progress', { get: get , set: set , configurable: true , enumerable: true }) function get () { return this._progress } function set (v) { this._progress = Math.max(0, Math.min(v, 100)) var w = this.width - this.complete.length - this.incomplete.length , n = w * (this._progress / 100) | 0 , i = w - n , com = c(this.complete, n) , inc = c(this.incomplete, i) , delta = this.cursor.newlines - this.delta assert.equal(com.length + inc.length, w) if (delta > 0) { this.cursor.up(delta) this.delta = this.cursor.newlines } this.cursor .horizontalAbsolute(0) .eraseLine(2) .fg.white() .write(this.open) .fg.grey() .bold() .write(com) .resetBold() .write(inc) .fg.white() .write(this.close) .fg.reset() .write('\n') } function c (char, length) { return Array.apply(null, Array(length)).map(function () { return char }).join('') } // Usage var width = parseInt(process.argv[2], 10) || process.stdout.getWindowSize()[0] / 2 , p = new Progress(process.stdout, width) ;(function tick () { p.progress += Math.random() * 5 p.cursor .eraseLine(2) .write('Progress: ') .bold().write(p.progress.toFixed(2)) .write('%') .resetBold() .write('\n') if (p.progress < 100) setTimeout(tick, 100) })() npm_3.5.2.orig/node_modules/npmlog/node_modules/ansi/lib/ansi.js0000644000000000000000000001743412631326456023121 0ustar 00000000000000 /** * References: * * - http://en.wikipedia.org/wiki/ANSI_escape_code * - http://www.termsys.demon.co.uk/vtansi.htm * */ /** * Module dependencies. */ var emitNewlineEvents = require('./newlines') , prefix = '\x1b[' // For all escape codes , suffix = 'm' // Only for color codes /** * The ANSI escape sequences. */ var codes = { up: 'A' , down: 'B' , forward: 'C' , back: 'D' , nextLine: 'E' , previousLine: 'F' , horizontalAbsolute: 'G' , eraseData: 'J' , eraseLine: 'K' , scrollUp: 'S' , scrollDown: 'T' , savePosition: 's' , restorePosition: 'u' , queryPosition: '6n' , hide: '?25l' , show: '?25h' } /** * Rendering ANSI codes. */ var styles = { bold: 1 , italic: 3 , underline: 4 , inverse: 7 } /** * The negating ANSI code for the rendering modes. */ var reset = { bold: 22 , italic: 23 , underline: 24 , inverse: 27 } /** * The standard, styleable ANSI colors. */ var colors = { white: 37 , black: 30 , blue: 34 , cyan: 36 , green: 32 , magenta: 35 , red: 31 , yellow: 33 , grey: 90 , brightBlack: 90 , brightRed: 91 , brightGreen: 92 , brightYellow: 93 , brightBlue: 94 , brightMagenta: 95 , brightCyan: 96 , brightWhite: 97 } /** * Creates a Cursor instance based off the given `writable stream` instance. */ function ansi (stream, options) { if (stream._ansicursor) { return stream._ansicursor } else { return stream._ansicursor = new Cursor(stream, options) } } module.exports = exports = ansi /** * The `Cursor` class. */ function Cursor (stream, options) { if (!(this instanceof Cursor)) { return new Cursor(stream, options) } if (typeof stream != 'object' || typeof stream.write != 'function') { throw new Error('a valid Stream instance must be passed in') } // the stream to use this.stream = stream // when 'enabled' is false then all the functions are no-ops except for write() this.enabled = options && options.enabled if (typeof this.enabled === 'undefined') { this.enabled = stream.isTTY } this.enabled = !!this.enabled // then `buffering` is true, then `write()` calls are buffered in // memory until `flush()` is invoked this.buffering = !!(options && options.buffering) this._buffer = [] // controls the foreground and background colors this.fg = this.foreground = new Colorer(this, 0) this.bg = this.background = new Colorer(this, 10) // defaults this.Bold = false this.Italic = false this.Underline = false this.Inverse = false // keep track of the number of "newlines" that get encountered this.newlines = 0 emitNewlineEvents(stream) stream.on('newline', function () { this.newlines++ }.bind(this)) } exports.Cursor = Cursor /** * Helper function that calls `write()` on the underlying Stream. * Returns `this` instead of the write() return value to keep * the chaining going. */ Cursor.prototype.write = function (data) { if (this.buffering) { this._buffer.push(arguments) } else { this.stream.write.apply(this.stream, arguments) } return this } /** * Buffer `write()` calls into memory. * * @api public */ Cursor.prototype.buffer = function () { this.buffering = true return this } /** * Write out the in-memory buffer. * * @api public */ Cursor.prototype.flush = function () { this.buffering = false var str = this._buffer.map(function (args) { if (args.length != 1) throw new Error('unexpected args length! ' + args.length); return args[0]; }).join(''); this._buffer.splice(0); // empty this.write(str); return this } /** * The `Colorer` class manages both the background and foreground colors. */ function Colorer (cursor, base) { this.current = null this.cursor = cursor this.base = base } exports.Colorer = Colorer /** * Write an ANSI color code, ensuring that the same code doesn't get rewritten. */ Colorer.prototype._setColorCode = function setColorCode (code) { var c = String(code) if (this.current === c) return this.cursor.enabled && this.cursor.write(prefix + c + suffix) this.current = c return this } /** * Set up the positional ANSI codes. */ Object.keys(codes).forEach(function (name) { var code = String(codes[name]) Cursor.prototype[name] = function () { var c = code if (arguments.length > 0) { c = toArray(arguments).map(Math.round).join(';') + code } this.enabled && this.write(prefix + c) return this } }) /** * Set up the functions for the rendering ANSI codes. */ Object.keys(styles).forEach(function (style) { var name = style[0].toUpperCase() + style.substring(1) , c = styles[style] , r = reset[style] Cursor.prototype[style] = function () { if (this[name]) return this.enabled && this.write(prefix + c + suffix) this[name] = true return this } Cursor.prototype['reset' + name] = function () { if (!this[name]) return this.enabled && this.write(prefix + r + suffix) this[name] = false return this } }) /** * Setup the functions for the standard colors. */ Object.keys(colors).forEach(function (color) { var code = colors[color] Colorer.prototype[color] = function () { this._setColorCode(this.base + code) return this.cursor } Cursor.prototype[color] = function () { return this.foreground[color]() } }) /** * Makes a beep sound! */ Cursor.prototype.beep = function () { this.enabled && this.write('\x07') return this } /** * Moves cursor to specific position */ Cursor.prototype.goto = function (x, y) { x = x | 0 y = y | 0 this.enabled && this.write(prefix + y + ';' + x + 'H') return this } /** * Resets the color. */ Colorer.prototype.reset = function () { this._setColorCode(this.base + 39) return this.cursor } /** * Resets all ANSI formatting on the stream. */ Cursor.prototype.reset = function () { this.enabled && this.write(prefix + '0' + suffix) this.Bold = false this.Italic = false this.Underline = false this.Inverse = false this.foreground.current = null this.background.current = null return this } /** * Sets the foreground color with the given RGB values. * The closest match out of the 216 colors is picked. */ Colorer.prototype.rgb = function (r, g, b) { var base = this.base + 38 , code = rgb(r, g, b) this._setColorCode(base + ';5;' + code) return this.cursor } /** * Same as `cursor.fg.rgb(r, g, b)`. */ Cursor.prototype.rgb = function (r, g, b) { return this.foreground.rgb(r, g, b) } /** * Accepts CSS color codes for use with ANSI escape codes. * For example: `#FF000` would be bright red. */ Colorer.prototype.hex = function (color) { return this.rgb.apply(this, hex(color)) } /** * Same as `cursor.fg.hex(color)`. */ Cursor.prototype.hex = function (color) { return this.foreground.hex(color) } // UTIL FUNCTIONS // /** * Translates a 255 RGB value to a 0-5 ANSI RGV value, * then returns the single ANSI color code to use. */ function rgb (r, g, b) { var red = r / 255 * 5 , green = g / 255 * 5 , blue = b / 255 * 5 return rgb5(red, green, blue) } /** * Turns rgb 0-5 values into a single ANSI color code to use. */ function rgb5 (r, g, b) { var red = Math.round(r) , green = Math.round(g) , blue = Math.round(b) return 16 + (red*36) + (green*6) + blue } /** * Accepts a hex CSS color code string (# is optional) and * translates it into an Array of 3 RGB 0-255 values, which * can then be used with rgb(). */ function hex (color) { var c = color[0] === '#' ? color.substring(1) : color , r = c.substring(0, 2) , g = c.substring(2, 4) , b = c.substring(4, 6) return [parseInt(r, 16), parseInt(g, 16), parseInt(b, 16)] } /** * Turns an array-like object into a real array. */ function toArray (a) { var i = 0 , l = a.length , rtn = [] for (; i 0) { var len = data.length , i = 0 // now try to calculate any deltas if (typeof data == 'string') { for (; i 100% are not allowed. Triggers a `change` event. * tracker.finish() Marks this tracker as finished, tracker.completed() will now be 1. Triggers a `change` event. TrackerStream ============= * var tracker = new TrackerStream(**name**, **size**, **options**) * **name** *(optional)* The name of this counter to report in change events. Defaults to undefined. * **size** *(optional)* The number of bytes being sent through this stream. * **options** *(optional)* A hash of stream options The tracker stream object is a pass through stream that updates an internal tracker object each time a block passes through. It's intended to track downloads, file extraction and other related activities. You use it by piping your data source into it and then using it as your data source. If your data has a length attribute then that's used as the amount of work completed when the chunk is passed through. If it does not (eg, object streams) then each chunk counts as completing 1 unit of work, so your size should be the total number of objects being streamed. * tracker.addWork(**todo**) * **todo** Increase the expected overall size by **todo** bytes. Increases the amount of work to be done, thus decreasing the completion percentage. Triggers a `change` event. npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/index.js0000644000000000000000000000700412631326456024652 0ustar 00000000000000"use strict" var stream = require("readable-stream"); var EventEmitter = require("events").EventEmitter var util = require("util") var delegate = require("delegates") var TrackerGroup = exports.TrackerGroup = function (name) { EventEmitter.call(this) this.name = name this.trackGroup = [] var self = this this.totalWeight = 0 var noteChange = this.noteChange = function (name) { self.emit("change", name || this.name) }.bind(this) this.trackGroup.forEach(function(unit) { unit.on("change", noteChange) }) } util.inherits(TrackerGroup, EventEmitter) TrackerGroup.prototype.completed = function () { if (this.trackGroup.length==0) return 0 var valPerWeight = 1 / this.totalWeight var completed = 0 this.trackGroup.forEach(function(T) { completed += valPerWeight * T.weight * T.completed() }) return completed } TrackerGroup.prototype.addUnit = function (unit, weight, noChange) { unit.weight = weight || 1 this.totalWeight += unit.weight this.trackGroup.push(unit) unit.on("change", this.noteChange) if (! noChange) this.emit("change", this.name) return unit } TrackerGroup.prototype.newGroup = function (name, weight) { return this.addUnit(new TrackerGroup(name), weight) } TrackerGroup.prototype.newItem = function (name, todo, weight) { return this.addUnit(new Tracker(name, todo), weight) } TrackerGroup.prototype.newStream = function (name, todo, weight) { return this.addUnit(new TrackerStream(name, todo), weight) } TrackerGroup.prototype.finish = function () { if (! this.trackGroup.length) { this.addUnit(new Tracker(), 1, true) } var self = this this.trackGroup.forEach(function(T) { T.removeListener("change", self.noteChange) T.finish() }) this.emit("change", this.name) } var buffer = " " TrackerGroup.prototype.debug = function (depth) { depth = depth || 0 var indent = depth ? buffer.substr(0,depth) : "" var output = indent + (this.name||"top") + ": " + this.completed() + "\n" this.trackGroup.forEach(function(T) { if (T instanceof TrackerGroup) { output += T.debug(depth + 1) } else { output += indent + " " + T.name + ": " + T.completed() + "\n" } }) return output } var Tracker = exports.Tracker = function (name,todo) { EventEmitter.call(this) this.name = name this.workDone = 0 this.workTodo = todo || 0 } util.inherits(Tracker, EventEmitter) Tracker.prototype.completed = function () { return this.workTodo==0 ? 0 : this.workDone / this.workTodo } Tracker.prototype.addWork = function (work) { this.workTodo += work this.emit("change", this.name) } Tracker.prototype.completeWork = function (work) { this.workDone += work if (this.workDone > this.workTodo) this.workDone = this.workTodo this.emit("change", this.name) } Tracker.prototype.finish = function () { this.workTodo = this.workDone = 1 this.emit("change", this.name) } var TrackerStream = exports.TrackerStream = function (name, size, options) { stream.Transform.call(this, options) this.tracker = new Tracker(name, size) this.name = name var self = this this.tracker.on("change", function (name) { self.emit("change", name) }) } util.inherits(TrackerStream, stream.Transform) TrackerStream.prototype._transform = function (data, encoding, cb) { this.tracker.completeWork(data.length ? data.length : 1) this.push(data) cb() } TrackerStream.prototype._flush = function (cb) { this.tracker.finish() cb() } delegate(TrackerStream.prototype, "tracker") .method("completed") .method("addWork") npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/0000755000000000000000000000000012631326456025661 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/package.json0000644000000000000000000000256312631326456025500 0ustar 00000000000000{ "name": "are-we-there-yet", "version": "1.0.4", "description": "Keep track of the overall completion of many dispirate processes", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/are-we-there-yet.git" }, "author": { "name": "Rebecca Turner", "url": "http://re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/are-we-there-yet/issues" }, "homepage": "https://github.com/iarna/are-we-there-yet", "devDependencies": { "tap": "^0.4.13" }, "dependencies": { "delegates": "^0.1.0", "readable-stream": "^1.1.13" }, "gitHead": "7ce414849b81ab83935a935275def01914821bde", "_id": "are-we-there-yet@1.0.4", "_shasum": "527fe389f7bcba90806106b99244eaa07e886f85", "_from": "are-we-there-yet@>=1.0.0 <1.1.0", "_npmVersion": "2.0.0", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "dist": { "shasum": "527fe389f7bcba90806106b99244eaa07e886f85", "tarball": "http://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-1.0.4.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/are-we-there-yet/-/are-we-there-yet-1.0.4.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/test/0000755000000000000000000000000012631326456024163 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/0000755000000000000000000000000012631326456027616 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/0000755000000000000000000000000012631326456030711 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/.npmignore0000644000000000000000000000001612631326456031612 0ustar 00000000000000node_modules/ npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/History.md0000644000000000000000000000034412631326456031602 0ustar 00000000000000 0.1.0 / 2014-10-17 ================== * adds `.fluent()` to api 0.0.3 / 2014-01-13 ================== * fix receiver for .method() 0.0.2 / 2014-01-13 ================== * Object.defineProperty() sucks * Initial commit npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/Makefile0000644000000000000000000000014412631326456031255 0ustar 00000000000000 test: @./node_modules/.bin/mocha \ --require should \ --reporter spec \ --bail .PHONY: testnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/Readme.md0000644000000000000000000000335012631326456031336 0ustar 00000000000000 # delegates Node method and accessor delegation utilty. ## Installation ``` $ npm install delegates ``` ## Example ```js var delegate = require('delegates'); ... delegate(proto, 'request') .method('acceptsLanguages') .method('acceptsEncodings') .method('acceptsCharsets') .method('accepts') .method('is') .access('querystring') .access('idempotent') .access('socket') .access('length') .access('query') .access('search') .access('status') .access('method') .access('path') .access('body') .access('host') .access('url') .getter('subdomains') .getter('protocol') .getter('header') .getter('stale') .getter('fresh') .getter('secure') .getter('ips') .getter('ip') ``` # API ## Delegate(proto, prop) Creates a delegator instance used to configure using the `prop` on the given `proto` object. (which is usually a prototype) ## Delegate#method(name) Allows the given method `name` to be accessed on the host. ## Delegate#getter(name) Creates a "getter" for the property with the given `name` on the delegated object. ## Delegate#setter(name) Creates a "setter" for the property with the given `name` on the delegated object. ## Delegate#access(name) Creates an "accessor" (ie: both getter *and* setter) for the property with the given `name` on the delegated object. ## Delegate#fluent(name) A unique type of "accessor" that works for a "fluent" API. When called as a getter, the method returns the expected value. However, if the method is called with a value, it will return itself so it can be chained. For example: ```js delegate(proto, 'request') .fluent('query') // getter var q = request.query(); // setter (chainable) request .query({ a: 1 }) .query({ b: 2 }); ``` # License MIT npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/index.js0000644000000000000000000000402112631326456031260 0ustar 00000000000000 /** * Expose `Delegator`. */ module.exports = Delegator; /** * Initialize a delegator. * * @param {Object} proto * @param {String} target * @api public */ function Delegator(proto, target) { if (!(this instanceof Delegator)) return new Delegator(proto, target); this.proto = proto; this.target = target; this.methods = []; this.getters = []; this.setters = []; this.fluents = []; } /** * Delegate method `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.method = function(name){ var proto = this.proto; var target = this.target; this.methods.push(name); proto[name] = function(){ return this[target][name].apply(this[target], arguments); }; return this; }; /** * Delegator accessor `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.access = function(name){ return this.getter(name).setter(name); }; /** * Delegator getter `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.getter = function(name){ var proto = this.proto; var target = this.target; this.getters.push(name); proto.__defineGetter__(name, function(){ return this[target][name]; }); return this; }; /** * Delegator setter `name`. * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.setter = function(name){ var proto = this.proto; var target = this.target; this.setters.push(name); proto.__defineSetter__(name, function(val){ return this[target][name] = val; }); return this; }; /** * Delegator fluent accessor * * @param {String} name * @return {Delegator} self * @api public */ Delegator.prototype.fluent = function (name) { var proto = this.proto; var target = this.target; this.fluents.push(name); proto[name] = function(val){ if ('undefined' != typeof val) { this[target][name] = val; return this; } else { return this[target][name]; } }; return this; }; npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/package.json0000644000000000000000000000233112631326456032103 0ustar 00000000000000{ "name": "delegates", "version": "0.1.0", "repository": { "type": "git", "url": "git://github.com/visionmedia/node-delegates.git" }, "description": "delegate methods and accessors to another property", "keywords": [ "delegate", "delegation" ], "dependencies": {}, "devDependencies": { "mocha": "*", "should": "*" }, "license": "MIT", "bugs": { "url": "https://github.com/visionmedia/node-delegates/issues" }, "homepage": "https://github.com/visionmedia/node-delegates", "_id": "delegates@0.1.0", "_shasum": "b4b57be11a1653517a04b27f0949bdc327dfe390", "_from": "delegates@>=0.1.0 <0.2.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "dominicbarnes", "email": "dominic@dbarnes.info" }, "maintainers": [ { "name": "tjholowaychuk", "email": "tj@vision-media.ca" }, { "name": "dominicbarnes", "email": "dominic@dbarnes.info" } ], "dist": { "shasum": "b4b57be11a1653517a04b27f0949bdc327dfe390", "tarball": "http://registry.npmjs.org/delegates/-/delegates-0.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/delegates/-/delegates-0.1.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/0000755000000000000000000000000012631326456030575 5ustar 00000000000000././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/delegates/test/index.j0000644000000000000000000000337012631326456032062 0ustar 00000000000000 var assert = require('assert'); var delegate = require('..'); describe('.method(name)', function(){ it('should delegate methods', function(){ var obj = {}; obj.request = { foo: function(bar){ assert(this == obj.request); return bar; } }; delegate(obj, 'request').method('foo'); obj.foo('something').should.equal('something'); }) }) describe('.getter(name)', function(){ it('should delegate getters', function(){ var obj = {}; obj.request = { get type() { return 'text/html'; } } delegate(obj, 'request').getter('type'); obj.type.should.equal('text/html'); }) }) describe('.setter(name)', function(){ it('should delegate setters', function(){ var obj = {}; obj.request = { get type() { return this._type.toUpperCase(); }, set type(val) { this._type = val; } } delegate(obj, 'request').setter('type'); obj.type = 'hey'; obj.request.type.should.equal('HEY'); }) }) describe('.access(name)', function(){ it('should delegate getters and setters', function(){ var obj = {}; obj.request = { get type() { return this._type.toUpperCase(); }, set type(val) { this._type = val; } } delegate(obj, 'request').access('type'); obj.type = 'hey'; obj.type.should.equal('HEY'); }) }) describe('.fluent(name)', function () { it('should delegate in a fluent fashion', function () { var obj = { settings: { env: 'development' } }; delegate(obj, 'settings').fluent('env'); obj.env().should.equal('development'); obj.env('production').should.equal(obj); obj.settings.env.should.equal('production'); }) }) ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/.npmignorenpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/.npmig0000644000000000000000000000004412631326456032022 0ustar 00000000000000build/ test/ examples/ fs.js zlib.js././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/LICENS0000644000000000000000000000211012631326456031603 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/README0000644000000000000000000000243012631326456031570 0ustar 00000000000000# readable-stream ***Node-core streams for userland*** [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png&months=6&height=3)](https://nodei.co/npm/readable-stream/) This package is a mirror of the Streams2 and Streams3 implementations in Node-core. If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core. **readable-stream** comes in two major versions, v1.0.x and v1.1.x. The former tracks the Streams2 implementation in Node 0.10, including bug-fixes and minor improvements as they are added. The latter tracks Streams3 as it develops in Node 0.11; we will likely see a v1.2.x branch for Node 0.12. **readable-stream** uses proper patch-level versioning so if you pin to `"~1.0.0"` you’ll get the latest Node 0.10 Streams2 implementation, including any fixes and minor non-breaking improvements. The patch-level versions of 1.0.x and 1.1.x should mirror the patch-level versions of Node-core releases. You should prefer the **1.0.x** releases for now and when you’re ready to start using Streams3, pin to `"~1.1.0"` ././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/duplex.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/duplex0000644000000000000000000000006412631326456032135 0ustar 00000000000000module.exports = require("./lib/_stream_duplex.js") ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/float.patchnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/float.0000644000000000000000000007374312631326456032035 0ustar 00000000000000diff --git a/lib/_stream_duplex.js b/lib/_stream_duplex.js index c5a741c..a2e0d8e 100644 --- a/lib/_stream_duplex.js +++ b/lib/_stream_duplex.js @@ -26,8 +26,8 @@ module.exports = Duplex; var util = require('util'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('./_stream_readable'); +var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); diff --git a/lib/_stream_passthrough.js b/lib/_stream_passthrough.js index a5e9864..330c247 100644 --- a/lib/_stream_passthrough.js +++ b/lib/_stream_passthrough.js @@ -25,7 +25,7 @@ module.exports = PassThrough; -var Transform = require('_stream_transform'); +var Transform = require('./_stream_transform'); var util = require('util'); util.inherits(PassThrough, Transform); diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js index 0c3fe3e..90a8298 100644 --- a/lib/_stream_readable.js +++ b/lib/_stream_readable.js @@ -23,10 +23,34 @@ module.exports = Readable; Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; +if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { + return emitter.listeners(type).length; +}; + +if (!global.setImmediate) global.setImmediate = function setImmediate(fn) { + return setTimeout(fn, 0); +}; +if (!global.clearImmediate) global.clearImmediate = function clearImmediate(i) { + return clearTimeout(i); +}; + var Stream = require('stream'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var StringDecoder; -var debug = util.debuglog('stream'); +var debug; +if (util.debuglog) + debug = util.debuglog('stream'); +else try { + debug = require('debuglog')('stream'); +} catch (er) { + debug = function() {}; +} util.inherits(Readable, Stream); @@ -380,7 +404,7 @@ function chunkInvalid(state, chunk) { function onEofChunk(stream, state) { - if (state.decoder && !state.ended) { + if (state.decoder && !state.ended && state.decoder.end) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); diff --git a/lib/_stream_transform.js b/lib/_stream_transform.js index b1f9fcc..b0caf57 100644 --- a/lib/_stream_transform.js +++ b/lib/_stream_transform.js @@ -64,8 +64,14 @@ module.exports = Transform; -var Duplex = require('_stream_duplex'); +var Duplex = require('./_stream_duplex'); var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} util.inherits(Transform, Duplex); diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index ba2e920..f49288b 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -27,6 +27,12 @@ module.exports = Writable; Writable.WritableState = WritableState; var util = require('util'); +if (!util.isUndefined) { + var utilIs = require('core-util-is'); + for (var f in utilIs) { + util[f] = utilIs[f]; + } +} var Stream = require('stream'); util.inherits(Writable, Stream); @@ -119,7 +125,7 @@ function WritableState(options, stream) { function Writable(options) { // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. - if (!(this instanceof Writable) && !(this instanceof Stream.Duplex)) + if (!(this instanceof Writable) && !(this instanceof require('./_stream_duplex'))) return new Writable(options); this._writableState = new WritableState(options, this); diff --git a/test/simple/test-stream-big-push.js b/test/simple/test-stream-big-push.js index e3787e4..8cd2127 100644 --- a/test/simple/test-stream-big-push.js +++ b/test/simple/test-stream-big-push.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var str = 'asdfasdfasdfasdfasdf'; var r = new stream.Readable({ diff --git a/test/simple/test-stream-end-paused.js b/test/simple/test-stream-end-paused.js index bb73777..d40efc7 100644 --- a/test/simple/test-stream-end-paused.js +++ b/test/simple/test-stream-end-paused.js @@ -25,7 +25,7 @@ var gotEnd = false; // Make sure we don't miss the end event for paused 0-length streams -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var stream = new Readable(); var calledRead = false; stream._read = function() { diff --git a/test/simple/test-stream-pipe-after-end.js b/test/simple/test-stream-pipe-after-end.js index b46ee90..0be8366 100644 --- a/test/simple/test-stream-pipe-after-end.js +++ b/test/simple/test-stream-pipe-after-end.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var util = require('util'); util.inherits(TestReadable, Readable); diff --git a/test/simple/test-stream-pipe-cleanup.js b/test/simple/test-stream-pipe-cleanup.js deleted file mode 100644 index f689358..0000000 --- a/test/simple/test-stream-pipe-cleanup.js +++ /dev/null @@ -1,122 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -// This test asserts that Stream.prototype.pipe does not leave listeners -// hanging on the source or dest. - -var common = require('../common'); -var stream = require('stream'); -var assert = require('assert'); -var util = require('util'); - -function Writable() { - this.writable = true; - this.endCalls = 0; - stream.Stream.call(this); -} -util.inherits(Writable, stream.Stream); -Writable.prototype.end = function() { - this.endCalls++; -}; - -Writable.prototype.destroy = function() { - this.endCalls++; -}; - -function Readable() { - this.readable = true; - stream.Stream.call(this); -} -util.inherits(Readable, stream.Stream); - -function Duplex() { - this.readable = true; - Writable.call(this); -} -util.inherits(Duplex, Writable); - -var i = 0; -var limit = 100; - -var w = new Writable(); - -var r; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('end'); -} -assert.equal(0, r.listeners('end').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -for (i = 0; i < limit; i++) { - r = new Readable(); - r.pipe(w); - r.emit('close'); -} -assert.equal(0, r.listeners('close').length); -assert.equal(limit, w.endCalls); - -w.endCalls = 0; - -r = new Readable(); - -for (i = 0; i < limit; i++) { - w = new Writable(); - r.pipe(w); - w.emit('close'); -} -assert.equal(0, w.listeners('close').length); - -r = new Readable(); -w = new Writable(); -var d = new Duplex(); -r.pipe(d); // pipeline A -d.pipe(w); // pipeline B -assert.equal(r.listeners('end').length, 2); // A.onend, A.cleanup -assert.equal(r.listeners('close').length, 2); // A.onclose, A.cleanup -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 3); // A.cleanup, B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -r.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 0); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 2); // B.onend, B.cleanup -assert.equal(d.listeners('close').length, 2); // B.onclose, B.cleanup -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 1); // B.cleanup - -d.emit('end'); -assert.equal(d.endCalls, 1); -assert.equal(w.endCalls, 1); -assert.equal(r.listeners('end').length, 0); -assert.equal(r.listeners('close').length, 0); -assert.equal(d.listeners('end').length, 0); -assert.equal(d.listeners('close').length, 0); -assert.equal(w.listeners('end').length, 0); -assert.equal(w.listeners('close').length, 0); diff --git a/test/simple/test-stream-pipe-error-handling.js b/test/simple/test-stream-pipe-error-handling.js index c5d724b..c7d6b7d 100644 --- a/test/simple/test-stream-pipe-error-handling.js +++ b/test/simple/test-stream-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream').Stream; +var Stream = require('../../').Stream; (function testErrorListenerCatches() { var source = new Stream(); diff --git a/test/simple/test-stream-pipe-event.js b/test/simple/test-stream-pipe-event.js index cb9d5fe..56f8d61 100644 --- a/test/simple/test-stream-pipe-event.js +++ b/test/simple/test-stream-pipe-event.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common'); -var stream = require('stream'); +var stream = require('../../'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream-push-order.js b/test/simple/test-stream-push-order.js index f2e6ec2..a5c9bf9 100644 --- a/test/simple/test-stream-push-order.js +++ b/test/simple/test-stream-push-order.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var assert = require('assert'); var s = new Readable({ diff --git a/test/simple/test-stream-push-strings.js b/test/simple/test-stream-push-strings.js index 06f43dc..1701a9a 100644 --- a/test/simple/test-stream-push-strings.js +++ b/test/simple/test-stream-push-strings.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var util = require('util'); util.inherits(MyStream, Readable); diff --git a/test/simple/test-stream-readable-event.js b/test/simple/test-stream-readable-event.js index ba6a577..a8e6f7b 100644 --- a/test/simple/test-stream-readable-event.js +++ b/test/simple/test-stream-readable-event.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; (function first() { // First test, not reading when the readable is added. diff --git a/test/simple/test-stream-readable-flow-recursion.js b/test/simple/test-stream-readable-flow-recursion.js index 2891ad6..11689ba 100644 --- a/test/simple/test-stream-readable-flow-recursion.js +++ b/test/simple/test-stream-readable-flow-recursion.js @@ -27,7 +27,7 @@ var assert = require('assert'); // more data continuously, but without triggering a nextTick // warning or RangeError. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; // throw an error if we trigger a nextTick warning. process.throwDeprecation = true; diff --git a/test/simple/test-stream-unshift-empty-chunk.js b/test/simple/test-stream-unshift-empty-chunk.js index 0c96476..7827538 100644 --- a/test/simple/test-stream-unshift-empty-chunk.js +++ b/test/simple/test-stream-unshift-empty-chunk.js @@ -24,7 +24,7 @@ var assert = require('assert'); // This test verifies that stream.unshift(Buffer(0)) or // stream.unshift('') does not set state.reading=false. -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var nChunks = 10; diff --git a/test/simple/test-stream-unshift-read-race.js b/test/simple/test-stream-unshift-read-race.js index 83fd9fa..17c18aa 100644 --- a/test/simple/test-stream-unshift-read-race.js +++ b/test/simple/test-stream-unshift-read-race.js @@ -29,7 +29,7 @@ var assert = require('assert'); // 3. push() after the EOF signaling null is an error. // 4. _read() is not called after pushing the EOF null chunk. -var stream = require('stream'); +var stream = require('../../'); var hwm = 10; var r = stream.Readable({ highWaterMark: hwm }); var chunks = 10; @@ -51,7 +51,14 @@ r._read = function(n) { function push(fast) { assert(!pushedNull, 'push() after null push'); - var c = pos >= data.length ? null : data.slice(pos, pos + n); + var c; + if (pos >= data.length) + c = null; + else { + if (n + pos > data.length) + n = data.length - pos; + c = data.slice(pos, pos + n); + } pushedNull = c === null; if (fast) { pos += n; diff --git a/test/simple/test-stream-writev.js b/test/simple/test-stream-writev.js index 5b49e6e..b5321f3 100644 --- a/test/simple/test-stream-writev.js +++ b/test/simple/test-stream-writev.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var queue = []; for (var decode = 0; decode < 2; decode++) { diff --git a/test/simple/test-stream2-basic.js b/test/simple/test-stream2-basic.js index 3814bf0..248c1be 100644 --- a/test/simple/test-stream2-basic.js +++ b/test/simple/test-stream2-basic.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-compatibility.js b/test/simple/test-stream2-compatibility.js index 6cdd4e9..f0fa84b 100644 --- a/test/simple/test-stream2-compatibility.js +++ b/test/simple/test-stream2-compatibility.js @@ -21,7 +21,7 @@ var common = require('../common.js'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream2-finish-pipe.js b/test/simple/test-stream2-finish-pipe.js index 39b274f..006a19b 100644 --- a/test/simple/test-stream2-finish-pipe.js +++ b/test/simple/test-stream2-finish-pipe.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Buffer = require('buffer').Buffer; var r = new stream.Readable(); diff --git a/test/simple/test-stream2-fs.js b/test/simple/test-stream2-fs.js deleted file mode 100644 index e162406..0000000 --- a/test/simple/test-stream2-fs.js +++ /dev/null @@ -1,72 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - - -var common = require('../common.js'); -var R = require('_stream_readable'); -var assert = require('assert'); - -var fs = require('fs'); -var FSReadable = fs.ReadStream; - -var path = require('path'); -var file = path.resolve(common.fixturesDir, 'x1024.txt'); - -var size = fs.statSync(file).size; - -var expectLengths = [1024]; - -var util = require('util'); -var Stream = require('stream'); - -util.inherits(TestWriter, Stream); - -function TestWriter() { - Stream.apply(this); - this.buffer = []; - this.length = 0; -} - -TestWriter.prototype.write = function(c) { - this.buffer.push(c.toString()); - this.length += c.length; - return true; -}; - -TestWriter.prototype.end = function(c) { - if (c) this.buffer.push(c.toString()); - this.emit('results', this.buffer); -} - -var r = new FSReadable(file); -var w = new TestWriter(); - -w.on('results', function(res) { - console.error(res, w.length); - assert.equal(w.length, size); - var l = 0; - assert.deepEqual(res.map(function (c) { - return c.length; - }), expectLengths); - console.log('ok'); -}); - -r.pipe(w); diff --git a/test/simple/test-stream2-httpclient-response-end.js b/test/simple/test-stream2-httpclient-response-end.js deleted file mode 100644 index 15cffc2..0000000 --- a/test/simple/test-stream2-httpclient-response-end.js +++ /dev/null @@ -1,52 +0,0 @@ -// Copyright Joyent, Inc. and other Node contributors. -// -// Permission is hereby granted, free of charge, to any person obtaining a -// copy of this software and associated documentation files (the -// "Software"), to deal in the Software without restriction, including -// without limitation the rights to use, copy, modify, merge, publish, -// distribute, sublicense, and/or sell copies of the Software, and to permit -// persons to whom the Software is furnished to do so, subject to the -// following conditions: -// -// The above copyright notice and this permission notice shall be included -// in all copies or substantial portions of the Software. -// -// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS -// OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF -// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN -// NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, -// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR -// OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE -// USE OR OTHER DEALINGS IN THE SOFTWARE. - -var common = require('../common.js'); -var assert = require('assert'); -var http = require('http'); -var msg = 'Hello'; -var readable_event = false; -var end_event = false; -var server = http.createServer(function(req, res) { - res.writeHead(200, {'Content-Type': 'text/plain'}); - res.end(msg); -}).listen(common.PORT, function() { - http.get({port: common.PORT}, function(res) { - var data = ''; - res.on('readable', function() { - console.log('readable event'); - readable_event = true; - data += res.read(); - }); - res.on('end', function() { - console.log('end event'); - end_event = true; - assert.strictEqual(msg, data); - server.close(); - }); - }); -}); - -process.on('exit', function() { - assert(readable_event); - assert(end_event); -}); - diff --git a/test/simple/test-stream2-large-read-stall.js b/test/simple/test-stream2-large-read-stall.js index 2fbfbca..667985b 100644 --- a/test/simple/test-stream2-large-read-stall.js +++ b/test/simple/test-stream2-large-read-stall.js @@ -30,7 +30,7 @@ var PUSHSIZE = 20; var PUSHCOUNT = 1000; var HWM = 50; -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable({ highWaterMark: HWM }); @@ -39,23 +39,23 @@ var rs = r._readableState; r._read = push; r.on('readable', function() { - console.error('>> readable'); + //console.error('>> readable'); do { - console.error(' > read(%d)', READSIZE); + //console.error(' > read(%d)', READSIZE); var ret = r.read(READSIZE); - console.error(' < %j (%d remain)', ret && ret.length, rs.length); + //console.error(' < %j (%d remain)', ret && ret.length, rs.length); } while (ret && ret.length === READSIZE); - console.error('<< after read()', - ret && ret.length, - rs.needReadable, - rs.length); + //console.error('<< after read()', + // ret && ret.length, + // rs.needReadable, + // rs.length); }); var endEmitted = false; r.on('end', function() { endEmitted = true; - console.error('end'); + //console.error('end'); }); var pushes = 0; @@ -64,11 +64,11 @@ function push() { return; if (pushes++ === PUSHCOUNT) { - console.error(' push(EOF)'); + //console.error(' push(EOF)'); return r.push(null); } - console.error(' push #%d', pushes); + //console.error(' push #%d', pushes); if (r.push(new Buffer(PUSHSIZE))) setTimeout(push); } diff --git a/test/simple/test-stream2-objects.js b/test/simple/test-stream2-objects.js index 3e6931d..ff47d89 100644 --- a/test/simple/test-stream2-objects.js +++ b/test/simple/test-stream2-objects.js @@ -21,8 +21,8 @@ var common = require('../common.js'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var assert = require('assert'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-pipe-error-handling.js b/test/simple/test-stream2-pipe-error-handling.js index cf7531c..e3f3e4e 100644 --- a/test/simple/test-stream2-pipe-error-handling.js +++ b/test/simple/test-stream2-pipe-error-handling.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); (function testErrorListenerCatches() { var count = 1000; diff --git a/test/simple/test-stream2-pipe-error-once-listener.js b/test/simple/test-stream2-pipe-error-once-listener.js index 5e8e3cb..53b2616 100755 --- a/test/simple/test-stream2-pipe-error-once-listener.js +++ b/test/simple/test-stream2-pipe-error-once-listener.js @@ -24,7 +24,7 @@ var common = require('../common.js'); var assert = require('assert'); var util = require('util'); -var stream = require('stream'); +var stream = require('../../'); var Read = function() { diff --git a/test/simple/test-stream2-push.js b/test/simple/test-stream2-push.js index b63edc3..eb2b0e9 100644 --- a/test/simple/test-stream2-push.js +++ b/test/simple/test-stream2-push.js @@ -20,7 +20,7 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; var assert = require('assert'); diff --git a/test/simple/test-stream2-read-sync-stack.js b/test/simple/test-stream2-read-sync-stack.js index e8a7305..9740a47 100644 --- a/test/simple/test-stream2-read-sync-stack.js +++ b/test/simple/test-stream2-read-sync-stack.js @@ -21,7 +21,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; var r = new Readable(); var N = 256 * 1024; diff --git a/test/simple/test-stream2-readable-empty-buffer-no-eof.js b/test/simple/test-stream2-readable-empty-buffer-no-eof.js index cd30178..4b1659d 100644 --- a/test/simple/test-stream2-readable-empty-buffer-no-eof.js +++ b/test/simple/test-stream2-readable-empty-buffer-no-eof.js @@ -22,10 +22,9 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('stream').Readable; +var Readable = require('../../').Readable; test1(); -test2(); function test1() { var r = new Readable(); @@ -88,31 +87,3 @@ function test1() { console.log('ok'); }); } - -function test2() { - var r = new Readable({ encoding: 'base64' }); - var reads = 5; - r._read = function(n) { - if (!reads--) - return r.push(null); // EOF - else - return r.push(new Buffer('x')); - }; - - var results = []; - function flow() { - var chunk; - while (null !== (chunk = r.read())) - results.push(chunk + ''); - } - r.on('readable', flow); - r.on('end', function() { - results.push('EOF'); - }); - flow(); - - process.on('exit', function() { - assert.deepEqual(results, [ 'eHh4', 'eHg=', 'EOF' ]); - console.log('ok'); - }); -} diff --git a/test/simple/test-stream2-readable-from-list.js b/test/simple/test-stream2-readable-from-list.js index 7c96ffe..04a96f5 100644 --- a/test/simple/test-stream2-readable-from-list.js +++ b/test/simple/test-stream2-readable-from-list.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var fromList = require('_stream_readable')._fromList; +var fromList = require('../../lib/_stream_readable')._fromList; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-readable-legacy-drain.js b/test/simple/test-stream2-readable-legacy-drain.js index 675da8e..51fd3d5 100644 --- a/test/simple/test-stream2-readable-legacy-drain.js +++ b/test/simple/test-stream2-readable-legacy-drain.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Stream = require('stream'); +var Stream = require('../../'); var Readable = Stream.Readable; var r = new Readable(); diff --git a/test/simple/test-stream2-readable-non-empty-end.js b/test/simple/test-stream2-readable-non-empty-end.js index 7314ae7..c971898 100644 --- a/test/simple/test-stream2-readable-non-empty-end.js +++ b/test/simple/test-stream2-readable-non-empty-end.js @@ -21,7 +21,7 @@ var assert = require('assert'); var common = require('../common.js'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var len = 0; var chunks = new Array(10); diff --git a/test/simple/test-stream2-readable-wrap-empty.js b/test/simple/test-stream2-readable-wrap-empty.js index 2e5cf25..fd8a3dc 100644 --- a/test/simple/test-stream2-readable-wrap-empty.js +++ b/test/simple/test-stream2-readable-wrap-empty.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); +var Readable = require('../../lib/_stream_readable'); var EE = require('events').EventEmitter; var oldStream = new EE(); diff --git a/test/simple/test-stream2-readable-wrap.js b/test/simple/test-stream2-readable-wrap.js index 90eea01..6b177f7 100644 --- a/test/simple/test-stream2-readable-wrap.js +++ b/test/simple/test-stream2-readable-wrap.js @@ -22,8 +22,8 @@ var common = require('../common'); var assert = require('assert'); -var Readable = require('_stream_readable'); -var Writable = require('_stream_writable'); +var Readable = require('../../lib/_stream_readable'); +var Writable = require('../../lib/_stream_writable'); var EE = require('events').EventEmitter; var testRuns = 0, completedRuns = 0; diff --git a/test/simple/test-stream2-set-encoding.js b/test/simple/test-stream2-set-encoding.js index 5d2c32a..685531b 100644 --- a/test/simple/test-stream2-set-encoding.js +++ b/test/simple/test-stream2-set-encoding.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var R = require('_stream_readable'); +var R = require('../../lib/_stream_readable'); var util = require('util'); // tiny node-tap lookalike. diff --git a/test/simple/test-stream2-transform.js b/test/simple/test-stream2-transform.js index 9c9ddd8..a0cacc6 100644 --- a/test/simple/test-stream2-transform.js +++ b/test/simple/test-stream2-transform.js @@ -21,8 +21,8 @@ var assert = require('assert'); var common = require('../common.js'); -var PassThrough = require('_stream_passthrough'); -var Transform = require('_stream_transform'); +var PassThrough = require('../../').PassThrough; +var Transform = require('../../').Transform; // tiny node-tap lookalike. var tests = []; diff --git a/test/simple/test-stream2-unpipe-drain.js b/test/simple/test-stream2-unpipe-drain.js index d66dc3c..365b327 100644 --- a/test/simple/test-stream2-unpipe-drain.js +++ b/test/simple/test-stream2-unpipe-drain.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var crypto = require('crypto'); var util = require('util'); diff --git a/test/simple/test-stream2-unpipe-leak.js b/test/simple/test-stream2-unpipe-leak.js index 99f8746..17c92ae 100644 --- a/test/simple/test-stream2-unpipe-leak.js +++ b/test/simple/test-stream2-unpipe-leak.js @@ -22,7 +22,7 @@ var common = require('../common.js'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var chunk = new Buffer('hallo'); diff --git a/test/simple/test-stream2-writable.js b/test/simple/test-stream2-writable.js index 704100c..209c3a6 100644 --- a/test/simple/test-stream2-writable.js +++ b/test/simple/test-stream2-writable.js @@ -20,8 +20,8 @@ // USE OR OTHER DEALINGS IN THE SOFTWARE. var common = require('../common.js'); -var W = require('_stream_writable'); -var D = require('_stream_duplex'); +var W = require('../../').Writable; +var D = require('../../').Duplex; var assert = require('assert'); var util = require('util'); diff --git a/test/simple/test-stream3-pause-then-read.js b/test/simple/test-stream3-pause-then-read.js index b91bde3..2f72c15 100644 --- a/test/simple/test-stream3-pause-then-read.js +++ b/test/simple/test-stream3-pause-then-read.js @@ -22,7 +22,7 @@ var common = require('../common'); var assert = require('assert'); -var stream = require('stream'); +var stream = require('../../'); var Readable = stream.Readable; var Writable = stream.Writable; npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/0000755000000000000000000000000012631326456031457 5ustar 00000000000000././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000755000000000000000000000000012631326456032073 5ustar 00000000000000././@LongLink0000000000000000000000000000015300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/packag0000644000000000000000000000327212631326456032066 0ustar 00000000000000{ "name": "readable-stream", "version": "1.1.13", "description": "Streams3, a user-land copy of the stream library from Node.js v0.11.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "isarray": "0.0.1", "string_decoder": "~0.10.x", "inherits": "~2.0.1" }, "devDependencies": { "tap": "~0.2.6" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/readable-stream.git" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "gitHead": "3b672fd7ae92acf5b4ffdbabf74b372a0a56b051", "bugs": { "url": "https://github.com/isaacs/readable-stream/issues" }, "homepage": "https://github.com/isaacs/readable-stream", "_id": "readable-stream@1.1.13", "_shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "_from": "readable-stream@>=1.1.13 <2.0.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" }, { "name": "tootallnate", "email": "nathan@tootallnate.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "f6eef764f514c89e2b9e23146a75ba106756d23e", "tarball": "http://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-1.1.13.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/passthrough.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/passth0000644000000000000000000000007112631326456032134 0ustar 00000000000000module.exports = require("./lib/_stream_passthrough.js") ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/readable.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/readab0000644000000000000000000000055112631326456032053 0ustar 00000000000000exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = require('stream'); exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); ././@LongLink0000000000000000000000000000015300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/transform.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/transf0000644000000000000000000000006712631326456032134 0ustar 00000000000000module.exports = require("./lib/_stream_transform.js") ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/writable.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/writab0000644000000000000000000000006612631326456032126 0ustar 00000000000000module.exports = require("./lib/_stream_writable.js") ././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_duplex.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_s0000644000000000000000000000537312631326456032013 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. module.exports = Duplex; /**/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /**/ /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); forEach(objectKeys(Writable.prototype), function(method) { if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; }); function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. process.nextTick(this.end.bind(this)); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_passthrough.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_s0000644000000000000000000000327712631326456032014 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. module.exports = PassThrough; var Transform = require('./_stream_transform'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_readable.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_s0000644000000000000000000006254712631326456032021 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. module.exports = Readable; /**/ var isArray = require('isarray'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /**/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /**/ var Stream = require('stream'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var StringDecoder; /**/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /**/ util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (util.isString(chunk) && !state.objectMode) { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (util.isNullOrUndefined(chunk)) { state.reading = false; if (!state.ended) onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (isNaN(n) || util.isNull(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else return state.length; } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (!util.isNumber(n) || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (util.isNull(ret)) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (!util.isNull(ret)) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) process.nextTick(function() { emitReadable_(stream); }); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; process.nextTick(function() { maybeReadMore_(stream, state); }); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) process.nextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { var self = this; process.nextTick(function() { debug('readable nexttick read 0'); self.read(0); }); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; if (!state.reading) { debug('resume read 0'); this.read(0); } resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; process.nextTick(function() { resume_(stream, state); }); } } function resume_(stream, state) { state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); if (!chunk || !state.objectMode && !chunk.length) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (util.isFunction(stream[i]) && util.isUndefined(this[i])) { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }}(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; process.nextTick(function() { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } }); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_transform.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_s0000644000000000000000000001626612631326456032016 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. module.exports = Transform; var Duplex = require('./_stream_duplex'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(Transform, Duplex); function TransformState(options, stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (!util.isNullOrUndefined(data)) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(options, this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; this.once('prefinish', function() { if (util.isFunction(this._flush)) this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (!util.isNull(ts.writechunk) && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_stream_writable.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/lib/_s0000644000000000000000000003141512631326456032007 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. module.exports = Writable; /**/ var Buffer = require('buffer').Buffer; /**/ Writable.WritableState = WritableState; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Stream = require('stream'); util.inherits(Writable, Stream); function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = options.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.buffer = []; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, state, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); process.nextTick(function() { cb(er); }); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!util.isBuffer(chunk) && !util.isString(chunk) && !util.isNullOrUndefined(chunk) && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); process.nextTick(function() { cb(er); }); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (util.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (!util.isFunction(cb)) cb = function() {}; if (state.ended) writeAfterEnd(this, state, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.buffer.length) clearBuffer(this, state); } }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && util.isString(chunk)) { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (util.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) state.buffer.push(new WriteReq(chunk, encoding, cb)); else doWrite(stream, state, false, len, chunk, encoding, cb); return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { if (sync) process.nextTick(function() { state.pendingcb--; cb(er); }); else { state.pendingcb--; cb(er); } stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(stream, state); if (!finished && !state.corked && !state.bufferProcessing && state.buffer.length) { clearBuffer(stream, state); } if (sync) { process.nextTick(function() { afterWrite(stream, state, finished, cb); }); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; if (stream._writev && state.buffer.length > 1) { // Fast case, write everything using _writev() var cbs = []; for (var c = 0; c < state.buffer.length; c++) cbs.push(state.buffer[c].callback); // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; doWrite(stream, state, true, state.length, state.buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer state.buffer = []; } else { // Slow case, write chunks one-by-one for (var c = 0; c < state.buffer.length; c++) { var entry = state.buffer[c]; var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { c++; break; } } if (c < state.buffer.length) state.buffer = state.buffer.slice(c); else state.buffer.length = 0; } state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (util.isFunction(chunk)) { cb = chunk; chunk = null; encoding = null; } else if (util.isFunction(encoding)) { cb = encoding; encoding = null; } if (!util.isNullOrUndefined(chunk)) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(stream, state) { return (state.ending && state.length === 0 && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(stream, state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else prefinish(stream, state); } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) process.nextTick(cb); else stream.once('finish', cb); } state.ended = true; } ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000755000000000000000000000000012631326456032073 5ustar 00000000000000././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000755000000000000000000000000012631326456032073 5ustar 00000000000000././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000755000000000000000000000000012631326456032073 5ustar 00000000000000././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000010312631326456032067 0ustar 00000000000000# core-util-is The `util.is*` functions introduced in Node v0.12. ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/float.patchnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000003762612631326456032113 0ustar 00000000000000diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/lib/npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000755000000000000000000000000012631326456032073 5ustar 00000000000000././@LongLink0000000000000000000000000000020500000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000250612631326456032100 0ustar 00000000000000{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is.git" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is", "_id": "core-util-is@1.0.1", "dist": { "shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "tarball": "http://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" }, "_from": "core-util-is@>=1.0.0 <1.1.0", "_npmVersion": "1.3.23", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz" } ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/util.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000570412631326456032103 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/core-util-is/lib/util.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000574012631326456032103 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000302512631326456032075 0ustar 00000000000000 # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/build/npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000755000000000000000000000000012631326456032073 5ustar 00000000000000././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/component.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000072612631326456032102 0ustar 00000000000000{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000017012631326456032073 0ustar 00000000000000module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000241012631326456032072 0ustar 00000000000000{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "_id": "isarray@0.0.1", "dist": { "shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "tarball": "http://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz" }, "_from": "isarray@0.0.1", "_npmVersion": "1.2.18", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" } ], "directories": {}, "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" }, "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/isarray/build/build.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000777112631326456032111 0ustar 00000000000000 /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ././@LongLink0000000000000000000000000000020500000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/.npmignorenpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000001312631326456032067 0ustar 00000000000000build test ././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000206412631326456032077 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000076212631326456032102 0ustar 00000000000000**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000001716412631326456032106 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ././@LongLink0000000000000000000000000000020700000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_modules/string_decoder/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/node_modules/readable-stream/node_m0000644000000000000000000000252712631326456032103 0ustar 00000000000000{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "gitHead": "d46d4fd87cf1d06e031c23f1ba170ca7d4ade9a0", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_from": "string_decoder@>=0.10.0 <0.11.0", "_npmVersion": "1.4.23", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "substack", "email": "mail@substack.net" }, { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "tarball": "http://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/test/tracker.js0000644000000000000000000000317412631326456026161 0ustar 00000000000000"use strict" var test = require("tap").test var Tracker = require("../index.js").Tracker var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } test("Tracker", function (t) { t.plan(10) var name = "test" var track = new Tracker(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") var todo = 100 track = new Tracker(name, todo) t.is(track.completed(), 0, "Nothing done is 0 completion") testEvent(track, "change", afterCompleteWork) track.completeWork(100) function afterCompleteWork(er, onChangeName) { t.is(er, null, "completeWork: on change event fired") t.is(onChangeName, name, "completeWork: on change emits the correct name") } t.is(track.completed(), 1, "completeWork: 100% completed") testEvent(track, "change", afterAddWork) track.addWork(100) function afterAddWork(er, onChangeName) { t.is(er, null, "addWork: on change event fired") t.is(onChangeName, name, "addWork: on change emits the correct name") } t.is(track.completed(), 0.5, "addWork: 50% completed") track.completeWork(200) t.is(track.completed(), 1, "completeWork: Over completion is still only 100% complete") track = new Tracker(name, todo) track.completeWork(50) track.finish() t.is(track.completed(), 1, "finish: Explicitly finishing moves to 100%") }) npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/test/trackergroup.js0000644000000000000000000000633412631326456027237 0ustar 00000000000000"use strict" var test = require("tap").test var Tracker = require("../index.js").Tracker var TrackerGroup = require("../index.js").TrackerGroup var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } test("TrackerGroup", function (t) { var name = "test" var track = new TrackerGroup(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") testEvent(track, "change", afterFinishEmpty) track.finish() var a, b function afterFinishEmpty(er, onChangeName) { t.is(er, null, "finishEmpty: on change event fired") t.is(onChangeName, name, "finishEmpty: on change emits the correct name") t.is(track.completed(), 1, "finishEmpty: Finishing an empty group actually finishes it") track = new TrackerGroup(name) a = track.newItem("a", 10, 1) b = track.newItem("b", 10, 1) t.is(track.completed(), 0, "Initially empty") testEvent(track, "change", afterCompleteWork) a.completeWork(5) } function afterCompleteWork(er, onChangeName) { t.is(er, null, "on change event fired") t.is(onChangeName, "a", "on change emits the correct name") t.is(track.completed(), 0.25, "Complete half of one is a quarter overall") testEvent(track, "change", afterFinishAll) track.finish() } function afterFinishAll(er, onChangeName) { t.is(er, null, "finishAll: on change event fired") t.is(onChangeName, name, "finishAll: on change emits the correct name") t.is(track.completed(), 1, "Finishing everything ") track = new TrackerGroup(name) a = track.newItem("a", 10, 2) b = track.newItem("b", 10, 1) t.is(track.completed(), 0, "weighted: Initially empty") testEvent(track, "change", afterWeightedCompleteWork) a.completeWork(5) } function afterWeightedCompleteWork(er, onChangeName) { t.is(er, null, "weighted: on change event fired") t.is(onChangeName, "a", "weighted: on change emits the correct name") t.is(Math.round(track.completed()*100), 33, "weighted: Complete half of double weighted") testEvent(track, "change", afterWeightedFinishAll) track.finish() } function afterWeightedFinishAll(er, onChangeName) { t.is(er, null, "weightedFinishAll: on change event fired") t.is(onChangeName, name, "weightedFinishAll: on change emits the correct name") t.is(track.completed(), 1, "weightedFinishaAll: Finishing everything ") track = new TrackerGroup(name) a = track.newGroup("a", 10) b = track.newGroup("b", 10) var a1 = a.newItem("a.1",10) a1.completeWork(5) t.is(track.completed(), 0.25, "nested: Initially quarter done") testEvent(track, "change", afterNestedComplete) b.finish() } function afterNestedComplete(er, onChangeName) { t.is(er, null, "nestedComplete: on change event fired") t.is(onChangeName, "b", "nestedComplete: on change emits the correct name") t.is(track.completed(), 0.75, "nestedComplete: Finishing everything ") t.end() } }) npm_3.5.2.orig/node_modules/npmlog/node_modules/are-we-there-yet/test/trackerstream.js0000644000000000000000000000344212631326456027373 0ustar 00000000000000"use strict" var test = require("tap").test var util = require("util") var stream = require("readable-stream") var TrackerStream = require("../index.js").TrackerStream var timeoutError = new Error("timeout") var testEvent = function (obj,event,next) { var timeout = setTimeout(function(){ obj.removeListener(event, eventHandler) next(timeoutError) }, 10) var eventHandler = function () { var args = Array.prototype.slice.call(arguments) args.unshift(null) clearTimeout(timeout) next.apply(null, args) } obj.once(event, eventHandler) } var Sink = function () { stream.Writable.apply(this,arguments) } util.inherits(Sink, stream.Writable) Sink.prototype._write = function (data, encoding, cb) { cb() } test("TrackerStream", function (t) { t.plan(9) var name = "test" var track = new TrackerStream(name) t.is(track.completed(), 0, "Nothing todo is 0 completion") var todo = 10 track = new TrackerStream(name, todo) t.is(track.completed(), 0, "Nothing done is 0 completion") track.pipe(new Sink()) testEvent(track, "change", afterCompleteWork) track.write("0123456789") function afterCompleteWork(er, onChangeName) { t.is(er, null, "write: on change event fired") t.is(onChangeName, name, "write: on change emits the correct name") t.is(track.completed(), 1, "write: 100% completed") testEvent(track, "change", afterAddWork) track.addWork(10) } function afterAddWork(er, onChangeName) { t.is(er, null, "addWork: on change event fired") t.is(track.completed(), 0.5, "addWork: 50% completed") testEvent(track, "change", afterAllWork) track.write("ABCDEFGHIJKLMNOPQRST") } function afterAllWork(er) { t.is(er, null, "allWork: on change event fired") t.is(track.completed(), 1, "allWork: 100% completed") } }) npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/.npmignore0000644000000000000000000000114212631326456023205 0ustar 00000000000000# Logs logs *.log # Runtime data pids *.pid *.seed # Directory for instrumented libs generated by jscoverage/JSCover lib-cov # Coverage directory used by tools like istanbul coverage # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) .grunt # Compiled binary addons (http://nodejs.org/api/addons.html) build/Release # Dependency directory # Commenting this out is preferred by some people, see # https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git- node_modules # Users Environment Variables .lock-wscript # Editor cruft *~ .#* npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/LICENSE0000644000000000000000000000135712631326456022223 0ustar 00000000000000Copyright (c) 2014, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/README.md0000644000000000000000000001403612631326456022473 0ustar 00000000000000gauge ===== A nearly stateless terminal based horizontal guage / progress bar. ```javascript var Gauge = require("gauge") var gauge = new Gauge() gauge.show("test", 0.20) gauge.pulse("this") gauge.hide() ``` ![](example.png) ### `var gauge = new Gauge([options], [ansiStream])` * **options** – *(optional)* An option object. (See [below] for details.) * **ansiStream** – *(optional)* A stream that's been blessed by the [ansi] module to include various commands for controlling the cursor in a terminal. [ansi]: https://www.npmjs.com/package/ansi [below]: #theme-objects Constructs a new gauge. Gauges are drawn on a single line, and are not drawn if the current terminal isn't a tty. If you resize your terminal in a way that can be detected then the gauge will be drawn at the new size. As a general rule, growing your terminal will be clean, but shrinking your terminal will result in cruft as we don't have enough information to know where what we wrote previously is now located. The **options** object can have the following properties, all of which are optional: * maxUpdateFrequency: defaults to 50 msec, the gauge will not be drawn more than once in this period of time. This applies to `show` and `pulse` calls, but if you `hide` and then `show` the gauge it will draw it regardless of time since last draw. * theme: defaults to Gauge.unicode` if the terminal supports unicode according to [has-unicode], otherwise it defaults to `Gauge.ascii`. Details on the [theme object](#theme-objects) are documented elsewhere. * template: see [documentation elsewhere](#template-objects) for defaults and details. [has-unicode]: https://www.npmjs.com/package/has-unicode If **ansiStream** isn't passed in, then one will be constructed from stderr with `ansi(process.stderr)`. ### `gauge.show([name, [completed]])` * **name** – *(optional)* The name of the current thing contributing to progress. Defaults to the last value used, or "". * **completed** – *(optional)* The portion completed as a value between 0 and 1. Defaults to the last value used, or 0. If `process.stdout.isTTY` is false then this does nothing. If completed is 0 and `gauge.pulse` has never been called, then similarly nothing will be printed. If `maxUpdateFrequency` msec haven't passed since the last call to `show` or `pulse` then similarly, nothing will be printed. (Actually, the update is deferred until `maxUpdateFrequency` msec have passed and if nothing else has happened, the gauge update will happen.) ### `gauge.hide()` Removes the gauge from the terminal. ### `gauge.pulse([name])` * **name** – *(optional)* The specific thing that triggered this pulse Spins the spinner in the gauge to show output. If **name** is included then it will be combined with the last name passed to `gauge.show` using the subsection property of the theme (typically a right facing arrow). ### `gauge.disable()` Hides the gauge and ignores further calls to `show` or `pulse`. ### `gauge.enable()` Shows the gauge and resumes updating when `show` or `pulse` is called. ### `gauge.setTheme(theme)` Change the active theme, will be displayed with the next show or pulse ### `gauge.setTemplate(template)` Change the active template, will be displayed with the next show or pulse ### Theme Objects There are two theme objects available as a part of the module, `Gauge.unicode` and `Gauge.ascii`. Theme objects have the follow properties: | Property | Unicode | ASCII | | ---------- | ------- | ----- | | startgroup | ╢ | \| | | endgroup | ╟ | \| | | complete | █ | # | | incomplete | ░ | - | | spinner | ▀▐▄▌ | -\\\|/ | | subsection | → | -> | *startgroup*, *endgroup* and *subsection* can be as many characters as you want. *complete* and *incomplete* should be a single character width each. *spinner* is a list of characters to use in turn when displaying an activity spinner. The Gauge will spin as many characters as you give here. ### Template Objects A template is an array of objects and strings that, after being evaluated, will be turned into the gauge line. The default template is: ```javascript [ {type: "name", separated: true, maxLength: 25, minLength: 25, align: "left"}, {type: "spinner", separated: true}, {type: "startgroup"}, {type: "completionbar"}, {type: "endgroup"} ] ``` The various template elements can either be **plain strings**, in which case they will be be included verbatum in the output. If the template element is an object, it can have the following keys: * *type* can be: * `name` – The most recent name passed to `show`; if this is in response to a `pulse` then the name passed to `pulse` will be appended along with the subsection property from the theme. * `spinner` – If you've ever called `pulse` this will be one of the characters from the spinner property of the theme. * `startgroup` – The `startgroup` property from the theme. * `completionbar` – This progress bar itself * `endgroup` – The `endgroup` property from the theme. * *separated* – If true, the element will be separated with spaces from things on either side (and margins count as space, so it won't be indented), but only if its included. * *maxLength* – The maximum length for this element. If its value is longer it will be truncated. * *minLength* – The minimum length for this element. If its value is shorter it will be padded according to the *align* value. * *align* – (Default: left) Possible values "left", "right" and "center". Works as you'd expect from word processors. * *length* – Provides a single value for both *minLength* and *maxLength*. If both *length* and *minLength or *maxLength* are specifed then the latter take precedence. ### Tracking Completion If you have more than one thing going on that you want to track completion of, you may find the related [are-we-there-yet] helpful. It's `change` event can be wired up to the `show` method to get a more traditional progress bar interface. [are-we-there-yet]: https://www.npmjs.com/package/are-we-there-yet npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/example.png0000644000000000000000000004635112631326456023362 0ustar 00000000000000PNG  IHDRtol{ iCCPICC ProfileHWXS[R -)7AzޥJ! J AŎ.*vQAQEZQ,` "*ʺX& {;79gs|3(ڳ,T l~0*Ї$ @@0Yl;22\g*  q GΆ(&[ sbU!$,i2%)2l-2 c8 {Y = bE*Ħ)?I)c1Y1,E*d?HŚKvxt }ب (IΰnUB$rGNS# V#{~ 0@1%gz`[P p^npNΊqEѣ8:s?+|!3 =/㉶!VC2b(?7|F(p6]0 JfgF¬,)urcdXW:ʍq8\~g .BAV=V ;$ʋ ' H 72F A(~ İY ᗬ'.ьzK{ _qhGyPuL+{ZTio#<8=p7<>`ŝqQ?D1@4`{: 7 ] raiO̙ tARwߨ n Y;>; \X0o?2^˟Ǔ11cV?GF[Z3EVikNJLx" EIe8Q>/1:kPA.wNdAspG2l L[k$lxːw]N.EP]2S l0O%E2400'[܀A`z:Ȇg` (`-JT0'p\&F/x {0  !tDE qF<$Bd$ #bd>)F#.9A."]C Q 6jNDQo4Aih.CW[ t?ZA/7.%:Lc`z%bX XVU`X#ױ.q:- cq6/Wx^n|FWB0!FM($KxO$D\ <*vAbC$H$ ;)" I[IIIHdy.ٖ@N">)53򐜒\Gn=rrWz(;%AYBB<חw"ϓ_,EnOT9՗:*VRwoi41͋D˥U>*8 )RS4RVXxDbKiRqJtelU/*?W!pTV9Ct_:~ޫJT5Q VP-V=ڮ:f6GLZc3Y5Ì[y[9vܵqǫ{sՋTXQP4ל9[\fxnO 2׊Қ[MkP[G;P[UvCK'Cg)>].OwiL573 k7я/?ЀbljѠ`P0pa=#9#gtFFM㍗?7Q7 671y`J3410aF4s64na;_@--x-:'&LOpےjmgYcmŰ *z5pbu['~vβc}FfMM[m ;]"\r;t0_}NNNۜn;:G:rBpqYr哫ka׿,2=d2;iϤw}w..GN.O=Ogc//^gf_X}|u]? H tD Zt;X;\<0i-!ԐҐǡ04lr؆F!aIdNSS"Mye5?5=3z_51cMcűqq>ůJ rf"/!7ipMS{9L+vk9/М5LřG "Xm)l_fKg#]}Ꞻ>y{چt~/:#(cGƇ̈|~&eά9:BAWkΦapM5£NT;#,G(ik>wg5כd~" S6/2XlQUK(K2\).X_nie/%BBanwWV[u"Nѥb/ث.j_Wn_㸦|-q-u+_߳!lCFƢ6tľdff-[n]Kiz2۴a;gr;w|ygW஺ ㊒yo{5Zɯ쪊jvާoM Z#?m 2ć^!8=jtt1:nn@}z}WCbCǛaG e'N9E9ӃM3igzg6?pF˔s!.8ջ '.^<~ReummǮ8\9^wjCKGcS/D]eed9|]=a! :w ! yMPN:rXA҉nŮ} 55K5`;oXi9}xiBnSԻ~w ףkx!zT ' NS3,+z_^jo}A5eh~r Y kjlU,ઉ?^pY??__p{MROOڻP~eMz.hKFtW9]ֵuB噐m!~\ kj-> y?V|er wVJhRG.a؆%O(BZ#`lZ&kƴտ>epü$=d.t.~iUVPŋ205~B^Ѭժ>933 r:ŇCsWjjߚzrVͬS(n!`%g)1iT\T/4`:GѡWD -^C\VV$P9-r_B@E饬2aݚnSa*bof>l uGEKJ7Qߺx/9~5jCD$P X>G~kNH#9ssCs=2U{IZQը|.=HDKѱ\CYt"j˧> }x_"WuV){\y7l|4z-Z\NuG$:.^ kbxWtm8KwcD:䵬rܻ!T2ZNry.4$:.^4%Ntr$AN0d>w9+`6ё2 l 𣾫)+N .֝DKC\mǔ9H>C̙rcuoIr69𢣽ICtz&aSd*Weܪ0[Xɳwj=Ŵ8PE\ljXOtxEע|H NhDs= /ug$6< 0Yl4&n +:ݿG.u1ܠM^C|A$?(0Dv}>;gH#7|^W٪X;:jrgݚmaCEr״;e|_|czS5j$x_ ,.:ƭ-.o7Uc>u(z7[_g ((|k 5jYI(: R΅ߗTsbgDgd^^X4:q,{ؿ$6jjď8#=*ezE>_a~~!O$ߝϸOT|RKtUo$^t Li0-ii朲ŋRr##JJ1"tJER v<܋MZnpz`<霟ŋZё2''O&c\U"Taj,SW/j='utQ:%D77TO Lk$aZ)*Y 'RnËZ1,U <Աo@CcxQAtHuѷ#N@0Aqw##,:J8 1S}ROt ug91F8N_=)}?&Zp,srf|G5aӪ:OX:떝GI1N"-^zO@ʉ Dm xlu4E3˜E(Xdt Qj/mPzNVV/j=' DMݫ(5EߢRjD :@tDB|\kD̷Ksn:ԑM[O >/#/O!Cd/Y :РD5#_57y/c|B6>9|XѱOȇ݂J}9^Egzpb>qokjjZ}߰rHҶ;{V!ZTCJw~v7si)I{ijEw=>oO'宭.[Iȩ Ì<3ɑ64^p_ىkk="‚?n_Y׋, ZR9-\ &/_RcdR{Mf9~f8G:;+elRG.a؆%OBZ#`ƴ7׿;Mԣ۬%5֗7[O <%IȼL^@Ǧޓʠ2s 1uŇĤCsWjjf=l/=C \.[=3%H=s8DNaIslLr.*=XcuLfIY"ADBtpJ5E3R. 2-BѴH^v_.# zjgVK{3ے$Zbva (^hxy&LY9B3ڭ>?rγb3?D0C)` aB:t9Q[[ hG z<0S:$ZQ(\nx:)R!DHRtxFF  Z XBdCo>՜r!чW:1HQ)Y-iZts S=SixY ybx@tZ mDe"zEf-Qwf6q`em4y''7mF"y-;8O$BtZ$wx\y%4گZ{C#$wu9e%3X$?dKm0HEGjE9;8OL!?DZd:p9@}L]h/S$rS#&'<-`f$3.:qX@Ģ#=/ \&mϷz3_*ZnM=ey[Š=uMcDaDp'ӿgg} iPtX?;9NzlZwgݚ}aXTQP=`P/iT>G/"s @}"|&,P>9 FBzSK)ӭD'G2^|՜eYR"[[": R;$1ӪĬklvKNy'g-Έ JIzїEC)_/S Bi>ziUt&a۳oas=9[R2-å5yWoƘEZLqNTqF׋E#n>WCyAtqf-sh):Q<f)[(,>)':2_Y,BNߛ[$iý1ۤ #y!Q,>/xxDl2U)Br:?ppeS} :\G:Aɡ]BMs^LTδFMp"HF81,>R3H4$:DT};t w1rQ>)(:R Z9N͢S=' DϿP[zxX>(:,nc'cl[<eGѡ8Q>)':Lmw[8٬qcnyt-oE@ؖJ+AVNSt.:#ɍ,^zO@ʉPEHGO/NF (,DoeS}RNtШKѴ^ٽR3Y-*,>&:@tD@tuuϑε%F%;ԑMٛkt8T|E}P˧y]!{mf-8˜|z?8}6NtM6$@ݵ&?i0k+Od8ȅ烚jܱjVZ7g)P:E^IpȅJM)7谄ė/~=]vޯ͹;"'%5o4=79'm!=i}E26aIh,Esh:1Bdkܞ%,elB2۸Y8\q Tt̋KӘvwc°α K֓F6F%57[OB97[k# Z/a|CkW=kBŕ=*+?ֺҞZRrV$]LKLf_u,9D s:Ň~p׺bXyk}UtԒCs 9+Wd (g'>Vߊ0)Sci]i/*:1[8+}%CtDG59- sˌ*:jIbof)Ak* vܗ3=LsR-o_SGN}̱CL?D@t`,:/A[s6@lc)FIkEW΅uhDq2!T6TSijuf`~fNڟ!SIS,K,o %a+dFgJ,Jhr2TuFְX[Wڻ ut|.%1*:jiXљ2?.*}[9æ>.,lkc:Hkw4p"}ME+Rbkej|1r۸D"EtZf+V"NyA-I[]O#JJJg3Ns`E QO%[d::\0#4eoԞR2 74[خM\ԔSILNb{!;CqM%+zm₺*)uÎ%0?i13/1kP*Uz35̮jk=xM&ٶ@lɞ˽  41e\ERWYKld=sȭZ.7+Dg\P&>Pv=9ZWSK`SPM a+*'ߜZohD'Iorß\G㐳[pFXLm~`ԒlcA[sF zd-p /1< U ;Uiç Y[WSK:IC8MW|CV2.s( (}"\}h$8°Z:r͒GcMX |qk ;kE7YJZ-y6tEkɲ%=U HEi;ރ;ob jXЧڭtɲDn*^;S&ޞ R=:TNbCg9)%-,4ӊ5Kƙ%{pѱs{-,C}KJLTKjY3ў.s衦.M(-3}ui$g=*jZ="ΔUH#xz{WeqeoΖʈO!@Ģ3;`#0t$ߊ >0coQtǧWj' ѡ>*g$Ɂǰ ޿;{1$(: ~lI~L?߰^ ,+6щ9SE}`rbv׻,X.Vls\t-_s[? $ =S˟Gk,Xܬ(E'' );`rs\_%ÂŊvxs^tҷ{R6?4X`YӉ9I&:X}k["ت?‚Ѻ٫؟)sXR{D'G]rWb(X`9[&9XRhnFsls2yU2?Taщ9':hw?qpF_|?藺`r2b{x8|tvk`h_D<{uOŁ:_,X-y"9.O>ZY ,+)<,X.Vws~:9{ ەdu-,X\k!:2mcaX`9[8م=݄`lAtNgr"X`9Z߾T$Mf(7- s©_M¦K "`r-4HIq#i.!즿EtT/zODgZ#a &HQEW8R$u#^za`}ËZ@EtY+T)EfQQzCU_-=c6+zIDATcNCqS}RNtڌȷ&qZ3Y Koݲ")"&i[:;ŋZO H9a-!V]tF3TYT̑^7J "jQ_Y|ىEQ#i{53fv[TY|RMtD$ I+9[t@]C9ڔiHwͳtڥ^rwNӮHD.:]"M.:#-Ӿ?*:Ӄ)!TN5O#]VRfp#-l zu'/x?8|&Gh'2AM5-xeMsT ϫ.uh.:+:=@JK~[0O${iZ7V#9~ZRVIcsNUt%7k>ɟ."|Ei'~L"$j]$-(^  NN9p13JTtmXh4A%5'2Q ;ՋEt؉[OJChS]j+Ww{VUV~lȼ5!RZP蹡mߜw!)yE[޾*jɡRRcb"j^t*/|V"+jNWnVZp`:DrkNpgm{Ϫr-f]thﳙu"GG jI:تxQRR2-OUxPlT^|;}Bf3PEԂ2 g3rϕ0m< ⹒zT4ac.ܠx>=)WŘ0ױY-3<$4-1ؿА-8Ł]t=/:r j.:MrTRZt2mP4PzcK2ⳒKͧiPQt2C҂ ~.Ytv+잝\G[pFXL'~`Ԓl+A[s  eՋ.%KzS/ zd-p4mсFl? Gɞj`>xWpl@0iTRJj}tYt8*gK :#&g/AtWq;~{@td3D@tU4߻7 *+aD݊YωkNot.7.$&[9,X c&:s")oi BgŸs&b$lJsvϤR\_ӗ~ VXy yIzBt<3b񓼢sj?dG/5է 9V]]]#VAy,fX9 K=~ïOp,Wo~cj/D^N1e~+q(EOv4PXdX+rO5rs\;_`%jqYUY 0#>7FUR2bz\7ۮ?k$+o m~i`%uy9RNcө*,)P9Opj{#:X}k[Zت?‚ e^R7CxCqTubw!o{mZ Ut1e ַ=erLCGWõ!j7GWyB.sC,dk{NAii%|,+Q,,)]47#Ji| gwMHˆNmmm89`YcGY%蜗r+Q,蘞Jxq``A&WYg(48"dS}8#}/SXʈUtƭ/H}Wi3LJ?)TӢKWD$:"?p.ߕ\ VX|80?^6U <ܻqG1kHtRlq68_,X bz״?ٗ+u0tc :~}1`%Ôy:B6f+Y>#@fޓ zd-3٦xrJYD'+~>@co/DGf+],Xcpς>u|74u^2*z9謳0,έ-.9`J+Łg9f)<[/:e?SMJFQ=|}Nx]itSMtr[,X c(:*Y3KJ-FS$Ƶ`%ѹIb)m! Â(D'oARm.D`%Iu:{MS`%ԢӰT$Mf(7IڷZt2)w)KL9< 17\oH+A,ƘEZLqNTqF׋E#n>WCyAtqf-sh):Q<f)[(,>)':2_Y,BNߛ[$iý1ۤ #y!Q,>/xxDl2U)Br:?ppeS} :\G:Aɡ]BMs^LTδFMp"HF81,>R3H4$:DT};t W1rQ>)(:R Z9N͢S=' DϿP[zxX>(:,nS#}?&3x=EKDG0ߑoMjgd1)ߺeERDL2;;;7uvvGrc[B(,^)[:M-0f&7xQ>)':B#=͎\CvwR[IM[pev'ם!gp 5մj6=S1[@?޺ԡhkIq^/CDvЯuc)1'%5o4< n{QE9Yrs"Rǹ\4&!$BօNڂE Xs wX{~1DE؆%QICīZRcazʺ(cXXDؼ.Y}4=kܥr}gUe.ZӞ+Ջچ x]D15K훭r ,U;O.q-˩EBng)"tx{jCt.w[Ўߦ'*jIbofE|d^tOWL͋n&d!:\ѡ`,/>[1Llc)IGBõvѹSk""*ϻ\tx{,lWQtإ~=΂CtVZI#k4!:s> T9`k5WOtҰ3e~]TrEM}\*X2.:5ՋUt~;BNv+,9,ts .~ :[tZf;謕s;>Y'r$|%%%B=^*FEgg(d1^tL-8)s.:}v9Oq(X.\L 0 +M%+L8 zOޓrU }2C"O@S тX@EsA*/F Ms+)M%EzJW/L> E*>9$C\!>+$|A%E'3L)-a @E'Iolٹ%49;gt K-ɶϺ50ѹ`Y񼟯Q肜]R70G;OӆOm͜&:Y>_rj vp% NU,֧NEà opi^N"Ě^ǵ]hni nm@EV#1ߟ73b6J)[/:1=I2}:A>zfՅ~I OM) yh=gWcڟjjmЀhٷ3J7-)RR˚ItLlfY\(у X|v圣҂B\m א"]wU!UdܑCc8삝@t\#'V2DǕIWZ6+T<$&&g/AtWq;~\Atd3D@tn.ڥq2 ~~2XƊMtڭxK[:QS?Â0Vl3';ωz{g =+aDg W:D]e3.:)8ׯ幭 VlczoD^ҵO+*Oѕj?dG/5է 9V]]]#nzAyw>|9 K=+~O=0ϫdS{&:v(<,XcE):6~b"~#=%#RϸGxi|̸bO+ahW+W'dϪ#Ȇyn18o>jsdvwUVݕ]i/D'}QoM+q z6v+W!w\d9O٥-|-Cek{#:X}k[Zت?‚ e^'7C׋xCqTubw!o{mZ8S`}S/$:tt<\K *opz4(:{='?7$:ʢM6ѱQ DLst܌(EIS77-" #:eDe^D%蜗r+Q,xq``A&WYg(48QtxgֵdS}8#}/SXʈUtƭ/H}Wi3LJ?)TӢKWD$:BuHtsɰNg:XEÁxj58JJ:99z ^rݑɷ8po/Fd=m?⁻+u0tc :~}1`%Ôy:B6f+Y>#@0ޓ zd-3٦.erJYD'+~>@co/DGf+],Xcp/1aQ:S:`/FUtYFRFt%u8G^,XbŶ8P,g,vE'8~r Ch0Ѡ+nwo`%d,)?3MK{/:MkaJ+.s-)Sڶ]C+Q,N٥<]`J t$aJ+9E"bѩHxPn ֵo%Xfu]tW!K )`%'|5 :դ8.wRĉäu[| :|c5g1bQ@tSUyyb舛ODU |>)mx}@D%u$Z`x@jgT>O<3jbΓب?zh~$*/|<|w>>Sbg H-#VEqܒgz)d2i#g>s/J=O@ʉ8+*@Ё(I`p/v6iys~^~Ht/j=ODG0(=sUwP}N4+:҉hL"n\DGb@t50-m,]t|)ER7‰Eg * F7!1 :\D[H དྷZO HAёQth%m>':TxE:3ƃ΀DRO HEau#D716ٝ(:4X :j=' D|kU;#u tLq-;."bٹE񭳳;ZTBie:HinEg1C5ŋZO H9*P<A|)by_h7^zO@ʉ5r)+W3SQjf7I' D :hIENDB`npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/0000755000000000000000000000000012631326456023665 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/package.json0000644000000000000000000000263712631326456023506 0ustar 00000000000000{ "name": "gauge", "version": "1.2.2", "description": "A terminal based horizontal guage", "main": "progress-bar.js", "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/iarna/gauge.git" }, "keywords": [ "progressbar", "progress", "gauge" ], "author": { "name": "Rebecca Turner", "email": "me@re-becca.org" }, "license": "ISC", "bugs": { "url": "https://github.com/iarna/gauge/issues" }, "homepage": "https://github.com/iarna/gauge", "dependencies": { "ansi": "^0.3.0", "has-unicode": "^1.0.0", "lodash.pad": "^3.0.0", "lodash.padleft": "^3.0.0", "lodash.padright": "^3.0.0" }, "devDependencies": { "tap": "^0.4.13" }, "gitHead": "9f7eeeeed3b74a70f30b721d570435f6ffbc0168", "_id": "gauge@1.2.2", "_shasum": "05b6730a19a8fcad3c340a142f0945222a3f815b", "_from": "gauge@>=1.2.0 <1.3.0", "_npmVersion": "3.1.0", "_nodeVersion": "0.10.38", "_npmUser": { "name": "iarna", "email": "me@re-becca.org" }, "dist": { "shasum": "05b6730a19a8fcad3c340a142f0945222a3f815b", "tarball": "http://registry.npmjs.org/gauge/-/gauge-1.2.2.tgz" }, "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/gauge/-/gauge-1.2.2.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/progress-bar.js0000644000000000000000000001336512631326456024164 0ustar 00000000000000"use strict" var hasUnicode = require("has-unicode") var ansi = require("ansi") var align = { center: require("lodash.pad"), left: require("lodash.padright"), right: require("lodash.padleft") } var defaultStream = process.stderr function isTTY() { return process.stderr.isTTY } function getWritableTTYColumns() { // Writing to the final column wraps the line // We have to use stdout here, because Node's magic SIGWINCH handler only // updates process.stdout, not process.stderr return process.stdout.columns - 1 } var ProgressBar = module.exports = function (options, cursor) { if (! options) options = {} if (! cursor && options.write) { cursor = options options = {} } if (! cursor) { cursor = ansi(defaultStream) } this.cursor = cursor this.showing = false this.theme = options.theme || (hasUnicode() ? ProgressBar.unicode : ProgressBar.ascii) this.template = options.template || [ {type: "name", separated: true, length: 25}, {type: "spinner", separated: true}, {type: "startgroup"}, {type: "completionbar"}, {type: "endgroup"} ] this.updatefreq = options.maxUpdateFrequency || 50 this.lastName = "" this.lastCompleted = 0 this.spun = 0 this.last = new Date(0) var self = this this._handleSizeChange = function () { if (!self.showing) return self.hide() self.show() } } ProgressBar.prototype = {} ProgressBar.unicode = { startgroup: "╢", endgroup: "╟", complete: "█", incomplete: "░", spinner: "▀▐▄▌", subsection: "→" } ProgressBar.ascii = { startgroup: "|", endgroup: "|", complete: "#", incomplete: "-", spinner: "-\\|/", subsection: "->" } ProgressBar.prototype.setTheme = function(theme) { this.theme = theme } ProgressBar.prototype.setTemplate = function(template) { this.template = template } ProgressBar.prototype._enableResizeEvents = function() { process.stdout.on('resize', this._handleSizeChange) } ProgressBar.prototype._disableResizeEvents = function() { process.stdout.removeListener('resize', this._handleSizeChange) } ProgressBar.prototype.disable = function() { this.hide() this.disabled = true } ProgressBar.prototype.enable = function() { this.disabled = false this.show() } ProgressBar.prototype.hide = function() { if (!isTTY()) return if (this.disabled) return this.cursor.show() if (this.showing) this.cursor.up(1) this.cursor.horizontalAbsolute(0).eraseLine() this.showing = false } var repeat = function (str, count) { var out = "" for (var ii=0; ii Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/README.md0000644000000000000000000000103212631326456027155 0ustar 00000000000000# lodash.pad v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.pad` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.pad ``` In Node.js/io.js: ```js var pad = require('lodash.pad'); ``` See the [documentation](https://lodash.com/docs#pad) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.pad) for more details. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/index.js0000644000000000000000000000325312631326456027352 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Pads `string` on the left and right sides if it's shorter than `length`. * Padding characters are truncated if they can't be evenly divided by `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.pad('abc', 8); * // => ' abc ' * * _.pad('abc', 8, '_-'); * // => '_-abc_-_' * * _.pad('abc', 3); * // => 'abc' */ function pad(string, length, chars) { string = baseToString(string); length = +length; var strLength = string.length; if (strLength >= length || !nativeIsFinite(length)) { return string; } var mid = (length - strLength) / 2, leftLength = nativeFloor(mid), rightLength = nativeCeil(mid); chars = createPadding('', rightLength, chars); return chars.slice(0, leftLength) + string + chars; } module.exports = pad; npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/0000755000000000000000000000000012631326456030357 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/package.json0000644000000000000000000000457212631326456030200 0ustar 00000000000000{ "name": "lodash.pad", "version": "3.1.1", "description": "The modern build of lodash’s `_.pad` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.pad@3.1.1", "_shasum": "2e078ebc33b331d2ba34bf8732af129fd5c04624", "_from": "lodash.pad@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "2e078ebc33b331d2ba34bf8732af129fd5c04624", "tarball": "http://registry.npmjs.org/lodash.pad/-/lodash.pad-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.pad/-/lodash.pad-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016100000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._b0000755000000000000000000000000012631326456032131 5ustar 00000000000000././@LongLink0000000000000000000000000000016200000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000755000000000000000000000000012631326456032132 5ustar 00000000000000././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._b0000644000000000000000000000232112631326456032131 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._b0000644000000000000000000000105312631326456032132 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._b0000644000000000000000000000134212631326456032133 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._b0000644000000000000000000000442512631326456032140 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017100000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000232112631326456032132 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000106112631326456032132 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000254212631326456032137 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000755000000000000000000000000012631326456032132 5ustar 00000000000000././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000452412631326456032141 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000021500000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000755000000000000000000000000012631326456032132 5ustar 00000000000000././@LongLink0000000000000000000000000000022400000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000232112631326456032132 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000022600000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000105712631326456032137 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000022500000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000273712631326456032145 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.pad/node_modules/lodash._c0000644000000000000000000000455312631326456032143 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/LICENSE.txt0000644000000000000000000000232112631326456030376 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/README.md0000644000000000000000000000106612631326456030037 0ustar 00000000000000# lodash.padleft v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.padLeft` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.padleft ``` In Node.js/io.js: ```js var padLeft = require('lodash.padleft'); ``` See the [documentation](https://lodash.com/docs#padLeft) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.padleft) for more details. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/index.js0000644000000000000000000000277612631326456030236 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /** * Creates a function for `_.padLeft` or `_.padRight`. * * @private * @param {boolean} [fromRight] Specify padding from the right. * @returns {Function} Returns the new pad function. */ function createPadDir(fromRight) { return function(string, length, chars) { string = baseToString(string); return (fromRight ? string : '') + createPadding(string, length, chars) + (fromRight ? '' : string); }; } /** * Pads `string` on the left side if it is shorter than `length`. Padding * characters are truncated if they exceed `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.padLeft('abc', 6); * // => ' abc' * * _.padLeft('abc', 6, '_-'); * // => '_-_abc' * * _.padLeft('abc', 3); * // => 'abc' */ var padLeft = createPadDir(); module.exports = padLeft; npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/0000755000000000000000000000000012631326456031232 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/package.json0000644000000000000000000000463112631326456031047 0ustar 00000000000000{ "name": "lodash.padleft", "version": "3.1.1", "description": "The modern build of lodash’s `_.padLeft` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.padleft@3.1.1", "_shasum": "150151f1e0245edba15d50af2d71f1d5cff46530", "_from": "lodash.padleft@>=3.0.0 <4.0.0", "_npmVersion": "2.9.0", "_nodeVersion": "0.12.2", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "150151f1e0245edba15d50af2d71f1d5cff46530", "tarball": "http://registry.npmjs.org/lodash.padleft/-/lodash.padleft-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.padleft/-/lodash.padleft-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000232112631326456032255 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000105312631326456032256 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000134212631326456032257 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000020100000000000011206 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000442512631326456032264 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000232112631326456032255 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000106112631326456032255 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000254212631326456032262 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000452412631326456032264 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022100000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000023000000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000232112631326456032255 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000105712631326456032262 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000273712631326456032270 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000023500000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padleft/node_modules/lodas0000644000000000000000000000455312631326456032266 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/LICENSE.txt0000644000000000000000000000232112631326456030561 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/README.md0000644000000000000000000000107512631326456030222 0ustar 00000000000000# lodash.padright v3.1.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.padRight` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.padright ``` In Node.js/io.js: ```js var padRight = require('lodash.padright'); ``` See the [documentation](https://lodash.com/docs#padRight) or [package source](https://github.com/lodash/lodash/blob/3.1.1-npm-packages/lodash.padright) for more details. npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/index.js0000644000000000000000000000301012631326456030377 0ustar 00000000000000/** * lodash 3.1.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'), createPadding = require('lodash._createpadding'); /** * Creates a function for `_.padLeft` or `_.padRight`. * * @private * @param {boolean} [fromRight] Specify padding from the right. * @returns {Function} Returns the new pad function. */ function createPadDir(fromRight) { return function(string, length, chars) { string = baseToString(string); return (fromRight ? string : '') + createPadding(string, length, chars) + (fromRight ? '' : string); }; } /** * Pads `string` on the right side if it is shorter than `length`. Padding * characters are truncated if they exceed `length`. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to pad. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the padded string. * @example * * _.padRight('abc', 6); * // => 'abc ' * * _.padRight('abc', 6, '_-'); * // => 'abc_-_' * * _.padRight('abc', 3); * // => 'abc' */ var padRight = createPadDir(true); module.exports = padRight; npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/0000755000000000000000000000000012631326456031415 5ustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/package.json0000644000000000000000000000464112631326456031233 0ustar 00000000000000{ "name": "lodash.padright", "version": "3.1.1", "description": "The modern build of lodash’s `_.padRight` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0", "lodash._createpadding": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.padright@3.1.1", "_shasum": "79f7770baaa39738c040aeb5465e8d88f2aacec0", "_from": "lodash.padright@>=3.0.0 <4.0.0", "_npmVersion": "2.9.0", "_nodeVersion": "0.12.2", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "79f7770baaa39738c040aeb5465e8d88f2aacec0", "tarball": "http://registry.npmjs.org/lodash.padright/-/lodash.padright-3.1.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.padright/-/lodash.padright-3.1.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000232112631326456032255 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000105312631326456032256 0ustar 00000000000000# lodash._basetostring v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `baseToString` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._basetostring ``` In Node.js/io.js: ```js var baseToString = require('lodash._basetostring'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash._basetostring) for more details. ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000134212631326456032257 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ /** * Converts `value` to a string if it's not one. An empty string is returned * for `null` or `undefined` values. * * @private * @param {*} value The value to process. * @returns {string} Returns the string. */ function baseToString(value) { return value == null ? '' : (value + ''); } module.exports = baseToString; ././@LongLink0000000000000000000000000000020200000000000011207 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._basetostring/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000442512631326456032264 0ustar 00000000000000{ "name": "lodash._basetostring", "version": "3.0.1", "description": "The modern build of lodash’s internal `baseToString` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._basetostring@3.0.1", "_shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "_from": "lodash._basetostring@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "d1861d877f824a52f669832dcaf3ee15566a07d5", "tarball": "http://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._basetostring/-/lodash._basetostring-3.0.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000232112631326456032255 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000106112631326456032255 0ustar 00000000000000# lodash._createpadding v3.6.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) internal `createPadding` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash._createpadding ``` In Node.js/io.js: ```js var createPadding = require('lodash._createpadding'); ``` See the [package source](https://github.com/lodash/lodash/blob/3.6.1-npm-packages/lodash._createpadding) for more details. ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000254212631326456032262 0ustar 00000000000000/** * lodash 3.6.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var repeat = require('lodash.repeat'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeCeil = Math.ceil, nativeIsFinite = global.isFinite; /** * Creates the padding required for `string` based on the given `length`. * The `chars` string is truncated if the number of characters exceeds `length`. * * @private * @param {string} string The string to create padding for. * @param {number} [length=0] The padding length. * @param {string} [chars=' '] The string used as padding. * @returns {string} Returns the pad for `string`. */ function createPadding(string, length, chars) { var strLength = string.length; length = +length; if (strLength >= length || !nativeIsFinite(length)) { return ''; } var padLength = length - strLength; chars = chars == null ? ' ' : (chars + ''); return repeat(chars, nativeCeil(padLength / chars.length)).slice(0, padLength); } module.exports = createPadding; ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000452412631326456032264 0ustar 00000000000000{ "name": "lodash._createpadding", "version": "3.6.1", "description": "The modern build of lodash’s internal `createPadding` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash.repeat": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash._createpadding@3.6.1", "_shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "_from": "lodash._createpadding@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "4907b438595adc54ee8935527a6c424c02c81a87", "tarball": "http://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash._createpadding/-/lodash._createpadding-3.6.1.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000022200000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000755000000000000000000000000012631326456032255 5ustar 00000000000000././@LongLink0000000000000000000000000000023100000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/LICENSEnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000232112631326456032255 0ustar 00000000000000Copyright 2012-2015 The Dojo Foundation Based on Underscore.js, copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000023300000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/README.mdnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000105712631326456032262 0ustar 00000000000000# lodash.repeat v3.0.1 The [modern build](https://github.com/lodash/lodash/wiki/Build-Differences) of [lodash’s](https://lodash.com/) `_.repeat` exported as a [Node.js](http://nodejs.org/)/[io.js](https://iojs.org/) module. ## Installation Using npm: ```bash $ {sudo -H} npm i -g npm $ npm i --save lodash.repeat ``` In Node.js/io.js: ```js var repeat = require('lodash.repeat'); ``` See the [documentation](https://lodash.com/docs#repeat) or [package source](https://github.com/lodash/lodash/blob/3.0.1-npm-packages/lodash.repeat) for more details. ././@LongLink0000000000000000000000000000023200000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/index.jsnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000273712631326456032270 0ustar 00000000000000/** * lodash 3.0.1 (Custom Build) * Build: `lodash modern modularize exports="npm" -o ./` * Copyright 2012-2015 The Dojo Foundation * Based on Underscore.js 1.8.3 * Copyright 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors * Available under MIT license */ var baseToString = require('lodash._basetostring'); /* Native method references for those with the same name as other `lodash` methods. */ var nativeFloor = Math.floor, nativeIsFinite = global.isFinite; /** * Repeats the given string `n` times. * * @static * @memberOf _ * @category String * @param {string} [string=''] The string to repeat. * @param {number} [n=0] The number of times to repeat the string. * @returns {string} Returns the repeated string. * @example * * _.repeat('*', 3); * // => '***' * * _.repeat('abc', 2); * // => 'abcabc' * * _.repeat('abc', 0); * // => '' */ function repeat(string, n) { var result = ''; string = baseToString(string); n = +n; if (n < 1 || !string || !nativeIsFinite(n)) { return result; } // Leverage the exponentiation by squaring algorithm for a faster repeat. // See https://en.wikipedia.org/wiki/Exponentiation_by_squaring for more details. do { if (n % 2) { result += string; } n = nativeFloor(n / 2); string += string; } while (n); return result; } module.exports = repeat; ././@LongLink0000000000000000000000000000023600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/lodash._createpadding/node_modules/lodash.repeat/package.jsonnpm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/node_modules/lodash.padright/node_modules/loda0000644000000000000000000000455312631326456032266 0ustar 00000000000000{ "name": "lodash.repeat", "version": "3.0.1", "description": "The modern build of lodash’s `_.repeat` as a module.", "homepage": "https://lodash.com/", "icon": "https://lodash.com/icon.svg", "license": "MIT", "keywords": [ "lodash", "lodash-modularized", "stdlib", "util" ], "author": { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, "contributors": [ { "name": "John-David Dalton", "email": "john.david.dalton@gmail.com", "url": "http://allyoucanleet.com/" }, { "name": "Benjamin Tan", "email": "demoneaux@gmail.com", "url": "https://d10.github.io/" }, { "name": "Blaine Bublitz", "email": "blaine@iceddev.com", "url": "http://www.iceddev.com/" }, { "name": "Kit Cambridge", "email": "github@kitcambridge.be", "url": "http://kitcambridge.be/" }, { "name": "Mathias Bynens", "email": "mathias@qiwi.be", "url": "https://mathiasbynens.be/" } ], "repository": { "type": "git", "url": "git+https://github.com/lodash/lodash.git" }, "scripts": { "test": "echo \"See https://travis-ci.org/lodash/lodash-cli for testing details.\"" }, "dependencies": { "lodash._basetostring": "^3.0.0" }, "bugs": { "url": "https://github.com/lodash/lodash/issues" }, "_id": "lodash.repeat@3.0.1", "_shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "_from": "lodash.repeat@>=3.0.0 <4.0.0", "_npmVersion": "2.12.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, "maintainers": [ { "name": "jdalton", "email": "john.david.dalton@gmail.com" }, { "name": "d10", "email": "demoneaux@gmail.com" }, { "name": "kitcambridge", "email": "github@kitcambridge.be" }, { "name": "mathias", "email": "mathias@qiwi.be" }, { "name": "phated", "email": "blaine@iceddev.com" } ], "dist": { "shasum": "f4b98dc7ef67256ce61e7874e1865edb208e0edf", "tarball": "http://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/lodash.repeat/-/lodash.repeat-3.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/npmlog/node_modules/gauge/test/progress-bar.js0000644000000000000000000001257212631326456025142 0ustar 00000000000000"use strict" var test = require("tap").test var ProgressBar = require("../progress-bar.js") var cursor = [] var C var bar = new ProgressBar({theme: ProgressBar.ascii}, C = { show: function () { cursor.push(["show"]) return C }, hide: function () { cursor.push(["hide"]) return C }, up: function (lines) { cursor.push(["up",lines]) return C }, horizontalAbsolute: function (col) { cursor.push(["horizontalAbsolute", col]) return C }, eraseLine: function () { cursor.push(["eraseLine"]) return C }, write: function (line) { cursor.push(["write", line]) return C } }) function isOutput(t, msg, output) { var tests = [] for (var ii = 0; ii P | |----|\n' ], [ 'show' ] ]) }) test("window resizing", function (t) { t.plan(16) process.stderr.isTTY = true process.stdout.columns = 32 bar.show("NAME", 0.1) cursor = [] bar.last = new Date(0) bar.pulse() isOutput(t, "32 columns", [ [ 'up', 1 ], [ 'hide' ], [ 'horizontalAbsolute', 0 ], [ 'write', 'NAME / |##------------------|\n' ], [ 'show' ] ]) process.stdout.columns = 16 bar.show("NAME", 0.5) cursor = [] bar.last = new Date(0) bar.pulse() isOutput(t, "16 columns", [ [ 'up', 1 ], [ 'hide' ], [ 'horizontalAbsolute', 0 ], [ 'write', 'NAME - |##--|\n' ], [ 'show' ] ]); }); npm_3.5.2.orig/node_modules/npmlog/test/basic.js0000644000000000000000000002210312631326456020037 0ustar 00000000000000var tap = require('tap') var log = require('../') var result = [] var logEvents = [] var logInfoEvents = [] var logPrefixEvents = [] var util = require('util') var resultExpect = [ '\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[7msill\u001b[0m \u001b[0m\u001b[35msilly prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[34m\u001b[40mverb\u001b[0m \u001b[0m\u001b[35mverbose prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32minfo\u001b[0m \u001b[0m\u001b[35minfo prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32m\u001b[40mhttp\u001b[0m \u001b[0m\u001b[35mhttp prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[30m\u001b[43mWARN\u001b[0m \u001b[0m\u001b[35mwarn prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35merror prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32minfo\u001b[0m \u001b[0m\u001b[35minfo prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[32m\u001b[40mhttp\u001b[0m \u001b[0m\u001b[35mhttp prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[30m\u001b[43mWARN\u001b[0m \u001b[0m\u001b[35mwarn prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35merror prefix\u001b[0m x = {"foo":{"bar":"baz"}}\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m This is a longer\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m message, with some details\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m and maybe a stack.\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u001b[31m\u001b[40mERR!\u001b[0m \u001b[0m\u001b[35m404\u001b[0m \n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u0007noise\u001b[0m\u001b[35m\u001b[0m LOUD NOISES\n', '\u001b[0m\u001b[37m\u001b[40mnpm\u001b[0m \u001b[0m\u0007noise\u001b[0m \u001b[0m\u001b[35merror\u001b[0m erroring\n', '\u001b[0m' ] var logPrefixEventsExpect = [ { id: 2, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 9, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 16, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] } ] // should be the same. var logInfoEventsExpect = logPrefixEventsExpect var logEventsExpect = [ { id: 0, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 1, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 2, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 3, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 4, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 5, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 6, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 7, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 8, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 9, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 10, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 11, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 12, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 13, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 14, level: 'silly', prefix: 'silly prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 15, level: 'verbose', prefix: 'verbose prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 16, level: 'info', prefix: 'info prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 17, level: 'http', prefix: 'http prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 18, level: 'warn', prefix: 'warn prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 19, level: 'error', prefix: 'error prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 20, level: 'silent', prefix: 'silent prefix', message: 'x = {"foo":{"bar":"baz"}}', messageRaw: [ 'x = %j', { foo: { bar: 'baz' } } ] }, { id: 21, level: 'error', prefix: '404', message: 'This is a longer\nmessage, with some details\nand maybe a stack.\n', messageRaw: [ 'This is a longer\nmessage, with some details\nand maybe a stack.\n' ] }, { id: 22, level: 'noise', prefix: false, message: 'LOUD NOISES', messageRaw: [ 'LOUD NOISES' ] }, { id: 23, level: 'noise', prefix: 'error', message: 'erroring', messageRaw: [ 'erroring' ] } ] var Stream = require('stream').Stream var s = new Stream() s.write = function (m) { result.push(m) } s.writable = true s.isTTY = true s.end = function () {} log.stream = s log.heading = 'npm' tap.test('basic', function (t) { log.on('log', logEvents.push.bind(logEvents)) log.on('log.info', logInfoEvents.push.bind(logInfoEvents)) log.on('info prefix', logPrefixEvents.push.bind(logPrefixEvents)) console.error('log.level=silly') log.level = 'silly' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=silent') log.level = 'silent' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) console.error('log.level=info') log.level = 'info' log.silly('silly prefix', 'x = %j', {foo:{bar:'baz'}}) log.verbose('verbose prefix', 'x = %j', {foo:{bar:'baz'}}) log.info('info prefix', 'x = %j', {foo:{bar:'baz'}}) log.http('http prefix', 'x = %j', {foo:{bar:'baz'}}) log.warn('warn prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('error prefix', 'x = %j', {foo:{bar:'baz'}}) log.silent('silent prefix', 'x = %j', {foo:{bar:'baz'}}) log.error('404', 'This is a longer\n'+ 'message, with some details\n'+ 'and maybe a stack.\n') log.addLevel('noise', 10000, {beep: true}) log.noise(false, 'LOUD NOISES') log.noise('error', 'erroring') t.deepEqual(result.join('').trim(), resultExpect.join('').trim(), 'result') t.deepEqual(log.record, logEventsExpect, 'record') t.deepEqual(logEvents, logEventsExpect, 'logEvents') t.deepEqual(logInfoEvents, logInfoEventsExpect, 'logInfoEvents') t.deepEqual(logPrefixEvents, logPrefixEventsExpect, 'logPrefixEvents') t.end() }) npm_3.5.2.orig/node_modules/npmlog/test/progress.js0000644000000000000000000000621612631326456020631 0ustar 00000000000000'use strict' var test = require('tap').test var log = require('../log.js') var actions = [] log.gauge = { enable: function () { actions.push(['enable']) }, disable: function () { actions.push(['disable']) }, hide: function () { actions.push(['hide']) }, show: function (name, completed) { actions.push(['show', name, completed]) }, pulse: function (name) { actions.push(['pulse', name]) } } function didActions(t, msg, output) { var tests = [] for (var ii = 0; ii < output.length; ++ ii) { for (var jj = 0; jj < output[ii].length; ++ jj) { tests.push({cmd: ii, arg: jj}) } } t.is(actions.length, output.length, msg) tests.forEach(function (test) { t.is(actions[test.cmd] ? actions[test.cmd][test.arg] : null, output[test.cmd][test.arg], msg + ': ' + output[test.cmd] + (test.arg ? ' arg #'+test.arg : '')) }) actions = [] } test('enableProgress', function (t) { t.plan(6) log.enableProgress() didActions(t, 'enableProgress', [ [ 'enable' ], [ 'show', undefined, 0 ] ]) log.enableProgress() didActions(t, 'enableProgress again', []) }) test('disableProgress', function (t) { t.plan(4) log.disableProgress() didActions(t, 'disableProgress', [ [ 'hide' ], [ 'disable' ] ]) log.disableProgress() didActions(t, 'disableProgress again', []) }) test('showProgress', function (t) { t.plan(5) log.showProgress('foo') didActions(t, 'showProgress disabled', []) log.enableProgress() actions = [] log.showProgress('foo') didActions(t, 'showProgress', [ [ 'show', 'foo', 0 ] ]) }) test('clearProgress', function (t) { t.plan(3) log.clearProgress() didActions(t, 'clearProgress', [ [ 'hide' ] ]) log.disableProgress() actions = [] log.clearProgress() didActions(t, 'clearProgress disabled', [ ]) }) test("newItem", function (t) { t.plan(12) log.enableProgress() actions = [] var a = log.newItem("test", 10) didActions(t, "newItem", [ [ 'show', undefined, 0 ] ]) a.completeWork(5) didActions(t, "newItem:completeWork", [ [ 'show', 'test', 0.5 ] ]) a.finish() didActions(t, "newItem:finish", [ [ 'show', 'test', 1 ] ]) }) // test that log objects proxy through. And test that completion status filters up test("newGroup", function (t) { t.plan(23) var a = log.newGroup("newGroup") didActions(t, "newGroup", [ [ 'show', undefined, 0.5 ] ]) a.warn("test", "this is a test") didActions(t, "newGroup:warn", [ [ 'pulse', 'test' ], [ 'hide' ], [ 'show', undefined, 0.5 ] ]) var b = a.newItem("newGroup2", 10) didActions(t, "newGroup:newItem", [ [ 'show', 'newGroup', 0.5 ] ]) b.completeWork(5) didActions(t, "newGroup:completeWork", [ [ 'show', 'newGroup2', 0.75 ] ]) a.finish() didActions(t, "newGroup:finish", [ [ 'show', 'newGroup', 1 ] ]) }) test("newStream", function (t) { t.plan(13) var a = log.newStream("newStream", 10) didActions(t, "newStream", [ [ 'show', undefined, 0.6666666666666666 ] ]) a.write("abcde") didActions(t, "newStream", [ [ 'show', 'newStream', 0.8333333333333333 ] ]) a.write("fghij") didActions(t, "newStream", [ [ 'show', 'newStream', 1 ] ]) t.is(log.tracker.completed(), 1, "Overall completion") }) npm_3.5.2.orig/node_modules/once/LICENSE0000644000000000000000000000137512631326456016106 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/once/README.md0000644000000000000000000000176412631326456016362 0ustar 00000000000000# once Only call a function once. ## usage ```javascript var once = require('once') function load (file, cb) { cb = once(cb) loader.load('file') loader.once('load', cb) loader.once('error', cb) } ``` Or add to the Function.prototype in a responsible way: ```javascript // only has to be done once require('once').proto() function load (file, cb) { cb = cb.once() loader.load('file') loader.once('load', cb) loader.once('error', cb) } ``` Ironically, the prototype feature makes this module twice as complicated as necessary. To check whether you function has been called, use `fn.called`. Once the function is called for the first time the return value of the original function is saved in `fn.value` and subsequent calls will continue to return this value. ```javascript var once = require('once') function load (cb) { cb = once(cb) var stream = createStream() stream.once('data', cb) stream.once('end', function () { if (!cb.called) cb(new Error('not found')) }) } ``` npm_3.5.2.orig/node_modules/once/once.js0000644000000000000000000000064112631326456016356 0ustar 00000000000000var wrappy = require('wrappy') module.exports = wrappy(once) once.proto = once(function () { Object.defineProperty(Function.prototype, 'once', { value: function () { return once(this) }, configurable: true }) }) function once (fn) { var f = function () { if (f.called) return f.value f.called = true return f.value = fn.apply(this, arguments) } f.called = false return f } npm_3.5.2.orig/node_modules/once/package.json0000644000000000000000000000372612631326456017371 0ustar 00000000000000{ "name": "once", "version": "1.3.2", "description": "Run a function exactly one time", "main": "once.js", "directories": { "test": "test" }, "dependencies": { "wrappy": "1" }, "devDependencies": { "tap": "~0.3.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git://github.com/isaacs/once.git" }, "keywords": [ "once", "function", "one", "single" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "readme": "# once\n\nOnly call a function once.\n\n## usage\n\n```javascript\nvar once = require('once')\n\nfunction load (file, cb) {\n cb = once(cb)\n loader.load('file')\n loader.once('load', cb)\n loader.once('error', cb)\n}\n```\n\nOr add to the Function.prototype in a responsible way:\n\n```javascript\n// only has to be done once\nrequire('once').proto()\n\nfunction load (file, cb) {\n cb = cb.once()\n loader.load('file')\n loader.once('load', cb)\n loader.once('error', cb)\n}\n```\n\nIronically, the prototype feature makes this module twice as\ncomplicated as necessary.\n\nTo check whether you function has been called, use `fn.called`. Once the\nfunction is called for the first time the return value of the original\nfunction is saved in `fn.value` and subsequent calls will continue to\nreturn this value.\n\n```javascript\nvar once = require('once')\n\nfunction load (cb) {\n cb = once(cb)\n var stream = createStream()\n stream.once('data', cb)\n stream.once('end', function () {\n if (!cb.called) cb(new Error('not found'))\n })\n}\n```\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/once/issues" }, "homepage": "https://github.com/isaacs/once#readme", "_id": "once@1.3.2", "_shasum": "d8feeca93b039ec1dcdee7741c92bdac5e28081b", "_resolved": "https://registry.npmjs.org/once/-/once-1.3.2.tgz", "_from": "once@>=1.3.2 <1.4.0" } npm_3.5.2.orig/node_modules/once/test/0000755000000000000000000000000012631326456016052 5ustar 00000000000000npm_3.5.2.orig/node_modules/once/test/once.js0000644000000000000000000000070412631326456017335 0ustar 00000000000000var test = require('tap').test var once = require('../once.js') test('once', function (t) { var f = 0 function fn (g) { t.equal(f, 0) f ++ return f + g + this } fn.ownProperty = {} var foo = once(fn) t.equal(fn.ownProperty, foo.ownProperty) t.notOk(foo.called) for (var i = 0; i < 1E3; i++) { t.same(f, i === 0 ? 0 : 1) var g = foo.call(1, 1) t.ok(foo.called) t.same(g, 3) t.same(f, 1) } t.end() }) npm_3.5.2.orig/node_modules/opener/LICENSE.txt0000644000000000000000000000133612631326456017265 0ustar 00000000000000Copyright © 2012–2015 Domenic Denicola This work is free. You can redistribute it and/or modify it under the terms of the Do What The Fuck You Want To Public License, Version 2, as published by Sam Hocevar. See below for more details. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE Version 2, December 2004 Copyright (C) 2004 Sam Hocevar Everyone is permitted to copy and distribute verbatim or modified copies of this license document, and changing it is allowed as long as the name is changed. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. You just DO WHAT THE FUCK YOU WANT TO. npm_3.5.2.orig/node_modules/opener/README.md0000644000000000000000000000242512631326456016721 0ustar 00000000000000# It Opens Stuff That is, in your desktop environment. This will make *actual windows pop up*, with stuff in them: ```bash npm install opener -g opener http://google.com opener ./my-file.txt opener firefox opener npm run lint ``` Also if you want to use it programmatically you can do that too: ```js var opener = require("opener"); opener("http://google.com"); opener("./my-file.txt"); opener("firefox"); opener("npm run lint"); ``` Plus, it returns the child process created, so you can do things like let your script exit while the window stays open: ```js var editor = opener("documentation.odt"); editor.unref(); // These other unrefs may be necessary if your OS's opener process // exits before the process it started is complete. editor.stdin.unref(); editor.stdout.unref(); editor.stderr.unref(); ``` ## Use It for Good Like opening the user's browser with a test harness in your package's test script: ```json { "scripts": { "test": "opener ./test/runner.html" }, "devDependencies": { "opener": "*" } } ``` ## Why Because Windows has `start`, Macs have `open`, and *nix has `xdg-open`. At least [according to some guy on StackOverflow](http://stackoverflow.com/q/1480971/3191). And I like things that work on all three. Like Node.js. And Opener. npm_3.5.2.orig/node_modules/opener/opener.js0000755000000000000000000000410212631326456017265 0ustar 00000000000000#!/usr/bin/env node "use strict"; var childProcess = require("child_process"); function opener(args, options, callback) { // http://stackoverflow.com/q/1480971/3191, but see below for Windows. var command = process.platform === "win32" ? "cmd" : process.platform === "darwin" ? "open" : "xdg-open"; if (typeof args === "string") { args = [args]; } if (typeof options === "function") { callback = options; options = {}; } if (options && typeof options === "object" && options.command) { if (process.platform === "win32") { // *always* use cmd on windows args = [options.command].concat(args); } else { command = options.command; } } if (process.platform === "win32") { // On Windows, we really want to use the "start" command. But, the rules regarding arguments with spaces, and // escaping them with quotes, can get really arcane. So the easiest way to deal with this is to pass off the // responsibility to "cmd /c", which has that logic built in. // // Furthermore, if "cmd /c" double-quoted the first parameter, then "start" will interpret it as a window title, // so we need to add a dummy empty-string window title: http://stackoverflow.com/a/154090/3191 // // Additionally, on Windows ampersand needs to be escaped when passed to "start" args = args.map(function(value) { return value.replace(/&/g, '^&'); }); args = ["/c", "start", '""'].concat(args); } return childProcess.execFile(command, args, options, callback); } // Export `opener` for programmatic access. // You might use this to e.g. open a website: `opener("http://google.com")` module.exports = opener; // If we're being called from the command line, just execute, using the command-line arguments. if (require.main && require.main.id === module.id) { opener(process.argv.slice(2), function (error) { if (error) { throw error; } }); } npm_3.5.2.orig/node_modules/opener/package.json0000644000000000000000000000253712631326456017734 0ustar 00000000000000{ "name": "opener", "description": "Opens stuff, like webpages and files and executables, cross-platform", "version": "1.4.1", "author": { "name": "Domenic Denicola", "email": "d@domenic.me", "url": "https://domenic.me/" }, "license": "WTFPL", "repository": { "type": "git", "url": "git+https://github.com/domenic/opener.git" }, "main": "opener.js", "bin": { "opener": "opener.js" }, "files": [ "opener.js" ], "scripts": { "lint": "jshint opener.js" }, "devDependencies": { "jshint": "^2.6.3" }, "gitHead": "d0ee95b19951703462fa593baa16e81fdff7827c", "bugs": { "url": "https://github.com/domenic/opener/issues" }, "homepage": "https://github.com/domenic/opener", "_id": "opener@1.4.1", "_shasum": "897590acd1aed3311b703b58bccb4d43f56f2895", "_from": "opener@>=1.4.1 <1.5.0", "_npmVersion": "2.7.0", "_nodeVersion": "1.5.1", "_npmUser": { "name": "domenic", "email": "d@domenic.me" }, "maintainers": [ { "name": "domenic", "email": "domenic@domenicdenicola.com" } ], "dist": { "shasum": "897590acd1aed3311b703b58bccb4d43f56f2895", "tarball": "http://registry.npmjs.org/opener/-/opener-1.4.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/opener/-/opener-1.4.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/osenv/.npmignore0000644000000000000000000000014612631326456017301 0ustar 00000000000000*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modules npm_3.5.2.orig/node_modules/osenv/.travis.yml0000644000000000000000000000020612631326456017410 0ustar 00000000000000language: node_js language: node_js node_js: - '0.8' - '0.10' - '0.12' - 'iojs' before_install: - npm install -g npm@latest npm_3.5.2.orig/node_modules/osenv/LICENSE0000644000000000000000000000137512631326456016314 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/osenv/README.md0000644000000000000000000000267412631326456016571 0ustar 00000000000000# osenv Look up environment settings specific to different operating systems. ## Usage ```javascript var osenv = require('osenv') var path = osenv.path() var user = osenv.user() // etc. // Some things are not reliably in the env, and have a fallback command: var h = osenv.hostname(function (er, hostname) { h = hostname }) // This will still cause it to be memoized, so calling osenv.hostname() // is now an immediate operation. // You can always send a cb, which will get called in the nextTick // if it's been memoized, or wait for the fallback data if it wasn't // found in the environment. osenv.hostname(function (er, hostname) { if (er) console.error('error looking up hostname') else console.log('this machine calls itself %s', hostname) }) ``` ## osenv.hostname() The machine name. Calls `hostname` if not found. ## osenv.user() The currently logged-in user. Calls `whoami` if not found. ## osenv.prompt() Either PS1 on unix, or PROMPT on Windows. ## osenv.tmpdir() The place where temporary files should be created. ## osenv.home() No place like it. ## osenv.path() An array of the places that the operating system will search for executables. ## osenv.editor() Return the executable name of the editor program. This uses the EDITOR and VISUAL environment variables, and falls back to `vi` on Unix, or `notepad.exe` on Windows. ## osenv.shell() The SHELL on Unix, which Windows calls the ComSpec. Defaults to 'bash' or 'cmd'. npm_3.5.2.orig/node_modules/osenv/node_modules/0000755000000000000000000000000012631326456017756 5ustar 00000000000000npm_3.5.2.orig/node_modules/osenv/osenv.js0000644000000000000000000000351012631326456016770 0ustar 00000000000000var isWindows = process.platform === 'win32' var path = require('path') var exec = require('child_process').exec var osTmpdir = require('os-tmpdir') var osHomedir = require('os-homedir') // looking up envs is a bit costly. // Also, sometimes we want to have a fallback // Pass in a callback to wait for the fallback on failures // After the first lookup, always returns the same thing. function memo (key, lookup, fallback) { var fell = false var falling = false exports[key] = function (cb) { var val = lookup() if (!val && !fell && !falling && fallback) { fell = true falling = true exec(fallback, function (er, output, stderr) { falling = false if (er) return // oh well, we tried val = output.trim() }) } exports[key] = function (cb) { if (cb) process.nextTick(cb.bind(null, null, val)) return val } if (cb && !falling) process.nextTick(cb.bind(null, null, val)) return val } } memo('user', function () { return ( isWindows ? process.env.USERDOMAIN + '\\' + process.env.USERNAME : process.env.USER ) }, 'whoami') memo('prompt', function () { return isWindows ? process.env.PROMPT : process.env.PS1 }) memo('hostname', function () { return isWindows ? process.env.COMPUTERNAME : process.env.HOSTNAME }, 'hostname') memo('tmpdir', function () { return osTmpdir() }) memo('home', function () { return osHomedir() }) memo('path', function () { return (process.env.PATH || process.env.Path || process.env.path).split(isWindows ? ';' : ':') }) memo('editor', function () { return process.env.EDITOR || process.env.VISUAL || (isWindows ? 'notepad.exe' : 'vi') }) memo('shell', function () { return isWindows ? process.env.ComSpec || 'cmd' : process.env.SHELL || 'bash' }) npm_3.5.2.orig/node_modules/osenv/package.json0000644000000000000000000000504612631326456017574 0ustar 00000000000000{ "name": "osenv", "version": "0.1.3", "main": "osenv.js", "directories": { "test": "test" }, "dependencies": { "os-homedir": "^1.0.0", "os-tmpdir": "^1.0.0" }, "devDependencies": { "tap": "^1.2.0" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/npm/osenv.git" }, "keywords": [ "environment", "variable", "home", "tmpdir", "path", "prompt", "ps1" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "description": "Look up environment settings specific to different operating systems", "readme": "# osenv\n\nLook up environment settings specific to different operating systems.\n\n## Usage\n\n```javascript\nvar osenv = require('osenv')\nvar path = osenv.path()\nvar user = osenv.user()\n// etc.\n\n// Some things are not reliably in the env, and have a fallback command:\nvar h = osenv.hostname(function (er, hostname) {\n h = hostname\n})\n// This will still cause it to be memoized, so calling osenv.hostname()\n// is now an immediate operation.\n\n// You can always send a cb, which will get called in the nextTick\n// if it's been memoized, or wait for the fallback data if it wasn't\n// found in the environment.\nosenv.hostname(function (er, hostname) {\n if (er) console.error('error looking up hostname')\n else console.log('this machine calls itself %s', hostname)\n})\n```\n\n## osenv.hostname()\n\nThe machine name. Calls `hostname` if not found.\n\n## osenv.user()\n\nThe currently logged-in user. Calls `whoami` if not found.\n\n## osenv.prompt()\n\nEither PS1 on unix, or PROMPT on Windows.\n\n## osenv.tmpdir()\n\nThe place where temporary files should be created.\n\n## osenv.home()\n\nNo place like it.\n\n## osenv.path()\n\nAn array of the places that the operating system will search for\nexecutables.\n\n## osenv.editor() \n\nReturn the executable name of the editor program. This uses the EDITOR\nand VISUAL environment variables, and falls back to `vi` on Unix, or\n`notepad.exe` on Windows.\n\n## osenv.shell()\n\nThe SHELL on Unix, which Windows calls the ComSpec. Defaults to 'bash'\nor 'cmd'.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/npm/osenv/issues" }, "homepage": "https://github.com/npm/osenv#readme", "_id": "osenv@0.1.3", "_shasum": "83cf05c6d6458fc4d5ac6362ea325d92f2754217", "_resolved": "https://registry.npmjs.org/osenv/-/osenv-0.1.3.tgz", "_from": "osenv@>=0.1.3 <0.2.0" } npm_3.5.2.orig/node_modules/osenv/test/0000755000000000000000000000000012631326456016260 5ustar 00000000000000npm_3.5.2.orig/node_modules/osenv/x.tap0000644000000000000000000000167112631326456016263 0ustar 00000000000000TAP version 13 # Subtest: test/unix.js TAP version 13 # Subtest: basic unix sanity test ok 1 - should be equal ok 2 - should be equal ok 3 - should be equal ok 4 - should be equivalent ok 5 - should be equal ok 6 - should be equal ok 7 - should be equal ok 8 - should be equal ok 9 - should be equal ok 10 - should be equal ok 11 - should be equal ok 12 - should be equal ok 13 - should be equal ok 14 - should be equal 1..14 ok 1 - basic unix sanity test # time=10.712ms 1..1 # time=18.422ms ok 1 - test/unix.js # time=169.827ms # Subtest: test/windows.js TAP version 13 1..0 # Skip windows tests, this is not windows ok 2 - test/windows.js # SKIP Skip windows tests, this is not windows # Subtest: test/nada.js TAP version 13 1..0 ok 2 - test/nada.js 1..3 # time=274.247ms npm_3.5.2.orig/node_modules/osenv/node_modules/os-homedir/0000755000000000000000000000000012631326456022024 5ustar 00000000000000npm_3.5.2.orig/node_modules/osenv/node_modules/os-tmpdir/0000755000000000000000000000000012631326456021674 5ustar 00000000000000npm_3.5.2.orig/node_modules/osenv/node_modules/os-homedir/index.js0000644000000000000000000000114012631326456023465 0ustar 00000000000000'use strict'; var os = require('os'); function homedir() { var env = process.env; var home = env.HOME; var user = env.LOGNAME || env.USER || env.LNAME || env.USERNAME; if (process.platform === 'win32') { return env.USERPROFILE || env.HOMEDRIVE + env.HOMEPATH || home || null; } if (process.platform === 'darwin') { return home || (user ? '/Users/' + user : null); } if (process.platform === 'linux') { return home || (process.getuid() === 0 ? '/root' : (user ? '/home/' + user : null)); } return home || null; } module.exports = typeof os.homedir === 'function' ? os.homedir : homedir; npm_3.5.2.orig/node_modules/osenv/node_modules/os-homedir/license0000644000000000000000000000213712631326456023374 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/osenv/node_modules/os-homedir/package.json0000644000000000000000000000363112631326456024315 0ustar 00000000000000{ "name": "os-homedir", "version": "1.0.1", "description": "io.js 2.3.0 os.homedir() ponyfill", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/os-homedir.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "node test.js" }, "files": [ "index.js" ], "keywords": [ "built-in", "core", "ponyfill", "polyfill", "shim", "os", "homedir", "home", "dir", "directory", "folder", "user", "path" ], "devDependencies": { "ava": "0.0.4", "path-exists": "^1.0.0" }, "readme": "# os-homedir [![Build Status](https://travis-ci.org/sindresorhus/os-homedir.svg?branch=master)](https://travis-ci.org/sindresorhus/os-homedir)\n\n> io.js 2.3.0 [`os.homedir()`](https://iojs.org/api/os.html#os_os_homedir) ponyfill\n\n> Ponyfill: A polyfill that doesn't overwrite the native method\n\n\n## Install\n\n```\n$ npm install --save os-homedir\n```\n\n\n## Usage\n\n```js\nvar osHomedir = require('os-homedir');\n\nconsole.log(osHomedir());\n//=> /Users/sindresorhus\n```\n\n\n## Related\n\n- [user-home](https://github.com/sindresorhus/user-home) - Same as this module but caches the result\n- [home-or-tmp](https://github.com/sindresorhus/home-or-tmp) - Get the user home directory with fallback to the system temp directory\n\n\n## License\n\nMIT © [Sindre Sorhus](http://sindresorhus.com)\n", "readmeFilename": "readme.md", "bugs": { "url": "https://github.com/sindresorhus/os-homedir/issues" }, "homepage": "https://github.com/sindresorhus/os-homedir#readme", "_id": "os-homedir@1.0.1", "_shasum": "0d62bdf44b916fd3bbdcf2cab191948fb094f007", "_resolved": "https://registry.npmjs.org/os-homedir/-/os-homedir-1.0.1.tgz", "_from": "os-homedir@>=1.0.0 <2.0.0" } npm_3.5.2.orig/node_modules/osenv/node_modules/os-homedir/readme.md0000644000000000000000000000140312631326456023601 0ustar 00000000000000# os-homedir [![Build Status](https://travis-ci.org/sindresorhus/os-homedir.svg?branch=master)](https://travis-ci.org/sindresorhus/os-homedir) > io.js 2.3.0 [`os.homedir()`](https://iojs.org/api/os.html#os_os_homedir) ponyfill > Ponyfill: A polyfill that doesn't overwrite the native method ## Install ``` $ npm install --save os-homedir ``` ## Usage ```js var osHomedir = require('os-homedir'); console.log(osHomedir()); //=> /Users/sindresorhus ``` ## Related - [user-home](https://github.com/sindresorhus/user-home) - Same as this module but caches the result - [home-or-tmp](https://github.com/sindresorhus/home-or-tmp) - Get the user home directory with fallback to the system temp directory ## License MIT © [Sindre Sorhus](http://sindresorhus.com) npm_3.5.2.orig/node_modules/osenv/node_modules/os-tmpdir/index.js0000644000000000000000000000107512631326456023344 0ustar 00000000000000'use strict'; var isWindows = process.platform === 'win32'; var trailingSlashRe = isWindows ? /[^:]\\$/ : /.\/$/; // https://github.com/nodejs/io.js/blob/3e7a14381497a3b73dda68d05b5130563cdab420/lib/os.js#L25-L43 module.exports = function () { var path; if (isWindows) { path = process.env.TEMP || process.env.TMP || (process.env.SystemRoot || process.env.windir) + '\\temp'; } else { path = process.env.TMPDIR || process.env.TMP || process.env.TEMP || '/tmp'; } if (trailingSlashRe.test(path)) { path = path.slice(0, -1); } return path; }; npm_3.5.2.orig/node_modules/osenv/node_modules/os-tmpdir/license0000644000000000000000000000213712631326456023244 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/osenv/node_modules/os-tmpdir/package.json0000644000000000000000000000370712631326456024171 0ustar 00000000000000{ "name": "os-tmpdir", "version": "1.0.1", "description": "Node.js os.tmpdir() ponyfill", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/os-tmpdir.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "engines": { "node": ">=0.10.0" }, "scripts": { "test": "node test.js" }, "files": [ "index.js" ], "keywords": [ "built-in", "core", "ponyfill", "polyfill", "shim", "os", "tmpdir", "tempdir", "tmp", "temp", "dir", "directory", "env", "environment" ], "devDependencies": { "ava": "0.0.4" }, "readme": "# os-tmpdir [![Build Status](https://travis-ci.org/sindresorhus/os-tmpdir.svg?branch=master)](https://travis-ci.org/sindresorhus/os-tmpdir)\n\n> Node.js [`os.tmpdir()`](https://nodejs.org/api/os.html#os_os_tmpdir) ponyfill\n\n> Ponyfill: A polyfill that doesn't overwrite the native method\n\nUse this instead of `require('os').tmpdir()` to get a consistent behaviour on different Node.js versions (even 0.8).\n\n*This is actually taken from io.js 2.0.2 as it contains some fixes that haven't bubbled up to Node.js yet.*\n\n\n## Install\n\n```\n$ npm install --save os-tmpdir\n```\n\n\n## Usage\n\n```js\nvar osTmpdir = require('os-tmpdir');\n\nosTmpdir();\n//=> /var/folders/m3/5574nnhn0yj488ccryqr7tc80000gn/T\n```\n\n\n## API\n\nSee the [`os.tmpdir()` docs](https://nodejs.org/api/os.html#os_os_tmpdir).\n\n\n## License\n\nMIT © [Sindre Sorhus](http://sindresorhus.com)\n", "readmeFilename": "readme.md", "bugs": { "url": "https://github.com/sindresorhus/os-tmpdir/issues" }, "homepage": "https://github.com/sindresorhus/os-tmpdir#readme", "_id": "os-tmpdir@1.0.1", "_shasum": "e9b423a1edaf479882562e92ed71d7743a071b6e", "_resolved": "https://registry.npmjs.org/os-tmpdir/-/os-tmpdir-1.0.1.tgz", "_from": "os-tmpdir@>=1.0.0 <2.0.0" } npm_3.5.2.orig/node_modules/osenv/node_modules/os-tmpdir/readme.md0000644000000000000000000000150712631326456023456 0ustar 00000000000000# os-tmpdir [![Build Status](https://travis-ci.org/sindresorhus/os-tmpdir.svg?branch=master)](https://travis-ci.org/sindresorhus/os-tmpdir) > Node.js [`os.tmpdir()`](https://nodejs.org/api/os.html#os_os_tmpdir) ponyfill > Ponyfill: A polyfill that doesn't overwrite the native method Use this instead of `require('os').tmpdir()` to get a consistent behaviour on different Node.js versions (even 0.8). *This is actually taken from io.js 2.0.2 as it contains some fixes that haven't bubbled up to Node.js yet.* ## Install ``` $ npm install --save os-tmpdir ``` ## Usage ```js var osTmpdir = require('os-tmpdir'); osTmpdir(); //=> /var/folders/m3/5574nnhn0yj488ccryqr7tc80000gn/T ``` ## API See the [`os.tmpdir()` docs](https://nodejs.org/api/os.html#os_os_tmpdir). ## License MIT © [Sindre Sorhus](http://sindresorhus.com) npm_3.5.2.orig/node_modules/osenv/test/unix.js0000644000000000000000000000430712631326456017605 0ustar 00000000000000// only run this test on windows // pretending to be another platform is too hacky, since it breaks // how the underlying system looks up module paths and runs // child processes, and all that stuff is cached. if (process.platform === 'win32') { console.log('TAP Version 13\n' + '1..0\n' + '# Skip unix tests, this is not unix\n') return } var tap = require('tap') // like unix, but funny process.env.USER = 'sirUser' process.env.HOME = '/home/sirUser' process.env.HOSTNAME = 'my-machine' process.env.TMPDIR = '/tmpdir' process.env.TMP = '/tmp' process.env.TEMP = '/temp' process.env.PATH = '/opt/local/bin:/usr/local/bin:/usr/bin/:bin' process.env.PS1 = '(o_o) $ ' process.env.EDITOR = 'edit' process.env.VISUAL = 'visualedit' process.env.SHELL = 'zsh' tap.test('basic unix sanity test', function (t) { var osenv = require('../osenv.js') t.equal(osenv.user(), process.env.USER) t.equal(osenv.home(), process.env.HOME) t.equal(osenv.hostname(), process.env.HOSTNAME) t.same(osenv.path(), process.env.PATH.split(':')) t.equal(osenv.prompt(), process.env.PS1) t.equal(osenv.tmpdir(), process.env.TMPDIR) // mildly evil, but it's for a test. process.env.TMPDIR = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.tmpdir(), process.env.TMP) process.env.TMP = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.tmpdir(), process.env.TEMP) process.env.TEMP = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') osenv.home = function () { return null } t.equal(osenv.tmpdir(), '/tmp') t.equal(osenv.editor(), 'edit') process.env.EDITOR = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.editor(), 'visualedit') process.env.VISUAL = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.editor(), 'vi') t.equal(osenv.shell(), 'zsh') process.env.SHELL = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.shell(), 'bash') t.end() }) npm_3.5.2.orig/node_modules/osenv/test/windows.js0000644000000000000000000000463412631326456020317 0ustar 00000000000000// only run this test on windows // pretending to be another platform is too hacky, since it breaks // how the underlying system looks up module paths and runs // child processes, and all that stuff is cached. if (process.platform !== 'win32') { console.log('TAP version 13\n' + '1..0 # Skip windows tests, this is not windows\n') return } // load this before clubbing the platform name. var tap = require('tap') process.env.windir = 'c:\\windows' process.env.USERDOMAIN = 'some-domain' process.env.USERNAME = 'sirUser' process.env.USERPROFILE = 'C:\\Users\\sirUser' process.env.COMPUTERNAME = 'my-machine' process.env.TMPDIR = 'C:\\tmpdir' process.env.TMP = 'C:\\tmp' process.env.TEMP = 'C:\\temp' process.env.Path = 'C:\\Program Files\\;C:\\Binary Stuff\\bin' process.env.PROMPT = '(o_o) $ ' process.env.EDITOR = 'edit' process.env.VISUAL = 'visualedit' process.env.ComSpec = 'some-com' tap.test('basic windows sanity test', function (t) { var osenv = require('../osenv.js') t.equal(osenv.user(), process.env.USERDOMAIN + '\\' + process.env.USERNAME) t.equal(osenv.home(), process.env.USERPROFILE) t.equal(osenv.hostname(), process.env.COMPUTERNAME) t.same(osenv.path(), process.env.Path.split(';')) t.equal(osenv.prompt(), process.env.PROMPT) t.equal(osenv.tmpdir(), process.env.TMPDIR) // mildly evil, but it's for a test. process.env.TMPDIR = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.tmpdir(), process.env.TMP) process.env.TMP = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.tmpdir(), process.env.TEMP) process.env.TEMP = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') osenv.home = function () { return null } t.equal(osenv.tmpdir(), 'c:\\windows\\temp') t.equal(osenv.editor(), 'edit') process.env.EDITOR = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.editor(), 'visualedit') process.env.VISUAL = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.editor(), 'notepad.exe') t.equal(osenv.shell(), 'some-com') process.env.ComSpec = '' delete require.cache[require.resolve('../osenv.js')] var osenv = require('../osenv.js') t.equal(osenv.shell(), 'cmd') t.end() }) npm_3.5.2.orig/node_modules/path-is-inside/LICENSE.txt0000644000000000000000000000135512631326456020614 0ustar 00000000000000Copyright © 2013–2014 Domenic Denicola This work is free. You can redistribute it and/or modify it under the terms of the Do What The Fuck You Want To Public License, Version 2, as published by Sam Hocevar. See below for more details. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE Version 2, December 2004 Copyright (C) 2004 Sam Hocevar Everyone is permitted to copy and distribute verbatim or modified copies of this license document, and changing it is allowed as long as the name is changed. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. You just DO WHAT THE FUCK YOU WANT TO. npm_3.5.2.orig/node_modules/path-is-inside/README.md0000644000000000000000000000273712631326456020255 0ustar 00000000000000# Is This Path Inside This Other Path? It turns out this question isn't trivial to answer using Node's built-in path APIs. A naive `indexOf`-based solution will fail sometimes on Windows, which is case-insensitive (see e.g. [isaacs/npm#4214][]). You might then think to be clever with `path.resolve`, but you have to be careful to account for situations whether the paths have different drive letters, or else you'll cause bugs like [isaacs/npm#4313][]. And let's not even get started on trailing slashes. The **path-is-inside** package will give you a robust, cross-platform way of detecting whether a given path is inside another path. ## Usage Pretty simple. First the path being tested; then the potential parent. Like so: ```js var pathIsInside = require("path-is-inside"); pathIsInside("/x/y/z", "/x/y") // true pathIsInside("/x/y", "/x/y/z") // false ``` ## OS-Specific Behavior Like Node's built-in path module, path-is-inside treats all file paths on Windows as case-insensitive, whereas it treats all file paths on *-nix operating systems as case-sensitive. Keep this in mind especially when working on a Mac, where, despite Node's defaults, the OS usually treats paths case-insensitively. In practice, this means: ```js // On Windows pathIsInside("C:\\X\\Y\\Z", "C:\\x\\y") // true // On *-nix, including Mac OS X pathIsInside("/X/Y/Z", "/x/y") // false ``` [isaacs/npm#4214]: https://github.com/isaacs/npm/pull/4214 [isaacs/npm#4313]: https://github.com/isaacs/npm/issues/4313 npm_3.5.2.orig/node_modules/path-is-inside/lib/0000755000000000000000000000000012631326456017533 5ustar 00000000000000npm_3.5.2.orig/node_modules/path-is-inside/package.json0000644000000000000000000000506712631326456021263 0ustar 00000000000000{ "name": "path-is-inside", "description": "Tests whether one path is inside another path", "keywords": [ "path", "directory", "folder", "inside", "relative" ], "version": "1.0.1", "author": { "name": "Domenic Denicola", "email": "domenic@domenicdenicola.com", "url": "http://domenic.me" }, "license": "WTFPL", "repository": { "type": "git", "url": "git://github.com/domenic/path-is-inside.git" }, "bugs": { "url": "http://github.com/domenic/path-is-inside/issues" }, "main": "lib/path-is-inside.js", "scripts": { "test": "mocha", "lint": "jshint lib" }, "devDependencies": { "jshint": "~2.3.0", "mocha": "~1.15.1" }, "readme": "# Is This Path Inside This Other Path?\n\nIt turns out this question isn't trivial to answer using Node's built-in path APIs. A naive `indexOf`-based solution will fail sometimes on Windows, which is case-insensitive (see e.g. [isaacs/npm#4214][]). You might then think to be clever with `path.resolve`, but you have to be careful to account for situations whether the paths have different drive letters, or else you'll cause bugs like [isaacs/npm#4313][]. And let's not even get started on trailing slashes.\n\nThe **path-is-inside** package will give you a robust, cross-platform way of detecting whether a given path is inside another path.\n\n## Usage\n\nPretty simple. First the path being tested; then the potential parent. Like so:\n\n```js\nvar pathIsInside = require(\"path-is-inside\");\n\npathIsInside(\"/x/y/z\", \"/x/y\") // true\npathIsInside(\"/x/y\", \"/x/y/z\") // false\n```\n\n## OS-Specific Behavior\n\nLike Node's built-in path module, path-is-inside treats all file paths on Windows as case-insensitive, whereas it treats all file paths on *-nix operating systems as case-sensitive. Keep this in mind especially when working on a Mac, where, despite Node's defaults, the OS usually treats paths case-insensitively.\n\nIn practice, this means:\n\n```js\n// On Windows\n\npathIsInside(\"C:\\\\X\\\\Y\\\\Z\", \"C:\\\\x\\\\y\") // true\n\n// On *-nix, including Mac OS X\n\npathIsInside(\"/X/Y/Z\", \"/x/y\") // false\n```\n\n[isaacs/npm#4214]: https://github.com/isaacs/npm/pull/4214\n[isaacs/npm#4313]: https://github.com/isaacs/npm/issues/4313\n", "readmeFilename": "README.md", "homepage": "https://github.com/domenic/path-is-inside#readme", "_id": "path-is-inside@1.0.1", "_shasum": "98d8f1d030bf04bd7aeee4a1ba5485d40318fd89", "_resolved": "https://registry.npmjs.org/path-is-inside/-/path-is-inside-1.0.1.tgz", "_from": "path-is-inside@>=1.0.1 <1.1.0" } npm_3.5.2.orig/node_modules/path-is-inside/lib/path-is-inside.js0000644000000000000000000000153212631326456022710 0ustar 00000000000000"use strict"; var path = require("path"); module.exports = function (thePath, potentialParent) { // For inside-directory checking, we want to allow trailing slashes, so normalize. thePath = stripTrailingSep(thePath); potentialParent = stripTrailingSep(potentialParent); // Node treats only Windows as case-insensitive in its path module; we follow those conventions. if (process.platform === "win32") { thePath = thePath.toLowerCase(); potentialParent = potentialParent.toLowerCase(); } return thePath.lastIndexOf(potentialParent, 0) === 0 && ( thePath[potentialParent.length] === path.sep || thePath[potentialParent.length] === undefined ); }; function stripTrailingSep(thePath) { if (thePath[thePath.length - 1] === path.sep) { return thePath.slice(0, -1); } return thePath; } npm_3.5.2.orig/node_modules/read/LICENSE0000644000000000000000000000137512631326456016075 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/read/README.md0000644000000000000000000000327112631326456016344 0ustar 00000000000000## read For reading user input from stdin. Similar to the `readline` builtin's `question()` method, but with a few more features. ## USAGE ```javascript var read = require("read") read(options, callback) ``` The callback gets called with either the user input, or the default specified, or an error, as `callback(error, result, isDefault)` node style. ## OPTIONS Every option is optional. * `prompt` What to write to stdout before reading input. * `silent` Don't echo the output as the user types it. * `replace` Replace silenced characters with the supplied character value. * `timeout` Number of ms to wait for user input before giving up. * `default` The default value if the user enters nothing. * `edit` Allow the user to edit the default value. * `terminal` Treat the output as a TTY, whether it is or not. * `input` Readable stream to get input data from. (default `process.stdin`) * `output` Writeable stream to write prompts to. (default: `process.stdout`) If silent is true, and the input is a TTY, then read will set raw mode, and read character by character. ## COMPATIBILITY This module works sort of with node 0.6. It does not work with node versions less than 0.6. It is best on node 0.8. On node version 0.6, it will remove all listeners on the input stream's `data` and `keypress` events, because the readline module did not fully clean up after itself in that version of node, and did not make it possible to clean up after it in a way that has no potential for side effects. Additionally, some of the readline options (like `terminal`) will not function in versions of node before 0.8, because they were not implemented in the builtin readline module. ## CONTRIBUTING Patches welcome. npm_3.5.2.orig/node_modules/read/lib/0000755000000000000000000000000012631326456015630 5ustar 00000000000000npm_3.5.2.orig/node_modules/read/node_modules/0000755000000000000000000000000012631326456017537 5ustar 00000000000000npm_3.5.2.orig/node_modules/read/package.json0000644000000000000000000000517712631326456017362 0ustar 00000000000000{ "name": "read", "version": "1.0.7", "main": "lib/read.js", "dependencies": { "mute-stream": "~0.0.4" }, "devDependencies": { "tap": "^1.2.0" }, "engines": { "node": ">=0.8" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "description": "read(1) for node programs", "repository": { "type": "git", "url": "git://github.com/isaacs/read.git" }, "license": "ISC", "scripts": { "test": "tap test/*.js" }, "files": [ "lib/read.js" ], "readme": "## read\n\nFor reading user input from stdin.\n\nSimilar to the `readline` builtin's `question()` method, but with a\nfew more features.\n\n## USAGE\n\n```javascript\nvar read = require(\"read\")\nread(options, callback)\n```\n\nThe callback gets called with either the user input, or the default\nspecified, or an error, as `callback(error, result, isDefault)`\nnode style.\n\n## OPTIONS\n\nEvery option is optional.\n\n* `prompt` What to write to stdout before reading input.\n* `silent` Don't echo the output as the user types it.\n* `replace` Replace silenced characters with the supplied character value.\n* `timeout` Number of ms to wait for user input before giving up.\n* `default` The default value if the user enters nothing.\n* `edit` Allow the user to edit the default value.\n* `terminal` Treat the output as a TTY, whether it is or not.\n* `input` Readable stream to get input data from. (default `process.stdin`)\n* `output` Writeable stream to write prompts to. (default: `process.stdout`)\n\nIf silent is true, and the input is a TTY, then read will set raw\nmode, and read character by character.\n\n## COMPATIBILITY\n\nThis module works sort of with node 0.6. It does not work with node\nversions less than 0.6. It is best on node 0.8.\n\nOn node version 0.6, it will remove all listeners on the input\nstream's `data` and `keypress` events, because the readline module did\nnot fully clean up after itself in that version of node, and did not\nmake it possible to clean up after it in a way that has no potential\nfor side effects.\n\nAdditionally, some of the readline options (like `terminal`) will not\nfunction in versions of node before 0.8, because they were not\nimplemented in the builtin readline module.\n\n## CONTRIBUTING\n\nPatches welcome.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/read/issues" }, "homepage": "https://github.com/isaacs/read#readme", "_id": "read@1.0.7", "_shasum": "b3da19bd052431a97671d44a42634adf710b40c4", "_resolved": "https://registry.npmjs.org/read/-/read-1.0.7.tgz", "_from": "read@>=1.0.7 <1.1.0" } npm_3.5.2.orig/node_modules/read/lib/read.js0000644000000000000000000000457712631326456017116 0ustar 00000000000000 module.exports = read var readline = require('readline') var Mute = require('mute-stream') function read (opts, cb) { if (opts.num) { throw new Error('read() no longer accepts a char number limit') } if (typeof opts.default !== 'undefined' && typeof opts.default !== 'string' && typeof opts.default !== 'number') { throw new Error('default value must be string or number') } var input = opts.input || process.stdin var output = opts.output || process.stdout var prompt = (opts.prompt || '').trim() + ' ' var silent = opts.silent var editDef = false var timeout = opts.timeout var def = opts.default || '' if (def) { if (silent) { prompt += '(
      new BufferList([ callback ]) * bl.length * bl.append(buffer) * bl.get(index) * bl.slice([ start[, end ] ]) * bl.copy(dest, [ destStart, [ srcStart [, srcEnd ] ] ]) * bl.duplicate() * bl.consume(bytes) * bl.toString([encoding, [ start, [ end ]]]) * bl.readDoubleBE(), bl.readDoubleLE(), bl.readFloatBE(), bl.readFloatLE(), bl.readInt32BE(), bl.readInt32LE(), bl.readUInt32BE(), bl.readUInt32LE(), bl.readInt16BE(), bl.readInt16LE(), bl.readUInt16BE(), bl.readUInt16LE(), bl.readInt8(), bl.readUInt8() * Streams -------------------------------------------------------- ### new BufferList([ callback | buffer | buffer array ]) The constructor takes an optional callback, if supplied, the callback will be called with an error argument followed by a reference to the **bl** instance, when `bl.end()` is called (i.e. from a piped stream). This is a convenient method of collecting the entire contents of a stream, particularly when the stream is *chunky*, such as a network stream. Normally, no arguments are required for the constructor, but you can initialise the list by passing in a single `Buffer` object or an array of `Buffer` object. `new` is not strictly required, if you don't instantiate a new object, it will be done automatically for you so you can create a new instance simply with: ```js var bl = require('bl') var myinstance = bl() // equivilant to: var BufferList = require('bl') var myinstance = new BufferList() ``` -------------------------------------------------------- ### bl.length Get the length of the list in bytes. This is the sum of the lengths of all of the buffers contained in the list, minus any initial offset for a semi-consumed buffer at the beginning. Should accurately represent the total number of bytes that can be read from the list. -------------------------------------------------------- ### bl.append(buffer) `append(buffer)` adds an additional buffer or BufferList to the internal list. -------------------------------------------------------- ### bl.get(index) `get()` will return the byte at the specified index. -------------------------------------------------------- ### bl.slice([ start, [ end ] ]) `slice()` returns a new `Buffer` object containing the bytes within the range specified. Both `start` and `end` are optional and will default to the beginning and end of the list respectively. If the requested range spans a single internal buffer then a slice of that buffer will be returned which shares the original memory range of that Buffer. If the range spans multiple buffers then copy operations will likely occur to give you a uniform Buffer. -------------------------------------------------------- ### bl.copy(dest, [ destStart, [ srcStart [, srcEnd ] ] ]) `copy()` copies the content of the list in the `dest` buffer, starting from `destStart` and containing the bytes within the range specified with `srcStart` to `srcEnd`. `destStart`, `start` and `end` are optional and will default to the beginning of the `dest` buffer, and the beginning and end of the list respectively. -------------------------------------------------------- ### bl.duplicate() `duplicate()` performs a **shallow-copy** of the list. The internal Buffers remains the same, so if you change the underlying Buffers, the change will be reflected in both the original and the duplicate. This method is needed if you want to call `consume()` or `pipe()` and still keep the original list.Example: ```js var bl = new BufferList() bl.append('hello') bl.append(' world') bl.append('\n') bl.duplicate().pipe(process.stdout, { end: false }) console.log(bl.toString()) ``` -------------------------------------------------------- ### bl.consume(bytes) `consume()` will shift bytes *off the start of the list*. The number of bytes consumed don't need to line up with the sizes of the internal Buffers—initial offsets will be calculated accordingly in order to give you a consistent view of the data. -------------------------------------------------------- ### bl.toString([encoding, [ start, [ end ]]]) `toString()` will return a string representation of the buffer. The optional `start` and `end` arguments are passed on to `slice()`, while the `encoding` is passed on to `toString()` of the resulting Buffer. See the [Buffer#toString()](http://nodejs.org/docs/latest/api/buffer.html#buffer_buf_tostring_encoding_start_end) documentation for more information. -------------------------------------------------------- ### bl.readDoubleBE(), bl.readDoubleLE(), bl.readFloatBE(), bl.readFloatLE(), bl.readInt32BE(), bl.readInt32LE(), bl.readUInt32BE(), bl.readUInt32LE(), bl.readInt16BE(), bl.readInt16LE(), bl.readUInt16BE(), bl.readUInt16LE(), bl.readInt8(), bl.readUInt8() All of the standard byte-reading methods of the `Buffer` interface are implemented and will operate across internal Buffer boundaries transparently. See the [Buffer](http://nodejs.org/docs/latest/api/buffer.html) documentation for how these work. -------------------------------------------------------- ### Streams **bl** is a Node **[Duplex Stream](http://nodejs.org/docs/latest/api/stream.html#stream_class_stream_duplex)**, so it can be read from and written to like a standard Node stream. You can also `pipe()` to and from a **bl** instance. -------------------------------------------------------- ## Contributors **bl** is brought to you by the following hackers: * [Rod Vagg](https://github.com/rvagg) * [Matteo Collina](https://github.com/mcollina) * [Jarett Cruger](https://github.com/jcrugzz) ======= ## License & copyright Copyright (c) 2013-2014 bl contributors (listed above). bl is licensed under the MIT license. All rights not explicitly granted in the MIT license are reserved. See the included LICENSE.md file for more details. npm_3.5.2.orig/node_modules/request/node_modules/bl/bl.js0000644000000000000000000001152512631326456021650 0ustar 00000000000000var DuplexStream = require('readable-stream/duplex') , util = require('util') function BufferList (callback) { if (!(this instanceof BufferList)) return new BufferList(callback) this._bufs = [] this.length = 0 if (typeof callback == 'function') { this._callback = callback var piper = function (err) { if (this._callback) { this._callback(err) this._callback = null } }.bind(this) this.on('pipe', function (src) { src.on('error', piper) }) this.on('unpipe', function (src) { src.removeListener('error', piper) }) } else if (Buffer.isBuffer(callback)) this.append(callback) else if (Array.isArray(callback)) { callback.forEach(function (b) { Buffer.isBuffer(b) && this.append(b) }.bind(this)) } DuplexStream.call(this) } util.inherits(BufferList, DuplexStream) BufferList.prototype._offset = function (offset) { var tot = 0, i = 0, _t for (; i < this._bufs.length; i++) { _t = tot + this._bufs[i].length if (offset < _t) return [ i, offset - tot ] tot = _t } } BufferList.prototype.append = function (buf) { var isBuffer = Buffer.isBuffer(buf) || buf instanceof BufferList this._bufs.push(isBuffer ? buf : new Buffer(buf)) this.length += buf.length return this } BufferList.prototype._write = function (buf, encoding, callback) { this.append(buf) if (callback) callback() } BufferList.prototype._read = function (size) { if (!this.length) return this.push(null) size = Math.min(size, this.length) this.push(this.slice(0, size)) this.consume(size) } BufferList.prototype.end = function (chunk) { DuplexStream.prototype.end.call(this, chunk) if (this._callback) { this._callback(null, this.slice()) this._callback = null } } BufferList.prototype.get = function (index) { return this.slice(index, index + 1)[0] } BufferList.prototype.slice = function (start, end) { return this.copy(null, 0, start, end) } BufferList.prototype.copy = function (dst, dstStart, srcStart, srcEnd) { if (typeof srcStart != 'number' || srcStart < 0) srcStart = 0 if (typeof srcEnd != 'number' || srcEnd > this.length) srcEnd = this.length if (srcStart >= this.length) return dst || new Buffer(0) if (srcEnd <= 0) return dst || new Buffer(0) var copy = !!dst , off = this._offset(srcStart) , len = srcEnd - srcStart , bytes = len , bufoff = (copy && dstStart) || 0 , start = off[1] , l , i // copy/slice everything if (srcStart === 0 && srcEnd == this.length) { if (!copy) // slice, just return a full concat return Buffer.concat(this._bufs) // copy, need to copy individual buffers for (i = 0; i < this._bufs.length; i++) { this._bufs[i].copy(dst, bufoff) bufoff += this._bufs[i].length } return dst } // easy, cheap case where it's a subset of one of the buffers if (bytes <= this._bufs[off[0]].length - start) { return copy ? this._bufs[off[0]].copy(dst, dstStart, start, start + bytes) : this._bufs[off[0]].slice(start, start + bytes) } if (!copy) // a slice, we need something to copy in to dst = new Buffer(len) for (i = off[0]; i < this._bufs.length; i++) { l = this._bufs[i].length - start if (bytes > l) { this._bufs[i].copy(dst, bufoff, start) } else { this._bufs[i].copy(dst, bufoff, start, start + bytes) break } bufoff += l bytes -= l if (start) start = 0 } return dst } BufferList.prototype.toString = function (encoding, start, end) { return this.slice(start, end).toString(encoding) } BufferList.prototype.consume = function (bytes) { while (this._bufs.length) { if (bytes > this._bufs[0].length) { bytes -= this._bufs[0].length this.length -= this._bufs[0].length this._bufs.shift() } else { this._bufs[0] = this._bufs[0].slice(bytes) this.length -= bytes break } } return this } BufferList.prototype.duplicate = function () { var i = 0 , copy = new BufferList() for (; i < this._bufs.length; i++) copy.append(this._bufs[i]) return copy } BufferList.prototype.destroy = function () { this._bufs.length = 0; this.length = 0; this.push(null); } ;(function () { var methods = { 'readDoubleBE' : 8 , 'readDoubleLE' : 8 , 'readFloatBE' : 4 , 'readFloatLE' : 4 , 'readInt32BE' : 4 , 'readInt32LE' : 4 , 'readUInt32BE' : 4 , 'readUInt32LE' : 4 , 'readInt16BE' : 2 , 'readInt16LE' : 2 , 'readUInt16BE' : 2 , 'readUInt16LE' : 2 , 'readInt8' : 1 , 'readUInt8' : 1 } for (var m in methods) { (function (m) { BufferList.prototype[m] = function (offset) { return this.slice(offset, offset + methods[m])[m](0) } }(m)) } }()) module.exports = BufferList npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/0000755000000000000000000000000012631326456023366 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/package.json0000644000000000000000000000322212631326456023176 0ustar 00000000000000{ "name": "bl", "version": "1.0.0", "description": "Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!", "main": "bl.js", "scripts": { "test": "node test/test.js | faucet", "test-local": "brtapsauce-local test/basic-test.js" }, "repository": { "type": "git", "url": "git+https://github.com/rvagg/bl.git" }, "homepage": "https://github.com/rvagg/bl", "authors": [ "Rod Vagg (https://github.com/rvagg)", "Matteo Collina (https://github.com/mcollina)", "Jarett Cruger (https://github.com/jcrugzz)" ], "keywords": [ "buffer", "buffers", "stream", "awesomesauce" ], "license": "MIT", "dependencies": { "readable-stream": "~2.0.0" }, "devDependencies": { "tape": "~2.12.3", "hash_file": "~0.1.1", "faucet": "~0.0.1", "brtapsauce": "~0.3.0" }, "gitHead": "1794938be6697a6d1e02cd942a4eea59b353347a", "bugs": { "url": "https://github.com/rvagg/bl/issues" }, "_id": "bl@1.0.0", "_shasum": "ada9a8a89a6d7ac60862f7dec7db207873e0c3f5", "_from": "bl@>=1.0.0 <1.1.0", "_npmVersion": "2.9.0", "_nodeVersion": "2.0.1-nightly20150618d2e4e03444", "_npmUser": { "name": "rvagg", "email": "rod@vagg.org" }, "maintainers": [ { "name": "rvagg", "email": "rod@vagg.org" } ], "dist": { "shasum": "ada9a8a89a6d7ac60862f7dec7db207873e0c3f5", "tarball": "http://registry.npmjs.org/bl/-/bl-1.0.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/bl/-/bl-1.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/bl/test/0000755000000000000000000000000012631326456021670 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/0000755000000000000000000000000012631326456026416 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/.npmignore0000644000000000000000000000004412631326456030413 0ustar 00000000000000build/ test/ examples/ fs.js zlib.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/.travis.yml0000644000000000000000000000264112631326456030532 0ustar 00000000000000sudo: false language: node_js before_install: - npm install -g npm notifications: email: false matrix: include: - node_js: '0.8' env: TASK=test - node_js: '0.10' env: TASK=test - node_js: '0.11' env: TASK=test - node_js: '0.12' env: TASK=test - node_js: 'iojs' env: TASK=test - node_js: 'iojs' env: TASK=browser BROWSER_NAME=opera BROWSER_VERSION="11..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=ie BROWSER_VERSION="9..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=chrome BROWSER_VERSION="39..beta" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=firefox BROWSER_VERSION="34..beta" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=ipad BROWSER_VERSION="6.0..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=iphone BROWSER_VERSION="6.0..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=safari BROWSER_VERSION="5..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=android BROWSER_VERSION="4.0..latest" script: "npm run $TASK" env: global: - secure: rE2Vvo7vnjabYNULNyLFxOyt98BoJexDqsiOnfiD6kLYYsiQGfr/sbZkPMOFm9qfQG7pjqx+zZWZjGSswhTt+626C0t/njXqug7Yps4c3dFblzGfreQHp7wNX5TFsvrxd6dAowVasMp61sJcRnB2w8cUzoe3RAYUDHyiHktwqMc= - secure: g9YINaKAdMatsJ28G9jCGbSaguXCyxSTy+pBO6Ch0Cf57ZLOTka3HqDj8p3nV28LUIHZ3ut5WO43CeYKwt4AUtLpBS3a0dndHdY6D83uY6b2qh5hXlrcbeQTq2cvw2y95F7hm4D1kwrgZ7ViqaKggRcEupAL69YbJnxeUDKWEdI= npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/.zuul.yml0000644000000000000000000000001112631326456030206 0ustar 00000000000000ui: tape npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/LICENSE0000644000000000000000000000211012631326456027415 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/README.md0000644000000000000000000000361012631326456027675 0ustar 00000000000000# readable-stream ***Node-core streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream) [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/) [![Sauce Test Status](https://saucelabs.com/browser-matrix/readable-stream.svg)](https://saucelabs.com/u/readable-stream) ```bash npm install --save readable-stream ``` ***Node-core streams for userland*** This package is a mirror of the Streams2 and Streams3 implementations in Node-core, including [documentation](doc/stream.markdown). If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html). As of version 2.0.0 **readable-stream** uses semantic versioning. # Streams WG Team Members * **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com> - Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B * **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com> - Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242 * **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org> - Release GPG key: DD8F2338BAE7501E3DD5AC78C273792F7D83545D * **Sam Newman** ([@sonewman](https://github.com/sonewman)) <newmansam@outlook.com> * **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com> * **Domenic Denicola** ([@domenic](https://github.com/domenic)) <d@domenic.me> npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/doc/0000755000000000000000000000000012631326456027163 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/duplex.js0000644000000000000000000000006412631326456030255 0ustar 00000000000000module.exports = require("./lib/_stream_duplex.js") npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/0000755000000000000000000000000012631326456027164 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/0000755000000000000000000000000012631326456031073 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/package.json0000644000000000000000000000623512631326456030712 0ustar 00000000000000{ "name": "readable-stream", "version": "2.0.2", "description": "Streams3, a user-land copy of the stream library from iojs v2.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "inherits": "~2.0.1", "isarray": "0.0.1", "process-nextick-args": "~1.0.0", "string_decoder": "~0.10.x", "util-deprecate": "~1.0.1" }, "devDependencies": { "tap": "~0.2.6", "tape": "~4.0.0", "zuul": "~3.0.0" }, "scripts": { "test": "tap test/parallel/*.js", "browser": "zuul --browser-name $BROWSER_NAME --browser-version $BROWSER_VERSION -- test/browser.js" }, "repository": { "type": "git", "url": "git://github.com/nodejs/readable-stream.git" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "license": "MIT", "readme": "# readable-stream\n\n***Node-core streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream)\n\n\n[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)\n[![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/)\n\n\n[![Sauce Test Status](https://saucelabs.com/browser-matrix/readable-stream.svg)](https://saucelabs.com/u/readable-stream)\n\n```bash\nnpm install --save readable-stream\n```\n\n***Node-core streams for userland***\n\nThis package is a mirror of the Streams2 and Streams3 implementations in\nNode-core, including [documentation](doc/stream.markdown).\n\nIf you want to guarantee a stable streams base, regardless of what version of\nNode you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *\"stream\"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html).\n\nAs of version 2.0.0 **readable-stream** uses semantic versioning. \n\n# Streams WG Team Members\n\n* **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com>\n - Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B\n* **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com>\n - Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242\n* **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org>\n - Release GPG key: DD8F2338BAE7501E3DD5AC78C273792F7D83545D\n* **Sam Newman** ([@sonewman](https://github.com/sonewman)) <newmansam@outlook.com>\n* **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com>\n* **Domenic Denicola** ([@domenic](https://github.com/domenic)) <d@domenic.me>\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/nodejs/readable-stream/issues" }, "homepage": "https://github.com/nodejs/readable-stream#readme", "_id": "readable-stream@2.0.2", "_shasum": "bec81beae8cf455168bc2e5b2b31f5bcfaed9b1b", "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.0.2.tgz", "_from": "readable-stream@>=2.0.0 <2.1.0" } npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/passthrough.js0000644000000000000000000000007112631326456031321 0ustar 00000000000000module.exports = require("./lib/_stream_passthrough.js") npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/readable.js0000644000000000000000000000101112631326456030504 0ustar 00000000000000var Stream = (function (){ try { return require('st' + 'ream'); // hack to fix a circular dependency issue when used with browserify } catch(_){} }()); exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = Stream || exports; exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/transform.js0000644000000000000000000000006712631326456030772 0ustar 00000000000000module.exports = require("./lib/_stream_transform.js") npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/writable.js0000644000000000000000000000006612631326456030567 0ustar 00000000000000module.exports = require("./lib/_stream_writable.js") npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/doc/stream.markdown0000644000000000000000000014777612631326456032250 0ustar 00000000000000# Stream Stability: 2 - Stable A stream is an abstract interface implemented by various objects in io.js. For example a [request to an HTTP server](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_http_incomingmessage) is a stream, as is [stdout][]. Streams are readable, writable, or both. All streams are instances of [EventEmitter][] You can load the Stream base classes by doing `require('stream')`. There are base classes provided for [Readable][] streams, [Writable][] streams, [Duplex][] streams, and [Transform][] streams. This document is split up into 3 sections. The first explains the parts of the API that you need to be aware of to use streams in your programs. If you never implement a streaming API yourself, you can stop there. The second section explains the parts of the API that you need to use if you implement your own custom streams yourself. The API is designed to make this easy for you to do. The third section goes into more depth about how streams work, including some of the internal mechanisms and functions that you should probably not modify unless you definitely know what you are doing. ## API for Stream Consumers Streams can be either [Readable][], [Writable][], or both ([Duplex][]). All streams are EventEmitters, but they also have other custom methods and properties depending on whether they are Readable, Writable, or Duplex. If a stream is both Readable and Writable, then it implements all of the methods and events below. So, a [Duplex][] or [Transform][] stream is fully described by this API, though their implementation may be somewhat different. It is not necessary to implement Stream interfaces in order to consume streams in your programs. If you **are** implementing streaming interfaces in your own program, please also refer to [API for Stream Implementors][] below. Almost all io.js programs, no matter how simple, use Streams in some way. Here is an example of using Streams in an io.js program: ```javascript var http = require('http'); var server = http.createServer(function (req, res) { // req is an http.IncomingMessage, which is a Readable Stream // res is an http.ServerResponse, which is a Writable Stream var body = ''; // we want to get the data as utf8 strings // If you don't set an encoding, then you'll get Buffer objects req.setEncoding('utf8'); // Readable streams emit 'data' events once a listener is added req.on('data', function (chunk) { body += chunk; }); // the end event tells you that you have entire body req.on('end', function () { try { var data = JSON.parse(body); } catch (er) { // uh oh! bad json! res.statusCode = 400; return res.end('error: ' + er.message); } // write back something interesting to the user: res.write(typeof data); res.end(); }); }); server.listen(1337); // $ curl localhost:1337 -d '{}' // object // $ curl localhost:1337 -d '"foo"' // string // $ curl localhost:1337 -d 'not json' // error: Unexpected token o ``` ### Class: stream.Readable The Readable stream interface is the abstraction for a *source* of data that you are reading from. In other words, data comes *out* of a Readable stream. A Readable stream will not start emitting data until you indicate that you are ready to receive it. Readable streams have two "modes": a **flowing mode** and a **paused mode**. When in flowing mode, data is read from the underlying system and provided to your program as fast as possible. In paused mode, you must explicitly call `stream.read()` to get chunks of data out. Streams start out in paused mode. **Note**: If no data event handlers are attached, and there are no [`pipe()`][] destinations, and the stream is switched into flowing mode, then data will be lost. You can switch to flowing mode by doing any of the following: * Adding a [`'data'` event][] handler to listen for data. * Calling the [`resume()`][] method to explicitly open the flow. * Calling the [`pipe()`][] method to send the data to a [Writable][]. You can switch back to paused mode by doing either of the following: * If there are no pipe destinations, by calling the [`pause()`][] method. * If there are pipe destinations, by removing any [`'data'` event][] handlers, and removing all pipe destinations by calling the [`unpipe()`][] method. Note that, for backwards compatibility reasons, removing `'data'` event handlers will **not** automatically pause the stream. Also, if there are piped destinations, then calling `pause()` will not guarantee that the stream will *remain* paused once those destinations drain and ask for more data. Examples of readable streams include: * [http responses, on the client](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_http_incomingmessage) * [http requests, on the server](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_http_incomingmessage) * [fs read streams](https://iojs.org/dist/v2.3.0/doc/api/fs.html#fs_class_fs_readstream) * [zlib streams][] * [crypto streams][] * [tcp sockets][] * [child process stdout and stderr][] * [process.stdin][] #### Event: 'readable' When a chunk of data can be read from the stream, it will emit a `'readable'` event. In some cases, listening for a `'readable'` event will cause some data to be read into the internal buffer from the underlying system, if it hadn't already. ```javascript var readable = getReadableStreamSomehow(); readable.on('readable', function() { // there is some data to read now }); ``` Once the internal buffer is drained, a `readable` event will fire again when more data is available. #### Event: 'data' * `chunk` {Buffer | String} The chunk of data. Attaching a `data` event listener to a stream that has not been explicitly paused will switch the stream into flowing mode. Data will then be passed as soon as it is available. If you just want to get all the data out of the stream as fast as possible, this is the best way to do so. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); }); ``` #### Event: 'end' This event fires when there will be no more data to read. Note that the `end` event **will not fire** unless the data is completely consumed. This can be done by switching into flowing mode, or by calling `read()` repeatedly until you get to the end. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); }); readable.on('end', function() { console.log('there will be no more data.'); }); ``` #### Event: 'close' Emitted when the underlying resource (for example, the backing file descriptor) has been closed. Not all streams will emit this. #### Event: 'error' * {Error Object} Emitted if there was an error receiving data. #### readable.read([size]) * `size` {Number} Optional argument to specify how much data to read. * Return {String | Buffer | null} The `read()` method pulls some data out of the internal buffer and returns it. If there is no data available, then it will return `null`. If you pass in a `size` argument, then it will return that many bytes. If `size` bytes are not available, then it will return `null`. If you do not specify a `size` argument, then it will return all the data in the internal buffer. This method should only be called in paused mode. In flowing mode, this method is called automatically until the internal buffer is drained. ```javascript var readable = getReadableStreamSomehow(); readable.on('readable', function() { var chunk; while (null !== (chunk = readable.read())) { console.log('got %d bytes of data', chunk.length); } }); ``` If this method returns a data chunk, then it will also trigger the emission of a [`'data'` event][]. #### readable.setEncoding(encoding) * `encoding` {String} The encoding to use. * Return: `this` Call this function to cause the stream to return strings of the specified encoding instead of Buffer objects. For example, if you do `readable.setEncoding('utf8')`, then the output data will be interpreted as UTF-8 data, and returned as strings. If you do `readable.setEncoding('hex')`, then the data will be encoded in hexadecimal string format. This properly handles multi-byte characters that would otherwise be potentially mangled if you simply pulled the Buffers directly and called `buf.toString(encoding)` on them. If you want to read the data as strings, always use this method. ```javascript var readable = getReadableStreamSomehow(); readable.setEncoding('utf8'); readable.on('data', function(chunk) { assert.equal(typeof chunk, 'string'); console.log('got %d characters of string data', chunk.length); }); ``` #### readable.resume() * Return: `this` This method will cause the readable stream to resume emitting `data` events. This method will switch the stream into flowing mode. If you do *not* want to consume the data from a stream, but you *do* want to get to its `end` event, you can call [`readable.resume()`][] to open the flow of data. ```javascript var readable = getReadableStreamSomehow(); readable.resume(); readable.on('end', function() { console.log('got to the end, but did not read anything'); }); ``` #### readable.pause() * Return: `this` This method will cause a stream in flowing mode to stop emitting `data` events, switching out of flowing mode. Any data that becomes available will remain in the internal buffer. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); readable.pause(); console.log('there will be no more data for 1 second'); setTimeout(function() { console.log('now data will start flowing again'); readable.resume(); }, 1000); }); ``` #### readable.isPaused() * Return: `Boolean` This method returns whether or not the `readable` has been **explicitly** paused by client code (using `readable.pause()` without a corresponding `readable.resume()`). ```javascript var readable = new stream.Readable readable.isPaused() // === false readable.pause() readable.isPaused() // === true readable.resume() readable.isPaused() // === false ``` #### readable.pipe(destination[, options]) * `destination` {[Writable][] Stream} The destination for writing data * `options` {Object} Pipe options * `end` {Boolean} End the writer when the reader ends. Default = `true` This method pulls all the data out of a readable stream, and writes it to the supplied destination, automatically managing the flow so that the destination is not overwhelmed by a fast readable stream. Multiple destinations can be piped to safely. ```javascript var readable = getReadableStreamSomehow(); var writable = fs.createWriteStream('file.txt'); // All the data from readable goes into 'file.txt' readable.pipe(writable); ``` This function returns the destination stream, so you can set up pipe chains like so: ```javascript var r = fs.createReadStream('file.txt'); var z = zlib.createGzip(); var w = fs.createWriteStream('file.txt.gz'); r.pipe(z).pipe(w); ``` For example, emulating the Unix `cat` command: ```javascript process.stdin.pipe(process.stdout); ``` By default [`end()`][] is called on the destination when the source stream emits `end`, so that `destination` is no longer writable. Pass `{ end: false }` as `options` to keep the destination stream open. This keeps `writer` open so that "Goodbye" can be written at the end. ```javascript reader.pipe(writer, { end: false }); reader.on('end', function() { writer.end('Goodbye\n'); }); ``` Note that `process.stderr` and `process.stdout` are never closed until the process exits, regardless of the specified options. #### readable.unpipe([destination]) * `destination` {[Writable][] Stream} Optional specific stream to unpipe This method will remove the hooks set up for a previous `pipe()` call. If the destination is not specified, then all pipes are removed. If the destination is specified, but no pipe is set up for it, then this is a no-op. ```javascript var readable = getReadableStreamSomehow(); var writable = fs.createWriteStream('file.txt'); // All the data from readable goes into 'file.txt', // but only for the first second readable.pipe(writable); setTimeout(function() { console.log('stop writing to file.txt'); readable.unpipe(writable); console.log('manually close the file stream'); writable.end(); }, 1000); ``` #### readable.unshift(chunk) * `chunk` {Buffer | String} Chunk of data to unshift onto the read queue This is useful in certain cases where a stream is being consumed by a parser, which needs to "un-consume" some data that it has optimistically pulled out of the source, so that the stream can be passed on to some other party. If you find that you must often call `stream.unshift(chunk)` in your programs, consider implementing a [Transform][] stream instead. (See API for Stream Implementors, below.) ```javascript // Pull off a header delimited by \n\n // use unshift() if we get too much // Call the callback with (error, header, stream) var StringDecoder = require('string_decoder').StringDecoder; function parseHeader(stream, callback) { stream.on('error', callback); stream.on('readable', onReadable); var decoder = new StringDecoder('utf8'); var header = ''; function onReadable() { var chunk; while (null !== (chunk = stream.read())) { var str = decoder.write(chunk); if (str.match(/\n\n/)) { // found the header boundary var split = str.split(/\n\n/); header += split.shift(); var remaining = split.join('\n\n'); var buf = new Buffer(remaining, 'utf8'); if (buf.length) stream.unshift(buf); stream.removeListener('error', callback); stream.removeListener('readable', onReadable); // now the body of the message can be read from the stream. callback(null, header, stream); } else { // still reading the header. header += str; } } } } ``` #### readable.wrap(stream) * `stream` {Stream} An "old style" readable stream Versions of Node.js prior to v0.10 had streams that did not implement the entire Streams API as it is today. (See "Compatibility" below for more information.) If you are using an older io.js library that emits `'data'` events and has a [`pause()`][] method that is advisory only, then you can use the `wrap()` method to create a [Readable][] stream that uses the old stream as its data source. You will very rarely ever need to call this function, but it exists as a convenience for interacting with old io.js programs and libraries. For example: ```javascript var OldReader = require('./old-api-module.js').OldReader; var oreader = new OldReader; var Readable = require('stream').Readable; var myReader = new Readable().wrap(oreader); myReader.on('readable', function() { myReader.read(); // etc. }); ``` ### Class: stream.Writable The Writable stream interface is an abstraction for a *destination* that you are writing data *to*. Examples of writable streams include: * [http requests, on the client](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_class_http_clientrequest) * [http responses, on the server](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_class_http_serverresponse) * [fs write streams](https://iojs.org/dist/v2.3.0/doc/api/fs.html#fs_class_fs_writestream) * [zlib streams][] * [crypto streams][] * [tcp sockets][] * [child process stdin](https://iojs.org/dist/v2.3.0/doc/api/child_process.html#child_process_child_stdin) * [process.stdout][], [process.stderr][] #### writable.write(chunk[, encoding][, callback]) * `chunk` {String | Buffer} The data to write * `encoding` {String} The encoding, if `chunk` is a String * `callback` {Function} Callback for when this chunk of data is flushed * Returns: {Boolean} True if the data was handled completely. This method writes some data to the underlying system, and calls the supplied callback once the data has been fully handled. The return value indicates if you should continue writing right now. If the data had to be buffered internally, then it will return `false`. Otherwise, it will return `true`. This return value is strictly advisory. You MAY continue to write, even if it returns `false`. However, writes will be buffered in memory, so it is best not to do this excessively. Instead, wait for the `drain` event before writing more data. #### Event: 'drain' If a [`writable.write(chunk)`][] call returns false, then the `drain` event will indicate when it is appropriate to begin writing more data to the stream. ```javascript // Write the data to the supplied writable stream 1MM times. // Be attentive to back-pressure. function writeOneMillionTimes(writer, data, encoding, callback) { var i = 1000000; write(); function write() { var ok = true; do { i -= 1; if (i === 0) { // last time! writer.write(data, encoding, callback); } else { // see if we should continue, or wait // don't pass the callback, because we're not done yet. ok = writer.write(data, encoding); } } while (i > 0 && ok); if (i > 0) { // had to stop early! // write some more once it drains writer.once('drain', write); } } } ``` #### writable.cork() Forces buffering of all writes. Buffered data will be flushed either at `.uncork()` or at `.end()` call. #### writable.uncork() Flush all data, buffered since `.cork()` call. #### writable.setDefaultEncoding(encoding) * `encoding` {String} The new default encoding Sets the default encoding for a writable stream. #### writable.end([chunk][, encoding][, callback]) * `chunk` {String | Buffer} Optional data to write * `encoding` {String} The encoding, if `chunk` is a String * `callback` {Function} Optional callback for when the stream is finished Call this method when no more data will be written to the stream. If supplied, the callback is attached as a listener on the `finish` event. Calling [`write()`][] after calling [`end()`][] will raise an error. ```javascript // write 'hello, ' and then end with 'world!' var file = fs.createWriteStream('example.txt'); file.write('hello, '); file.end('world!'); // writing more now is not allowed! ``` #### Event: 'finish' When the [`end()`][] method has been called, and all data has been flushed to the underlying system, this event is emitted. ```javascript var writer = getWritableStreamSomehow(); for (var i = 0; i < 100; i ++) { writer.write('hello, #' + i + '!\n'); } writer.end('this is the end\n'); writer.on('finish', function() { console.error('all writes are now complete.'); }); ``` #### Event: 'pipe' * `src` {[Readable][] Stream} source stream that is piping to this writable This is emitted whenever the `pipe()` method is called on a readable stream, adding this writable to its set of destinations. ```javascript var writer = getWritableStreamSomehow(); var reader = getReadableStreamSomehow(); writer.on('pipe', function(src) { console.error('something is piping into the writer'); assert.equal(src, reader); }); reader.pipe(writer); ``` #### Event: 'unpipe' * `src` {[Readable][] Stream} The source stream that [unpiped][] this writable This is emitted whenever the [`unpipe()`][] method is called on a readable stream, removing this writable from its set of destinations. ```javascript var writer = getWritableStreamSomehow(); var reader = getReadableStreamSomehow(); writer.on('unpipe', function(src) { console.error('something has stopped piping into the writer'); assert.equal(src, reader); }); reader.pipe(writer); reader.unpipe(writer); ``` #### Event: 'error' * {Error object} Emitted if there was an error when writing or piping data. ### Class: stream.Duplex Duplex streams are streams that implement both the [Readable][] and [Writable][] interfaces. See above for usage. Examples of Duplex streams include: * [tcp sockets][] * [zlib streams][] * [crypto streams][] ### Class: stream.Transform Transform streams are [Duplex][] streams where the output is in some way computed from the input. They implement both the [Readable][] and [Writable][] interfaces. See above for usage. Examples of Transform streams include: * [zlib streams][] * [crypto streams][] ## API for Stream Implementors To implement any sort of stream, the pattern is the same: 1. Extend the appropriate parent class in your own subclass. (The [`util.inherits`][] method is particularly helpful for this.) 2. Call the appropriate parent class constructor in your constructor, to be sure that the internal mechanisms are set up properly. 2. Implement one or more specific methods, as detailed below. The class to extend and the method(s) to implement depend on the sort of stream class you are writing:

      Use-case

      Class

      Method(s) to implement

      Reading only

      [Readable](#stream_class_stream_readable_1)

      [_read][]

      Writing only

      [Writable](#stream_class_stream_writable_1)

      [_write][], _writev

      Reading and writing

      [Duplex](#stream_class_stream_duplex_1)

      [_read][], [_write][], _writev

      Operate on written data, then read the result

      [Transform](#stream_class_stream_transform_1)

      _transform, _flush

      In your implementation code, it is very important to never call the methods described in [API for Stream Consumers][] above. Otherwise, you can potentially cause adverse side effects in programs that consume your streaming interfaces. ### Class: stream.Readable `stream.Readable` is an abstract class designed to be extended with an underlying implementation of the [`_read(size)`][] method. Please see above under [API for Stream Consumers][] for how to consume streams in your programs. What follows is an explanation of how to implement Readable streams in your programs. #### Example: A Counting Stream This is a basic example of a Readable stream. It emits the numerals from 1 to 1,000,000 in ascending order, and then ends. ```javascript var Readable = require('stream').Readable; var util = require('util'); util.inherits(Counter, Readable); function Counter(opt) { Readable.call(this, opt); this._max = 1000000; this._index = 1; } Counter.prototype._read = function() { var i = this._index++; if (i > this._max) this.push(null); else { var str = '' + i; var buf = new Buffer(str, 'ascii'); this.push(buf); } }; ``` #### Example: SimpleProtocol v1 (Sub-optimal) This is similar to the `parseHeader` function described above, but implemented as a custom stream. Also, note that this implementation does not convert the incoming data to a string. However, this would be better implemented as a [Transform][] stream. See below for a better implementation. ```javascript // A parser for a simple data protocol. // The "header" is a JSON object, followed by 2 \n characters, and // then a message body. // // NOTE: This can be done more simply as a Transform stream! // Using Readable directly for this is sub-optimal. See the // alternative example below under the Transform section. var Readable = require('stream').Readable; var util = require('util'); util.inherits(SimpleProtocol, Readable); function SimpleProtocol(source, options) { if (!(this instanceof SimpleProtocol)) return new SimpleProtocol(source, options); Readable.call(this, options); this._inBody = false; this._sawFirstCr = false; // source is a readable stream, such as a socket or file this._source = source; var self = this; source.on('end', function() { self.push(null); }); // give it a kick whenever the source is readable // read(0) will not consume any bytes source.on('readable', function() { self.read(0); }); this._rawHeader = []; this.header = null; } SimpleProtocol.prototype._read = function(n) { if (!this._inBody) { var chunk = this._source.read(); // if the source doesn't have data, we don't have data yet. if (chunk === null) return this.push(''); // check if the chunk has a \n\n var split = -1; for (var i = 0; i < chunk.length; i++) { if (chunk[i] === 10) { // '\n' if (this._sawFirstCr) { split = i; break; } else { this._sawFirstCr = true; } } else { this._sawFirstCr = false; } } if (split === -1) { // still waiting for the \n\n // stash the chunk, and try again. this._rawHeader.push(chunk); this.push(''); } else { this._inBody = true; var h = chunk.slice(0, split); this._rawHeader.push(h); var header = Buffer.concat(this._rawHeader).toString(); try { this.header = JSON.parse(header); } catch (er) { this.emit('error', new Error('invalid simple protocol data')); return; } // now, because we got some extra data, unshift the rest // back into the read queue so that our consumer will see it. var b = chunk.slice(split); this.unshift(b); // and let them know that we are done parsing the header. this.emit('header', this.header); } } else { // from there on, just provide the data to our consumer. // careful not to push(null), since that would indicate EOF. var chunk = this._source.read(); if (chunk) this.push(chunk); } }; // Usage: // var parser = new SimpleProtocol(source); // Now parser is a readable stream that will emit 'header' // with the parsed header data. ``` #### new stream.Readable([options]) * `options` {Object} * `highWaterMark` {Number} The maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resource. Default=16kb, or 16 for `objectMode` streams * `encoding` {String} If specified, then buffers will be decoded to strings using the specified encoding. Default=null * `objectMode` {Boolean} Whether this stream should behave as a stream of objects. Meaning that stream.read(n) returns a single value instead of a Buffer of size n. Default=false In classes that extend the Readable class, make sure to call the Readable constructor so that the buffering settings can be properly initialized. #### readable.\_read(size) * `size` {Number} Number of bytes to read asynchronously Note: **Implement this function, but do NOT call it directly.** This function should NOT be called directly. It should be implemented by child classes, and only called by the internal Readable class methods. All Readable stream implementations must provide a `_read` method to fetch data from the underlying resource. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. When data is available, put it into the read queue by calling `readable.push(chunk)`. If `push` returns false, then you should stop reading. When `_read` is called again, you should start pushing more data. The `size` argument is advisory. Implementations where a "read" is a single call that returns data can use this to know how much data to fetch. Implementations where that is not relevant, such as TCP or TLS, may ignore this argument, and simply provide data whenever it becomes available. There is no need, for example to "wait" until `size` bytes are available before calling [`stream.push(chunk)`][]. #### readable.push(chunk[, encoding]) * `chunk` {Buffer | null | String} Chunk of data to push into the read queue * `encoding` {String} Encoding of String chunks. Must be a valid Buffer encoding, such as `'utf8'` or `'ascii'` * return {Boolean} Whether or not more pushes should be performed Note: **This function should be called by Readable implementors, NOT by consumers of Readable streams.** The `_read()` function will not be called again until at least one `push(chunk)` call is made. The `Readable` class works by putting data into a read queue to be pulled out later by calling the `read()` method when the `'readable'` event fires. The `push()` method will explicitly insert some data into the read queue. If it is called with `null` then it will signal the end of the data (EOF). This API is designed to be as flexible as possible. For example, you may be wrapping a lower-level source which has some sort of pause/resume mechanism, and a data callback. In those cases, you could wrap the low-level source object by doing something like this: ```javascript // source is an object with readStop() and readStart() methods, // and an `ondata` member that gets called when it has data, and // an `onend` member that gets called when the data is over. util.inherits(SourceWrapper, Readable); function SourceWrapper(options) { Readable.call(this, options); this._source = getLowlevelSourceObject(); var self = this; // Every time there's data, we push it into the internal buffer. this._source.ondata = function(chunk) { // if push() returns false, then we need to stop reading from source if (!self.push(chunk)) self._source.readStop(); }; // When the source ends, we push the EOF-signaling `null` chunk this._source.onend = function() { self.push(null); }; } // _read will be called when the stream wants to pull more data in // the advisory size argument is ignored in this case. SourceWrapper.prototype._read = function(size) { this._source.readStart(); }; ``` ### Class: stream.Writable `stream.Writable` is an abstract class designed to be extended with an underlying implementation of the [`_write(chunk, encoding, callback)`][] method. Please see above under [API for Stream Consumers][] for how to consume writable streams in your programs. What follows is an explanation of how to implement Writable streams in your programs. #### new stream.Writable([options]) * `options` {Object} * `highWaterMark` {Number} Buffer level when [`write()`][] starts returning false. Default=16kb, or 16 for `objectMode` streams * `decodeStrings` {Boolean} Whether or not to decode strings into Buffers before passing them to [`_write()`][]. Default=true * `objectMode` {Boolean} Whether or not the `write(anyObj)` is a valid operation. If set you can write arbitrary data instead of only `Buffer` / `String` data. Default=false In classes that extend the Writable class, make sure to call the constructor so that the buffering settings can be properly initialized. #### writable.\_write(chunk, encoding, callback) * `chunk` {Buffer | String} The chunk to be written. Will **always** be a buffer unless the `decodeStrings` option was set to `false`. * `encoding` {String} If the chunk is a string, then this is the encoding type. If chunk is a buffer, then this is the special value - 'buffer', ignore it in this case. * `callback` {Function} Call this function (optionally with an error argument) when you are done processing the supplied chunk. All Writable stream implementations must provide a [`_write()`][] method to send data to the underlying resource. Note: **This function MUST NOT be called directly.** It should be implemented by child classes, and called by the internal Writable class methods only. Call the callback using the standard `callback(error)` pattern to signal that the write completed successfully or with an error. If the `decodeStrings` flag is set in the constructor options, then `chunk` may be a string rather than a Buffer, and `encoding` will indicate the sort of string that it is. This is to support implementations that have an optimized handling for certain string data encodings. If you do not explicitly set the `decodeStrings` option to `false`, then you can safely ignore the `encoding` argument, and assume that `chunk` will always be a Buffer. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### writable.\_writev(chunks, callback) * `chunks` {Array} The chunks to be written. Each chunk has following format: `{ chunk: ..., encoding: ... }`. * `callback` {Function} Call this function (optionally with an error argument) when you are done processing the supplied chunks. Note: **This function MUST NOT be called directly.** It may be implemented by child classes, and called by the internal Writable class methods only. This function is completely optional to implement. In most cases it is unnecessary. If implemented, it will be called with all the chunks that are buffered in the write queue. ### Class: stream.Duplex A "duplex" stream is one that is both Readable and Writable, such as a TCP socket connection. Note that `stream.Duplex` is an abstract class designed to be extended with an underlying implementation of the `_read(size)` and [`_write(chunk, encoding, callback)`][] methods as you would with a Readable or Writable stream class. Since JavaScript doesn't have multiple prototypal inheritance, this class prototypally inherits from Readable, and then parasitically from Writable. It is thus up to the user to implement both the lowlevel `_read(n)` method as well as the lowlevel [`_write(chunk, encoding, callback)`][] method on extension duplex classes. #### new stream.Duplex(options) * `options` {Object} Passed to both Writable and Readable constructors. Also has the following fields: * `allowHalfOpen` {Boolean} Default=true. If set to `false`, then the stream will automatically end the readable side when the writable side ends and vice versa. * `readableObjectMode` {Boolean} Default=false. Sets `objectMode` for readable side of the stream. Has no effect if `objectMode` is `true`. * `writableObjectMode` {Boolean} Default=false. Sets `objectMode` for writable side of the stream. Has no effect if `objectMode` is `true`. In classes that extend the Duplex class, make sure to call the constructor so that the buffering settings can be properly initialized. ### Class: stream.Transform A "transform" stream is a duplex stream where the output is causally connected in some way to the input, such as a [zlib][] stream or a [crypto][] stream. There is no requirement that the output be the same size as the input, the same number of chunks, or arrive at the same time. For example, a Hash stream will only ever have a single chunk of output which is provided when the input is ended. A zlib stream will produce output that is either much smaller or much larger than its input. Rather than implement the [`_read()`][] and [`_write()`][] methods, Transform classes must implement the `_transform()` method, and may optionally also implement the `_flush()` method. (See below.) #### new stream.Transform([options]) * `options` {Object} Passed to both Writable and Readable constructors. In classes that extend the Transform class, make sure to call the constructor so that the buffering settings can be properly initialized. #### transform.\_transform(chunk, encoding, callback) * `chunk` {Buffer | String} The chunk to be transformed. Will **always** be a buffer unless the `decodeStrings` option was set to `false`. * `encoding` {String} If the chunk is a string, then this is the encoding type. If chunk is a buffer, then this is the special value - 'buffer', ignore it in this case. * `callback` {Function} Call this function (optionally with an error argument and data) when you are done processing the supplied chunk. Note: **This function MUST NOT be called directly.** It should be implemented by child classes, and called by the internal Transform class methods only. All Transform stream implementations must provide a `_transform` method to accept input and produce output. `_transform` should do whatever has to be done in this specific Transform class, to handle the bytes being written, and pass them off to the readable portion of the interface. Do asynchronous I/O, process things, and so on. Call `transform.push(outputChunk)` 0 or more times to generate output from this input chunk, depending on how much data you want to output as a result of this chunk. Call the callback function only when the current chunk is completely consumed. Note that there may or may not be output as a result of any particular input chunk. If you supply output as the second argument to the callback, it will be passed to push method, in other words the following are equivalent: ```javascript transform.prototype._transform = function (data, encoding, callback) { this.push(data); callback(); } transform.prototype._transform = function (data, encoding, callback) { callback(null, data); } ``` This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### transform.\_flush(callback) * `callback` {Function} Call this function (optionally with an error argument) when you are done flushing any remaining data. Note: **This function MUST NOT be called directly.** It MAY be implemented by child classes, and if so, will be called by the internal Transform class methods only. In some cases, your transform operation may need to emit a bit more data at the end of the stream. For example, a `Zlib` compression stream will store up some internal state so that it can optimally compress the output. At the end, however, it needs to do the best it can with what is left, so that the data will be complete. In those cases, you can implement a `_flush` method, which will be called at the very end, after all the written data is consumed, but before emitting `end` to signal the end of the readable side. Just like with `_transform`, call `transform.push(chunk)` zero or more times, as appropriate, and call `callback` when the flush operation is complete. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### Events: 'finish' and 'end' The [`finish`][] and [`end`][] events are from the parent Writable and Readable classes respectively. The `finish` event is fired after `.end()` is called and all chunks have been processed by `_transform`, `end` is fired after all data has been output which is after the callback in `_flush` has been called. #### Example: `SimpleProtocol` parser v2 The example above of a simple protocol parser can be implemented simply by using the higher level [Transform][] stream class, similar to the `parseHeader` and `SimpleProtocol v1` examples above. In this example, rather than providing the input as an argument, it would be piped into the parser, which is a more idiomatic io.js stream approach. ```javascript var util = require('util'); var Transform = require('stream').Transform; util.inherits(SimpleProtocol, Transform); function SimpleProtocol(options) { if (!(this instanceof SimpleProtocol)) return new SimpleProtocol(options); Transform.call(this, options); this._inBody = false; this._sawFirstCr = false; this._rawHeader = []; this.header = null; } SimpleProtocol.prototype._transform = function(chunk, encoding, done) { if (!this._inBody) { // check if the chunk has a \n\n var split = -1; for (var i = 0; i < chunk.length; i++) { if (chunk[i] === 10) { // '\n' if (this._sawFirstCr) { split = i; break; } else { this._sawFirstCr = true; } } else { this._sawFirstCr = false; } } if (split === -1) { // still waiting for the \n\n // stash the chunk, and try again. this._rawHeader.push(chunk); } else { this._inBody = true; var h = chunk.slice(0, split); this._rawHeader.push(h); var header = Buffer.concat(this._rawHeader).toString(); try { this.header = JSON.parse(header); } catch (er) { this.emit('error', new Error('invalid simple protocol data')); return; } // and let them know that we are done parsing the header. this.emit('header', this.header); // now, because we got some extra data, emit this first. this.push(chunk.slice(split)); } } else { // from there on, just provide the data to our consumer as-is. this.push(chunk); } done(); }; // Usage: // var parser = new SimpleProtocol(); // source.pipe(parser) // Now parser is a readable stream that will emit 'header' // with the parsed header data. ``` ### Class: stream.PassThrough This is a trivial implementation of a [Transform][] stream that simply passes the input bytes across to the output. Its purpose is mainly for examples and testing, but there are occasionally use cases where it can come in handy as a building block for novel sorts of streams. ## Simplified Constructor API In simple cases there is now the added benefit of being able to construct a stream without inheritance. This can be done by passing the appropriate methods as constructor options: Examples: ### Readable ```javascript var readable = new stream.Readable({ read: function(n) { // sets this._read under the hood } }); ``` ### Writable ```javascript var writable = new stream.Writable({ write: function(chunk, encoding, next) { // sets this._write under the hood } }); // or var writable = new stream.Writable({ writev: function(chunks, next) { // sets this._writev under the hood } }); ``` ### Duplex ```javascript var duplex = new stream.Duplex({ read: function(n) { // sets this._read under the hood }, write: function(chunk, encoding, next) { // sets this._write under the hood } }); // or var duplex = new stream.Duplex({ read: function(n) { // sets this._read under the hood }, writev: function(chunks, next) { // sets this._writev under the hood } }); ``` ### Transform ```javascript var transform = new stream.Transform({ transform: function(chunk, encoding, next) { // sets this._transform under the hood }, flush: function(done) { // sets this._flush under the hood } }); ``` ## Streams: Under the Hood ### Buffering Both Writable and Readable streams will buffer data on an internal object called `_writableState.buffer` or `_readableState.buffer`, respectively. The amount of data that will potentially be buffered depends on the `highWaterMark` option which is passed into the constructor. Buffering in Readable streams happens when the implementation calls [`stream.push(chunk)`][]. If the consumer of the Stream does not call `stream.read()`, then the data will sit in the internal queue until it is consumed. Buffering in Writable streams happens when the user calls [`stream.write(chunk)`][] repeatedly, even when `write()` returns `false`. The purpose of streams, especially with the `pipe()` method, is to limit the buffering of data to acceptable levels, so that sources and destinations of varying speed will not overwhelm the available memory. ### `stream.read(0)` There are some cases where you want to trigger a refresh of the underlying readable stream mechanisms, without actually consuming any data. In that case, you can call `stream.read(0)`, which will always return null. If the internal read buffer is below the `highWaterMark`, and the stream is not currently reading, then calling `read(0)` will trigger a low-level `_read` call. There is almost never a need to do this. However, you will see some cases in io.js's internals where this is done, particularly in the Readable stream class internals. ### `stream.push('')` Pushing a zero-byte string or Buffer (when not in [Object mode][]) has an interesting side effect. Because it *is* a call to [`stream.push()`][], it will end the `reading` process. However, it does *not* add any data to the readable buffer, so there's nothing for a user to consume. Very rarely, there are cases where you have no data to provide now, but the consumer of your stream (or, perhaps, another bit of your own code) will know when to check again, by calling `stream.read(0)`. In those cases, you *may* call `stream.push('')`. So far, the only use case for this functionality is in the [tls.CryptoStream][] class, which is deprecated in io.js v1.0. If you find that you have to use `stream.push('')`, please consider another approach, because it almost certainly indicates that something is horribly wrong. ### Compatibility with Older Node.js Versions In versions of Node.js prior to v0.10, the Readable stream interface was simpler, but also less powerful and less useful. * Rather than waiting for you to call the `read()` method, `'data'` events would start emitting immediately. If you needed to do some I/O to decide how to handle data, then you had to store the chunks in some kind of buffer so that they would not be lost. * The [`pause()`][] method was advisory, rather than guaranteed. This meant that you still had to be prepared to receive `'data'` events even when the stream was in a paused state. In io.js v1.0 and Node.js v0.10, the Readable class described below was added. For backwards compatibility with older Node.js programs, Readable streams switch into "flowing mode" when a `'data'` event handler is added, or when the [`resume()`][] method is called. The effect is that, even if you are not using the new `read()` method and `'readable'` event, you no longer have to worry about losing `'data'` chunks. Most programs will continue to function normally. However, this introduces an edge case in the following conditions: * No [`'data'` event][] handler is added. * The [`resume()`][] method is never called. * The stream is not piped to any writable destination. For example, consider the following code: ```javascript // WARNING! BROKEN! net.createServer(function(socket) { // we add an 'end' method, but never consume the data socket.on('end', function() { // It will never get here. socket.end('I got your message (but didnt read it)\n'); }); }).listen(1337); ``` In versions of Node.js prior to v0.10, the incoming message data would be simply discarded. However, in io.js v1.0 and Node.js v0.10 and beyond, the socket will remain paused forever. The workaround in this situation is to call the `resume()` method to start the flow of data: ```javascript // Workaround net.createServer(function(socket) { socket.on('end', function() { socket.end('I got your message (but didnt read it)\n'); }); // start the flow of data, discarding it. socket.resume(); }).listen(1337); ``` In addition to new Readable streams switching into flowing mode, pre-v0.10 style streams can be wrapped in a Readable class using the `wrap()` method. ### Object Mode Normally, Streams operate on Strings and Buffers exclusively. Streams that are in **object mode** can emit generic JavaScript values other than Buffers and Strings. A Readable stream in object mode will always return a single item from a call to `stream.read(size)`, regardless of what the size argument is. A Writable stream in object mode will always ignore the `encoding` argument to `stream.write(data, encoding)`. The special value `null` still retains its special value for object mode streams. That is, for object mode readable streams, `null` as a return value from `stream.read()` indicates that there is no more data, and [`stream.push(null)`][] will signal the end of stream data (`EOF`). No streams in io.js core are object mode streams. This pattern is only used by userland streaming libraries. You should set `objectMode` in your stream child class constructor on the options object. Setting `objectMode` mid-stream is not safe. For Duplex streams `objectMode` can be set exclusively for readable or writable side with `readableObjectMode` and `writableObjectMode` respectively. These options can be used to implement parsers and serializers with Transform streams. ```javascript var util = require('util'); var StringDecoder = require('string_decoder').StringDecoder; var Transform = require('stream').Transform; util.inherits(JSONParseStream, Transform); // Gets \n-delimited JSON string data, and emits the parsed objects function JSONParseStream() { if (!(this instanceof JSONParseStream)) return new JSONParseStream(); Transform.call(this, { readableObjectMode : true }); this._buffer = ''; this._decoder = new StringDecoder('utf8'); } JSONParseStream.prototype._transform = function(chunk, encoding, cb) { this._buffer += this._decoder.write(chunk); // split on newlines var lines = this._buffer.split(/\r?\n/); // keep the last partial line buffered this._buffer = lines.pop(); for (var l = 0; l < lines.length; l++) { var line = lines[l]; try { var obj = JSON.parse(line); } catch (er) { this.emit('error', er); return; } // push the parsed object out to the readable consumer this.push(obj); } cb(); }; JSONParseStream.prototype._flush = function(cb) { // Just handle any leftover var rem = this._buffer.trim(); if (rem) { try { var obj = JSON.parse(rem); } catch (er) { this.emit('error', er); return; } // push the parsed object out to the readable consumer this.push(obj); } cb(); }; ``` [EventEmitter]: https://iojs.org/dist/v2.3.0/doc/api/events.html#events_class_events_eventemitter [Object mode]: #stream_object_mode [`stream.push(chunk)`]: #stream_readable_push_chunk_encoding [`stream.push(null)`]: #stream_readable_push_chunk_encoding [`stream.push()`]: #stream_readable_push_chunk_encoding [`unpipe()`]: #stream_readable_unpipe_destination [unpiped]: #stream_readable_unpipe_destination [tcp sockets]: https://iojs.org/dist/v2.3.0/doc/api/net.html#net_class_net_socket [zlib streams]: zlib.html [zlib]: zlib.html [crypto streams]: crypto.html [crypto]: crypto.html [tls.CryptoStream]: https://iojs.org/dist/v2.3.0/doc/api/tls.html#tls_class_cryptostream [process.stdin]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stdin [stdout]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stdout [process.stdout]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stdout [process.stderr]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stderr [child process stdout and stderr]: https://iojs.org/dist/v2.3.0/doc/api/child_process.html#child_process_child_stdout [API for Stream Consumers]: #stream_api_for_stream_consumers [API for Stream Implementors]: #stream_api_for_stream_implementors [Readable]: #stream_class_stream_readable [Writable]: #stream_class_stream_writable [Duplex]: #stream_class_stream_duplex [Transform]: #stream_class_stream_transform [`end`]: #stream_event_end [`finish`]: #stream_event_finish [`_read(size)`]: #stream_readable_read_size_1 [`_read()`]: #stream_readable_read_size_1 [_read]: #stream_readable_read_size_1 [`writable.write(chunk)`]: #stream_writable_write_chunk_encoding_callback [`write(chunk, encoding, callback)`]: #stream_writable_write_chunk_encoding_callback [`write()`]: #stream_writable_write_chunk_encoding_callback [`stream.write(chunk)`]: #stream_writable_write_chunk_encoding_callback [`_write(chunk, encoding, callback)`]: #stream_writable_write_chunk_encoding_callback_1 [`_write()`]: #stream_writable_write_chunk_encoding_callback_1 [_write]: #stream_writable_write_chunk_encoding_callback_1 [`util.inherits`]: https://iojs.org/dist/v2.3.0/doc/api/util.html#util_util_inherits_constructor_superconstructor [`end()`]: #stream_writable_end_chunk_encoding_callback [`'data'` event]: #stream_event_data [`resume()`]: #stream_readable_resume [`readable.resume()`]: #stream_readable_resume [`pause()`]: #stream_readable_pause [`unpipe()`]: #stream_readable_unpipe_destination [`pipe()`]: #stream_readable_pipe_destination_options npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/doc/wg-meetings/0000755000000000000000000000000012631326456031411 5ustar 00000000000000././@LongLink0000000000000000000000000000015700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/doc/wg-meetings/2015-01-30.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/doc/wg-meetings/2010000644000000000000000000000435012631326456031640 0ustar 00000000000000# streams WG Meeting 2015-01-30 ## Links * **Google Hangouts Video**: http://www.youtube.com/watch?v=I9nDOSGfwZg * **GitHub Issue**: https://github.com/iojs/readable-stream/issues/106 * **Original Minutes Google Doc**: https://docs.google.com/document/d/17aTgLnjMXIrfjgNaTUnHQO7m3xgzHR2VXBTmi03Qii4/ ## Agenda Extracted from https://github.com/iojs/readable-stream/labels/wg-agenda prior to meeting. * adopt a charter [#105](https://github.com/iojs/readable-stream/issues/105) * release and versioning strategy [#101](https://github.com/iojs/readable-stream/issues/101) * simpler stream creation [#102](https://github.com/iojs/readable-stream/issues/102) * proposal: deprecate implicit flowing of streams [#99](https://github.com/iojs/readable-stream/issues/99) ## Minutes ### adopt a charter * group: +1's all around ### What versioning scheme should be adopted? * group: +1’s 3.0.0 * domenic+group: pulling in patches from other sources where appropriate * mikeal: version independently, suggesting versions for io.js * mikeal+domenic: work with TC to notify in advance of changes simpler stream creation ### streamline creation of streams * sam: streamline creation of streams * domenic: nice simple solution posted but, we lose the opportunity to change the model may not be backwards incompatible (double check keys) **action item:** domenic will check ### remove implicit flowing of streams on(‘data’) * add isFlowing / isPaused * mikeal: worrying that we’re documenting polyfill methods – confuses users * domenic: more reflective API is probably good, with warning labels for users * new section for mad scientists (reflective stream access) * calvin: name the “third state” * mikeal: maybe borrow the name from whatwg? * domenic: we’re missing the “third state” * consensus: kind of difficult to name the third state * mikeal: figure out differences in states / compat * mathias: always flow on data – eliminates third state * explore what it breaks **action items:** * ask isaac for ability to list packages by what public io.js APIs they use (esp. Stream) * ask rod/build for infrastructure * **chris**: explore the “flow on data” approach * add isPaused/isFlowing * add new docs section * move isPaused to that section ././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_duplex.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_duplex.0000644000000000000000000000351412631326456032203 0ustar 00000000000000// a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. 'use strict'; /**/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /**/ module.exports = Duplex; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); var keys = objectKeys(Writable.prototype); for (var v = 0; v < keys.length; v++) { var method = keys[v]; if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; } function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. processNextTick(onEndNT, this); } function onEndNT(self) { self.end(); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_passthrough.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_passthr0000644000000000000000000000114012631326456032301 0ustar 00000000000000// a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. 'use strict'; module.exports = PassThrough; var Transform = require('./_stream_transform'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_readable.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_readabl0000644000000000000000000006144512631326456032225 0ustar 00000000000000'use strict'; module.exports = Readable; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var isArray = require('isarray'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /**/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /**/ /**/ var Stream; (function (){try{ Stream = require('st' + 'ream'); }catch(_){}finally{ if (!Stream) Stream = require('events').EventEmitter; }}()) /**/ var Buffer = require('buffer').Buffer; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ /**/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /**/ var StringDecoder; util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = this.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; if (options && typeof options.read === 'function') this._read = options.read; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (!state.objectMode && typeof chunk === 'string') { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; Readable.prototype.isPaused = function() { return this._readableState.flowing === false; }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (chunk === null) { state.reading = false; onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (n === null || isNaN(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else { return state.length; } } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (typeof n !== 'number' || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (ret === null) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (ret !== null) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!(Buffer.isBuffer(chunk)) && typeof chunk !== 'string' && chunk !== null && chunk !== undefined && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.ended) return; if (state.decoder) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) processNextTick(emitReadable_, stream); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; processNextTick(maybeReadMore_, stream, state); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) processNextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { processNextTick(nReadingNextTick, this); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; function nReadingNextTick(self) { debug('readable nexttick read 0'); self.read(0); } // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; processNextTick(resume_, stream, state); } } function resume_(stream, state) { if (!state.reading) { debug('resume read 0'); stream.read(0); } state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); // don't skip over falsy values in objectMode if (state.objectMode && (chunk === null || chunk === undefined)) return; else if (!state.objectMode && (!chunk || !chunk.length)) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (this[i] === undefined && typeof stream[i] === 'function') { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }; }(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; processNextTick(endReadableNT, state, stream); } } function endReadableNT(state, stream) { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_transform.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_transfo0000644000000000000000000001441512631326456032302 0ustar 00000000000000// a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. 'use strict'; module.exports = Transform; var Duplex = require('./_stream_duplex'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(Transform, Duplex); function TransformState(stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (data !== null && data !== undefined) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; if (options) { if (typeof options.transform === 'function') this._transform = options.transform; if (typeof options.flush === 'function') this._flush = options.flush; } this.once('prefinish', function() { if (typeof this._flush === 'function') this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (ts.writechunk !== null && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_writable.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/lib/_stream_writabl0000644000000000000000000003243512631326456032274 0ustar 00000000000000// A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. 'use strict'; module.exports = Writable; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Writable.WritableState = WritableState; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ /**/ var Stream; (function (){try{ Stream = require('st' + 'ream'); }catch(_){}finally{ if (!Stream) Stream = require('events').EventEmitter; }}()) /**/ var Buffer = require('buffer').Buffer; util.inherits(Writable, Stream); function nop() {} function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; this.next = null; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = this.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.bufferedRequest = null; this.lastBufferedRequest = null; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } WritableState.prototype.getBuffer = function writableStateGetBuffer() { var current = this.bufferedRequest; var out = []; while (current) { out.push(current); current = current.next; } return out; }; (function (){try { Object.defineProperty(WritableState.prototype, 'buffer', { get: require('util-deprecate')(function() { return this.getBuffer(); }, '_writableState.buffer is deprecated. Use ' + '_writableState.getBuffer() instead.') }); }catch(_){}}()); function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; if (options) { if (typeof options.write === 'function') this._write = options.write; if (typeof options.writev === 'function') this._writev = options.writev; } Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); processNextTick(cb, er); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!(Buffer.isBuffer(chunk)) && typeof chunk !== 'string' && chunk !== null && chunk !== undefined && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); processNextTick(cb, er); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (Buffer.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (typeof cb !== 'function') cb = nop; if (state.ended) writeAfterEnd(this, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state); } }; Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) { // node::ParseEncoding() requires lower case. if (typeof encoding === 'string') encoding = encoding.toLowerCase(); if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2','utf16le', 'utf-16le', 'raw'] .indexOf((encoding + '').toLowerCase()) > -1)) throw new TypeError('Unknown encoding: ' + encoding); this._writableState.defaultEncoding = encoding; }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (Buffer.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) { var last = state.lastBufferedRequest; state.lastBufferedRequest = new WriteReq(chunk, encoding, cb); if (last) { last.next = state.lastBufferedRequest; } else { state.bufferedRequest = state.lastBufferedRequest; } } else { doWrite(stream, state, false, len, chunk, encoding, cb); } return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { --state.pendingcb; if (sync) processNextTick(cb, er); else cb(er); stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(state); if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) { clearBuffer(stream, state); } if (sync) { processNextTick(afterWrite, stream, state, finished, cb); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; var entry = state.bufferedRequest; if (stream._writev && entry && entry.next) { // Fast case, write everything using _writev() var buffer = []; var cbs = []; while (entry) { cbs.push(entry.callback); buffer.push(entry); entry = entry.next; } // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; state.lastBufferedRequest = null; doWrite(stream, state, true, state.length, buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer } else { // Slow case, write chunks one-by-one while (entry) { var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); entry = entry.next; // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { break; } } if (entry === null) state.lastBufferedRequest = null; } state.bufferedRequest = entry; state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (typeof chunk === 'function') { cb = chunk; chunk = null; encoding = null; } else if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(state) { return (state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else { prefinish(stream, state); } } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) processNextTick(cb); else stream.once('finish', cb); } state.ended = true; } ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000755000000000000000000000000012631326456032206 5ustar 00000000000000././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000755000000000000000000000000012631326456032275 5ustar 00000000000000././@LongLink0000000000000000000000000000016400000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000755000000000000000000000000012631326456032307 5ustar 00000000000000././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string_decoder/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string0000755000000000000000000000000012631326456032322 5ustar 00000000000000././@LongLink0000000000000000000000000000015600000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000755000000000000000000000000012631326456032212 5ustar 00000000000000././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/README.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000644000000000000000000000010312631326456032202 0ustar 00000000000000# core-util-is The `util.is*` functions introduced in Node v0.12. ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/float.patchnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000644000000000000000000003762612631326456032226 0ustar 00000000000000diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/lib/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000755000000000000000000000000012631326456032206 5ustar 00000000000000././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/package.jsonnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000644000000000000000000000175012631326456032213 0ustar 00000000000000{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is.git" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is#readme", "_id": "core-util-is@1.0.1", "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz", "_from": "core-util-is@>=1.0.0 <1.1.0" } ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/util.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000644000000000000000000000570412631326456032216 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-util-is/lib/util.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/core-u0000644000000000000000000000574012631326456032216 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/README.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000644000000000000000000000302512631326456032277 0ustar 00000000000000 # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/build/npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000755000000000000000000000000012631326456032275 5ustar 00000000000000././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/component.jsonnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000644000000000000000000000072612631326456032304 0ustar 00000000000000{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } ././@LongLink0000000000000000000000000000015700000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/index.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000644000000000000000000000017012631326456032275 0ustar 00000000000000module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/package.jsonnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000644000000000000000000000472712631326456032311 0ustar 00000000000000{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "readme": "\n# isarray\n\n`Array#isArray` for older browsers.\n\n## Usage\n\n```js\nvar isArray = require('isarray');\n\nconsole.log(isArray([])); // => true\nconsole.log(isArray({})); // => false\n```\n\n## Installation\n\nWith [npm](http://npmjs.org) do\n\n```bash\n$ npm install isarray\n```\n\nThen bundle for the browser with\n[browserify](https://github.com/substack/browserify).\n\nWith [component](http://component.io) do\n\n```bash\n$ component install juliangruber/isarray\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber <julian@juliangruber.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" }, "_id": "isarray@0.0.1", "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "_from": "isarray@0.0.1" } ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarray/build/build.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/isarra0000644000000000000000000000777112631326456032313 0ustar 00000000000000 /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ././@LongLink0000000000000000000000000000017700000000000011222 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/.travis.ymlnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000644000000000000000000000012112631326456032303 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" - "0.11" - "0.12" - "iojs" ././@LongLink0000000000000000000000000000017400000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/index.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000644000000000000000000000040712631326456032312 0ustar 00000000000000'use strict'; module.exports = nextTick; function nextTick(fn) { var args = new Array(arguments.length - 1); var i = 0; while (i < args.length) { args[i++] = arguments[i]; } process.nextTick(function afterTick() { fn.apply(null, args); }); } ././@LongLink0000000000000000000000000000017600000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/license.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000644000000000000000000000205012631326456032306 0ustar 00000000000000# Copyright (c) 2015 Calvin Metcalf Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. **THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.** ././@LongLink0000000000000000000000000000020000000000000011205 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/package.jsonnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000644000000000000000000000244012631326456032311 0ustar 00000000000000{ "name": "process-nextick-args", "version": "1.0.3", "description": "process.nextTick but always with args", "main": "index.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git+https://github.com/calvinmetcalf/process-nextick-args.git" }, "author": "", "license": "MIT", "bugs": { "url": "https://github.com/calvinmetcalf/process-nextick-args/issues" }, "homepage": "https://github.com/calvinmetcalf/process-nextick-args", "devDependencies": { "tap": "~0.2.6" }, "readme": "process-nextick-args\n=====\n\n[![Build Status](https://travis-ci.org/calvinmetcalf/process-nextick-args.svg?branch=master)](https://travis-ci.org/calvinmetcalf/process-nextick-args)\n\n```bash\nnpm install --save process-nextick-args\n```\n\nAlways be able to pass arguments to process.nextTick, no matter the platform\n\n```js\nvar nextTick = require('process-nextick-args');\n\nnextTick(function (a, b, c) {\n console.log(a, b, c);\n}, 'step', 3, 'profit');\n```\n", "readmeFilename": "readme.md", "_id": "process-nextick-args@1.0.3", "_shasum": "e272eed825d5e9f4ea74d8d73b1fe311c3beb630", "_resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-1.0.3.tgz", "_from": "process-nextick-args@>=1.0.0 <1.1.0" } ././@LongLink0000000000000000000000000000017500000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/readme.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000644000000000000000000000070312631326456032311 0ustar 00000000000000process-nextick-args ===== [![Build Status](https://travis-ci.org/calvinmetcalf/process-nextick-args.svg?branch=master)](https://travis-ci.org/calvinmetcalf/process-nextick-args) ```bash npm install --save process-nextick-args ``` Always be able to pass arguments to process.nextTick, no matter the platform ```js var nextTick = require('process-nextick-args'); nextTick(function (a, b, c) { console.log(a, b, c); }, 'step', 3, 'profit'); ``` ././@LongLink0000000000000000000000000000017300000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/process-nextick-args/test.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/proces0000644000000000000000000000101612631326456032307 0ustar 00000000000000var test = require("tap").test; var nextTick = require('./'); test('should work', function (t) { t.plan(5); nextTick(function (a) { t.ok(a); nextTick(function (thing) { t.equals(thing, 7); }, 7); }, true); nextTick(function (a, b, c) { t.equals(a, 'step'); t.equals(b, 3); t.equals(c, 'profit'); }, 'step', 3, 'profit'); }); test('correct number of arguments', function (t) { t.plan(1); nextTick(function () { t.equals(2, arguments.length, 'correct number'); }, 1, 2); }); ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string_decoder/.npmignorenpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string0000644000000000000000000000001312631326456032316 0ustar 00000000000000build test ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string_decoder/LICENSEnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string0000644000000000000000000000206412631326456032326 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string_decoder/README.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string0000644000000000000000000000076212631326456032331 0ustar 00000000000000**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.././@LongLink0000000000000000000000000000016600000000000011220 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string_decoder/index.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string0000644000000000000000000001716412631326456032335 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string_decoder/package.jsonnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/string0000644000000000000000000000254012631326456032325 0ustar 00000000000000{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "readme": "**string_decoder.js** (`require('string_decoder')`) from Node.js core\n\nCopyright Joyent, Inc. and other Node contributors. See LICENCE file for details.\n\nVersion numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.**\n\nThe *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "_from": "string_decoder@>=0.10.0 <0.11.0" } ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/History.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000644000000000000000000000043212631326456032213 0ustar 00000000000000 1.0.2 / 2015-10-07 ================== * use try/catch when checking `localStorage` (#3, @kumavis) 1.0.1 / 2014-11-25 ================== * browser: use `console.warn()` for deprecation calls * browser: more jsdocs 1.0.0 / 2014-04-30 ================== * initial commit ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/LICENSEnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000644000000000000000000000211612631326456032214 0ustar 00000000000000(The MIT License) Copyright (c) 2014 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000016700000000000011221 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/README.mdnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000644000000000000000000000320212631326456032211 0ustar 00000000000000util-deprecate ============== ### The Node.js `util.deprecate()` function with browser support In Node.js, this module simply re-exports the `util.deprecate()` function. In the web browser (i.e. via browserify), a browser-specific implementation of the `util.deprecate()` function is used. ## API A `deprecate()` function is the only thing exposed by this module. ``` javascript // setup: exports.foo = deprecate(foo, 'foo() is deprecated, use bar() instead'); // users see: foo(); // foo() is deprecated, use bar() instead foo(); foo(); ``` ## License (The MIT License) Copyright (c) 2014 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000017000000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/browser.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000644000000000000000000000311612631326456032215 0ustar 00000000000000 /** * Module exports. */ module.exports = deprecate; /** * Mark that a method should not be used. * Returns a modified function which warns once by default. * * If `localStorage.noDeprecation = true` is set, then it is a no-op. * * If `localStorage.throwDeprecation = true` is set, then deprecated functions * will throw an Error when invoked. * * If `localStorage.traceDeprecation = true` is set, then deprecated functions * will invoke `console.trace()` instead of `console.error()`. * * @param {Function} fn - the function to deprecate * @param {String} msg - the string to print to the console when `fn` is invoked * @returns {Function} a new "deprecated" version of `fn` * @api public */ function deprecate (fn, msg) { if (config('noDeprecation')) { return fn; } var warned = false; function deprecated() { if (!warned) { if (config('throwDeprecation')) { throw new Error(msg); } else if (config('traceDeprecation')) { console.trace(msg); } else { console.warn(msg); } warned = true; } return fn.apply(this, arguments); } return deprecated; } /** * Checks `localStorage` for boolean values for the given `name`. * * @param {String} name * @returns {Boolean} * @api private */ function config (name) { // accessing global.localStorage can trigger a DOMException in sandboxed iframes try { if (!global.localStorage) return false; } catch (_) { return false; } var val = global.localStorage[name]; if (null == val) return false; return String(val).toLowerCase() === 'true'; } ././@LongLink0000000000000000000000000000016500000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/node.jsnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000644000000000000000000000017312631326456032215 0ustar 00000000000000 /** * For Node.js, simply re-export the core `util.deprecate` function. */ module.exports = require('util').deprecate; ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-deprecate/package.jsonnpm_3.5.2.orig/node_modules/request/node_modules/bl/node_modules/readable-stream/node_modules/util-d0000644000000000000000000000271612631326456032222 0ustar 00000000000000{ "name": "util-deprecate", "version": "1.0.2", "description": "The Node.js `util.deprecate()` function with browser support", "main": "node.js", "browser": "browser.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/util-deprecate.git" }, "keywords": [ "util", "deprecate", "browserify", "browser", "node" ], "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://n8.io/" }, "license": "MIT", "bugs": { "url": "https://github.com/TooTallNate/util-deprecate/issues" }, "homepage": "https://github.com/TooTallNate/util-deprecate", "gitHead": "475fb6857cd23fafff20c1be846c1350abf8e6d4", "_id": "util-deprecate@1.0.2", "_shasum": "450d4dc9fa70de732762fbd2d4a28981419a0ccf", "_from": "util-deprecate@>=1.0.1 <1.1.0", "_npmVersion": "2.14.4", "_nodeVersion": "4.1.2", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "450d4dc9fa70de732762fbd2d4a28981419a0ccf", "tarball": "http://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/bl/test/basic-test.js0000644000000000000000000003075712631326456024300 0ustar 00000000000000var tape = require('tape') , crypto = require('crypto') , fs = require('fs') , hash = require('hash_file') , BufferList = require('../') , encodings = ('hex utf8 utf-8 ascii binary base64' + (process.browser ? '' : ' ucs2 ucs-2 utf16le utf-16le')).split(' ') tape('single bytes from single buffer', function (t) { var bl = new BufferList() bl.append(new Buffer('abcd')) t.equal(bl.length, 4) t.equal(bl.get(0), 97) t.equal(bl.get(1), 98) t.equal(bl.get(2), 99) t.equal(bl.get(3), 100) t.end() }) tape('single bytes from multiple buffers', function (t) { var bl = new BufferList() bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append(new Buffer('hi')) bl.append(new Buffer('j')) t.equal(bl.length, 10) t.equal(bl.get(0), 97) t.equal(bl.get(1), 98) t.equal(bl.get(2), 99) t.equal(bl.get(3), 100) t.equal(bl.get(4), 101) t.equal(bl.get(5), 102) t.equal(bl.get(6), 103) t.equal(bl.get(7), 104) t.equal(bl.get(8), 105) t.equal(bl.get(9), 106) t.end() }) tape('multi bytes from single buffer', function (t) { var bl = new BufferList() bl.append(new Buffer('abcd')) t.equal(bl.length, 4) t.equal(bl.slice(0, 4).toString('ascii'), 'abcd') t.equal(bl.slice(0, 3).toString('ascii'), 'abc') t.equal(bl.slice(1, 4).toString('ascii'), 'bcd') t.end() }) tape('multiple bytes from multiple buffers', function (t) { var bl = new BufferList() bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append(new Buffer('hi')) bl.append(new Buffer('j')) t.equal(bl.length, 10) t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij') t.equal(bl.slice(3, 10).toString('ascii'), 'defghij') t.equal(bl.slice(3, 6).toString('ascii'), 'def') t.equal(bl.slice(3, 8).toString('ascii'), 'defgh') t.equal(bl.slice(5, 10).toString('ascii'), 'fghij') t.end() }) tape('multiple bytes from multiple buffer lists', function (t) { var bl = new BufferList() bl.append(new BufferList([new Buffer('abcd'), new Buffer('efg')])) bl.append(new BufferList([new Buffer('hi'), new Buffer('j')])) t.equal(bl.length, 10) t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij') t.equal(bl.slice(3, 10).toString('ascii'), 'defghij') t.equal(bl.slice(3, 6).toString('ascii'), 'def') t.equal(bl.slice(3, 8).toString('ascii'), 'defgh') t.equal(bl.slice(5, 10).toString('ascii'), 'fghij') t.end() }) tape('consuming from multiple buffers', function (t) { var bl = new BufferList() bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append(new Buffer('hi')) bl.append(new Buffer('j')) t.equal(bl.length, 10) t.equal(bl.slice(0, 10).toString('ascii'), 'abcdefghij') bl.consume(3) t.equal(bl.length, 7) t.equal(bl.slice(0, 7).toString('ascii'), 'defghij') bl.consume(2) t.equal(bl.length, 5) t.equal(bl.slice(0, 5).toString('ascii'), 'fghij') bl.consume(1) t.equal(bl.length, 4) t.equal(bl.slice(0, 4).toString('ascii'), 'ghij') bl.consume(1) t.equal(bl.length, 3) t.equal(bl.slice(0, 3).toString('ascii'), 'hij') bl.consume(2) t.equal(bl.length, 1) t.equal(bl.slice(0, 1).toString('ascii'), 'j') t.end() }) tape('test readUInt8 / readInt8', function (t) { var buf1 = new Buffer(1) , buf2 = new Buffer(3) , buf3 = new Buffer(3) , bl = new BufferList() buf2[1] = 0x3 buf2[2] = 0x4 buf3[0] = 0x23 buf3[1] = 0x42 bl.append(buf1) bl.append(buf2) bl.append(buf3) t.equal(bl.readUInt8(2), 0x3) t.equal(bl.readInt8(2), 0x3) t.equal(bl.readUInt8(3), 0x4) t.equal(bl.readInt8(3), 0x4) t.equal(bl.readUInt8(4), 0x23) t.equal(bl.readInt8(4), 0x23) t.equal(bl.readUInt8(5), 0x42) t.equal(bl.readInt8(5), 0x42) t.end() }) tape('test readUInt16LE / readUInt16BE / readInt16LE / readInt16BE', function (t) { var buf1 = new Buffer(1) , buf2 = new Buffer(3) , buf3 = new Buffer(3) , bl = new BufferList() buf2[1] = 0x3 buf2[2] = 0x4 buf3[0] = 0x23 buf3[1] = 0x42 bl.append(buf1) bl.append(buf2) bl.append(buf3) t.equal(bl.readUInt16BE(2), 0x0304) t.equal(bl.readUInt16LE(2), 0x0403) t.equal(bl.readInt16BE(2), 0x0304) t.equal(bl.readInt16LE(2), 0x0403) t.equal(bl.readUInt16BE(3), 0x0423) t.equal(bl.readUInt16LE(3), 0x2304) t.equal(bl.readInt16BE(3), 0x0423) t.equal(bl.readInt16LE(3), 0x2304) t.equal(bl.readUInt16BE(4), 0x2342) t.equal(bl.readUInt16LE(4), 0x4223) t.equal(bl.readInt16BE(4), 0x2342) t.equal(bl.readInt16LE(4), 0x4223) t.end() }) tape('test readUInt32LE / readUInt32BE / readInt32LE / readInt32BE', function (t) { var buf1 = new Buffer(1) , buf2 = new Buffer(3) , buf3 = new Buffer(3) , bl = new BufferList() buf2[1] = 0x3 buf2[2] = 0x4 buf3[0] = 0x23 buf3[1] = 0x42 bl.append(buf1) bl.append(buf2) bl.append(buf3) t.equal(bl.readUInt32BE(2), 0x03042342) t.equal(bl.readUInt32LE(2), 0x42230403) t.equal(bl.readInt32BE(2), 0x03042342) t.equal(bl.readInt32LE(2), 0x42230403) t.end() }) tape('test readFloatLE / readFloatBE', function (t) { var buf1 = new Buffer(1) , buf2 = new Buffer(3) , buf3 = new Buffer(3) , bl = new BufferList() buf2[1] = 0x00 buf2[2] = 0x00 buf3[0] = 0x80 buf3[1] = 0x3f bl.append(buf1) bl.append(buf2) bl.append(buf3) t.equal(bl.readFloatLE(2), 0x01) t.end() }) tape('test readDoubleLE / readDoubleBE', function (t) { var buf1 = new Buffer(1) , buf2 = new Buffer(3) , buf3 = new Buffer(10) , bl = new BufferList() buf2[1] = 0x55 buf2[2] = 0x55 buf3[0] = 0x55 buf3[1] = 0x55 buf3[2] = 0x55 buf3[3] = 0x55 buf3[4] = 0xd5 buf3[5] = 0x3f bl.append(buf1) bl.append(buf2) bl.append(buf3) t.equal(bl.readDoubleLE(2), 0.3333333333333333) t.end() }) tape('test toString', function (t) { var bl = new BufferList() bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append(new Buffer('hi')) bl.append(new Buffer('j')) t.equal(bl.toString('ascii', 0, 10), 'abcdefghij') t.equal(bl.toString('ascii', 3, 10), 'defghij') t.equal(bl.toString('ascii', 3, 6), 'def') t.equal(bl.toString('ascii', 3, 8), 'defgh') t.equal(bl.toString('ascii', 5, 10), 'fghij') t.end() }) tape('test toString encoding', function (t) { var bl = new BufferList() , b = new Buffer('abcdefghij\xff\x00') bl.append(new Buffer('abcd')) bl.append(new Buffer('efg')) bl.append(new Buffer('hi')) bl.append(new Buffer('j')) bl.append(new Buffer('\xff\x00')) encodings.forEach(function (enc) { t.equal(bl.toString(enc), b.toString(enc), enc) }) t.end() }) !process.browser && tape('test stream', function (t) { var random = crypto.randomBytes(65534) , rndhash = hash(random, 'md5') , md5sum = crypto.createHash('md5') , bl = new BufferList(function (err, buf) { t.ok(Buffer.isBuffer(buf)) t.ok(err === null) t.equal(rndhash, hash(bl.slice(), 'md5')) t.equal(rndhash, hash(buf, 'md5')) bl.pipe(fs.createWriteStream('/tmp/bl_test_rnd_out.dat')) .on('close', function () { var s = fs.createReadStream('/tmp/bl_test_rnd_out.dat') s.on('data', md5sum.update.bind(md5sum)) s.on('end', function() { t.equal(rndhash, md5sum.digest('hex'), 'woohoo! correct hash!') t.end() }) }) }) fs.writeFileSync('/tmp/bl_test_rnd.dat', random) fs.createReadStream('/tmp/bl_test_rnd.dat').pipe(bl) }) tape('instantiation with Buffer', function (t) { var buf = crypto.randomBytes(1024) , buf2 = crypto.randomBytes(1024) , b = BufferList(buf) t.equal(buf.toString('hex'), b.slice().toString('hex'), 'same buffer') b = BufferList([ buf, buf2 ]) t.equal(b.slice().toString('hex'), Buffer.concat([ buf, buf2 ]).toString('hex'), 'same buffer') t.end() }) tape('test String appendage', function (t) { var bl = new BufferList() , b = new Buffer('abcdefghij\xff\x00') bl.append('abcd') bl.append('efg') bl.append('hi') bl.append('j') bl.append('\xff\x00') encodings.forEach(function (enc) { t.equal(bl.toString(enc), b.toString(enc)) }) t.end() }) tape('write nothing, should get empty buffer', function (t) { t.plan(3) BufferList(function (err, data) { t.notOk(err, 'no error') t.ok(Buffer.isBuffer(data), 'got a buffer') t.equal(0, data.length, 'got a zero-length buffer') t.end() }).end() }) tape('unicode string', function (t) { t.plan(2) var inp1 = '\u2600' , inp2 = '\u2603' , exp = inp1 + ' and ' + inp2 , bl = BufferList() bl.write(inp1) bl.write(' and ') bl.write(inp2) t.equal(exp, bl.toString()) t.equal(new Buffer(exp).toString('hex'), bl.toString('hex')) }) tape('should emit finish', function (t) { var source = BufferList() , dest = BufferList() source.write('hello') source.pipe(dest) dest.on('finish', function () { t.equal(dest.toString('utf8'), 'hello') t.end() }) }) tape('basic copy', function (t) { var buf = crypto.randomBytes(1024) , buf2 = new Buffer(1024) , b = BufferList(buf) b.copy(buf2) t.equal(b.slice().toString('hex'), buf2.toString('hex'), 'same buffer') t.end() }) tape('copy after many appends', function (t) { var buf = crypto.randomBytes(512) , buf2 = new Buffer(1024) , b = BufferList(buf) b.append(buf) b.copy(buf2) t.equal(b.slice().toString('hex'), buf2.toString('hex'), 'same buffer') t.end() }) tape('copy at a precise position', function (t) { var buf = crypto.randomBytes(1004) , buf2 = new Buffer(1024) , b = BufferList(buf) b.copy(buf2, 20) t.equal(b.slice().toString('hex'), buf2.slice(20).toString('hex'), 'same buffer') t.end() }) tape('copy starting from a precise location', function (t) { var buf = crypto.randomBytes(10) , buf2 = new Buffer(5) , b = BufferList(buf) b.copy(buf2, 0, 5) t.equal(b.slice(5).toString('hex'), buf2.toString('hex'), 'same buffer') t.end() }) tape('copy in an interval', function (t) { var rnd = crypto.randomBytes(10) , b = BufferList(rnd) // put the random bytes there , actual = new Buffer(3) , expected = new Buffer(3) rnd.copy(expected, 0, 5, 8) b.copy(actual, 0, 5, 8) t.equal(actual.toString('hex'), expected.toString('hex'), 'same buffer') t.end() }) tape('copy an interval between two buffers', function (t) { var buf = crypto.randomBytes(10) , buf2 = new Buffer(10) , b = BufferList(buf) b.append(buf) b.copy(buf2, 0, 5, 15) t.equal(b.slice(5, 15).toString('hex'), buf2.toString('hex'), 'same buffer') t.end() }) tape('duplicate', function (t) { t.plan(2) var bl = new BufferList('abcdefghij\xff\x00') , dup = bl.duplicate() t.equal(bl.prototype, dup.prototype) t.equal(bl.toString('hex'), dup.toString('hex')) }) tape('destroy no pipe', function (t) { t.plan(2) var bl = new BufferList('alsdkfja;lsdkfja;lsdk') bl.destroy() t.equal(bl._bufs.length, 0) t.equal(bl.length, 0) }) !process.browser && tape('destroy with pipe before read end', function (t) { t.plan(2) var bl = new BufferList() fs.createReadStream(__dirname + '/sauce.js') .pipe(bl) bl.destroy() t.equal(bl._bufs.length, 0) t.equal(bl.length, 0) }) !process.browser && tape('destroy with pipe before read end with race', function (t) { t.plan(2) var bl = new BufferList() fs.createReadStream(__dirname + '/sauce.js') .pipe(bl) setTimeout(function () { bl.destroy() setTimeout(function () { t.equal(bl._bufs.length, 0) t.equal(bl.length, 0) }, 500) }, 500) }) !process.browser && tape('destroy with pipe after read end', function (t) { t.plan(2) var bl = new BufferList() fs.createReadStream(__dirname + '/sauce.js') .on('end', onEnd) .pipe(bl) function onEnd () { bl.destroy() t.equal(bl._bufs.length, 0) t.equal(bl.length, 0) } }) !process.browser && tape('destroy with pipe while writing to a destination', function (t) { t.plan(4) var bl = new BufferList() , ds = new BufferList() fs.createReadStream(__dirname + '/sauce.js') .on('end', onEnd) .pipe(bl) function onEnd () { bl.pipe(ds) setTimeout(function () { bl.destroy() t.equals(bl._bufs.length, 0) t.equals(bl.length, 0) ds.destroy() t.equals(bl._bufs.length, 0) t.equals(bl.length, 0) }, 100) } }) !process.browser && tape('handle error', function (t) { t.plan(2) fs.createReadStream('/does/not/exist').pipe(BufferList(function (err, data) { t.ok(err instanceof Error, 'has error') t.notOk(data, 'no data') })) }) npm_3.5.2.orig/node_modules/request/node_modules/bl/test/sauce.js0000644000000000000000000000344512631326456023334 0ustar 00000000000000#!/usr/bin/env node const user = process.env.SAUCE_USER , key = process.env.SAUCE_KEY , path = require('path') , brtapsauce = require('brtapsauce') , testFile = path.join(__dirname, 'basic-test.js') , capabilities = [ { browserName: 'chrome' , platform: 'Windows XP', version: '' } , { browserName: 'firefox' , platform: 'Windows 8' , version: '' } , { browserName: 'firefox' , platform: 'Windows XP', version: '4' } , { browserName: 'internet explorer' , platform: 'Windows 8' , version: '10' } , { browserName: 'internet explorer' , platform: 'Windows 7' , version: '9' } , { browserName: 'internet explorer' , platform: 'Windows 7' , version: '8' } , { browserName: 'internet explorer' , platform: 'Windows XP', version: '7' } , { browserName: 'internet explorer' , platform: 'Windows XP', version: '6' } , { browserName: 'safari' , platform: 'Windows 7' , version: '5' } , { browserName: 'safari' , platform: 'OS X 10.8' , version: '6' } , { browserName: 'opera' , platform: 'Windows 7' , version: '' } , { browserName: 'opera' , platform: 'Windows 7' , version: '11' } , { browserName: 'ipad' , platform: 'OS X 10.8' , version: '6' } , { browserName: 'android' , platform: 'Linux' , version: '4.0', 'device-type': 'tablet' } ] if (!user) throw new Error('Must set a SAUCE_USER env var') if (!key) throw new Error('Must set a SAUCE_KEY env var') brtapsauce({ name : 'Traversty' , user : user , key : key , brsrc : testFile , capabilities : capabilities , options : { timeout: 60 * 6 } })npm_3.5.2.orig/node_modules/request/node_modules/bl/test/test.js0000644000000000000000000000044312631326456023206 0ustar 00000000000000require('./basic-test') if (!process.env.SAUCE_KEY || !process.env.SAUCE_USER) return console.log('SAUCE_KEY and/or SAUCE_USER not set, not running sauce tests') if (!/v0\.10/.test(process.version)) return console.log('Not Node v0.10.x, not running sauce tests') require('./sauce.js')npm_3.5.2.orig/node_modules/request/node_modules/caseless/LICENSE0000644000000000000000000002163112631326456023126 0ustar 00000000000000Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONSnpm_3.5.2.orig/node_modules/request/node_modules/caseless/README.md0000644000000000000000000000222512631326456023376 0ustar 00000000000000## Caseless -- wrap an object to set and get property with caseless semantics but also preserve caseing. This library is incredibly useful when working with HTTP headers. It allows you to get/set/check for headers in a caseless manner while also preserving the caseing of headers the first time they are set. ## Usage ```javascript var headers = {} , c = caseless(headers) ; c.set('a-Header', 'asdf') c.get('a-header') === 'asdf' ``` ## has(key) Has takes a name and if it finds a matching header will return that header name with the preserved caseing it was set with. ```javascript c.has('a-header') === 'a-Header' ``` ## set(key, value[, clobber=true]) Set is fairly straight forward except that if the header exists and clobber is disabled it will add `','+value` to the existing header. ```javascript c.set('a-Header', 'fdas') c.set('a-HEADER', 'more', false) c.get('a-header') === 'fdsa,more' ``` ## swap(key) Swaps the casing of a header with the new one that is passed in. ```javascript var headers = {} , c = caseless(headers) ; c.set('a-Header', 'fdas') c.swap('a-HEADER') c.has('a-header') === 'a-HEADER' headers === {'a-HEADER': 'fdas'} ``` npm_3.5.2.orig/node_modules/request/node_modules/caseless/index.js0000644000000000000000000000333712631326456023571 0ustar 00000000000000function Caseless (dict) { this.dict = dict || {} } Caseless.prototype.set = function (name, value, clobber) { if (typeof name === 'object') { for (var i in name) { this.set(i, name[i], value) } } else { if (typeof clobber === 'undefined') clobber = true var has = this.has(name) if (!clobber && has) this.dict[has] = this.dict[has] + ',' + value else this.dict[has || name] = value return has } } Caseless.prototype.has = function (name) { var keys = Object.keys(this.dict) , name = name.toLowerCase() ; for (var i=0;i=0.11.0 <0.12.0", "_npmVersion": "2.8.3", "_nodeVersion": "1.8.1", "_npmUser": { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, { "name": "nylen", "email": "jnylen@gmail.com" }, { "name": "simov", "email": "simeonvelichkov@gmail.com" } ], "dist": { "shasum": "715b96ea9841593cc33067923f5ec60ebda4f7d7", "tarball": "http://registry.npmjs.org/caseless/-/caseless-0.11.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/caseless/-/caseless-0.11.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/caseless/test.js0000644000000000000000000000173312631326456023437 0ustar 00000000000000var tape = require('tape') , caseless = require('./') ; tape('set get has', function (t) { var headers = {} , c = caseless(headers) ; t.plan(17) c.set('a-Header', 'asdf') t.equal(c.get('a-header'), 'asdf') t.equal(c.has('a-header'), 'a-Header') t.ok(!c.has('nothing')) // old bug where we used the wrong regex t.ok(!c.has('a-hea')) c.set('a-header', 'fdsa') t.equal(c.get('a-header'), 'fdsa') t.equal(c.get('a-Header'), 'fdsa') c.set('a-HEADER', 'more', false) t.equal(c.get('a-header'), 'fdsa,more') t.deepEqual(headers, {'a-Header': 'fdsa,more'}) c.swap('a-HEADER') t.deepEqual(headers, {'a-HEADER': 'fdsa,more'}) c.set('deleteme', 'foobar') t.ok(c.has('deleteme')) t.ok(c.del('deleteme')) t.notOk(c.has('deleteme')) t.notOk(c.has('idonotexist')) t.ok(c.del('idonotexist')) c.set('tva', 'test1') c.set('tva-header', 'test2') t.equal(c.has('tva'), 'tva') t.notOk(c.has('header')) t.equal(c.get('tva'), 'test1') }) npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/License0000644000000000000000000000207512631326456024676 0ustar 00000000000000Copyright (c) 2011 Debuggable Limited Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/Readme.md0000644000000000000000000001070712631326456025111 0ustar 00000000000000# combined-stream A stream that emits multiple other streams one after another. **NB** Currently `combined-stream` works with streams vesrion 1 only. There is ongoing effort to switch this library to streams version 2. Any help is welcome. :) Meanwhile you can explore other libraries that provide streams2 support with more or less compatability with `combined-stream`. - [combined-stream2](https://www.npmjs.com/package/combined-stream2): A drop-in streams2-compatible replacement for the combined-stream module. - [multistream](https://www.npmjs.com/package/multistream): A stream that emits multiple other streams one after another. ## Installation ``` bash npm install combined-stream ``` ## Usage Here is a simple example that shows how you can use combined-stream to combine two files into one: ``` javascript var CombinedStream = require('combined-stream'); var fs = require('fs'); var combinedStream = CombinedStream.create(); combinedStream.append(fs.createReadStream('file1.txt')); combinedStream.append(fs.createReadStream('file2.txt')); combinedStream.pipe(fs.createWriteStream('combined.txt')); ``` While the example above works great, it will pause all source streams until they are needed. If you don't want that to happen, you can set `pauseStreams` to `false`: ``` javascript var CombinedStream = require('combined-stream'); var fs = require('fs'); var combinedStream = CombinedStream.create({pauseStreams: false}); combinedStream.append(fs.createReadStream('file1.txt')); combinedStream.append(fs.createReadStream('file2.txt')); combinedStream.pipe(fs.createWriteStream('combined.txt')); ``` However, what if you don't have all the source streams yet, or you don't want to allocate the resources (file descriptors, memory, etc.) for them right away? Well, in that case you can simply provide a callback that supplies the stream by calling a `next()` function: ``` javascript var CombinedStream = require('combined-stream'); var fs = require('fs'); var combinedStream = CombinedStream.create(); combinedStream.append(function(next) { next(fs.createReadStream('file1.txt')); }); combinedStream.append(function(next) { next(fs.createReadStream('file2.txt')); }); combinedStream.pipe(fs.createWriteStream('combined.txt')); ``` ## API ### CombinedStream.create([options]) Returns a new combined stream object. Available options are: * `maxDataSize` * `pauseStreams` The effect of those options is described below. ### combinedStream.pauseStreams = `true` Whether to apply back pressure to the underlaying streams. If set to `false`, the underlaying streams will never be paused. If set to `true`, the underlaying streams will be paused right after being appended, as well as when `delayedStream.pipe()` wants to throttle. ### combinedStream.maxDataSize = `2 * 1024 * 1024` The maximum amount of bytes (or characters) to buffer for all source streams. If this value is exceeded, `combinedStream` emits an `'error'` event. ### combinedStream.dataSize = `0` The amount of bytes (or characters) currently buffered by `combinedStream`. ### combinedStream.append(stream) Appends the given `stream` to the combinedStream object. If `pauseStreams` is set to `true, this stream will also be paused right away. `streams` can also be a function that takes one parameter called `next`. `next` is a function that must be invoked in order to provide the `next` stream, see example above. Regardless of how the `stream` is appended, combined-stream always attaches an `'error'` listener to it, so you don't have to do that manually. Special case: `stream` can also be a String or Buffer. ### combinedStream.write(data) You should not call this, `combinedStream` takes care of piping the appended streams into itself for you. ### combinedStream.resume() Causes `combinedStream` to start drain the streams it manages. The function is idempotent, and also emits a `'resume'` event each time which usually goes to the stream that is currently being drained. ### combinedStream.pause(); If `combinedStream.pauseStreams` is set to `false`, this does nothing. Otherwise a `'pause'` event is emitted, this goes to the stream that is currently being drained, so you can use it to apply back pressure. ### combinedStream.end(); Sets `combinedStream.writable` to false, emits an `'end'` event, and removes all streams from the queue. ### combinedStream.destroy(); Same as `combinedStream.end()`, except it emits a `'close'` event instead of `'end'`. ## License combined-stream is licensed under the MIT license. npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/lib/0000755000000000000000000000000012631326456024133 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/0000755000000000000000000000000012631326456026042 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/package.json0000644000000000000000000000331212631326456025652 0ustar 00000000000000{ "author": { "name": "Felix Geisendörfer", "email": "felix@debuggable.com", "url": "http://debuggable.com/" }, "name": "combined-stream", "description": "A stream that emits multiple other streams one after another.", "version": "1.0.5", "homepage": "https://github.com/felixge/node-combined-stream", "repository": { "type": "git", "url": "git://github.com/felixge/node-combined-stream.git" }, "main": "./lib/combined_stream", "scripts": { "test": "node test/run.js" }, "engines": { "node": ">= 0.8" }, "dependencies": { "delayed-stream": "~1.0.0" }, "devDependencies": { "far": "~0.0.7" }, "license": "MIT", "gitHead": "cfc7b815d090a109bcedb5bb0f6713148d55a6b7", "bugs": { "url": "https://github.com/felixge/node-combined-stream/issues" }, "_id": "combined-stream@1.0.5", "_shasum": "938370a57b4a51dea2c77c15d5c5fdf895164009", "_from": "combined-stream@>=1.0.5 <1.1.0", "_npmVersion": "2.10.1", "_nodeVersion": "0.12.4", "_npmUser": { "name": "alexindigo", "email": "iam@alexindigo.com" }, "dist": { "shasum": "938370a57b4a51dea2c77c15d5c5fdf895164009", "tarball": "http://registry.npmjs.org/combined-stream/-/combined-stream-1.0.5.tgz" }, "maintainers": [ { "name": "felixge", "email": "felix@debuggable.com" }, { "name": "celer", "email": "dtyree77@gmail.com" }, { "name": "alexindigo", "email": "iam@alexindigo.com" }, { "name": "apechimp", "email": "apeherder@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/combined-stream/-/combined-stream-1.0.5.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/lib/combined_stream.js0000644000000000000000000001031412631326456027623 0ustar 00000000000000var util = require('util'); var Stream = require('stream').Stream; var DelayedStream = require('delayed-stream'); module.exports = CombinedStream; function CombinedStream() { this.writable = false; this.readable = true; this.dataSize = 0; this.maxDataSize = 2 * 1024 * 1024; this.pauseStreams = true; this._released = false; this._streams = []; this._currentStream = null; } util.inherits(CombinedStream, Stream); CombinedStream.create = function(options) { var combinedStream = new this(); options = options || {}; for (var option in options) { combinedStream[option] = options[option]; } return combinedStream; }; CombinedStream.isStreamLike = function(stream) { return (typeof stream !== 'function') && (typeof stream !== 'string') && (typeof stream !== 'boolean') && (typeof stream !== 'number') && (!Buffer.isBuffer(stream)); }; CombinedStream.prototype.append = function(stream) { var isStreamLike = CombinedStream.isStreamLike(stream); if (isStreamLike) { if (!(stream instanceof DelayedStream)) { var newStream = DelayedStream.create(stream, { maxDataSize: Infinity, pauseStream: this.pauseStreams, }); stream.on('data', this._checkDataSize.bind(this)); stream = newStream; } this._handleErrors(stream); if (this.pauseStreams) { stream.pause(); } } this._streams.push(stream); return this; }; CombinedStream.prototype.pipe = function(dest, options) { Stream.prototype.pipe.call(this, dest, options); this.resume(); return dest; }; CombinedStream.prototype._getNext = function() { this._currentStream = null; var stream = this._streams.shift(); if (typeof stream == 'undefined') { this.end(); return; } if (typeof stream !== 'function') { this._pipeNext(stream); return; } var getStream = stream; getStream(function(stream) { var isStreamLike = CombinedStream.isStreamLike(stream); if (isStreamLike) { stream.on('data', this._checkDataSize.bind(this)); this._handleErrors(stream); } this._pipeNext(stream); }.bind(this)); }; CombinedStream.prototype._pipeNext = function(stream) { this._currentStream = stream; var isStreamLike = CombinedStream.isStreamLike(stream); if (isStreamLike) { stream.on('end', this._getNext.bind(this)); stream.pipe(this, {end: false}); return; } var value = stream; this.write(value); this._getNext(); }; CombinedStream.prototype._handleErrors = function(stream) { var self = this; stream.on('error', function(err) { self._emitError(err); }); }; CombinedStream.prototype.write = function(data) { this.emit('data', data); }; CombinedStream.prototype.pause = function() { if (!this.pauseStreams) { return; } if(this.pauseStreams && this._currentStream && typeof(this._currentStream.pause) == 'function') this._currentStream.pause(); this.emit('pause'); }; CombinedStream.prototype.resume = function() { if (!this._released) { this._released = true; this.writable = true; this._getNext(); } if(this.pauseStreams && this._currentStream && typeof(this._currentStream.resume) == 'function') this._currentStream.resume(); this.emit('resume'); }; CombinedStream.prototype.end = function() { this._reset(); this.emit('end'); }; CombinedStream.prototype.destroy = function() { this._reset(); this.emit('close'); }; CombinedStream.prototype._reset = function() { this.writable = false; this._streams = []; this._currentStream = null; }; CombinedStream.prototype._checkDataSize = function() { this._updateDataSize(); if (this.dataSize <= this.maxDataSize) { return; } var message = 'DelayedStream#maxDataSize of ' + this.maxDataSize + ' bytes exceeded.'; this._emitError(new Error(message)); }; CombinedStream.prototype._updateDataSize = function() { this.dataSize = 0; var self = this; this._streams.forEach(function(stream) { if (!stream.dataSize) { return; } self.dataSize += stream.dataSize; }); if (this._currentStream && this._currentStream.dataSize) { this.dataSize += this._currentStream.dataSize; } }; CombinedStream.prototype._emitError = function(err) { this._reset(); this.emit('error', err); }; npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/0000755000000000000000000000000012631326456030742 5ustar 00000000000000././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/.npmignorenpm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/.npmign0000644000000000000000000000000512631326456032226 0ustar 00000000000000test npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/License0000644000000000000000000000207512631326456032253 0ustar 00000000000000Copyright (c) 2011 Debuggable Limited Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/Makefilenpm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/Makefil0000644000000000000000000000007112631326456032233 0ustar 00000000000000SHELL := /bin/bash test: @./test/run.js .PHONY: test ././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/Readme.mdnpm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/Readme.0000644000000000000000000000743712631326456032153 0ustar 00000000000000# delayed-stream Buffers events from a stream until you are ready to handle them. ## Installation ``` bash npm install delayed-stream ``` ## Usage The following example shows how to write a http echo server that delays its response by 1000 ms. ``` javascript var DelayedStream = require('delayed-stream'); var http = require('http'); http.createServer(function(req, res) { var delayed = DelayedStream.create(req); setTimeout(function() { res.writeHead(200); delayed.pipe(res); }, 1000); }); ``` If you are not using `Stream#pipe`, you can also manually release the buffered events by calling `delayedStream.resume()`: ``` javascript var delayed = DelayedStream.create(req); setTimeout(function() { // Emit all buffered events and resume underlaying source delayed.resume(); }, 1000); ``` ## Implementation In order to use this meta stream properly, here are a few things you should know about the implementation. ### Event Buffering / Proxying All events of the `source` stream are hijacked by overwriting the `source.emit` method. Until node implements a catch-all event listener, this is the only way. However, delayed-stream still continues to emit all events it captures on the `source`, regardless of whether you have released the delayed stream yet or not. Upon creation, delayed-stream captures all `source` events and stores them in an internal event buffer. Once `delayedStream.release()` is called, all buffered events are emitted on the `delayedStream`, and the event buffer is cleared. After that, delayed-stream merely acts as a proxy for the underlaying source. ### Error handling Error events on `source` are buffered / proxied just like any other events. However, `delayedStream.create` attaches a no-op `'error'` listener to the `source`. This way you only have to handle errors on the `delayedStream` object, rather than in two places. ### Buffer limits delayed-stream provides a `maxDataSize` property that can be used to limit the amount of data being buffered. In order to protect you from bad `source` streams that don't react to `source.pause()`, this feature is enabled by default. ## API ### DelayedStream.create(source, [options]) Returns a new `delayedStream`. Available options are: * `pauseStream` * `maxDataSize` The description for those properties can be found below. ### delayedStream.source The `source` stream managed by this object. This is useful if you are passing your `delayedStream` around, and you still want to access properties on the `source` object. ### delayedStream.pauseStream = true Whether to pause the underlaying `source` when calling `DelayedStream.create()`. Modifying this property afterwards has no effect. ### delayedStream.maxDataSize = 1024 * 1024 The amount of data to buffer before emitting an `error`. If the underlaying source is emitting `Buffer` objects, the `maxDataSize` refers to bytes. If the underlaying source is emitting JavaScript strings, the size refers to characters. If you know what you are doing, you can set this property to `Infinity` to disable this feature. You can also modify this property during runtime. ### delayedStream.dataSize = 0 The amount of data buffered so far. ### delayedStream.readable An ECMA5 getter that returns the value of `source.readable`. ### delayedStream.resume() If the `delayedStream` has not been released so far, `delayedStream.release()` is called. In either case, `source.resume()` is called. ### delayedStream.pause() Calls `source.pause()`. ### delayedStream.pipe(dest) Calls `delayedStream.resume()` and then proxies the arguments to `source.pipe`. ### delayedStream.release() Emits and clears all events that have been buffered up so far. This does not resume the underlaying source, use `delayedStream.resume()` instead. ## License delayed-stream is licensed under the MIT license. npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/lib/0000755000000000000000000000000012631326456031510 5ustar 00000000000000././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/package.jsonnpm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/package0000644000000000000000000000317512631326456032266 0ustar 00000000000000{ "author": { "name": "Felix Geisendörfer", "email": "felix@debuggable.com", "url": "http://debuggable.com/" }, "contributors": [ { "name": "Mike Atkins", "email": "apeherder@gmail.com" } ], "name": "delayed-stream", "description": "Buffers events from a stream until you are ready to handle them.", "license": "MIT", "version": "1.0.0", "homepage": "https://github.com/felixge/node-delayed-stream", "repository": { "type": "git", "url": "git://github.com/felixge/node-delayed-stream.git" }, "main": "./lib/delayed_stream", "engines": { "node": ">=0.4.0" }, "scripts": { "test": "make test" }, "dependencies": {}, "devDependencies": { "fake": "0.2.0", "far": "0.0.1" }, "gitHead": "07a9dc99fb8f1a488160026b9ad77493f766fb84", "bugs": { "url": "https://github.com/felixge/node-delayed-stream/issues" }, "_id": "delayed-stream@1.0.0", "_shasum": "df3ae199acadfb7d440aaae0b29e2272b24ec619", "_from": "delayed-stream@>=1.0.0 <1.1.0", "_npmVersion": "2.8.3", "_nodeVersion": "1.6.4", "_npmUser": { "name": "apechimp", "email": "apeherder@gmail.com" }, "dist": { "shasum": "df3ae199acadfb7d440aaae0b29e2272b24ec619", "tarball": "http://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz" }, "maintainers": [ { "name": "felixge", "email": "felix@debuggable.com" }, { "name": "apechimp", "email": "apeherder@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/delayed-stream/-/delayed-stream-1.0.0.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016300000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/lib/delayed_stream.jsnpm_3.5.2.orig/node_modules/request/node_modules/combined-stream/node_modules/delayed-stream/lib/del0000644000000000000000000000441712631326456032205 0ustar 00000000000000var Stream = require('stream').Stream; var util = require('util'); module.exports = DelayedStream; function DelayedStream() { this.source = null; this.dataSize = 0; this.maxDataSize = 1024 * 1024; this.pauseStream = true; this._maxDataSizeExceeded = false; this._released = false; this._bufferedEvents = []; } util.inherits(DelayedStream, Stream); DelayedStream.create = function(source, options) { var delayedStream = new this(); options = options || {}; for (var option in options) { delayedStream[option] = options[option]; } delayedStream.source = source; var realEmit = source.emit; source.emit = function() { delayedStream._handleEmit(arguments); return realEmit.apply(source, arguments); }; source.on('error', function() {}); if (delayedStream.pauseStream) { source.pause(); } return delayedStream; }; Object.defineProperty(DelayedStream.prototype, 'readable', { configurable: true, enumerable: true, get: function() { return this.source.readable; } }); DelayedStream.prototype.setEncoding = function() { return this.source.setEncoding.apply(this.source, arguments); }; DelayedStream.prototype.resume = function() { if (!this._released) { this.release(); } this.source.resume(); }; DelayedStream.prototype.pause = function() { this.source.pause(); }; DelayedStream.prototype.release = function() { this._released = true; this._bufferedEvents.forEach(function(args) { this.emit.apply(this, args); }.bind(this)); this._bufferedEvents = []; }; DelayedStream.prototype.pipe = function() { var r = Stream.prototype.pipe.apply(this, arguments); this.resume(); return r; }; DelayedStream.prototype._handleEmit = function(args) { if (this._released) { this.emit.apply(this, args); return; } if (args[0] === 'data') { this.dataSize += args[1].length; this._checkIfMaxDataSizeExceeded(); } this._bufferedEvents.push(args); }; DelayedStream.prototype._checkIfMaxDataSizeExceeded = function() { if (this._maxDataSizeExceeded) { return; } if (this.dataSize <= this.maxDataSize) { return; } this._maxDataSizeExceeded = true; var message = 'DelayedStream#maxDataSize of ' + this.maxDataSize + ' bytes exceeded.' this.emit('error', new Error(message)); }; npm_3.5.2.orig/node_modules/request/node_modules/extend/.jscs.json0000644000000000000000000000477312631326456023531 0ustar 00000000000000{ "additionalRules": [], "requireSemicolons": true, "disallowMultipleSpaces": true, "disallowIdentifierNames": [], "requireCurlyBraces": ["if", "else", "for", "while", "do", "try", "catch"], "requireSpaceAfterKeywords": ["if", "else", "for", "while", "do", "switch", "return", "try", "catch", "function"], "disallowSpaceAfterKeywords": [], "requireSpacesInAnonymousFunctionExpression": { "beforeOpeningRoundBrace": true, "beforeOpeningCurlyBrace": true }, "requireSpacesInNamedFunctionExpression": { "beforeOpeningCurlyBrace": true }, "disallowSpacesInNamedFunctionExpression": { "beforeOpeningRoundBrace": true }, "requireSpacesInFunctionDeclaration": { "beforeOpeningCurlyBrace": true }, "disallowSpacesInFunctionDeclaration": { "beforeOpeningRoundBrace": true }, "requireSpaceBetweenArguments": true, "disallowSpacesInsideParentheses": true, "disallowSpacesInsideArrayBrackets": true, "disallowQuotedKeysInObjects": "allButReserved", "disallowSpaceAfterObjectKeys": true, "requireCommaBeforeLineBreak": true, "disallowSpaceAfterPrefixUnaryOperators": ["++", "--", "+", "-", "~", "!"], "requireSpaceAfterPrefixUnaryOperators": [], "disallowSpaceBeforePostfixUnaryOperators": ["++", "--"], "requireSpaceBeforePostfixUnaryOperators": [], "disallowSpaceBeforeBinaryOperators": [], "requireSpaceBeforeBinaryOperators": ["+", "-", "/", "*", "=", "==", "===", "!=", "!=="], "requireSpaceAfterBinaryOperators": ["+", "-", "/", "*", "=", "==", "===", "!=", "!=="], "disallowSpaceAfterBinaryOperators": [], "disallowImplicitTypeConversion": ["binary", "string"], "disallowKeywords": ["with", "eval"], "requireKeywordsOnNewLine": [], "disallowKeywordsOnNewLine": ["else"], "requireLineFeedAtFileEnd": true, "disallowTrailingWhitespace": true, "disallowTrailingComma": true, "excludeFiles": ["node_modules/**", "vendor/**"], "disallowMultipleLineStrings": true, "requireDotNotation": true, "requireParenthesesAroundIIFE": true, "validateLineBreaks": "LF", "validateQuoteMarks": { "escape": true, "mark": "'" }, "disallowOperatorBeforeLineBreak": [], "requireSpaceBeforeKeywords": [ "do", "for", "if", "else", "switch", "case", "try", "catch", "finally", "while", "with", "return" ], "validateAlignedFunctionParameters": { "lineBreakAfterOpeningBraces": true, "lineBreakBeforeClosingBraces": true }, "requirePaddingNewLinesBeforeExport": true, "validateNewlineAfterArrayElements": { "maximum": 6 }, "requirePaddingNewLinesAfterUseStrict": true } npm_3.5.2.orig/node_modules/request/node_modules/extend/.npmignore0000644000000000000000000000000412631326456023574 0ustar 00000000000000testnpm_3.5.2.orig/node_modules/request/node_modules/extend/.travis.yml0000644000000000000000000000155712631326456023724 0ustar 00000000000000language: node_js node_js: - "iojs-v2.3" - "iojs-v2.2" - "iojs-v2.1" - "iojs-v2.0" - "iojs-v1.8" - "iojs-v1.7" - "iojs-v1.6" - "iojs-v1.5" - "iojs-v1.4" - "iojs-v1.3" - "iojs-v1.2" - "iojs-v1.1" - "iojs-v1.0" - "0.12" - "0.11" - "0.10" - "0.9" - "0.8" - "0.6" - "0.4" before_install: - '[ "${TRAVIS_NODE_VERSION}" = "0.6" ] || npm install -g npm@1.4.28 && npm install -g npm' sudo: false matrix: fast_finish: true allow_failures: - node_js: "iojs-v2.2" - node_js: "iojs-v2.1" - node_js: "iojs-v2.0" - node_js: "iojs-v1.7" - node_js: "iojs-v1.6" - node_js: "iojs-v1.5" - node_js: "iojs-v1.4" - node_js: "iojs-v1.3" - node_js: "iojs-v1.2" - node_js: "iojs-v1.1" - node_js: "iojs-v1.0" - node_js: "0.11" - node_js: "0.9" - node_js: "0.8" - node_js: "0.6" - node_js: "0.4" npm_3.5.2.orig/node_modules/request/node_modules/extend/CHANGELOG.md0000644000000000000000000000416212631326456023417 0ustar 000000000000003.0.0 / 2015-07-01 ================== * [Possible breaking change] Use global "strict" directive (#32) * [Tests] `int` is an ES3 reserved word * [Tests] Test up to `io.js` `v2.3` * [Tests] Add `npm run eslint` * [Dev Deps] Update `covert`, `jscs` 2.0.1 / 2015-04-25 ================== * Use an inline `isArray` check, for ES3 browsers. (#27) * Some old browsers fail when an identifier is `toString` * Test latest `node` and `io.js` versions on `travis-ci`; speed up builds * Add license info to package.json (#25) * Update `tape`, `jscs` * Adding a CHANGELOG 2.0.0 / 2014-10-01 ================== * Increase code coverage to 100%; run code coverage as part of tests * Add `npm run lint`; Run linter as part of tests * Remove nodeType and setInterval checks in isPlainObject * Updating `tape`, `jscs`, `covert` * General style and README cleanup 1.3.0 / 2014-06-20 ================== * Add component.json for browser support (#18) * Use SVG for badges in README (#16) * Updating `tape`, `covert` * Updating travis-ci to work with multiple node versions * Fix `deep === false` bug (returning target as {}) (#14) * Fixing constructor checks in isPlainObject * Adding additional test coverage * Adding `npm run coverage` * Add LICENSE (#13) * Adding a warning about `false`, per #11 * General style and whitespace cleanup 1.2.1 / 2013-09-14 ================== * Fixing hasOwnProperty bugs that would only have shown up in specific browsers. Fixes #8 * Updating `tape` 1.2.0 / 2013-09-02 ================== * Updating the README: add badges * Adding a missing variable reference. * Using `tape` instead of `buster` for tests; add more tests (#7) * Adding node 0.10 to Travis CI (#6) * Enabling "npm test" and cleaning up package.json (#5) * Add Travis CI. 1.1.3 / 2012-12-06 ================== * Added unit tests. * Ensure extend function is named. (Looks nicer in a stack trace.) * README cleanup. 1.1.1 / 2012-11-07 ================== * README cleanup. * Added installation instructions. * Added a missing semicolon 1.0.0 / 2012-04-08 ================== * Initial commit npm_3.5.2.orig/node_modules/request/node_modules/extend/LICENSE0000644000000000000000000000207112631326456022610 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2014 Stefan Thomas Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/request/node_modules/extend/README.md0000644000000000000000000000447012631326456023067 0ustar 00000000000000[![Build Status][travis-svg]][travis-url] [![dependency status][deps-svg]][deps-url] [![dev dependency status][dev-deps-svg]][dev-deps-url] # extend() for Node.js [![Version Badge][npm-version-png]][npm-url] `node-extend` is a port of the classic extend() method from jQuery. It behaves as you expect. It is simple, tried and true. ## Installation This package is available on [npm][npm-url] as: `extend` ``` sh npm install extend ``` ## Usage **Syntax:** extend **(** [`deep`], `target`, `object1`, [`objectN`] **)** *Extend one object with one or more others, returning the modified object.* Keep in mind that the target object will be modified, and will be returned from extend(). If a boolean true is specified as the first argument, extend performs a deep copy, recursively copying any objects it finds. Otherwise, the copy will share structure with the original object(s). Undefined properties are not copied. However, properties inherited from the object's prototype will be copied over. Warning: passing `false` as the first argument is not supported. ### Arguments * `deep` *Boolean* (optional) If set, the merge becomes recursive (i.e. deep copy). * `target` *Object* The object to extend. * `object1` *Object* The object that will be merged into the first. * `objectN` *Object* (Optional) More objects to merge into the first. ## License `node-extend` is licensed under the [MIT License][mit-license-url]. ## Acknowledgements All credit to the jQuery authors for perfecting this amazing utility. Ported to Node.js by [Stefan Thomas][github-justmoon] with contributions by [Jonathan Buchanan][github-insin] and [Jordan Harband][github-ljharb]. [travis-svg]: https://travis-ci.org/justmoon/node-extend.svg [travis-url]: https://travis-ci.org/justmoon/node-extend [npm-url]: https://npmjs.org/package/extend [mit-license-url]: http://opensource.org/licenses/MIT [github-justmoon]: https://github.com/justmoon [github-insin]: https://github.com/insin [github-ljharb]: https://github.com/ljharb [npm-version-png]: http://vb.teelaun.ch/justmoon/node-extend.svg [deps-svg]: https://david-dm.org/justmoon/node-extend.svg [deps-url]: https://david-dm.org/justmoon/node-extend [dev-deps-svg]: https://david-dm.org/justmoon/node-extend/dev-status.svg [dev-deps-url]: https://david-dm.org/justmoon/node-extend#info=devDependencies npm_3.5.2.orig/node_modules/request/node_modules/extend/component.json0000644000000000000000000000110512631326456024475 0ustar 00000000000000{ "name": "extend", "author": "Stefan Thomas (http://www.justmoon.net)", "version": "3.0.0", "description": "Port of jQuery.extend for node.js and the browser.", "scripts": [ "index.js" ], "contributors": [ { "name": "Jordan Harband", "url": "https://github.com/ljharb" } ], "keywords": [ "extend", "clone", "merge" ], "repository" : { "type": "git", "url": "https://github.com/justmoon/node-extend.git" }, "dependencies": { }, "devDependencies": { "tape" : "~3.0.0", "covert": "~0.4.0", "jscs": "~1.6.2" } } npm_3.5.2.orig/node_modules/request/node_modules/extend/index.js0000644000000000000000000000433212631326456023252 0ustar 00000000000000'use strict'; var hasOwn = Object.prototype.hasOwnProperty; var toStr = Object.prototype.toString; var isArray = function isArray(arr) { if (typeof Array.isArray === 'function') { return Array.isArray(arr); } return toStr.call(arr) === '[object Array]'; }; var isPlainObject = function isPlainObject(obj) { if (!obj || toStr.call(obj) !== '[object Object]') { return false; } var hasOwnConstructor = hasOwn.call(obj, 'constructor'); var hasIsPrototypeOf = obj.constructor && obj.constructor.prototype && hasOwn.call(obj.constructor.prototype, 'isPrototypeOf'); // Not own constructor property must be Object if (obj.constructor && !hasOwnConstructor && !hasIsPrototypeOf) { return false; } // Own properties are enumerated firstly, so to speed up, // if last one is own, then all properties are own. var key; for (key in obj) {/**/} return typeof key === 'undefined' || hasOwn.call(obj, key); }; module.exports = function extend() { var options, name, src, copy, copyIsArray, clone, target = arguments[0], i = 1, length = arguments.length, deep = false; // Handle a deep copy situation if (typeof target === 'boolean') { deep = target; target = arguments[1] || {}; // skip the boolean and the target i = 2; } else if ((typeof target !== 'object' && typeof target !== 'function') || target == null) { target = {}; } for (; i < length; ++i) { options = arguments[i]; // Only deal with non-null/undefined values if (options != null) { // Extend the base object for (name in options) { src = target[name]; copy = options[name]; // Prevent never-ending loop if (target !== copy) { // Recurse if we're merging plain objects or arrays if (deep && copy && (isPlainObject(copy) || (copyIsArray = isArray(copy)))) { if (copyIsArray) { copyIsArray = false; clone = src && isArray(src) ? src : []; } else { clone = src && isPlainObject(src) ? src : {}; } // Never move original objects, clone them target[name] = extend(deep, clone, copy); // Don't bring in undefined values } else if (typeof copy !== 'undefined') { target[name] = copy; } } } } } // Return the modified object return target; }; npm_3.5.2.orig/node_modules/request/node_modules/extend/package.json0000644000000000000000000000355312631326456024077 0ustar 00000000000000{ "name": "extend", "author": { "name": "Stefan Thomas", "email": "justmoon@members.fsf.org", "url": "http://www.justmoon.net" }, "version": "3.0.0", "description": "Port of jQuery.extend for node.js and the browser", "main": "index", "scripts": { "test": "npm run lint && node test/index.js && npm run coverage-quiet", "coverage": "covert test/index.js", "coverage-quiet": "covert test/index.js --quiet", "lint": "npm run jscs && npm run eslint", "jscs": "jscs *.js */*.js", "eslint": "eslint *.js */*.js" }, "contributors": [ { "name": "Jordan Harband", "url": "https://github.com/ljharb" } ], "keywords": [ "extend", "clone", "merge" ], "repository": { "type": "git", "url": "git+https://github.com/justmoon/node-extend.git" }, "dependencies": {}, "devDependencies": { "tape": "^4.0.0", "covert": "^1.1.0", "jscs": "^1.13.1", "eslint": "^0.24.0" }, "license": "MIT", "gitHead": "148e7270cab2e9413af2cd0cab147070d755ed6d", "bugs": { "url": "https://github.com/justmoon/node-extend/issues" }, "homepage": "https://github.com/justmoon/node-extend#readme", "_id": "extend@3.0.0", "_shasum": "5a474353b9f3353ddd8176dfd37b91c83a46f1d4", "_from": "extend@>=3.0.0 <3.1.0", "_npmVersion": "2.11.3", "_nodeVersion": "2.3.1", "_npmUser": { "name": "ljharb", "email": "ljharb@gmail.com" }, "dist": { "shasum": "5a474353b9f3353ddd8176dfd37b91c83a46f1d4", "tarball": "http://registry.npmjs.org/extend/-/extend-3.0.0.tgz" }, "maintainers": [ { "name": "justmoon", "email": "justmoon@members.fsf.org" }, { "name": "ljharb", "email": "ljharb@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/extend/-/extend-3.0.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/forever-agent/LICENSE0000644000000000000000000002166412631326456024076 0ustar 00000000000000Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONSnpm_3.5.2.orig/node_modules/request/node_modules/forever-agent/README.md0000644000000000000000000000024312631326456024336 0ustar 00000000000000forever-agent ============= HTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module. npm_3.5.2.orig/node_modules/request/node_modules/forever-agent/index.js0000644000000000000000000001012012631326456024517 0ustar 00000000000000module.exports = ForeverAgent ForeverAgent.SSL = ForeverAgentSSL var util = require('util') , Agent = require('http').Agent , net = require('net') , tls = require('tls') , AgentSSL = require('https').Agent function getConnectionName(host, port) { var name = '' if (typeof host === 'string') { name = host + ':' + port } else { // For node.js v012.0 and iojs-v1.5.1, host is an object. And any existing localAddress is part of the connection name. name = host.host + ':' + host.port + ':' + (host.localAddress ? (host.localAddress + ':') : ':') } return name } function ForeverAgent(options) { var self = this self.options = options || {} self.requests = {} self.sockets = {} self.freeSockets = {} self.maxSockets = self.options.maxSockets || Agent.defaultMaxSockets self.minSockets = self.options.minSockets || ForeverAgent.defaultMinSockets self.on('free', function(socket, host, port) { var name = getConnectionName(host, port) if (self.requests[name] && self.requests[name].length) { self.requests[name].shift().onSocket(socket) } else if (self.sockets[name].length < self.minSockets) { if (!self.freeSockets[name]) self.freeSockets[name] = [] self.freeSockets[name].push(socket) // if an error happens while we don't use the socket anyway, meh, throw the socket away var onIdleError = function() { socket.destroy() } socket._onIdleError = onIdleError socket.on('error', onIdleError) } else { // If there are no pending requests just destroy the // socket and it will get removed from the pool. This // gets us out of timeout issues and allows us to // default to Connection:keep-alive. socket.destroy() } }) } util.inherits(ForeverAgent, Agent) ForeverAgent.defaultMinSockets = 5 ForeverAgent.prototype.createConnection = net.createConnection ForeverAgent.prototype.addRequestNoreuse = Agent.prototype.addRequest ForeverAgent.prototype.addRequest = function(req, host, port) { var name = getConnectionName(host, port) if (typeof host !== 'string') { var options = host port = options.port host = options.host } if (this.freeSockets[name] && this.freeSockets[name].length > 0 && !req.useChunkedEncodingByDefault) { var idleSocket = this.freeSockets[name].pop() idleSocket.removeListener('error', idleSocket._onIdleError) delete idleSocket._onIdleError req._reusedSocket = true req.onSocket(idleSocket) } else { this.addRequestNoreuse(req, host, port) } } ForeverAgent.prototype.removeSocket = function(s, name, host, port) { if (this.sockets[name]) { var index = this.sockets[name].indexOf(s) if (index !== -1) { this.sockets[name].splice(index, 1) } } else if (this.sockets[name] && this.sockets[name].length === 0) { // don't leak delete this.sockets[name] delete this.requests[name] } if (this.freeSockets[name]) { var index = this.freeSockets[name].indexOf(s) if (index !== -1) { this.freeSockets[name].splice(index, 1) if (this.freeSockets[name].length === 0) { delete this.freeSockets[name] } } } if (this.requests[name] && this.requests[name].length) { // If we have pending requests and a socket gets closed a new one // needs to be created to take over in the pool for the one that closed. this.createSocket(name, host, port).emit('free') } } function ForeverAgentSSL (options) { ForeverAgent.call(this, options) } util.inherits(ForeverAgentSSL, ForeverAgent) ForeverAgentSSL.prototype.createConnection = createConnectionSSL ForeverAgentSSL.prototype.addRequestNoreuse = AgentSSL.prototype.addRequest function createConnectionSSL (port, host, options) { if (typeof port === 'object') { options = port; } else if (typeof host === 'object') { options = host; } else if (typeof options === 'object') { options = options; } else { options = {}; } if (typeof port === 'number') { options.port = port; } if (typeof host === 'string') { options.host = host; } return tls.connect(options); } npm_3.5.2.orig/node_modules/request/node_modules/forever-agent/package.json0000644000000000000000000000220212631326456025342 0ustar 00000000000000{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "forever-agent", "description": "HTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module.", "version": "0.6.1", "license": "Apache-2.0", "repository": { "url": "git+https://github.com/mikeal/forever-agent.git" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "readme": "forever-agent\n=============\n\nHTTP Agent that keeps socket connections alive between keep-alive requests. Formerly part of mikeal/request, now a standalone module.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/mikeal/forever-agent/issues" }, "homepage": "https://github.com/mikeal/forever-agent#readme", "_id": "forever-agent@0.6.1", "_shasum": "fbc71f0c41adeb37f96c577ad1ed42d8fdacca91", "_resolved": "https://registry.npmjs.org/forever-agent/-/forever-agent-0.6.1.tgz", "_from": "forever-agent@>=0.6.1 <0.7.0" } npm_3.5.2.orig/node_modules/request/node_modules/form-data/License0000644000000000000000000000213612631326456023475 0ustar 00000000000000Copyright (c) 2012 Felix Geisendörfer (felix@debuggable.com) and contributors Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/request/node_modules/form-data/Readme.md0000644000000000000000000001332412631326456023710 0ustar 00000000000000# Form-Data [![Join the chat at https://gitter.im/form-data/form-data](http://form-data.github.io/images/gitterbadge.svg)](https://gitter.im/form-data/form-data) [![Build Status](https://img.shields.io/travis/form-data/form-data/master.svg)](https://travis-ci.org/form-data/form-data) [![Dependency Status](https://img.shields.io/david/form-data/form-data.svg)](https://david-dm.org/form-data/form-data) A library to create readable ```"multipart/form-data"``` streams. Can be used to submit forms and file uploads to other web applications. The API of this library is inspired by the [XMLHttpRequest-2 FormData Interface][xhr2-fd]. [xhr2-fd]: http://dev.w3.org/2006/webapi/XMLHttpRequest-2/Overview.html#the-formdata-interface [streams2-thing]: http://nodejs.org/api/stream.html#stream_compatibility_with_older_node_versions ## Install ``` npm install form-data ``` ## Usage In this example we are constructing a form with 3 fields that contain a string, a buffer and a file stream. ``` javascript var FormData = require('form-data'); var fs = require('fs'); var form = new FormData(); form.append('my_field', 'my value'); form.append('my_buffer', new Buffer(10)); form.append('my_file', fs.createReadStream('/foo/bar.jpg')); ``` Also you can use http-response stream: ``` javascript var FormData = require('form-data'); var http = require('http'); var form = new FormData(); http.request('http://nodejs.org/images/logo.png', function(response) { form.append('my_field', 'my value'); form.append('my_buffer', new Buffer(10)); form.append('my_logo', response); }); ``` Or @mikeal's [request](https://github.com/request/request) stream: ``` javascript var FormData = require('form-data'); var request = require('request'); var form = new FormData(); form.append('my_field', 'my value'); form.append('my_buffer', new Buffer(10)); form.append('my_logo', request('http://nodejs.org/images/logo.png')); ``` In order to submit this form to a web application, call ```submit(url, [callback])``` method: ``` javascript form.submit('http://example.org/', function(err, res) { // res – response object (http.IncomingMessage) // res.resume(); }); ``` For more advanced request manipulations ```submit()``` method returns ```http.ClientRequest``` object, or you can choose from one of the alternative submission methods. ### Alternative submission methods You can use node's http client interface: ``` javascript var http = require('http'); var request = http.request({ method: 'post', host: 'example.org', path: '/upload', headers: form.getHeaders() }); form.pipe(request); request.on('response', function(res) { console.log(res.statusCode); }); ``` Or if you would prefer the `'Content-Length'` header to be set for you: ``` javascript form.submit('example.org/upload', function(err, res) { console.log(res.statusCode); }); ``` To use custom headers and pre-known length in parts: ``` javascript var CRLF = '\r\n'; var form = new FormData(); var options = { header: CRLF + '--' + form.getBoundary() + CRLF + 'X-Custom-Header: 123' + CRLF + CRLF, knownLength: 1 }; form.append('my_buffer', buffer, options); form.submit('http://example.com/', function(err, res) { if (err) throw err; console.log('Done'); }); ``` Form-Data can recognize and fetch all the required information from common types of streams (```fs.readStream```, ```http.response``` and ```mikeal's request```), for some other types of streams you'd need to provide "file"-related information manually: ``` javascript someModule.stream(function(err, stdout, stderr) { if (err) throw err; var form = new FormData(); form.append('file', stdout, { filename: 'unicycle.jpg', contentType: 'image/jpg', knownLength: 19806 }); form.submit('http://example.com/', function(err, res) { if (err) throw err; console.log('Done'); }); }); ``` For edge cases, like POST request to URL with query string or to pass HTTP auth credentials, object can be passed to `form.submit()` as first parameter: ``` javascript form.submit({ host: 'example.com', path: '/probably.php?extra=params', auth: 'username:password' }, function(err, res) { console.log(res.statusCode); }); ``` In case you need to also send custom HTTP headers with the POST request, you can use the `headers` key in first parameter of `form.submit()`: ``` javascript form.submit({ host: 'example.com', path: '/surelynot.php', headers: {'x-test-header': 'test-header-value'} }, function(err, res) { console.log(res.statusCode); }); ``` ### Integration with other libraries #### Request Form submission using [request](https://github.com/request/request): ```javascript var formData = { my_field: 'my_value', my_file: fs.createReadStream(__dirname + '/unicycle.jpg'), }; request.post({url:'http://service.com/upload', formData: formData}, function(err, httpResponse, body) { if (err) { return console.error('upload failed:', err); } console.log('Upload successful! Server responded with:', body); }); ``` For more details see [request readme](https://github.com/request/request#multipartform-data-multipart-form-uploads). #### node-fetch You can also submit a form using [node-fetch](https://github.com/bitinn/node-fetch): ```javascript var form = new FormData(); form.append('a', 1); fetch('http://example.com', { method: 'POST', body: form }) .then(function(res) { return res.json(); }).then(function(json) { console.log(json); }); ``` ## Notes - ```getLengthSync()``` method DOESN'T calculate length for streams, use ```knownLength``` options as workaround. - If it feels like FormData hangs after submit and you're on ```node-0.10```, please check [Compatibility with Older Node Versions][streams2-thing] ## License Form-Data is licensed under the MIT license. npm_3.5.2.orig/node_modules/request/node_modules/form-data/lib/0000755000000000000000000000000012631326456022734 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/0000755000000000000000000000000012631326456024643 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/form-data/package.json0000644000000000000000000000413612631326456024460 0ustar 00000000000000{ "author": { "name": "Felix Geisendörfer", "email": "felix@debuggable.com", "url": "http://debuggable.com/" }, "name": "form-data", "description": "A library to create readable \"multipart/form-data\" streams. Can be used to submit forms and file uploads to other web applications.", "version": "1.0.0-rc3", "repository": { "type": "git", "url": "git://github.com/form-data/form-data.git" }, "main": "./lib/form_data", "browser": "./lib/browser", "scripts": { "test": "./test/run.js" }, "pre-commit": [ "test" ], "engines": { "node": ">= 0.10" }, "dependencies": { "async": "^1.4.0", "combined-stream": "^1.0.5", "mime-types": "^2.1.3" }, "license": "MIT", "devDependencies": { "fake": "^0.2.2", "far": "^0.0.7", "formidable": "^1.0.17", "pre-commit": "^1.0.10", "request": "^2.60.0" }, "gitHead": "c174f1b7f3a78a00ec5af0360469280445e37804", "bugs": { "url": "https://github.com/form-data/form-data/issues" }, "homepage": "https://github.com/form-data/form-data#readme", "_id": "form-data@1.0.0-rc3", "_shasum": "d35bc62e7fbc2937ae78f948aaa0d38d90607577", "_from": "form-data@>=1.0.0-rc3 <1.1.0", "_npmVersion": "2.11.0", "_nodeVersion": "2.2.1", "_npmUser": { "name": "dylanpiercey", "email": "pierceydylan@gmail.com" }, "dist": { "shasum": "d35bc62e7fbc2937ae78f948aaa0d38d90607577", "tarball": "http://registry.npmjs.org/form-data/-/form-data-1.0.0-rc3.tgz" }, "maintainers": [ { "name": "felixge", "email": "felix@debuggable.com" }, { "name": "idralyuk", "email": "igor@buran.us" }, { "name": "alexindigo", "email": "iam@alexindigo.com" }, { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, { "name": "celer", "email": "dtyree77@gmail.com" }, { "name": "dylanpiercey", "email": "pierceydylan@gmail.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/form-data/-/form-data-1.0.0-rc3.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/form-data/lib/browser.js0000644000000000000000000000003212631326456024750 0ustar 00000000000000module.exports = FormData;npm_3.5.2.orig/node_modules/request/node_modules/form-data/lib/form_data.js0000644000000000000000000002376712631326456025245 0ustar 00000000000000var CombinedStream = require('combined-stream'); var util = require('util'); var path = require('path'); var http = require('http'); var https = require('https'); var parseUrl = require('url').parse; var fs = require('fs'); var mime = require('mime-types'); var async = require('async'); module.exports = FormData; function FormData() { this._overheadLength = 0; this._valueLength = 0; this._lengthRetrievers = []; CombinedStream.call(this); } util.inherits(FormData, CombinedStream); FormData.LINE_BREAK = '\r\n'; FormData.DEFAULT_CONTENT_TYPE = 'application/octet-stream'; FormData.prototype.append = function(field, value, options) { options = (typeof options === 'string') ? { filename: options } : options || {}; var append = CombinedStream.prototype.append.bind(this); // all that streamy business can't handle numbers if (typeof value == 'number') value = ''+value; // https://github.com/felixge/node-form-data/issues/38 if (util.isArray(value)) { // Please convert your array into string // the way web server expects it this._error(new Error('Arrays are not supported.')); return; } var header = this._multiPartHeader(field, value, options); var footer = this._multiPartFooter(field, value, options); append(header); append(value); append(footer); // pass along options.knownLength this._trackLength(header, value, options); }; FormData.prototype._trackLength = function(header, value, options) { var valueLength = 0; // used w/ getLengthSync(), when length is known. // e.g. for streaming directly from a remote server, // w/ a known file a size, and not wanting to wait for // incoming file to finish to get its size. if (options.knownLength != null) { valueLength += +options.knownLength; } else if (Buffer.isBuffer(value)) { valueLength = value.length; } else if (typeof value === 'string') { valueLength = Buffer.byteLength(value); } this._valueLength += valueLength; // @check why add CRLF? does this account for custom/multiple CRLFs? this._overheadLength += Buffer.byteLength(header) + FormData.LINE_BREAK.length; // empty or either doesn't have path or not an http response if (!value || ( !value.path && !(value.readable && value.hasOwnProperty('httpVersion')) )) { return; } // no need to bother with the length if (!options.knownLength) this._lengthRetrievers.push(function(next) { if (value.hasOwnProperty('fd')) { // take read range into a account // `end` = Infinity –> read file till the end // // TODO: Looks like there is bug in Node fs.createReadStream // it doesn't respect `end` options without `start` options // Fix it when node fixes it. // https://github.com/joyent/node/issues/7819 if (value.end != undefined && value.end != Infinity && value.start != undefined) { // when end specified // no need to calculate range // inclusive, starts with 0 next(null, value.end+1 - (value.start ? value.start : 0)); // not that fast snoopy } else { // still need to fetch file size from fs fs.stat(value.path, function(err, stat) { var fileSize; if (err) { next(err); return; } // update final size based on the range options fileSize = stat.size - (value.start ? value.start : 0); next(null, fileSize); }); } // or http response } else if (value.hasOwnProperty('httpVersion')) { next(null, +value.headers['content-length']); // or request stream http://github.com/mikeal/request } else if (value.hasOwnProperty('httpModule')) { // wait till response come back value.on('response', function(response) { value.pause(); next(null, +response.headers['content-length']); }); value.resume(); // something else } else { next('Unknown stream'); } }); }; FormData.prototype._multiPartHeader = function(field, value, options) { // custom header specified (as string)? // it becomes responsible for boundary // (e.g. to handle extra CRLFs on .NET servers) if (options.header != null) { return options.header; } var contents = ''; var headers = { 'Content-Disposition': ['form-data', 'name="' + field + '"'], 'Content-Type': [] }; // fs- and request- streams have path property // or use custom filename and/or contentType // TODO: Use request's response mime-type if (options.filename || value.path) { headers['Content-Disposition'].push( 'filename="' + path.basename(options.filename || value.path) + '"' ); headers['Content-Type'].push( options.contentType || mime.lookup(options.filename || value.path) || FormData.DEFAULT_CONTENT_TYPE ); // http response has not } else if (value.readable && value.hasOwnProperty('httpVersion')) { headers['Content-Disposition'].push( 'filename="' + path.basename(value.client._httpMessage.path) + '"' ); headers['Content-Type'].push( options.contentType || value.headers['content-type'] || FormData.DEFAULT_CONTENT_TYPE ); } else if (Buffer.isBuffer(value)) { headers['Content-Type'].push( options.contentType || FormData.DEFAULT_CONTENT_TYPE ); } else if (options.contentType) { headers['Content-Type'].push(options.contentType); } for (var prop in headers) { if (headers[prop].length) { contents += prop + ': ' + headers[prop].join('; ') + FormData.LINE_BREAK; } } return '--' + this.getBoundary() + FormData.LINE_BREAK + contents + FormData.LINE_BREAK; }; FormData.prototype._multiPartFooter = function(field, value, options) { return function(next) { var footer = FormData.LINE_BREAK; var lastPart = (this._streams.length === 0); if (lastPart) { footer += this._lastBoundary(); } next(footer); }.bind(this); }; FormData.prototype._lastBoundary = function() { return '--' + this.getBoundary() + '--' + FormData.LINE_BREAK; }; FormData.prototype.getHeaders = function(userHeaders) { var formHeaders = { 'content-type': 'multipart/form-data; boundary=' + this.getBoundary() }; for (var header in userHeaders) { formHeaders[header.toLowerCase()] = userHeaders[header]; } return formHeaders; } FormData.prototype.getCustomHeaders = function(contentType) { contentType = contentType ? contentType : 'multipart/form-data'; var formHeaders = { 'content-type': contentType + '; boundary=' + this.getBoundary(), 'content-length': this.getLengthSync() }; return formHeaders; } FormData.prototype.getBoundary = function() { if (!this._boundary) { this._generateBoundary(); } return this._boundary; }; FormData.prototype._generateBoundary = function() { // This generates a 50 character boundary similar to those used by Firefox. // They are optimized for boyer-moore parsing. var boundary = '--------------------------'; for (var i = 0; i < 24; i++) { boundary += Math.floor(Math.random() * 10).toString(16); } this._boundary = boundary; }; // Note: getLengthSync DOESN'T calculate streams length // As workaround one can calculate file size manually // and add it as knownLength option FormData.prototype.getLengthSync = function(debug) { var knownLength = this._overheadLength + this._valueLength; // Don't get confused, there are 3 "internal" streams for each keyval pair // so it basically checks if there is any value added to the form if (this._streams.length) { knownLength += this._lastBoundary().length; } // https://github.com/felixge/node-form-data/issues/40 if (this._lengthRetrievers.length) { // Some async length retrivers are present // therefore synchronous length calculation is false. // Please use getLength(callback) to get proper length this._error(new Error('Cannot calculate proper length in synchronous way.')); } return knownLength; }; FormData.prototype.getLength = function(cb) { var knownLength = this._overheadLength + this._valueLength; if (this._streams.length) { knownLength += this._lastBoundary().length; } if (!this._lengthRetrievers.length) { process.nextTick(cb.bind(this, null, knownLength)); return; } async.parallel(this._lengthRetrievers, function(err, values) { if (err) { cb(err); return; } values.forEach(function(length) { knownLength += length; }); cb(null, knownLength); }); }; FormData.prototype.submit = function(params, cb) { var request , options , defaults = { method : 'post' }; // parse provided url if it's string // or treat it as options object if (typeof params == 'string') { params = parseUrl(params); options = populate({ port: params.port, path: params.pathname, host: params.hostname }, defaults); } else // use custom params { options = populate(params, defaults); // if no port provided use default one if (!options.port) { options.port = options.protocol == 'https:' ? 443 : 80; } } // put that good code in getHeaders to some use options.headers = this.getHeaders(params.headers); // https if specified, fallback to http in any other case if (options.protocol == 'https:') { request = https.request(options); } else { request = http.request(options); } // get content length and fire away this.getLength(function(err, length) { // TODO: Add chunked encoding when no length (if err) // add content length request.setHeader('Content-Length', length); this.pipe(request); if (cb) { request.on('error', cb); request.on('response', cb.bind(this, null)); } }.bind(this)); return request; }; FormData.prototype._error = function(err) { if (this.error) return; this.error = err; this.pause(); this.emit('error', err); }; /* * Santa's little helpers */ // populates missing values function populate(dst, src) { for (var prop in src) { if (!dst[prop]) dst[prop] = src[prop]; } return dst; } npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/async/0000755000000000000000000000000012631326456025760 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/async/CHANGELOG.md0000644000000000000000000001026512631326456027575 0ustar 00000000000000# v1.4.2 - Ensure coverage files don't get published on npm (#879) # v1.4.1 - Add in overlooked `detectLimit` method (#866) - Removed unnecessary files from npm releases (#861) - Removed usage of a reserved word to prevent :boom: in older environments (#870) # v1.4.0 - `asyncify` now supports promises (#840) - Added `Limit` versions of `filter` and `reject` (#836) - Add `Limit` versions of `detect`, `some` and `every` (#828, #829) - `some`, `every` and `detect` now short circuit early (#828, #829) - Improve detection of the global object (#804), enabling use in WebWorkers - `whilst` now called with arguments from iterator (#823) - `during` now gets called with arguments from iterator (#824) - Code simplifications and optimizations aplenty ([diff](https://github.com/caolan/async/compare/v1.3.0...v1.4.0)) # v1.3.0 New Features: - Added `constant` - Added `asyncify`/`wrapSync` for making sync functions work with callbacks. (#671, #806) - Added `during` and `doDuring`, which are like `whilst` with an async truth test. (#800) - `retry` now accepts an `interval` parameter to specify a delay between retries. (#793) - `async` should work better in Web Workers due to better `root` detection (#804) - Callbacks are now optional in `whilst`, `doWhilst`, `until`, and `doUntil` (#642) - Various internal updates (#786, #801, #802, #803) - Various doc fixes (#790, #794) Bug Fixes: - `cargo` now exposes the `payload` size, and `cargo.payload` can be changed on the fly after the `cargo` is created. (#740, #744, #783) # v1.2.1 Bug Fix: - Small regression with synchronous iterator behavior in `eachSeries` with a 1-element array. Before 1.1.0, `eachSeries`'s callback was called on the same tick, which this patch restores. In 2.0.0, it will be called on the next tick. (#782) # v1.2.0 New Features: - Added `timesLimit` (#743) - `concurrency` can be changed after initialization in `queue` by setting `q.concurrency`. The new concurrency will be reflected the next time a task is processed. (#747, #772) Bug Fixes: - Fixed a regression in `each` and family with empty arrays that have additional properties. (#775, #777) # v1.1.1 Bug Fix: - Small regression with synchronous iterator behavior in `eachSeries` with a 1-element array. Before 1.1.0, `eachSeries`'s callback was called on the same tick, which this patch restores. In 2.0.0, it will be called on the next tick. (#782) # v1.1.0 New Features: - `cargo` now supports all of the same methods and event callbacks as `queue`. - Added `ensureAsync` - A wrapper that ensures an async function calls its callback on a later tick. (#769) - Optimized `map`, `eachOf`, and `waterfall` families of functions - Passing a `null` or `undefined` array to `map`, `each`, `parallel` and families will be treated as an empty array (#667). - The callback is now optional for the composed results of `compose` and `seq`. (#618) - Reduced file size by 4kb, (minified version by 1kb) - Added code coverage through `nyc` and `coveralls` (#768) Bug Fixes: - `forever` will no longer stack overflow with a synchronous iterator (#622) - `eachLimit` and other limit functions will stop iterating once an error occurs (#754) - Always pass `null` in callbacks when there is no error (#439) - Ensure proper conditions when calling `drain()` after pushing an empty data set to a queue (#668) - `each` and family will properly handle an empty array (#578) - `eachSeries` and family will finish if the underlying array is modified during execution (#557) - `queue` will throw if a non-function is passed to `q.push()` (#593) - Doc fixes (#629, #766) # v1.0.0 No known breaking changes, we are simply complying with semver from here on out. Changes: - Start using a changelog! - Add `forEachOf` for iterating over Objects (or to iterate Arrays with indexes available) (#168 #704 #321) - Detect deadlocks in `auto` (#663) - Better support for require.js (#527) - Throw if queue created with concurrency `0` (#714) - Fix unneeded iteration in `queue.resume()` (#758) - Guard against timer mocking overriding `setImmediate` (#609 #611) - Miscellaneous doc fixes (#542 #596 #615 #628 #631 #690 #729) - Use single noop function internally (#546) - Optimize internal `_each`, `_map` and `_keys` functions. npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/async/LICENSE0000644000000000000000000000204712631326456026770 0ustar 00000000000000Copyright (c) 2010-2014 Caolan McMahon Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/async/lib/0000755000000000000000000000000012631326456026526 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/async/package.json0000644000000000000000000000442012631326456030246 0ustar 00000000000000{ "name": "async", "description": "Higher-order functions and common patterns for asynchronous code", "main": "lib/async.js", "files": [ "lib" ], "author": { "name": "Caolan McMahon" }, "version": "1.4.2", "keywords": [ "async", "callback", "utility", "module" ], "repository": { "type": "git", "url": "git+https://github.com/caolan/async.git" }, "bugs": { "url": "https://github.com/caolan/async/issues" }, "license": "MIT", "devDependencies": { "benchmark": "github:bestiejs/benchmark.js", "bluebird": "^2.9.32", "chai": "^3.1.0", "coveralls": "^2.11.2", "es6-promise": "^2.3.0", "jscs": "^1.13.1", "jshint": "~2.8.0", "karma": "^0.13.2", "karma-browserify": "^4.2.1", "karma-firefox-launcher": "^0.1.6", "karma-mocha": "^0.2.0", "karma-mocha-reporter": "^1.0.2", "lodash": "^3.9.0", "mkdirp": "~0.5.1", "mocha": "^2.2.5", "native-promise-only": "^0.8.0-a", "nodeunit": ">0.0.0", "nyc": "^2.1.0", "rsvp": "^3.0.18", "uglify-js": "~2.4.0", "xyz": "^0.5.0", "yargs": "~3.9.1" }, "jam": { "main": "lib/async.js", "include": [ "lib/async.js", "README.md", "LICENSE" ], "categories": [ "Utilities" ] }, "scripts": { "mocha-node-test": "mocha mocha_test/", "mocha-browser-test": "karma start", "mocha-test": "npm run mocha-node-test && npm run mocha-browser-test", "nodeunit-test": "nodeunit test/test-async.js", "test": "npm run-script lint && npm run nodeunit-test && npm run mocha-test", "lint": "jshint lib/*.js test/*.js perf/*.js && jscs lib/*.js test/*.js perf/*.js", "coverage": "nyc npm test && nyc report", "coveralls": "nyc npm test && nyc report --reporter=text-lcov | coveralls" }, "spm": { "main": "lib/async.js" }, "volo": { "main": "lib/async.js", "ignore": [ "**/.*", "node_modules", "bower_components", "test", "tests" ] }, "readme": "ERROR: No README data found!", "homepage": "https://github.com/caolan/async#readme", "_id": "async@1.4.2", "_shasum": "6c9edcb11ced4f0dd2f2d40db0d49a109c088aab", "_resolved": "https://registry.npmjs.org/async/-/async-1.4.2.tgz", "_from": "async@>=1.4.0 <2.0.0" } npm_3.5.2.orig/node_modules/request/node_modules/form-data/node_modules/async/lib/async.js0000644000000000000000000011055112631326456030204 0ustar 00000000000000/*! * async * https://github.com/caolan/async * * Copyright 2010-2014 Caolan McMahon * Released under the MIT license */ (function () { var async = {}; function noop() {} function identity(v) { return v; } function toBool(v) { return !!v; } function notId(v) { return !v; } // global on the server, window in the browser var previous_async; // Establish the root object, `window` (`self`) in the browser, `global` // on the server, or `this` in some virtual machines. We use `self` // instead of `window` for `WebWorker` support. var root = typeof self === 'object' && self.self === self && self || typeof global === 'object' && global.global === global && global || this; if (root != null) { previous_async = root.async; } async.noConflict = function () { root.async = previous_async; return async; }; function only_once(fn) { return function() { if (fn === null) throw new Error("Callback was already called."); fn.apply(this, arguments); fn = null; }; } function _once(fn) { return function() { if (fn === null) return; fn.apply(this, arguments); fn = null; }; } //// cross-browser compatiblity functions //// var _toString = Object.prototype.toString; var _isArray = Array.isArray || function (obj) { return _toString.call(obj) === '[object Array]'; }; // Ported from underscore.js isObject var _isObject = function(obj) { var type = typeof obj; return type === 'function' || type === 'object' && !!obj; }; function _isArrayLike(arr) { return _isArray(arr) || ( // has a positive integer length property typeof arr.length === "number" && arr.length >= 0 && arr.length % 1 === 0 ); } function _each(coll, iterator) { return _isArrayLike(coll) ? _arrayEach(coll, iterator) : _forEachOf(coll, iterator); } function _arrayEach(arr, iterator) { var index = -1, length = arr.length; while (++index < length) { iterator(arr[index], index, arr); } } function _map(arr, iterator) { var index = -1, length = arr.length, result = Array(length); while (++index < length) { result[index] = iterator(arr[index], index, arr); } return result; } function _range(count) { return _map(Array(count), function (v, i) { return i; }); } function _reduce(arr, iterator, memo) { _arrayEach(arr, function (x, i, a) { memo = iterator(memo, x, i, a); }); return memo; } function _forEachOf(object, iterator) { _arrayEach(_keys(object), function (key) { iterator(object[key], key); }); } function _indexOf(arr, item) { for (var i = 0; i < arr.length; i++) { if (arr[i] === item) return i; } return -1; } var _keys = Object.keys || function (obj) { var keys = []; for (var k in obj) { if (obj.hasOwnProperty(k)) { keys.push(k); } } return keys; }; function _keyIterator(coll) { var i = -1; var len; var keys; if (_isArrayLike(coll)) { len = coll.length; return function next() { i++; return i < len ? i : null; }; } else { keys = _keys(coll); len = keys.length; return function next() { i++; return i < len ? keys[i] : null; }; } } // Similar to ES6's rest param (http://ariya.ofilabs.com/2013/03/es6-and-rest-parameter.html) // This accumulates the arguments passed into an array, after a given index. // From underscore.js (https://github.com/jashkenas/underscore/pull/2140). function _restParam(func, startIndex) { startIndex = startIndex == null ? func.length - 1 : +startIndex; return function() { var length = Math.max(arguments.length - startIndex, 0); var rest = Array(length); for (var index = 0; index < length; index++) { rest[index] = arguments[index + startIndex]; } switch (startIndex) { case 0: return func.call(this, rest); case 1: return func.call(this, arguments[0], rest); } // Currently unused but handle cases outside of the switch statement: // var args = Array(startIndex + 1); // for (index = 0; index < startIndex; index++) { // args[index] = arguments[index]; // } // args[startIndex] = rest; // return func.apply(this, args); }; } function _withoutIndex(iterator) { return function (value, index, callback) { return iterator(value, callback); }; } //// exported async module functions //// //// nextTick implementation with browser-compatible fallback //// // capture the global reference to guard against fakeTimer mocks var _setImmediate = typeof setImmediate === 'function' && setImmediate; var _delay = _setImmediate ? function(fn) { // not a direct alias for IE10 compatibility _setImmediate(fn); } : function(fn) { setTimeout(fn, 0); }; if (typeof process === 'object' && typeof process.nextTick === 'function') { async.nextTick = process.nextTick; } else { async.nextTick = _delay; } async.setImmediate = _setImmediate ? _delay : async.nextTick; async.forEach = async.each = function (arr, iterator, callback) { return async.eachOf(arr, _withoutIndex(iterator), callback); }; async.forEachSeries = async.eachSeries = function (arr, iterator, callback) { return async.eachOfSeries(arr, _withoutIndex(iterator), callback); }; async.forEachLimit = async.eachLimit = function (arr, limit, iterator, callback) { return _eachOfLimit(limit)(arr, _withoutIndex(iterator), callback); }; async.forEachOf = async.eachOf = function (object, iterator, callback) { callback = _once(callback || noop); object = object || []; var size = _isArrayLike(object) ? object.length : _keys(object).length; var completed = 0; if (!size) { return callback(null); } _each(object, function (value, key) { iterator(object[key], key, only_once(done)); }); function done(err) { if (err) { callback(err); } else { completed += 1; if (completed >= size) { callback(null); } } } }; async.forEachOfSeries = async.eachOfSeries = function (obj, iterator, callback) { callback = _once(callback || noop); obj = obj || []; var nextKey = _keyIterator(obj); var key = nextKey(); function iterate() { var sync = true; if (key === null) { return callback(null); } iterator(obj[key], key, only_once(function (err) { if (err) { callback(err); } else { key = nextKey(); if (key === null) { return callback(null); } else { if (sync) { async.nextTick(iterate); } else { iterate(); } } } })); sync = false; } iterate(); }; async.forEachOfLimit = async.eachOfLimit = function (obj, limit, iterator, callback) { _eachOfLimit(limit)(obj, iterator, callback); }; function _eachOfLimit(limit) { return function (obj, iterator, callback) { callback = _once(callback || noop); obj = obj || []; var nextKey = _keyIterator(obj); if (limit <= 0) { return callback(null); } var done = false; var running = 0; var errored = false; (function replenish () { if (done && running <= 0) { return callback(null); } while (running < limit && !errored) { var key = nextKey(); if (key === null) { done = true; if (running <= 0) { callback(null); } return; } running += 1; iterator(obj[key], key, only_once(function (err) { running -= 1; if (err) { callback(err); errored = true; } else { replenish(); } })); } })(); }; } function doParallel(fn) { return function (obj, iterator, callback) { return fn(async.eachOf, obj, iterator, callback); }; } function doParallelLimit(fn) { return function (obj, limit, iterator, callback) { return fn(_eachOfLimit(limit), obj, iterator, callback); }; } function doSeries(fn) { return function (obj, iterator, callback) { return fn(async.eachOfSeries, obj, iterator, callback); }; } function _asyncMap(eachfn, arr, iterator, callback) { callback = _once(callback || noop); var results = []; eachfn(arr, function (value, index, callback) { iterator(value, function (err, v) { results[index] = v; callback(err); }); }, function (err) { callback(err, results); }); } async.map = doParallel(_asyncMap); async.mapSeries = doSeries(_asyncMap); async.mapLimit = doParallelLimit(_asyncMap); // reduce only has a series version, as doing reduce in parallel won't // work in many situations. async.inject = async.foldl = async.reduce = function (arr, memo, iterator, callback) { async.eachOfSeries(arr, function (x, i, callback) { iterator(memo, x, function (err, v) { memo = v; callback(err); }); }, function (err) { callback(err || null, memo); }); }; async.foldr = async.reduceRight = function (arr, memo, iterator, callback) { var reversed = _map(arr, identity).reverse(); async.reduce(reversed, memo, iterator, callback); }; function _filter(eachfn, arr, iterator, callback) { var results = []; eachfn(arr, function (x, index, callback) { iterator(x, function (v) { if (v) { results.push({index: index, value: x}); } callback(); }); }, function () { callback(_map(results.sort(function (a, b) { return a.index - b.index; }), function (x) { return x.value; })); }); } async.select = async.filter = doParallel(_filter); async.selectLimit = async.filterLimit = doParallelLimit(_filter); async.selectSeries = async.filterSeries = doSeries(_filter); function _reject(eachfn, arr, iterator, callback) { _filter(eachfn, arr, function(value, cb) { iterator(value, function(v) { cb(!v); }); }, callback); } async.reject = doParallel(_reject); async.rejectLimit = doParallelLimit(_reject); async.rejectSeries = doSeries(_reject); function _createTester(eachfn, check, getResult) { return function(arr, limit, iterator, cb) { function done() { if (cb) cb(getResult(false, void 0)); } function iteratee(x, _, callback) { if (!cb) return callback(); iterator(x, function (v) { if (cb && check(v)) { cb(getResult(true, x)); cb = iterator = false; } callback(); }); } if (arguments.length > 3) { eachfn(arr, limit, iteratee, done); } else { cb = iterator; iterator = limit; eachfn(arr, iteratee, done); } }; } async.any = async.some = _createTester(async.eachOf, toBool, identity); async.someLimit = _createTester(async.eachOfLimit, toBool, identity); async.all = async.every = _createTester(async.eachOf, notId, notId); async.everyLimit = _createTester(async.eachOfLimit, notId, notId); function _findGetResult(v, x) { return x; } async.detect = _createTester(async.eachOf, identity, _findGetResult); async.detectSeries = _createTester(async.eachOfSeries, identity, _findGetResult); async.detectLimit = _createTester(async.eachOfLimit, identity, _findGetResult); async.sortBy = function (arr, iterator, callback) { async.map(arr, function (x, callback) { iterator(x, function (err, criteria) { if (err) { callback(err); } else { callback(null, {value: x, criteria: criteria}); } }); }, function (err, results) { if (err) { return callback(err); } else { callback(null, _map(results.sort(comparator), function (x) { return x.value; })); } }); function comparator(left, right) { var a = left.criteria, b = right.criteria; return a < b ? -1 : a > b ? 1 : 0; } }; async.auto = function (tasks, callback) { callback = _once(callback || noop); var keys = _keys(tasks); var remainingTasks = keys.length; if (!remainingTasks) { return callback(null); } var results = {}; var listeners = []; function addListener(fn) { listeners.unshift(fn); } function removeListener(fn) { var idx = _indexOf(listeners, fn); if (idx >= 0) listeners.splice(idx, 1); } function taskComplete() { remainingTasks--; _arrayEach(listeners.slice(0), function (fn) { fn(); }); } addListener(function () { if (!remainingTasks) { callback(null, results); } }); _arrayEach(keys, function (k) { var task = _isArray(tasks[k]) ? tasks[k]: [tasks[k]]; var taskCallback = _restParam(function(err, args) { if (args.length <= 1) { args = args[0]; } if (err) { var safeResults = {}; _forEachOf(results, function(val, rkey) { safeResults[rkey] = val; }); safeResults[k] = args; callback(err, safeResults); } else { results[k] = args; async.setImmediate(taskComplete); } }); var requires = task.slice(0, task.length - 1); // prevent dead-locks var len = requires.length; var dep; while (len--) { if (!(dep = tasks[requires[len]])) { throw new Error('Has inexistant dependency'); } if (_isArray(dep) && _indexOf(dep, k) >= 0) { throw new Error('Has cyclic dependencies'); } } function ready() { return _reduce(requires, function (a, x) { return (a && results.hasOwnProperty(x)); }, true) && !results.hasOwnProperty(k); } if (ready()) { task[task.length - 1](taskCallback, results); } else { addListener(listener); } function listener() { if (ready()) { removeListener(listener); task[task.length - 1](taskCallback, results); } } }); }; async.retry = function(times, task, callback) { var DEFAULT_TIMES = 5; var DEFAULT_INTERVAL = 0; var attempts = []; var opts = { times: DEFAULT_TIMES, interval: DEFAULT_INTERVAL }; function parseTimes(acc, t){ if(typeof t === 'number'){ acc.times = parseInt(t, 10) || DEFAULT_TIMES; } else if(typeof t === 'object'){ acc.times = parseInt(t.times, 10) || DEFAULT_TIMES; acc.interval = parseInt(t.interval, 10) || DEFAULT_INTERVAL; } else { throw new Error('Unsupported argument type for \'times\': ' + typeof t); } } var length = arguments.length; if (length < 1 || length > 3) { throw new Error('Invalid arguments - must be either (task), (task, callback), (times, task) or (times, task, callback)'); } else if (length <= 2 && typeof times === 'function') { callback = task; task = times; } if (typeof times !== 'function') { parseTimes(opts, times); } opts.callback = callback; opts.task = task; function wrappedTask(wrappedCallback, wrappedResults) { function retryAttempt(task, finalAttempt) { return function(seriesCallback) { task(function(err, result){ seriesCallback(!err || finalAttempt, {err: err, result: result}); }, wrappedResults); }; } function retryInterval(interval){ return function(seriesCallback){ setTimeout(function(){ seriesCallback(null); }, interval); }; } while (opts.times) { var finalAttempt = !(opts.times-=1); attempts.push(retryAttempt(opts.task, finalAttempt)); if(!finalAttempt && opts.interval > 0){ attempts.push(retryInterval(opts.interval)); } } async.series(attempts, function(done, data){ data = data[data.length - 1]; (wrappedCallback || opts.callback)(data.err, data.result); }); } // If a callback is passed, run this as a controll flow return opts.callback ? wrappedTask() : wrappedTask; }; async.waterfall = function (tasks, callback) { callback = _once(callback || noop); if (!_isArray(tasks)) { var err = new Error('First argument to waterfall must be an array of functions'); return callback(err); } if (!tasks.length) { return callback(); } function wrapIterator(iterator) { return _restParam(function (err, args) { if (err) { callback.apply(null, [err].concat(args)); } else { var next = iterator.next(); if (next) { args.push(wrapIterator(next)); } else { args.push(callback); } ensureAsync(iterator).apply(null, args); } }); } wrapIterator(async.iterator(tasks))(); }; function _parallel(eachfn, tasks, callback) { callback = callback || noop; var results = _isArrayLike(tasks) ? [] : {}; eachfn(tasks, function (task, key, callback) { task(_restParam(function (err, args) { if (args.length <= 1) { args = args[0]; } results[key] = args; callback(err); })); }, function (err) { callback(err, results); }); } async.parallel = function (tasks, callback) { _parallel(async.eachOf, tasks, callback); }; async.parallelLimit = function(tasks, limit, callback) { _parallel(_eachOfLimit(limit), tasks, callback); }; async.series = function(tasks, callback) { _parallel(async.eachOfSeries, tasks, callback); }; async.iterator = function (tasks) { function makeCallback(index) { function fn() { if (tasks.length) { tasks[index].apply(null, arguments); } return fn.next(); } fn.next = function () { return (index < tasks.length - 1) ? makeCallback(index + 1): null; }; return fn; } return makeCallback(0); }; async.apply = _restParam(function (fn, args) { return _restParam(function (callArgs) { return fn.apply( null, args.concat(callArgs) ); }); }); function _concat(eachfn, arr, fn, callback) { var result = []; eachfn(arr, function (x, index, cb) { fn(x, function (err, y) { result = result.concat(y || []); cb(err); }); }, function (err) { callback(err, result); }); } async.concat = doParallel(_concat); async.concatSeries = doSeries(_concat); async.whilst = function (test, iterator, callback) { callback = callback || noop; if (test()) { var next = _restParam(function(err, args) { if (err) { callback(err); } else if (test.apply(this, args)) { iterator(next); } else { callback(null); } }); iterator(next); } else { callback(null); } }; async.doWhilst = function (iterator, test, callback) { var calls = 0; return async.whilst(function() { return ++calls <= 1 || test.apply(this, arguments); }, iterator, callback); }; async.until = function (test, iterator, callback) { return async.whilst(function() { return !test.apply(this, arguments); }, iterator, callback); }; async.doUntil = function (iterator, test, callback) { return async.doWhilst(iterator, function() { return !test.apply(this, arguments); }, callback); }; async.during = function (test, iterator, callback) { callback = callback || noop; var next = _restParam(function(err, args) { if (err) { callback(err); } else { args.push(check); test.apply(this, args); } }); var check = function(err, truth) { if (err) { callback(err); } else if (truth) { iterator(next); } else { callback(null); } }; test(check); }; async.doDuring = function (iterator, test, callback) { var calls = 0; async.during(function(next) { if (calls++ < 1) { next(null, true); } else { test.apply(this, arguments); } }, iterator, callback); }; function _queue(worker, concurrency, payload) { if (concurrency == null) { concurrency = 1; } else if(concurrency === 0) { throw new Error('Concurrency must not be zero'); } function _insert(q, data, pos, callback) { if (callback != null && typeof callback !== "function") { throw new Error("task callback must be a function"); } q.started = true; if (!_isArray(data)) { data = [data]; } if(data.length === 0 && q.idle()) { // call drain immediately if there are no tasks return async.setImmediate(function() { q.drain(); }); } _arrayEach(data, function(task) { var item = { data: task, callback: callback || noop }; if (pos) { q.tasks.unshift(item); } else { q.tasks.push(item); } if (q.tasks.length === q.concurrency) { q.saturated(); } }); async.setImmediate(q.process); } function _next(q, tasks) { return function(){ workers -= 1; var args = arguments; _arrayEach(tasks, function (task) { task.callback.apply(task, args); }); if (q.tasks.length + workers === 0) { q.drain(); } q.process(); }; } var workers = 0; var q = { tasks: [], concurrency: concurrency, payload: payload, saturated: noop, empty: noop, drain: noop, started: false, paused: false, push: function (data, callback) { _insert(q, data, false, callback); }, kill: function () { q.drain = noop; q.tasks = []; }, unshift: function (data, callback) { _insert(q, data, true, callback); }, process: function () { if (!q.paused && workers < q.concurrency && q.tasks.length) { while(workers < q.concurrency && q.tasks.length){ var tasks = q.payload ? q.tasks.splice(0, q.payload) : q.tasks.splice(0, q.tasks.length); var data = _map(tasks, function (task) { return task.data; }); if (q.tasks.length === 0) { q.empty(); } workers += 1; var cb = only_once(_next(q, tasks)); worker(data, cb); } } }, length: function () { return q.tasks.length; }, running: function () { return workers; }, idle: function() { return q.tasks.length + workers === 0; }, pause: function () { q.paused = true; }, resume: function () { if (q.paused === false) { return; } q.paused = false; var resumeCount = Math.min(q.concurrency, q.tasks.length); // Need to call q.process once per concurrent // worker to preserve full concurrency after pause for (var w = 1; w <= resumeCount; w++) { async.setImmediate(q.process); } } }; return q; } async.queue = function (worker, concurrency) { var q = _queue(function (items, cb) { worker(items[0], cb); }, concurrency, 1); return q; }; async.priorityQueue = function (worker, concurrency) { function _compareTasks(a, b){ return a.priority - b.priority; } function _binarySearch(sequence, item, compare) { var beg = -1, end = sequence.length - 1; while (beg < end) { var mid = beg + ((end - beg + 1) >>> 1); if (compare(item, sequence[mid]) >= 0) { beg = mid; } else { end = mid - 1; } } return beg; } function _insert(q, data, priority, callback) { if (callback != null && typeof callback !== "function") { throw new Error("task callback must be a function"); } q.started = true; if (!_isArray(data)) { data = [data]; } if(data.length === 0) { // call drain immediately if there are no tasks return async.setImmediate(function() { q.drain(); }); } _arrayEach(data, function(task) { var item = { data: task, priority: priority, callback: typeof callback === 'function' ? callback : noop }; q.tasks.splice(_binarySearch(q.tasks, item, _compareTasks) + 1, 0, item); if (q.tasks.length === q.concurrency) { q.saturated(); } async.setImmediate(q.process); }); } // Start with a normal queue var q = async.queue(worker, concurrency); // Override push to accept second parameter representing priority q.push = function (data, priority, callback) { _insert(q, data, priority, callback); }; // Remove unshift function delete q.unshift; return q; }; async.cargo = function (worker, payload) { return _queue(worker, 1, payload); }; function _console_fn(name) { return _restParam(function (fn, args) { fn.apply(null, args.concat([_restParam(function (err, args) { if (typeof console === 'object') { if (err) { if (console.error) { console.error(err); } } else if (console[name]) { _arrayEach(args, function (x) { console[name](x); }); } } })])); }); } async.log = _console_fn('log'); async.dir = _console_fn('dir'); /*async.info = _console_fn('info'); async.warn = _console_fn('warn'); async.error = _console_fn('error');*/ async.memoize = function (fn, hasher) { var memo = {}; var queues = {}; hasher = hasher || identity; var memoized = _restParam(function memoized(args) { var callback = args.pop(); var key = hasher.apply(null, args); if (key in memo) { async.nextTick(function () { callback.apply(null, memo[key]); }); } else if (key in queues) { queues[key].push(callback); } else { queues[key] = [callback]; fn.apply(null, args.concat([_restParam(function (args) { memo[key] = args; var q = queues[key]; delete queues[key]; for (var i = 0, l = q.length; i < l; i++) { q[i].apply(null, args); } })])); } }); memoized.memo = memo; memoized.unmemoized = fn; return memoized; }; async.unmemoize = function (fn) { return function () { return (fn.unmemoized || fn).apply(null, arguments); }; }; function _times(mapper) { return function (count, iterator, callback) { mapper(_range(count), iterator, callback); }; } async.times = _times(async.map); async.timesSeries = _times(async.mapSeries); async.timesLimit = function (count, limit, iterator, callback) { return async.mapLimit(_range(count), limit, iterator, callback); }; async.seq = function (/* functions... */) { var fns = arguments; return _restParam(function (args) { var that = this; var callback = args[args.length - 1]; if (typeof callback == 'function') { args.pop(); } else { callback = noop; } async.reduce(fns, args, function (newargs, fn, cb) { fn.apply(that, newargs.concat([_restParam(function (err, nextargs) { cb(err, nextargs); })])); }, function (err, results) { callback.apply(that, [err].concat(results)); }); }); }; async.compose = function (/* functions... */) { return async.seq.apply(null, Array.prototype.reverse.call(arguments)); }; function _applyEach(eachfn) { return _restParam(function(fns, args) { var go = _restParam(function(args) { var that = this; var callback = args.pop(); return eachfn(fns, function (fn, _, cb) { fn.apply(that, args.concat([cb])); }, callback); }); if (args.length) { return go.apply(this, args); } else { return go; } }); } async.applyEach = _applyEach(async.eachOf); async.applyEachSeries = _applyEach(async.eachOfSeries); async.forever = function (fn, callback) { var done = only_once(callback || noop); var task = ensureAsync(fn); function next(err) { if (err) { return done(err); } task(next); } next(); }; function ensureAsync(fn) { return _restParam(function (args) { var callback = args.pop(); args.push(function () { var innerArgs = arguments; if (sync) { async.setImmediate(function () { callback.apply(null, innerArgs); }); } else { callback.apply(null, innerArgs); } }); var sync = true; fn.apply(this, args); sync = false; }); } async.ensureAsync = ensureAsync; async.constant = _restParam(function(values) { var args = [null].concat(values); return function (callback) { return callback.apply(this, args); }; }); async.wrapSync = async.asyncify = function asyncify(func) { return _restParam(function (args) { var callback = args.pop(); var result; try { result = func.apply(this, args); } catch (e) { return callback(e); } // if result is Promise object if (_isObject(result) && typeof result.then === "function") { result.then(function(value) { callback(null, value); })["catch"](function(err) { callback(err.message ? err : new Error(err)); }); } else { callback(null, result); } }); }; // Node.js if (typeof module === 'object' && module.exports) { module.exports = async; } // AMD / RequireJS else if (typeof define === 'function' && define.amd) { define([], function () { return async; }); } // included directly via '); expect(encoded).to.equal('\\x3cscript\\x3ealert\\x281\\x29\\x3c\\x2fscript\\x3e'); done(); }); it('encodes \' characters', function (done) { var encoded = Hoek.escapeJavaScript('something(\'param\')'); expect(encoded).to.equal('something\\x28\\x27param\\x27\\x29'); done(); }); it('encodes large unicode characters with the correct padding', function (done) { var encoded = Hoek.escapeJavaScript(String.fromCharCode(500) + String.fromCharCode(1000)); expect(encoded).to.equal('\\u0500\\u1000'); done(); }); it('doesn\'t throw an exception when passed null', function (done) { var encoded = Hoek.escapeJavaScript(null); expect(encoded).to.equal(''); done(); }); }); describe('escapeHtml()', function () { it('encodes / characters', function (done) { var encoded = Hoek.escapeHtml(''); expect(encoded).to.equal('<script>alert(1)</script>'); done(); }); it('encodes < and > as named characters', function (done) { var encoded = Hoek.escapeHtml(' ``` Or in node.js: ``` npm install node-uuid ``` ```javascript var uuid = require('node-uuid'); ``` Then create some ids ... ```javascript // Generate a v1 (time-based) id uuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a' // Generate a v4 (random) id uuid.v4(); // -> '110ec58a-a0f2-4ac4-8393-c866d813b8d1' ``` ## API ### uuid.v1([`options` [, `buffer` [, `offset`]]]) Generate and return a RFC4122 v1 (timestamp-based) UUID. * `options` - (Object) Optional uuid state to apply. Properties may include: * `node` - (Array) Node id as Array of 6 bytes (per 4.1.6). Default: Randomly generated ID. See note 1. * `clockseq` - (Number between 0 - 0x3fff) RFC clock sequence. Default: An internally maintained clockseq is used. * `msecs` - (Number | Date) Time in milliseconds since unix Epoch. Default: The current time is used. * `nsecs` - (Number between 0-9999) additional time, in 100-nanosecond units. Ignored if `msecs` is unspecified. Default: internal uuid counter is used, as per 4.2.1.2. * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. * `offset` - (Number) Starting index in `buffer` at which to begin writing. Returns `buffer`, if specified, otherwise the string form of the UUID Notes: 1. The randomly generated node id is only guaranteed to stay constant for the lifetime of the current JS runtime. (Future versions of this module may use persistent storage mechanisms to extend this guarantee.) Example: Generate string UUID with fully-specified options ```javascript uuid.v1({ node: [0x01, 0x23, 0x45, 0x67, 0x89, 0xab], clockseq: 0x1234, msecs: new Date('2011-11-01').getTime(), nsecs: 5678 }); // -> "710b962e-041c-11e1-9234-0123456789ab" ``` Example: In-place generation of two binary IDs ```javascript // Generate two ids in an array var arr = new Array(32); // -> [] uuid.v1(null, arr, 0); // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15] uuid.v1(null, arr, 16); // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15 02 a3 1c b0 14 32 11 e1 85 58 0b 48 8e 4f c1 15] // Optionally use uuid.unparse() to get stringify the ids uuid.unparse(buffer); // -> '02a2ce90-1432-11e1-8558-0b488e4fc115' uuid.unparse(buffer, 16) // -> '02a31cb0-1432-11e1-8558-0b488e4fc115' ``` ### uuid.v4([`options` [, `buffer` [, `offset`]]]) Generate and return a RFC4122 v4 UUID. * `options` - (Object) Optional uuid state to apply. Properties may include: * `random` - (Number[16]) Array of 16 numbers (0-255) to use in place of randomly generated values * `rng` - (Function) Random # generator to use. Set to one of the built-in generators - `uuid.mathRNG` (all platforms), `uuid.nodeRNG` (node.js only), `uuid.whatwgRNG` (WebKit only) - or a custom function that returns an array[16] of byte values. * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. * `offset` - (Number) Starting index in `buffer` at which to begin writing. Returns `buffer`, if specified, otherwise the string form of the UUID Example: Generate string UUID with fully-specified options ```javascript uuid.v4({ random: [ 0x10, 0x91, 0x56, 0xbe, 0xc4, 0xfb, 0xc1, 0xea, 0x71, 0xb4, 0xef, 0xe1, 0x67, 0x1c, 0x58, 0x36 ] }); // -> "109156be-c4fb-41ea-b1b4-efe1671c5836" ``` Example: Generate two IDs in a single buffer ```javascript var buffer = new Array(32); // (or 'new Buffer' in node.js) uuid.v4(null, buffer, 0); uuid.v4(null, buffer, 16); ``` ### uuid.parse(id[, buffer[, offset]]) ### uuid.unparse(buffer[, offset]) Parse and unparse UUIDs * `id` - (String) UUID(-like) string * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. Default: A new Array or Buffer is used * `offset` - (Number) Starting index in `buffer` at which to begin writing. Default: 0 Example parsing and unparsing a UUID string ```javascript var bytes = uuid.parse('797ff043-11eb-11e1-80d6-510998755d10'); // -> var string = uuid.unparse(bytes); // -> '797ff043-11eb-11e1-80d6-510998755d10' ``` ### uuid.noConflict() (Browsers only) Set `uuid` property back to it's previous value. Returns the node-uuid object. Example: ```javascript var myUuid = uuid.noConflict(); myUuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a' ``` ## Deprecated APIs Support for the following v1.2 APIs is available in v1.3, but is deprecated and will be removed in the next major version. ### uuid([format [, buffer [, offset]]]) uuid() has become uuid.v4(), and the `format` argument is now implicit in the `buffer` argument. (i.e. if you specify a buffer, the format is assumed to be binary). ### uuid.BufferClass The class of container created when generating binary uuid data if no buffer argument is specified. This is expected to go away, with no replacement API. ## Command Line Interface To use the executable, it's probably best to install this library globally. `npm install -g node-uuid` Usage: ``` USAGE: uuid [version] [options] options: --help Display this message and exit ``` `version` must be an RFC4122 version that is supported by this library, which is currently version 1 and version 4 (denoted by "v1" and "v4", respectively). `version` defaults to version 4 when not supplied. ### Examples ``` > uuid 3a91f950-dec8-4688-ba14-5b7bbfc7a563 ``` ``` > uuid v1 9d0b43e0-7696-11e3-964b-250efa37a98e ``` ``` > uuid v4 6790ac7c-24ac-4f98-8464-42f6d98a53ae ``` ## Testing In node.js ``` npm test ``` In Browser ``` open test/test.html ``` ### Benchmarking Requires node.js ``` npm install uuid uuid-js node benchmark/benchmark.js ``` For a more complete discussion of node-uuid performance, please see the `benchmark/README.md` file, and the [benchmark wiki](https://github.com/broofa/node-uuid/wiki/Benchmark) For browser performance [checkout the JSPerf tests](http://jsperf.com/node-uuid-performance). ## Release notes ### 1.4.0 * Improved module context detection * Removed public RNG functions ### 1.3.2 * Improve tests and handling of v1() options (Issue #24) * Expose RNG option to allow for perf testing with different generators ### 1.3.0 * Support for version 1 ids, thanks to [@ctavan](https://github.com/ctavan)! * Support for node.js crypto API * De-emphasizing performance in favor of a) cryptographic quality PRNGs where available and b) more manageable code npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/benchmark/0000755000000000000000000000000012631326456024137 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/bin/0000755000000000000000000000000012631326456022755 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/bower.json0000644000000000000000000000066312631326456024223 0ustar 00000000000000{ "name": "node-uuid", "version": "1.4.3", "homepage": "https://github.com/broofa/node-uuid", "authors": [ "Robert Kieffer " ], "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.", "main": "uuid.js", "keywords": [ "uuid", "gid", "rfc4122" ], "license": "MIT", "ignore": [ "**/.*", "node_modules", "bower_components", "test", "tests" ] } npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/component.json0000644000000000000000000000073212631326456025104 0ustar 00000000000000{ "name": "node-uuid", "repo": "broofa/node-uuid", "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.", "version": "1.4.3", "author": "Robert Kieffer ", "contributors": [ {"name": "Christoph Tavan ", "github": "https://github.com/ctavan"} ], "keywords": ["uuid", "guid", "rfc4122"], "dependencies": {}, "development": {}, "main": "uuid.js", "scripts": [ "uuid.js" ], "license": "MIT" } npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/package.json0000644000000000000000000002064312631326456024500 0ustar 00000000000000{ "name": "node-uuid", "description": "Rigorous implementation of RFC4122 (v1 and v4) UUIDs.", "url": "http://github.com/broofa/node-uuid", "keywords": [ "uuid", "guid", "rfc4122" ], "author": { "name": "Robert Kieffer", "email": "robert@broofa.com" }, "contributors": [ { "name": "Christoph Tavan", "email": "dev@tavan.de" } ], "bin": { "uuid": "./bin/uuid" }, "scripts": { "test": "node test/test.js" }, "lib": ".", "main": "./uuid.js", "repository": { "type": "git", "url": "git+https://github.com/broofa/node-uuid.git" }, "version": "1.4.3", "licenses": [ { "type": "MIT", "url": "https://raw.github.com/broofa/node-uuid/master/LICENSE.md" } ], "readme": "# node-uuid\n\nSimple, fast generation of [RFC4122](http://www.ietf.org/rfc/rfc4122.txt) UUIDS.\n\nFeatures:\n\n* Generate RFC4122 version 1 or version 4 UUIDs\n* Runs in node.js and all browsers.\n* Registered as a [ComponentJS](https://github.com/component/component) [component](https://github.com/component/component/wiki/Components) ('broofa/node-uuid').\n* Cryptographically strong random # generation on supporting platforms\n* 1.1K minified and gzip'ed (Want something smaller? Check this [crazy shit](https://gist.github.com/982883) out! )\n* [Annotated source code](http://broofa.github.com/node-uuid/docs/uuid.html)\n* Comes with a Command Line Interface for generating uuids on the command line\n\n## Getting Started\n\nInstall it in your browser:\n\n```html\n\n```\n\nOr in node.js:\n\n```\nnpm install node-uuid\n```\n\n```javascript\nvar uuid = require('node-uuid');\n```\n\nThen create some ids ...\n\n```javascript\n// Generate a v1 (time-based) id\nuuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a'\n\n// Generate a v4 (random) id\nuuid.v4(); // -> '110ec58a-a0f2-4ac4-8393-c866d813b8d1'\n```\n\n## API\n\n### uuid.v1([`options` [, `buffer` [, `offset`]]])\n\nGenerate and return a RFC4122 v1 (timestamp-based) UUID.\n\n* `options` - (Object) Optional uuid state to apply. Properties may include:\n\n * `node` - (Array) Node id as Array of 6 bytes (per 4.1.6). Default: Randomly generated ID. See note 1.\n * `clockseq` - (Number between 0 - 0x3fff) RFC clock sequence. Default: An internally maintained clockseq is used.\n * `msecs` - (Number | Date) Time in milliseconds since unix Epoch. Default: The current time is used.\n * `nsecs` - (Number between 0-9999) additional time, in 100-nanosecond units. Ignored if `msecs` is unspecified. Default: internal uuid counter is used, as per 4.2.1.2.\n\n* `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written.\n* `offset` - (Number) Starting index in `buffer` at which to begin writing.\n\nReturns `buffer`, if specified, otherwise the string form of the UUID\n\nNotes:\n\n1. The randomly generated node id is only guaranteed to stay constant for the lifetime of the current JS runtime. (Future versions of this module may use persistent storage mechanisms to extend this guarantee.)\n\nExample: Generate string UUID with fully-specified options\n\n```javascript\nuuid.v1({\n node: [0x01, 0x23, 0x45, 0x67, 0x89, 0xab],\n clockseq: 0x1234,\n msecs: new Date('2011-11-01').getTime(),\n nsecs: 5678\n}); // -> \"710b962e-041c-11e1-9234-0123456789ab\"\n```\n\nExample: In-place generation of two binary IDs\n\n```javascript\n// Generate two ids in an array\nvar arr = new Array(32); // -> []\nuuid.v1(null, arr, 0); // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15]\nuuid.v1(null, arr, 16); // -> [02 a2 ce 90 14 32 11 e1 85 58 0b 48 8e 4f c1 15 02 a3 1c b0 14 32 11 e1 85 58 0b 48 8e 4f c1 15]\n\n// Optionally use uuid.unparse() to get stringify the ids\nuuid.unparse(buffer); // -> '02a2ce90-1432-11e1-8558-0b488e4fc115'\nuuid.unparse(buffer, 16) // -> '02a31cb0-1432-11e1-8558-0b488e4fc115'\n```\n\n### uuid.v4([`options` [, `buffer` [, `offset`]]])\n\nGenerate and return a RFC4122 v4 UUID.\n\n* `options` - (Object) Optional uuid state to apply. Properties may include:\n\n * `random` - (Number[16]) Array of 16 numbers (0-255) to use in place of randomly generated values\n * `rng` - (Function) Random # generator to use. Set to one of the built-in generators - `uuid.mathRNG` (all platforms), `uuid.nodeRNG` (node.js only), `uuid.whatwgRNG` (WebKit only) - or a custom function that returns an array[16] of byte values.\n\n* `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written.\n* `offset` - (Number) Starting index in `buffer` at which to begin writing.\n\nReturns `buffer`, if specified, otherwise the string form of the UUID\n\nExample: Generate string UUID with fully-specified options\n\n```javascript\nuuid.v4({\n random: [\n 0x10, 0x91, 0x56, 0xbe, 0xc4, 0xfb, 0xc1, 0xea,\n 0x71, 0xb4, 0xef, 0xe1, 0x67, 0x1c, 0x58, 0x36\n ]\n});\n// -> \"109156be-c4fb-41ea-b1b4-efe1671c5836\"\n```\n\nExample: Generate two IDs in a single buffer\n\n```javascript\nvar buffer = new Array(32); // (or 'new Buffer' in node.js)\nuuid.v4(null, buffer, 0);\nuuid.v4(null, buffer, 16);\n```\n\n### uuid.parse(id[, buffer[, offset]])\n### uuid.unparse(buffer[, offset])\n\nParse and unparse UUIDs\n\n * `id` - (String) UUID(-like) string\n * `buffer` - (Array | Buffer) Array or buffer where UUID bytes are to be written. Default: A new Array or Buffer is used\n * `offset` - (Number) Starting index in `buffer` at which to begin writing. Default: 0\n\nExample parsing and unparsing a UUID string\n\n```javascript\nvar bytes = uuid.parse('797ff043-11eb-11e1-80d6-510998755d10'); // -> \nvar string = uuid.unparse(bytes); // -> '797ff043-11eb-11e1-80d6-510998755d10'\n```\n\n### uuid.noConflict()\n\n(Browsers only) Set `uuid` property back to it's previous value.\n\nReturns the node-uuid object.\n\nExample:\n\n```javascript\nvar myUuid = uuid.noConflict();\nmyUuid.v1(); // -> '6c84fb90-12c4-11e1-840d-7b25c5ee775a'\n```\n\n## Deprecated APIs\n\nSupport for the following v1.2 APIs is available in v1.3, but is deprecated and will be removed in the next major version.\n\n### uuid([format [, buffer [, offset]]])\n\nuuid() has become uuid.v4(), and the `format` argument is now implicit in the `buffer` argument. (i.e. if you specify a buffer, the format is assumed to be binary).\n\n### uuid.BufferClass\n\nThe class of container created when generating binary uuid data if no buffer argument is specified. This is expected to go away, with no replacement API.\n\n## Command Line Interface\n\nTo use the executable, it's probably best to install this library globally.\n\n`npm install -g node-uuid`\n\nUsage:\n\n```\nUSAGE: uuid [version] [options]\n\n\noptions:\n\n--help Display this message and exit\n```\n\n`version` must be an RFC4122 version that is supported by this library, which is currently version 1 and version 4 (denoted by \"v1\" and \"v4\", respectively). `version` defaults to version 4 when not supplied.\n\n### Examples\n\n```\n> uuid\n3a91f950-dec8-4688-ba14-5b7bbfc7a563\n```\n\n```\n> uuid v1\n9d0b43e0-7696-11e3-964b-250efa37a98e\n```\n\n```\n> uuid v4\n6790ac7c-24ac-4f98-8464-42f6d98a53ae\n```\n\n## Testing\n\nIn node.js\n\n```\nnpm test\n```\n\nIn Browser\n\n```\nopen test/test.html\n```\n\n### Benchmarking\n\nRequires node.js\n\n```\nnpm install uuid uuid-js\nnode benchmark/benchmark.js\n```\n\nFor a more complete discussion of node-uuid performance, please see the `benchmark/README.md` file, and the [benchmark wiki](https://github.com/broofa/node-uuid/wiki/Benchmark)\n\nFor browser performance [checkout the JSPerf tests](http://jsperf.com/node-uuid-performance).\n\n## Release notes\n\n### 1.4.0\n\n* Improved module context detection\n* Removed public RNG functions\n\n### 1.3.2\n\n* Improve tests and handling of v1() options (Issue #24)\n* Expose RNG option to allow for perf testing with different generators\n\n### 1.3.0\n\n* Support for version 1 ids, thanks to [@ctavan](https://github.com/ctavan)!\n* Support for node.js crypto API\n* De-emphasizing performance in favor of a) cryptographic quality PRNGs where available and b) more manageable code\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/broofa/node-uuid/issues" }, "homepage": "https://github.com/broofa/node-uuid#readme", "_id": "node-uuid@1.4.3", "_shasum": "319bb7a56e7cb63f00b5c0cd7851cd4b4ddf1df9", "_resolved": "https://registry.npmjs.org/node-uuid/-/node-uuid-1.4.3.tgz", "_from": "node-uuid@>=1.4.3 <1.5.0" } npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/test/0000755000000000000000000000000012631326456023164 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/uuid.js0000644000000000000000000001611312631326456023513 0ustar 00000000000000// uuid.js // // Copyright (c) 2010-2012 Robert Kieffer // MIT License - http://opensource.org/licenses/mit-license.php (function() { var _global = this; // Unique ID creation requires a high quality random # generator. We feature // detect to determine the best RNG source, normalizing to a function that // returns 128-bits of randomness, since that's what's usually required var _rng; // Node.js crypto-based RNG - http://nodejs.org/docs/v0.6.2/api/crypto.html // // Moderately fast, high quality if (typeof(_global.require) == 'function') { try { var _rb = _global.require('crypto').randomBytes; _rng = _rb && function() {return _rb(16);}; } catch(e) {} } if (!_rng && _global.crypto && crypto.getRandomValues) { // WHATWG crypto-based RNG - http://wiki.whatwg.org/wiki/Crypto // // Moderately fast, high quality var _rnds8 = new Uint8Array(16); _rng = function whatwgRNG() { crypto.getRandomValues(_rnds8); return _rnds8; }; } if (!_rng) { // Math.random()-based (RNG) // // If all else fails, use Math.random(). It's fast, but is of unspecified // quality. var _rnds = new Array(16); _rng = function() { for (var i = 0, r; i < 16; i++) { if ((i & 0x03) === 0) r = Math.random() * 0x100000000; _rnds[i] = r >>> ((i & 0x03) << 3) & 0xff; } return _rnds; }; } // Buffer class to use var BufferClass = typeof(_global.Buffer) == 'function' ? _global.Buffer : Array; // Maps for number <-> hex string conversion var _byteToHex = []; var _hexToByte = {}; for (var i = 0; i < 256; i++) { _byteToHex[i] = (i + 0x100).toString(16).substr(1); _hexToByte[_byteToHex[i]] = i; } // **`parse()` - Parse a UUID into it's component bytes** function parse(s, buf, offset) { var i = (buf && offset) || 0, ii = 0; buf = buf || []; s.toLowerCase().replace(/[0-9a-f]{2}/g, function(oct) { if (ii < 16) { // Don't overflow! buf[i + ii++] = _hexToByte[oct]; } }); // Zero out remaining bytes if string was short while (ii < 16) { buf[i + ii++] = 0; } return buf; } // **`unparse()` - Convert UUID byte array (ala parse()) into a string** function unparse(buf, offset) { var i = offset || 0, bth = _byteToHex; return bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + '-' + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]] + bth[buf[i++]]; } // **`v1()` - Generate time-based UUID** // // Inspired by https://github.com/LiosK/UUID.js // and http://docs.python.org/library/uuid.html // random #'s we need to init node and clockseq var _seedBytes = _rng(); // Per 4.5, create and 48-bit node id, (47 random bits + multicast bit = 1) var _nodeId = [ _seedBytes[0] | 0x01, _seedBytes[1], _seedBytes[2], _seedBytes[3], _seedBytes[4], _seedBytes[5] ]; // Per 4.2.2, randomize (14 bit) clockseq var _clockseq = (_seedBytes[6] << 8 | _seedBytes[7]) & 0x3fff; // Previous uuid creation time var _lastMSecs = 0, _lastNSecs = 0; // See https://github.com/broofa/node-uuid for API details function v1(options, buf, offset) { var i = buf && offset || 0; var b = buf || []; options = options || {}; var clockseq = options.clockseq != null ? options.clockseq : _clockseq; // UUID timestamps are 100 nano-second units since the Gregorian epoch, // (1582-10-15 00:00). JSNumbers aren't precise enough for this, so // time is handled internally as 'msecs' (integer milliseconds) and 'nsecs' // (100-nanoseconds offset from msecs) since unix epoch, 1970-01-01 00:00. var msecs = options.msecs != null ? options.msecs : new Date().getTime(); // Per 4.2.1.2, use count of uuid's generated during the current clock // cycle to simulate higher resolution clock var nsecs = options.nsecs != null ? options.nsecs : _lastNSecs + 1; // Time since last uuid creation (in msecs) var dt = (msecs - _lastMSecs) + (nsecs - _lastNSecs)/10000; // Per 4.2.1.2, Bump clockseq on clock regression if (dt < 0 && options.clockseq == null) { clockseq = clockseq + 1 & 0x3fff; } // Reset nsecs if clock regresses (new clockseq) or we've moved onto a new // time interval if ((dt < 0 || msecs > _lastMSecs) && options.nsecs == null) { nsecs = 0; } // Per 4.2.1.2 Throw error if too many uuids are requested if (nsecs >= 10000) { throw new Error('uuid.v1(): Can\'t create more than 10M uuids/sec'); } _lastMSecs = msecs; _lastNSecs = nsecs; _clockseq = clockseq; // Per 4.1.4 - Convert from unix epoch to Gregorian epoch msecs += 12219292800000; // `time_low` var tl = ((msecs & 0xfffffff) * 10000 + nsecs) % 0x100000000; b[i++] = tl >>> 24 & 0xff; b[i++] = tl >>> 16 & 0xff; b[i++] = tl >>> 8 & 0xff; b[i++] = tl & 0xff; // `time_mid` var tmh = (msecs / 0x100000000 * 10000) & 0xfffffff; b[i++] = tmh >>> 8 & 0xff; b[i++] = tmh & 0xff; // `time_high_and_version` b[i++] = tmh >>> 24 & 0xf | 0x10; // include version b[i++] = tmh >>> 16 & 0xff; // `clock_seq_hi_and_reserved` (Per 4.2.2 - include variant) b[i++] = clockseq >>> 8 | 0x80; // `clock_seq_low` b[i++] = clockseq & 0xff; // `node` var node = options.node || _nodeId; for (var n = 0; n < 6; n++) { b[i + n] = node[n]; } return buf ? buf : unparse(b); } // **`v4()` - Generate random UUID** // See https://github.com/broofa/node-uuid for API details function v4(options, buf, offset) { // Deprecated - 'format' argument, as supported in v1.2 var i = buf && offset || 0; if (typeof(options) == 'string') { buf = options == 'binary' ? new BufferClass(16) : null; options = null; } options = options || {}; var rnds = options.random || (options.rng || _rng)(); // Per 4.4, set bits for version and `clock_seq_hi_and_reserved` rnds[6] = (rnds[6] & 0x0f) | 0x40; rnds[8] = (rnds[8] & 0x3f) | 0x80; // Copy bytes to buffer, if provided if (buf) { for (var ii = 0; ii < 16; ii++) { buf[i + ii] = rnds[ii]; } } return buf || unparse(rnds); } // Export public API var uuid = v4; uuid.v1 = v1; uuid.v4 = v4; uuid.parse = parse; uuid.unparse = unparse; uuid.BufferClass = BufferClass; if (typeof(module) != 'undefined' && module.exports) { // Publish as node.js module module.exports = uuid; } else if (typeof define === 'function' && define.amd) { // Publish as AMD module define(function() {return uuid;}); } else { // Publish as global (in browsers) var _previousRoot = _global.uuid; // **`noConflict()` - (browser only) to reset global 'uuid' var** uuid.noConflict = function() { _global.uuid = _previousRoot; return uuid; }; _global.uuid = uuid; } }).call(this); npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/benchmark/README.md0000644000000000000000000000375712631326456025432 0ustar 00000000000000# node-uuid Benchmarks ### Results To see the results of our benchmarks visit https://github.com/broofa/node-uuid/wiki/Benchmark ### Run them yourself node-uuid comes with some benchmarks to measure performance of generating UUIDs. These can be run using node.js. node-uuid is being benchmarked against some other uuid modules, that are available through npm namely `uuid` and `uuid-js`. To prepare and run the benchmark issue; ``` npm install uuid uuid-js node benchmark/benchmark.js ``` You'll see an output like this one: ``` # v4 nodeuuid.v4(): 854700 uuids/second nodeuuid.v4('binary'): 788643 uuids/second nodeuuid.v4('binary', buffer): 1336898 uuids/second uuid(): 479386 uuids/second uuid('binary'): 582072 uuids/second uuidjs.create(4): 312304 uuids/second # v1 nodeuuid.v1(): 938086 uuids/second nodeuuid.v1('binary'): 683060 uuids/second nodeuuid.v1('binary', buffer): 1644736 uuids/second uuidjs.create(1): 190621 uuids/second ``` * The `uuid()` entries are for Nikhil Marathe's [uuid module](https://bitbucket.org/nikhilm/uuidjs) which is a wrapper around the native libuuid library. * The `uuidjs()` entries are for Patrick Negri's [uuid-js module](https://github.com/pnegri/uuid-js) which is a pure javascript implementation based on [UUID.js](https://github.com/LiosK/UUID.js) by LiosK. If you want to get more reliable results you can run the benchmark multiple times and write the output into a log file: ``` for i in {0..9}; do node benchmark/benchmark.js >> benchmark/bench_0.4.12.log; done; ``` If you're interested in how performance varies between different node versions, you can issue the above command multiple times. You can then use the shell script `bench.sh` provided in this directory to calculate the averages over all benchmark runs and draw a nice plot: ``` (cd benchmark/ && ./bench.sh) ``` This assumes you have [gnuplot](http://www.gnuplot.info/) and [ImageMagick](http://www.imagemagick.org/) installed. You'll find a nice `bench.png` graph in the `benchmark/` directory then. npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/benchmark/bench.gnu0000644000000000000000000001327612631326456025742 0ustar 00000000000000#!/opt/local/bin/gnuplot -persist # # # G N U P L O T # Version 4.4 patchlevel 3 # last modified March 2011 # System: Darwin 10.8.0 # # Copyright (C) 1986-1993, 1998, 2004, 2007-2010 # Thomas Williams, Colin Kelley and many others # # gnuplot home: http://www.gnuplot.info # faq, bugs, etc: type "help seeking-assistance" # immediate help: type "help" # plot window: hit 'h' set terminal postscript eps noenhanced defaultplex \ leveldefault color colortext \ solid linewidth 1.2 butt noclip \ palfuncparam 2000,0.003 \ "Helvetica" 14 set output 'bench.eps' unset clip points set clip one unset clip two set bar 1.000000 front set border 31 front linetype -1 linewidth 1.000 set xdata set ydata set zdata set x2data set y2data set timefmt x "%d/%m/%y,%H:%M" set timefmt y "%d/%m/%y,%H:%M" set timefmt z "%d/%m/%y,%H:%M" set timefmt x2 "%d/%m/%y,%H:%M" set timefmt y2 "%d/%m/%y,%H:%M" set timefmt cb "%d/%m/%y,%H:%M" set boxwidth set style fill empty border set style rectangle back fc lt -3 fillstyle solid 1.00 border lt -1 set style circle radius graph 0.02, first 0, 0 set dummy x,y set format x "% g" set format y "% g" set format x2 "% g" set format y2 "% g" set format z "% g" set format cb "% g" set angles radians unset grid set key title "" set key outside left top horizontal Right noreverse enhanced autotitles columnhead nobox set key noinvert samplen 4 spacing 1 width 0 height 0 set key maxcolumns 2 maxrows 0 unset label unset arrow set style increment default unset style line set style line 1 linetype 1 linewidth 2.000 pointtype 1 pointsize default pointinterval 0 unset style arrow set style histogram clustered gap 2 title offset character 0, 0, 0 unset logscale set offsets graph 0.05, 0.15, 0, 0 set pointsize 1.5 set pointintervalbox 1 set encoding default unset polar unset parametric unset decimalsign set view 60, 30, 1, 1 set samples 100, 100 set isosamples 10, 10 set surface unset contour set clabel '%8.3g' set mapping cartesian set datafile separator whitespace unset hidden3d set cntrparam order 4 set cntrparam linear set cntrparam levels auto 5 set cntrparam points 5 set size ratio 0 1,1 set origin 0,0 set style data points set style function lines set xzeroaxis linetype -2 linewidth 1.000 set yzeroaxis linetype -2 linewidth 1.000 set zzeroaxis linetype -2 linewidth 1.000 set x2zeroaxis linetype -2 linewidth 1.000 set y2zeroaxis linetype -2 linewidth 1.000 set ticslevel 0.5 set mxtics default set mytics default set mztics default set mx2tics default set my2tics default set mcbtics default set xtics border in scale 1,0.5 mirror norotate offset character 0, 0, 0 set xtics norangelimit set xtics () set ytics border in scale 1,0.5 mirror norotate offset character 0, 0, 0 set ytics autofreq norangelimit set ztics border in scale 1,0.5 nomirror norotate offset character 0, 0, 0 set ztics autofreq norangelimit set nox2tics set noy2tics set cbtics border in scale 1,0.5 mirror norotate offset character 0, 0, 0 set cbtics autofreq norangelimit set title "" set title offset character 0, 0, 0 font "" norotate set timestamp bottom set timestamp "" set timestamp offset character 0, 0, 0 font "" norotate set rrange [ * : * ] noreverse nowriteback # (currently [8.98847e+307:-8.98847e+307] ) set autoscale rfixmin set autoscale rfixmax set trange [ * : * ] noreverse nowriteback # (currently [-5.00000:5.00000] ) set autoscale tfixmin set autoscale tfixmax set urange [ * : * ] noreverse nowriteback # (currently [-10.0000:10.0000] ) set autoscale ufixmin set autoscale ufixmax set vrange [ * : * ] noreverse nowriteback # (currently [-10.0000:10.0000] ) set autoscale vfixmin set autoscale vfixmax set xlabel "" set xlabel offset character 0, 0, 0 font "" textcolor lt -1 norotate set x2label "" set x2label offset character 0, 0, 0 font "" textcolor lt -1 norotate set xrange [ * : * ] noreverse nowriteback # (currently [-0.150000:3.15000] ) set autoscale xfixmin set autoscale xfixmax set x2range [ * : * ] noreverse nowriteback # (currently [0.00000:3.00000] ) set autoscale x2fixmin set autoscale x2fixmax set ylabel "" set ylabel offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270 set y2label "" set y2label offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270 set yrange [ 0.00000 : 1.90000e+06 ] noreverse nowriteback # (currently [:] ) set autoscale yfixmin set autoscale yfixmax set y2range [ * : * ] noreverse nowriteback # (currently [0.00000:1.90000e+06] ) set autoscale y2fixmin set autoscale y2fixmax set zlabel "" set zlabel offset character 0, 0, 0 font "" textcolor lt -1 norotate set zrange [ * : * ] noreverse nowriteback # (currently [-10.0000:10.0000] ) set autoscale zfixmin set autoscale zfixmax set cblabel "" set cblabel offset character 0, 0, 0 font "" textcolor lt -1 rotate by -270 set cbrange [ * : * ] noreverse nowriteback # (currently [8.98847e+307:-8.98847e+307] ) set autoscale cbfixmin set autoscale cbfixmax set zero 1e-08 set lmargin -1 set bmargin -1 set rmargin -1 set tmargin -1 set pm3d explicit at s set pm3d scansautomatic set pm3d interpolate 1,1 flush begin noftriangles nohidden3d corners2color mean set palette positive nops_allcF maxcolors 0 gamma 1.5 color model RGB set palette rgbformulae 7, 5, 15 set colorbox default set colorbox vertical origin screen 0.9, 0.2, 0 size screen 0.05, 0.6, 0 front bdefault set loadpath set fontpath set fit noerrorvariables GNUTERM = "aqua" plot 'bench_results.txt' using 2:xticlabel(1) w lp lw 2, '' using 3:xticlabel(1) w lp lw 2, '' using 4:xticlabel(1) w lp lw 2, '' using 5:xticlabel(1) w lp lw 2, '' using 6:xticlabel(1) w lp lw 2, '' using 7:xticlabel(1) w lp lw 2, '' using 8:xticlabel(1) w lp lw 2, '' using 9:xticlabel(1) w lp lw 2 # EOF npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/benchmark/bench.sh0000755000000000000000000000211512631326456025554 0ustar 00000000000000#!/bin/bash # for a given node version run: # for i in {0..9}; do node benchmark.js >> bench_0.6.2.log; done; PATTERNS=('nodeuuid.v1()' "nodeuuid.v1('binary'," 'nodeuuid.v4()' "nodeuuid.v4('binary'," "uuid()" "uuid('binary')" 'uuidjs.create(1)' 'uuidjs.create(4)' '140byte') FILES=(node_uuid_v1_string node_uuid_v1_buf node_uuid_v4_string node_uuid_v4_buf libuuid_v4_string libuuid_v4_binary uuidjs_v1_string uuidjs_v4_string 140byte_es) INDICES=(2 3 2 3 2 2 2 2 2) VERSIONS=$( ls bench_*.log | sed -e 's/^bench_\([0-9\.]*\)\.log/\1/' | tr "\\n" " " ) TMPJOIN="tmp_join" OUTPUT="bench_results.txt" for I in ${!FILES[*]}; do F=${FILES[$I]} P=${PATTERNS[$I]} INDEX=${INDICES[$I]} echo "version $F" > $F for V in $VERSIONS; do (VAL=$( grep "$P" bench_$V.log | LC_ALL=en_US awk '{ sum += $'$INDEX' } END { print sum/NR }' ); echo $V $VAL) >> $F done if [ $I == 0 ]; then cat $F > $TMPJOIN else join $TMPJOIN $F > $OUTPUT cp $OUTPUT $TMPJOIN fi rm $F done rm $TMPJOIN gnuplot bench.gnu convert -density 200 -resize 800x560 -flatten bench.eps bench.png rm bench.eps npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/benchmark/benchmark-native.c0000644000000000000000000000114512631326456027522 0ustar 00000000000000/* Test performance of native C UUID generation To Compile: cc -luuid benchmark-native.c -o benchmark-native */ #include #include #include #include int main() { uuid_t myid; char buf[36+1]; int i; struct timeval t; double start, finish; gettimeofday(&t, NULL); start = t.tv_sec + t.tv_usec/1e6; int n = 2e5; for (i = 0; i < n; i++) { uuid_generate(myid); uuid_unparse(myid, buf); } gettimeofday(&t, NULL); finish = t.tv_sec + t.tv_usec/1e6; double dur = finish - start; printf("%d uuids/sec", (int)(n/dur)); return 0; } npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/benchmark/benchmark.js0000644000000000000000000000427512631326456026437 0ustar 00000000000000try { var nodeuuid = require('../uuid'); } catch (e) { console.error('node-uuid require failed - skipping tests'); } try { var uuid = require('uuid'); } catch (e) { console.error('uuid require failed - skipping tests'); } try { var uuidjs = require('uuid-js'); } catch (e) { console.error('uuid-js require failed - skipping tests'); } var N = 5e5; function rate(msg, t) { console.log(msg + ': ' + (N / (Date.now() - t) * 1e3 | 0) + ' uuids/second'); } console.log('# v4'); // node-uuid - string form if (nodeuuid) { for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4(); rate('nodeuuid.v4() - using node.js crypto RNG', t); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4({rng: nodeuuid.mathRNG}); rate('nodeuuid.v4() - using Math.random() RNG', t); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4('binary'); rate('nodeuuid.v4(\'binary\')', t); var buffer = new nodeuuid.BufferClass(16); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v4('binary', buffer); rate('nodeuuid.v4(\'binary\', buffer)', t); } // libuuid - string form if (uuid) { for (var i = 0, t = Date.now(); i < N; i++) uuid(); rate('uuid()', t); for (var i = 0, t = Date.now(); i < N; i++) uuid('binary'); rate('uuid(\'binary\')', t); } // uuid-js - string form if (uuidjs) { for (var i = 0, t = Date.now(); i < N; i++) uuidjs.create(4); rate('uuidjs.create(4)', t); } // 140byte.es for (var i = 0, t = Date.now(); i < N; i++) 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g,function(s,r){r=Math.random()*16|0;return (s=='x'?r:r&0x3|0x8).toString(16)}); rate('140byte.es_v4', t); console.log(''); console.log('# v1'); // node-uuid - v1 string form if (nodeuuid) { for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1(); rate('nodeuuid.v1()', t); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1('binary'); rate('nodeuuid.v1(\'binary\')', t); var buffer = new nodeuuid.BufferClass(16); for (var i = 0, t = Date.now(); i < N; i++) nodeuuid.v1('binary', buffer); rate('nodeuuid.v1(\'binary\', buffer)', t); } // uuid-js - v1 string form if (uuidjs) { for (var i = 0, t = Date.now(); i < N; i++) uuidjs.create(1); rate('uuidjs.create(1)', t); } npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/bin/uuid0000755000000000000000000000112512631326456023650 0ustar 00000000000000#!/usr/bin/env node var path = require('path'); var uuid = require(path.join(__dirname, '..')); var arg = process.argv[2]; if ('--help' === arg) { console.log('\n USAGE: uuid [version] [options]\n\n'); console.log(' options:\n'); console.log(' --help Display this message and exit\n'); process.exit(0); } if (null == arg) { console.log(uuid()); process.exit(0); } if ('v1' !== arg && 'v4' !== arg) { console.error('Version must be RFC4122 version 1 or version 4, denoted as "v1" or "v4"'); process.exit(1); } console.log(uuid[arg]()); process.exit(0); npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/test/compare_v1.js0000644000000000000000000000271312631326456025561 0ustar 00000000000000var assert = require('assert'), nodeuuid = require('../uuid'), uuidjs = require('uuid-js'), libuuid = require('uuid').generate, util = require('util'), exec = require('child_process').exec, os = require('os'); // On Mac Os X / macports there's only the ossp-uuid package that provides uuid // On Linux there's uuid-runtime which provides uuidgen var uuidCmd = os.type() === 'Darwin' ? 'uuid -1' : 'uuidgen -t'; function compare(ids) { console.log(ids); for (var i = 0; i < ids.length; i++) { var id = ids[i].split('-'); id = [id[2], id[1], id[0]].join(''); ids[i] = id; } var sorted = ([].concat(ids)).sort(); if (sorted.toString() !== ids.toString()) { console.log('Warning: sorted !== ids'); } else { console.log('everything in order!'); } } // Test time order of v1 uuids var ids = []; while (ids.length < 10e3) ids.push(nodeuuid.v1()); var max = 10; console.log('node-uuid:'); ids = []; for (var i = 0; i < max; i++) ids.push(nodeuuid.v1()); compare(ids); console.log(''); console.log('uuidjs:'); ids = []; for (var i = 0; i < max; i++) ids.push(uuidjs.create(1).toString()); compare(ids); console.log(''); console.log('libuuid:'); ids = []; var count = 0; var last = function() { compare(ids); } var cb = function(err, stdout, stderr) { ids.push(stdout.substring(0, stdout.length-1)); count++; if (count < max) { return next(); } last(); }; var next = function() { exec(uuidCmd, cb); }; next(); npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/test/test.html0000644000000000000000000000052412631326456025032 0ustar 00000000000000 npm_3.5.2.orig/node_modules/request/node_modules/node-uuid/test/test.js0000644000000000000000000001403012631326456024477 0ustar 00000000000000if (!this.uuid) { // node.js uuid = require('../uuid'); } // // x-platform log/assert shims // function _log(msg, type) { type = type || 'log'; if (typeof(document) != 'undefined') { document.write('
      ' + msg.replace(/\n/g, '
      ') + '
      '); } if (typeof(console) != 'undefined') { var color = { log: '\033[39m', warn: '\033[33m', error: '\033[31m' }; console[type](color[type] + msg + color.log); } } function log(msg) {_log(msg, 'log');} function warn(msg) {_log(msg, 'warn');} function error(msg) {_log(msg, 'error');} function assert(res, msg) { if (!res) { error('FAIL: ' + msg); } else { log('Pass: ' + msg); } } // // Unit tests // // Verify ordering of v1 ids created with explicit times var TIME = 1321644961388; // 2011-11-18 11:36:01.388-08:00 function compare(name, ids) { ids = ids.map(function(id) { return id.split('-').reverse().join('-'); }).sort(); var sorted = ([].concat(ids)).sort(); assert(sorted.toString() == ids.toString(), name + ' have expected order'); } // Verify ordering of v1 ids created using default behavior compare('uuids with current time', [ uuid.v1(), uuid.v1(), uuid.v1(), uuid.v1(), uuid.v1() ]); // Verify ordering of v1 ids created with explicit times compare('uuids with time option', [ uuid.v1({msecs: TIME - 10*3600*1000}), uuid.v1({msecs: TIME - 1}), uuid.v1({msecs: TIME}), uuid.v1({msecs: TIME + 1}), uuid.v1({msecs: TIME + 28*24*3600*1000}) ]); assert( uuid.v1({msecs: TIME}) != uuid.v1({msecs: TIME}), 'IDs created at same msec are different' ); // Verify throw if too many ids created var thrown = false; try { uuid.v1({msecs: TIME, nsecs: 10000}); } catch (e) { thrown = true; } assert(thrown, 'Exception thrown when > 10K ids created in 1 ms'); // Verify clock regression bumps clockseq var uidt = uuid.v1({msecs: TIME}); var uidtb = uuid.v1({msecs: TIME - 1}); assert( parseInt(uidtb.split('-')[3], 16) - parseInt(uidt.split('-')[3], 16) === 1, 'Clock regression by msec increments the clockseq' ); // Verify clock regression bumps clockseq var uidtn = uuid.v1({msecs: TIME, nsecs: 10}); var uidtnb = uuid.v1({msecs: TIME, nsecs: 9}); assert( parseInt(uidtnb.split('-')[3], 16) - parseInt(uidtn.split('-')[3], 16) === 1, 'Clock regression by nsec increments the clockseq' ); // Verify explicit options produce expected id var id = uuid.v1({ msecs: 1321651533573, nsecs: 5432, clockseq: 0x385c, node: [ 0x61, 0xcd, 0x3c, 0xbb, 0x32, 0x10 ] }); assert(id == 'd9428888-122b-11e1-b85c-61cd3cbb3210', 'Explicit options produce expected id'); // Verify adjacent ids across a msec boundary are 1 time unit apart var u0 = uuid.v1({msecs: TIME, nsecs: 9999}); var u1 = uuid.v1({msecs: TIME + 1, nsecs: 0}); var before = u0.split('-')[0], after = u1.split('-')[0]; var dt = parseInt(after, 16) - parseInt(before, 16); assert(dt === 1, 'Ids spanning 1ms boundary are 100ns apart'); // // Test parse/unparse // id = '00112233445566778899aabbccddeeff'; assert(uuid.unparse(uuid.parse(id.substr(0,10))) == '00112233-4400-0000-0000-000000000000', 'Short parse'); assert(uuid.unparse(uuid.parse('(this is the uuid -> ' + id + id)) == '00112233-4455-6677-8899-aabbccddeeff', 'Dirty parse'); // // Perf tests // var generators = { v1: uuid.v1, v4: uuid.v4 }; var UUID_FORMAT = { v1: /[0-9a-f]{8}-[0-9a-f]{4}-1[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}/i, v4: /[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}/i }; var N = 1e4; // Get %'age an actual value differs from the ideal value function divergence(actual, ideal) { return Math.round(100*100*(actual - ideal)/ideal)/100; } function rate(msg, t) { log(msg + ': ' + (N / (Date.now() - t) * 1e3 | 0) + ' uuids\/second'); } for (var version in generators) { var counts = {}, max = 0; var generator = generators[version]; var format = UUID_FORMAT[version]; log('\nSanity check ' + N + ' ' + version + ' uuids'); for (var i = 0, ok = 0; i < N; i++) { id = generator(); if (!format.test(id)) { throw Error(id + ' is not a valid UUID string'); } if (id != uuid.unparse(uuid.parse(id))) { assert(fail, id + ' is not a valid id'); } // Count digits for our randomness check if (version == 'v4') { var digits = id.replace(/-/g, '').split(''); for (var j = digits.length-1; j >= 0; j--) { var c = digits[j]; max = Math.max(max, counts[c] = (counts[c] || 0) + 1); } } } // Check randomness for v4 UUIDs if (version == 'v4') { // Limit that we get worried about randomness. (Purely empirical choice, this!) var limit = 2*100*Math.sqrt(1/N); log('\nChecking v4 randomness. Distribution of Hex Digits (% deviation from ideal)'); for (var i = 0; i < 16; i++) { var c = i.toString(16); var bar = '', n = counts[c], p = Math.round(n/max*100|0); // 1-3,5-8, and D-F: 1:16 odds over 30 digits var ideal = N*30/16; if (i == 4) { // 4: 1:1 odds on 1 digit, plus 1:16 odds on 30 digits ideal = N*(1 + 30/16); } else if (i >= 8 && i <= 11) { // 8-B: 1:4 odds on 1 digit, plus 1:16 odds on 30 digits ideal = N*(1/4 + 30/16); } else { // Otherwise: 1:16 odds on 30 digits ideal = N*30/16; } var d = divergence(n, ideal); // Draw bar using UTF squares (just for grins) var s = n/max*50 | 0; while (s--) bar += '='; assert(Math.abs(d) < limit, c + ' |' + bar + '| ' + counts[c] + ' (' + d + '% < ' + limit + '%)'); } } } // Perf tests for (var version in generators) { log('\nPerformance testing ' + version + ' UUIDs'); var generator = generators[version]; var buf = new uuid.BufferClass(16); for (var i = 0, t = Date.now(); i < N; i++) generator(); rate('uuid.' + version + '()', t); for (var i = 0, t = Date.now(); i < N; i++) generator('binary'); rate('uuid.' + version + '(\'binary\')', t); for (var i = 0, t = Date.now(); i < N; i++) generator('binary', buf); rate('uuid.' + version + '(\'binary\', buffer)', t); } npm_3.5.2.orig/node_modules/request/node_modules/oauth-sign/LICENSE0000644000000000000000000002166412631326456023410 0ustar 00000000000000Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONSnpm_3.5.2.orig/node_modules/request/node_modules/oauth-sign/README.md0000644000000000000000000000015312631326456023650 0ustar 00000000000000oauth-sign ========== OAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module. npm_3.5.2.orig/node_modules/request/node_modules/oauth-sign/index.js0000644000000000000000000000673412631326456024051 0ustar 00000000000000var crypto = require('crypto') , qs = require('querystring') ; function sha1 (key, body) { return crypto.createHmac('sha1', key).update(body).digest('base64') } function rsa (key, body) { return crypto.createSign("RSA-SHA1").update(body).sign(key, 'base64'); } function rfc3986 (str) { return encodeURIComponent(str) .replace(/!/g,'%21') .replace(/\*/g,'%2A') .replace(/\(/g,'%28') .replace(/\)/g,'%29') .replace(/'/g,'%27') ; } // Maps object to bi-dimensional array // Converts { foo: 'A', bar: [ 'b', 'B' ]} to // [ ['foo', 'A'], ['bar', 'b'], ['bar', 'B'] ] function map (obj) { var key, val, arr = [] for (key in obj) { val = obj[key] if (Array.isArray(val)) for (var i = 0; i < val.length; i++) arr.push([key, val[i]]) else if (typeof val === "object") for (var prop in val) arr.push([key + '[' + prop + ']', val[prop]]); else arr.push([key, val]) } return arr } // Compare function for sort function compare (a, b) { return a > b ? 1 : a < b ? -1 : 0 } function generateBase (httpMethod, base_uri, params) { // adapted from https://dev.twitter.com/docs/auth/oauth and // https://dev.twitter.com/docs/auth/creating-signature // Parameter normalization // http://tools.ietf.org/html/rfc5849#section-3.4.1.3.2 var normalized = map(params) // 1. First, the name and value of each parameter are encoded .map(function (p) { return [ rfc3986(p[0]), rfc3986(p[1] || '') ] }) // 2. The parameters are sorted by name, using ascending byte value // ordering. If two or more parameters share the same name, they // are sorted by their value. .sort(function (a, b) { return compare(a[0], b[0]) || compare(a[1], b[1]) }) // 3. The name of each parameter is concatenated to its corresponding // value using an "=" character (ASCII code 61) as a separator, even // if the value is empty. .map(function (p) { return p.join('=') }) // 4. The sorted name/value pairs are concatenated together into a // single string by using an "&" character (ASCII code 38) as // separator. .join('&') var base = [ rfc3986(httpMethod ? httpMethod.toUpperCase() : 'GET'), rfc3986(base_uri), rfc3986(normalized) ].join('&') return base } function hmacsign (httpMethod, base_uri, params, consumer_secret, token_secret) { var base = generateBase(httpMethod, base_uri, params) var key = [ consumer_secret || '', token_secret || '' ].map(rfc3986).join('&') return sha1(key, base) } function rsasign (httpMethod, base_uri, params, private_key, token_secret) { var base = generateBase(httpMethod, base_uri, params) var key = private_key || '' return rsa(key, base) } function plaintext (consumer_secret, token_secret) { var key = [ consumer_secret || '', token_secret || '' ].map(rfc3986).join('&') return key } function sign (signMethod, httpMethod, base_uri, params, consumer_secret, token_secret) { var method var skipArgs = 1 switch (signMethod) { case 'RSA-SHA1': method = rsasign break case 'HMAC-SHA1': method = hmacsign break case 'PLAINTEXT': method = plaintext skipArgs = 4 break default: throw new Error("Signature method not supported: " + signMethod) } return method.apply(null, [].slice.call(arguments, skipArgs)) } exports.hmacsign = hmacsign exports.rsasign = rsasign exports.plaintext = plaintext exports.sign = sign exports.rfc3986 = rfc3986 npm_3.5.2.orig/node_modules/request/node_modules/oauth-sign/package.json0000644000000000000000000000304412631326456024661 0ustar 00000000000000{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "oauth-sign", "description": "OAuth 1 signing. Formerly a vendor lib in mikeal/request, now a standalone module.", "version": "0.8.0", "license": "Apache-2.0", "repository": { "url": "git+https://github.com/mikeal/oauth-sign.git" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "scripts": { "test": "node test.js" }, "gitHead": "e1f2b42ff039901ce977f8e81918767d97d496b5", "bugs": { "url": "https://github.com/mikeal/oauth-sign/issues" }, "homepage": "https://github.com/mikeal/oauth-sign#readme", "_id": "oauth-sign@0.8.0", "_shasum": "938fdc875765ba527137d8aec9d178e24debc553", "_from": "oauth-sign@>=0.8.0 <0.9.0", "_npmVersion": "2.10.1", "_nodeVersion": "0.12.4", "_npmUser": { "name": "simov", "email": "simeonvelichkov@gmail.com" }, "maintainers": [ { "name": "mikeal", "email": "mikeal.rogers@gmail.com" }, { "name": "nylen", "email": "jnylen@gmail.com" }, { "name": "simov", "email": "simeonvelichkov@gmail.com" } ], "dist": { "shasum": "938fdc875765ba527137d8aec9d178e24debc553", "tarball": "http://registry.npmjs.org/oauth-sign/-/oauth-sign-0.8.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/oauth-sign/-/oauth-sign-0.8.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/oauth-sign/test.js0000644000000000000000000000674212631326456023720 0ustar 00000000000000var oauth = require('./index') , hmacsign = oauth.hmacsign , assert = require('assert') , qs = require('querystring') ; // Tests from Twitter documentation https://dev.twitter.com/docs/auth/oauth var reqsign = hmacsign('POST', 'https://api.twitter.com/oauth/request_token', { oauth_callback: 'http://localhost:3005/the_dance/process_callback?service_provider_id=11' , oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g' , oauth_nonce: 'QP70eNmVz8jvdPevU3oJD2AfF7R7odC2XJcn4XlZJqk' , oauth_signature_method: 'HMAC-SHA1' , oauth_timestamp: '1272323042' , oauth_version: '1.0' }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98") console.log(reqsign) console.log('8wUi7m5HFQy76nowoCThusfgB+Q=') assert.equal(reqsign, '8wUi7m5HFQy76nowoCThusfgB+Q=') var accsign = hmacsign('POST', 'https://api.twitter.com/oauth/access_token', { oauth_consumer_key: 'GDdmIQH6jhtmLUypg82g' , oauth_nonce: '9zWH6qe0qG7Lc1telCn7FhUbLyVdjEaL3MO5uHxn8' , oauth_signature_method: 'HMAC-SHA1' , oauth_token: '8ldIZyxQeVrFZXFOZH5tAwj6vzJYuLQpl0WUEYtWc' , oauth_timestamp: '1272323047' , oauth_verifier: 'pDNg57prOHapMbhv25RNf75lVRd6JDsni1AJJIDYoTY' , oauth_version: '1.0' }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "x6qpRnlEmW9JbQn4PQVVeVG8ZLPEx6A0TOebgwcuA") console.log(accsign) console.log('PUw/dHA4fnlJYM6RhXk5IU/0fCc=') assert.equal(accsign, 'PUw/dHA4fnlJYM6RhXk5IU/0fCc=') var upsign = hmacsign('POST', 'http://api.twitter.com/1/statuses/update.json', { oauth_consumer_key: "GDdmIQH6jhtmLUypg82g" , oauth_nonce: "oElnnMTQIZvqvlfXM56aBLAf5noGD0AQR3Fmi7Q6Y" , oauth_signature_method: "HMAC-SHA1" , oauth_token: "819797-Jxq8aYUDRmykzVKrgoLhXSq67TEa5ruc4GJC2rWimw" , oauth_timestamp: "1272325550" , oauth_version: "1.0" , status: 'setting up my twitter 私のさえずりを設定する' }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "J6zix3FfA9LofH0awS24M3HcBYXO5nI1iYe8EfBA") console.log(upsign) console.log('yOahq5m0YjDDjfjxHaXEsW9D+X0=') assert.equal(upsign, 'yOahq5m0YjDDjfjxHaXEsW9D+X0=') // handle objects in params (useful for Wordpress REST API) var upsign = hmacsign('POST', 'http://wordpress.com/wp-json', { oauth_consumer_key: "GDdmIQH6jhtmLUypg82g" , oauth_nonce: "oElnnMTQIZvqvlfXM56aBLAf5noGD0AQR3Fmi7Q6Y" , oauth_signature_method: "HMAC-SHA1" , oauth_token: "819797-Jxq8aYUDRmykzVKrgoLhXSq67TEa5ruc4GJC2rWimw" , oauth_timestamp: "1272325550" , oauth_version: "1.0" , filter: { number: "-1" } }, "MCD8BKwGdgPHvAuvgvz4EQpqDAtx89grbuNMRd7Eh98", "J6zix3FfA9LofH0awS24M3HcBYXO5nI1iYe8EfBA") console.log(upsign) console.log('YrJFBdwnjuIitGpKrxLUplcuuUQ=') assert.equal(upsign, 'YrJFBdwnjuIitGpKrxLUplcuuUQ=') // example in rfc5849 var params = qs.parse('b5=%3D%253D&a3=a&c%40=&a2=r%20b' + '&' + 'c2&a3=2+q') params.oauth_consumer_key = '9djdj82h48djs9d2' params.oauth_token = 'kkk9d7dh3k39sjv7' params.oauth_nonce = '7d8f3e4a' params.oauth_signature_method = 'HMAC-SHA1' params.oauth_timestamp = '137131201' var rfc5849sign = hmacsign('POST', 'http://example.com/request', params, "j49sk3j29djd", "dh893hdasih9") console.log(rfc5849sign) console.log('r6/TJjbCOr97/+UU0NsvSne7s5g=') assert.equal(rfc5849sign, 'r6/TJjbCOr97/+UU0NsvSne7s5g=') // PLAINTEXT var plainSign = oauth.sign('PLAINTEXT', 'GET', 'http://dummy.com', {}, 'consumer_secret', 'token_secret') console.log(plainSign) assert.equal(plainSign, 'consumer_secret&token_secret') plainSign = oauth.plaintext('consumer_secret', 'token_secret') console.log(plainSign) assert.equal(plainSign, 'consumer_secret&token_secret') npm_3.5.2.orig/node_modules/request/node_modules/qs/.eslintignore0000644000000000000000000000000512631326456023435 0ustar 00000000000000dist npm_3.5.2.orig/node_modules/request/node_modules/qs/.npmignore0000644000000000000000000000027712631326456022744 0ustar 00000000000000.idea *.iml npm-debug.log dump.rdb node_modules results.tap results.xml npm-shrinkwrap.json config.json .DS_Store */.DS_Store */*/.DS_Store ._* */._* */*/._* coverage.* lib-cov complexity.md npm_3.5.2.orig/node_modules/request/node_modules/qs/.travis.yml0000644000000000000000000000010012631326456023037 0ustar 00000000000000language: node_js node_js: - 0.10 - 4.0 - 4 sudo: false npm_3.5.2.orig/node_modules/request/node_modules/qs/CHANGELOG.md0000644000000000000000000001470012631326456022552 0ustar 00000000000000 ## [**5.1.0**](https://github.com/hapijs/qs/issues?milestone=29&state=open) - [**#117**](https://github.com/hapijs/qs/issues/117) make URI encoding stringified results optional - [**#106**](https://github.com/hapijs/qs/issues/106) Add flag `skipNulls` to optionally skip null values in stringify ## [**5.0.0**](https://github.com/hapijs/qs/issues?milestone=28&state=closed) - [**#114**](https://github.com/hapijs/qs/issues/114) default allowDots to false - [**#100**](https://github.com/hapijs/qs/issues/100) include dist to npm ## [**4.0.0**](https://github.com/hapijs/qs/issues?milestone=26&state=closed) - [**#98**](https://github.com/hapijs/qs/issues/98) make returning plain objects and allowing prototype overwriting properties optional ## [**3.1.0**](https://github.com/hapijs/qs/issues?milestone=24&state=closed) - [**#89**](https://github.com/hapijs/qs/issues/89) Add option to disable "Transform dot notation to bracket notation" ## [**3.0.0**](https://github.com/hapijs/qs/issues?milestone=23&state=closed) - [**#80**](https://github.com/hapijs/qs/issues/80) qs.parse silently drops properties - [**#77**](https://github.com/hapijs/qs/issues/77) Perf boost - [**#60**](https://github.com/hapijs/qs/issues/60) Add explicit option to disable array parsing - [**#74**](https://github.com/hapijs/qs/issues/74) Bad parse when turning array into object - [**#81**](https://github.com/hapijs/qs/issues/81) Add a `filter` option - [**#68**](https://github.com/hapijs/qs/issues/68) Fixed issue with recursion and passing strings into objects. - [**#66**](https://github.com/hapijs/qs/issues/66) Add mixed array and object dot notation support Closes: #47 - [**#76**](https://github.com/hapijs/qs/issues/76) RFC 3986 - [**#85**](https://github.com/hapijs/qs/issues/85) No equal sign - [**#84**](https://github.com/hapijs/qs/issues/84) update license attribute ## [**2.4.1**](https://github.com/hapijs/qs/issues?milestone=20&state=closed) - [**#73**](https://github.com/hapijs/qs/issues/73) Property 'hasOwnProperty' of object # is not a function ## [**2.4.0**](https://github.com/hapijs/qs/issues?milestone=19&state=closed) - [**#70**](https://github.com/hapijs/qs/issues/70) Add arrayFormat option ## [**2.3.3**](https://github.com/hapijs/qs/issues?milestone=18&state=closed) - [**#59**](https://github.com/hapijs/qs/issues/59) make sure array indexes are >= 0, closes #57 - [**#58**](https://github.com/hapijs/qs/issues/58) make qs usable for browser loader ## [**2.3.2**](https://github.com/hapijs/qs/issues?milestone=17&state=closed) - [**#55**](https://github.com/hapijs/qs/issues/55) allow merging a string into an object ## [**2.3.1**](https://github.com/hapijs/qs/issues?milestone=16&state=closed) - [**#52**](https://github.com/hapijs/qs/issues/52) Return "undefined" and "false" instead of throwing "TypeError". ## [**2.3.0**](https://github.com/hapijs/qs/issues?milestone=15&state=closed) - [**#50**](https://github.com/hapijs/qs/issues/50) add option to omit array indices, closes #46 ## [**2.2.5**](https://github.com/hapijs/qs/issues?milestone=14&state=closed) - [**#39**](https://github.com/hapijs/qs/issues/39) Is there an alternative to Buffer.isBuffer? - [**#49**](https://github.com/hapijs/qs/issues/49) refactor utils.merge, fixes #45 - [**#41**](https://github.com/hapijs/qs/issues/41) avoid browserifying Buffer, for #39 ## [**2.2.4**](https://github.com/hapijs/qs/issues?milestone=13&state=closed) - [**#38**](https://github.com/hapijs/qs/issues/38) how to handle object keys beginning with a number ## [**2.2.3**](https://github.com/hapijs/qs/issues?milestone=12&state=closed) - [**#37**](https://github.com/hapijs/qs/issues/37) parser discards first empty value in array - [**#36**](https://github.com/hapijs/qs/issues/36) Update to lab 4.x ## [**2.2.2**](https://github.com/hapijs/qs/issues?milestone=11&state=closed) - [**#33**](https://github.com/hapijs/qs/issues/33) Error when plain object in a value - [**#34**](https://github.com/hapijs/qs/issues/34) use Object.prototype.hasOwnProperty.call instead of obj.hasOwnProperty - [**#24**](https://github.com/hapijs/qs/issues/24) Changelog? Semver? ## [**2.2.1**](https://github.com/hapijs/qs/issues?milestone=10&state=closed) - [**#32**](https://github.com/hapijs/qs/issues/32) account for circular references properly, closes #31 - [**#31**](https://github.com/hapijs/qs/issues/31) qs.parse stackoverflow on circular objects ## [**2.2.0**](https://github.com/hapijs/qs/issues?milestone=9&state=closed) - [**#26**](https://github.com/hapijs/qs/issues/26) Don't use Buffer global if it's not present - [**#30**](https://github.com/hapijs/qs/issues/30) Bug when merging non-object values into arrays - [**#29**](https://github.com/hapijs/qs/issues/29) Don't call Utils.clone at the top of Utils.merge - [**#23**](https://github.com/hapijs/qs/issues/23) Ability to not limit parameters? ## [**2.1.0**](https://github.com/hapijs/qs/issues?milestone=8&state=closed) - [**#22**](https://github.com/hapijs/qs/issues/22) Enable using a RegExp as delimiter ## [**2.0.0**](https://github.com/hapijs/qs/issues?milestone=7&state=closed) - [**#18**](https://github.com/hapijs/qs/issues/18) Why is there arrayLimit? - [**#20**](https://github.com/hapijs/qs/issues/20) Configurable parametersLimit - [**#21**](https://github.com/hapijs/qs/issues/21) make all limits optional, for #18, for #20 ## [**1.2.2**](https://github.com/hapijs/qs/issues?milestone=6&state=closed) - [**#19**](https://github.com/hapijs/qs/issues/19) Don't overwrite null values ## [**1.2.1**](https://github.com/hapijs/qs/issues?milestone=5&state=closed) - [**#16**](https://github.com/hapijs/qs/issues/16) ignore non-string delimiters - [**#15**](https://github.com/hapijs/qs/issues/15) Close code block ## [**1.2.0**](https://github.com/hapijs/qs/issues?milestone=4&state=closed) - [**#12**](https://github.com/hapijs/qs/issues/12) Add optional delim argument - [**#13**](https://github.com/hapijs/qs/issues/13) fix #11: flattened keys in array are now correctly parsed ## [**1.1.0**](https://github.com/hapijs/qs/issues?milestone=3&state=closed) - [**#7**](https://github.com/hapijs/qs/issues/7) Empty values of a POST array disappear after being submitted - [**#9**](https://github.com/hapijs/qs/issues/9) Should not omit equals signs (=) when value is null - [**#6**](https://github.com/hapijs/qs/issues/6) Minor grammar fix in README ## [**1.0.2**](https://github.com/hapijs/qs/issues?milestone=2&state=closed) - [**#5**](https://github.com/hapijs/qs/issues/5) array holes incorrectly copied into object on large index npm_3.5.2.orig/node_modules/request/node_modules/qs/CONTRIBUTING.md0000644000000000000000000000015112631326456023165 0ustar 00000000000000Please view our [hapijs contributing guide](https://github.com/hapijs/hapi/blob/master/CONTRIBUTING.md). npm_3.5.2.orig/node_modules/request/node_modules/qs/LICENSE0000644000000000000000000000316612631326456021752 0ustar 00000000000000Copyright (c) 2014 Nathan LaFreniere and other contributors. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. * * * The complete list of contributors can be found at: https://github.com/hapijs/qs/graphs/contributors npm_3.5.2.orig/node_modules/request/node_modules/qs/README.md0000644000000000000000000002033112631326456022215 0ustar 00000000000000# qs A querystring parsing and stringifying library with some added security. [![Build Status](https://secure.travis-ci.org/hapijs/qs.svg)](http://travis-ci.org/hapijs/qs) Lead Maintainer: [Nathan LaFreniere](https://github.com/nlf) The **qs** module was originally created and maintained by [TJ Holowaychuk](https://github.com/visionmedia/node-querystring). ## Usage ```javascript var Qs = require('qs'); var obj = Qs.parse('a=c'); // { a: 'c' } var str = Qs.stringify(obj); // 'a=c' ``` ### Parsing Objects ```javascript Qs.parse(string, [options]); ``` **qs** allows you to create nested objects within your query strings, by surrounding the name of sub-keys with square brackets `[]`. For example, the string `'foo[bar]=baz'` converts to: ```javascript { foo: { bar: 'baz' } } ``` When using the `plainObjects` option the parsed value is returned as a plain object, created via `Object.create(null)` and as such you should be aware that prototype methods will not exist on it and a user may set those names to whatever value they like: ```javascript Qs.parse('a.hasOwnProperty=b', { plainObjects: true }); // { a: { hasOwnProperty: 'b' } } ``` By default parameters that would overwrite properties on the object prototype are ignored, if you wish to keep the data from those fields either use `plainObjects` as mentioned above, or set `allowPrototypes` to `true` which will allow user input to overwrite those properties. *WARNING* It is generally a bad idea to enable this option as it can cause problems when attempting to use the properties that have been overwritten. Always be careful with this option. ```javascript Qs.parse('a.hasOwnProperty=b', { allowPrototypes: true }); // { a: { hasOwnProperty: 'b' } } ``` URI encoded strings work too: ```javascript Qs.parse('a%5Bb%5D=c'); // { a: { b: 'c' } } ``` You can also nest your objects, like `'foo[bar][baz]=foobarbaz'`: ```javascript { foo: { bar: { baz: 'foobarbaz' } } } ``` By default, when nesting objects **qs** will only parse up to 5 children deep. This means if you attempt to parse a string like `'a[b][c][d][e][f][g][h][i]=j'` your resulting object will be: ```javascript { a: { b: { c: { d: { e: { f: { '[g][h][i]': 'j' } } } } } } } ``` This depth can be overridden by passing a `depth` option to `Qs.parse(string, [options])`: ```javascript Qs.parse('a[b][c][d][e][f][g][h][i]=j', { depth: 1 }); // { a: { b: { '[c][d][e][f][g][h][i]': 'j' } } } ``` The depth limit helps mitigate abuse when **qs** is used to parse user input, and it is recommended to keep it a reasonably small number. For similar reasons, by default **qs** will only parse up to 1000 parameters. This can be overridden by passing a `parameterLimit` option: ```javascript Qs.parse('a=b&c=d', { parameterLimit: 1 }); // { a: 'b' } ``` An optional delimiter can also be passed: ```javascript Qs.parse('a=b;c=d', { delimiter: ';' }); // { a: 'b', c: 'd' } ``` Delimiters can be a regular expression too: ```javascript Qs.parse('a=b;c=d,e=f', { delimiter: /[;,]/ }); // { a: 'b', c: 'd', e: 'f' } ``` Option `allowDots` can be used to enable dot notation: ```javascript Qs.parse('a.b=c', { allowDots: true }); // { a: { b: 'c' } } ``` ### Parsing Arrays **qs** can also parse arrays using a similar `[]` notation: ```javascript Qs.parse('a[]=b&a[]=c'); // { a: ['b', 'c'] } ``` You may specify an index as well: ```javascript Qs.parse('a[1]=c&a[0]=b'); // { a: ['b', 'c'] } ``` Note that the only difference between an index in an array and a key in an object is that the value between the brackets must be a number to create an array. When creating arrays with specific indices, **qs** will compact a sparse array to only the existing values preserving their order: ```javascript Qs.parse('a[1]=b&a[15]=c'); // { a: ['b', 'c'] } ``` Note that an empty string is also a value, and will be preserved: ```javascript Qs.parse('a[]=&a[]=b'); // { a: ['', 'b'] } Qs.parse('a[0]=b&a[1]=&a[2]=c'); // { a: ['b', '', 'c'] } ``` **qs** will also limit specifying indices in an array to a maximum index of `20`. Any array members with an index of greater than `20` will instead be converted to an object with the index as the key: ```javascript Qs.parse('a[100]=b'); // { a: { '100': 'b' } } ``` This limit can be overridden by passing an `arrayLimit` option: ```javascript Qs.parse('a[1]=b', { arrayLimit: 0 }); // { a: { '1': 'b' } } ``` To disable array parsing entirely, set `parseArrays` to `false`. ```javascript Qs.parse('a[]=b', { parseArrays: false }); // { a: { '0': 'b' } } ``` If you mix notations, **qs** will merge the two items into an object: ```javascript Qs.parse('a[0]=b&a[b]=c'); // { a: { '0': 'b', b: 'c' } } ``` You can also create arrays of objects: ```javascript Qs.parse('a[][b]=c'); // { a: [{ b: 'c' }] } ``` ### Stringifying ```javascript Qs.stringify(object, [options]); ``` When stringifying, **qs** by default URI encodes output. Objects are stringified as you would expect: ```javascript Qs.stringify({ a: 'b' }); // 'a=b' Qs.stringify({ a: { b: 'c' } }); // 'a%5Bb%5D=c' ``` This encoding can be disabled by setting the `encode` option to `false`: ```javascript Qs.stringify({ a: { b: 'c' } }, { encode: false }); // 'a[b]=c' ``` Examples beyond this point will be shown as though the output is not URI encoded for clarity. Please note that the return values in these cases *will* be URI encoded during real usage. When arrays are stringified, by default they are given explicit indices: ```javascript Qs.stringify({ a: ['b', 'c', 'd'] }); // 'a[0]=b&a[1]=c&a[2]=d' ``` You may override this by setting the `indices` option to `false`: ```javascript Qs.stringify({ a: ['b', 'c', 'd'] }, { indices: false }); // 'a=b&a=c&a=d' ``` You may use the `arrayFormat` option to specify the format of the output array ```javascript Qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'indices' }) // 'a[0]=b&a[1]=c' Qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'brackets' }) // 'a[]=b&a[]=c' Qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'repeat' }) // 'a=b&a=c' ``` Empty strings and null values will omit the value, but the equals sign (=) remains in place: ```javascript Qs.stringify({ a: '' }); // 'a=' ``` Properties that are set to `undefined` will be omitted entirely: ```javascript Qs.stringify({ a: null, b: undefined }); // 'a=' ``` The delimiter may be overridden with stringify as well: ```javascript Qs.stringify({ a: 'b', c: 'd' }, { delimiter: ';' }); // 'a=b;c=d' ``` Finally, you can use the `filter` option to restrict which keys will be included in the stringified output. If you pass a function, it will be called for each key to obtain the replacement value. Otherwise, if you pass an array, it will be used to select properties and array indices for stringification: ```javascript function filterFunc(prefix, value) { if (prefix == 'b') { // Return an `undefined` value to omit a property. return; } if (prefix == 'e[f]') { return value.getTime(); } if (prefix == 'e[g][0]') { return value * 2; } return value; } Qs.stringify({ a: 'b', c: 'd', e: { f: new Date(123), g: [2] } }, { filter: filterFunc }) // 'a=b&c=d&e[f]=123&e[g][0]=4' Qs.stringify({ a: 'b', c: 'd', e: 'f' }, { filter: ['a', 'e'] }) // 'a=b&e=f' Qs.stringify({ a: ['b', 'c', 'd'], e: 'f' }, { filter: ['a', 0, 2] }) // 'a[0]=b&a[2]=d' ``` ### Handling of `null` values By default, `null` values are treated like empty strings: ```javascript Qs.stringify({ a: null, b: '' }); // 'a=&b=' ``` Parsing does not distinguish between parameters with and without equal signs. Both are converted to empty strings. ```javascript Qs.parse('a&b=') // { a: '', b: '' } ``` To distinguish between `null` values and empty strings use the `strictNullHandling` flag. In the result string the `null` values have no `=` sign: ```javascript Qs.stringify({ a: null, b: '' }, { strictNullHandling: true }); // 'a&b=' ``` To parse values without `=` back to `null` use the `strictNullHandling` flag: ```javascript Qs.parse('a&b=', { strictNullHandling: true }); // { a: null, b: '' } ``` To completely skip rendering keys with `null` values, use the `skipNulls` flag: ```javascript qs.stringify({ a: 'b', c: null}, { skipNulls: true }) // 'a=b' ``` npm_3.5.2.orig/node_modules/request/node_modules/qs/bower.json0000644000000000000000000000070112631326456022746 0ustar 00000000000000{ "name": "qs", "main": "dist/qs.js", "version": "5.1.0", "homepage": "https://github.com/hapijs/qs", "authors": [ "Nathan LaFreniere " ], "description": "A querystring parser that supports nesting and arrays, with a depth limit", "keywords": [ "querystring", "qs" ], "license": "BSD-3-Clause", "ignore": [ "**/.*", "node_modules", "bower_components", "test", "tests" ] } npm_3.5.2.orig/node_modules/request/node_modules/qs/component.json0000644000000000000000000000054112631326456023634 0ustar 00000000000000{ "name": "qs", "repository": "hapijs/qs", "description": "query-string parser / stringifier with nesting support", "version": "5.1.0", "keywords": ["querystring", "query", "parser"], "main": "lib/index.js", "scripts": [ "lib/index.js", "lib/parse.js", "lib/stringify.js", "lib/utils.js" ], "license": "BSD-3-Clause" } npm_3.5.2.orig/node_modules/request/node_modules/qs/dist/0000755000000000000000000000000012631326456021702 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/qs/lib/0000755000000000000000000000000012631326456021505 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/qs/package.json0000644000000000000000000000302512631326456023225 0ustar 00000000000000{ "name": "qs", "description": "A querystring parser that supports nesting and arrays, with a depth limit", "homepage": "https://github.com/hapijs/qs", "version": "5.2.0", "repository": { "type": "git", "url": "git+https://github.com/hapijs/qs.git" }, "main": "lib/index.js", "keywords": [ "querystring", "qs" ], "engines": ">=0.10.40", "dependencies": {}, "devDependencies": { "browserify": "^10.2.1", "code": "1.x.x", "lab": "5.x.x" }, "scripts": { "test": "lab -a code -t 100 -L", "test-tap": "lab -a code -r tap -o tests.tap", "test-cov-html": "lab -a code -r html -o coverage.html", "dist": "browserify --standalone Qs lib/index.js > dist/qs.js" }, "license": "BSD-3-Clause", "gitHead": "a341cdf2fadba5ede1ce6c95c7051f6f31f37b81", "bugs": { "url": "https://github.com/hapijs/qs/issues" }, "_id": "qs@5.2.0", "_shasum": "a9f31142af468cb72b25b30136ba2456834916be", "_from": "qs@>=5.2.0 <5.3.0", "_npmVersion": "3.3.5", "_nodeVersion": "0.10.40", "_npmUser": { "name": "nlf", "email": "quitlahok@gmail.com" }, "dist": { "shasum": "a9f31142af468cb72b25b30136ba2456834916be", "tarball": "http://registry.npmjs.org/qs/-/qs-5.2.0.tgz" }, "maintainers": [ { "name": "nlf", "email": "quitlahok@gmail.com" }, { "name": "hueniverse", "email": "eran@hueniverse.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/qs/-/qs-5.2.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/qs/test/0000755000000000000000000000000012631326456021716 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/qs/dist/qs.js0000644000000000000000000003437112631326456022673 0ustar 00000000000000(function(f){if(typeof exports==="object"&&typeof module!=="undefined"){module.exports=f()}else if(typeof define==="function"&&define.amd){define([],f)}else{var g;if(typeof window!=="undefined"){g=window}else if(typeof global!=="undefined"){g=global}else if(typeof self!=="undefined"){g=self}else{g=this}g.Qs = f()}})(function(){var define,module,exports;return (function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o= 0 && (options.parseArrays && index <= options.arrayLimit)) { obj = []; obj[index] = internals.parseObject(chain, val, options); } else { obj[cleanRoot] = internals.parseObject(chain, val, options); } } return obj; }; internals.parseKeys = function (key, val, options) { if (!key) { return; } // Transform dot notation to bracket notation if (options.allowDots) { key = key.replace(/\.([^\.\[]+)/g, '[$1]'); } // The regex chunks var parent = /^([^\[\]]*)/; var child = /(\[[^\[\]]*\])/g; // Get the parent var segment = parent.exec(key); // Stash the parent if it exists var keys = []; if (segment[1]) { // If we aren't using plain objects, optionally prefix keys // that would overwrite object prototype properties if (!options.plainObjects && Object.prototype.hasOwnProperty(segment[1])) { if (!options.allowPrototypes) { return; } } keys.push(segment[1]); } // Loop through children appending to the array until we hit depth var i = 0; while ((segment = child.exec(key)) !== null && i < options.depth) { ++i; if (!options.plainObjects && Object.prototype.hasOwnProperty(segment[1].replace(/\[|\]/g, ''))) { if (!options.allowPrototypes) { continue; } } keys.push(segment[1]); } // If there's a remainder, just add whatever is left if (segment) { keys.push('[' + key.slice(segment.index) + ']'); } return internals.parseObject(keys, val, options); }; module.exports = function (str, options) { options = options || {}; options.delimiter = typeof options.delimiter === 'string' || Utils.isRegExp(options.delimiter) ? options.delimiter : internals.delimiter; options.depth = typeof options.depth === 'number' ? options.depth : internals.depth; options.arrayLimit = typeof options.arrayLimit === 'number' ? options.arrayLimit : internals.arrayLimit; options.parseArrays = options.parseArrays !== false; options.allowDots = typeof options.allowDots === 'boolean' ? options.allowDots : internals.allowDots; options.plainObjects = typeof options.plainObjects === 'boolean' ? options.plainObjects : internals.plainObjects; options.allowPrototypes = typeof options.allowPrototypes === 'boolean' ? options.allowPrototypes : internals.allowPrototypes; options.parameterLimit = typeof options.parameterLimit === 'number' ? options.parameterLimit : internals.parameterLimit; options.strictNullHandling = typeof options.strictNullHandling === 'boolean' ? options.strictNullHandling : internals.strictNullHandling; if (str === '' || str === null || typeof str === 'undefined') { return options.plainObjects ? Object.create(null) : {}; } var tempObj = typeof str === 'string' ? internals.parseValues(str, options) : str; var obj = options.plainObjects ? Object.create(null) : {}; // Iterate over the keys and setup the new object var keys = Object.keys(tempObj); for (var i = 0, il = keys.length; i < il; ++i) { var key = keys[i]; var newObj = internals.parseKeys(key, tempObj[key], options); obj = Utils.merge(obj, newObj, options); } return Utils.compact(obj); }; },{"./utils":4}],3:[function(require,module,exports){ // Load modules var Utils = require('./utils'); // Declare internals var internals = { delimiter: '&', arrayPrefixGenerators: { brackets: function (prefix, key) { return prefix + '[]'; }, indices: function (prefix, key) { return prefix + '[' + key + ']'; }, repeat: function (prefix, key) { return prefix; } }, strictNullHandling: false, skipNulls: false, encode: true }; internals.stringify = function (obj, prefix, generateArrayPrefix, strictNullHandling, skipNulls, encode, filter) { if (typeof filter === 'function') { obj = filter(prefix, obj); } else if (Utils.isBuffer(obj)) { obj = obj.toString(); } else if (obj instanceof Date) { obj = obj.toISOString(); } else if (obj === null) { if (strictNullHandling) { return encode ? Utils.encode(prefix) : prefix; } obj = ''; } if (typeof obj === 'string' || typeof obj === 'number' || typeof obj === 'boolean') { if (encode) { return [Utils.encode(prefix) + '=' + Utils.encode(obj)]; } return [prefix + '=' + obj]; } var values = []; if (typeof obj === 'undefined') { return values; } var objKeys = Array.isArray(filter) ? filter : Object.keys(obj); for (var i = 0, il = objKeys.length; i < il; ++i) { var key = objKeys[i]; if (skipNulls && obj[key] === null) { continue; } if (Array.isArray(obj)) { values = values.concat(internals.stringify(obj[key], generateArrayPrefix(prefix, key), generateArrayPrefix, strictNullHandling, skipNulls, encode, filter)); } else { values = values.concat(internals.stringify(obj[key], prefix + '[' + key + ']', generateArrayPrefix, strictNullHandling, skipNulls, encode, filter)); } } return values; }; module.exports = function (obj, options) { options = options || {}; var delimiter = typeof options.delimiter === 'undefined' ? internals.delimiter : options.delimiter; var strictNullHandling = typeof options.strictNullHandling === 'boolean' ? options.strictNullHandling : internals.strictNullHandling; var skipNulls = typeof options.skipNulls === 'boolean' ? options.skipNulls : internals.skipNulls; var encode = typeof options.encode === 'boolean' ? options.encode : internals.encode; var objKeys; var filter; if (typeof options.filter === 'function') { filter = options.filter; obj = filter('', obj); } else if (Array.isArray(options.filter)) { objKeys = filter = options.filter; } var keys = []; if (typeof obj !== 'object' || obj === null) { return ''; } var arrayFormat; if (options.arrayFormat in internals.arrayPrefixGenerators) { arrayFormat = options.arrayFormat; } else if ('indices' in options) { arrayFormat = options.indices ? 'indices' : 'repeat'; } else { arrayFormat = 'indices'; } var generateArrayPrefix = internals.arrayPrefixGenerators[arrayFormat]; if (!objKeys) { objKeys = Object.keys(obj); } for (var i = 0, il = objKeys.length; i < il; ++i) { var key = objKeys[i]; if (skipNulls && obj[key] === null) { continue; } keys = keys.concat(internals.stringify(obj[key], key, generateArrayPrefix, strictNullHandling, skipNulls, encode, filter)); } return keys.join(delimiter); }; },{"./utils":4}],4:[function(require,module,exports){ // Load modules // Declare internals var internals = {}; internals.hexTable = new Array(256); for (var h = 0; h < 256; ++h) { internals.hexTable[h] = '%' + ((h < 16 ? '0' : '') + h.toString(16)).toUpperCase(); } exports.arrayToObject = function (source, options) { var obj = options.plainObjects ? Object.create(null) : {}; for (var i = 0, il = source.length; i < il; ++i) { if (typeof source[i] !== 'undefined') { obj[i] = source[i]; } } return obj; }; exports.merge = function (target, source, options) { if (!source) { return target; } if (typeof source !== 'object') { if (Array.isArray(target)) { target.push(source); } else if (typeof target === 'object') { target[source] = true; } else { target = [target, source]; } return target; } if (typeof target !== 'object') { target = [target].concat(source); return target; } if (Array.isArray(target) && !Array.isArray(source)) { target = exports.arrayToObject(target, options); } var keys = Object.keys(source); for (var k = 0, kl = keys.length; k < kl; ++k) { var key = keys[k]; var value = source[key]; if (!Object.prototype.hasOwnProperty.call(target, key)) { target[key] = value; } else { target[key] = exports.merge(target[key], value, options); } } return target; }; exports.decode = function (str) { try { return decodeURIComponent(str.replace(/\+/g, ' ')); } catch (e) { return str; } }; exports.encode = function (str) { // This code was originally written by Brian White (mscdex) for the io.js core querystring library. // It has been adapted here for stricter adherence to RFC 3986 if (str.length === 0) { return str; } if (typeof str !== 'string') { str = '' + str; } var out = ''; for (var i = 0, il = str.length; i < il; ++i) { var c = str.charCodeAt(i); if (c === 0x2D || // - c === 0x2E || // . c === 0x5F || // _ c === 0x7E || // ~ (c >= 0x30 && c <= 0x39) || // 0-9 (c >= 0x41 && c <= 0x5A) || // a-z (c >= 0x61 && c <= 0x7A)) { // A-Z out += str[i]; continue; } if (c < 0x80) { out += internals.hexTable[c]; continue; } if (c < 0x800) { out += internals.hexTable[0xC0 | (c >> 6)] + internals.hexTable[0x80 | (c & 0x3F)]; continue; } if (c < 0xD800 || c >= 0xE000) { out += internals.hexTable[0xE0 | (c >> 12)] + internals.hexTable[0x80 | ((c >> 6) & 0x3F)] + internals.hexTable[0x80 | (c & 0x3F)]; continue; } ++i; c = 0x10000 + (((c & 0x3FF) << 10) | (str.charCodeAt(i) & 0x3FF)); out += internals.hexTable[0xF0 | (c >> 18)] + internals.hexTable[0x80 | ((c >> 12) & 0x3F)] + internals.hexTable[0x80 | ((c >> 6) & 0x3F)] + internals.hexTable[0x80 | (c & 0x3F)]; } return out; }; exports.compact = function (obj, refs) { if (typeof obj !== 'object' || obj === null) { return obj; } refs = refs || []; var lookup = refs.indexOf(obj); if (lookup !== -1) { return refs[lookup]; } refs.push(obj); if (Array.isArray(obj)) { var compacted = []; for (var i = 0, il = obj.length; i < il; ++i) { if (typeof obj[i] !== 'undefined') { compacted.push(obj[i]); } } return compacted; } var keys = Object.keys(obj); for (i = 0, il = keys.length; i < il; ++i) { var key = keys[i]; obj[key] = exports.compact(obj[key], refs); } return obj; }; exports.isRegExp = function (obj) { return Object.prototype.toString.call(obj) === '[object RegExp]'; }; exports.isBuffer = function (obj) { if (obj === null || typeof obj === 'undefined') { return false; } return !!(obj.constructor && obj.constructor.isBuffer && obj.constructor.isBuffer(obj)); }; },{}]},{},[1])(1) });npm_3.5.2.orig/node_modules/request/node_modules/qs/lib/index.js0000644000000000000000000000031012631326456023144 0ustar 00000000000000// Load modules var Stringify = require('./stringify'); var Parse = require('./parse'); // Declare internals var internals = {}; module.exports = { stringify: Stringify, parse: Parse }; npm_3.5.2.orig/node_modules/request/node_modules/qs/lib/parse.js0000644000000000000000000001236312631326456023162 0ustar 00000000000000// Load modules var Utils = require('./utils'); // Declare internals var internals = { delimiter: '&', depth: 5, arrayLimit: 20, parameterLimit: 1000, strictNullHandling: false, plainObjects: false, allowPrototypes: false, allowDots: false }; internals.parseValues = function (str, options) { var obj = {}; var parts = str.split(options.delimiter, options.parameterLimit === Infinity ? undefined : options.parameterLimit); for (var i = 0, il = parts.length; i < il; ++i) { var part = parts[i]; var pos = part.indexOf(']=') === -1 ? part.indexOf('=') : part.indexOf(']=') + 1; if (pos === -1) { obj[Utils.decode(part)] = ''; if (options.strictNullHandling) { obj[Utils.decode(part)] = null; } } else { var key = Utils.decode(part.slice(0, pos)); var val = Utils.decode(part.slice(pos + 1)); if (!Object.prototype.hasOwnProperty.call(obj, key)) { obj[key] = val; } else { obj[key] = [].concat(obj[key]).concat(val); } } } return obj; }; internals.parseObject = function (chain, val, options) { if (!chain.length) { return val; } var root = chain.shift(); var obj; if (root === '[]') { obj = []; obj = obj.concat(internals.parseObject(chain, val, options)); } else { obj = options.plainObjects ? Object.create(null) : {}; var cleanRoot = root[0] === '[' && root[root.length - 1] === ']' ? root.slice(1, root.length - 1) : root; var index = parseInt(cleanRoot, 10); var indexString = '' + index; if (!isNaN(index) && root !== cleanRoot && indexString === cleanRoot && index >= 0 && (options.parseArrays && index <= options.arrayLimit)) { obj = []; obj[index] = internals.parseObject(chain, val, options); } else { obj[cleanRoot] = internals.parseObject(chain, val, options); } } return obj; }; internals.parseKeys = function (key, val, options) { if (!key) { return; } // Transform dot notation to bracket notation if (options.allowDots) { key = key.replace(/\.([^\.\[]+)/g, '[$1]'); } // The regex chunks var parent = /^([^\[\]]*)/; var child = /(\[[^\[\]]*\])/g; // Get the parent var segment = parent.exec(key); // Stash the parent if it exists var keys = []; if (segment[1]) { // If we aren't using plain objects, optionally prefix keys // that would overwrite object prototype properties if (!options.plainObjects && Object.prototype.hasOwnProperty(segment[1])) { if (!options.allowPrototypes) { return; } } keys.push(segment[1]); } // Loop through children appending to the array until we hit depth var i = 0; while ((segment = child.exec(key)) !== null && i < options.depth) { ++i; if (!options.plainObjects && Object.prototype.hasOwnProperty(segment[1].replace(/\[|\]/g, ''))) { if (!options.allowPrototypes) { continue; } } keys.push(segment[1]); } // If there's a remainder, just add whatever is left if (segment) { keys.push('[' + key.slice(segment.index) + ']'); } return internals.parseObject(keys, val, options); }; module.exports = function (str, options) { options = options || {}; options.delimiter = typeof options.delimiter === 'string' || Utils.isRegExp(options.delimiter) ? options.delimiter : internals.delimiter; options.depth = typeof options.depth === 'number' ? options.depth : internals.depth; options.arrayLimit = typeof options.arrayLimit === 'number' ? options.arrayLimit : internals.arrayLimit; options.parseArrays = options.parseArrays !== false; options.allowDots = typeof options.allowDots === 'boolean' ? options.allowDots : internals.allowDots; options.plainObjects = typeof options.plainObjects === 'boolean' ? options.plainObjects : internals.plainObjects; options.allowPrototypes = typeof options.allowPrototypes === 'boolean' ? options.allowPrototypes : internals.allowPrototypes; options.parameterLimit = typeof options.parameterLimit === 'number' ? options.parameterLimit : internals.parameterLimit; options.strictNullHandling = typeof options.strictNullHandling === 'boolean' ? options.strictNullHandling : internals.strictNullHandling; if (str === '' || str === null || typeof str === 'undefined') { return options.plainObjects ? Object.create(null) : {}; } var tempObj = typeof str === 'string' ? internals.parseValues(str, options) : str; var obj = options.plainObjects ? Object.create(null) : {}; // Iterate over the keys and setup the new object var keys = Object.keys(tempObj); for (var i = 0, il = keys.length; i < il; ++i) { var key = keys[i]; var newObj = internals.parseKeys(key, tempObj[key], options); obj = Utils.merge(obj, newObj, options); } return Utils.compact(obj); }; npm_3.5.2.orig/node_modules/request/node_modules/qs/lib/stringify.js0000644000000000000000000000750212631326456024065 0ustar 00000000000000// Load modules var Utils = require('./utils'); // Declare internals var internals = { delimiter: '&', arrayPrefixGenerators: { brackets: function (prefix, key) { return prefix + '[]'; }, indices: function (prefix, key) { return prefix + '[' + key + ']'; }, repeat: function (prefix, key) { return prefix; } }, strictNullHandling: false, skipNulls: false, encode: true }; internals.stringify = function (obj, prefix, generateArrayPrefix, strictNullHandling, skipNulls, encode, filter, sort) { if (typeof filter === 'function') { obj = filter(prefix, obj); } else if (Utils.isBuffer(obj)) { obj = obj.toString(); } else if (obj instanceof Date) { obj = obj.toISOString(); } else if (obj === null) { if (strictNullHandling) { return encode ? Utils.encode(prefix) : prefix; } obj = ''; } if (typeof obj === 'string' || typeof obj === 'number' || typeof obj === 'boolean') { if (encode) { return [Utils.encode(prefix) + '=' + Utils.encode(obj)]; } return [prefix + '=' + obj]; } var values = []; if (typeof obj === 'undefined') { return values; } var objKeys; if (Array.isArray(filter)) { objKeys = filter; } else { var keys = Object.keys(obj); objKeys = sort ? keys.sort(sort) : keys; } for (var i = 0, il = objKeys.length; i < il; ++i) { var key = objKeys[i]; if (skipNulls && obj[key] === null) { continue; } if (Array.isArray(obj)) { values = values.concat(internals.stringify(obj[key], generateArrayPrefix(prefix, key), generateArrayPrefix, strictNullHandling, skipNulls, encode, filter)); } else { values = values.concat(internals.stringify(obj[key], prefix + '[' + key + ']', generateArrayPrefix, strictNullHandling, skipNulls, encode, filter)); } } return values; }; module.exports = function (obj, options) { options = options || {}; var delimiter = typeof options.delimiter === 'undefined' ? internals.delimiter : options.delimiter; var strictNullHandling = typeof options.strictNullHandling === 'boolean' ? options.strictNullHandling : internals.strictNullHandling; var skipNulls = typeof options.skipNulls === 'boolean' ? options.skipNulls : internals.skipNulls; var encode = typeof options.encode === 'boolean' ? options.encode : internals.encode; var sort = typeof options.sort === 'function' ? options.sort : null; var objKeys; var filter; if (typeof options.filter === 'function') { filter = options.filter; obj = filter('', obj); } else if (Array.isArray(options.filter)) { objKeys = filter = options.filter; } var keys = []; if (typeof obj !== 'object' || obj === null) { return ''; } var arrayFormat; if (options.arrayFormat in internals.arrayPrefixGenerators) { arrayFormat = options.arrayFormat; } else if ('indices' in options) { arrayFormat = options.indices ? 'indices' : 'repeat'; } else { arrayFormat = 'indices'; } var generateArrayPrefix = internals.arrayPrefixGenerators[arrayFormat]; if (!objKeys) { objKeys = Object.keys(obj); } if (sort) { objKeys.sort(sort); } for (var i = 0, il = objKeys.length; i < il; ++i) { var key = objKeys[i]; if (skipNulls && obj[key] === null) { continue; } keys = keys.concat(internals.stringify(obj[key], key, generateArrayPrefix, strictNullHandling, skipNulls, encode, filter, sort)); } return keys.join(delimiter); }; npm_3.5.2.orig/node_modules/request/node_modules/qs/lib/utils.js0000644000000000000000000001033712631326456023207 0ustar 00000000000000// Load modules // Declare internals var internals = {}; internals.hexTable = new Array(256); for (var h = 0; h < 256; ++h) { internals.hexTable[h] = '%' + ((h < 16 ? '0' : '') + h.toString(16)).toUpperCase(); } exports.arrayToObject = function (source, options) { var obj = options.plainObjects ? Object.create(null) : {}; for (var i = 0, il = source.length; i < il; ++i) { if (typeof source[i] !== 'undefined') { obj[i] = source[i]; } } return obj; }; exports.merge = function (target, source, options) { if (!source) { return target; } if (typeof source !== 'object') { if (Array.isArray(target)) { target.push(source); } else if (typeof target === 'object') { target[source] = true; } else { target = [target, source]; } return target; } if (typeof target !== 'object') { target = [target].concat(source); return target; } if (Array.isArray(target) && !Array.isArray(source)) { target = exports.arrayToObject(target, options); } var keys = Object.keys(source); for (var k = 0, kl = keys.length; k < kl; ++k) { var key = keys[k]; var value = source[key]; if (!Object.prototype.hasOwnProperty.call(target, key)) { target[key] = value; } else { target[key] = exports.merge(target[key], value, options); } } return target; }; exports.decode = function (str) { try { return decodeURIComponent(str.replace(/\+/g, ' ')); } catch (e) { return str; } }; exports.encode = function (str) { // This code was originally written by Brian White (mscdex) for the io.js core querystring library. // It has been adapted here for stricter adherence to RFC 3986 if (str.length === 0) { return str; } if (typeof str !== 'string') { str = '' + str; } var out = ''; for (var i = 0, il = str.length; i < il; ++i) { var c = str.charCodeAt(i); if (c === 0x2D || // - c === 0x2E || // . c === 0x5F || // _ c === 0x7E || // ~ (c >= 0x30 && c <= 0x39) || // 0-9 (c >= 0x41 && c <= 0x5A) || // a-z (c >= 0x61 && c <= 0x7A)) { // A-Z out += str[i]; continue; } if (c < 0x80) { out += internals.hexTable[c]; continue; } if (c < 0x800) { out += internals.hexTable[0xC0 | (c >> 6)] + internals.hexTable[0x80 | (c & 0x3F)]; continue; } if (c < 0xD800 || c >= 0xE000) { out += internals.hexTable[0xE0 | (c >> 12)] + internals.hexTable[0x80 | ((c >> 6) & 0x3F)] + internals.hexTable[0x80 | (c & 0x3F)]; continue; } ++i; c = 0x10000 + (((c & 0x3FF) << 10) | (str.charCodeAt(i) & 0x3FF)); out += internals.hexTable[0xF0 | (c >> 18)] + internals.hexTable[0x80 | ((c >> 12) & 0x3F)] + internals.hexTable[0x80 | ((c >> 6) & 0x3F)] + internals.hexTable[0x80 | (c & 0x3F)]; } return out; }; exports.compact = function (obj, refs) { if (typeof obj !== 'object' || obj === null) { return obj; } refs = refs || []; var lookup = refs.indexOf(obj); if (lookup !== -1) { return refs[lookup]; } refs.push(obj); if (Array.isArray(obj)) { var compacted = []; for (var i = 0, il = obj.length; i < il; ++i) { if (typeof obj[i] !== 'undefined') { compacted.push(obj[i]); } } return compacted; } var keys = Object.keys(obj); for (i = 0, il = keys.length; i < il; ++i) { var key = keys[i]; obj[key] = exports.compact(obj[key], refs); } return obj; }; exports.isRegExp = function (obj) { return Object.prototype.toString.call(obj) === '[object RegExp]'; }; exports.isBuffer = function (obj) { if (obj === null || typeof obj === 'undefined') { return false; } return !!(obj.constructor && obj.constructor.isBuffer && obj.constructor.isBuffer(obj)); }; npm_3.5.2.orig/node_modules/request/node_modules/qs/test/parse.js0000644000000000000000000004146612631326456023401 0ustar 00000000000000/* eslint no-extend-native:0 */ // Load modules var Code = require('code'); var Lab = require('lab'); var Qs = require('../'); // Declare internals var internals = {}; // Test shortcuts var lab = exports.lab = Lab.script(); var expect = Code.expect; var describe = lab.experiment; var it = lab.test; describe('parse()', function () { it('parses a simple string', function (done) { expect(Qs.parse('0=foo')).to.deep.equal({ '0': 'foo' }); expect(Qs.parse('foo=c++')).to.deep.equal({ foo: 'c ' }); expect(Qs.parse('a[>=]=23')).to.deep.equal({ a: { '>=': '23' } }); expect(Qs.parse('a[<=>]==23')).to.deep.equal({ a: { '<=>': '=23' } }); expect(Qs.parse('a[==]=23')).to.deep.equal({ a: { '==': '23' } }); expect(Qs.parse('foo', { strictNullHandling: true })).to.deep.equal({ foo: null }); expect(Qs.parse('foo' )).to.deep.equal({ foo: '' }); expect(Qs.parse('foo=')).to.deep.equal({ foo: '' }); expect(Qs.parse('foo=bar')).to.deep.equal({ foo: 'bar' }); expect(Qs.parse(' foo = bar = baz ')).to.deep.equal({ ' foo ': ' bar = baz ' }); expect(Qs.parse('foo=bar=baz')).to.deep.equal({ foo: 'bar=baz' }); expect(Qs.parse('foo=bar&bar=baz')).to.deep.equal({ foo: 'bar', bar: 'baz' }); expect(Qs.parse('foo2=bar2&baz2=')).to.deep.equal({ foo2: 'bar2', baz2: '' }); expect(Qs.parse('foo=bar&baz', { strictNullHandling: true })).to.deep.equal({ foo: 'bar', baz: null }); expect(Qs.parse('foo=bar&baz')).to.deep.equal({ foo: 'bar', baz: '' }); expect(Qs.parse('cht=p3&chd=t:60,40&chs=250x100&chl=Hello|World')).to.deep.equal({ cht: 'p3', chd: 't:60,40', chs: '250x100', chl: 'Hello|World' }); done(); }); it('allows enabling dot notation', function (done) { expect(Qs.parse('a.b=c')).to.deep.equal({ 'a.b': 'c' }); expect(Qs.parse('a.b=c', { allowDots: true })).to.deep.equal({ a: { b: 'c' } }); done(); }); it('parses a single nested string', function (done) { expect(Qs.parse('a[b]=c')).to.deep.equal({ a: { b: 'c' } }); done(); }); it('parses a double nested string', function (done) { expect(Qs.parse('a[b][c]=d')).to.deep.equal({ a: { b: { c: 'd' } } }); done(); }); it('defaults to a depth of 5', function (done) { expect(Qs.parse('a[b][c][d][e][f][g][h]=i')).to.deep.equal({ a: { b: { c: { d: { e: { f: { '[g][h]': 'i' } } } } } } }); done(); }); it('only parses one level when depth = 1', function (done) { expect(Qs.parse('a[b][c]=d', { depth: 1 })).to.deep.equal({ a: { b: { '[c]': 'd' } } }); expect(Qs.parse('a[b][c][d]=e', { depth: 1 })).to.deep.equal({ a: { b: { '[c][d]': 'e' } } }); done(); }); it('parses a simple array', function (done) { expect(Qs.parse('a=b&a=c')).to.deep.equal({ a: ['b', 'c'] }); done(); }); it('parses an explicit array', function (done) { expect(Qs.parse('a[]=b')).to.deep.equal({ a: ['b'] }); expect(Qs.parse('a[]=b&a[]=c')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a[]=b&a[]=c&a[]=d')).to.deep.equal({ a: ['b', 'c', 'd'] }); done(); }); it('parses a mix of simple and explicit arrays', function (done) { expect(Qs.parse('a=b&a[]=c')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a[]=b&a=c')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a[0]=b&a=c')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a=b&a[0]=c')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a[1]=b&a=c')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a=b&a[1]=c')).to.deep.equal({ a: ['b', 'c'] }); done(); }); it('parses a nested array', function (done) { expect(Qs.parse('a[b][]=c&a[b][]=d')).to.deep.equal({ a: { b: ['c', 'd'] } }); expect(Qs.parse('a[>=]=25')).to.deep.equal({ a: { '>=': '25' } }); done(); }); it('allows to specify array indices', function (done) { expect(Qs.parse('a[1]=c&a[0]=b&a[2]=d')).to.deep.equal({ a: ['b', 'c', 'd'] }); expect(Qs.parse('a[1]=c&a[0]=b')).to.deep.equal({ a: ['b', 'c'] }); expect(Qs.parse('a[1]=c')).to.deep.equal({ a: ['c'] }); done(); }); it('limits specific array indices to 20', function (done) { expect(Qs.parse('a[20]=a')).to.deep.equal({ a: ['a'] }); expect(Qs.parse('a[21]=a')).to.deep.equal({ a: { '21': 'a' } }); done(); }); it('supports keys that begin with a number', function (done) { expect(Qs.parse('a[12b]=c')).to.deep.equal({ a: { '12b': 'c' } }); done(); }); it('supports encoded = signs', function (done) { expect(Qs.parse('he%3Dllo=th%3Dere')).to.deep.equal({ 'he=llo': 'th=ere' }); done(); }); it('is ok with url encoded strings', function (done) { expect(Qs.parse('a[b%20c]=d')).to.deep.equal({ a: { 'b c': 'd' } }); expect(Qs.parse('a[b]=c%20d')).to.deep.equal({ a: { b: 'c d' } }); done(); }); it('allows brackets in the value', function (done) { expect(Qs.parse('pets=["tobi"]')).to.deep.equal({ pets: '["tobi"]' }); expect(Qs.parse('operators=[">=", "<="]')).to.deep.equal({ operators: '[">=", "<="]' }); done(); }); it('allows empty values', function (done) { expect(Qs.parse('')).to.deep.equal({}); expect(Qs.parse(null)).to.deep.equal({}); expect(Qs.parse(undefined)).to.deep.equal({}); done(); }); it('transforms arrays to objects', function (done) { expect(Qs.parse('foo[0]=bar&foo[bad]=baz')).to.deep.equal({ foo: { '0': 'bar', bad: 'baz' } }); expect(Qs.parse('foo[bad]=baz&foo[0]=bar')).to.deep.equal({ foo: { bad: 'baz', '0': 'bar' } }); expect(Qs.parse('foo[bad]=baz&foo[]=bar')).to.deep.equal({ foo: { bad: 'baz', '0': 'bar' } }); expect(Qs.parse('foo[]=bar&foo[bad]=baz')).to.deep.equal({ foo: { '0': 'bar', bad: 'baz' } }); expect(Qs.parse('foo[bad]=baz&foo[]=bar&foo[]=foo')).to.deep.equal({ foo: { bad: 'baz', '0': 'bar', '1': 'foo' } }); expect(Qs.parse('foo[0][a]=a&foo[0][b]=b&foo[1][a]=aa&foo[1][b]=bb')).to.deep.equal({ foo: [{ a: 'a', b: 'b' }, { a: 'aa', b: 'bb' }] }); expect(Qs.parse('a[]=b&a[t]=u&a[hasOwnProperty]=c')).to.deep.equal({ a: { '0': 'b', t: 'u', c: true } }); expect(Qs.parse('a[]=b&a[hasOwnProperty]=c&a[x]=y')).to.deep.equal({ a: { '0': 'b', '1': 'c', x: 'y' } }); done(); }); it('transforms arrays to objects (dot notation)', function (done) { expect(Qs.parse('foo[0].baz=bar&fool.bad=baz', { allowDots: true })).to.deep.equal({ foo: [{ baz: 'bar' }], fool: { bad: 'baz' } }); expect(Qs.parse('foo[0].baz=bar&fool.bad.boo=baz', { allowDots: true })).to.deep.equal({ foo: [{ baz: 'bar' }], fool: { bad: { boo: 'baz' } } }); expect(Qs.parse('foo[0][0].baz=bar&fool.bad=baz', { allowDots: true })).to.deep.equal({ foo: [[{ baz: 'bar' }]], fool: { bad: 'baz' } }); expect(Qs.parse('foo[0].baz[0]=15&foo[0].bar=2', { allowDots: true })).to.deep.equal({ foo: [{ baz: ['15'], bar: '2' }] }); expect(Qs.parse('foo[0].baz[0]=15&foo[0].baz[1]=16&foo[0].bar=2', { allowDots: true })).to.deep.equal({ foo: [{ baz: ['15', '16'], bar: '2' }] }); expect(Qs.parse('foo.bad=baz&foo[0]=bar', { allowDots: true })).to.deep.equal({ foo: { bad: 'baz', '0': 'bar' } }); expect(Qs.parse('foo.bad=baz&foo[]=bar', { allowDots: true })).to.deep.equal({ foo: { bad: 'baz', '0': 'bar' } }); expect(Qs.parse('foo[]=bar&foo.bad=baz', { allowDots: true })).to.deep.equal({ foo: { '0': 'bar', bad: 'baz' } }); expect(Qs.parse('foo.bad=baz&foo[]=bar&foo[]=foo', { allowDots: true })).to.deep.equal({ foo: { bad: 'baz', '0': 'bar', '1': 'foo' } }); expect(Qs.parse('foo[0].a=a&foo[0].b=b&foo[1].a=aa&foo[1].b=bb', { allowDots: true })).to.deep.equal({ foo: [{ a: 'a', b: 'b' }, { a: 'aa', b: 'bb' }] }); done(); }); it('can add keys to objects', function (done) { expect(Qs.parse('a[b]=c&a=d')).to.deep.equal({ a: { b: 'c', d: true } }); done(); }); it('correctly prunes undefined values when converting an array to an object', function (done) { expect(Qs.parse('a[2]=b&a[99999999]=c')).to.deep.equal({ a: { '2': 'b', '99999999': 'c' } }); done(); }); it('supports malformed uri characters', function (done) { expect(Qs.parse('{%:%}', { strictNullHandling: true })).to.deep.equal({ '{%:%}': null }); expect(Qs.parse('{%:%}=')).to.deep.equal({ '{%:%}': '' }); expect(Qs.parse('foo=%:%}')).to.deep.equal({ foo: '%:%}' }); done(); }); it('doesn\'t produce empty keys', function (done) { expect(Qs.parse('_r=1&')).to.deep.equal({ '_r': '1' }); done(); }); it('cannot access Object prototype', function (done) { Qs.parse('constructor[prototype][bad]=bad'); Qs.parse('bad[constructor][prototype][bad]=bad'); expect(typeof Object.prototype.bad).to.equal('undefined'); done(); }); it('parses arrays of objects', function (done) { expect(Qs.parse('a[][b]=c')).to.deep.equal({ a: [{ b: 'c' }] }); expect(Qs.parse('a[0][b]=c')).to.deep.equal({ a: [{ b: 'c' }] }); done(); }); it('allows for empty strings in arrays', function (done) { expect(Qs.parse('a[]=b&a[]=&a[]=c')).to.deep.equal({ a: ['b', '', 'c'] }); expect(Qs.parse('a[0]=b&a[1]&a[2]=c&a[19]=', { strictNullHandling: true })).to.deep.equal({ a: ['b', null, 'c', ''] }); expect(Qs.parse('a[0]=b&a[1]=&a[2]=c&a[19]', { strictNullHandling: true })).to.deep.equal({ a: ['b', '', 'c', null] }); expect(Qs.parse('a[]=&a[]=b&a[]=c')).to.deep.equal({ a: ['', 'b', 'c'] }); done(); }); it('compacts sparse arrays', function (done) { expect(Qs.parse('a[10]=1&a[2]=2')).to.deep.equal({ a: ['2', '1'] }); done(); }); it('parses semi-parsed strings', function (done) { expect(Qs.parse({ 'a[b]': 'c' })).to.deep.equal({ a: { b: 'c' } }); expect(Qs.parse({ 'a[b]': 'c', 'a[d]': 'e' })).to.deep.equal({ a: { b: 'c', d: 'e' } }); done(); }); it('parses buffers correctly', function (done) { var b = new Buffer('test'); expect(Qs.parse({ a: b })).to.deep.equal({ a: b }); done(); }); it('continues parsing when no parent is found', function (done) { expect(Qs.parse('[]=&a=b')).to.deep.equal({ '0': '', a: 'b' }); expect(Qs.parse('[]&a=b', { strictNullHandling: true })).to.deep.equal({ '0': null, a: 'b' }); expect(Qs.parse('[foo]=bar')).to.deep.equal({ foo: 'bar' }); done(); }); it('does not error when parsing a very long array', function (done) { var str = 'a[]=a'; while (Buffer.byteLength(str) < 128 * 1024) { str += '&' + str; } expect(function () { Qs.parse(str); }).to.not.throw(); done(); }); it('should not throw when a native prototype has an enumerable property', { parallel: false }, function (done) { Object.prototype.crash = ''; Array.prototype.crash = ''; expect(Qs.parse.bind(null, 'a=b')).to.not.throw(); expect(Qs.parse('a=b')).to.deep.equal({ a: 'b' }); expect(Qs.parse.bind(null, 'a[][b]=c')).to.not.throw(); expect(Qs.parse('a[][b]=c')).to.deep.equal({ a: [{ b: 'c' }] }); delete Object.prototype.crash; delete Array.prototype.crash; done(); }); it('parses a string with an alternative string delimiter', function (done) { expect(Qs.parse('a=b;c=d', { delimiter: ';' })).to.deep.equal({ a: 'b', c: 'd' }); done(); }); it('parses a string with an alternative RegExp delimiter', function (done) { expect(Qs.parse('a=b; c=d', { delimiter: /[;,] */ })).to.deep.equal({ a: 'b', c: 'd' }); done(); }); it('does not use non-splittable objects as delimiters', function (done) { expect(Qs.parse('a=b&c=d', { delimiter: true })).to.deep.equal({ a: 'b', c: 'd' }); done(); }); it('allows overriding parameter limit', function (done) { expect(Qs.parse('a=b&c=d', { parameterLimit: 1 })).to.deep.equal({ a: 'b' }); done(); }); it('allows setting the parameter limit to Infinity', function (done) { expect(Qs.parse('a=b&c=d', { parameterLimit: Infinity })).to.deep.equal({ a: 'b', c: 'd' }); done(); }); it('allows overriding array limit', function (done) { expect(Qs.parse('a[0]=b', { arrayLimit: -1 })).to.deep.equal({ a: { '0': 'b' } }); expect(Qs.parse('a[-1]=b', { arrayLimit: -1 })).to.deep.equal({ a: { '-1': 'b' } }); expect(Qs.parse('a[0]=b&a[1]=c', { arrayLimit: 0 })).to.deep.equal({ a: { '0': 'b', '1': 'c' } }); done(); }); it('allows disabling array parsing', function (done) { expect(Qs.parse('a[0]=b&a[1]=c', { parseArrays: false })).to.deep.equal({ a: { '0': 'b', '1': 'c' } }); done(); }); it('parses an object', function (done) { var input = { 'user[name]': { 'pop[bob]': 3 }, 'user[email]': null }; var expected = { 'user': { 'name': { 'pop[bob]': 3 }, 'email': null } }; var result = Qs.parse(input); expect(result).to.deep.equal(expected); done(); }); it('parses an object in dot notation', function (done) { var input = { 'user.name': { 'pop[bob]': 3 }, 'user.email.': null }; var expected = { 'user': { 'name': { 'pop[bob]': 3 }, 'email': null } }; var result = Qs.parse(input, { allowDots: true }); expect(result).to.deep.equal(expected); done(); }); it('parses an object and not child values', function (done) { var input = { 'user[name]': { 'pop[bob]': { 'test': 3 } }, 'user[email]': null }; var expected = { 'user': { 'name': { 'pop[bob]': { 'test': 3 } }, 'email': null } }; var result = Qs.parse(input); expect(result).to.deep.equal(expected); done(); }); it('does not blow up when Buffer global is missing', function (done) { var tempBuffer = global.Buffer; delete global.Buffer; var result = Qs.parse('a=b&c=d'); global.Buffer = tempBuffer; expect(result).to.deep.equal({ a: 'b', c: 'd' }); done(); }); it('does not crash when parsing circular references', function (done) { var a = {}; a.b = a; var parsed; expect(function () { parsed = Qs.parse({ 'foo[bar]': 'baz', 'foo[baz]': a }); }).to.not.throw(); expect(parsed).to.contain('foo'); expect(parsed.foo).to.contain('bar', 'baz'); expect(parsed.foo.bar).to.equal('baz'); expect(parsed.foo.baz).to.deep.equal(a); done(); }); it('parses plain objects correctly', function (done) { var a = Object.create(null); a.b = 'c'; expect(Qs.parse(a)).to.deep.equal({ b: 'c' }); var result = Qs.parse({ a: a }); expect(result).to.contain('a'); expect(result.a).to.deep.equal(a); done(); }); it('parses dates correctly', function (done) { var now = new Date(); expect(Qs.parse({ a: now })).to.deep.equal({ a: now }); done(); }); it('parses regular expressions correctly', function (done) { var re = /^test$/; expect(Qs.parse({ a: re })).to.deep.equal({ a: re }); done(); }); it('can allow overwriting prototype properties', function (done) { expect(Qs.parse('a[hasOwnProperty]=b', { allowPrototypes: true })).to.deep.equal({ a: { hasOwnProperty: 'b' } }, { prototype: false }); expect(Qs.parse('hasOwnProperty=b', { allowPrototypes: true })).to.deep.equal({ hasOwnProperty: 'b' }, { prototype: false }); done(); }); it('can return plain objects', function (done) { var expected = Object.create(null); expected.a = Object.create(null); expected.a.b = 'c'; expected.a.hasOwnProperty = 'd'; expect(Qs.parse('a[b]=c&a[hasOwnProperty]=d', { plainObjects: true })).to.deep.equal(expected); expect(Qs.parse(null, { plainObjects: true })).to.deep.equal(Object.create(null)); var expectedArray = Object.create(null); expectedArray.a = Object.create(null); expectedArray.a['0'] = 'b'; expectedArray.a.c = 'd'; expect(Qs.parse('a[]=b&a[c]=d', { plainObjects: true })).to.deep.equal(expectedArray); done(); }); }); npm_3.5.2.orig/node_modules/request/node_modules/qs/test/stringify.js0000644000000000000000000002274112631326456024300 0ustar 00000000000000/* eslint no-extend-native:0 */ // Load modules var Code = require('code'); var Lab = require('lab'); var Qs = require('../'); // Declare internals var internals = {}; // Test shortcuts var lab = exports.lab = Lab.script(); var expect = Code.expect; var describe = lab.experiment; var it = lab.test; describe('stringify()', function () { it('stringifies a querystring object', function (done) { expect(Qs.stringify({ a: 'b' })).to.equal('a=b'); expect(Qs.stringify({ a: 1 })).to.equal('a=1'); expect(Qs.stringify({ a: 1, b: 2 })).to.equal('a=1&b=2'); expect(Qs.stringify({ a: 'A_Z' })).to.equal('a=A_Z'); expect(Qs.stringify({ a: '€' })).to.equal('a=%E2%82%AC'); expect(Qs.stringify({ a: '' })).to.equal('a=%EE%80%80'); expect(Qs.stringify({ a: 'א' })).to.equal('a=%D7%90'); expect(Qs.stringify({ a: '𐐷' })).to.equal('a=%F0%90%90%B7'); done(); }); it('stringifies a nested object', function (done) { expect(Qs.stringify({ a: { b: 'c' } })).to.equal('a%5Bb%5D=c'); expect(Qs.stringify({ a: { b: { c: { d: 'e' } } } })).to.equal('a%5Bb%5D%5Bc%5D%5Bd%5D=e'); done(); }); it('stringifies an array value', function (done) { expect(Qs.stringify({ a: ['b', 'c', 'd'] })).to.equal('a%5B0%5D=b&a%5B1%5D=c&a%5B2%5D=d'); done(); }); it('omits nulls when asked', function (done) { expect(Qs.stringify({ a: 'b', c: null }, { skipNulls: true })).to.equal('a=b'); done(); }); it('omits nested nulls when asked', function (done) { expect(Qs.stringify({ a: { b: 'c', d: null } }, { skipNulls: true })).to.equal('a%5Bb%5D=c'); done(); }); it('omits array indices when asked', function (done) { expect(Qs.stringify({ a: ['b', 'c', 'd'] }, { indices: false })).to.equal('a=b&a=c&a=d'); done(); }); it('stringifies a nested array value', function (done) { expect(Qs.stringify({ a: { b: ['c', 'd'] } })).to.equal('a%5Bb%5D%5B0%5D=c&a%5Bb%5D%5B1%5D=d'); done(); }); it('stringifies an object inside an array', function (done) { expect(Qs.stringify({ a: [{ b: 'c' }] })).to.equal('a%5B0%5D%5Bb%5D=c'); expect(Qs.stringify({ a: [{ b: { c: [1] } }] })).to.equal('a%5B0%5D%5Bb%5D%5Bc%5D%5B0%5D=1'); done(); }); it('does not omit object keys when indices = false', function (done) { expect(Qs.stringify({ a: [{ b: 'c' }] }, { indices: false })).to.equal('a%5Bb%5D=c'); done(); }); it('uses indices notation for arrays when indices=true', function (done) { expect(Qs.stringify({ a: ['b', 'c'] }, { indices: true })).to.equal('a%5B0%5D=b&a%5B1%5D=c'); done(); }); it('uses indices notation for arrays when no arrayFormat is specified', function (done) { expect(Qs.stringify({ a: ['b', 'c'] })).to.equal('a%5B0%5D=b&a%5B1%5D=c'); done(); }); it('uses indices notation for arrays when no arrayFormat=indices', function (done) { expect(Qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'indices' })).to.equal('a%5B0%5D=b&a%5B1%5D=c'); done(); }); it('uses repeat notation for arrays when no arrayFormat=repeat', function (done) { expect(Qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'repeat' })).to.equal('a=b&a=c'); done(); }); it('uses brackets notation for arrays when no arrayFormat=brackets', function (done) { expect(Qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'brackets' })).to.equal('a%5B%5D=b&a%5B%5D=c'); done(); }); it('stringifies a complicated object', function (done) { expect(Qs.stringify({ a: { b: 'c', d: 'e' } })).to.equal('a%5Bb%5D=c&a%5Bd%5D=e'); done(); }); it('stringifies an empty value', function (done) { expect(Qs.stringify({ a: '' })).to.equal('a='); expect(Qs.stringify({ a: null }, { strictNullHandling: true })).to.equal('a'); expect(Qs.stringify({ a: '', b: '' })).to.equal('a=&b='); expect(Qs.stringify({ a: null, b: '' }, { strictNullHandling: true })).to.equal('a&b='); expect(Qs.stringify({ a: { b: '' } })).to.equal('a%5Bb%5D='); expect(Qs.stringify({ a: { b: null } }, { strictNullHandling: true })).to.equal('a%5Bb%5D'); expect(Qs.stringify({ a: { b: null } }, { strictNullHandling: false })).to.equal('a%5Bb%5D='); done(); }); it('stringifies an empty object', function (done) { var obj = Object.create(null); obj.a = 'b'; expect(Qs.stringify(obj)).to.equal('a=b'); done(); }); it('returns an empty string for invalid input', function (done) { expect(Qs.stringify(undefined)).to.equal(''); expect(Qs.stringify(false)).to.equal(''); expect(Qs.stringify(null)).to.equal(''); expect(Qs.stringify('')).to.equal(''); done(); }); it('stringifies an object with an empty object as a child', function (done) { var obj = { a: Object.create(null) }; obj.a.b = 'c'; expect(Qs.stringify(obj)).to.equal('a%5Bb%5D=c'); done(); }); it('drops keys with a value of undefined', function (done) { expect(Qs.stringify({ a: undefined })).to.equal(''); expect(Qs.stringify({ a: { b: undefined, c: null } }, { strictNullHandling: true })).to.equal('a%5Bc%5D'); expect(Qs.stringify({ a: { b: undefined, c: null } }, { strictNullHandling: false })).to.equal('a%5Bc%5D='); expect(Qs.stringify({ a: { b: undefined, c: '' } })).to.equal('a%5Bc%5D='); done(); }); it('url encodes values', function (done) { expect(Qs.stringify({ a: 'b c' })).to.equal('a=b%20c'); done(); }); it('stringifies a date', function (done) { var now = new Date(); var str = 'a=' + encodeURIComponent(now.toISOString()); expect(Qs.stringify({ a: now })).to.equal(str); done(); }); it('stringifies the weird object from qs', function (done) { expect(Qs.stringify({ 'my weird field': '~q1!2"\'w$5&7/z8)?' })).to.equal('my%20weird%20field=~q1%212%22%27w%245%267%2Fz8%29%3F'); done(); }); it('skips properties that are part of the object prototype', function (done) { Object.prototype.crash = 'test'; expect(Qs.stringify({ a: 'b' })).to.equal('a=b'); expect(Qs.stringify({ a: { b: 'c' } })).to.equal('a%5Bb%5D=c'); delete Object.prototype.crash; done(); }); it('stringifies boolean values', function (done) { expect(Qs.stringify({ a: true })).to.equal('a=true'); expect(Qs.stringify({ a: { b: true } })).to.equal('a%5Bb%5D=true'); expect(Qs.stringify({ b: false })).to.equal('b=false'); expect(Qs.stringify({ b: { c: false } })).to.equal('b%5Bc%5D=false'); done(); }); it('stringifies buffer values', function (done) { expect(Qs.stringify({ a: new Buffer('test') })).to.equal('a=test'); expect(Qs.stringify({ a: { b: new Buffer('test') } })).to.equal('a%5Bb%5D=test'); done(); }); it('stringifies an object using an alternative delimiter', function (done) { expect(Qs.stringify({ a: 'b', c: 'd' }, { delimiter: ';' })).to.equal('a=b;c=d'); done(); }); it('doesn\'t blow up when Buffer global is missing', function (done) { var tempBuffer = global.Buffer; delete global.Buffer; var result = Qs.stringify({ a: 'b', c: 'd' }); global.Buffer = tempBuffer; expect(result).to.equal('a=b&c=d'); done(); }); it('selects properties when filter=array', function (done) { expect(Qs.stringify({ a: 'b' }, { filter: ['a'] })).to.equal('a=b'); expect(Qs.stringify({ a: 1 }, { filter: [] })).to.equal(''); expect(Qs.stringify({ a: { b: [1, 2, 3, 4], c: 'd' }, c: 'f' }, { filter: ['a', 'b', 0, 2] })).to.equal('a%5Bb%5D%5B0%5D=1&a%5Bb%5D%5B2%5D=3'); done(); }); it('supports custom representations when filter=function', function (done) { var calls = 0; var obj = { a: 'b', c: 'd', e: { f: new Date(1257894000000) } }; var filterFunc = function (prefix, value) { calls++; if (calls === 1) { expect(prefix).to.be.empty(); expect(value).to.equal(obj); } else if (prefix === 'c') { return; } else if (value instanceof Date) { expect(prefix).to.equal('e[f]'); return value.getTime(); } return value; }; expect(Qs.stringify(obj, { filter: filterFunc })).to.equal('a=b&e%5Bf%5D=1257894000000'); expect(calls).to.equal(5); done(); }); it('can disable uri encoding', function (done) { expect(Qs.stringify({ a: 'b' }, { encode: false })).to.equal('a=b'); expect(Qs.stringify({ a: { b: 'c' } }, { encode: false })).to.equal('a[b]=c'); expect(Qs.stringify({ a: 'b', c: null }, { strictNullHandling: true, encode: false })).to.equal('a=b&c'); done(); }); it('can sort the keys', function (done) { var sort = function alphabeticalSort (a, b) { return a.localeCompare(b); }; expect(Qs.stringify({ a: 'c', z: 'y', b : 'f' }, { sort : sort })).to.equal('a=c&b=f&z=y'); expect(Qs.stringify({ a: 'c', z: { j: 'a', i:'b' }, b : 'f' }, { sort : sort })).to.equal('a=c&b=f&z%5Bi%5D=b&z%5Bj%5D=a'); done(); }); }); npm_3.5.2.orig/node_modules/request/node_modules/qs/test/utils.js0000644000000000000000000000077312631326456023423 0ustar 00000000000000// Load modules var Code = require('code'); var Lab = require('lab'); var Utils = require('../lib/utils'); // Declare internals var internals = {}; // Test shortcuts var lab = exports.lab = Lab.script(); var expect = Code.expect; var describe = lab.experiment; var it = lab.test; describe('merge()', function () { it('can merge two objects with the same key', function (done) { expect(Utils.merge({ a: 'b' }, { a: 'c' })).to.deep.equal({ a: ['b', 'c'] }); done(); }); }); npm_3.5.2.orig/node_modules/request/node_modules/stringstream/.npmignore0000644000000000000000000000014012631326456025030 0ustar 00000000000000lib-cov *.seed *.log *.csv *.dat *.out *.pid *.gz pids logs results node_modules npm-debug.lognpm_3.5.2.orig/node_modules/request/node_modules/stringstream/.travis.yml0000644000000000000000000000005312631326456025145 0ustar 00000000000000language: node_js node_js: - 0.4 - 0.6 npm_3.5.2.orig/node_modules/request/node_modules/stringstream/LICENSE.txt0000644000000000000000000000025212631326456024660 0ustar 00000000000000Copyright 2012 Michael Hart (michael.hart.au@gmail.com) This project is free software released under the MIT license: http://www.opensource.org/licenses/mit-license.php npm_3.5.2.orig/node_modules/request/node_modules/stringstream/README.md0000644000000000000000000000204612631326456024317 0ustar 00000000000000# Decode streams into strings The Right Way(tm) ```javascript var fs = require('fs') var zlib = require('zlib') var strs = require('stringstream') var utf8Stream = fs.createReadStream('massiveLogFile.gz') .pipe(zlib.createGunzip()) .pipe(strs('utf8')) ``` No need to deal with `setEncoding()` weirdness, just compose streams like they were supposed to be! Handles input and output encoding: ```javascript // Stream from utf8 to hex to base64... Why not, ay. var hex64Stream = fs.createReadStream('myFile') .pipe(strs('utf8', 'hex')) .pipe(strs('hex', 'base64')) ``` Also deals with `base64` output correctly by aligning each emitted data chunk so that there are no dangling `=` characters: ```javascript var stream = fs.createReadStream('myFile').pipe(strs('base64')) var base64Str = '' stream.on('data', function(data) { base64Str += data }) stream.on('end', function() { console.log('My base64 encoded file is: ' + base64Str) // Wouldn't work with setEncoding() console.log('Original file is: ' + new Buffer(base64Str, 'base64')) }) ``` npm_3.5.2.orig/node_modules/request/node_modules/stringstream/example.js0000644000000000000000000000145612631326456025035 0ustar 00000000000000var fs = require('fs') var zlib = require('zlib') var strs = require('stringstream') var utf8Stream = fs.createReadStream('massiveLogFile.gz') .pipe(zlib.createGunzip()) .pipe(strs('utf8')) utf8Stream.pipe(process.stdout) // Stream from utf8 to hex to base64... Why not, ay. var hex64Stream = fs.createReadStream('myFile') .pipe(strs('utf8', 'hex')) .pipe(strs('hex', 'base64')) hex64Stream.pipe(process.stdout) // Deals with base64 correctly by aligning chunks var stream = fs.createReadStream('myFile').pipe(strs('base64')) var base64Str = '' stream.on('data', function(data) { base64Str += data }) stream.on('end', function() { console.log('My base64 encoded file is: ' + base64Str) // Wouldn't work with setEncoding() console.log('Original file is: ' + new Buffer(base64Str, 'base64')) }) npm_3.5.2.orig/node_modules/request/node_modules/stringstream/package.json0000644000000000000000000000365612631326456025336 0ustar 00000000000000{ "name": "stringstream", "version": "0.0.4", "description": "Encode and decode streams into string streams", "author": { "name": "Michael Hart", "email": "michael.hart.au@gmail.com", "url": "http://github.com/mhart" }, "main": "stringstream.js", "keywords": [ "string", "stream", "base64", "gzip" ], "repository": { "type": "git", "url": "git+https://github.com/mhart/StringStream.git" }, "license": "MIT", "readme": "# Decode streams into strings The Right Way(tm)\n\n```javascript\nvar fs = require('fs')\nvar zlib = require('zlib')\nvar strs = require('stringstream')\n\nvar utf8Stream = fs.createReadStream('massiveLogFile.gz')\n .pipe(zlib.createGunzip())\n .pipe(strs('utf8'))\n```\n\nNo need to deal with `setEncoding()` weirdness, just compose streams\nlike they were supposed to be!\n\nHandles input and output encoding:\n\n```javascript\n// Stream from utf8 to hex to base64... Why not, ay.\nvar hex64Stream = fs.createReadStream('myFile')\n .pipe(strs('utf8', 'hex'))\n .pipe(strs('hex', 'base64'))\n```\n\nAlso deals with `base64` output correctly by aligning each emitted data\nchunk so that there are no dangling `=` characters:\n\n```javascript\nvar stream = fs.createReadStream('myFile').pipe(strs('base64'))\n\nvar base64Str = ''\n\nstream.on('data', function(data) { base64Str += data })\nstream.on('end', function() {\n console.log('My base64 encoded file is: ' + base64Str) // Wouldn't work with setEncoding()\n console.log('Original file is: ' + new Buffer(base64Str, 'base64'))\n})\n```\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/mhart/StringStream/issues" }, "homepage": "https://github.com/mhart/StringStream#readme", "_id": "stringstream@0.0.4", "_shasum": "0f0e3423f942960b5692ac324a57dd093bc41a92", "_resolved": "https://registry.npmjs.org/stringstream/-/stringstream-0.0.4.tgz", "_from": "stringstream@>=0.0.4 <0.1.0" } npm_3.5.2.orig/node_modules/request/node_modules/stringstream/stringstream.js0000644000000000000000000000535012631326456026121 0ustar 00000000000000var util = require('util') var Stream = require('stream') var StringDecoder = require('string_decoder').StringDecoder module.exports = StringStream module.exports.AlignedStringDecoder = AlignedStringDecoder function StringStream(from, to) { if (!(this instanceof StringStream)) return new StringStream(from, to) Stream.call(this) if (from == null) from = 'utf8' this.readable = this.writable = true this.paused = false this.toEncoding = (to == null ? from : to) this.fromEncoding = (to == null ? '' : from) this.decoder = new AlignedStringDecoder(this.toEncoding) } util.inherits(StringStream, Stream) StringStream.prototype.write = function(data) { if (!this.writable) { var err = new Error('stream not writable') err.code = 'EPIPE' this.emit('error', err) return false } if (this.fromEncoding) { if (Buffer.isBuffer(data)) data = data.toString() data = new Buffer(data, this.fromEncoding) } var string = this.decoder.write(data) if (string.length) this.emit('data', string) return !this.paused } StringStream.prototype.flush = function() { if (this.decoder.flush) { var string = this.decoder.flush() if (string.length) this.emit('data', string) } } StringStream.prototype.end = function() { if (!this.writable && !this.readable) return this.flush() this.emit('end') this.writable = this.readable = false this.destroy() } StringStream.prototype.destroy = function() { this.decoder = null this.writable = this.readable = false this.emit('close') } StringStream.prototype.pause = function() { this.paused = true } StringStream.prototype.resume = function () { if (this.paused) this.emit('drain') this.paused = false } function AlignedStringDecoder(encoding) { StringDecoder.call(this, encoding) switch (this.encoding) { case 'base64': this.write = alignedWrite this.alignedBuffer = new Buffer(3) this.alignedBytes = 0 break } } util.inherits(AlignedStringDecoder, StringDecoder) AlignedStringDecoder.prototype.flush = function() { if (!this.alignedBuffer || !this.alignedBytes) return '' var leftover = this.alignedBuffer.toString(this.encoding, 0, this.alignedBytes) this.alignedBytes = 0 return leftover } function alignedWrite(buffer) { var rem = (this.alignedBytes + buffer.length) % this.alignedBuffer.length if (!rem && !this.alignedBytes) return buffer.toString(this.encoding) var returnBuffer = new Buffer(this.alignedBytes + buffer.length - rem) this.alignedBuffer.copy(returnBuffer, 0, 0, this.alignedBytes) buffer.copy(returnBuffer, this.alignedBytes, 0, buffer.length - rem) buffer.copy(this.alignedBuffer, 0, buffer.length - rem, buffer.length) this.alignedBytes = rem return returnBuffer.toString(this.encoding) } npm_3.5.2.orig/node_modules/request/node_modules/tough-cookie/LICENSE0000644000000000000000000000363412631326456023724 0ustar 00000000000000Copyright (c) 2015, Salesforce.com, Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of Salesforce.com nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. === The following exceptions apply: === `public_suffix_list.dat` was obtained from via . The license for this file is MPL/2.0. The header of that file reads as follows: // This Source Code Form is subject to the terms of the Mozilla Public // License, v. 2.0. If a copy of the MPL was not distributed with this // file, You can obtain one at http://mozilla.org/MPL/2.0/. npm_3.5.2.orig/node_modules/request/node_modules/tough-cookie/README.md0000644000000000000000000006124212631326456024175 0ustar 00000000000000[RFC6265](https://tools.ietf.org/html/rfc6265) Cookies and CookieJar for Node.js [![Build Status](https://travis-ci.org/SalesforceEng/tough-cookie.png?branch=master)](https://travis-ci.org/SalesforceEng/tough-cookie) [![NPM Stats](https://nodei.co/npm/tough-cookie.png?downloads=true&stars=true)](https://npmjs.org/package/tough-cookie) ![NPM Downloads](https://nodei.co/npm-dl/tough-cookie.png?months=9) # Synopsis ``` javascript var tough = require('tough-cookie'); var Cookie = tough.Cookie; var cookie = Cookie.parse(header); cookie.value = 'somethingdifferent'; header = cookie.toString(); var cookiejar = new tough.CookieJar(); cookiejar.setCookie(cookie, 'http://currentdomain.example.com/path', cb); // ... cookiejar.getCookies('http://example.com/otherpath',function(err,cookies) { res.headers['cookie'] = cookies.join('; '); }); ``` # Installation It's _so_ easy! `npm install tough-cookie` Why the name? NPM modules `cookie`, `cookies` and `cookiejar` were already taken. # API ## tough Functions on the module you get from `require('tough-cookie')`. All can be used as pure functions and don't need to be "bound". **Note**: prior to 1.0.x, several of these functions took a `strict` parameter. This has since been removed from the API as it was no longer necessary. ### `parseDate(string)` Parse a cookie date string into a `Date`. Parses according to RFC6265 Section 5.1.1, not `Date.parse()`. ### `formatDate(date)` Format a Date into a RFC1123 string (the RFC6265-recommended format). ### `canonicalDomain(str)` Transforms a domain-name into a canonical domain-name. The canonical domain-name is a trimmed, lowercased, stripped-of-leading-dot and optionally punycode-encoded domain-name (Section 5.1.2 of RFC6265). For the most part, this function is idempotent (can be run again on its output without ill effects). ### `domainMatch(str,domStr[,canonicalize=true])` Answers "does this real domain match the domain in a cookie?". The `str` is the "current" domain-name and the `domStr` is the "cookie" domain-name. Matches according to RFC6265 Section 5.1.3, but it helps to think of it as a "suffix match". The `canonicalize` parameter will run the other two paramters through `canonicalDomain` or not. ### `defaultPath(path)` Given a current request/response path, gives the Path apropriate for storing in a cookie. This is basically the "directory" of a "file" in the path, but is specified by Section 5.1.4 of the RFC. The `path` parameter MUST be _only_ the pathname part of a URI (i.e. excludes the hostname, query, fragment, etc.). This is the `.pathname` property of node's `uri.parse()` output. ### `pathMatch(reqPath,cookiePath)` Answers "does the request-path path-match a given cookie-path?" as per RFC6265 Section 5.1.4. Returns a boolean. This is essentially a prefix-match where `cookiePath` is a prefix of `reqPath`. ### `parse(cookieString[, options])` alias for `Cookie.parse(cookieString[, options])` ### `fromJSON(string)` alias for `Cookie.fromJSON(string)` ### `getPublicSuffix(hostname)` Returns the public suffix of this hostname. The public suffix is the shortest domain-name upon which a cookie can be set. Returns `null` if the hostname cannot have cookies set for it. For example: `www.example.com` and `www.subdomain.example.com` both have public suffix `example.com`. For further information, see http://publicsuffix.org/. This module derives its list from that site. ### `cookieCompare(a,b)` For use with `.sort()`, sorts a list of cookies into the recommended order given in the RFC (Section 5.4 step 2). The sort algorithm is, in order of precedence: * Longest `.path` * oldest `.creation` (which has a 1ms precision, same as `Date`) * lowest `.creationIndex` (to get beyond the 1ms precision) ``` javascript var cookies = [ /* unsorted array of Cookie objects */ ]; cookies = cookies.sort(cookieCompare); ``` **Note**: Since JavaScript's `Date` is limited to a 1ms precision, cookies within the same milisecond are entirely possible. This is especially true when using the `now` option to `.setCookie()`. The `.creationIndex` property is a per-process global counter, assigned during construction with `new Cookie()`. This preserves the spirit of the RFC sorting: older cookies go first. This works great for `MemoryCookieStore`, since `Set-Cookie` headers are parsed in order, but may not be so great for distributed systems. Sophisticated `Store`s may wish to set this to some other _logical clock_ such that if cookies A and B are created in the same millisecond, but cookie A is created before cookie B, then `A.creationIndex < B.creationIndex`. If you want to alter the global counter, which you probably _shouldn't_ do, it's stored in `Cookie.cookiesCreated`. ### `permuteDomain(domain)` Generates a list of all possible domains that `domainMatch()` the parameter. May be handy for implementing cookie stores. ### `permutePath(path)` Generates a list of all possible paths that `pathMatch()` the parameter. May be handy for implementing cookie stores. ## Cookie Exported via `tough.Cookie`. ### `Cookie.parse(cookieString[, options])` Parses a single Cookie or Set-Cookie HTTP header into a `Cookie` object. Returns `undefined` if the string can't be parsed. The options parameter is not required and currently has only one property: * _loose_ - boolean - if `true` enable parsing of key-less cookies like `=abc` and `=`, which are not RFC-compliant. If options is not an object, it is ignored, which means you can use `Array#map` with it. Here's how to process the Set-Cookie header(s) on a node HTTP/HTTPS response: ``` javascript if (res.headers['set-cookie'] instanceof Array) cookies = res.headers['set-cookie'].map(Cookie.parse); else cookies = [Cookie.parse(res.headers['set-cookie'])]; ``` ### Properties Cookie object properties: * _key_ - string - the name or key of the cookie (default "") * _value_ - string - the value of the cookie (default "") * _expires_ - `Date` - if set, the `Expires=` attribute of the cookie (defaults to the string `"Infinity"`). See `setExpires()` * _maxAge_ - seconds - if set, the `Max-Age=` attribute _in seconds_ of the cookie. May also be set to strings `"Infinity"` and `"-Infinity"` for non-expiry and immediate-expiry, respectively. See `setMaxAge()` * _domain_ - string - the `Domain=` attribute of the cookie * _path_ - string - the `Path=` of the cookie * _secure_ - boolean - the `Secure` cookie flag * _httpOnly_ - boolean - the `HttpOnly` cookie flag * _extensions_ - `Array` - any unrecognized cookie attributes as strings (even if equal-signs inside) * _creation_ - `Date` - when this cookie was constructed * _creationIndex_ - number - set at construction, used to provide greater sort precision (please see `cookieCompare(a,b)` for a full explanation) After a cookie has been passed through `CookieJar.setCookie()` it will have the following additional attributes: * _hostOnly_ - boolean - is this a host-only cookie (i.e. no Domain field was set, but was instead implied) * _pathIsDefault_ - boolean - if true, there was no Path field on the cookie and `defaultPath()` was used to derive one. * _creation_ - `Date` - **modified** from construction to when the cookie was added to the jar * _lastAccessed_ - `Date` - last time the cookie got accessed. Will affect cookie cleaning once implemented. Using `cookiejar.getCookies(...)` will update this attribute. ### `Cookie([{properties}])` Receives an options object that can contain any of the above Cookie properties, uses the default for unspecified properties. ### `.toString()` encode to a Set-Cookie header value. The Expires cookie field is set using `formatDate()`, but is omitted entirely if `.expires` is `Infinity`. ### `.cookieString()` encode to a Cookie header value (i.e. the `.key` and `.value` properties joined with '='). ### `.setExpires(String)` sets the expiry based on a date-string passed through `parseDate()`. If parseDate returns `null` (i.e. can't parse this date string), `.expires` is set to `"Infinity"` (a string) is set. ### `.setMaxAge(number)` sets the maxAge in seconds. Coerces `-Infinity` to `"-Infinity"` and `Infinity` to `"Infinity"` so it JSON serializes correctly. ### `.expiryTime([now=Date.now()])` ### `.expiryDate([now=Date.now()])` expiryTime() Computes the absolute unix-epoch milliseconds that this cookie expires. expiryDate() works similarly, except it returns a `Date` object. Note that in both cases the `now` parameter should be milliseconds. Max-Age takes precedence over Expires (as per the RFC). The `.creation` attribute -- or, by default, the `now` paramter -- is used to offset the `.maxAge` attribute. If Expires (`.expires`) is set, that's returned. Otherwise, `expiryTime()` returns `Infinity` and `expiryDate()` returns a `Date` object for "Tue, 19 Jan 2038 03:14:07 GMT" (latest date that can be expressed by a 32-bit `time_t`; the common limit for most user-agents). ### `.TTL([now=Date.now()])` compute the TTL relative to `now` (milliseconds). The same precedence rules as for `expiryTime`/`expiryDate` apply. The "number" `Infinity` is returned for cookies without an explicit expiry and `0` is returned if the cookie is expired. Otherwise a time-to-live in milliseconds is returned. ### `.canonicalizedDoman()` ### `.cdomain()` return the canonicalized `.domain` field. This is lower-cased and punycode (RFC3490) encoded if the domain has any non-ASCII characters. ### `.toJSON()` For convenience in using `JSON.serialize(cookie)`. Returns a plain-old `Object` that can be JSON-serialized. Any `Date` properties (i.e., `.expires`, `.creation`, and `.lastAccessed`) are exported in ISO format (`.toISOString()`). **NOTE**: Custom `Cookie` properties will be discarded. In tough-cookie 1.x, since there was no `.toJSON` method explicitly defined, all enumerable properties were captured. If you want a property to be serialized, add the property name to the `Cookie.serializableProperties` Array. ### `Cookie.fromJSON(strOrObj)` Does the reverse of `cookie.toJSON()`. If passed a string, will `JSON.parse()` that first. Any `Date` properties (i.e., `.expires`, `.creation`, and `.lastAccessed`) are parsed via `Date.parse()`, not the tough-cookie `parseDate`, since it's JavaScript/JSON-y timestamps being handled at this layer. Returns `null` upon JSON parsing error. ### `.clone()` Does a deep clone of this cookie, exactly implemented as `Cookie.fromJSON(cookie.toJSON())`. ### `.validate()` Status: *IN PROGRESS*. Works for a few things, but is by no means comprehensive. validates cookie attributes for semantic correctness. Useful for "lint" checking any Set-Cookie headers you generate. For now, it returns a boolean, but eventually could return a reason string -- you can future-proof with this construct: ``` javascript if (cookie.validate() === true) { // it's tasty } else { // yuck! } ``` ## CookieJar Exported via `tough.CookieJar`. ### `CookieJar([store],[options])` Simply use `new CookieJar()`. If you'd like to use a custom store, pass that to the constructor otherwise a `MemoryCookieStore` will be created and used. The `options` object can be omitted and can have the following properties: * _rejectPublicSuffixes_ - boolean - default `true` - reject cookies with domains like "com" and "co.uk" * _looseMode_ - boolean - default `false` - accept malformed cookies like `bar` and `=bar`, which have an implied empty name. This is not in the standard, but is used sometimes on the web and is accepted by (most) browsers. Since eventually this module would like to support database/remote/etc. CookieJars, continuation passing style is used for CookieJar methods. ### `.setCookie(cookieOrString, currentUrl, [{options},] cb(err,cookie))` Attempt to set the cookie in the cookie jar. If the operation fails, an error will be given to the callback `cb`, otherwise the cookie is passed through. The cookie will have updated `.creation`, `.lastAccessed` and `.hostOnly` properties. The `options` object can be omitted and can have the following properties: * _http_ - boolean - default `true` - indicates if this is an HTTP or non-HTTP API. Affects HttpOnly cookies. * _secure_ - boolean - autodetect from url - indicates if this is a "Secure" API. If the currentUrl starts with `https:` or `wss:` then this is defaulted to `true`, otherwise `false`. * _now_ - Date - default `new Date()` - what to use for the creation/access time of cookies * _ignoreError_ - boolean - default `false` - silently ignore things like parse errors and invalid domains. `Store` errors aren't ignored by this option. As per the RFC, the `.hostOnly` property is set if there was no "Domain=" parameter in the cookie string (or `.domain` was null on the Cookie object). The `.domain` property is set to the fully-qualified hostname of `currentUrl` in this case. Matching this cookie requires an exact hostname match (not a `domainMatch` as per usual). ### `.setCookieSync(cookieOrString, currentUrl, [{options}])` Synchronous version of `setCookie`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). ### `.getCookies(currentUrl, [{options},] cb(err,cookies))` Retrieve the list of cookies that can be sent in a Cookie header for the current url. If an error is encountered, that's passed as `err` to the callback, otherwise an `Array` of `Cookie` objects is passed. The array is sorted with `cookieCompare()` unless the `{sort:false}` option is given. The `options` object can be omitted and can have the following properties: * _http_ - boolean - default `true` - indicates if this is an HTTP or non-HTTP API. Affects HttpOnly cookies. * _secure_ - boolean - autodetect from url - indicates if this is a "Secure" API. If the currentUrl starts with `https:` or `wss:` then this is defaulted to `true`, otherwise `false`. * _now_ - Date - default `new Date()` - what to use for the creation/access time of cookies * _expire_ - boolean - default `true` - perform expiry-time checking of cookies and asynchronously remove expired cookies from the store. Using `false` will return expired cookies and **not** remove them from the store (which is useful for replaying Set-Cookie headers, potentially). * _allPaths_ - boolean - default `false` - if `true`, do not scope cookies by path. The default uses RFC-compliant path scoping. **Note**: may not be supported by the underlying store (the default `MemoryCookieStore` supports it). The `.lastAccessed` property of the returned cookies will have been updated. ### `.getCookiesSync(currentUrl, [{options}])` Synchronous version of `getCookies`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). ### `.getCookieString(...)` Accepts the same options as `.getCookies()` but passes a string suitable for a Cookie header rather than an array to the callback. Simply maps the `Cookie` array via `.cookieString()`. ### `.getCookieStringSync(...)` Synchronous version of `getCookieString`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). ### `.getSetCookieStrings(...)` Returns an array of strings suitable for **Set-Cookie** headers. Accepts the same options as `.getCookies()`. Simply maps the cookie array via `.toString()`. ### `.getSetCookieStringsSync(...)` Synchronous version of `getSetCookieStrings`; only works with synchronous stores (e.g. the default `MemoryCookieStore`). ### `.serialize(cb(err,serializedObject))` Serialize the Jar if the underlying store supports `.getAllCookies`. **NOTE**: Custom `Cookie` properties will be discarded. If you want a property to be serialized, add the property name to the `Cookie.serializableProperties` Array. See [Serialization Format]. ### `.serializeSync()` Sync version of .serialize ### `.toJSON()` Alias of .serializeSync() for the convenience of `JSON.stringify(cookiejar)`. ### `CookieJar.deserialize(serialized, [store], cb(err,object))` A new Jar is created and the serialized Cookies are added to the underlying store. Each `Cookie` is added via `store.putCookie` in the order in which they appear in the serialization. The `store` argument is optional, but should be an instance of `Store`. By default, a new instance of `MemoryCookieStore` is created. As a convenience, if `serialized` is a string, it is passed through `JSON.parse` first. If that throws an error, this is passed to the callback. ### `CookieJar.deserializeSync(serialized, [store])` Sync version of `.deserialize`. _Note_ that the `store` must be synchronous for this to work. ### `CookieJar.fromJSON(string)` Alias of `.deserializeSync` to provide consistency with `Cookie.fromJSON()`. ### `.clone([store,]cb(err,newJar))` Produces a deep clone of this jar. Modifications to the original won't affect the clone, and vice versa. The `store` argument is optional, but should be an instance of `Store`. By default, a new instance of `MemoryCookieStore` is created. Transferring between store types is supported so long as the source implements `.getAllCookies()` and the destination implements `.putCookie()`. ### `.cloneSync([store])` Synchronous version of `.clone`, returning a new `CookieJar` instance. The `store` argument is optional, but must be a _synchronous_ `Store` instance if specified. If not passed, a new instance of `MemoryCookieStore` is used. The _source_ and _destination_ must both be synchronous `Store`s. If one or both stores are asynchronous, use `.clone` instead. Recall that `MemoryCookieStore` supports both synchronous and asynchronous API calls. ## Store Base class for CookieJar stores. Available as `tough.Store`. ## Store API The storage model for each `CookieJar` instance can be replaced with a custom implementation. The default is `MemoryCookieStore` which can be found in the `lib/memstore.js` file. The API uses continuation-passing-style to allow for asynchronous stores. Stores should inherit from the base `Store` class, which is available as `require('tough-cookie').Store`. Stores are asynchronous by default, but if `store.synchronous` is set to `true`, then the `*Sync` methods on the of the containing `CookieJar` can be used (however, the continuation-passing style All `domain` parameters will have been normalized before calling. The Cookie store must have all of the following methods. ### `store.findCookie(domain, path, key, cb(err,cookie))` Retrieve a cookie with the given domain, path and key (a.k.a. name). The RFC maintains that exactly one of these cookies should exist in a store. If the store is using versioning, this means that the latest/newest such cookie should be returned. Callback takes an error and the resulting `Cookie` object. If no cookie is found then `null` MUST be passed instead (i.e. not an error). ### `store.findCookies(domain, path, cb(err,cookies))` Locates cookies matching the given domain and path. This is most often called in the context of `cookiejar.getCookies()` above. If no cookies are found, the callback MUST be passed an empty array. The resulting list will be checked for applicability to the current request according to the RFC (domain-match, path-match, http-only-flag, secure-flag, expiry, etc.), so it's OK to use an optimistic search algorithm when implementing this method. However, the search algorithm used SHOULD try to find cookies that `domainMatch()` the domain and `pathMatch()` the path in order to limit the amount of checking that needs to be done. As of version 0.9.12, the `allPaths` option to `cookiejar.getCookies()` above will cause the path here to be `null`. If the path is `null`, path-matching MUST NOT be performed (i.e. domain-matching only). ### `store.putCookie(cookie, cb(err))` Adds a new cookie to the store. The implementation SHOULD replace any existing cookie with the same `.domain`, `.path`, and `.key` properties -- depending on the nature of the implementation, it's possible that between the call to `fetchCookie` and `putCookie` that a duplicate `putCookie` can occur. The `cookie` object MUST NOT be modified; the caller will have already updated the `.creation` and `.lastAccessed` properties. Pass an error if the cookie cannot be stored. ### `store.updateCookie(oldCookie, newCookie, cb(err))` Update an existing cookie. The implementation MUST update the `.value` for a cookie with the same `domain`, `.path` and `.key`. The implementation SHOULD check that the old value in the store is equivalent to `oldCookie` - how the conflict is resolved is up to the store. The `.lastAccessed` property will always be different between the two objects (to the precision possible via JavaScript's clock). Both `.creation` and `.creationIndex` are guaranteed to be the same. Stores MAY ignore or defer the `.lastAccessed` change at the cost of affecting how cookies are selected for automatic deletion (e.g., least-recently-used, which is up to the store to implement). Stores may wish to optimize changing the `.value` of the cookie in the store versus storing a new cookie. If the implementation doesn't define this method a stub that calls `putCookie(newCookie,cb)` will be added to the store object. The `newCookie` and `oldCookie` objects MUST NOT be modified. Pass an error if the newCookie cannot be stored. ### `store.removeCookie(domain, path, key, cb(err))` Remove a cookie from the store (see notes on `findCookie` about the uniqueness constraint). The implementation MUST NOT pass an error if the cookie doesn't exist; only pass an error due to the failure to remove an existing cookie. ### `store.removeCookies(domain, path, cb(err))` Removes matching cookies from the store. The `path` parameter is optional, and if missing means all paths in a domain should be removed. Pass an error ONLY if removing any existing cookies failed. ### `store.getAllCookies(cb(err, cookies))` Produces an `Array` of all cookies during `jar.serialize()`. The items in the array can be true `Cookie` objects or generic `Object`s with the [Serialization Format] data structure. Cookies SHOULD be returned in creation order to preserve sorting via `compareCookies()`. For reference, `MemoryCookieStore` will sort by `.creationIndex` since it uses true `Cookie` objects internally. If you don't return the cookies in creation order, they'll still be sorted by creation time, but this only has a precision of 1ms. See `compareCookies` for more detail. Pass an error if retrieval fails. ## MemoryCookieStore Inherits from `Store`. A just-in-memory CookieJar synchronous store implementation, used by default. Despite being a synchronous implementation, it's usable with both the synchronous and asynchronous forms of the `CookieJar` API. # Serialization Format **NOTE**: if you want to have custom `Cookie` properties serialized, add the property name to `Cookie.serializableProperties`. ```js { // The version of tough-cookie that serialized this jar. version: 'tough-cookie@1.x.y', // add the store type, to make humans happy: storeType: 'MemoryCookieStore', // CookieJar configuration: rejectPublicSuffixes: true, // ... future items go here // Gets filled from jar.store.getAllCookies(): cookies: [ { key: 'string', value: 'string', // ... /* other Cookie.serializableProperties go here */ } ] } ``` # Copyright and License (tl;dr: BSD-3-Clause with some MPL/2.0) ```text Copyright (c) 2015, Salesforce.com, Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of Salesforce.com nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ``` Portions may be licensed under different licenses (in particular `public_suffix_list.dat` is MPL/2.0); please read that file and the LICENSE file for full details. npm_3.5.2.orig/node_modules/request/node_modules/tough-cookie/lib/0000755000000000000000000000000012631326456023457 5ustar 00000000000000npm_3.5.2.orig/node_modules/request/node_modules/tough-cookie/package.json0000644000000000000000000000401312631326456025175 0ustar 00000000000000{ "author": { "name": "Jeremy Stashewsky", "email": "jstashewsky@salesforce.com" }, "contributors": [ { "name": "Alexander Savin" }, { "name": "Ian Livingstone" }, { "name": "Ivan Nikulin" }, { "name": "Lalit Kapoor" }, { "name": "Sam Thompson" }, { "name": "Sebastian Mayr" } ], "license": "BSD-3-Clause", "name": "tough-cookie", "description": "RFC6265 Cookies and Cookie Jar for node.js", "keywords": [ "HTTP", "cookie", "cookies", "set-cookie", "cookiejar", "jar", "RFC6265", "RFC2965" ], "version": "2.2.0", "homepage": "https://github.com/SalesforceEng/tough-cookie", "repository": { "type": "git", "url": "git://github.com/SalesforceEng/tough-cookie.git" }, "bugs": { "url": "https://github.com/SalesforceEng/tough-cookie/issues" }, "main": "./lib/cookie", "files": [ "lib" ], "scripts": { "suffixup": "curl -o public_suffix_list.dat https://publicsuffix.org/list/public_suffix_list.dat && ./generate-pubsuffix.js", "test": "vows test/*_test.js" }, "engines": { "node": ">=0.10.0" }, "devDependencies": { "async": "^1.4.2", "vows": "^0.8.1" }, "gitHead": "fb1456177c9b51445afa34656eb314c70c2adcd2", "_id": "tough-cookie@2.2.0", "_shasum": "d4ce661075e5fddb7f20341d3f9931a6fbbadde0", "_from": "tough-cookie@>=2.2.0 <2.3.0", "_npmVersion": "2.11.2", "_nodeVersion": "0.12.5", "_npmUser": { "name": "jstash", "email": "jstash@gmail.com" }, "dist": { "shasum": "d4ce661075e5fddb7f20341d3f9931a6fbbadde0", "tarball": "http://registry.npmjs.org/tough-cookie/-/tough-cookie-2.2.0.tgz" }, "maintainers": [ { "name": "jstash", "email": "jeremy@goinstant.com" }, { "name": "goinstant", "email": "services@goinstant.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/tough-cookie/-/tough-cookie-2.2.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/request/node_modules/tough-cookie/lib/cookie.js0000644000000000000000000011126212631326456025271 0ustar 00000000000000/*! * Copyright (c) 2015, Salesforce.com, Inc. * All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions are met: * * 1. Redistributions of source code must retain the above copyright notice, * this list of conditions and the following disclaimer. * * 2. Redistributions in binary form must reproduce the above copyright notice, * this list of conditions and the following disclaimer in the documentation * and/or other materials provided with the distribution. * * 3. Neither the name of Salesforce.com nor the names of its contributors may * be used to endorse or promote products derived from this software without * specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE * ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE * LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE * POSSIBILITY OF SUCH DAMAGE. */ 'use strict'; var net = require('net'); var urlParse = require('url').parse; var pubsuffix = require('./pubsuffix'); var Store = require('./store').Store; var MemoryCookieStore = require('./memstore').MemoryCookieStore; var pathMatch = require('./pathMatch').pathMatch; var VERSION = require('../package.json').version; var punycode; try { punycode = require('punycode'); } catch(e) { console.warn("cookie: can't load punycode; won't use punycode for domain normalization"); } var DATE_DELIM = /[\x09\x20-\x2F\x3B-\x40\x5B-\x60\x7B-\x7E]/; // From RFC6265 S4.1.1 // note that it excludes \x3B ";" var COOKIE_OCTET = /[\x21\x23-\x2B\x2D-\x3A\x3C-\x5B\x5D-\x7E]/; var COOKIE_OCTETS = new RegExp('^'+COOKIE_OCTET.source+'+$'); var CONTROL_CHARS = /[\x00-\x1F]/; // Double quotes are part of the value (see: S4.1.1). // '\r', '\n' and '\0' should be treated as a terminator in the "relaxed" mode // (see: https://github.com/ChromiumWebApps/chromium/blob/b3d3b4da8bb94c1b2e061600df106d590fda3620/net/cookies/parsed_cookie.cc#L60) // '=' and ';' are attribute/values separators // (see: https://github.com/ChromiumWebApps/chromium/blob/b3d3b4da8bb94c1b2e061600df106d590fda3620/net/cookies/parsed_cookie.cc#L64) var COOKIE_PAIR = /^(([^=;]+))\s*=\s*(("?)[^\n\r\0]*\3)/ // Used to parse non-RFC-compliant cookies like '=abc' when given the `loose` // option in Cookie.parse: var LOOSE_COOKIE_PAIR = /^((?:=)?([^=;]*)\s*=\s*)?(("?)[^\n\r\0]*\3)/; // RFC6265 S4.1.1 defines path value as 'any CHAR except CTLs or ";"' // Note ';' is \x3B var PATH_VALUE = /[\x20-\x3A\x3C-\x7E]+/; // Used for checking whether or not there is a trailing semi-colon var TRAILING_SEMICOLON = /;+$/; var DAY_OF_MONTH = /^(\d{1,2})[^\d]*$/; var TIME = /^(\d{1,2})[^\d]*:(\d{1,2})[^\d]*:(\d{1,2})[^\d]*$/; var MONTH = /^(Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec)/i; var MONTH_TO_NUM = { jan:0, feb:1, mar:2, apr:3, may:4, jun:5, jul:6, aug:7, sep:8, oct:9, nov:10, dec:11 }; var NUM_TO_MONTH = [ 'Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec' ]; var NUM_TO_DAY = [ 'Sun','Mon','Tue','Wed','Thu','Fri','Sat' ]; var YEAR = /^(\d{2}|\d{4})$/; // 2 to 4 digits var MAX_TIME = 2147483647000; // 31-bit max var MIN_TIME = 0; // 31-bit min // RFC6265 S5.1.1 date parser: function parseDate(str) { if (!str) { return; } /* RFC6265 S5.1.1: * 2. Process each date-token sequentially in the order the date-tokens * appear in the cookie-date */ var tokens = str.split(DATE_DELIM); if (!tokens) { return; } var hour = null; var minutes = null; var seconds = null; var day = null; var month = null; var year = null; for (var i=0; i 23 || minutes > 59 || seconds > 59) { return; } continue; } } /* 2.2. If the found-day-of-month flag is not set and the date-token matches * the day-of-month production, set the found-day-of- month flag and set * the day-of-month-value to the number denoted by the date-token. Skip * the remaining sub-steps and continue to the next date-token. */ if (day === null) { result = DAY_OF_MONTH.exec(token); if (result) { day = parseInt(result, 10); /* RFC6265 S5.1.1.5: * [fail if] the day-of-month-value is less than 1 or greater than 31 */ if(day < 1 || day > 31) { return; } continue; } } /* 2.3. If the found-month flag is not set and the date-token matches the * month production, set the found-month flag and set the month-value to * the month denoted by the date-token. Skip the remaining sub-steps and * continue to the next date-token. */ if (month === null) { result = MONTH.exec(token); if (result) { month = MONTH_TO_NUM[result[1].toLowerCase()]; continue; } } /* 2.4. If the found-year flag is not set and the date-token matches the year * production, set the found-year flag and set the year-value to the number * denoted by the date-token. Skip the remaining sub-steps and continue to * the next date-token. */ if (year === null) { result = YEAR.exec(token); if (result) { year = parseInt(result[0], 10); /* From S5.1.1: * 3. If the year-value is greater than or equal to 70 and less * than or equal to 99, increment the year-value by 1900. * 4. If the year-value is greater than or equal to 0 and less * than or equal to 69, increment the year-value by 2000. */ if (70 <= year && year <= 99) { year += 1900; } else if (0 <= year && year <= 69) { year += 2000; } if (year < 1601) { return; // 5. ... the year-value is less than 1601 } } } } if (seconds === null || day === null || month === null || year === null) { return; // 5. ... at least one of the found-day-of-month, found-month, found- // year, or found-time flags is not set, } return new Date(Date.UTC(year, month, day, hour, minutes, seconds)); } function formatDate(date) { var d = date.getUTCDate(); d = d >= 10 ? d : '0'+d; var h = date.getUTCHours(); h = h >= 10 ? h : '0'+h; var m = date.getUTCMinutes(); m = m >= 10 ? m : '0'+m; var s = date.getUTCSeconds(); s = s >= 10 ? s : '0'+s; return NUM_TO_DAY[date.getUTCDay()] + ', ' + d+' '+ NUM_TO_MONTH[date.getUTCMonth()] +' '+ date.getUTCFullYear() +' '+ h+':'+m+':'+s+' GMT'; } // S5.1.2 Canonicalized Host Names function canonicalDomain(str) { if (str == null) { return null; } str = str.trim().replace(/^\./,''); // S4.1.2.3 & S5.2.3: ignore leading . // convert to IDN if any non-ASCII characters if (punycode && /[^\u0001-\u007f]/.test(str)) { str = punycode.toASCII(str); } return str.toLowerCase(); } // S5.1.3 Domain Matching function domainMatch(str, domStr, canonicalize) { if (str == null || domStr == null) { return null; } if (canonicalize !== false) { str = canonicalDomain(str); domStr = canonicalDomain(domStr); } /* * "The domain string and the string are identical. (Note that both the * domain string and the string will have been canonicalized to lower case at * this point)" */ if (str == domStr) { return true; } /* "All of the following [three] conditions hold:" (order adjusted from the RFC) */ /* "* The string is a host name (i.e., not an IP address)." */ if (net.isIP(str)) { return false; } /* "* The domain string is a suffix of the string" */ var idx = str.indexOf(domStr); if (idx <= 0) { return false; // it's a non-match (-1) or prefix (0) } // e.g "a.b.c".indexOf("b.c") === 2 // 5 === 3+2 if (str.length !== domStr.length + idx) { // it's not a suffix return false; } /* "* The last character of the string that is not included in the domain * string is a %x2E (".") character." */ if (str.substr(idx-1,1) !== '.') { return false; } return true; } // RFC6265 S5.1.4 Paths and Path-Match /* * "The user agent MUST use an algorithm equivalent to the following algorithm * to compute the default-path of a cookie:" * * Assumption: the path (and not query part or absolute uri) is passed in. */ function defaultPath(path) { // "2. If the uri-path is empty or if the first character of the uri-path is not // a %x2F ("/") character, output %x2F ("/") and skip the remaining steps. if (!path || path.substr(0,1) !== "/") { return "/"; } // "3. If the uri-path contains no more than one %x2F ("/") character, output // %x2F ("/") and skip the remaining step." if (path === "/") { return path; } var rightSlash = path.lastIndexOf("/"); if (rightSlash === 0) { return "/"; } // "4. Output the characters of the uri-path from the first character up to, // but not including, the right-most %x2F ("/")." return path.slice(0, rightSlash); } function parse(str, options) { if (!options || typeof options !== 'object') { options = {}; } str = str.trim(); // S4.1.1 Trailing semi-colons are not part of the specification. var semiColonCheck = TRAILING_SEMICOLON.exec(str); if (semiColonCheck) { str = str.slice(0, semiColonCheck.index); } // We use a regex to parse the "name-value-pair" part of S5.2 var firstSemi = str.indexOf(';'); // S5.2 step 1 var pairRe = options.loose ? LOOSE_COOKIE_PAIR : COOKIE_PAIR; var result = pairRe.exec(firstSemi === -1 ? str : str.substr(0,firstSemi)); // Rx satisfies the "the name string is empty" and "lacks a %x3D ("=")" // constraints as well as trimming any whitespace. if (!result) { return; } var c = new Cookie(); if (result[1]) { c.key = result[2].trim(); } else { c.key = ''; } c.value = result[3].trim(); if (CONTROL_CHARS.test(c.key) || CONTROL_CHARS.test(c.value)) { return; } if (firstSemi === -1) { return c; } // S5.2.3 "unparsed-attributes consist of the remainder of the set-cookie-string // (including the %x3B (";") in question)." plus later on in the same section // "discard the first ";" and trim". var unparsed = str.slice(firstSemi).replace(/^\s*;\s*/,'').trim(); // "If the unparsed-attributes string is empty, skip the rest of these // steps." if (unparsed.length === 0) { return c; } /* * S5.2 says that when looping over the items "[p]rocess the attribute-name * and attribute-value according to the requirements in the following * subsections" for every item. Plus, for many of the individual attributes * in S5.3 it says to use the "attribute-value of the last attribute in the * cookie-attribute-list". Therefore, in this implementation, we overwrite * the previous value. */ var cookie_avs = unparsed.split(/\s*;\s*/); while (cookie_avs.length) { var av = cookie_avs.shift(); var av_sep = av.indexOf('='); var av_key, av_value; if (av_sep === -1) { av_key = av; av_value = null; } else { av_key = av.substr(0,av_sep); av_value = av.substr(av_sep+1); } av_key = av_key.trim().toLowerCase(); if (av_value) { av_value = av_value.trim(); } switch(av_key) { case 'expires': // S5.2.1 if (av_value) { var exp = parseDate(av_value); // "If the attribute-value failed to parse as a cookie date, ignore the // cookie-av." if (exp) { // over and underflow not realistically a concern: V8's getTime() seems to // store something larger than a 32-bit time_t (even with 32-bit node) c.expires = exp; } } break; case 'max-age': // S5.2.2 if (av_value) { // "If the first character of the attribute-value is not a DIGIT or a "-" // character ...[or]... If the remainder of attribute-value contains a // non-DIGIT character, ignore the cookie-av." if (/^-?[0-9]+$/.test(av_value)) { var delta = parseInt(av_value, 10); // "If delta-seconds is less than or equal to zero (0), let expiry-time // be the earliest representable date and time." c.setMaxAge(delta); } } break; case 'domain': // S5.2.3 // "If the attribute-value is empty, the behavior is undefined. However, // the user agent SHOULD ignore the cookie-av entirely." if (av_value) { // S5.2.3 "Let cookie-domain be the attribute-value without the leading %x2E // (".") character." var domain = av_value.trim().replace(/^\./, ''); if (domain) { // "Convert the cookie-domain to lower case." c.domain = domain.toLowerCase(); } } break; case 'path': // S5.2.4 /* * "If the attribute-value is empty or if the first character of the * attribute-value is not %x2F ("/"): * Let cookie-path be the default-path. * Otherwise: * Let cookie-path be the attribute-value." * * We'll represent the default-path as null since it depends on the * context of the parsing. */ c.path = av_value && av_value[0] === "/" ? av_value : null; break; case 'secure': // S5.2.5 /* * "If the attribute-name case-insensitively matches the string "Secure", * the user agent MUST append an attribute to the cookie-attribute-list * with an attribute-name of Secure and an empty attribute-value." */ c.secure = true; break; case 'httponly': // S5.2.6 -- effectively the same as 'secure' c.httpOnly = true; break; default: c.extensions = c.extensions || []; c.extensions.push(av); break; } } return c; } // avoid the V8 deoptimization monster! function jsonParse(str) { var obj; try { obj = JSON.parse(str); } catch (e) { return e; } return obj; } function fromJSON(str) { if (!str) { return null; } var obj; if (typeof str === 'string') { obj = jsonParse(str); if (obj instanceof Error) { return null; } } else { // assume it's an Object obj = str; } var c = new Cookie(); for (var i=0; i 1) { var lindex = path.lastIndexOf('/'); if (lindex === 0) { break; } path = path.substr(0,lindex); permutations.push(path); } permutations.push('/'); return permutations; } function getCookieContext(url) { if (url instanceof Object) { return url; } // NOTE: decodeURI will throw on malformed URIs (see GH-32). // Therefore, we will just skip decoding for such URIs. try { url = decodeURI(url); } catch(err) { // Silently swallow error } return urlParse(url); } function Cookie(options) { options = options || {}; Object.keys(options).forEach(function(prop) { if (Cookie.prototype.hasOwnProperty(prop) && Cookie.prototype[prop] !== options[prop] && prop.substr(0,1) !== '_') { this[prop] = options[prop]; } }, this); this.creation = this.creation || new Date(); // used to break creation ties in cookieCompare(): Object.defineProperty(this, 'creationIndex', { configurable: false, enumerable: false, // important for assert.deepEqual checks writable: true, value: ++Cookie.cookiesCreated }); } Cookie.cookiesCreated = 0; // incremented each time a cookie is created Cookie.parse = parse; Cookie.fromJSON = fromJSON; Cookie.prototype.key = ""; Cookie.prototype.value = ""; // the order in which the RFC has them: Cookie.prototype.expires = "Infinity"; // coerces to literal Infinity Cookie.prototype.maxAge = null; // takes precedence over expires for TTL Cookie.prototype.domain = null; Cookie.prototype.path = null; Cookie.prototype.secure = false; Cookie.prototype.httpOnly = false; Cookie.prototype.extensions = null; // set by the CookieJar: Cookie.prototype.hostOnly = null; // boolean when set Cookie.prototype.pathIsDefault = null; // boolean when set Cookie.prototype.creation = null; // Date when set; defaulted by Cookie.parse Cookie.prototype.lastAccessed = null; // Date when set Object.defineProperty(Cookie.prototype, 'creationIndex', { configurable: true, enumerable: false, writable: true, value: 0 }); Cookie.serializableProperties = Object.keys(Cookie.prototype) .filter(function(prop) { return !( Cookie.prototype[prop] instanceof Function || prop === 'creationIndex' || prop.substr(0,1) === '_' ); }); Cookie.prototype.inspect = function inspect() { var now = Date.now(); return 'Cookie="'+this.toString() + '; hostOnly='+(this.hostOnly != null ? this.hostOnly : '?') + '; aAge='+(this.lastAccessed ? (now-this.lastAccessed.getTime())+'ms' : '?') + '; cAge='+(this.creation ? (now-this.creation.getTime())+'ms' : '?') + '"'; }; Cookie.prototype.toJSON = function() { var obj = {}; var props = Cookie.serializableProperties; for (var i=0; i suffixLen) { var publicSuffix = parts.slice(0,suffixLen+1).reverse().join('.'); return converted ? punycode.toUnicode(publicSuffix) : publicSuffix; } return null; }; // The following generated structure is used under the MPL version 2.0 // See public-suffix.txt for more information var index = module.exports.index = Object.freeze( {"ac":true,"com.ac":true,"edu.ac":true,"gov.ac":true,"net.ac":true,"mil.ac":true,"org.ac":true,"ad":true,"nom.ad":true,"ae":true,"co.ae":true,"net.ae":true,"org.ae":true,"sch.ae":true,"ac.ae":true,"gov.ae":true,"mil.ae":true,"aero":true,"accident-investigation.aero":true,"accident-prevention.aero":true,"aerobatic.aero":true,"aeroclub.aero":true,"aerodrome.aero":true,"agents.aero":true,"aircraft.aero":true,"airline.aero":true,"airport.aero":true,"air-surveillance.aero":true,"airtraffic.aero":true,"air-traffic-control.aero":true,"ambulance.aero":true,"amusement.aero":true,"association.aero":true,"author.aero":true,"ballooning.aero":true,"broker.aero":true,"caa.aero":true,"cargo.aero":true,"catering.aero":true,"certification.aero":true,"championship.aero":true,"charter.aero":true,"civilaviation.aero":true,"club.aero":true,"conference.aero":true,"consultant.aero":true,"consulting.aero":true,"control.aero":true,"council.aero":true,"crew.aero":true,"design.aero":true,"dgca.aero":true,"educator.aero":true,"emergency.aero":true,"engine.aero":true,"engineer.aero":true,"entertainment.aero":true,"equipment.aero":true,"exchange.aero":true,"express.aero":true,"federation.aero":true,"flight.aero":true,"freight.aero":true,"fuel.aero":true,"gliding.aero":true,"government.aero":true,"groundhandling.aero":true,"group.aero":true,"hanggliding.aero":true,"homebuilt.aero":true,"insurance.aero":true,"journal.aero":true,"journalist.aero":true,"leasing.aero":true,"logistics.aero":true,"magazine.aero":true,"maintenance.aero":true,"marketplace.aero":true,"media.aero":true,"microlight.aero":true,"modelling.aero":true,"navigation.aero":true,"parachuting.aero":true,"paragliding.aero":true,"passenger-association.aero":true,"pilot.aero":true,"press.aero":true,"production.aero":true,"recreation.aero":true,"repbody.aero":true,"res.aero":true,"research.aero":true,"rotorcraft.aero":true,"safety.aero":true,"scientist.aero":true,"services.aero":true,"show.aero":true,"skydiving.aero":true,"software.aero":true,"student.aero":true,"taxi.aero":true,"trader.aero":true,"trading.aero":true,"trainer.aero":true,"union.aero":true,"workinggroup.aero":true,"works.aero":true,"af":true,"gov.af":true,"com.af":true,"org.af":true,"net.af":true,"edu.af":true,"ag":true,"com.ag":true,"org.ag":true,"net.ag":true,"co.ag":true,"nom.ag":true,"ai":true,"off.ai":true,"com.ai":true,"net.ai":true,"org.ai":true,"al":true,"com.al":true,"edu.al":true,"gov.al":true,"mil.al":true,"net.al":true,"org.al":true,"am":true,"an":true,"com.an":true,"net.an":true,"org.an":true,"edu.an":true,"ao":true,"ed.ao":true,"gv.ao":true,"og.ao":true,"co.ao":true,"pb.ao":true,"it.ao":true,"aq":true,"ar":true,"com.ar":true,"edu.ar":true,"gob.ar":true,"gov.ar":true,"int.ar":true,"mil.ar":true,"net.ar":true,"org.ar":true,"tur.ar":true,"arpa":true,"e164.arpa":true,"in-addr.arpa":true,"ip6.arpa":true,"iris.arpa":true,"uri.arpa":true,"urn.arpa":true,"as":true,"gov.as":true,"asia":true,"at":true,"ac.at":true,"co.at":true,"gv.at":true,"or.at":true,"au":true,"com.au":true,"net.au":true,"org.au":true,"edu.au":true,"gov.au":true,"asn.au":true,"id.au":true,"info.au":true,"conf.au":true,"oz.au":true,"act.au":true,"nsw.au":true,"nt.au":true,"qld.au":true,"sa.au":true,"tas.au":true,"vic.au":true,"wa.au":true,"act.edu.au":true,"nsw.edu.au":true,"nt.edu.au":true,"qld.edu.au":true,"sa.edu.au":true,"tas.edu.au":true,"vic.edu.au":true,"wa.edu.au":true,"qld.gov.au":true,"sa.gov.au":true,"tas.gov.au":true,"vic.gov.au":true,"wa.gov.au":true,"aw":true,"com.aw":true,"ax":true,"az":true,"com.az":true,"net.az":true,"int.az":true,"gov.az":true,"org.az":true,"edu.az":true,"info.az":true,"pp.az":true,"mil.az":true,"name.az":true,"pro.az":true,"biz.az":true,"ba":true,"org.ba":true,"net.ba":true,"edu.ba":true,"gov.ba":true,"mil.ba":true,"unsa.ba":true,"unbi.ba":true,"co.ba":true,"com.ba":true,"rs.ba":true,"bb":true,"biz.bb":true,"co.bb":true,"com.bb":true,"edu.bb":true,"gov.bb":true,"info.bb":true,"net.bb":true,"org.bb":true,"store.bb":true,"tv.bb":true,"*.bd":true,"be":true,"ac.be":true,"bf":true,"gov.bf":true,"bg":true,"a.bg":true,"b.bg":true,"c.bg":true,"d.bg":true,"e.bg":true,"f.bg":true,"g.bg":true,"h.bg":true,"i.bg":true,"j.bg":true,"k.bg":true,"l.bg":true,"m.bg":true,"n.bg":true,"o.bg":true,"p.bg":true,"q.bg":true,"r.bg":true,"s.bg":true,"t.bg":true,"u.bg":true,"v.bg":true,"w.bg":true,"x.bg":true,"y.bg":true,"z.bg":true,"0.bg":true,"1.bg":true,"2.bg":true,"3.bg":true,"4.bg":true,"5.bg":true,"6.bg":true,"7.bg":true,"8.bg":true,"9.bg":true,"bh":true,"com.bh":true,"edu.bh":true,"net.bh":true,"org.bh":true,"gov.bh":true,"bi":true,"co.bi":true,"com.bi":true,"edu.bi":true,"or.bi":true,"org.bi":true,"biz":true,"bj":true,"asso.bj":true,"barreau.bj":true,"gouv.bj":true,"bm":true,"com.bm":true,"edu.bm":true,"gov.bm":true,"net.bm":true,"org.bm":true,"*.bn":true,"bo":true,"com.bo":true,"edu.bo":true,"gov.bo":true,"gob.bo":true,"int.bo":true,"org.bo":true,"net.bo":true,"mil.bo":true,"tv.bo":true,"br":true,"adm.br":true,"adv.br":true,"agr.br":true,"am.br":true,"arq.br":true,"art.br":true,"ato.br":true,"b.br":true,"bio.br":true,"blog.br":true,"bmd.br":true,"cim.br":true,"cng.br":true,"cnt.br":true,"com.br":true,"coop.br":true,"ecn.br":true,"eco.br":true,"edu.br":true,"emp.br":true,"eng.br":true,"esp.br":true,"etc.br":true,"eti.br":true,"far.br":true,"flog.br":true,"fm.br":true,"fnd.br":true,"fot.br":true,"fst.br":true,"g12.br":true,"ggf.br":true,"gov.br":true,"imb.br":true,"ind.br":true,"inf.br":true,"jor.br":true,"jus.br":true,"leg.br":true,"lel.br":true,"mat.br":true,"med.br":true,"mil.br":true,"mp.br":true,"mus.br":true,"net.br":true,"*.nom.br":true,"not.br":true,"ntr.br":true,"odo.br":true,"org.br":true,"ppg.br":true,"pro.br":true,"psc.br":true,"psi.br":true,"qsl.br":true,"radio.br":true,"rec.br":true,"slg.br":true,"srv.br":true,"taxi.br":true,"teo.br":true,"tmp.br":true,"trd.br":true,"tur.br":true,"tv.br":true,"vet.br":true,"vlog.br":true,"wiki.br":true,"zlg.br":true,"bs":true,"com.bs":true,"net.bs":true,"org.bs":true,"edu.bs":true,"gov.bs":true,"bt":true,"com.bt":true,"edu.bt":true,"gov.bt":true,"net.bt":true,"org.bt":true,"bv":true,"bw":true,"co.bw":true,"org.bw":true,"by":true,"gov.by":true,"mil.by":true,"com.by":true,"of.by":true,"bz":true,"com.bz":true,"net.bz":true,"org.bz":true,"edu.bz":true,"gov.bz":true,"ca":true,"ab.ca":true,"bc.ca":true,"mb.ca":true,"nb.ca":true,"nf.ca":true,"nl.ca":true,"ns.ca":true,"nt.ca":true,"nu.ca":true,"on.ca":true,"pe.ca":true,"qc.ca":true,"sk.ca":true,"yk.ca":true,"gc.ca":true,"cat":true,"cc":true,"cd":true,"gov.cd":true,"cf":true,"cg":true,"ch":true,"ci":true,"org.ci":true,"or.ci":true,"com.ci":true,"co.ci":true,"edu.ci":true,"ed.ci":true,"ac.ci":true,"net.ci":true,"go.ci":true,"asso.ci":true,"xn--aroport-bya.ci":true,"int.ci":true,"presse.ci":true,"md.ci":true,"gouv.ci":true,"*.ck":true,"www.ck":false,"cl":true,"gov.cl":true,"gob.cl":true,"co.cl":true,"mil.cl":true,"cm":true,"co.cm":true,"com.cm":true,"gov.cm":true,"net.cm":true,"cn":true,"ac.cn":true,"com.cn":true,"edu.cn":true,"gov.cn":true,"net.cn":true,"org.cn":true,"mil.cn":true,"xn--55qx5d.cn":true,"xn--io0a7i.cn":true,"xn--od0alg.cn":true,"ah.cn":true,"bj.cn":true,"cq.cn":true,"fj.cn":true,"gd.cn":true,"gs.cn":true,"gz.cn":true,"gx.cn":true,"ha.cn":true,"hb.cn":true,"he.cn":true,"hi.cn":true,"hl.cn":true,"hn.cn":true,"jl.cn":true,"js.cn":true,"jx.cn":true,"ln.cn":true,"nm.cn":true,"nx.cn":true,"qh.cn":true,"sc.cn":true,"sd.cn":true,"sh.cn":true,"sn.cn":true,"sx.cn":true,"tj.cn":true,"xj.cn":true,"xz.cn":true,"yn.cn":true,"zj.cn":true,"hk.cn":true,"mo.cn":true,"tw.cn":true,"co":true,"arts.co":true,"com.co":true,"edu.co":true,"firm.co":true,"gov.co":true,"info.co":true,"int.co":true,"mil.co":true,"net.co":true,"nom.co":true,"org.co":true,"rec.co":true,"web.co":true,"com":true,"coop":true,"cr":true,"ac.cr":true,"co.cr":true,"ed.cr":true,"fi.cr":true,"go.cr":true,"or.cr":true,"sa.cr":true,"cu":true,"com.cu":true,"edu.cu":true,"org.cu":true,"net.cu":true,"gov.cu":true,"inf.cu":true,"cv":true,"cw":true,"com.cw":true,"edu.cw":true,"net.cw":true,"org.cw":true,"cx":true,"gov.cx":true,"ac.cy":true,"biz.cy":true,"com.cy":true,"ekloges.cy":true,"gov.cy":true,"ltd.cy":true,"name.cy":true,"net.cy":true,"org.cy":true,"parliament.cy":true,"press.cy":true,"pro.cy":true,"tm.cy":true,"cz":true,"de":true,"dj":true,"dk":true,"dm":true,"com.dm":true,"net.dm":true,"org.dm":true,"edu.dm":true,"gov.dm":true,"do":true,"art.do":true,"com.do":true,"edu.do":true,"gob.do":true,"gov.do":true,"mil.do":true,"net.do":true,"org.do":true,"sld.do":true,"web.do":true,"dz":true,"com.dz":true,"org.dz":true,"net.dz":true,"gov.dz":true,"edu.dz":true,"asso.dz":true,"pol.dz":true,"art.dz":true,"ec":true,"com.ec":true,"info.ec":true,"net.ec":true,"fin.ec":true,"k12.ec":true,"med.ec":true,"pro.ec":true,"org.ec":true,"edu.ec":true,"gov.ec":true,"gob.ec":true,"mil.ec":true,"edu":true,"ee":true,"edu.ee":true,"gov.ee":true,"riik.ee":true,"lib.ee":true,"med.ee":true,"com.ee":true,"pri.ee":true,"aip.ee":true,"org.ee":true,"fie.ee":true,"eg":true,"com.eg":true,"edu.eg":true,"eun.eg":true,"gov.eg":true,"mil.eg":true,"name.eg":true,"net.eg":true,"org.eg":true,"sci.eg":true,"*.er":true,"es":true,"com.es":true,"nom.es":true,"org.es":true,"gob.es":true,"edu.es":true,"et":true,"com.et":true,"gov.et":true,"org.et":true,"edu.et":true,"biz.et":true,"name.et":true,"info.et":true,"net.et":true,"eu":true,"fi":true,"aland.fi":true,"*.fj":true,"*.fk":true,"fm":true,"fo":true,"fr":true,"com.fr":true,"asso.fr":true,"nom.fr":true,"prd.fr":true,"presse.fr":true,"tm.fr":true,"aeroport.fr":true,"assedic.fr":true,"avocat.fr":true,"avoues.fr":true,"cci.fr":true,"chambagri.fr":true,"chirurgiens-dentistes.fr":true,"experts-comptables.fr":true,"geometre-expert.fr":true,"gouv.fr":true,"greta.fr":true,"huissier-justice.fr":true,"medecin.fr":true,"notaires.fr":true,"pharmacien.fr":true,"port.fr":true,"veterinaire.fr":true,"ga":true,"gb":true,"gd":true,"ge":true,"com.ge":true,"edu.ge":true,"gov.ge":true,"org.ge":true,"mil.ge":true,"net.ge":true,"pvt.ge":true,"gf":true,"gg":true,"co.gg":true,"net.gg":true,"org.gg":true,"gh":true,"com.gh":true,"edu.gh":true,"gov.gh":true,"org.gh":true,"mil.gh":true,"gi":true,"com.gi":true,"ltd.gi":true,"gov.gi":true,"mod.gi":true,"edu.gi":true,"org.gi":true,"gl":true,"co.gl":true,"com.gl":true,"edu.gl":true,"net.gl":true,"org.gl":true,"gm":true,"gn":true,"ac.gn":true,"com.gn":true,"edu.gn":true,"gov.gn":true,"org.gn":true,"net.gn":true,"gov":true,"gp":true,"com.gp":true,"net.gp":true,"mobi.gp":true,"edu.gp":true,"org.gp":true,"asso.gp":true,"gq":true,"gr":true,"com.gr":true,"edu.gr":true,"net.gr":true,"org.gr":true,"gov.gr":true,"gs":true,"gt":true,"com.gt":true,"edu.gt":true,"gob.gt":true,"ind.gt":true,"mil.gt":true,"net.gt":true,"org.gt":true,"*.gu":true,"gw":true,"gy":true,"co.gy":true,"com.gy":true,"net.gy":true,"hk":true,"com.hk":true,"edu.hk":true,"gov.hk":true,"idv.hk":true,"net.hk":true,"org.hk":true,"xn--55qx5d.hk":true,"xn--wcvs22d.hk":true,"xn--lcvr32d.hk":true,"xn--mxtq1m.hk":true,"xn--gmqw5a.hk":true,"xn--ciqpn.hk":true,"xn--gmq050i.hk":true,"xn--zf0avx.hk":true,"xn--io0a7i.hk":true,"xn--mk0axi.hk":true,"xn--od0alg.hk":true,"xn--od0aq3b.hk":true,"xn--tn0ag.hk":true,"xn--uc0atv.hk":true,"xn--uc0ay4a.hk":true,"hm":true,"hn":true,"com.hn":true,"edu.hn":true,"org.hn":true,"net.hn":true,"mil.hn":true,"gob.hn":true,"hr":true,"iz.hr":true,"from.hr":true,"name.hr":true,"com.hr":true,"ht":true,"com.ht":true,"shop.ht":true,"firm.ht":true,"info.ht":true,"adult.ht":true,"net.ht":true,"pro.ht":true,"org.ht":true,"med.ht":true,"art.ht":true,"coop.ht":true,"pol.ht":true,"asso.ht":true,"edu.ht":true,"rel.ht":true,"gouv.ht":true,"perso.ht":true,"hu":true,"co.hu":true,"info.hu":true,"org.hu":true,"priv.hu":true,"sport.hu":true,"tm.hu":true,"2000.hu":true,"agrar.hu":true,"bolt.hu":true,"casino.hu":true,"city.hu":true,"erotica.hu":true,"erotika.hu":true,"film.hu":true,"forum.hu":true,"games.hu":true,"hotel.hu":true,"ingatlan.hu":true,"jogasz.hu":true,"konyvelo.hu":true,"lakas.hu":true,"media.hu":true,"news.hu":true,"reklam.hu":true,"sex.hu":true,"shop.hu":true,"suli.hu":true,"szex.hu":true,"tozsde.hu":true,"utazas.hu":true,"video.hu":true,"id":true,"ac.id":true,"biz.id":true,"co.id":true,"desa.id":true,"go.id":true,"mil.id":true,"my.id":true,"net.id":true,"or.id":true,"sch.id":true,"web.id":true,"ie":true,"gov.ie":true,"il":true,"ac.il":true,"co.il":true,"gov.il":true,"idf.il":true,"k12.il":true,"muni.il":true,"net.il":true,"org.il":true,"im":true,"ac.im":true,"co.im":true,"com.im":true,"ltd.co.im":true,"net.im":true,"org.im":true,"plc.co.im":true,"tt.im":true,"tv.im":true,"in":true,"co.in":true,"firm.in":true,"net.in":true,"org.in":true,"gen.in":true,"ind.in":true,"nic.in":true,"ac.in":true,"edu.in":true,"res.in":true,"gov.in":true,"mil.in":true,"info":true,"int":true,"eu.int":true,"io":true,"com.io":true,"iq":true,"gov.iq":true,"edu.iq":true,"mil.iq":true,"com.iq":true,"org.iq":true,"net.iq":true,"ir":true,"ac.ir":true,"co.ir":true,"gov.ir":true,"id.ir":true,"net.ir":true,"org.ir":true,"sch.ir":true,"xn--mgba3a4f16a.ir":true,"xn--mgba3a4fra.ir":true,"is":true,"net.is":true,"com.is":true,"edu.is":true,"gov.is":true,"org.is":true,"int.is":true,"it":true,"gov.it":true,"edu.it":true,"abr.it":true,"abruzzo.it":true,"aosta-valley.it":true,"aostavalley.it":true,"bas.it":true,"basilicata.it":true,"cal.it":true,"calabria.it":true,"cam.it":true,"campania.it":true,"emilia-romagna.it":true,"emiliaromagna.it":true,"emr.it":true,"friuli-v-giulia.it":true,"friuli-ve-giulia.it":true,"friuli-vegiulia.it":true,"friuli-venezia-giulia.it":true,"friuli-veneziagiulia.it":true,"friuli-vgiulia.it":true,"friuliv-giulia.it":true,"friulive-giulia.it":true,"friulivegiulia.it":true,"friulivenezia-giulia.it":true,"friuliveneziagiulia.it":true,"friulivgiulia.it":true,"fvg.it":true,"laz.it":true,"lazio.it":true,"lig.it":true,"liguria.it":true,"lom.it":true,"lombardia.it":true,"lombardy.it":true,"lucania.it":true,"mar.it":true,"marche.it":true,"mol.it":true,"molise.it":true,"piedmont.it":true,"piemonte.it":true,"pmn.it":true,"pug.it":true,"puglia.it":true,"sar.it":true,"sardegna.it":true,"sardinia.it":true,"sic.it":true,"sicilia.it":true,"sicily.it":true,"taa.it":true,"tos.it":true,"toscana.it":true,"trentino-a-adige.it":true,"trentino-aadige.it":true,"trentino-alto-adige.it":true,"trentino-altoadige.it":true,"trentino-s-tirol.it":true,"trentino-stirol.it":true,"trentino-sud-tirol.it":true,"trentino-sudtirol.it":true,"trentino-sued-tirol.it":true,"trentino-suedtirol.it":true,"trentinoa-adige.it":true,"trentinoaadige.it":true,"trentinoalto-adige.it":true,"trentinoaltoadige.it":true,"trentinos-tirol.it":true,"trentinostirol.it":true,"trentinosud-tirol.it":true,"trentinosudtirol.it":true,"trentinosued-tirol.it":true,"trentinosuedtirol.it":true,"tuscany.it":true,"umb.it":true,"umbria.it":true,"val-d-aosta.it":true,"val-daosta.it":true,"vald-aosta.it":true,"valdaosta.it":true,"valle-aosta.it":true,"valle-d-aosta.it":true,"valle-daosta.it":true,"valleaosta.it":true,"valled-aosta.it":true,"valledaosta.it":true,"vallee-aoste.it":true,"valleeaoste.it":true,"vao.it":true,"vda.it":true,"ven.it":true,"veneto.it":true,"ag.it":true,"agrigento.it":true,"al.it":true,"alessandria.it":true,"alto-adige.it":true,"altoadige.it":true,"an.it":true,"ancona.it":true,"andria-barletta-trani.it":true,"andria-trani-barletta.it":true,"andriabarlettatrani.it":true,"andriatranibarletta.it":true,"ao.it":true,"aosta.it":true,"aoste.it":true,"ap.it":true,"aq.it":true,"aquila.it":true,"ar.it":true,"arezzo.it":true,"ascoli-piceno.it":true,"ascolipiceno.it":true,"asti.it":true,"at.it":true,"av.it":true,"avellino.it":true,"ba.it":true,"balsan.it":true,"bari.it":true,"barletta-trani-andria.it":true,"barlettatraniandria.it":true,"belluno.it":true,"benevento.it":true,"bergamo.it":true,"bg.it":true,"bi.it":true,"biella.it":true,"bl.it":true,"bn.it":true,"bo.it":true,"bologna.it":true,"bolzano.it":true,"bozen.it":true,"br.it":true,"brescia.it":true,"brindisi.it":true,"bs.it":true,"bt.it":true,"bz.it":true,"ca.it":true,"cagliari.it":true,"caltanissetta.it":true,"campidano-medio.it":true,"campidanomedio.it":true,"campobasso.it":true,"carbonia-iglesias.it":true,"carboniaiglesias.it":true,"carrara-massa.it":true,"carraramassa.it":true,"caserta.it":true,"catania.it":true,"catanzaro.it":true,"cb.it":true,"ce.it":true,"cesena-forli.it":true,"cesenaforli.it":true,"ch.it":true,"chieti.it":true,"ci.it":true,"cl.it":true,"cn.it":true,"co.it":true,"como.it":true,"cosenza.it":true,"cr.it":true,"cremona.it":true,"crotone.it":true,"cs.it":true,"ct.it":true,"cuneo.it":true,"cz.it":true,"dell-ogliastra.it":true,"dellogliastra.it":true,"en.it":true,"enna.it":true,"fc.it":true,"fe.it":true,"fermo.it":true,"ferrara.it":true,"fg.it":true,"fi.it":true,"firenze.it":true,"florence.it":true,"fm.it":true,"foggia.it":true,"forli-cesena.it":true,"forlicesena.it":true,"fr.it":true,"frosinone.it":true,"ge.it":true,"genoa.it":true,"genova.it":true,"go.it":true,"gorizia.it":true,"gr.it":true,"grosseto.it":true,"iglesias-carbonia.it":true,"iglesiascarbonia.it":true,"im.it":true,"imperia.it":true,"is.it":true,"isernia.it":true,"kr.it":true,"la-spezia.it":true,"laquila.it":true,"laspezia.it":true,"latina.it":true,"lc.it":true,"le.it":true,"lecce.it":true,"lecco.it":true,"li.it":true,"livorno.it":true,"lo.it":true,"lodi.it":true,"lt.it":true,"lu.it":true,"lucca.it":true,"macerata.it":true,"mantova.it":true,"massa-carrara.it":true,"massacarrara.it":true,"matera.it":true,"mb.it":true,"mc.it":true,"me.it":true,"medio-campidano.it":true,"mediocampidano.it":true,"messina.it":true,"mi.it":true,"milan.it":true,"milano.it":true,"mn.it":true,"mo.it":true,"modena.it":true,"monza-brianza.it":true,"monza-e-della-brianza.it":true,"monza.it":true,"monzabrianza.it":true,"monzaebrianza.it":true,"monzaedellabrianza.it":true,"ms.it":true,"mt.it":true,"na.it":true,"naples.it":true,"napoli.it":true,"no.it":true,"novara.it":true,"nu.it":true,"nuoro.it":true,"og.it":true,"ogliastra.it":true,"olbia-tempio.it":true,"olbiatempio.it":true,"or.it":true,"oristano.it":true,"ot.it":true,"pa.it":true,"padova.it":true,"padua.it":true,"palermo.it":true,"parma.it":true,"pavia.it":true,"pc.it":true,"pd.it":true,"pe.it":true,"perugia.it":true,"pesaro-urbino.it":true,"pesarourbino.it":true,"pescara.it":true,"pg.it":true,"pi.it":true,"piacenza.it":true,"pisa.it":true,"pistoia.it":true,"pn.it":true,"po.it":true,"pordenone.it":true,"potenza.it":true,"pr.it":true,"prato.it":true,"pt.it":true,"pu.it":true,"pv.it":true,"pz.it":true,"ra.it":true,"ragusa.it":true,"ravenna.it":true,"rc.it":true,"re.it":true,"reggio-calabria.it":true,"reggio-emilia.it":true,"reggiocalabria.it":true,"reggioemilia.it":true,"rg.it":true,"ri.it":true,"rieti.it":true,"rimini.it":true,"rm.it":true,"rn.it":true,"ro.it":true,"roma.it":true,"rome.it":true,"rovigo.it":true,"sa.it":true,"salerno.it":true,"sassari.it":true,"savona.it":true,"si.it":true,"siena.it":true,"siracusa.it":true,"so.it":true,"sondrio.it":true,"sp.it":true,"sr.it":true,"ss.it":true,"suedtirol.it":true,"sv.it":true,"ta.it":true,"taranto.it":true,"te.it":true,"tempio-olbia.it":true,"tempioolbia.it":true,"teramo.it":true,"terni.it":true,"tn.it":true,"to.it":true,"torino.it":true,"tp.it":true,"tr.it":true,"trani-andria-barletta.it":true,"trani-barletta-andria.it":true,"traniandriabarletta.it":true,"tranibarlettaandria.it":true,"trapani.it":true,"trentino.it":true,"trento.it":true,"treviso.it":true,"trieste.it":true,"ts.it":true,"turin.it":true,"tv.it":true,"ud.it":true,"udine.it":true,"urbino-pesaro.it":true,"urbinopesaro.it":true,"va.it":true,"varese.it":true,"vb.it":true,"vc.it":true,"ve.it":true,"venezia.it":true,"venice.it":true,"verbania.it":true,"vercelli.it":true,"verona.it":true,"vi.it":true,"vibo-valentia.it":true,"vibovalentia.it":true,"vicenza.it":true,"viterbo.it":true,"vr.it":true,"vs.it":true,"vt.it":true,"vv.it":true,"je":true,"co.je":true,"net.je":true,"org.je":true,"*.jm":true,"jo":true,"com.jo":true,"org.jo":true,"net.jo":true,"edu.jo":true,"sch.jo":true,"gov.jo":true,"mil.jo":true,"name.jo":true,"jobs":true,"jp":true,"ac.jp":true,"ad.jp":true,"co.jp":true,"ed.jp":true,"go.jp":true,"gr.jp":true,"lg.jp":true,"ne.jp":true,"or.jp":true,"aichi.jp":true,"akita.jp":true,"aomori.jp":true,"chiba.jp":true,"ehime.jp":true,"fukui.jp":true,"fukuoka.jp":true,"fukushima.jp":true,"gifu.jp":true,"gunma.jp":true,"hiroshima.jp":true,"hokkaido.jp":true,"hyogo.jp":true,"ibaraki.jp":true,"ishikawa.jp":true,"iwate.jp":true,"kagawa.jp":true,"kagoshima.jp":true,"kanagawa.jp":true,"kochi.jp":true,"kumamoto.jp":true,"kyoto.jp":true,"mie.jp":true,"miyagi.jp":true,"miyazaki.jp":true,"nagano.jp":true,"nagasaki.jp":true,"nara.jp":true,"niigata.jp":true,"oita.jp":true,"okayama.jp":true,"okinawa.jp":true,"osaka.jp":true,"saga.jp":true,"saitama.jp":true,"shiga.jp":true,"shimane.jp":true,"shizuoka.jp":true,"tochigi.jp":true,"tokushima.jp":true,"tokyo.jp":true,"tottori.jp":true,"toyama.jp":true,"wakayama.jp":true,"yamagata.jp":true,"yamaguchi.jp":true,"yamanashi.jp":true,"xn--4pvxs.jp":true,"xn--vgu402c.jp":true,"xn--c3s14m.jp":true,"xn--f6qx53a.jp":true,"xn--8pvr4u.jp":true,"xn--uist22h.jp":true,"xn--djrs72d6uy.jp":true,"xn--mkru45i.jp":true,"xn--0trq7p7nn.jp":true,"xn--8ltr62k.jp":true,"xn--2m4a15e.jp":true,"xn--efvn9s.jp":true,"xn--32vp30h.jp":true,"xn--4it797k.jp":true,"xn--1lqs71d.jp":true,"xn--5rtp49c.jp":true,"xn--5js045d.jp":true,"xn--ehqz56n.jp":true,"xn--1lqs03n.jp":true,"xn--qqqt11m.jp":true,"xn--kbrq7o.jp":true,"xn--pssu33l.jp":true,"xn--ntsq17g.jp":true,"xn--uisz3g.jp":true,"xn--6btw5a.jp":true,"xn--1ctwo.jp":true,"xn--6orx2r.jp":true,"xn--rht61e.jp":true,"xn--rht27z.jp":true,"xn--djty4k.jp":true,"xn--nit225k.jp":true,"xn--rht3d.jp":true,"xn--klty5x.jp":true,"xn--kltx9a.jp":true,"xn--kltp7d.jp":true,"xn--uuwu58a.jp":true,"xn--zbx025d.jp":true,"xn--ntso0iqx3a.jp":true,"xn--elqq16h.jp":true,"xn--4it168d.jp":true,"xn--klt787d.jp":true,"xn--rny31h.jp":true,"xn--7t0a264c.jp":true,"xn--5rtq34k.jp":true,"xn--k7yn95e.jp":true,"xn--tor131o.jp":true,"xn--d5qv7z876c.jp":true,"*.kawasaki.jp":true,"*.kitakyushu.jp":true,"*.kobe.jp":true,"*.nagoya.jp":true,"*.sapporo.jp":true,"*.sendai.jp":true,"*.yokohama.jp":true,"city.kawasaki.jp":false,"city.kitakyushu.jp":false,"city.kobe.jp":false,"city.nagoya.jp":false,"city.sapporo.jp":false,"city.sendai.jp":false,"city.yokohama.jp":false,"aisai.aichi.jp":true,"ama.aichi.jp":true,"anjo.aichi.jp":true,"asuke.aichi.jp":true,"chiryu.aichi.jp":true,"chita.aichi.jp":true,"fuso.aichi.jp":true,"gamagori.aichi.jp":true,"handa.aichi.jp":true,"hazu.aichi.jp":true,"hekinan.aichi.jp":true,"higashiura.aichi.jp":true,"ichinomiya.aichi.jp":true,"inazawa.aichi.jp":true,"inuyama.aichi.jp":true,"isshiki.aichi.jp":true,"iwakura.aichi.jp":true,"kanie.aichi.jp":true,"kariya.aichi.jp":true,"kasugai.aichi.jp":true,"kira.aichi.jp":true,"kiyosu.aichi.jp":true,"komaki.aichi.jp":true,"konan.aichi.jp":true,"kota.aichi.jp":true,"mihama.aichi.jp":true,"miyoshi.aichi.jp":true,"nishio.aichi.jp":true,"nisshin.aichi.jp":true,"obu.aichi.jp":true,"oguchi.aichi.jp":true,"oharu.aichi.jp":true,"okazaki.aichi.jp":true,"owariasahi.aichi.jp":true,"seto.aichi.jp":true,"shikatsu.aichi.jp":true,"shinshiro.aichi.jp":true,"shitara.aichi.jp":true,"tahara.aichi.jp":true,"takahama.aichi.jp":true,"tobishima.aichi.jp":true,"toei.aichi.jp":true,"togo.aichi.jp":true,"tokai.aichi.jp":true,"tokoname.aichi.jp":true,"toyoake.aichi.jp":true,"toyohashi.aichi.jp":true,"toyokawa.aichi.jp":true,"toyone.aichi.jp":true,"toyota.aichi.jp":true,"tsushima.aichi.jp":true,"yatomi.aichi.jp":true,"akita.akita.jp":true,"daisen.akita.jp":true,"fujisato.akita.jp":true,"gojome.akita.jp":true,"hachirogata.akita.jp":true,"happou.akita.jp":true,"higashinaruse.akita.jp":true,"honjo.akita.jp":true,"honjyo.akita.jp":true,"ikawa.akita.jp":true,"kamikoani.akita.jp":true,"kamioka.akita.jp":true,"katagami.akita.jp":true,"kazuno.akita.jp":true,"kitaakita.akita.jp":true,"kosaka.akita.jp":true,"kyowa.akita.jp":true,"misato.akita.jp":true,"mitane.akita.jp":true,"moriyoshi.akita.jp":true,"nikaho.akita.jp":true,"noshiro.akita.jp":true,"odate.akita.jp":true,"oga.akita.jp":true,"ogata.akita.jp":true,"semboku.akita.jp":true,"yokote.akita.jp":true,"yurihonjo.akita.jp":true,"aomori.aomori.jp":true,"gonohe.aomori.jp":true,"hachinohe.aomori.jp":true,"hashikami.aomori.jp":true,"hiranai.aomori.jp":true,"hirosaki.aomori.jp":true,"itayanagi.aomori.jp":true,"kuroishi.aomori.jp":true,"misawa.aomori.jp":true,"mutsu.aomori.jp":true,"nakadomari.aomori.jp":true,"noheji.aomori.jp":true,"oirase.aomori.jp":true,"owani.aomori.jp":true,"rokunohe.aomori.jp":true,"sannohe.aomori.jp":true,"shichinohe.aomori.jp":true,"shingo.aomori.jp":true,"takko.aomori.jp":true,"towada.aomori.jp":true,"tsugaru.aomori.jp":true,"tsuruta.aomori.jp":true,"abiko.chiba.jp":true,"asahi.chiba.jp":true,"chonan.chiba.jp":true,"chosei.chiba.jp":true,"choshi.chiba.jp":true,"chuo.chiba.jp":true,"funabashi.chiba.jp":true,"futtsu.chiba.jp":true,"hanamigawa.chiba.jp":true,"ichihara.chiba.jp":true,"ichikawa.chiba.jp":true,"ichinomiya.chiba.jp":true,"inzai.chiba.jp":true,"isumi.chiba.jp":true,"kamagaya.chiba.jp":true,"kamogawa.chiba.jp":true,"kashiwa.chiba.jp":true,"katori.chiba.jp":true,"katsuura.chiba.jp":true,"kimitsu.chiba.jp":true,"kisarazu.chiba.jp":true,"kozaki.chiba.jp":true,"kujukuri.chiba.jp":true,"kyonan.chiba.jp":true,"matsudo.chiba.jp":true,"midori.chiba.jp":true,"mihama.chiba.jp":true,"minamiboso.chiba.jp":true,"mobara.chiba.jp":true,"mutsuzawa.chiba.jp":true,"nagara.chiba.jp":true,"nagareyama.chiba.jp":true,"narashino.chiba.jp":true,"narita.chiba.jp":true,"noda.chiba.jp":true,"oamishirasato.chiba.jp":true,"omigawa.chiba.jp":true,"onjuku.chiba.jp":true,"otaki.chiba.jp":true,"sakae.chiba.jp":true,"sakura.chiba.jp":true,"shimofusa.chiba.jp":true,"shirako.chiba.jp":true,"shiroi.chiba.jp":true,"shisui.chiba.jp":true,"sodegaura.chiba.jp":true,"sosa.chiba.jp":true,"tako.chiba.jp":true,"tateyama.chiba.jp":true,"togane.chiba.jp":true,"tohnosho.chiba.jp":true,"tomisato.chiba.jp":true,"urayasu.chiba.jp":true,"yachimata.chiba.jp":true,"yachiyo.chiba.jp":true,"yokaichiba.chiba.jp":true,"yokoshibahikari.chiba.jp":true,"yotsukaido.chiba.jp":true,"ainan.ehime.jp":true,"honai.ehime.jp":true,"ikata.ehime.jp":true,"imabari.ehime.jp":true,"iyo.ehime.jp":true,"kamijima.ehime.jp":true,"kihoku.ehime.jp":true,"kumakogen.ehime.jp":true,"masaki.ehime.jp":true,"matsuno.ehime.jp":true,"matsuyama.ehime.jp":true,"namikata.ehime.jp":true,"niihama.ehime.jp":true,"ozu.ehime.jp":true,"saijo.ehime.jp":true,"seiyo.ehime.jp":true,"shikokuchuo.ehime.jp":true,"tobe.ehime.jp":true,"toon.ehime.jp":true,"uchiko.ehime.jp":true,"uwajima.ehime.jp":true,"yawatahama.ehime.jp":true,"echizen.fukui.jp":true,"eiheiji.fukui.jp":true,"fukui.fukui.jp":true,"ikeda.fukui.jp":true,"katsuyama.fukui.jp":true,"mihama.fukui.jp":true,"minamiechizen.fukui.jp":true,"obama.fukui.jp":true,"ohi.fukui.jp":true,"ono.fukui.jp":true,"sabae.fukui.jp":true,"sakai.fukui.jp":true,"takahama.fukui.jp":true,"tsuruga.fukui.jp":true,"wakasa.fukui.jp":true,"ashiya.fukuoka.jp":true,"buzen.fukuoka.jp":true,"chikugo.fukuoka.jp":true,"chikuho.fukuoka.jp":true,"chikujo.fukuoka.jp":true,"chikushino.fukuoka.jp":true,"chikuzen.fukuoka.jp":true,"chuo.fukuoka.jp":true,"dazaifu.fukuoka.jp":true,"fukuchi.fukuoka.jp":true,"hakata.fukuoka.jp":true,"higashi.fukuoka.jp":true,"hirokawa.fukuoka.jp":true,"hisayama.fukuoka.jp":true,"iizuka.fukuoka.jp":true,"inatsuki.fukuoka.jp":true,"kaho.fukuoka.jp":true,"kasuga.fukuoka.jp":true,"kasuya.fukuoka.jp":true,"kawara.fukuoka.jp":true,"keisen.fukuoka.jp":true,"koga.fukuoka.jp":true,"kurate.fukuoka.jp":true,"kurogi.fukuoka.jp":true,"kurume.fukuoka.jp":true,"minami.fukuoka.jp":true,"miyako.fukuoka.jp":true,"miyama.fukuoka.jp":true,"miyawaka.fukuoka.jp":true,"mizumaki.fukuoka.jp":true,"munakata.fukuoka.jp":true,"nakagawa.fukuoka.jp":true,"nakama.fukuoka.jp":true,"nishi.fukuoka.jp":true,"nogata.fukuoka.jp":true,"ogori.fukuoka.jp":true,"okagaki.fukuoka.jp":true,"okawa.fukuoka.jp":true,"oki.fukuoka.jp":true,"omuta.fukuoka.jp":true,"onga.fukuoka.jp":true,"onojo.fukuoka.jp":true,"oto.fukuoka.jp":true,"saigawa.fukuoka.jp":true,"sasaguri.fukuoka.jp":true,"shingu.fukuoka.jp":true,"shinyoshitomi.fukuoka.jp":true,"shonai.fukuoka.jp":true,"soeda.fukuoka.jp":true,"sue.fukuoka.jp":true,"tachiarai.fukuoka.jp":true,"tagawa.fukuoka.jp":true,"takata.fukuoka.jp":true,"toho.fukuoka.jp":true,"toyotsu.fukuoka.jp":true,"tsuiki.fukuoka.jp":true,"ukiha.fukuoka.jp":true,"umi.fukuoka.jp":true,"usui.fukuoka.jp":true,"yamada.fukuoka.jp":true,"yame.fukuoka.jp":true,"yanagawa.fukuoka.jp":true,"yukuhashi.fukuoka.jp":true,"aizubange.fukushima.jp":true,"aizumisato.fukushima.jp":true,"aizuwakamatsu.fukushima.jp":true,"asakawa.fukushima.jp":true,"bandai.fukushima.jp":true,"date.fukushima.jp":true,"fukushima.fukushima.jp":true,"furudono.fukushima.jp":true,"futaba.fukushima.jp":true,"hanawa.fukushima.jp":true,"higashi.fukushima.jp":true,"hirata.fukushima.jp":true,"hirono.fukushima.jp":true,"iitate.fukushima.jp":true,"inawashiro.fukushima.jp":true,"ishikawa.fukushima.jp":true,"iwaki.fukushima.jp":true,"izumizaki.fukushima.jp":true,"kagamiishi.fukushima.jp":true,"kaneyama.fukushima.jp":true,"kawamata.fukushima.jp":true,"kitakata.fukushima.jp":true,"kitashiobara.fukushima.jp":true,"koori.fukushima.jp":true,"koriyama.fukushima.jp":true,"kunimi.fukushima.jp":true,"miharu.fukushima.jp":true,"mishima.fukushima.jp":true,"namie.fukushima.jp":true,"nango.fukushima.jp":true,"nishiaizu.fukushima.jp":true,"nishigo.fukushima.jp":true,"okuma.fukushima.jp":true,"omotego.fukushima.jp":true,"ono.fukushima.jp":true,"otama.fukushima.jp":true,"samegawa.fukushima.jp":true,"shimogo.fukushima.jp":true,"shirakawa.fukushima.jp":true,"showa.fukushima.jp":true,"soma.fukushima.jp":true,"sukagawa.fukushima.jp":true,"taishin.fukushima.jp":true,"tamakawa.fukushima.jp":true,"tanagura.fukushima.jp":true,"tenei.fukushima.jp":true,"yabuki.fukushima.jp":true,"yamato.fukushima.jp":true,"yamatsuri.fukushima.jp":true,"yanaizu.fukushima.jp":true,"yugawa.fukushima.jp":true,"anpachi.gifu.jp":true,"ena.gifu.jp":true,"gifu.gifu.jp":true,"ginan.gifu.jp":true,"godo.gifu.jp":true,"gujo.gifu.jp":true,"hashima.gifu.jp":true,"hichiso.gifu.jp":true,"hida.gifu.jp":true,"higashishirakawa.gifu.jp":true,"ibigawa.gifu.jp":true,"ikeda.gifu.jp":true,"kakamigahara.gifu.jp":true,"kani.gifu.jp":true,"kasahara.gifu.jp":true,"kasamatsu.gifu.jp":true,"kawaue.gifu.jp":true,"kitagata.gifu.jp":true,"mino.gifu.jp":true,"minokamo.gifu.jp":true,"mitake.gifu.jp":true,"mizunami.gifu.jp":true,"motosu.gifu.jp":true,"nakatsugawa.gifu.jp":true,"ogaki.gifu.jp":true,"sakahogi.gifu.jp":true,"seki.gifu.jp":true,"sekigahara.gifu.jp":true,"shirakawa.gifu.jp":true,"tajimi.gifu.jp":true,"takayama.gifu.jp":true,"tarui.gifu.jp":true,"toki.gifu.jp":true,"tomika.gifu.jp":true,"wanouchi.gifu.jp":true,"yamagata.gifu.jp":true,"yaotsu.gifu.jp":true,"yoro.gifu.jp":true,"annaka.gunma.jp":true,"chiyoda.gunma.jp":true,"fujioka.gunma.jp":true,"higashiagatsuma.gunma.jp":true,"isesaki.gunma.jp":true,"itakura.gunma.jp":true,"kanna.gunma.jp":true,"kanra.gunma.jp":true,"katashina.gunma.jp":true,"kawaba.gunma.jp":true,"kiryu.gunma.jp":true,"kusatsu.gunma.jp":true,"maebashi.gunma.jp":true,"meiwa.gunma.jp":true,"midori.gunma.jp":true,"minakami.gunma.jp":true,"naganohara.gunma.jp":true,"nakanojo.gunma.jp":true,"nanmoku.gunma.jp":true,"numata.gunma.jp":true,"oizumi.gunma.jp":true,"ora.gunma.jp":true,"ota.gunma.jp":true,"shibukawa.gunma.jp":true,"shimonita.gunma.jp":true,"shinto.gunma.jp":true,"showa.gunma.jp":true,"takasaki.gunma.jp":true,"takayama.gunma.jp":true,"tamamura.gunma.jp":true,"tatebayashi.gunma.jp":true,"tomioka.gunma.jp":true,"tsukiyono.gunma.jp":true,"tsumagoi.gunma.jp":true,"ueno.gunma.jp":true,"yoshioka.gunma.jp":true,"asaminami.hiroshima.jp":true,"daiwa.hiroshima.jp":true,"etajima.hiroshima.jp":true,"fuchu.hiroshima.jp":true,"fukuyama.hiroshima.jp":true,"hatsukaichi.hiroshima.jp":true,"higashihiroshima.hiroshima.jp":true,"hongo.hiroshima.jp":true,"jinsekikogen.hiroshima.jp":true,"kaita.hiroshima.jp":true,"kui.hiroshima.jp":true,"kumano.hiroshima.jp":true,"kure.hiroshima.jp":true,"mihara.hiroshima.jp":true,"miyoshi.hiroshima.jp":true,"naka.hiroshima.jp":true,"onomichi.hiroshima.jp":true,"osakikamijima.hiroshima.jp":true,"otake.hiroshima.jp":true,"saka.hiroshima.jp":true,"sera.hiroshima.jp":true,"seranishi.hiroshima.jp":true,"shinichi.hiroshima.jp":true,"shobara.hiroshima.jp":true,"takehara.hiroshima.jp":true,"abashiri.hokkaido.jp":true,"abira.hokkaido.jp":true,"aibetsu.hokkaido.jp":true,"akabira.hokkaido.jp":true,"akkeshi.hokkaido.jp":true,"asahikawa.hokkaido.jp":true,"ashibetsu.hokkaido.jp":true,"ashoro.hokkaido.jp":true,"assabu.hokkaido.jp":true,"atsuma.hokkaido.jp":true,"bibai.hokkaido.jp":true,"biei.hokkaido.jp":true,"bifuka.hokkaido.jp":true,"bihoro.hokkaido.jp":true,"biratori.hokkaido.jp":true,"chippubetsu.hokkaido.jp":true,"chitose.hokkaido.jp":true,"date.hokkaido.jp":true,"ebetsu.hokkaido.jp":true,"embetsu.hokkaido.jp":true,"eniwa.hokkaido.jp":true,"erimo.hokkaido.jp":true,"esan.hokkaido.jp":true,"esashi.hokkaido.jp":true,"fukagawa.hokkaido.jp":true,"fukushima.hokkaido.jp":true,"furano.hokkaido.jp":true,"furubira.hokkaido.jp":true,"haboro.hokkaido.jp":true,"hakodate.hokkaido.jp":true,"hamatonbetsu.hokkaido.jp":true,"hidaka.hokkaido.jp":true,"higashikagura.hokkaido.jp":true,"higashikawa.hokkaido.jp":true,"hiroo.hokkaido.jp":true,"hokuryu.hokkaido.jp":true,"hokuto.hokkaido.jp":true,"honbetsu.hokkaido.jp":true,"horokanai.hokkaido.jp":true,"horonobe.hokkaido.jp":true,"ikeda.hokkaido.jp":true,"imakane.hokkaido.jp":true,"ishikari.hokkaido.jp":true,"iwamizawa.hokkaido.jp":true,"iwanai.hokkaido.jp":true,"kamifurano.hokkaido.jp":true,"kamikawa.hokkaido.jp":true,"kamishihoro.hokkaido.jp":true,"kamisunagawa.hokkaido.jp":true,"kamoenai.hokkaido.jp":true,"kayabe.hokkaido.jp":true,"kembuchi.hokkaido.jp":true,"kikonai.hokkaido.jp":true,"kimobetsu.hokkaido.jp":true,"kitahiroshima.hokkaido.jp":true,"kitami.hokkaido.jp":true,"kiyosato.hokkaido.jp":true,"koshimizu.hokkaido.jp":true,"kunneppu.hokkaido.jp":true,"kuriyama.hokkaido.jp":true,"kuromatsunai.hokkaido.jp":true,"kushiro.hokkaido.jp":true,"kutchan.hokkaido.jp":true,"kyowa.hokkaido.jp":true,"mashike.hokkaido.jp":true,"matsumae.hokkaido.jp":true,"mikasa.hokkaido.jp":true,"minamifurano.hokkaido.jp":true,"mombetsu.hokkaido.jp":true,"moseushi.hokkaido.jp":true,"mukawa.hokkaido.jp":true,"muroran.hokkaido.jp":true,"naie.hokkaido.jp":true,"nakagawa.hokkaido.jp":true,"nakasatsunai.hokkaido.jp":true,"nakatombetsu.hokkaido.jp":true,"nanae.hokkaido.jp":true,"nanporo.hokkaido.jp":true,"nayoro.hokkaido.jp":true,"nemuro.hokkaido.jp":true,"niikappu.hokkaido.jp":true,"niki.hokkaido.jp":true,"nishiokoppe.hokkaido.jp":true,"noboribetsu.hokkaido.jp":true,"numata.hokkaido.jp":true,"obihiro.hokkaido.jp":true,"obira.hokkaido.jp":true,"oketo.hokkaido.jp":true,"okoppe.hokkaido.jp":true,"otaru.hokkaido.jp":true,"otobe.hokkaido.jp":true,"otofuke.hokkaido.jp":true,"otoineppu.hokkaido.jp":true,"oumu.hokkaido.jp":true,"ozora.hokkaido.jp":true,"pippu.hokkaido.jp":true,"rankoshi.hokkaido.jp":true,"rebun.hokkaido.jp":true,"rikubetsu.hokkaido.jp":true,"rishiri.hokkaido.jp":true,"rishirifuji.hokkaido.jp":true,"saroma.hokkaido.jp":true,"sarufutsu.hokkaido.jp":true,"shakotan.hokkaido.jp":true,"shari.hokkaido.jp":true,"shibecha.hokkaido.jp":true,"shibetsu.hokkaido.jp":true,"shikabe.hokkaido.jp":true,"shikaoi.hokkaido.jp":true,"shimamaki.hokkaido.jp":true,"shimizu.hokkaido.jp":true,"shimokawa.hokkaido.jp":true,"shinshinotsu.hokkaido.jp":true,"shintoku.hokkaido.jp":true,"shiranuka.hokkaido.jp":true,"shiraoi.hokkaido.jp":true,"shiriuchi.hokkaido.jp":true,"sobetsu.hokkaido.jp":true,"sunagawa.hokkaido.jp":true,"taiki.hokkaido.jp":true,"takasu.hokkaido.jp":true,"takikawa.hokkaido.jp":true,"takinoue.hokkaido.jp":true,"teshikaga.hokkaido.jp":true,"tobetsu.hokkaido.jp":true,"tohma.hokkaido.jp":true,"tomakomai.hokkaido.jp":true,"tomari.hokkaido.jp":true,"toya.hokkaido.jp":true,"toyako.hokkaido.jp":true,"toyotomi.hokkaido.jp":true,"toyoura.hokkaido.jp":true,"tsubetsu.hokkaido.jp":true,"tsukigata.hokkaido.jp":true,"urakawa.hokkaido.jp":true,"urausu.hokkaido.jp":true,"uryu.hokkaido.jp":true,"utashinai.hokkaido.jp":true,"wakkanai.hokkaido.jp":true,"wassamu.hokkaido.jp":true,"yakumo.hokkaido.jp":true,"yoichi.hokkaido.jp":true,"aioi.hyogo.jp":true,"akashi.hyogo.jp":true,"ako.hyogo.jp":true,"amagasaki.hyogo.jp":true,"aogaki.hyogo.jp":true,"asago.hyogo.jp":true,"ashiya.hyogo.jp":true,"awaji.hyogo.jp":true,"fukusaki.hyogo.jp":true,"goshiki.hyogo.jp":true,"harima.hyogo.jp":true,"himeji.hyogo.jp":true,"ichikawa.hyogo.jp":true,"inagawa.hyogo.jp":true,"itami.hyogo.jp":true,"kakogawa.hyogo.jp":true,"kamigori.hyogo.jp":true,"kamikawa.hyogo.jp":true,"kasai.hyogo.jp":true,"kasuga.hyogo.jp":true,"kawanishi.hyogo.jp":true,"miki.hyogo.jp":true,"minamiawaji.hyogo.jp":true,"nishinomiya.hyogo.jp":true,"nishiwaki.hyogo.jp":true,"ono.hyogo.jp":true,"sanda.hyogo.jp":true,"sannan.hyogo.jp":true,"sasayama.hyogo.jp":true,"sayo.hyogo.jp":true,"shingu.hyogo.jp":true,"shinonsen.hyogo.jp":true,"shiso.hyogo.jp":true,"sumoto.hyogo.jp":true,"taishi.hyogo.jp":true,"taka.hyogo.jp":true,"takarazuka.hyogo.jp":true,"takasago.hyogo.jp":true,"takino.hyogo.jp":true,"tamba.hyogo.jp":true,"tatsuno.hyogo.jp":true,"toyooka.hyogo.jp":true,"yabu.hyogo.jp":true,"yashiro.hyogo.jp":true,"yoka.hyogo.jp":true,"yokawa.hyogo.jp":true,"ami.ibaraki.jp":true,"asahi.ibaraki.jp":true,"bando.ibaraki.jp":true,"chikusei.ibaraki.jp":true,"daigo.ibaraki.jp":true,"fujishiro.ibaraki.jp":true,"hitachi.ibaraki.jp":true,"hitachinaka.ibaraki.jp":true,"hitachiomiya.ibaraki.jp":true,"hitachiota.ibaraki.jp":true,"ibaraki.ibaraki.jp":true,"ina.ibaraki.jp":true,"inashiki.ibaraki.jp":true,"itako.ibaraki.jp":true,"iwama.ibaraki.jp":true,"joso.ibaraki.jp":true,"kamisu.ibaraki.jp":true,"kasama.ibaraki.jp":true,"kashima.ibaraki.jp":true,"kasumigaura.ibaraki.jp":true,"koga.ibaraki.jp":true,"miho.ibaraki.jp":true,"mito.ibaraki.jp":true,"moriya.ibaraki.jp":true,"naka.ibaraki.jp":true,"namegata.ibaraki.jp":true,"oarai.ibaraki.jp":true,"ogawa.ibaraki.jp":true,"omitama.ibaraki.jp":true,"ryugasaki.ibaraki.jp":true,"sakai.ibaraki.jp":true,"sakuragawa.ibaraki.jp":true,"shimodate.ibaraki.jp":true,"shimotsuma.ibaraki.jp":true,"shirosato.ibaraki.jp":true,"sowa.ibaraki.jp":true,"suifu.ibaraki.jp":true,"takahagi.ibaraki.jp":true,"tamatsukuri.ibaraki.jp":true,"tokai.ibaraki.jp":true,"tomobe.ibaraki.jp":true,"tone.ibaraki.jp":true,"toride.ibaraki.jp":true,"tsuchiura.ibaraki.jp":true,"tsukuba.ibaraki.jp":true,"uchihara.ibaraki.jp":true,"ushiku.ibaraki.jp":true,"yachiyo.ibaraki.jp":true,"yamagata.ibaraki.jp":true,"yawara.ibaraki.jp":true,"yuki.ibaraki.jp":true,"anamizu.ishikawa.jp":true,"hakui.ishikawa.jp":true,"hakusan.ishikawa.jp":true,"kaga.ishikawa.jp":true,"kahoku.ishikawa.jp":true,"kanazawa.ishikawa.jp":true,"kawakita.ishikawa.jp":true,"komatsu.ishikawa.jp":true,"nakanoto.ishikawa.jp":true,"nanao.ishikawa.jp":true,"nomi.ishikawa.jp":true,"nonoichi.ishikawa.jp":true,"noto.ishikawa.jp":true,"shika.ishikawa.jp":true,"suzu.ishikawa.jp":true,"tsubata.ishikawa.jp":true,"tsurugi.ishikawa.jp":true,"uchinada.ishikawa.jp":true,"wajima.ishikawa.jp":true,"fudai.iwate.jp":true,"fujisawa.iwate.jp":true,"hanamaki.iwate.jp":true,"hiraizumi.iwate.jp":true,"hirono.iwate.jp":true,"ichinohe.iwate.jp":true,"ichinoseki.iwate.jp":true,"iwaizumi.iwate.jp":true,"iwate.iwate.jp":true,"joboji.iwate.jp":true,"kamaishi.iwate.jp":true,"kanegasaki.iwate.jp":true,"karumai.iwate.jp":true,"kawai.iwate.jp":true,"kitakami.iwate.jp":true,"kuji.iwate.jp":true,"kunohe.iwate.jp":true,"kuzumaki.iwate.jp":true,"miyako.iwate.jp":true,"mizusawa.iwate.jp":true,"morioka.iwate.jp":true,"ninohe.iwate.jp":true,"noda.iwate.jp":true,"ofunato.iwate.jp":true,"oshu.iwate.jp":true,"otsuchi.iwate.jp":true,"rikuzentakata.iwate.jp":true,"shiwa.iwate.jp":true,"shizukuishi.iwate.jp":true,"sumita.iwate.jp":true,"tanohata.iwate.jp":true,"tono.iwate.jp":true,"yahaba.iwate.jp":true,"yamada.iwate.jp":true,"ayagawa.kagawa.jp":true,"higashikagawa.kagawa.jp":true,"kanonji.kagawa.jp":true,"kotohira.kagawa.jp":true,"manno.kagawa.jp":true,"marugame.kagawa.jp":true,"mitoyo.kagawa.jp":true,"naoshima.kagawa.jp":true,"sanuki.kagawa.jp":true,"tadotsu.kagawa.jp":true,"takamatsu.kagawa.jp":true,"tonosho.kagawa.jp":true,"uchinomi.kagawa.jp":true,"utazu.kagawa.jp":true,"zentsuji.kagawa.jp":true,"akune.kagoshima.jp":true,"amami.kagoshima.jp":true,"hioki.kagoshima.jp":true,"isa.kagoshima.jp":true,"isen.kagoshima.jp":true,"izumi.kagoshima.jp":true,"kagoshima.kagoshima.jp":true,"kanoya.kagoshima.jp":true,"kawanabe.kagoshima.jp":true,"kinko.kagoshima.jp":true,"kouyama.kagoshima.jp":true,"makurazaki.kagoshima.jp":true,"matsumoto.kagoshima.jp":true,"minamitane.kagoshima.jp":true,"nakatane.kagoshima.jp":true,"nishinoomote.kagoshima.jp":true,"satsumasendai.kagoshima.jp":true,"soo.kagoshima.jp":true,"tarumizu.kagoshima.jp":true,"yusui.kagoshima.jp":true,"aikawa.kanagawa.jp":true,"atsugi.kanagawa.jp":true,"ayase.kanagawa.jp":true,"chigasaki.kanagawa.jp":true,"ebina.kanagawa.jp":true,"fujisawa.kanagawa.jp":true,"hadano.kanagawa.jp":true,"hakone.kanagawa.jp":true,"hiratsuka.kanagawa.jp":true,"isehara.kanagawa.jp":true,"kaisei.kanagawa.jp":true,"kamakura.kanagawa.jp":true,"kiyokawa.kanagawa.jp":true,"matsuda.kanagawa.jp":true,"minamiashigara.kanagawa.jp":true,"miura.kanagawa.jp":true,"nakai.kanagawa.jp":true,"ninomiya.kanagawa.jp":true,"odawara.kanagawa.jp":true,"oi.kanagawa.jp":true,"oiso.kanagawa.jp":true,"sagamihara.kanagawa.jp":true,"samukawa.kanagawa.jp":true,"tsukui.kanagawa.jp":true,"yamakita.kanagawa.jp":true,"yamato.kanagawa.jp":true,"yokosuka.kanagawa.jp":true,"yugawara.kanagawa.jp":true,"zama.kanagawa.jp":true,"zushi.kanagawa.jp":true,"aki.kochi.jp":true,"geisei.kochi.jp":true,"hidaka.kochi.jp":true,"higashitsuno.kochi.jp":true,"ino.kochi.jp":true,"kagami.kochi.jp":true,"kami.kochi.jp":true,"kitagawa.kochi.jp":true,"kochi.kochi.jp":true,"mihara.kochi.jp":true,"motoyama.kochi.jp":true,"muroto.kochi.jp":true,"nahari.kochi.jp":true,"nakamura.kochi.jp":true,"nankoku.kochi.jp":true,"nishitosa.kochi.jp":true,"niyodogawa.kochi.jp":true,"ochi.kochi.jp":true,"okawa.kochi.jp":true,"otoyo.kochi.jp":true,"otsuki.kochi.jp":true,"sakawa.kochi.jp":true,"sukumo.kochi.jp":true,"susaki.kochi.jp":true,"tosa.kochi.jp":true,"tosashimizu.kochi.jp":true,"toyo.kochi.jp":true,"tsuno.kochi.jp":true,"umaji.kochi.jp":true,"yasuda.kochi.jp":true,"yusuhara.kochi.jp":true,"amakusa.kumamoto.jp":true,"arao.kumamoto.jp":true,"aso.kumamoto.jp":true,"choyo.kumamoto.jp":true,"gyokuto.kumamoto.jp":true,"hitoyoshi.kumamoto.jp":true,"kamiamakusa.kumamoto.jp":true,"kashima.kumamoto.jp":true,"kikuchi.kumamoto.jp":true,"kosa.kumamoto.jp":true,"kumamoto.kumamoto.jp":true,"mashiki.kumamoto.jp":true,"mifune.kumamoto.jp":true,"minamata.kumamoto.jp":true,"minamioguni.kumamoto.jp":true,"nagasu.kumamoto.jp":true,"nishihara.kumamoto.jp":true,"oguni.kumamoto.jp":true,"ozu.kumamoto.jp":true,"sumoto.kumamoto.jp":true,"takamori.kumamoto.jp":true,"uki.kumamoto.jp":true,"uto.kumamoto.jp":true,"yamaga.kumamoto.jp":true,"yamato.kumamoto.jp":true,"yatsushiro.kumamoto.jp":true,"ayabe.kyoto.jp":true,"fukuchiyama.kyoto.jp":true,"higashiyama.kyoto.jp":true,"ide.kyoto.jp":true,"ine.kyoto.jp":true,"joyo.kyoto.jp":true,"kameoka.kyoto.jp":true,"kamo.kyoto.jp":true,"kita.kyoto.jp":true,"kizu.kyoto.jp":true,"kumiyama.kyoto.jp":true,"kyotamba.kyoto.jp":true,"kyotanabe.kyoto.jp":true,"kyotango.kyoto.jp":true,"maizuru.kyoto.jp":true,"minami.kyoto.jp":true,"minamiyamashiro.kyoto.jp":true,"miyazu.kyoto.jp":true,"muko.kyoto.jp":true,"nagaokakyo.kyoto.jp":true,"nakagyo.kyoto.jp":true,"nantan.kyoto.jp":true,"oyamazaki.kyoto.jp":true,"sakyo.kyoto.jp":true,"seika.kyoto.jp":true,"tanabe.kyoto.jp":true,"uji.kyoto.jp":true,"ujitawara.kyoto.jp":true,"wazuka.kyoto.jp":true,"yamashina.kyoto.jp":true,"yawata.kyoto.jp":true,"asahi.mie.jp":true,"inabe.mie.jp":true,"ise.mie.jp":true,"kameyama.mie.jp":true,"kawagoe.mie.jp":true,"kiho.mie.jp":true,"kisosaki.mie.jp":true,"kiwa.mie.jp":true,"komono.mie.jp":true,"kumano.mie.jp":true,"kuwana.mie.jp":true,"matsusaka.mie.jp":true,"meiwa.mie.jp":true,"mihama.mie.jp":true,"minamiise.mie.jp":true,"misugi.mie.jp":true,"miyama.mie.jp":true,"nabari.mie.jp":true,"shima.mie.jp":true,"suzuka.mie.jp":true,"tado.mie.jp":true,"taiki.mie.jp":true,"taki.mie.jp":true,"tamaki.mie.jp":true,"toba.mie.jp":true,"tsu.mie.jp":true,"udono.mie.jp":true,"ureshino.mie.jp":true,"watarai.mie.jp":true,"yokkaichi.mie.jp":true,"furukawa.miyagi.jp":true,"higashimatsushima.miyagi.jp":true,"ishinomaki.miyagi.jp":true,"iwanuma.miyagi.jp":true,"kakuda.miyagi.jp":true,"kami.miyagi.jp":true,"kawasaki.miyagi.jp":true,"kesennuma.miyagi.jp":true,"marumori.miyagi.jp":true,"matsushima.miyagi.jp":true,"minamisanriku.miyagi.jp":true,"misato.miyagi.jp":true,"murata.miyagi.jp":true,"natori.miyagi.jp":true,"ogawara.miyagi.jp":true,"ohira.miyagi.jp":true,"onagawa.miyagi.jp":true,"osaki.miyagi.jp":true,"rifu.miyagi.jp":true,"semine.miyagi.jp":true,"shibata.miyagi.jp":true,"shichikashuku.miyagi.jp":true,"shikama.miyagi.jp":true,"shiogama.miyagi.jp":true,"shiroishi.miyagi.jp":true,"tagajo.miyagi.jp":true,"taiwa.miyagi.jp":true,"tome.miyagi.jp":true,"tomiya.miyagi.jp":true,"wakuya.miyagi.jp":true,"watari.miyagi.jp":true,"yamamoto.miyagi.jp":true,"zao.miyagi.jp":true,"aya.miyazaki.jp":true,"ebino.miyazaki.jp":true,"gokase.miyazaki.jp":true,"hyuga.miyazaki.jp":true,"kadogawa.miyazaki.jp":true,"kawaminami.miyazaki.jp":true,"kijo.miyazaki.jp":true,"kitagawa.miyazaki.jp":true,"kitakata.miyazaki.jp":true,"kitaura.miyazaki.jp":true,"kobayashi.miyazaki.jp":true,"kunitomi.miyazaki.jp":true,"kushima.miyazaki.jp":true,"mimata.miyazaki.jp":true,"miyakonojo.miyazaki.jp":true,"miyazaki.miyazaki.jp":true,"morotsuka.miyazaki.jp":true,"nichinan.miyazaki.jp":true,"nishimera.miyazaki.jp":true,"nobeoka.miyazaki.jp":true,"saito.miyazaki.jp":true,"shiiba.miyazaki.jp":true,"shintomi.miyazaki.jp":true,"takaharu.miyazaki.jp":true,"takanabe.miyazaki.jp":true,"takazaki.miyazaki.jp":true,"tsuno.miyazaki.jp":true,"achi.nagano.jp":true,"agematsu.nagano.jp":true,"anan.nagano.jp":true,"aoki.nagano.jp":true,"asahi.nagano.jp":true,"azumino.nagano.jp":true,"chikuhoku.nagano.jp":true,"chikuma.nagano.jp":true,"chino.nagano.jp":true,"fujimi.nagano.jp":true,"hakuba.nagano.jp":true,"hara.nagano.jp":true,"hiraya.nagano.jp":true,"iida.nagano.jp":true,"iijima.nagano.jp":true,"iiyama.nagano.jp":true,"iizuna.nagano.jp":true,"ikeda.nagano.jp":true,"ikusaka.nagano.jp":true,"ina.nagano.jp":true,"karuizawa.nagano.jp":true,"kawakami.nagano.jp":true,"kiso.nagano.jp":true,"kisofukushima.nagano.jp":true,"kitaaiki.nagano.jp":true,"komagane.nagano.jp":true,"komoro.nagano.jp":true,"matsukawa.nagano.jp":true,"matsumoto.nagano.jp":true,"miasa.nagano.jp":true,"minamiaiki.nagano.jp":true,"minamimaki.nagano.jp":true,"minamiminowa.nagano.jp":true,"minowa.nagano.jp":true,"miyada.nagano.jp":true,"miyota.nagano.jp":true,"mochizuki.nagano.jp":true,"nagano.nagano.jp":true,"nagawa.nagano.jp":true,"nagiso.nagano.jp":true,"nakagawa.nagano.jp":true,"nakano.nagano.jp":true,"nozawaonsen.nagano.jp":true,"obuse.nagano.jp":true,"ogawa.nagano.jp":true,"okaya.nagano.jp":true,"omachi.nagano.jp":true,"omi.nagano.jp":true,"ookuwa.nagano.jp":true,"ooshika.nagano.jp":true,"otaki.nagano.jp":true,"otari.nagano.jp":true,"sakae.nagano.jp":true,"sakaki.nagano.jp":true,"saku.nagano.jp":true,"sakuho.nagano.jp":true,"shimosuwa.nagano.jp":true,"shinanomachi.nagano.jp":true,"shiojiri.nagano.jp":true,"suwa.nagano.jp":true,"suzaka.nagano.jp":true,"takagi.nagano.jp":true,"takamori.nagano.jp":true,"takayama.nagano.jp":true,"tateshina.nagano.jp":true,"tatsuno.nagano.jp":true,"togakushi.nagano.jp":true,"togura.nagano.jp":true,"tomi.nagano.jp":true,"ueda.nagano.jp":true,"wada.nagano.jp":true,"yamagata.nagano.jp":true,"yamanouchi.nagano.jp":true,"yasaka.nagano.jp":true,"yasuoka.nagano.jp":true,"chijiwa.nagasaki.jp":true,"futsu.nagasaki.jp":true,"goto.nagasaki.jp":true,"hasami.nagasaki.jp":true,"hirado.nagasaki.jp":true,"iki.nagasaki.jp":true,"isahaya.nagasaki.jp":true,"kawatana.nagasaki.jp":true,"kuchinotsu.nagasaki.jp":true,"matsuura.nagasaki.jp":true,"nagasaki.nagasaki.jp":true,"obama.nagasaki.jp":true,"omura.nagasaki.jp":true,"oseto.nagasaki.jp":true,"saikai.nagasaki.jp":true,"sasebo.nagasaki.jp":true,"seihi.nagasaki.jp":true,"shimabara.nagasaki.jp":true,"shinkamigoto.nagasaki.jp":true,"togitsu.nagasaki.jp":true,"tsushima.nagasaki.jp":true,"unzen.nagasaki.jp":true,"ando.nara.jp":true,"gose.nara.jp":true,"heguri.nara.jp":true,"higashiyoshino.nara.jp":true,"ikaruga.nara.jp":true,"ikoma.nara.jp":true,"kamikitayama.nara.jp":true,"kanmaki.nara.jp":true,"kashiba.nara.jp":true,"kashihara.nara.jp":true,"katsuragi.nara.jp":true,"kawai.nara.jp":true,"kawakami.nara.jp":true,"kawanishi.nara.jp":true,"koryo.nara.jp":true,"kurotaki.nara.jp":true,"mitsue.nara.jp":true,"miyake.nara.jp":true,"nara.nara.jp":true,"nosegawa.nara.jp":true,"oji.nara.jp":true,"ouda.nara.jp":true,"oyodo.nara.jp":true,"sakurai.nara.jp":true,"sango.nara.jp":true,"shimoichi.nara.jp":true,"shimokitayama.nara.jp":true,"shinjo.nara.jp":true,"soni.nara.jp":true,"takatori.nara.jp":true,"tawaramoto.nara.jp":true,"tenkawa.nara.jp":true,"tenri.nara.jp":true,"uda.nara.jp":true,"yamatokoriyama.nara.jp":true,"yamatotakada.nara.jp":true,"yamazoe.nara.jp":true,"yoshino.nara.jp":true,"aga.niigata.jp":true,"agano.niigata.jp":true,"gosen.niigata.jp":true,"itoigawa.niigata.jp":true,"izumozaki.niigata.jp":true,"joetsu.niigata.jp":true,"kamo.niigata.jp":true,"kariwa.niigata.jp":true,"kashiwazaki.niigata.jp":true,"minamiuonuma.niigata.jp":true,"mitsuke.niigata.jp":true,"muika.niigata.jp":true,"murakami.niigata.jp":true,"myoko.niigata.jp":true,"nagaoka.niigata.jp":true,"niigata.niigata.jp":true,"ojiya.niigata.jp":true,"omi.niigata.jp":true,"sado.niigata.jp":true,"sanjo.niigata.jp":true,"seiro.niigata.jp":true,"seirou.niigata.jp":true,"sekikawa.niigata.jp":true,"shibata.niigata.jp":true,"tagami.niigata.jp":true,"tainai.niigata.jp":true,"tochio.niigata.jp":true,"tokamachi.niigata.jp":true,"tsubame.niigata.jp":true,"tsunan.niigata.jp":true,"uonuma.niigata.jp":true,"yahiko.niigata.jp":true,"yoita.niigata.jp":true,"yuzawa.niigata.jp":true,"beppu.oita.jp":true,"bungoono.oita.jp":true,"bungotakada.oita.jp":true,"hasama.oita.jp":true,"hiji.oita.jp":true,"himeshima.oita.jp":true,"hita.oita.jp":true,"kamitsue.oita.jp":true,"kokonoe.oita.jp":true,"kuju.oita.jp":true,"kunisaki.oita.jp":true,"kusu.oita.jp":true,"oita.oita.jp":true,"saiki.oita.jp":true,"taketa.oita.jp":true,"tsukumi.oita.jp":true,"usa.oita.jp":true,"usuki.oita.jp":true,"yufu.oita.jp":true,"akaiwa.okayama.jp":true,"asakuchi.okayama.jp":true,"bizen.okayama.jp":true,"hayashima.okayama.jp":true,"ibara.okayama.jp":true,"kagamino.okayama.jp":true,"kasaoka.okayama.jp":true,"kibichuo.okayama.jp":true,"kumenan.okayama.jp":true,"kurashiki.okayama.jp":true,"maniwa.okayama.jp":true,"misaki.okayama.jp":true,"nagi.okayama.jp":true,"niimi.okayama.jp":true,"nishiawakura.okayama.jp":true,"okayama.okayama.jp":true,"satosho.okayama.jp":true,"setouchi.okayama.jp":true,"shinjo.okayama.jp":true,"shoo.okayama.jp":true,"soja.okayama.jp":true,"takahashi.okayama.jp":true,"tamano.okayama.jp":true,"tsuyama.okayama.jp":true,"wake.okayama.jp":true,"yakage.okayama.jp":true,"aguni.okinawa.jp":true,"ginowan.okinawa.jp":true,"ginoza.okinawa.jp":true,"gushikami.okinawa.jp":true,"haebaru.okinawa.jp":true,"higashi.okinawa.jp":true,"hirara.okinawa.jp":true,"iheya.okinawa.jp":true,"ishigaki.okinawa.jp":true,"ishikawa.okinawa.jp":true,"itoman.okinawa.jp":true,"izena.okinawa.jp":true,"kadena.okinawa.jp":true,"kin.okinawa.jp":true,"kitadaito.okinawa.jp":true,"kitanakagusuku.okinawa.jp":true,"kumejima.okinawa.jp":true,"kunigami.okinawa.jp":true,"minamidaito.okinawa.jp":true,"motobu.okinawa.jp":true,"nago.okinawa.jp":true,"naha.okinawa.jp":true,"nakagusuku.okinawa.jp":true,"nakijin.okinawa.jp":true,"nanjo.okinawa.jp":true,"nishihara.okinawa.jp":true,"ogimi.okinawa.jp":true,"okinawa.okinawa.jp":true,"onna.okinawa.jp":true,"shimoji.okinawa.jp":true,"taketomi.okinawa.jp":true,"tarama.okinawa.jp":true,"tokashiki.okinawa.jp":true,"tomigusuku.okinawa.jp":true,"tonaki.okinawa.jp":true,"urasoe.okinawa.jp":true,"uruma.okinawa.jp":true,"yaese.okinawa.jp":true,"yomitan.okinawa.jp":true,"yonabaru.okinawa.jp":true,"yonaguni.okinawa.jp":true,"zamami.okinawa.jp":true,"abeno.osaka.jp":true,"chihayaakasaka.osaka.jp":true,"chuo.osaka.jp":true,"daito.osaka.jp":true,"fujiidera.osaka.jp":true,"habikino.osaka.jp":true,"hannan.osaka.jp":true,"higashiosaka.osaka.jp":true,"higashisumiyoshi.osaka.jp":true,"higashiyodogawa.osaka.jp":true,"hirakata.osaka.jp":true,"ibaraki.osaka.jp":true,"ikeda.osaka.jp":true,"izumi.osaka.jp":true,"izumiotsu.osaka.jp":true,"izumisano.osaka.jp":true,"kadoma.osaka.jp":true,"kaizuka.osaka.jp":true,"kanan.osaka.jp":true,"kashiwara.osaka.jp":true,"katano.osaka.jp":true,"kawachinagano.osaka.jp":true,"kishiwada.osaka.jp":true,"kita.osaka.jp":true,"kumatori.osaka.jp":true,"matsubara.osaka.jp":true,"minato.osaka.jp":true,"minoh.osaka.jp":true,"misaki.osaka.jp":true,"moriguchi.osaka.jp":true,"neyagawa.osaka.jp":true,"nishi.osaka.jp":true,"nose.osaka.jp":true,"osakasayama.osaka.jp":true,"sakai.osaka.jp":true,"sayama.osaka.jp":true,"sennan.osaka.jp":true,"settsu.osaka.jp":true,"shijonawate.osaka.jp":true,"shimamoto.osaka.jp":true,"suita.osaka.jp":true,"tadaoka.osaka.jp":true,"taishi.osaka.jp":true,"tajiri.osaka.jp":true,"takaishi.osaka.jp":true,"takatsuki.osaka.jp":true,"tondabayashi.osaka.jp":true,"toyonaka.osaka.jp":true,"toyono.osaka.jp":true,"yao.osaka.jp":true,"ariake.saga.jp":true,"arita.saga.jp":true,"fukudomi.saga.jp":true,"genkai.saga.jp":true,"hamatama.saga.jp":true,"hizen.saga.jp":true,"imari.saga.jp":true,"kamimine.saga.jp":true,"kanzaki.saga.jp":true,"karatsu.saga.jp":true,"kashima.saga.jp":true,"kitagata.saga.jp":true,"kitahata.saga.jp":true,"kiyama.saga.jp":true,"kouhoku.saga.jp":true,"kyuragi.saga.jp":true,"nishiarita.saga.jp":true,"ogi.saga.jp":true,"omachi.saga.jp":true,"ouchi.saga.jp":true,"saga.saga.jp":true,"shiroishi.saga.jp":true,"taku.saga.jp":true,"tara.saga.jp":true,"tosu.saga.jp":true,"yoshinogari.saga.jp":true,"arakawa.saitama.jp":true,"asaka.saitama.jp":true,"chichibu.saitama.jp":true,"fujimi.saitama.jp":true,"fujimino.saitama.jp":true,"fukaya.saitama.jp":true,"hanno.saitama.jp":true,"hanyu.saitama.jp":true,"hasuda.saitama.jp":true,"hatogaya.saitama.jp":true,"hatoyama.saitama.jp":true,"hidaka.saitama.jp":true,"higashichichibu.saitama.jp":true,"higashimatsuyama.saitama.jp":true,"honjo.saitama.jp":true,"ina.saitama.jp":true,"iruma.saitama.jp":true,"iwatsuki.saitama.jp":true,"kamiizumi.saitama.jp":true,"kamikawa.saitama.jp":true,"kamisato.saitama.jp":true,"kasukabe.saitama.jp":true,"kawagoe.saitama.jp":true,"kawaguchi.saitama.jp":true,"kawajima.saitama.jp":true,"kazo.saitama.jp":true,"kitamoto.saitama.jp":true,"koshigaya.saitama.jp":true,"kounosu.saitama.jp":true,"kuki.saitama.jp":true,"kumagaya.saitama.jp":true,"matsubushi.saitama.jp":true,"minano.saitama.jp":true,"misato.saitama.jp":true,"miyashiro.saitama.jp":true,"miyoshi.saitama.jp":true,"moroyama.saitama.jp":true,"nagatoro.saitama.jp":true,"namegawa.saitama.jp":true,"niiza.saitama.jp":true,"ogano.saitama.jp":true,"ogawa.saitama.jp":true,"ogose.saitama.jp":true,"okegawa.saitama.jp":true,"omiya.saitama.jp":true,"otaki.saitama.jp":true,"ranzan.saitama.jp":true,"ryokami.saitama.jp":true,"saitama.saitama.jp":true,"sakado.saitama.jp":true,"satte.saitama.jp":true,"sayama.saitama.jp":true,"shiki.saitama.jp":true,"shiraoka.saitama.jp":true,"soka.saitama.jp":true,"sugito.saitama.jp":true,"toda.saitama.jp":true,"tokigawa.saitama.jp":true,"tokorozawa.saitama.jp":true,"tsurugashima.saitama.jp":true,"urawa.saitama.jp":true,"warabi.saitama.jp":true,"yashio.saitama.jp":true,"yokoze.saitama.jp":true,"yono.saitama.jp":true,"yorii.saitama.jp":true,"yoshida.saitama.jp":true,"yoshikawa.saitama.jp":true,"yoshimi.saitama.jp":true,"aisho.shiga.jp":true,"gamo.shiga.jp":true,"higashiomi.shiga.jp":true,"hikone.shiga.jp":true,"koka.shiga.jp":true,"konan.shiga.jp":true,"kosei.shiga.jp":true,"koto.shiga.jp":true,"kusatsu.shiga.jp":true,"maibara.shiga.jp":true,"moriyama.shiga.jp":true,"nagahama.shiga.jp":true,"nishiazai.shiga.jp":true,"notogawa.shiga.jp":true,"omihachiman.shiga.jp":true,"otsu.shiga.jp":true,"ritto.shiga.jp":true,"ryuoh.shiga.jp":true,"takashima.shiga.jp":true,"takatsuki.shiga.jp":true,"torahime.shiga.jp":true,"toyosato.shiga.jp":true,"yasu.shiga.jp":true,"akagi.shimane.jp":true,"ama.shimane.jp":true,"gotsu.shimane.jp":true,"hamada.shimane.jp":true,"higashiizumo.shimane.jp":true,"hikawa.shimane.jp":true,"hikimi.shimane.jp":true,"izumo.shimane.jp":true,"kakinoki.shimane.jp":true,"masuda.shimane.jp":true,"matsue.shimane.jp":true,"misato.shimane.jp":true,"nishinoshima.shimane.jp":true,"ohda.shimane.jp":true,"okinoshima.shimane.jp":true,"okuizumo.shimane.jp":true,"shimane.shimane.jp":true,"tamayu.shimane.jp":true,"tsuwano.shimane.jp":true,"unnan.shimane.jp":true,"yakumo.shimane.jp":true,"yasugi.shimane.jp":true,"yatsuka.shimane.jp":true,"arai.shizuoka.jp":true,"atami.shizuoka.jp":true,"fuji.shizuoka.jp":true,"fujieda.shizuoka.jp":true,"fujikawa.shizuoka.jp":true,"fujinomiya.shizuoka.jp":true,"fukuroi.shizuoka.jp":true,"gotemba.shizuoka.jp":true,"haibara.shizuoka.jp":true,"hamamatsu.shizuoka.jp":true,"higashiizu.shizuoka.jp":true,"ito.shizuoka.jp":true,"iwata.shizuoka.jp":true,"izu.shizuoka.jp":true,"izunokuni.shizuoka.jp":true,"kakegawa.shizuoka.jp":true,"kannami.shizuoka.jp":true,"kawanehon.shizuoka.jp":true,"kawazu.shizuoka.jp":true,"kikugawa.shizuoka.jp":true,"kosai.shizuoka.jp":true,"makinohara.shizuoka.jp":true,"matsuzaki.shizuoka.jp":true,"minamiizu.shizuoka.jp":true,"mishima.shizuoka.jp":true,"morimachi.shizuoka.jp":true,"nishiizu.shizuoka.jp":true,"numazu.shizuoka.jp":true,"omaezaki.shizuoka.jp":true,"shimada.shizuoka.jp":true,"shimizu.shizuoka.jp":true,"shimoda.shizuoka.jp":true,"shizuoka.shizuoka.jp":true,"susono.shizuoka.jp":true,"yaizu.shizuoka.jp":true,"yoshida.shizuoka.jp":true,"ashikaga.tochigi.jp":true,"bato.tochigi.jp":true,"haga.tochigi.jp":true,"ichikai.tochigi.jp":true,"iwafune.tochigi.jp":true,"kaminokawa.tochigi.jp":true,"kanuma.tochigi.jp":true,"karasuyama.tochigi.jp":true,"kuroiso.tochigi.jp":true,"mashiko.tochigi.jp":true,"mibu.tochigi.jp":true,"moka.tochigi.jp":true,"motegi.tochigi.jp":true,"nasu.tochigi.jp":true,"nasushiobara.tochigi.jp":true,"nikko.tochigi.jp":true,"nishikata.tochigi.jp":true,"nogi.tochigi.jp":true,"ohira.tochigi.jp":true,"ohtawara.tochigi.jp":true,"oyama.tochigi.jp":true,"sakura.tochigi.jp":true,"sano.tochigi.jp":true,"shimotsuke.tochigi.jp":true,"shioya.tochigi.jp":true,"takanezawa.tochigi.jp":true,"tochigi.tochigi.jp":true,"tsuga.tochigi.jp":true,"ujiie.tochigi.jp":true,"utsunomiya.tochigi.jp":true,"yaita.tochigi.jp":true,"aizumi.tokushima.jp":true,"anan.tokushima.jp":true,"ichiba.tokushima.jp":true,"itano.tokushima.jp":true,"kainan.tokushima.jp":true,"komatsushima.tokushima.jp":true,"matsushige.tokushima.jp":true,"mima.tokushima.jp":true,"minami.tokushima.jp":true,"miyoshi.tokushima.jp":true,"mugi.tokushima.jp":true,"nakagawa.tokushima.jp":true,"naruto.tokushima.jp":true,"sanagochi.tokushima.jp":true,"shishikui.tokushima.jp":true,"tokushima.tokushima.jp":true,"wajiki.tokushima.jp":true,"adachi.tokyo.jp":true,"akiruno.tokyo.jp":true,"akishima.tokyo.jp":true,"aogashima.tokyo.jp":true,"arakawa.tokyo.jp":true,"bunkyo.tokyo.jp":true,"chiyoda.tokyo.jp":true,"chofu.tokyo.jp":true,"chuo.tokyo.jp":true,"edogawa.tokyo.jp":true,"fuchu.tokyo.jp":true,"fussa.tokyo.jp":true,"hachijo.tokyo.jp":true,"hachioji.tokyo.jp":true,"hamura.tokyo.jp":true,"higashikurume.tokyo.jp":true,"higashimurayama.tokyo.jp":true,"higashiyamato.tokyo.jp":true,"hino.tokyo.jp":true,"hinode.tokyo.jp":true,"hinohara.tokyo.jp":true,"inagi.tokyo.jp":true,"itabashi.tokyo.jp":true,"katsushika.tokyo.jp":true,"kita.tokyo.jp":true,"kiyose.tokyo.jp":true,"kodaira.tokyo.jp":true,"koganei.tokyo.jp":true,"kokubunji.tokyo.jp":true,"komae.tokyo.jp":true,"koto.tokyo.jp":true,"kouzushima.tokyo.jp":true,"kunitachi.tokyo.jp":true,"machida.tokyo.jp":true,"meguro.tokyo.jp":true,"minato.tokyo.jp":true,"mitaka.tokyo.jp":true,"mizuho.tokyo.jp":true,"musashimurayama.tokyo.jp":true,"musashino.tokyo.jp":true,"nakano.tokyo.jp":true,"nerima.tokyo.jp":true,"ogasawara.tokyo.jp":true,"okutama.tokyo.jp":true,"ome.tokyo.jp":true,"oshima.tokyo.jp":true,"ota.tokyo.jp":true,"setagaya.tokyo.jp":true,"shibuya.tokyo.jp":true,"shinagawa.tokyo.jp":true,"shinjuku.tokyo.jp":true,"suginami.tokyo.jp":true,"sumida.tokyo.jp":true,"tachikawa.tokyo.jp":true,"taito.tokyo.jp":true,"tama.tokyo.jp":true,"toshima.tokyo.jp":true,"chizu.tottori.jp":true,"hino.tottori.jp":true,"kawahara.tottori.jp":true,"koge.tottori.jp":true,"kotoura.tottori.jp":true,"misasa.tottori.jp":true,"nanbu.tottori.jp":true,"nichinan.tottori.jp":true,"sakaiminato.tottori.jp":true,"tottori.tottori.jp":true,"wakasa.tottori.jp":true,"yazu.tottori.jp":true,"yonago.tottori.jp":true,"asahi.toyama.jp":true,"fuchu.toyama.jp":true,"fukumitsu.toyama.jp":true,"funahashi.toyama.jp":true,"himi.toyama.jp":true,"imizu.toyama.jp":true,"inami.toyama.jp":true,"johana.toyama.jp":true,"kamiichi.toyama.jp":true,"kurobe.toyama.jp":true,"nakaniikawa.toyama.jp":true,"namerikawa.toyama.jp":true,"nanto.toyama.jp":true,"nyuzen.toyama.jp":true,"oyabe.toyama.jp":true,"taira.toyama.jp":true,"takaoka.toyama.jp":true,"tateyama.toyama.jp":true,"toga.toyama.jp":true,"tonami.toyama.jp":true,"toyama.toyama.jp":true,"unazuki.toyama.jp":true,"uozu.toyama.jp":true,"yamada.toyama.jp":true,"arida.wakayama.jp":true,"aridagawa.wakayama.jp":true,"gobo.wakayama.jp":true,"hashimoto.wakayama.jp":true,"hidaka.wakayama.jp":true,"hirogawa.wakayama.jp":true,"inami.wakayama.jp":true,"iwade.wakayama.jp":true,"kainan.wakayama.jp":true,"kamitonda.wakayama.jp":true,"katsuragi.wakayama.jp":true,"kimino.wakayama.jp":true,"kinokawa.wakayama.jp":true,"kitayama.wakayama.jp":true,"koya.wakayama.jp":true,"koza.wakayama.jp":true,"kozagawa.wakayama.jp":true,"kudoyama.wakayama.jp":true,"kushimoto.wakayama.jp":true,"mihama.wakayama.jp":true,"misato.wakayama.jp":true,"nachikatsuura.wakayama.jp":true,"shingu.wakayama.jp":true,"shirahama.wakayama.jp":true,"taiji.wakayama.jp":true,"tanabe.wakayama.jp":true,"wakayama.wakayama.jp":true,"yuasa.wakayama.jp":true,"yura.wakayama.jp":true,"asahi.yamagata.jp":true,"funagata.yamagata.jp":true,"higashine.yamagata.jp":true,"iide.yamagata.jp":true,"kahoku.yamagata.jp":true,"kaminoyama.yamagata.jp":true,"kaneyama.yamagata.jp":true,"kawanishi.yamagata.jp":true,"mamurogawa.yamagata.jp":true,"mikawa.yamagata.jp":true,"murayama.yamagata.jp":true,"nagai.yamagata.jp":true,"nakayama.yamagata.jp":true,"nanyo.yamagata.jp":true,"nishikawa.yamagata.jp":true,"obanazawa.yamagata.jp":true,"oe.yamagata.jp":true,"oguni.yamagata.jp":true,"ohkura.yamagata.jp":true,"oishida.yamagata.jp":true,"sagae.yamagata.jp":true,"sakata.yamagata.jp":true,"sakegawa.yamagata.jp":true,"shinjo.yamagata.jp":true,"shirataka.yamagata.jp":true,"shonai.yamagata.jp":true,"takahata.yamagata.jp":true,"tendo.yamagata.jp":true,"tozawa.yamagata.jp":true,"tsuruoka.yamagata.jp":true,"yamagata.yamagata.jp":true,"yamanobe.yamagata.jp":true,"yonezawa.yamagata.jp":true,"yuza.yamagata.jp":true,"abu.yamaguchi.jp":true,"hagi.yamaguchi.jp":true,"hikari.yamaguchi.jp":true,"hofu.yamaguchi.jp":true,"iwakuni.yamaguchi.jp":true,"kudamatsu.yamaguchi.jp":true,"mitou.yamaguchi.jp":true,"nagato.yamaguchi.jp":true,"oshima.yamaguchi.jp":true,"shimonoseki.yamaguchi.jp":true,"shunan.yamaguchi.jp":true,"tabuse.yamaguchi.jp":true,"tokuyama.yamaguchi.jp":true,"toyota.yamaguchi.jp":true,"ube.yamaguchi.jp":true,"yuu.yamaguchi.jp":true,"chuo.yamanashi.jp":true,"doshi.yamanashi.jp":true,"fuefuki.yamanashi.jp":true,"fujikawa.yamanashi.jp":true,"fujikawaguchiko.yamanashi.jp":true,"fujiyoshida.yamanashi.jp":true,"hayakawa.yamanashi.jp":true,"hokuto.yamanashi.jp":true,"ichikawamisato.yamanashi.jp":true,"kai.yamanashi.jp":true,"kofu.yamanashi.jp":true,"koshu.yamanashi.jp":true,"kosuge.yamanashi.jp":true,"minami-alps.yamanashi.jp":true,"minobu.yamanashi.jp":true,"nakamichi.yamanashi.jp":true,"nanbu.yamanashi.jp":true,"narusawa.yamanashi.jp":true,"nirasaki.yamanashi.jp":true,"nishikatsura.yamanashi.jp":true,"oshino.yamanashi.jp":true,"otsuki.yamanashi.jp":true,"showa.yamanashi.jp":true,"tabayama.yamanashi.jp":true,"tsuru.yamanashi.jp":true,"uenohara.yamanashi.jp":true,"yamanakako.yamanashi.jp":true,"yamanashi.yamanashi.jp":true,"*.ke":true,"kg":true,"org.kg":true,"net.kg":true,"com.kg":true,"edu.kg":true,"gov.kg":true,"mil.kg":true,"*.kh":true,"ki":true,"edu.ki":true,"biz.ki":true,"net.ki":true,"org.ki":true,"gov.ki":true,"info.ki":true,"com.ki":true,"km":true,"org.km":true,"nom.km":true,"gov.km":true,"prd.km":true,"tm.km":true,"edu.km":true,"mil.km":true,"ass.km":true,"com.km":true,"coop.km":true,"asso.km":true,"presse.km":true,"medecin.km":true,"notaires.km":true,"pharmaciens.km":true,"veterinaire.km":true,"gouv.km":true,"kn":true,"net.kn":true,"org.kn":true,"edu.kn":true,"gov.kn":true,"kp":true,"com.kp":true,"edu.kp":true,"gov.kp":true,"org.kp":true,"rep.kp":true,"tra.kp":true,"kr":true,"ac.kr":true,"co.kr":true,"es.kr":true,"go.kr":true,"hs.kr":true,"kg.kr":true,"mil.kr":true,"ms.kr":true,"ne.kr":true,"or.kr":true,"pe.kr":true,"re.kr":true,"sc.kr":true,"busan.kr":true,"chungbuk.kr":true,"chungnam.kr":true,"daegu.kr":true,"daejeon.kr":true,"gangwon.kr":true,"gwangju.kr":true,"gyeongbuk.kr":true,"gyeonggi.kr":true,"gyeongnam.kr":true,"incheon.kr":true,"jeju.kr":true,"jeonbuk.kr":true,"jeonnam.kr":true,"seoul.kr":true,"ulsan.kr":true,"*.kw":true,"ky":true,"edu.ky":true,"gov.ky":true,"com.ky":true,"org.ky":true,"net.ky":true,"kz":true,"org.kz":true,"edu.kz":true,"net.kz":true,"gov.kz":true,"mil.kz":true,"com.kz":true,"la":true,"int.la":true,"net.la":true,"info.la":true,"edu.la":true,"gov.la":true,"per.la":true,"com.la":true,"org.la":true,"lb":true,"com.lb":true,"edu.lb":true,"gov.lb":true,"net.lb":true,"org.lb":true,"lc":true,"com.lc":true,"net.lc":true,"co.lc":true,"org.lc":true,"edu.lc":true,"gov.lc":true,"li":true,"lk":true,"gov.lk":true,"sch.lk":true,"net.lk":true,"int.lk":true,"com.lk":true,"org.lk":true,"edu.lk":true,"ngo.lk":true,"soc.lk":true,"web.lk":true,"ltd.lk":true,"assn.lk":true,"grp.lk":true,"hotel.lk":true,"ac.lk":true,"lr":true,"com.lr":true,"edu.lr":true,"gov.lr":true,"org.lr":true,"net.lr":true,"ls":true,"co.ls":true,"org.ls":true,"lt":true,"gov.lt":true,"lu":true,"lv":true,"com.lv":true,"edu.lv":true,"gov.lv":true,"org.lv":true,"mil.lv":true,"id.lv":true,"net.lv":true,"asn.lv":true,"conf.lv":true,"ly":true,"com.ly":true,"net.ly":true,"gov.ly":true,"plc.ly":true,"edu.ly":true,"sch.ly":true,"med.ly":true,"org.ly":true,"id.ly":true,"ma":true,"co.ma":true,"net.ma":true,"gov.ma":true,"org.ma":true,"ac.ma":true,"press.ma":true,"mc":true,"tm.mc":true,"asso.mc":true,"md":true,"me":true,"co.me":true,"net.me":true,"org.me":true,"edu.me":true,"ac.me":true,"gov.me":true,"its.me":true,"priv.me":true,"mg":true,"org.mg":true,"nom.mg":true,"gov.mg":true,"prd.mg":true,"tm.mg":true,"edu.mg":true,"mil.mg":true,"com.mg":true,"co.mg":true,"mh":true,"mil":true,"mk":true,"com.mk":true,"org.mk":true,"net.mk":true,"edu.mk":true,"gov.mk":true,"inf.mk":true,"name.mk":true,"ml":true,"com.ml":true,"edu.ml":true,"gouv.ml":true,"gov.ml":true,"net.ml":true,"org.ml":true,"presse.ml":true,"*.mm":true,"mn":true,"gov.mn":true,"edu.mn":true,"org.mn":true,"mo":true,"com.mo":true,"net.mo":true,"org.mo":true,"edu.mo":true,"gov.mo":true,"mobi":true,"mp":true,"mq":true,"mr":true,"gov.mr":true,"ms":true,"com.ms":true,"edu.ms":true,"gov.ms":true,"net.ms":true,"org.ms":true,"mt":true,"com.mt":true,"edu.mt":true,"net.mt":true,"org.mt":true,"mu":true,"com.mu":true,"net.mu":true,"org.mu":true,"gov.mu":true,"ac.mu":true,"co.mu":true,"or.mu":true,"museum":true,"academy.museum":true,"agriculture.museum":true,"air.museum":true,"airguard.museum":true,"alabama.museum":true,"alaska.museum":true,"amber.museum":true,"ambulance.museum":true,"american.museum":true,"americana.museum":true,"americanantiques.museum":true,"americanart.museum":true,"amsterdam.museum":true,"and.museum":true,"annefrank.museum":true,"anthro.museum":true,"anthropology.museum":true,"antiques.museum":true,"aquarium.museum":true,"arboretum.museum":true,"archaeological.museum":true,"archaeology.museum":true,"architecture.museum":true,"art.museum":true,"artanddesign.museum":true,"artcenter.museum":true,"artdeco.museum":true,"arteducation.museum":true,"artgallery.museum":true,"arts.museum":true,"artsandcrafts.museum":true,"asmatart.museum":true,"assassination.museum":true,"assisi.museum":true,"association.museum":true,"astronomy.museum":true,"atlanta.museum":true,"austin.museum":true,"australia.museum":true,"automotive.museum":true,"aviation.museum":true,"axis.museum":true,"badajoz.museum":true,"baghdad.museum":true,"bahn.museum":true,"bale.museum":true,"baltimore.museum":true,"barcelona.museum":true,"baseball.museum":true,"basel.museum":true,"baths.museum":true,"bauern.museum":true,"beauxarts.museum":true,"beeldengeluid.museum":true,"bellevue.museum":true,"bergbau.museum":true,"berkeley.museum":true,"berlin.museum":true,"bern.museum":true,"bible.museum":true,"bilbao.museum":true,"bill.museum":true,"birdart.museum":true,"birthplace.museum":true,"bonn.museum":true,"boston.museum":true,"botanical.museum":true,"botanicalgarden.museum":true,"botanicgarden.museum":true,"botany.museum":true,"brandywinevalley.museum":true,"brasil.museum":true,"bristol.museum":true,"british.museum":true,"britishcolumbia.museum":true,"broadcast.museum":true,"brunel.museum":true,"brussel.museum":true,"brussels.museum":true,"bruxelles.museum":true,"building.museum":true,"burghof.museum":true,"bus.museum":true,"bushey.museum":true,"cadaques.museum":true,"california.museum":true,"cambridge.museum":true,"can.museum":true,"canada.museum":true,"capebreton.museum":true,"carrier.museum":true,"cartoonart.museum":true,"casadelamoneda.museum":true,"castle.museum":true,"castres.museum":true,"celtic.museum":true,"center.museum":true,"chattanooga.museum":true,"cheltenham.museum":true,"chesapeakebay.museum":true,"chicago.museum":true,"children.museum":true,"childrens.museum":true,"childrensgarden.museum":true,"chiropractic.museum":true,"chocolate.museum":true,"christiansburg.museum":true,"cincinnati.museum":true,"cinema.museum":true,"circus.museum":true,"civilisation.museum":true,"civilization.museum":true,"civilwar.museum":true,"clinton.museum":true,"clock.museum":true,"coal.museum":true,"coastaldefence.museum":true,"cody.museum":true,"coldwar.museum":true,"collection.museum":true,"colonialwilliamsburg.museum":true,"coloradoplateau.museum":true,"columbia.museum":true,"columbus.museum":true,"communication.museum":true,"communications.museum":true,"community.museum":true,"computer.museum":true,"computerhistory.museum":true,"xn--comunicaes-v6a2o.museum":true,"contemporary.museum":true,"contemporaryart.museum":true,"convent.museum":true,"copenhagen.museum":true,"corporation.museum":true,"xn--correios-e-telecomunicaes-ghc29a.museum":true,"corvette.museum":true,"costume.museum":true,"countryestate.museum":true,"county.museum":true,"crafts.museum":true,"cranbrook.museum":true,"creation.museum":true,"cultural.museum":true,"culturalcenter.museum":true,"culture.museum":true,"cyber.museum":true,"cymru.museum":true,"dali.museum":true,"dallas.museum":true,"database.museum":true,"ddr.museum":true,"decorativearts.museum":true,"delaware.museum":true,"delmenhorst.museum":true,"denmark.museum":true,"depot.museum":true,"design.museum":true,"detroit.museum":true,"dinosaur.museum":true,"discovery.museum":true,"dolls.museum":true,"donostia.museum":true,"durham.museum":true,"eastafrica.museum":true,"eastcoast.museum":true,"education.museum":true,"educational.museum":true,"egyptian.museum":true,"eisenbahn.museum":true,"elburg.museum":true,"elvendrell.museum":true,"embroidery.museum":true,"encyclopedic.museum":true,"england.museum":true,"entomology.museum":true,"environment.museum":true,"environmentalconservation.museum":true,"epilepsy.museum":true,"essex.museum":true,"estate.museum":true,"ethnology.museum":true,"exeter.museum":true,"exhibition.museum":true,"family.museum":true,"farm.museum":true,"farmequipment.museum":true,"farmers.museum":true,"farmstead.museum":true,"field.museum":true,"figueres.museum":true,"filatelia.museum":true,"film.museum":true,"fineart.museum":true,"finearts.museum":true,"finland.museum":true,"flanders.museum":true,"florida.museum":true,"force.museum":true,"fortmissoula.museum":true,"fortworth.museum":true,"foundation.museum":true,"francaise.museum":true,"frankfurt.museum":true,"franziskaner.museum":true,"freemasonry.museum":true,"freiburg.museum":true,"fribourg.museum":true,"frog.museum":true,"fundacio.museum":true,"furniture.museum":true,"gallery.museum":true,"garden.museum":true,"gateway.museum":true,"geelvinck.museum":true,"gemological.museum":true,"geology.museum":true,"georgia.museum":true,"giessen.museum":true,"glas.museum":true,"glass.museum":true,"gorge.museum":true,"grandrapids.museum":true,"graz.museum":true,"guernsey.museum":true,"halloffame.museum":true,"hamburg.museum":true,"handson.museum":true,"harvestcelebration.museum":true,"hawaii.museum":true,"health.museum":true,"heimatunduhren.museum":true,"hellas.museum":true,"helsinki.museum":true,"hembygdsforbund.museum":true,"heritage.museum":true,"histoire.museum":true,"historical.museum":true,"historicalsociety.museum":true,"historichouses.museum":true,"historisch.museum":true,"historisches.museum":true,"history.museum":true,"historyofscience.museum":true,"horology.museum":true,"house.museum":true,"humanities.museum":true,"illustration.museum":true,"imageandsound.museum":true,"indian.museum":true,"indiana.museum":true,"indianapolis.museum":true,"indianmarket.museum":true,"intelligence.museum":true,"interactive.museum":true,"iraq.museum":true,"iron.museum":true,"isleofman.museum":true,"jamison.museum":true,"jefferson.museum":true,"jerusalem.museum":true,"jewelry.museum":true,"jewish.museum":true,"jewishart.museum":true,"jfk.museum":true,"journalism.museum":true,"judaica.museum":true,"judygarland.museum":true,"juedisches.museum":true,"juif.museum":true,"karate.museum":true,"karikatur.museum":true,"kids.museum":true,"koebenhavn.museum":true,"koeln.museum":true,"kunst.museum":true,"kunstsammlung.museum":true,"kunstunddesign.museum":true,"labor.museum":true,"labour.museum":true,"lajolla.museum":true,"lancashire.museum":true,"landes.museum":true,"lans.museum":true,"xn--lns-qla.museum":true,"larsson.museum":true,"lewismiller.museum":true,"lincoln.museum":true,"linz.museum":true,"living.museum":true,"livinghistory.museum":true,"localhistory.museum":true,"london.museum":true,"losangeles.museum":true,"louvre.museum":true,"loyalist.museum":true,"lucerne.museum":true,"luxembourg.museum":true,"luzern.museum":true,"mad.museum":true,"madrid.museum":true,"mallorca.museum":true,"manchester.museum":true,"mansion.museum":true,"mansions.museum":true,"manx.museum":true,"marburg.museum":true,"maritime.museum":true,"maritimo.museum":true,"maryland.museum":true,"marylhurst.museum":true,"media.museum":true,"medical.museum":true,"medizinhistorisches.museum":true,"meeres.museum":true,"memorial.museum":true,"mesaverde.museum":true,"michigan.museum":true,"midatlantic.museum":true,"military.museum":true,"mill.museum":true,"miners.museum":true,"mining.museum":true,"minnesota.museum":true,"missile.museum":true,"missoula.museum":true,"modern.museum":true,"moma.museum":true,"money.museum":true,"monmouth.museum":true,"monticello.museum":true,"montreal.museum":true,"moscow.museum":true,"motorcycle.museum":true,"muenchen.museum":true,"muenster.museum":true,"mulhouse.museum":true,"muncie.museum":true,"museet.museum":true,"museumcenter.museum":true,"museumvereniging.museum":true,"music.museum":true,"national.museum":true,"nationalfirearms.museum":true,"nationalheritage.museum":true,"nativeamerican.museum":true,"naturalhistory.museum":true,"naturalhistorymuseum.museum":true,"naturalsciences.museum":true,"nature.museum":true,"naturhistorisches.museum":true,"natuurwetenschappen.museum":true,"naumburg.museum":true,"naval.museum":true,"nebraska.museum":true,"neues.museum":true,"newhampshire.museum":true,"newjersey.museum":true,"newmexico.museum":true,"newport.museum":true,"newspaper.museum":true,"newyork.museum":true,"niepce.museum":true,"norfolk.museum":true,"north.museum":true,"nrw.museum":true,"nuernberg.museum":true,"nuremberg.museum":true,"nyc.museum":true,"nyny.museum":true,"oceanographic.museum":true,"oceanographique.museum":true,"omaha.museum":true,"online.museum":true,"ontario.museum":true,"openair.museum":true,"oregon.museum":true,"oregontrail.museum":true,"otago.museum":true,"oxford.museum":true,"pacific.museum":true,"paderborn.museum":true,"palace.museum":true,"paleo.museum":true,"palmsprings.museum":true,"panama.museum":true,"paris.museum":true,"pasadena.museum":true,"pharmacy.museum":true,"philadelphia.museum":true,"philadelphiaarea.museum":true,"philately.museum":true,"phoenix.museum":true,"photography.museum":true,"pilots.museum":true,"pittsburgh.museum":true,"planetarium.museum":true,"plantation.museum":true,"plants.museum":true,"plaza.museum":true,"portal.museum":true,"portland.museum":true,"portlligat.museum":true,"posts-and-telecommunications.museum":true,"preservation.museum":true,"presidio.museum":true,"press.museum":true,"project.museum":true,"public.museum":true,"pubol.museum":true,"quebec.museum":true,"railroad.museum":true,"railway.museum":true,"research.museum":true,"resistance.museum":true,"riodejaneiro.museum":true,"rochester.museum":true,"rockart.museum":true,"roma.museum":true,"russia.museum":true,"saintlouis.museum":true,"salem.museum":true,"salvadordali.museum":true,"salzburg.museum":true,"sandiego.museum":true,"sanfrancisco.museum":true,"santabarbara.museum":true,"santacruz.museum":true,"santafe.museum":true,"saskatchewan.museum":true,"satx.museum":true,"savannahga.museum":true,"schlesisches.museum":true,"schoenbrunn.museum":true,"schokoladen.museum":true,"school.museum":true,"schweiz.museum":true,"science.museum":true,"scienceandhistory.museum":true,"scienceandindustry.museum":true,"sciencecenter.museum":true,"sciencecenters.museum":true,"science-fiction.museum":true,"sciencehistory.museum":true,"sciences.museum":true,"sciencesnaturelles.museum":true,"scotland.museum":true,"seaport.museum":true,"settlement.museum":true,"settlers.museum":true,"shell.museum":true,"sherbrooke.museum":true,"sibenik.museum":true,"silk.museum":true,"ski.museum":true,"skole.museum":true,"society.museum":true,"sologne.museum":true,"soundandvision.museum":true,"southcarolina.museum":true,"southwest.museum":true,"space.museum":true,"spy.museum":true,"square.museum":true,"stadt.museum":true,"stalbans.museum":true,"starnberg.museum":true,"state.museum":true,"stateofdelaware.museum":true,"station.museum":true,"steam.museum":true,"steiermark.museum":true,"stjohn.museum":true,"stockholm.museum":true,"stpetersburg.museum":true,"stuttgart.museum":true,"suisse.museum":true,"surgeonshall.museum":true,"surrey.museum":true,"svizzera.museum":true,"sweden.museum":true,"sydney.museum":true,"tank.museum":true,"tcm.museum":true,"technology.museum":true,"telekommunikation.museum":true,"television.museum":true,"texas.museum":true,"textile.museum":true,"theater.museum":true,"time.museum":true,"timekeeping.museum":true,"topology.museum":true,"torino.museum":true,"touch.museum":true,"town.museum":true,"transport.museum":true,"tree.museum":true,"trolley.museum":true,"trust.museum":true,"trustee.museum":true,"uhren.museum":true,"ulm.museum":true,"undersea.museum":true,"university.museum":true,"usa.museum":true,"usantiques.museum":true,"usarts.museum":true,"uscountryestate.museum":true,"usculture.museum":true,"usdecorativearts.museum":true,"usgarden.museum":true,"ushistory.museum":true,"ushuaia.museum":true,"uslivinghistory.museum":true,"utah.museum":true,"uvic.museum":true,"valley.museum":true,"vantaa.museum":true,"versailles.museum":true,"viking.museum":true,"village.museum":true,"virginia.museum":true,"virtual.museum":true,"virtuel.museum":true,"vlaanderen.museum":true,"volkenkunde.museum":true,"wales.museum":true,"wallonie.museum":true,"war.museum":true,"washingtondc.museum":true,"watchandclock.museum":true,"watch-and-clock.museum":true,"western.museum":true,"westfalen.museum":true,"whaling.museum":true,"wildlife.museum":true,"williamsburg.museum":true,"windmill.museum":true,"workshop.museum":true,"york.museum":true,"yorkshire.museum":true,"yosemite.museum":true,"youth.museum":true,"zoological.museum":true,"zoology.museum":true,"xn--9dbhblg6di.museum":true,"xn--h1aegh.museum":true,"mv":true,"aero.mv":true,"biz.mv":true,"com.mv":true,"coop.mv":true,"edu.mv":true,"gov.mv":true,"info.mv":true,"int.mv":true,"mil.mv":true,"museum.mv":true,"name.mv":true,"net.mv":true,"org.mv":true,"pro.mv":true,"mw":true,"ac.mw":true,"biz.mw":true,"co.mw":true,"com.mw":true,"coop.mw":true,"edu.mw":true,"gov.mw":true,"int.mw":true,"museum.mw":true,"net.mw":true,"org.mw":true,"mx":true,"com.mx":true,"org.mx":true,"gob.mx":true,"edu.mx":true,"net.mx":true,"my":true,"com.my":true,"net.my":true,"org.my":true,"gov.my":true,"edu.my":true,"mil.my":true,"name.my":true,"*.mz":true,"teledata.mz":false,"na":true,"info.na":true,"pro.na":true,"name.na":true,"school.na":true,"or.na":true,"dr.na":true,"us.na":true,"mx.na":true,"ca.na":true,"in.na":true,"cc.na":true,"tv.na":true,"ws.na":true,"mobi.na":true,"co.na":true,"com.na":true,"org.na":true,"name":true,"nc":true,"asso.nc":true,"ne":true,"net":true,"nf":true,"com.nf":true,"net.nf":true,"per.nf":true,"rec.nf":true,"web.nf":true,"arts.nf":true,"firm.nf":true,"info.nf":true,"other.nf":true,"store.nf":true,"ng":true,"com.ng":true,"edu.ng":true,"name.ng":true,"net.ng":true,"org.ng":true,"sch.ng":true,"gov.ng":true,"mil.ng":true,"mobi.ng":true,"*.ni":true,"nl":true,"bv.nl":true,"no":true,"fhs.no":true,"vgs.no":true,"fylkesbibl.no":true,"folkebibl.no":true,"museum.no":true,"idrett.no":true,"priv.no":true,"mil.no":true,"stat.no":true,"dep.no":true,"kommune.no":true,"herad.no":true,"aa.no":true,"ah.no":true,"bu.no":true,"fm.no":true,"hl.no":true,"hm.no":true,"jan-mayen.no":true,"mr.no":true,"nl.no":true,"nt.no":true,"of.no":true,"ol.no":true,"oslo.no":true,"rl.no":true,"sf.no":true,"st.no":true,"svalbard.no":true,"tm.no":true,"tr.no":true,"va.no":true,"vf.no":true,"gs.aa.no":true,"gs.ah.no":true,"gs.bu.no":true,"gs.fm.no":true,"gs.hl.no":true,"gs.hm.no":true,"gs.jan-mayen.no":true,"gs.mr.no":true,"gs.nl.no":true,"gs.nt.no":true,"gs.of.no":true,"gs.ol.no":true,"gs.oslo.no":true,"gs.rl.no":true,"gs.sf.no":true,"gs.st.no":true,"gs.svalbard.no":true,"gs.tm.no":true,"gs.tr.no":true,"gs.va.no":true,"gs.vf.no":true,"akrehamn.no":true,"xn--krehamn-dxa.no":true,"algard.no":true,"xn--lgrd-poac.no":true,"arna.no":true,"brumunddal.no":true,"bryne.no":true,"bronnoysund.no":true,"xn--brnnysund-m8ac.no":true,"drobak.no":true,"xn--drbak-wua.no":true,"egersund.no":true,"fetsund.no":true,"floro.no":true,"xn--flor-jra.no":true,"fredrikstad.no":true,"hokksund.no":true,"honefoss.no":true,"xn--hnefoss-q1a.no":true,"jessheim.no":true,"jorpeland.no":true,"xn--jrpeland-54a.no":true,"kirkenes.no":true,"kopervik.no":true,"krokstadelva.no":true,"langevag.no":true,"xn--langevg-jxa.no":true,"leirvik.no":true,"mjondalen.no":true,"xn--mjndalen-64a.no":true,"mo-i-rana.no":true,"mosjoen.no":true,"xn--mosjen-eya.no":true,"nesoddtangen.no":true,"orkanger.no":true,"osoyro.no":true,"xn--osyro-wua.no":true,"raholt.no":true,"xn--rholt-mra.no":true,"sandnessjoen.no":true,"xn--sandnessjen-ogb.no":true,"skedsmokorset.no":true,"slattum.no":true,"spjelkavik.no":true,"stathelle.no":true,"stavern.no":true,"stjordalshalsen.no":true,"xn--stjrdalshalsen-sqb.no":true,"tananger.no":true,"tranby.no":true,"vossevangen.no":true,"afjord.no":true,"xn--fjord-lra.no":true,"agdenes.no":true,"al.no":true,"xn--l-1fa.no":true,"alesund.no":true,"xn--lesund-hua.no":true,"alstahaug.no":true,"alta.no":true,"xn--lt-liac.no":true,"alaheadju.no":true,"xn--laheadju-7ya.no":true,"alvdal.no":true,"amli.no":true,"xn--mli-tla.no":true,"amot.no":true,"xn--mot-tla.no":true,"andebu.no":true,"andoy.no":true,"xn--andy-ira.no":true,"andasuolo.no":true,"ardal.no":true,"xn--rdal-poa.no":true,"aremark.no":true,"arendal.no":true,"xn--s-1fa.no":true,"aseral.no":true,"xn--seral-lra.no":true,"asker.no":true,"askim.no":true,"askvoll.no":true,"askoy.no":true,"xn--asky-ira.no":true,"asnes.no":true,"xn--snes-poa.no":true,"audnedaln.no":true,"aukra.no":true,"aure.no":true,"aurland.no":true,"aurskog-holand.no":true,"xn--aurskog-hland-jnb.no":true,"austevoll.no":true,"austrheim.no":true,"averoy.no":true,"xn--avery-yua.no":true,"balestrand.no":true,"ballangen.no":true,"balat.no":true,"xn--blt-elab.no":true,"balsfjord.no":true,"bahccavuotna.no":true,"xn--bhccavuotna-k7a.no":true,"bamble.no":true,"bardu.no":true,"beardu.no":true,"beiarn.no":true,"bajddar.no":true,"xn--bjddar-pta.no":true,"baidar.no":true,"xn--bidr-5nac.no":true,"berg.no":true,"bergen.no":true,"berlevag.no":true,"xn--berlevg-jxa.no":true,"bearalvahki.no":true,"xn--bearalvhki-y4a.no":true,"bindal.no":true,"birkenes.no":true,"bjarkoy.no":true,"xn--bjarky-fya.no":true,"bjerkreim.no":true,"bjugn.no":true,"bodo.no":true,"xn--bod-2na.no":true,"badaddja.no":true,"xn--bdddj-mrabd.no":true,"budejju.no":true,"bokn.no":true,"bremanger.no":true,"bronnoy.no":true,"xn--brnny-wuac.no":true,"bygland.no":true,"bykle.no":true,"barum.no":true,"xn--brum-voa.no":true,"bo.telemark.no":true,"xn--b-5ga.telemark.no":true,"bo.nordland.no":true,"xn--b-5ga.nordland.no":true,"bievat.no":true,"xn--bievt-0qa.no":true,"bomlo.no":true,"xn--bmlo-gra.no":true,"batsfjord.no":true,"xn--btsfjord-9za.no":true,"bahcavuotna.no":true,"xn--bhcavuotna-s4a.no":true,"dovre.no":true,"drammen.no":true,"drangedal.no":true,"dyroy.no":true,"xn--dyry-ira.no":true,"donna.no":true,"xn--dnna-gra.no":true,"eid.no":true,"eidfjord.no":true,"eidsberg.no":true,"eidskog.no":true,"eidsvoll.no":true,"eigersund.no":true,"elverum.no":true,"enebakk.no":true,"engerdal.no":true,"etne.no":true,"etnedal.no":true,"evenes.no":true,"evenassi.no":true,"xn--eveni-0qa01ga.no":true,"evje-og-hornnes.no":true,"farsund.no":true,"fauske.no":true,"fuossko.no":true,"fuoisku.no":true,"fedje.no":true,"fet.no":true,"finnoy.no":true,"xn--finny-yua.no":true,"fitjar.no":true,"fjaler.no":true,"fjell.no":true,"flakstad.no":true,"flatanger.no":true,"flekkefjord.no":true,"flesberg.no":true,"flora.no":true,"fla.no":true,"xn--fl-zia.no":true,"folldal.no":true,"forsand.no":true,"fosnes.no":true,"frei.no":true,"frogn.no":true,"froland.no":true,"frosta.no":true,"frana.no":true,"xn--frna-woa.no":true,"froya.no":true,"xn--frya-hra.no":true,"fusa.no":true,"fyresdal.no":true,"forde.no":true,"xn--frde-gra.no":true,"gamvik.no":true,"gangaviika.no":true,"xn--ggaviika-8ya47h.no":true,"gaular.no":true,"gausdal.no":true,"gildeskal.no":true,"xn--gildeskl-g0a.no":true,"giske.no":true,"gjemnes.no":true,"gjerdrum.no":true,"gjerstad.no":true,"gjesdal.no":true,"gjovik.no":true,"xn--gjvik-wua.no":true,"gloppen.no":true,"gol.no":true,"gran.no":true,"grane.no":true,"granvin.no":true,"gratangen.no":true,"grimstad.no":true,"grong.no":true,"kraanghke.no":true,"xn--kranghke-b0a.no":true,"grue.no":true,"gulen.no":true,"hadsel.no":true,"halden.no":true,"halsa.no":true,"hamar.no":true,"hamaroy.no":true,"habmer.no":true,"xn--hbmer-xqa.no":true,"hapmir.no":true,"xn--hpmir-xqa.no":true,"hammerfest.no":true,"hammarfeasta.no":true,"xn--hmmrfeasta-s4ac.no":true,"haram.no":true,"hareid.no":true,"harstad.no":true,"hasvik.no":true,"aknoluokta.no":true,"xn--koluokta-7ya57h.no":true,"hattfjelldal.no":true,"aarborte.no":true,"haugesund.no":true,"hemne.no":true,"hemnes.no":true,"hemsedal.no":true,"heroy.more-og-romsdal.no":true,"xn--hery-ira.xn--mre-og-romsdal-qqb.no":true,"heroy.nordland.no":true,"xn--hery-ira.nordland.no":true,"hitra.no":true,"hjartdal.no":true,"hjelmeland.no":true,"hobol.no":true,"xn--hobl-ira.no":true,"hof.no":true,"hol.no":true,"hole.no":true,"holmestrand.no":true,"holtalen.no":true,"xn--holtlen-hxa.no":true,"hornindal.no":true,"horten.no":true,"hurdal.no":true,"hurum.no":true,"hvaler.no":true,"hyllestad.no":true,"hagebostad.no":true,"xn--hgebostad-g3a.no":true,"hoyanger.no":true,"xn--hyanger-q1a.no":true,"hoylandet.no":true,"xn--hylandet-54a.no":true,"ha.no":true,"xn--h-2fa.no":true,"ibestad.no":true,"inderoy.no":true,"xn--indery-fya.no":true,"iveland.no":true,"jevnaker.no":true,"jondal.no":true,"jolster.no":true,"xn--jlster-bya.no":true,"karasjok.no":true,"karasjohka.no":true,"xn--krjohka-hwab49j.no":true,"karlsoy.no":true,"galsa.no":true,"xn--gls-elac.no":true,"karmoy.no":true,"xn--karmy-yua.no":true,"kautokeino.no":true,"guovdageaidnu.no":true,"klepp.no":true,"klabu.no":true,"xn--klbu-woa.no":true,"kongsberg.no":true,"kongsvinger.no":true,"kragero.no":true,"xn--krager-gya.no":true,"kristiansand.no":true,"kristiansund.no":true,"krodsherad.no":true,"xn--krdsherad-m8a.no":true,"kvalsund.no":true,"rahkkeravju.no":true,"xn--rhkkervju-01af.no":true,"kvam.no":true,"kvinesdal.no":true,"kvinnherad.no":true,"kviteseid.no":true,"kvitsoy.no":true,"xn--kvitsy-fya.no":true,"kvafjord.no":true,"xn--kvfjord-nxa.no":true,"giehtavuoatna.no":true,"kvanangen.no":true,"xn--kvnangen-k0a.no":true,"navuotna.no":true,"xn--nvuotna-hwa.no":true,"kafjord.no":true,"xn--kfjord-iua.no":true,"gaivuotna.no":true,"xn--givuotna-8ya.no":true,"larvik.no":true,"lavangen.no":true,"lavagis.no":true,"loabat.no":true,"xn--loabt-0qa.no":true,"lebesby.no":true,"davvesiida.no":true,"leikanger.no":true,"leirfjord.no":true,"leka.no":true,"leksvik.no":true,"lenvik.no":true,"leangaviika.no":true,"xn--leagaviika-52b.no":true,"lesja.no":true,"levanger.no":true,"lier.no":true,"lierne.no":true,"lillehammer.no":true,"lillesand.no":true,"lindesnes.no":true,"lindas.no":true,"xn--linds-pra.no":true,"lom.no":true,"loppa.no":true,"lahppi.no":true,"xn--lhppi-xqa.no":true,"lund.no":true,"lunner.no":true,"luroy.no":true,"xn--lury-ira.no":true,"luster.no":true,"lyngdal.no":true,"lyngen.no":true,"ivgu.no":true,"lardal.no":true,"lerdal.no":true,"xn--lrdal-sra.no":true,"lodingen.no":true,"xn--ldingen-q1a.no":true,"lorenskog.no":true,"xn--lrenskog-54a.no":true,"loten.no":true,"xn--lten-gra.no":true,"malvik.no":true,"masoy.no":true,"xn--msy-ula0h.no":true,"muosat.no":true,"xn--muost-0qa.no":true,"mandal.no":true,"marker.no":true,"marnardal.no":true,"masfjorden.no":true,"meland.no":true,"meldal.no":true,"melhus.no":true,"meloy.no":true,"xn--mely-ira.no":true,"meraker.no":true,"xn--merker-kua.no":true,"moareke.no":true,"xn--moreke-jua.no":true,"midsund.no":true,"midtre-gauldal.no":true,"modalen.no":true,"modum.no":true,"molde.no":true,"moskenes.no":true,"moss.no":true,"mosvik.no":true,"malselv.no":true,"xn--mlselv-iua.no":true,"malatvuopmi.no":true,"xn--mlatvuopmi-s4a.no":true,"namdalseid.no":true,"aejrie.no":true,"namsos.no":true,"namsskogan.no":true,"naamesjevuemie.no":true,"xn--nmesjevuemie-tcba.no":true,"laakesvuemie.no":true,"nannestad.no":true,"narvik.no":true,"narviika.no":true,"naustdal.no":true,"nedre-eiker.no":true,"nes.akershus.no":true,"nes.buskerud.no":true,"nesna.no":true,"nesodden.no":true,"nesseby.no":true,"unjarga.no":true,"xn--unjrga-rta.no":true,"nesset.no":true,"nissedal.no":true,"nittedal.no":true,"nord-aurdal.no":true,"nord-fron.no":true,"nord-odal.no":true,"norddal.no":true,"nordkapp.no":true,"davvenjarga.no":true,"xn--davvenjrga-y4a.no":true,"nordre-land.no":true,"nordreisa.no":true,"raisa.no":true,"xn--risa-5na.no":true,"nore-og-uvdal.no":true,"notodden.no":true,"naroy.no":true,"xn--nry-yla5g.no":true,"notteroy.no":true,"xn--nttery-byae.no":true,"odda.no":true,"oksnes.no":true,"xn--ksnes-uua.no":true,"oppdal.no":true,"oppegard.no":true,"xn--oppegrd-ixa.no":true,"orkdal.no":true,"orland.no":true,"xn--rland-uua.no":true,"orskog.no":true,"xn--rskog-uua.no":true,"orsta.no":true,"xn--rsta-fra.no":true,"os.hedmark.no":true,"os.hordaland.no":true,"osen.no":true,"osteroy.no":true,"xn--ostery-fya.no":true,"ostre-toten.no":true,"xn--stre-toten-zcb.no":true,"overhalla.no":true,"ovre-eiker.no":true,"xn--vre-eiker-k8a.no":true,"oyer.no":true,"xn--yer-zna.no":true,"oygarden.no":true,"xn--ygarden-p1a.no":true,"oystre-slidre.no":true,"xn--ystre-slidre-ujb.no":true,"porsanger.no":true,"porsangu.no":true,"xn--porsgu-sta26f.no":true,"porsgrunn.no":true,"radoy.no":true,"xn--rady-ira.no":true,"rakkestad.no":true,"rana.no":true,"ruovat.no":true,"randaberg.no":true,"rauma.no":true,"rendalen.no":true,"rennebu.no":true,"rennesoy.no":true,"xn--rennesy-v1a.no":true,"rindal.no":true,"ringebu.no":true,"ringerike.no":true,"ringsaker.no":true,"rissa.no":true,"risor.no":true,"xn--risr-ira.no":true,"roan.no":true,"rollag.no":true,"rygge.no":true,"ralingen.no":true,"xn--rlingen-mxa.no":true,"rodoy.no":true,"xn--rdy-0nab.no":true,"romskog.no":true,"xn--rmskog-bya.no":true,"roros.no":true,"xn--rros-gra.no":true,"rost.no":true,"xn--rst-0na.no":true,"royken.no":true,"xn--ryken-vua.no":true,"royrvik.no":true,"xn--ryrvik-bya.no":true,"rade.no":true,"xn--rde-ula.no":true,"salangen.no":true,"siellak.no":true,"saltdal.no":true,"salat.no":true,"xn--slt-elab.no":true,"xn--slat-5na.no":true,"samnanger.no":true,"sande.more-og-romsdal.no":true,"sande.xn--mre-og-romsdal-qqb.no":true,"sande.vestfold.no":true,"sandefjord.no":true,"sandnes.no":true,"sandoy.no":true,"xn--sandy-yua.no":true,"sarpsborg.no":true,"sauda.no":true,"sauherad.no":true,"sel.no":true,"selbu.no":true,"selje.no":true,"seljord.no":true,"sigdal.no":true,"siljan.no":true,"sirdal.no":true,"skaun.no":true,"skedsmo.no":true,"ski.no":true,"skien.no":true,"skiptvet.no":true,"skjervoy.no":true,"xn--skjervy-v1a.no":true,"skierva.no":true,"xn--skierv-uta.no":true,"skjak.no":true,"xn--skjk-soa.no":true,"skodje.no":true,"skanland.no":true,"xn--sknland-fxa.no":true,"skanit.no":true,"xn--sknit-yqa.no":true,"smola.no":true,"xn--smla-hra.no":true,"snillfjord.no":true,"snasa.no":true,"xn--snsa-roa.no":true,"snoasa.no":true,"snaase.no":true,"xn--snase-nra.no":true,"sogndal.no":true,"sokndal.no":true,"sola.no":true,"solund.no":true,"songdalen.no":true,"sortland.no":true,"spydeberg.no":true,"stange.no":true,"stavanger.no":true,"steigen.no":true,"steinkjer.no":true,"stjordal.no":true,"xn--stjrdal-s1a.no":true,"stokke.no":true,"stor-elvdal.no":true,"stord.no":true,"stordal.no":true,"storfjord.no":true,"omasvuotna.no":true,"strand.no":true,"stranda.no":true,"stryn.no":true,"sula.no":true,"suldal.no":true,"sund.no":true,"sunndal.no":true,"surnadal.no":true,"sveio.no":true,"svelvik.no":true,"sykkylven.no":true,"sogne.no":true,"xn--sgne-gra.no":true,"somna.no":true,"xn--smna-gra.no":true,"sondre-land.no":true,"xn--sndre-land-0cb.no":true,"sor-aurdal.no":true,"xn--sr-aurdal-l8a.no":true,"sor-fron.no":true,"xn--sr-fron-q1a.no":true,"sor-odal.no":true,"xn--sr-odal-q1a.no":true,"sor-varanger.no":true,"xn--sr-varanger-ggb.no":true,"matta-varjjat.no":true,"xn--mtta-vrjjat-k7af.no":true,"sorfold.no":true,"xn--srfold-bya.no":true,"sorreisa.no":true,"xn--srreisa-q1a.no":true,"sorum.no":true,"xn--srum-gra.no":true,"tana.no":true,"deatnu.no":true,"time.no":true,"tingvoll.no":true,"tinn.no":true,"tjeldsund.no":true,"dielddanuorri.no":true,"tjome.no":true,"xn--tjme-hra.no":true,"tokke.no":true,"tolga.no":true,"torsken.no":true,"tranoy.no":true,"xn--trany-yua.no":true,"tromso.no":true,"xn--troms-zua.no":true,"tromsa.no":true,"romsa.no":true,"trondheim.no":true,"troandin.no":true,"trysil.no":true,"trana.no":true,"xn--trna-woa.no":true,"trogstad.no":true,"xn--trgstad-r1a.no":true,"tvedestrand.no":true,"tydal.no":true,"tynset.no":true,"tysfjord.no":true,"divtasvuodna.no":true,"divttasvuotna.no":true,"tysnes.no":true,"tysvar.no":true,"xn--tysvr-vra.no":true,"tonsberg.no":true,"xn--tnsberg-q1a.no":true,"ullensaker.no":true,"ullensvang.no":true,"ulvik.no":true,"utsira.no":true,"vadso.no":true,"xn--vads-jra.no":true,"cahcesuolo.no":true,"xn--hcesuolo-7ya35b.no":true,"vaksdal.no":true,"valle.no":true,"vang.no":true,"vanylven.no":true,"vardo.no":true,"xn--vard-jra.no":true,"varggat.no":true,"xn--vrggt-xqad.no":true,"vefsn.no":true,"vaapste.no":true,"vega.no":true,"vegarshei.no":true,"xn--vegrshei-c0a.no":true,"vennesla.no":true,"verdal.no":true,"verran.no":true,"vestby.no":true,"vestnes.no":true,"vestre-slidre.no":true,"vestre-toten.no":true,"vestvagoy.no":true,"xn--vestvgy-ixa6o.no":true,"vevelstad.no":true,"vik.no":true,"vikna.no":true,"vindafjord.no":true,"volda.no":true,"voss.no":true,"varoy.no":true,"xn--vry-yla5g.no":true,"vagan.no":true,"xn--vgan-qoa.no":true,"voagat.no":true,"vagsoy.no":true,"xn--vgsy-qoa0j.no":true,"vaga.no":true,"xn--vg-yiab.no":true,"valer.ostfold.no":true,"xn--vler-qoa.xn--stfold-9xa.no":true,"valer.hedmark.no":true,"xn--vler-qoa.hedmark.no":true,"*.np":true,"nr":true,"biz.nr":true,"info.nr":true,"gov.nr":true,"edu.nr":true,"org.nr":true,"net.nr":true,"com.nr":true,"nu":true,"nz":true,"ac.nz":true,"co.nz":true,"cri.nz":true,"geek.nz":true,"gen.nz":true,"govt.nz":true,"health.nz":true,"iwi.nz":true,"kiwi.nz":true,"maori.nz":true,"mil.nz":true,"xn--mori-qsa.nz":true,"net.nz":true,"org.nz":true,"parliament.nz":true,"school.nz":true,"om":true,"co.om":true,"com.om":true,"edu.om":true,"gov.om":true,"med.om":true,"museum.om":true,"net.om":true,"org.om":true,"pro.om":true,"org":true,"pa":true,"ac.pa":true,"gob.pa":true,"com.pa":true,"org.pa":true,"sld.pa":true,"edu.pa":true,"net.pa":true,"ing.pa":true,"abo.pa":true,"med.pa":true,"nom.pa":true,"pe":true,"edu.pe":true,"gob.pe":true,"nom.pe":true,"mil.pe":true,"org.pe":true,"com.pe":true,"net.pe":true,"pf":true,"com.pf":true,"org.pf":true,"edu.pf":true,"*.pg":true,"ph":true,"com.ph":true,"net.ph":true,"org.ph":true,"gov.ph":true,"edu.ph":true,"ngo.ph":true,"mil.ph":true,"i.ph":true,"pk":true,"com.pk":true,"net.pk":true,"edu.pk":true,"org.pk":true,"fam.pk":true,"biz.pk":true,"web.pk":true,"gov.pk":true,"gob.pk":true,"gok.pk":true,"gon.pk":true,"gop.pk":true,"gos.pk":true,"info.pk":true,"pl":true,"com.pl":true,"net.pl":true,"org.pl":true,"aid.pl":true,"agro.pl":true,"atm.pl":true,"auto.pl":true,"biz.pl":true,"edu.pl":true,"gmina.pl":true,"gsm.pl":true,"info.pl":true,"mail.pl":true,"miasta.pl":true,"media.pl":true,"mil.pl":true,"nieruchomosci.pl":true,"nom.pl":true,"pc.pl":true,"powiat.pl":true,"priv.pl":true,"realestate.pl":true,"rel.pl":true,"sex.pl":true,"shop.pl":true,"sklep.pl":true,"sos.pl":true,"szkola.pl":true,"targi.pl":true,"tm.pl":true,"tourism.pl":true,"travel.pl":true,"turystyka.pl":true,"gov.pl":true,"ap.gov.pl":true,"ic.gov.pl":true,"is.gov.pl":true,"us.gov.pl":true,"kmpsp.gov.pl":true,"kppsp.gov.pl":true,"kwpsp.gov.pl":true,"psp.gov.pl":true,"wskr.gov.pl":true,"kwp.gov.pl":true,"mw.gov.pl":true,"ug.gov.pl":true,"um.gov.pl":true,"umig.gov.pl":true,"ugim.gov.pl":true,"upow.gov.pl":true,"uw.gov.pl":true,"starostwo.gov.pl":true,"pa.gov.pl":true,"po.gov.pl":true,"psse.gov.pl":true,"pup.gov.pl":true,"rzgw.gov.pl":true,"sa.gov.pl":true,"so.gov.pl":true,"sr.gov.pl":true,"wsa.gov.pl":true,"sko.gov.pl":true,"uzs.gov.pl":true,"wiih.gov.pl":true,"winb.gov.pl":true,"pinb.gov.pl":true,"wios.gov.pl":true,"witd.gov.pl":true,"wzmiuw.gov.pl":true,"piw.gov.pl":true,"wiw.gov.pl":true,"griw.gov.pl":true,"wif.gov.pl":true,"oum.gov.pl":true,"sdn.gov.pl":true,"zp.gov.pl":true,"uppo.gov.pl":true,"mup.gov.pl":true,"wuoz.gov.pl":true,"konsulat.gov.pl":true,"oirm.gov.pl":true,"augustow.pl":true,"babia-gora.pl":true,"bedzin.pl":true,"beskidy.pl":true,"bialowieza.pl":true,"bialystok.pl":true,"bielawa.pl":true,"bieszczady.pl":true,"boleslawiec.pl":true,"bydgoszcz.pl":true,"bytom.pl":true,"cieszyn.pl":true,"czeladz.pl":true,"czest.pl":true,"dlugoleka.pl":true,"elblag.pl":true,"elk.pl":true,"glogow.pl":true,"gniezno.pl":true,"gorlice.pl":true,"grajewo.pl":true,"ilawa.pl":true,"jaworzno.pl":true,"jelenia-gora.pl":true,"jgora.pl":true,"kalisz.pl":true,"kazimierz-dolny.pl":true,"karpacz.pl":true,"kartuzy.pl":true,"kaszuby.pl":true,"katowice.pl":true,"kepno.pl":true,"ketrzyn.pl":true,"klodzko.pl":true,"kobierzyce.pl":true,"kolobrzeg.pl":true,"konin.pl":true,"konskowola.pl":true,"kutno.pl":true,"lapy.pl":true,"lebork.pl":true,"legnica.pl":true,"lezajsk.pl":true,"limanowa.pl":true,"lomza.pl":true,"lowicz.pl":true,"lubin.pl":true,"lukow.pl":true,"malbork.pl":true,"malopolska.pl":true,"mazowsze.pl":true,"mazury.pl":true,"mielec.pl":true,"mielno.pl":true,"mragowo.pl":true,"naklo.pl":true,"nowaruda.pl":true,"nysa.pl":true,"olawa.pl":true,"olecko.pl":true,"olkusz.pl":true,"olsztyn.pl":true,"opoczno.pl":true,"opole.pl":true,"ostroda.pl":true,"ostroleka.pl":true,"ostrowiec.pl":true,"ostrowwlkp.pl":true,"pila.pl":true,"pisz.pl":true,"podhale.pl":true,"podlasie.pl":true,"polkowice.pl":true,"pomorze.pl":true,"pomorskie.pl":true,"prochowice.pl":true,"pruszkow.pl":true,"przeworsk.pl":true,"pulawy.pl":true,"radom.pl":true,"rawa-maz.pl":true,"rybnik.pl":true,"rzeszow.pl":true,"sanok.pl":true,"sejny.pl":true,"slask.pl":true,"slupsk.pl":true,"sosnowiec.pl":true,"stalowa-wola.pl":true,"skoczow.pl":true,"starachowice.pl":true,"stargard.pl":true,"suwalki.pl":true,"swidnica.pl":true,"swiebodzin.pl":true,"swinoujscie.pl":true,"szczecin.pl":true,"szczytno.pl":true,"tarnobrzeg.pl":true,"tgory.pl":true,"turek.pl":true,"tychy.pl":true,"ustka.pl":true,"walbrzych.pl":true,"warmia.pl":true,"warszawa.pl":true,"waw.pl":true,"wegrow.pl":true,"wielun.pl":true,"wlocl.pl":true,"wloclawek.pl":true,"wodzislaw.pl":true,"wolomin.pl":true,"wroclaw.pl":true,"zachpomor.pl":true,"zagan.pl":true,"zarow.pl":true,"zgora.pl":true,"zgorzelec.pl":true,"pm":true,"pn":true,"gov.pn":true,"co.pn":true,"org.pn":true,"edu.pn":true,"net.pn":true,"post":true,"pr":true,"com.pr":true,"net.pr":true,"org.pr":true,"gov.pr":true,"edu.pr":true,"isla.pr":true,"pro.pr":true,"biz.pr":true,"info.pr":true,"name.pr":true,"est.pr":true,"prof.pr":true,"ac.pr":true,"pro":true,"aca.pro":true,"bar.pro":true,"cpa.pro":true,"jur.pro":true,"law.pro":true,"med.pro":true,"eng.pro":true,"ps":true,"edu.ps":true,"gov.ps":true,"sec.ps":true,"plo.ps":true,"com.ps":true,"org.ps":true,"net.ps":true,"pt":true,"net.pt":true,"gov.pt":true,"org.pt":true,"edu.pt":true,"int.pt":true,"publ.pt":true,"com.pt":true,"nome.pt":true,"pw":true,"co.pw":true,"ne.pw":true,"or.pw":true,"ed.pw":true,"go.pw":true,"belau.pw":true,"py":true,"com.py":true,"coop.py":true,"edu.py":true,"gov.py":true,"mil.py":true,"net.py":true,"org.py":true,"qa":true,"com.qa":true,"edu.qa":true,"gov.qa":true,"mil.qa":true,"name.qa":true,"net.qa":true,"org.qa":true,"sch.qa":true,"re":true,"com.re":true,"asso.re":true,"nom.re":true,"ro":true,"com.ro":true,"org.ro":true,"tm.ro":true,"nt.ro":true,"nom.ro":true,"info.ro":true,"rec.ro":true,"arts.ro":true,"firm.ro":true,"store.ro":true,"www.ro":true,"rs":true,"co.rs":true,"org.rs":true,"edu.rs":true,"ac.rs":true,"gov.rs":true,"in.rs":true,"ru":true,"ac.ru":true,"com.ru":true,"edu.ru":true,"int.ru":true,"net.ru":true,"org.ru":true,"pp.ru":true,"adygeya.ru":true,"altai.ru":true,"amur.ru":true,"arkhangelsk.ru":true,"astrakhan.ru":true,"bashkiria.ru":true,"belgorod.ru":true,"bir.ru":true,"bryansk.ru":true,"buryatia.ru":true,"cbg.ru":true,"chel.ru":true,"chelyabinsk.ru":true,"chita.ru":true,"chukotka.ru":true,"chuvashia.ru":true,"dagestan.ru":true,"dudinka.ru":true,"e-burg.ru":true,"grozny.ru":true,"irkutsk.ru":true,"ivanovo.ru":true,"izhevsk.ru":true,"jar.ru":true,"joshkar-ola.ru":true,"kalmykia.ru":true,"kaluga.ru":true,"kamchatka.ru":true,"karelia.ru":true,"kazan.ru":true,"kchr.ru":true,"kemerovo.ru":true,"khabarovsk.ru":true,"khakassia.ru":true,"khv.ru":true,"kirov.ru":true,"koenig.ru":true,"komi.ru":true,"kostroma.ru":true,"krasnoyarsk.ru":true,"kuban.ru":true,"kurgan.ru":true,"kursk.ru":true,"lipetsk.ru":true,"magadan.ru":true,"mari.ru":true,"mari-el.ru":true,"marine.ru":true,"mordovia.ru":true,"msk.ru":true,"murmansk.ru":true,"nalchik.ru":true,"nnov.ru":true,"nov.ru":true,"novosibirsk.ru":true,"nsk.ru":true,"omsk.ru":true,"orenburg.ru":true,"oryol.ru":true,"palana.ru":true,"penza.ru":true,"perm.ru":true,"ptz.ru":true,"rnd.ru":true,"ryazan.ru":true,"sakhalin.ru":true,"samara.ru":true,"saratov.ru":true,"simbirsk.ru":true,"smolensk.ru":true,"spb.ru":true,"stavropol.ru":true,"stv.ru":true,"surgut.ru":true,"tambov.ru":true,"tatarstan.ru":true,"tom.ru":true,"tomsk.ru":true,"tsaritsyn.ru":true,"tsk.ru":true,"tula.ru":true,"tuva.ru":true,"tver.ru":true,"tyumen.ru":true,"udm.ru":true,"udmurtia.ru":true,"ulan-ude.ru":true,"vladikavkaz.ru":true,"vladimir.ru":true,"vladivostok.ru":true,"volgograd.ru":true,"vologda.ru":true,"voronezh.ru":true,"vrn.ru":true,"vyatka.ru":true,"yakutia.ru":true,"yamal.ru":true,"yaroslavl.ru":true,"yekaterinburg.ru":true,"yuzhno-sakhalinsk.ru":true,"amursk.ru":true,"baikal.ru":true,"cmw.ru":true,"fareast.ru":true,"jamal.ru":true,"kms.ru":true,"k-uralsk.ru":true,"kustanai.ru":true,"kuzbass.ru":true,"magnitka.ru":true,"mytis.ru":true,"nakhodka.ru":true,"nkz.ru":true,"norilsk.ru":true,"oskol.ru":true,"pyatigorsk.ru":true,"rubtsovsk.ru":true,"snz.ru":true,"syzran.ru":true,"vdonsk.ru":true,"zgrad.ru":true,"gov.ru":true,"mil.ru":true,"test.ru":true,"rw":true,"gov.rw":true,"net.rw":true,"edu.rw":true,"ac.rw":true,"com.rw":true,"co.rw":true,"int.rw":true,"mil.rw":true,"gouv.rw":true,"sa":true,"com.sa":true,"net.sa":true,"org.sa":true,"gov.sa":true,"med.sa":true,"pub.sa":true,"edu.sa":true,"sch.sa":true,"sb":true,"com.sb":true,"edu.sb":true,"gov.sb":true,"net.sb":true,"org.sb":true,"sc":true,"com.sc":true,"gov.sc":true,"net.sc":true,"org.sc":true,"edu.sc":true,"sd":true,"com.sd":true,"net.sd":true,"org.sd":true,"edu.sd":true,"med.sd":true,"tv.sd":true,"gov.sd":true,"info.sd":true,"se":true,"a.se":true,"ac.se":true,"b.se":true,"bd.se":true,"brand.se":true,"c.se":true,"d.se":true,"e.se":true,"f.se":true,"fh.se":true,"fhsk.se":true,"fhv.se":true,"g.se":true,"h.se":true,"i.se":true,"k.se":true,"komforb.se":true,"kommunalforbund.se":true,"komvux.se":true,"l.se":true,"lanbib.se":true,"m.se":true,"n.se":true,"naturbruksgymn.se":true,"o.se":true,"org.se":true,"p.se":true,"parti.se":true,"pp.se":true,"press.se":true,"r.se":true,"s.se":true,"t.se":true,"tm.se":true,"u.se":true,"w.se":true,"x.se":true,"y.se":true,"z.se":true,"sg":true,"com.sg":true,"net.sg":true,"org.sg":true,"gov.sg":true,"edu.sg":true,"per.sg":true,"sh":true,"com.sh":true,"net.sh":true,"gov.sh":true,"org.sh":true,"mil.sh":true,"si":true,"sj":true,"sk":true,"sl":true,"com.sl":true,"net.sl":true,"edu.sl":true,"gov.sl":true,"org.sl":true,"sm":true,"sn":true,"art.sn":true,"com.sn":true,"edu.sn":true,"gouv.sn":true,"org.sn":true,"perso.sn":true,"univ.sn":true,"so":true,"com.so":true,"net.so":true,"org.so":true,"sr":true,"st":true,"co.st":true,"com.st":true,"consulado.st":true,"edu.st":true,"embaixada.st":true,"gov.st":true,"mil.st":true,"net.st":true,"org.st":true,"principe.st":true,"saotome.st":true,"store.st":true,"su":true,"adygeya.su":true,"arkhangelsk.su":true,"balashov.su":true,"bashkiria.su":true,"bryansk.su":true,"dagestan.su":true,"grozny.su":true,"ivanovo.su":true,"kalmykia.su":true,"kaluga.su":true,"karelia.su":true,"khakassia.su":true,"krasnodar.su":true,"kurgan.su":true,"lenug.su":true,"mordovia.su":true,"msk.su":true,"murmansk.su":true,"nalchik.su":true,"nov.su":true,"obninsk.su":true,"penza.su":true,"pokrovsk.su":true,"sochi.su":true,"spb.su":true,"togliatti.su":true,"troitsk.su":true,"tula.su":true,"tuva.su":true,"vladikavkaz.su":true,"vladimir.su":true,"vologda.su":true,"sv":true,"com.sv":true,"edu.sv":true,"gob.sv":true,"org.sv":true,"red.sv":true,"sx":true,"gov.sx":true,"sy":true,"edu.sy":true,"gov.sy":true,"net.sy":true,"mil.sy":true,"com.sy":true,"org.sy":true,"sz":true,"co.sz":true,"ac.sz":true,"org.sz":true,"tc":true,"td":true,"tel":true,"tf":true,"tg":true,"th":true,"ac.th":true,"co.th":true,"go.th":true,"in.th":true,"mi.th":true,"net.th":true,"or.th":true,"tj":true,"ac.tj":true,"biz.tj":true,"co.tj":true,"com.tj":true,"edu.tj":true,"go.tj":true,"gov.tj":true,"int.tj":true,"mil.tj":true,"name.tj":true,"net.tj":true,"nic.tj":true,"org.tj":true,"test.tj":true,"web.tj":true,"tk":true,"tl":true,"gov.tl":true,"tm":true,"com.tm":true,"co.tm":true,"org.tm":true,"net.tm":true,"nom.tm":true,"gov.tm":true,"mil.tm":true,"edu.tm":true,"tn":true,"com.tn":true,"ens.tn":true,"fin.tn":true,"gov.tn":true,"ind.tn":true,"intl.tn":true,"nat.tn":true,"net.tn":true,"org.tn":true,"info.tn":true,"perso.tn":true,"tourism.tn":true,"edunet.tn":true,"rnrt.tn":true,"rns.tn":true,"rnu.tn":true,"mincom.tn":true,"agrinet.tn":true,"defense.tn":true,"turen.tn":true,"to":true,"com.to":true,"gov.to":true,"net.to":true,"org.to":true,"edu.to":true,"mil.to":true,"tp":true,"tr":true,"com.tr":true,"info.tr":true,"biz.tr":true,"net.tr":true,"org.tr":true,"web.tr":true,"gen.tr":true,"tv.tr":true,"av.tr":true,"dr.tr":true,"bbs.tr":true,"name.tr":true,"tel.tr":true,"gov.tr":true,"bel.tr":true,"pol.tr":true,"mil.tr":true,"k12.tr":true,"edu.tr":true,"kep.tr":true,"nc.tr":true,"gov.nc.tr":true,"travel":true,"tt":true,"co.tt":true,"com.tt":true,"org.tt":true,"net.tt":true,"biz.tt":true,"info.tt":true,"pro.tt":true,"int.tt":true,"coop.tt":true,"jobs.tt":true,"mobi.tt":true,"travel.tt":true,"museum.tt":true,"aero.tt":true,"name.tt":true,"gov.tt":true,"edu.tt":true,"tv":true,"tw":true,"edu.tw":true,"gov.tw":true,"mil.tw":true,"com.tw":true,"net.tw":true,"org.tw":true,"idv.tw":true,"game.tw":true,"ebiz.tw":true,"club.tw":true,"xn--zf0ao64a.tw":true,"xn--uc0atv.tw":true,"xn--czrw28b.tw":true,"tz":true,"ac.tz":true,"co.tz":true,"go.tz":true,"hotel.tz":true,"info.tz":true,"me.tz":true,"mil.tz":true,"mobi.tz":true,"ne.tz":true,"or.tz":true,"sc.tz":true,"tv.tz":true,"ua":true,"com.ua":true,"edu.ua":true,"gov.ua":true,"in.ua":true,"net.ua":true,"org.ua":true,"cherkassy.ua":true,"cherkasy.ua":true,"chernigov.ua":true,"chernihiv.ua":true,"chernivtsi.ua":true,"chernovtsy.ua":true,"ck.ua":true,"cn.ua":true,"cr.ua":true,"crimea.ua":true,"cv.ua":true,"dn.ua":true,"dnepropetrovsk.ua":true,"dnipropetrovsk.ua":true,"dominic.ua":true,"donetsk.ua":true,"dp.ua":true,"if.ua":true,"ivano-frankivsk.ua":true,"kh.ua":true,"kharkiv.ua":true,"kharkov.ua":true,"kherson.ua":true,"khmelnitskiy.ua":true,"khmelnytskyi.ua":true,"kiev.ua":true,"kirovograd.ua":true,"km.ua":true,"kr.ua":true,"krym.ua":true,"ks.ua":true,"kv.ua":true,"kyiv.ua":true,"lg.ua":true,"lt.ua":true,"lugansk.ua":true,"lutsk.ua":true,"lv.ua":true,"lviv.ua":true,"mk.ua":true,"mykolaiv.ua":true,"nikolaev.ua":true,"od.ua":true,"odesa.ua":true,"odessa.ua":true,"pl.ua":true,"poltava.ua":true,"rivne.ua":true,"rovno.ua":true,"rv.ua":true,"sb.ua":true,"sebastopol.ua":true,"sevastopol.ua":true,"sm.ua":true,"sumy.ua":true,"te.ua":true,"ternopil.ua":true,"uz.ua":true,"uzhgorod.ua":true,"vinnica.ua":true,"vinnytsia.ua":true,"vn.ua":true,"volyn.ua":true,"yalta.ua":true,"zaporizhzhe.ua":true,"zaporizhzhia.ua":true,"zhitomir.ua":true,"zhytomyr.ua":true,"zp.ua":true,"zt.ua":true,"ug":true,"co.ug":true,"or.ug":true,"ac.ug":true,"sc.ug":true,"go.ug":true,"ne.ug":true,"com.ug":true,"org.ug":true,"uk":true,"ac.uk":true,"co.uk":true,"gov.uk":true,"ltd.uk":true,"me.uk":true,"net.uk":true,"nhs.uk":true,"org.uk":true,"plc.uk":true,"police.uk":true,"*.sch.uk":true,"us":true,"dni.us":true,"fed.us":true,"isa.us":true,"kids.us":true,"nsn.us":true,"ak.us":true,"al.us":true,"ar.us":true,"as.us":true,"az.us":true,"ca.us":true,"co.us":true,"ct.us":true,"dc.us":true,"de.us":true,"fl.us":true,"ga.us":true,"gu.us":true,"hi.us":true,"ia.us":true,"id.us":true,"il.us":true,"in.us":true,"ks.us":true,"ky.us":true,"la.us":true,"ma.us":true,"md.us":true,"me.us":true,"mi.us":true,"mn.us":true,"mo.us":true,"ms.us":true,"mt.us":true,"nc.us":true,"nd.us":true,"ne.us":true,"nh.us":true,"nj.us":true,"nm.us":true,"nv.us":true,"ny.us":true,"oh.us":true,"ok.us":true,"or.us":true,"pa.us":true,"pr.us":true,"ri.us":true,"sc.us":true,"sd.us":true,"tn.us":true,"tx.us":true,"ut.us":true,"vi.us":true,"vt.us":true,"va.us":true,"wa.us":true,"wi.us":true,"wv.us":true,"wy.us":true,"k12.ak.us":true,"k12.al.us":true,"k12.ar.us":true,"k12.as.us":true,"k12.az.us":true,"k12.ca.us":true,"k12.co.us":true,"k12.ct.us":true,"k12.dc.us":true,"k12.de.us":true,"k12.fl.us":true,"k12.ga.us":true,"k12.gu.us":true,"k12.ia.us":true,"k12.id.us":true,"k12.il.us":true,"k12.in.us":true,"k12.ks.us":true,"k12.ky.us":true,"k12.la.us":true,"k12.ma.us":true,"k12.md.us":true,"k12.me.us":true,"k12.mi.us":true,"k12.mn.us":true,"k12.mo.us":true,"k12.ms.us":true,"k12.mt.us":true,"k12.nc.us":true,"k12.ne.us":true,"k12.nh.us":true,"k12.nj.us":true,"k12.nm.us":true,"k12.nv.us":true,"k12.ny.us":true,"k12.oh.us":true,"k12.ok.us":true,"k12.or.us":true,"k12.pa.us":true,"k12.pr.us":true,"k12.ri.us":true,"k12.sc.us":true,"k12.tn.us":true,"k12.tx.us":true,"k12.ut.us":true,"k12.vi.us":true,"k12.vt.us":true,"k12.va.us":true,"k12.wa.us":true,"k12.wi.us":true,"k12.wy.us":true,"cc.ak.us":true,"cc.al.us":true,"cc.ar.us":true,"cc.as.us":true,"cc.az.us":true,"cc.ca.us":true,"cc.co.us":true,"cc.ct.us":true,"cc.dc.us":true,"cc.de.us":true,"cc.fl.us":true,"cc.ga.us":true,"cc.gu.us":true,"cc.hi.us":true,"cc.ia.us":true,"cc.id.us":true,"cc.il.us":true,"cc.in.us":true,"cc.ks.us":true,"cc.ky.us":true,"cc.la.us":true,"cc.ma.us":true,"cc.md.us":true,"cc.me.us":true,"cc.mi.us":true,"cc.mn.us":true,"cc.mo.us":true,"cc.ms.us":true,"cc.mt.us":true,"cc.nc.us":true,"cc.nd.us":true,"cc.ne.us":true,"cc.nh.us":true,"cc.nj.us":true,"cc.nm.us":true,"cc.nv.us":true,"cc.ny.us":true,"cc.oh.us":true,"cc.ok.us":true,"cc.or.us":true,"cc.pa.us":true,"cc.pr.us":true,"cc.ri.us":true,"cc.sc.us":true,"cc.sd.us":true,"cc.tn.us":true,"cc.tx.us":true,"cc.ut.us":true,"cc.vi.us":true,"cc.vt.us":true,"cc.va.us":true,"cc.wa.us":true,"cc.wi.us":true,"cc.wv.us":true,"cc.wy.us":true,"lib.ak.us":true,"lib.al.us":true,"lib.ar.us":true,"lib.as.us":true,"lib.az.us":true,"lib.ca.us":true,"lib.co.us":true,"lib.ct.us":true,"lib.dc.us":true,"lib.de.us":true,"lib.fl.us":true,"lib.ga.us":true,"lib.gu.us":true,"lib.hi.us":true,"lib.ia.us":true,"lib.id.us":true,"lib.il.us":true,"lib.in.us":true,"lib.ks.us":true,"lib.ky.us":true,"lib.la.us":true,"lib.ma.us":true,"lib.md.us":true,"lib.me.us":true,"lib.mi.us":true,"lib.mn.us":true,"lib.mo.us":true,"lib.ms.us":true,"lib.mt.us":true,"lib.nc.us":true,"lib.nd.us":true,"lib.ne.us":true,"lib.nh.us":true,"lib.nj.us":true,"lib.nm.us":true,"lib.nv.us":true,"lib.ny.us":true,"lib.oh.us":true,"lib.ok.us":true,"lib.or.us":true,"lib.pa.us":true,"lib.pr.us":true,"lib.ri.us":true,"lib.sc.us":true,"lib.sd.us":true,"lib.tn.us":true,"lib.tx.us":true,"lib.ut.us":true,"lib.vi.us":true,"lib.vt.us":true,"lib.va.us":true,"lib.wa.us":true,"lib.wi.us":true,"lib.wy.us":true,"pvt.k12.ma.us":true,"chtr.k12.ma.us":true,"paroch.k12.ma.us":true,"uy":true,"com.uy":true,"edu.uy":true,"gub.uy":true,"mil.uy":true,"net.uy":true,"org.uy":true,"uz":true,"co.uz":true,"com.uz":true,"net.uz":true,"org.uz":true,"va":true,"vc":true,"com.vc":true,"net.vc":true,"org.vc":true,"gov.vc":true,"mil.vc":true,"edu.vc":true,"ve":true,"arts.ve":true,"co.ve":true,"com.ve":true,"e12.ve":true,"edu.ve":true,"firm.ve":true,"gob.ve":true,"gov.ve":true,"info.ve":true,"int.ve":true,"mil.ve":true,"net.ve":true,"org.ve":true,"rec.ve":true,"store.ve":true,"tec.ve":true,"web.ve":true,"vg":true,"vi":true,"co.vi":true,"com.vi":true,"k12.vi":true,"net.vi":true,"org.vi":true,"vn":true,"com.vn":true,"net.vn":true,"org.vn":true,"edu.vn":true,"gov.vn":true,"int.vn":true,"ac.vn":true,"biz.vn":true,"info.vn":true,"name.vn":true,"pro.vn":true,"health.vn":true,"vu":true,"com.vu":true,"edu.vu":true,"net.vu":true,"org.vu":true,"wf":true,"ws":true,"com.ws":true,"net.ws":true,"org.ws":true,"gov.ws":true,"edu.ws":true,"yt":true,"xn--mgbaam7a8h":true,"xn--y9a3aq":true,"xn--54b7fta0cc":true,"xn--90ais":true,"xn--fiqs8s":true,"xn--fiqz9s":true,"xn--lgbbat1ad8j":true,"xn--wgbh1c":true,"xn--node":true,"xn--qxam":true,"xn--j6w193g":true,"xn--h2brj9c":true,"xn--mgbbh1a71e":true,"xn--fpcrj9c3d":true,"xn--gecrj9c":true,"xn--s9brj9c":true,"xn--45brj9c":true,"xn--xkc2dl3a5ee0h":true,"xn--mgba3a4f16a":true,"xn--mgba3a4fra":true,"xn--mgbtx2b":true,"xn--mgbayh7gpa":true,"xn--3e0b707e":true,"xn--80ao21a":true,"xn--fzc2c9e2c":true,"xn--xkc2al3hye2a":true,"xn--mgbc0a9azcg":true,"xn--d1alf":true,"xn--l1acc":true,"xn--mix891f":true,"xn--mix082f":true,"xn--mgbx4cd0ab":true,"xn--mgb9awbf":true,"xn--mgbai9azgqp6j":true,"xn--mgbai9a5eva00b":true,"xn--ygbi2ammx":true,"xn--90a3ac":true,"xn--o1ac.xn--90a3ac":true,"xn--c1avg.xn--90a3ac":true,"xn--90azh.xn--90a3ac":true,"xn--d1at.xn--90a3ac":true,"xn--o1ach.xn--90a3ac":true,"xn--80au.xn--90a3ac":true,"xn--p1ai":true,"xn--wgbl6a":true,"xn--mgberp4a5d4ar":true,"xn--mgberp4a5d4a87g":true,"xn--mgbqly7c0a67fbc":true,"xn--mgbqly7cvafr":true,"xn--mgbpl2fh":true,"xn--yfro4i67o":true,"xn--clchc0ea0b2g2a9gcd":true,"xn--ogbpf8fl":true,"xn--mgbtf8fl":true,"xn--o3cw4h":true,"xn--pgbs0dh":true,"xn--kpry57d":true,"xn--kprw13d":true,"xn--nnx388a":true,"xn--j1amh":true,"xn--mgb2ddes":true,"xxx":true,"*.ye":true,"ac.za":true,"agrica.za":true,"alt.za":true,"co.za":true,"edu.za":true,"gov.za":true,"grondar.za":true,"law.za":true,"mil.za":true,"net.za":true,"ngo.za":true,"nis.za":true,"nom.za":true,"org.za":true,"school.za":true,"tm.za":true,"web.za":true,"*.zm":true,"*.zw":true,"aaa":true,"aarp":true,"abarth":true,"abb":true,"abbott":true,"abbvie":true,"abc":true,"able":true,"abogado":true,"abudhabi":true,"academy":true,"accenture":true,"accountant":true,"accountants":true,"aco":true,"active":true,"actor":true,"adac":true,"ads":true,"adult":true,"aeg":true,"aetna":true,"afamilycompany":true,"afl":true,"africa":true,"africamagic":true,"agakhan":true,"agency":true,"aig":true,"aigo":true,"airbus":true,"airforce":true,"airtel":true,"akdn":true,"alfaromeo":true,"alibaba":true,"alipay":true,"allfinanz":true,"allstate":true,"ally":true,"alsace":true,"alstom":true,"americanexpress":true,"americanfamily":true,"amex":true,"amfam":true,"amica":true,"amsterdam":true,"analytics":true,"android":true,"anquan":true,"anz":true,"aol":true,"apartments":true,"app":true,"apple":true,"aquarelle":true,"aramco":true,"archi":true,"army":true,"arte":true,"asda":true,"associates":true,"athleta":true,"attorney":true,"auction":true,"audi":true,"audible":true,"audio":true,"auspost":true,"author":true,"auto":true,"autos":true,"avianca":true,"aws":true,"axa":true,"azure":true,"baby":true,"baidu":true,"banamex":true,"bananarepublic":true,"band":true,"bank":true,"bar":true,"barcelona":true,"barclaycard":true,"barclays":true,"barefoot":true,"bargains":true,"basketball":true,"bauhaus":true,"bayern":true,"bbc":true,"bbt":true,"bbva":true,"bcg":true,"bcn":true,"beats":true,"beer":true,"bentley":true,"berlin":true,"best":true,"bestbuy":true,"bet":true,"bharti":true,"bible":true,"bid":true,"bike":true,"bing":true,"bingo":true,"bio":true,"black":true,"blackfriday":true,"blanco":true,"blockbuster":true,"blog":true,"bloomberg":true,"blue":true,"bms":true,"bmw":true,"bnl":true,"bnpparibas":true,"boats":true,"boehringer":true,"bofa":true,"bom":true,"bond":true,"boo":true,"book":true,"booking":true,"boots":true,"bosch":true,"bostik":true,"bot":true,"boutique":true,"bradesco":true,"bridgestone":true,"broadway":true,"broker":true,"brother":true,"brussels":true,"budapest":true,"bugatti":true,"build":true,"builders":true,"business":true,"buy":true,"buzz":true,"bzh":true,"cab":true,"cafe":true,"cal":true,"call":true,"calvinklein":true,"camera":true,"camp":true,"cancerresearch":true,"canon":true,"capetown":true,"capital":true,"capitalone":true,"car":true,"caravan":true,"cards":true,"care":true,"career":true,"careers":true,"cars":true,"cartier":true,"casa":true,"case":true,"caseih":true,"cash":true,"casino":true,"catering":true,"cba":true,"cbn":true,"cbre":true,"cbs":true,"ceb":true,"center":true,"ceo":true,"cern":true,"cfa":true,"cfd":true,"chanel":true,"channel":true,"chase":true,"chat":true,"cheap":true,"chintai":true,"chloe":true,"christmas":true,"chrome":true,"chrysler":true,"church":true,"cipriani":true,"circle":true,"cisco":true,"citadel":true,"citi":true,"citic":true,"city":true,"cityeats":true,"claims":true,"cleaning":true,"click":true,"clinic":true,"clothing":true,"cloud":true,"club":true,"clubmed":true,"coach":true,"codes":true,"coffee":true,"college":true,"cologne":true,"comcast":true,"commbank":true,"community":true,"company":true,"computer":true,"comsec":true,"condos":true,"construction":true,"consulting":true,"contact":true,"contractors":true,"cooking":true,"cookingchannel":true,"cool":true,"corsica":true,"country":true,"coupon":true,"coupons":true,"courses":true,"credit":true,"creditcard":true,"creditunion":true,"cricket":true,"crown":true,"crs":true,"cruises":true,"csc":true,"cuisinella":true,"cymru":true,"cyou":true,"dabur":true,"dad":true,"dance":true,"date":true,"dating":true,"datsun":true,"day":true,"dclk":true,"dds":true,"deal":true,"dealer":true,"deals":true,"degree":true,"delivery":true,"dell":true,"deloitte":true,"delta":true,"democrat":true,"dental":true,"dentist":true,"desi":true,"design":true,"dev":true,"dhl":true,"diamonds":true,"diet":true,"digital":true,"direct":true,"directory":true,"discount":true,"discover":true,"dish":true,"dnp":true,"docs":true,"dodge":true,"dog":true,"doha":true,"domains":true,"doosan":true,"dot":true,"download":true,"drive":true,"dstv":true,"dtv":true,"dubai":true,"duck":true,"dunlop":true,"duns":true,"dupont":true,"durban":true,"dvag":true,"dwg":true,"earth":true,"eat":true,"edeka":true,"education":true,"email":true,"emerck":true,"emerson":true,"energy":true,"engineer":true,"engineering":true,"enterprises":true,"epost":true,"epson":true,"equipment":true,"ericsson":true,"erni":true,"esq":true,"estate":true,"esurance":true,"etisalat":true,"eurovision":true,"eus":true,"events":true,"everbank":true,"exchange":true,"expert":true,"exposed":true,"express":true,"extraspace":true,"fage":true,"fail":true,"fairwinds":true,"faith":true,"family":true,"fan":true,"fans":true,"farm":true,"farmers":true,"fashion":true,"fast":true,"fedex":true,"feedback":true,"ferrari":true,"ferrero":true,"fiat":true,"fidelity":true,"fido":true,"film":true,"final":true,"finance":true,"financial":true,"fire":true,"firestone":true,"firmdale":true,"fish":true,"fishing":true,"fit":true,"fitness":true,"flickr":true,"flights":true,"flir":true,"florist":true,"flowers":true,"flsmidth":true,"fly":true,"foo":true,"foodnetwork":true,"football":true,"ford":true,"forex":true,"forsale":true,"forum":true,"foundation":true,"fox":true,"fresenius":true,"frl":true,"frogans":true,"frontdoor":true,"frontier":true,"ftr":true,"fujitsu":true,"fujixerox":true,"fund":true,"furniture":true,"futbol":true,"fyi":true,"gal":true,"gallery":true,"gallo":true,"gallup":true,"game":true,"games":true,"gap":true,"garden":true,"gbiz":true,"gdn":true,"gea":true,"gent":true,"genting":true,"george":true,"ggee":true,"gift":true,"gifts":true,"gives":true,"giving":true,"glade":true,"glass":true,"gle":true,"global":true,"globo":true,"gmail":true,"gmo":true,"gmx":true,"godaddy":true,"gold":true,"goldpoint":true,"golf":true,"goo":true,"goodhands":true,"goodyear":true,"goog":true,"google":true,"gop":true,"got":true,"gotv":true,"grainger":true,"graphics":true,"gratis":true,"green":true,"gripe":true,"group":true,"guardian":true,"gucci":true,"guge":true,"guide":true,"guitars":true,"guru":true,"hamburg":true,"hangout":true,"haus":true,"hbo":true,"hdfc":true,"hdfcbank":true,"health":true,"healthcare":true,"help":true,"helsinki":true,"here":true,"hermes":true,"hgtv":true,"hiphop":true,"hisamitsu":true,"hitachi":true,"hiv":true,"hkt":true,"hockey":true,"holdings":true,"holiday":true,"homedepot":true,"homegoods":true,"homes":true,"homesense":true,"honda":true,"honeywell":true,"horse":true,"host":true,"hosting":true,"hot":true,"hoteles":true,"hotmail":true,"house":true,"how":true,"hsbc":true,"htc":true,"hughes":true,"hyatt":true,"hyundai":true,"ibm":true,"icbc":true,"ice":true,"icu":true,"ieee":true,"ifm":true,"iinet":true,"ikano":true,"imamat":true,"imdb":true,"immo":true,"immobilien":true,"industries":true,"infiniti":true,"ing":true,"ink":true,"institute":true,"insurance":true,"insure":true,"intel":true,"international":true,"intuit":true,"investments":true,"ipiranga":true,"irish":true,"iselect":true,"ismaili":true,"ist":true,"istanbul":true,"itau":true,"itv":true,"iveco":true,"iwc":true,"jaguar":true,"java":true,"jcb":true,"jcp":true,"jeep":true,"jetzt":true,"jewelry":true,"jio":true,"jlc":true,"jll":true,"jmp":true,"jnj":true,"joburg":true,"jot":true,"joy":true,"jpmorgan":true,"jprs":true,"juegos":true,"juniper":true,"kaufen":true,"kddi":true,"kerryhotels":true,"kerrylogistics":true,"kerryproperties":true,"kfh":true,"kia":true,"kim":true,"kinder":true,"kindle":true,"kitchen":true,"kiwi":true,"koeln":true,"komatsu":true,"kosher":true,"kpmg":true,"kpn":true,"krd":true,"kred":true,"kuokgroup":true,"kyknet":true,"kyoto":true,"lacaixa":true,"ladbrokes":true,"lamborghini":true,"lancaster":true,"lancia":true,"lancome":true,"land":true,"landrover":true,"lanxess":true,"lasalle":true,"lat":true,"latino":true,"latrobe":true,"law":true,"lawyer":true,"lds":true,"lease":true,"leclerc":true,"lefrak":true,"legal":true,"lego":true,"lexus":true,"lgbt":true,"liaison":true,"lidl":true,"life":true,"lifeinsurance":true,"lifestyle":true,"lighting":true,"like":true,"lilly":true,"limited":true,"limo":true,"lincoln":true,"linde":true,"link":true,"lipsy":true,"live":true,"living":true,"lixil":true,"loan":true,"loans":true,"locker":true,"locus":true,"loft":true,"lol":true,"london":true,"lotte":true,"lotto":true,"love":true,"lpl":true,"lplfinancial":true,"ltd":true,"ltda":true,"lundbeck":true,"lupin":true,"luxe":true,"luxury":true,"macys":true,"madrid":true,"maif":true,"maison":true,"makeup":true,"man":true,"management":true,"mango":true,"market":true,"marketing":true,"markets":true,"marriott":true,"marshalls":true,"maserati":true,"mattel":true,"mba":true,"mcd":true,"mcdonalds":true,"mckinsey":true,"med":true,"media":true,"meet":true,"melbourne":true,"meme":true,"memorial":true,"men":true,"menu":true,"meo":true,"metlife":true,"miami":true,"microsoft":true,"mini":true,"mint":true,"mit":true,"mitsubishi":true,"mlb":true,"mls":true,"mma":true,"mnet":true,"mobily":true,"moda":true,"moe":true,"moi":true,"mom":true,"monash":true,"money":true,"monster":true,"montblanc":true,"mopar":true,"mormon":true,"mortgage":true,"moscow":true,"moto":true,"motorcycles":true,"mov":true,"movie":true,"movistar":true,"msd":true,"mtn":true,"mtpc":true,"mtr":true,"multichoice":true,"mutual":true,"mutuelle":true,"mzansimagic":true,"nab":true,"nadex":true,"nagoya":true,"naspers":true,"nationwide":true,"natura":true,"navy":true,"nba":true,"nec":true,"netbank":true,"netflix":true,"network":true,"neustar":true,"new":true,"newholland":true,"news":true,"next":true,"nextdirect":true,"nexus":true,"nfl":true,"ngo":true,"nhk":true,"nico":true,"nike":true,"nikon":true,"ninja":true,"nissan":true,"nokia":true,"northwesternmutual":true,"norton":true,"now":true,"nowruz":true,"nowtv":true,"nra":true,"nrw":true,"ntt":true,"nyc":true,"obi":true,"observer":true,"off":true,"office":true,"okinawa":true,"olayan":true,"olayangroup":true,"oldnavy":true,"ollo":true,"omega":true,"one":true,"ong":true,"onl":true,"online":true,"onyourside":true,"ooo":true,"open":true,"oracle":true,"orange":true,"organic":true,"orientexpress":true,"osaka":true,"otsuka":true,"ott":true,"ovh":true,"page":true,"pamperedchef":true,"panasonic":true,"panerai":true,"paris":true,"pars":true,"partners":true,"parts":true,"party":true,"passagens":true,"pay":true,"payu":true,"pccw":true,"pet":true,"pfizer":true,"pharmacy":true,"philips":true,"photo":true,"photography":true,"photos":true,"physio":true,"piaget":true,"pics":true,"pictet":true,"pictures":true,"pid":true,"pin":true,"ping":true,"pink":true,"pioneer":true,"pizza":true,"place":true,"play":true,"playstation":true,"plumbing":true,"plus":true,"pnc":true,"pohl":true,"poker":true,"politie":true,"porn":true,"pramerica":true,"praxi":true,"press":true,"prime":true,"prod":true,"productions":true,"prof":true,"progressive":true,"promo":true,"properties":true,"property":true,"protection":true,"pru":true,"prudential":true,"pub":true,"qpon":true,"quebec":true,"quest":true,"qvc":true,"racing":true,"raid":true,"read":true,"realestate":true,"realtor":true,"realty":true,"recipes":true,"red":true,"redstone":true,"redumbrella":true,"rehab":true,"reise":true,"reisen":true,"reit":true,"reliance":true,"ren":true,"rent":true,"rentals":true,"repair":true,"report":true,"republican":true,"rest":true,"restaurant":true,"review":true,"reviews":true,"rexroth":true,"rich":true,"richardli":true,"ricoh":true,"rightathome":true,"ril":true,"rio":true,"rip":true,"rocher":true,"rocks":true,"rodeo":true,"rogers":true,"room":true,"rsvp":true,"ruhr":true,"run":true,"rwe":true,"ryukyu":true,"saarland":true,"safe":true,"safety":true,"sakura":true,"sale":true,"salon":true,"samsclub":true,"samsung":true,"sandvik":true,"sandvikcoromant":true,"sanofi":true,"sap":true,"sapo":true,"sarl":true,"sas":true,"save":true,"saxo":true,"sbi":true,"sbs":true,"sca":true,"scb":true,"schaeffler":true,"schmidt":true,"scholarships":true,"school":true,"schule":true,"schwarz":true,"science":true,"scjohnson":true,"scor":true,"scot":true,"seat":true,"secure":true,"security":true,"seek":true,"sener":true,"services":true,"ses":true,"seven":true,"sew":true,"sex":true,"sexy":true,"sfr":true,"shangrila":true,"sharp":true,"shaw":true,"shell":true,"shia":true,"shiksha":true,"shoes":true,"shouji":true,"show":true,"showtime":true,"shriram":true,"silk":true,"sina":true,"singles":true,"site":true,"ski":true,"skin":true,"sky":true,"skype":true,"sling":true,"smart":true,"smile":true,"sncf":true,"soccer":true,"social":true,"softbank":true,"software":true,"sohu":true,"solar":true,"solutions":true,"song":true,"sony":true,"soy":true,"space":true,"spiegel":true,"spot":true,"spreadbetting":true,"srl":true,"srt":true,"stada":true,"staples":true,"star":true,"starhub":true,"statebank":true,"statefarm":true,"statoil":true,"stc":true,"stcgroup":true,"stockholm":true,"storage":true,"store":true,"studio":true,"study":true,"style":true,"sucks":true,"supersport":true,"supplies":true,"supply":true,"support":true,"surf":true,"surgery":true,"suzuki":true,"swatch":true,"swiftcover":true,"swiss":true,"sydney":true,"symantec":true,"systems":true,"tab":true,"taipei":true,"talk":true,"taobao":true,"target":true,"tatamotors":true,"tatar":true,"tattoo":true,"tax":true,"taxi":true,"tci":true,"tdk":true,"team":true,"tech":true,"technology":true,"telecity":true,"telefonica":true,"temasek":true,"tennis":true,"teva":true,"thd":true,"theater":true,"theatre":true,"theguardian":true,"tiaa":true,"tickets":true,"tienda":true,"tiffany":true,"tips":true,"tires":true,"tirol":true,"tjmaxx":true,"tjx":true,"tkmaxx":true,"tmall":true,"today":true,"tokyo":true,"tools":true,"top":true,"toray":true,"toshiba":true,"total":true,"tours":true,"town":true,"toyota":true,"toys":true,"trade":true,"trading":true,"training":true,"travelchannel":true,"travelers":true,"travelersinsurance":true,"trust":true,"trv":true,"tube":true,"tui":true,"tunes":true,"tushu":true,"tvs":true,"ubank":true,"ubs":true,"uconnect":true,"university":true,"uno":true,"uol":true,"ups":true,"vacations":true,"vana":true,"vanguard":true,"vegas":true,"ventures":true,"verisign":true,"versicherung":true,"vet":true,"viajes":true,"video":true,"vig":true,"viking":true,"villas":true,"vin":true,"vip":true,"virgin":true,"visa":true,"vision":true,"vista":true,"vistaprint":true,"viva":true,"vivo":true,"vlaanderen":true,"vodka":true,"volkswagen":true,"vote":true,"voting":true,"voto":true,"voyage":true,"vuelos":true,"wales":true,"walmart":true,"walter":true,"wang":true,"wanggou":true,"warman":true,"watch":true,"watches":true,"weather":true,"weatherchannel":true,"webcam":true,"weber":true,"website":true,"wed":true,"wedding":true,"weibo":true,"weir":true,"whoswho":true,"wien":true,"wiki":true,"williamhill":true,"win":true,"windows":true,"wine":true,"winners":true,"wme":true,"wolterskluwer":true,"woodside":true,"work":true,"works":true,"world":true,"wtc":true,"wtf":true,"xbox":true,"xerox":true,"xfinity":true,"xihuan":true,"xin":true,"xn--11b4c3d":true,"xn--1ck2e1b":true,"xn--1qqw23a":true,"xn--30rr7y":true,"xn--3bst00m":true,"xn--3ds443g":true,"xn--3oq18vl8pn36a":true,"xn--3pxu8k":true,"xn--42c2d9a":true,"xn--45q11c":true,"xn--4gbrim":true,"xn--4gq48lf9j":true,"xn--55qw42g":true,"xn--55qx5d":true,"xn--5su34j936bgsg":true,"xn--5tzm5g":true,"xn--6frz82g":true,"xn--6qq986b3xl":true,"xn--80adxhks":true,"xn--80asehdb":true,"xn--80aswg":true,"xn--8y0a063a":true,"xn--9dbq2a":true,"xn--9et52u":true,"xn--9krt00a":true,"xn--b4w605ferd":true,"xn--bck1b9a5dre4c":true,"xn--c1avg":true,"xn--c2br7g":true,"xn--cck2b3b":true,"xn--cg4bki":true,"xn--czr694b":true,"xn--czrs0t":true,"xn--czru2d":true,"xn--d1acj3b":true,"xn--eckvdtc9d":true,"xn--efvy88h":true,"xn--estv75g":true,"xn--fct429k":true,"xn--fhbei":true,"xn--fiq228c5hs":true,"xn--fiq64b":true,"xn--fjq720a":true,"xn--flw351e":true,"xn--fzys8d69uvgm":true,"xn--g2xx48c":true,"xn--gckr3f0f":true,"xn--hxt814e":true,"xn--i1b6b1a6a2e":true,"xn--imr513n":true,"xn--io0a7i":true,"xn--j1aef":true,"xn--jlq61u9w7b":true,"xn--jvr189m":true,"xn--kcrx77d1x4a":true,"xn--kpu716f":true,"xn--kput3i":true,"xn--mgba3a3ejt":true,"xn--mgba7c0bbn0a":true,"xn--mgbaakc7dvf":true,"xn--mgbab2bd":true,"xn--mgbb9fbpob":true,"xn--mgbca7dzdo":true,"xn--mgbt3dhd":true,"xn--mk1bu44c":true,"xn--mxtq1m":true,"xn--ngbc5azd":true,"xn--ngbe9e0a":true,"xn--nqv7f":true,"xn--nqv7fs00ema":true,"xn--nyqy26a":true,"xn--p1acf":true,"xn--pbt977c":true,"xn--pssy2u":true,"xn--q9jyb4c":true,"xn--qcka1pmc":true,"xn--rhqv96g":true,"xn--rovu88b":true,"xn--ses554g":true,"xn--t60b56a":true,"xn--tckwe":true,"xn--unup4y":true,"xn--vermgensberater-ctb":true,"xn--vermgensberatung-pwb":true,"xn--vhquv":true,"xn--vuq861b":true,"xn--w4r85el8fhu5dnra":true,"xn--w4rs40l":true,"xn--xhq521b":true,"xn--zfr164b":true,"xperia":true,"xyz":true,"yachts":true,"yahoo":true,"yamaxun":true,"yandex":true,"yodobashi":true,"yoga":true,"yokohama":true,"you":true,"youtube":true,"yun":true,"zappos":true,"zara":true,"zero":true,"zip":true,"zippo":true,"zone":true,"zuerich":true,"cloudfront.net":true,"ap-northeast-1.compute.amazonaws.com":true,"ap-southeast-1.compute.amazonaws.com":true,"ap-southeast-2.compute.amazonaws.com":true,"cn-north-1.compute.amazonaws.cn":true,"compute.amazonaws.cn":true,"compute.amazonaws.com":true,"compute-1.amazonaws.com":true,"eu-west-1.compute.amazonaws.com":true,"eu-central-1.compute.amazonaws.com":true,"sa-east-1.compute.amazonaws.com":true,"us-east-1.amazonaws.com":true,"us-gov-west-1.compute.amazonaws.com":true,"us-west-1.compute.amazonaws.com":true,"us-west-2.compute.amazonaws.com":true,"z-1.compute-1.amazonaws.com":true,"z-2.compute-1.amazonaws.com":true,"elasticbeanstalk.com":true,"elb.amazonaws.com":true,"s3.amazonaws.com":true,"s3-ap-northeast-1.amazonaws.com":true,"s3-ap-southeast-1.amazonaws.com":true,"s3-ap-southeast-2.amazonaws.com":true,"s3-external-1.amazonaws.com":true,"s3-external-2.amazonaws.com":true,"s3-fips-us-gov-west-1.amazonaws.com":true,"s3-eu-central-1.amazonaws.com":true,"s3-eu-west-1.amazonaws.com":true,"s3-sa-east-1.amazonaws.com":true,"s3-us-gov-west-1.amazonaws.com":true,"s3-us-west-1.amazonaws.com":true,"s3-us-west-2.amazonaws.com":true,"s3.cn-north-1.amazonaws.com.cn":true,"s3.eu-central-1.amazonaws.com":true,"betainabox.com":true,"ae.org":true,"ar.com":true,"br.com":true,"cn.com":true,"com.de":true,"com.se":true,"de.com":true,"eu.com":true,"gb.com":true,"gb.net":true,"hu.com":true,"hu.net":true,"jp.net":true,"jpn.com":true,"kr.com":true,"mex.com":true,"no.com":true,"qc.com":true,"ru.com":true,"sa.com":true,"se.com":true,"se.net":true,"uk.com":true,"uk.net":true,"us.com":true,"uy.com":true,"za.bz":true,"za.com":true,"africa.com":true,"gr.com":true,"in.net":true,"us.org":true,"co.com":true,"c.la":true,"cloudcontrolled.com":true,"cloudcontrolapp.com":true,"co.ca":true,"c.cdn77.org":true,"cdn77-ssl.net":true,"r.cdn77.net":true,"rsc.cdn77.org":true,"ssl.origin.cdn77-secure.org":true,"co.nl":true,"co.no":true,"*.platform.sh":true,"cupcake.is":true,"dreamhosters.com":true,"duckdns.org":true,"dyndns-at-home.com":true,"dyndns-at-work.com":true,"dyndns-blog.com":true,"dyndns-free.com":true,"dyndns-home.com":true,"dyndns-ip.com":true,"dyndns-mail.com":true,"dyndns-office.com":true,"dyndns-pics.com":true,"dyndns-remote.com":true,"dyndns-server.com":true,"dyndns-web.com":true,"dyndns-wiki.com":true,"dyndns-work.com":true,"dyndns.biz":true,"dyndns.info":true,"dyndns.org":true,"dyndns.tv":true,"at-band-camp.net":true,"ath.cx":true,"barrel-of-knowledge.info":true,"barrell-of-knowledge.info":true,"better-than.tv":true,"blogdns.com":true,"blogdns.net":true,"blogdns.org":true,"blogsite.org":true,"boldlygoingnowhere.org":true,"broke-it.net":true,"buyshouses.net":true,"cechire.com":true,"dnsalias.com":true,"dnsalias.net":true,"dnsalias.org":true,"dnsdojo.com":true,"dnsdojo.net":true,"dnsdojo.org":true,"does-it.net":true,"doesntexist.com":true,"doesntexist.org":true,"dontexist.com":true,"dontexist.net":true,"dontexist.org":true,"doomdns.com":true,"doomdns.org":true,"dvrdns.org":true,"dyn-o-saur.com":true,"dynalias.com":true,"dynalias.net":true,"dynalias.org":true,"dynathome.net":true,"dyndns.ws":true,"endofinternet.net":true,"endofinternet.org":true,"endoftheinternet.org":true,"est-a-la-maison.com":true,"est-a-la-masion.com":true,"est-le-patron.com":true,"est-mon-blogueur.com":true,"for-better.biz":true,"for-more.biz":true,"for-our.info":true,"for-some.biz":true,"for-the.biz":true,"forgot.her.name":true,"forgot.his.name":true,"from-ak.com":true,"from-al.com":true,"from-ar.com":true,"from-az.net":true,"from-ca.com":true,"from-co.net":true,"from-ct.com":true,"from-dc.com":true,"from-de.com":true,"from-fl.com":true,"from-ga.com":true,"from-hi.com":true,"from-ia.com":true,"from-id.com":true,"from-il.com":true,"from-in.com":true,"from-ks.com":true,"from-ky.com":true,"from-la.net":true,"from-ma.com":true,"from-md.com":true,"from-me.org":true,"from-mi.com":true,"from-mn.com":true,"from-mo.com":true,"from-ms.com":true,"from-mt.com":true,"from-nc.com":true,"from-nd.com":true,"from-ne.com":true,"from-nh.com":true,"from-nj.com":true,"from-nm.com":true,"from-nv.com":true,"from-ny.net":true,"from-oh.com":true,"from-ok.com":true,"from-or.com":true,"from-pa.com":true,"from-pr.com":true,"from-ri.com":true,"from-sc.com":true,"from-sd.com":true,"from-tn.com":true,"from-tx.com":true,"from-ut.com":true,"from-va.com":true,"from-vt.com":true,"from-wa.com":true,"from-wi.com":true,"from-wv.com":true,"from-wy.com":true,"ftpaccess.cc":true,"fuettertdasnetz.de":true,"game-host.org":true,"game-server.cc":true,"getmyip.com":true,"gets-it.net":true,"go.dyndns.org":true,"gotdns.com":true,"gotdns.org":true,"groks-the.info":true,"groks-this.info":true,"ham-radio-op.net":true,"here-for-more.info":true,"hobby-site.com":true,"hobby-site.org":true,"home.dyndns.org":true,"homedns.org":true,"homeftp.net":true,"homeftp.org":true,"homeip.net":true,"homelinux.com":true,"homelinux.net":true,"homelinux.org":true,"homeunix.com":true,"homeunix.net":true,"homeunix.org":true,"iamallama.com":true,"in-the-band.net":true,"is-a-anarchist.com":true,"is-a-blogger.com":true,"is-a-bookkeeper.com":true,"is-a-bruinsfan.org":true,"is-a-bulls-fan.com":true,"is-a-candidate.org":true,"is-a-caterer.com":true,"is-a-celticsfan.org":true,"is-a-chef.com":true,"is-a-chef.net":true,"is-a-chef.org":true,"is-a-conservative.com":true,"is-a-cpa.com":true,"is-a-cubicle-slave.com":true,"is-a-democrat.com":true,"is-a-designer.com":true,"is-a-doctor.com":true,"is-a-financialadvisor.com":true,"is-a-geek.com":true,"is-a-geek.net":true,"is-a-geek.org":true,"is-a-green.com":true,"is-a-guru.com":true,"is-a-hard-worker.com":true,"is-a-hunter.com":true,"is-a-knight.org":true,"is-a-landscaper.com":true,"is-a-lawyer.com":true,"is-a-liberal.com":true,"is-a-libertarian.com":true,"is-a-linux-user.org":true,"is-a-llama.com":true,"is-a-musician.com":true,"is-a-nascarfan.com":true,"is-a-nurse.com":true,"is-a-painter.com":true,"is-a-patsfan.org":true,"is-a-personaltrainer.com":true,"is-a-photographer.com":true,"is-a-player.com":true,"is-a-republican.com":true,"is-a-rockstar.com":true,"is-a-socialist.com":true,"is-a-soxfan.org":true,"is-a-student.com":true,"is-a-teacher.com":true,"is-a-techie.com":true,"is-a-therapist.com":true,"is-an-accountant.com":true,"is-an-actor.com":true,"is-an-actress.com":true,"is-an-anarchist.com":true,"is-an-artist.com":true,"is-an-engineer.com":true,"is-an-entertainer.com":true,"is-by.us":true,"is-certified.com":true,"is-found.org":true,"is-gone.com":true,"is-into-anime.com":true,"is-into-cars.com":true,"is-into-cartoons.com":true,"is-into-games.com":true,"is-leet.com":true,"is-lost.org":true,"is-not-certified.com":true,"is-saved.org":true,"is-slick.com":true,"is-uberleet.com":true,"is-very-bad.org":true,"is-very-evil.org":true,"is-very-good.org":true,"is-very-nice.org":true,"is-very-sweet.org":true,"is-with-theband.com":true,"isa-geek.com":true,"isa-geek.net":true,"isa-geek.org":true,"isa-hockeynut.com":true,"issmarterthanyou.com":true,"isteingeek.de":true,"istmein.de":true,"kicks-ass.net":true,"kicks-ass.org":true,"knowsitall.info":true,"land-4-sale.us":true,"lebtimnetz.de":true,"leitungsen.de":true,"likes-pie.com":true,"likescandy.com":true,"merseine.nu":true,"mine.nu":true,"misconfused.org":true,"mypets.ws":true,"myphotos.cc":true,"neat-url.com":true,"office-on-the.net":true,"on-the-web.tv":true,"podzone.net":true,"podzone.org":true,"readmyblog.org":true,"saves-the-whales.com":true,"scrapper-site.net":true,"scrapping.cc":true,"selfip.biz":true,"selfip.com":true,"selfip.info":true,"selfip.net":true,"selfip.org":true,"sells-for-less.com":true,"sells-for-u.com":true,"sells-it.net":true,"sellsyourhome.org":true,"servebbs.com":true,"servebbs.net":true,"servebbs.org":true,"serveftp.net":true,"serveftp.org":true,"servegame.org":true,"shacknet.nu":true,"simple-url.com":true,"space-to-rent.com":true,"stuff-4-sale.org":true,"stuff-4-sale.us":true,"teaches-yoga.com":true,"thruhere.net":true,"traeumtgerade.de":true,"webhop.biz":true,"webhop.info":true,"webhop.net":true,"webhop.org":true,"worse-than.tv":true,"writesthisblog.com":true,"eu.org":true,"al.eu.org":true,"asso.eu.org":true,"at.eu.org":true,"au.eu.org":true,"be.eu.org":true,"bg.eu.org":true,"ca.eu.org":true,"cd.eu.org":true,"ch.eu.org":true,"cn.eu.org":true,"cy.eu.org":true,"cz.eu.org":true,"de.eu.org":true,"dk.eu.org":true,"edu.eu.org":true,"ee.eu.org":true,"es.eu.org":true,"fi.eu.org":true,"fr.eu.org":true,"gr.eu.org":true,"hr.eu.org":true,"hu.eu.org":true,"ie.eu.org":true,"il.eu.org":true,"in.eu.org":true,"int.eu.org":true,"is.eu.org":true,"it.eu.org":true,"jp.eu.org":true,"kr.eu.org":true,"lt.eu.org":true,"lu.eu.org":true,"lv.eu.org":true,"mc.eu.org":true,"me.eu.org":true,"mk.eu.org":true,"mt.eu.org":true,"my.eu.org":true,"net.eu.org":true,"ng.eu.org":true,"nl.eu.org":true,"no.eu.org":true,"nz.eu.org":true,"paris.eu.org":true,"pl.eu.org":true,"pt.eu.org":true,"q-a.eu.org":true,"ro.eu.org":true,"ru.eu.org":true,"se.eu.org":true,"si.eu.org":true,"sk.eu.org":true,"tr.eu.org":true,"uk.eu.org":true,"us.eu.org":true,"a.ssl.fastly.net":true,"b.ssl.fastly.net":true,"global.ssl.fastly.net":true,"a.prod.fastly.net":true,"global.prod.fastly.net":true,"firebaseapp.com":true,"flynnhub.com":true,"service.gov.uk":true,"github.io":true,"githubusercontent.com":true,"ro.com":true,"appspot.com":true,"blogspot.ae":true,"blogspot.al":true,"blogspot.am":true,"blogspot.ba":true,"blogspot.be":true,"blogspot.bg":true,"blogspot.bj":true,"blogspot.ca":true,"blogspot.cf":true,"blogspot.ch":true,"blogspot.cl":true,"blogspot.co.at":true,"blogspot.co.id":true,"blogspot.co.il":true,"blogspot.co.ke":true,"blogspot.co.nz":true,"blogspot.co.uk":true,"blogspot.co.za":true,"blogspot.com":true,"blogspot.com.ar":true,"blogspot.com.au":true,"blogspot.com.br":true,"blogspot.com.by":true,"blogspot.com.co":true,"blogspot.com.cy":true,"blogspot.com.ee":true,"blogspot.com.eg":true,"blogspot.com.es":true,"blogspot.com.mt":true,"blogspot.com.ng":true,"blogspot.com.tr":true,"blogspot.com.uy":true,"blogspot.cv":true,"blogspot.cz":true,"blogspot.de":true,"blogspot.dk":true,"blogspot.fi":true,"blogspot.fr":true,"blogspot.gr":true,"blogspot.hk":true,"blogspot.hr":true,"blogspot.hu":true,"blogspot.ie":true,"blogspot.in":true,"blogspot.is":true,"blogspot.it":true,"blogspot.jp":true,"blogspot.kr":true,"blogspot.li":true,"blogspot.lt":true,"blogspot.lu":true,"blogspot.md":true,"blogspot.mk":true,"blogspot.mr":true,"blogspot.mx":true,"blogspot.my":true,"blogspot.nl":true,"blogspot.no":true,"blogspot.pe":true,"blogspot.pt":true,"blogspot.qa":true,"blogspot.re":true,"blogspot.ro":true,"blogspot.rs":true,"blogspot.ru":true,"blogspot.se":true,"blogspot.sg":true,"blogspot.si":true,"blogspot.sk":true,"blogspot.sn":true,"blogspot.td":true,"blogspot.tw":true,"blogspot.ug":true,"blogspot.vn":true,"codespot.com":true,"googleapis.com":true,"googlecode.com":true,"pagespeedmobilizer.com":true,"withgoogle.com":true,"withyoutube.com":true,"herokuapp.com":true,"herokussl.com":true,"iki.fi":true,"biz.at":true,"info.at":true,"co.pl":true,"azurewebsites.net":true,"azure-mobile.net":true,"cloudapp.net":true,"bmoattachments.org":true,"4u.com":true,"nfshost.com":true,"nyc.mn":true,"nid.io":true,"operaunite.com":true,"outsystemscloud.com":true,"art.pl":true,"gliwice.pl":true,"krakow.pl":true,"poznan.pl":true,"wroc.pl":true,"zakopane.pl":true,"pantheon.io":true,"gotpantheon.com":true,"priv.at":true,"qa2.com":true,"rhcloud.com":true,"sandcats.io":true,"biz.ua":true,"co.ua":true,"pp.ua":true,"sinaapp.com":true,"vipsinaapp.com":true,"1kapp.com":true,"gda.pl":true,"gdansk.pl":true,"gdynia.pl":true,"med.pl":true,"sopot.pl":true,"hk.com":true,"hk.org":true,"ltd.hk":true,"inc.hk":true,"yolasite.com":true,"za.net":true,"za.org":true}); // END of automatically generated file npm_3.5.2.orig/node_modules/request/node_modules/tough-cookie/lib/store.js0000644000000000000000000000543112631326456025154 0ustar 00000000000000/*! * Copyright (c) 2015, Salesforce.com, Inc. * All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions are met: * * 1. Redistributions of source code must retain the above copyright notice, * this list of conditions and the following disclaimer. * * 2. Redistributions in binary form must reproduce the above copyright notice, * this list of conditions and the following disclaimer in the documentation * and/or other materials provided with the distribution. * * 3. Neither the name of Salesforce.com nor the names of its contributors may * be used to endorse or promote products derived from this software without * specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE * ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE * LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE * POSSIBILITY OF SUCH DAMAGE. */ 'use strict'; /*jshint unused:false */ function Store() { } exports.Store = Store; // Stores may be synchronous, but are still required to use a // Continuation-Passing Style API. The CookieJar itself will expose a "*Sync" // API that converts from synchronous-callbacks to imperative style. Store.prototype.synchronous = false; Store.prototype.findCookie = function(domain, path, key, cb) { throw new Error('findCookie is not implemented'); }; Store.prototype.findCookies = function(domain, path, cb) { throw new Error('findCookies is not implemented'); }; Store.prototype.putCookie = function(cookie, cb) { throw new Error('putCookie is not implemented'); }; Store.prototype.updateCookie = function(oldCookie, newCookie, cb) { // recommended default implementation: // return this.putCookie(newCookie, cb); throw new Error('updateCookie is not implemented'); }; Store.prototype.removeCookie = function(domain, path, key, cb) { throw new Error('removeCookie is not implemented'); }; Store.prototype.removeCookies = function(domain, path, cb) { throw new Error('removeCookies is not implemented'); }; Store.prototype.getAllCookies = function(cb) { throw new Error('getAllCookies is not implemented (therefore jar cannot be serialized)'); }; npm_3.5.2.orig/node_modules/request/node_modules/tunnel-agent/LICENSE0000644000000000000000000002166412631326456023733 0ustar 00000000000000Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: You must give any other recipients of the Work or Derivative Works a copy of this License; and You must cause any modified files to carry prominent notices stating that You changed the files; and You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. END OF TERMS AND CONDITIONSnpm_3.5.2.orig/node_modules/request/node_modules/tunnel-agent/README.md0000644000000000000000000000016112631326456024172 0ustar 00000000000000tunnel-agent ============ HTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module. npm_3.5.2.orig/node_modules/request/node_modules/tunnel-agent/index.js0000644000000000000000000001521012631326456024361 0ustar 00000000000000'use strict' var net = require('net') , tls = require('tls') , http = require('http') , https = require('https') , events = require('events') , assert = require('assert') , util = require('util') ; exports.httpOverHttp = httpOverHttp exports.httpsOverHttp = httpsOverHttp exports.httpOverHttps = httpOverHttps exports.httpsOverHttps = httpsOverHttps function httpOverHttp(options) { var agent = new TunnelingAgent(options) agent.request = http.request return agent } function httpsOverHttp(options) { var agent = new TunnelingAgent(options) agent.request = http.request agent.createSocket = createSecureSocket return agent } function httpOverHttps(options) { var agent = new TunnelingAgent(options) agent.request = https.request return agent } function httpsOverHttps(options) { var agent = new TunnelingAgent(options) agent.request = https.request agent.createSocket = createSecureSocket return agent } function TunnelingAgent(options) { var self = this self.options = options || {} self.proxyOptions = self.options.proxy || {} self.maxSockets = self.options.maxSockets || http.Agent.defaultMaxSockets self.requests = [] self.sockets = [] self.on('free', function onFree(socket, host, port) { for (var i = 0, len = self.requests.length; i < len; ++i) { var pending = self.requests[i] if (pending.host === host && pending.port === port) { // Detect the request to connect same origin server, // reuse the connection. self.requests.splice(i, 1) pending.request.onSocket(socket) return } } socket.destroy() self.removeSocket(socket) }) } util.inherits(TunnelingAgent, events.EventEmitter) TunnelingAgent.prototype.addRequest = function addRequest(req, options) { var self = this // Legacy API: addRequest(req, host, port, path) if (typeof options === 'string') { options = { host: options, port: arguments[2], path: arguments[3] }; } if (self.sockets.length >= this.maxSockets) { // We are over limit so we'll add it to the queue. self.requests.push({host: options.host, port: options.port, request: req}) return } // If we are under maxSockets create a new one. self.createConnection({host: options.host, port: options.port, request: req}) } TunnelingAgent.prototype.createConnection = function createConnection(pending) { var self = this self.createSocket(pending, function(socket) { socket.on('free', onFree) socket.on('close', onCloseOrRemove) socket.on('agentRemove', onCloseOrRemove) pending.request.onSocket(socket) function onFree() { self.emit('free', socket, pending.host, pending.port) } function onCloseOrRemove(err) { self.removeSocket(socket) socket.removeListener('free', onFree) socket.removeListener('close', onCloseOrRemove) socket.removeListener('agentRemove', onCloseOrRemove) } }) } TunnelingAgent.prototype.createSocket = function createSocket(options, cb) { var self = this var placeholder = {} self.sockets.push(placeholder) var connectOptions = mergeOptions({}, self.proxyOptions, { method: 'CONNECT' , path: options.host + ':' + options.port , agent: false } ) if (connectOptions.proxyAuth) { connectOptions.headers = connectOptions.headers || {} connectOptions.headers['Proxy-Authorization'] = 'Basic ' + new Buffer(connectOptions.proxyAuth).toString('base64') } debug('making CONNECT request') var connectReq = self.request(connectOptions) connectReq.useChunkedEncodingByDefault = false // for v0.6 connectReq.once('response', onResponse) // for v0.6 connectReq.once('upgrade', onUpgrade) // for v0.6 connectReq.once('connect', onConnect) // for v0.7 or later connectReq.once('error', onError) connectReq.end() function onResponse(res) { // Very hacky. This is necessary to avoid http-parser leaks. res.upgrade = true } function onUpgrade(res, socket, head) { // Hacky. process.nextTick(function() { onConnect(res, socket, head) }) } function onConnect(res, socket, head) { connectReq.removeAllListeners() socket.removeAllListeners() if (res.statusCode === 200) { assert.equal(head.length, 0) debug('tunneling connection has established') self.sockets[self.sockets.indexOf(placeholder)] = socket cb(socket) } else { debug('tunneling socket could not be established, statusCode=%d', res.statusCode) var error = new Error('tunneling socket could not be established, ' + 'statusCode=' + res.statusCode) error.code = 'ECONNRESET' options.request.emit('error', error) self.removeSocket(placeholder) } } function onError(cause) { connectReq.removeAllListeners() debug('tunneling socket could not be established, cause=%s\n', cause.message, cause.stack) var error = new Error('tunneling socket could not be established, ' + 'cause=' + cause.message) error.code = 'ECONNRESET' options.request.emit('error', error) self.removeSocket(placeholder) } } TunnelingAgent.prototype.removeSocket = function removeSocket(socket) { var pos = this.sockets.indexOf(socket) if (pos === -1) return this.sockets.splice(pos, 1) var pending = this.requests.shift() if (pending) { // If we have pending requests and a socket gets closed a new one // needs to be created to take over in the pool for the one that closed. this.createConnection(pending) } } function createSecureSocket(options, cb) { var self = this TunnelingAgent.prototype.createSocket.call(self, options, function(socket) { // 0 is dummy port for v0.6 var secureSocket = tls.connect(0, mergeOptions({}, self.options, { servername: options.host , socket: socket } )) self.sockets[self.sockets.indexOf(socket)] = secureSocket cb(secureSocket) }) } function mergeOptions(target) { for (var i = 1, len = arguments.length; i < len; ++i) { var overrides = arguments[i] if (typeof overrides === 'object') { var keys = Object.keys(overrides) for (var j = 0, keyLen = keys.length; j < keyLen; ++j) { var k = keys[j] if (overrides[k] !== undefined) { target[k] = overrides[k] } } } } return target } var debug if (process.env.NODE_DEBUG && /\btunnel\b/.test(process.env.NODE_DEBUG)) { debug = function() { var args = Array.prototype.slice.call(arguments) if (typeof args[0] === 'string') { args[0] = 'TUNNEL: ' + args[0] } else { args.unshift('TUNNEL:') } console.error.apply(console, args) } } else { debug = function() {} } exports.debug = debug // for test npm_3.5.2.orig/node_modules/request/node_modules/tunnel-agent/package.json0000644000000000000000000000177512631326456025215 0ustar 00000000000000{ "author": { "name": "Mikeal Rogers", "email": "mikeal.rogers@gmail.com", "url": "http://www.futurealoof.com" }, "name": "tunnel-agent", "description": "HTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module.", "version": "0.4.1", "repository": { "url": "git+https://github.com/mikeal/tunnel-agent.git" }, "main": "index.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "readme": "tunnel-agent\n============\n\nHTTP proxy tunneling agent. Formerly part of mikeal/request, now a standalone module.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/mikeal/tunnel-agent/issues" }, "homepage": "https://github.com/mikeal/tunnel-agent#readme", "_id": "tunnel-agent@0.4.1", "_shasum": "bbeecff4d679ce753db9462761a88dfcec3c5ab3", "_resolved": "https://registry.npmjs.org/tunnel-agent/-/tunnel-agent-0.4.1.tgz", "_from": "tunnel-agent@>=0.4.1 <0.5.0" } npm_3.5.2.orig/node_modules/retry/.npmignore0000644000000000000000000000003612631326456017312 0ustar 00000000000000/node_modules/* npm-debug.log npm_3.5.2.orig/node_modules/retry/License0000644000000000000000000000216312631326456016623 0ustar 00000000000000Copyright (c) 2011: Tim Koschützki (tim@debuggable.com) Felix Geisendörfer (felix@debuggable.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/retry/Makefile0000644000000000000000000000045412631326456016757 0ustar 00000000000000SHELL := /bin/bash test: @node test/runner.js release-major: test npm version major -m "Release %s" git push npm publish release-minor: test npm version minor -m "Release %s" git push npm publish release-patch: test npm version patch -m "Release %s" git push npm publish .PHONY: test npm_3.5.2.orig/node_modules/retry/Readme.md0000644000000000000000000001553012631326456017037 0ustar 00000000000000# retry Abstraction for exponential and custom retry strategies for failed operations. ## Installation npm install retry ## Current Status This module has been tested and is ready to be used. ## Tutorial The example below will retry a potentially failing `dns.resolve` operation `10` times using an exponential backoff strategy. With the default settings, this means the last attempt is made after `17 minutes and 3 seconds`. ``` javascript var dns = require('dns'); var retry = require('retry'); function faultTolerantResolve(address, cb) { var operation = retry.operation(); operation.attempt(function(currentAttempt) { dns.resolve(address, function(err, addresses) { if (operation.retry(err)) { return; } cb(err ? operation.mainError() : null, addresses); }); }); } faultTolerantResolve('nodejs.org', function(err, addresses) { console.log(err, addresses); }); ``` Of course you can also configure the factors that go into the exponential backoff. See the API documentation below for all available settings. currentAttempt is an int representing the number of attempts so far. ``` javascript var operation = retry.operation({ retries: 5, factor: 3, minTimeout: 1 * 1000, maxTimeout: 60 * 1000, randomize: true, }); ``` ## API ### retry.operation([options]) Creates a new `RetryOperation` object. See the `retry.timeouts()` function below for available `options`. ### retry.timeouts([options]) Returns an array of timeouts. All time `options` and return values are in milliseconds. If `options` is an array, a copy of that array is returned. `options` is a JS object that can contain any of the following keys: * `retries`: The maximum amount of times to retry the operation. Default is `10`. * `factor`: The exponential factor to use. Default is `2`. * `minTimeout`: The number of milliseconds before starting the first retry. Default is `1000`. * `maxTimeout`: The maximum number of milliseconds between two retries. Default is `Infinity`. * `randomize`: Randomizes the timeouts by multiplying with a factor between `1` to `2`. Default is `false`. The formula used to calculate the individual timeouts is: ``` var Math.min(random * minTimeout * Math.pow(factor, attempt), maxTimeout); ``` Have a look at [this article][article] for a better explanation of approach. If you want to tune your `factor` / `times` settings to attempt the last retry after a certain amount of time, you can use wolfram alpha. For example in order to tune for `10` attempts in `5 minutes`, you can use this equation: ![screenshot](https://github.com/tim-kos/node-retry/raw/master/equation.gif) Explaining the various values from left to right: * `k = 0 ... 9`: The `retries` value (10) * `1000`: The `minTimeout` value in ms (1000) * `x^k`: No need to change this, `x` will be your resulting factor * `5 * 60 * 1000`: The desired total amount of time for retrying in ms (5 minutes) To make this a little easier for you, use wolfram alpha to do the calculations: [article]: http://dthain.blogspot.com/2009/02/exponential-backoff-in-distributed.html ### retry.createTimeout(attempt, opts) Returns a new `timeout` (integer in milliseconds) based on the given parameters. `attempt` is an integer representing for which retry the timeout should be calculated. If your retry operation was executed 4 times you had one attempt and 3 retries. If you then want to calculate a new timeout, you should set `attempt` to 4 (attempts are zero-indexed). `opts` can include `factor`, `minTimeout`, `randomize` (boolean) and `maxTimeout`. They are documented above. `retry.createTimeout()` is used internally by `retry.timeouts()` and is public for you to be able to create your own timeouts for reinserting an item, see [issue #13](https://github.com/tim-kos/node-retry/issues/13). ### retry.wrap(obj, [options], [methodNames]) Wrap all functions of the `obj` with retry. Optionally you can pass operation options and an array of method names which need to be wrapped. ``` retry.wrap(obj) retry.wrap(obj, ['method1', 'method2']); retry.wrap(obj, {retries: 3}); retry.wrap(obj, {retries: 3}, ['method1', 'method2']); ``` The `options` object can take any options that the usual call to `retry.operation` can take. ### new RetryOperation(timeouts) Creates a new `RetryOperation` where `timeouts` is an array where each value is a timeout given in milliseconds. #### retryOperation.errors() Returns an array of all errors that have been passed to `retryOperation.retry()` so far. #### retryOperation.mainError() A reference to the error object that occured most frequently. Errors are compared using the `error.message` property. If multiple error messages occured the same amount of time, the last error object with that message is returned. If no errors occured so far, the value is `null`. #### retryOperation.attempt(fn, timeoutOps) Defines the function `fn` that is to be retried and executes it for the first time right away. The `fn` function can receive an optional `currentAttempt` callback that represents the number of attempts to execute `fn` so far. Optionally defines `timeoutOps` which is an object having a property `timeout` in miliseconds and a property `cb` callback function. Whenever your retry operation takes longer than `timeout` to execute, the timeout callback function `cb` is called. #### retryOperation.try(fn) This is an alias for `retryOperation.attempt(fn)`. This is deprecated. #### retryOperation.start(fn) This is an alias for `retryOperation.attempt(fn)`. This is deprecated. #### retryOperation.retry(error) Returns `false` when no `error` value is given, or the maximum amount of retries has been reached. Otherwise it returns `true`, and retries the operation after the timeout for the current attempt number. #### retryOperation.attempts() Returns an int representing the number of attempts it took to call `fn` before it was successful. ## License retry is licensed under the MIT license. #Changelog 0.7.0 Some bugfixes and made retry.createTimeout() public. Fixed issues [#10](https://github.com/tim-kos/node-retry/issues/10), [#12](https://github.com/tim-kos/node-retry/issues/12), and [#13](https://github.com/tim-kos/node-retry/issues/13). 0.6.0 Introduced optional timeOps parameter for the attempt() function which is an object having a property timeout in miliseconds and a property cb callback function. Whenever your retry operation takes longer than timeout to execute, the timeout callback function cb is called. 0.5.0 Some minor refactorings. 0.4.0 Changed retryOperation.try() to retryOperation.attempt(). Deprecated the aliases start() and try() for it. 0.3.0 Added retryOperation.start() which is an alias for retryOperation.try(). 0.2.0 Added attempts() function and parameter to retryOperation.try() representing the number of attempts it took to call fn(). npm_3.5.2.orig/node_modules/retry/equation.gif0000644000000000000000000000227112631326456017632 0ustar 00000000000000GIF89a1  $ )()),)1011419899<9A@AADAJHJJLJRPRRURZYZZ]Zbabbebjijjmjsqssus{y{{}{,1pH,Ȥrl:ШtJZجxˀ(xL.gan <51zD/w< 5[:,R6C3o&H>1 - %> 8I7v24J š!Hڄ=0.8.0 <0.9.0", "_npmVersion": "2.1.7", "_nodeVersion": "0.10.33", "_npmUser": { "name": "tim-kos", "email": "tim@debuggable.com" }, "maintainers": [ { "name": "tim-kos", "email": "tim@debuggable.com" } ], "dist": { "shasum": "2367628dc0edb247b1eab649dc53ac8628ac2d5f", "tarball": "http://registry.npmjs.org/retry/-/retry-0.8.0.tgz" }, "_resolved": "https://registry.npmjs.org/retry/-/retry-0.8.0.tgz" } npm_3.5.2.orig/node_modules/retry/test/0000755000000000000000000000000012631326456016273 5ustar 00000000000000npm_3.5.2.orig/node_modules/retry/example/dns.js0000644000000000000000000000126012631326456020070 0ustar 00000000000000var dns = require('dns'); var retry = require('../lib/retry'); function faultTolerantResolve(address, cb) { var opts = { retries: 2, factor: 2, minTimeout: 1 * 1000, maxTimeout: 2 * 1000, randomize: true }; var operation = retry.operation(opts); operation.attempt(function(currentAttempt) { dns.resolve(address, function(err, addresses) { if (operation.retry(err)) { return; } cb(operation.mainError(), operation.errors(), addresses); }); }); } faultTolerantResolve('nodejs.org', function(err, errors, addresses) { console.warn('err:'); console.log(err); console.warn('addresses:'); console.log(addresses); }); npm_3.5.2.orig/node_modules/retry/lib/retry.js0000644000000000000000000000411312631326456017564 0ustar 00000000000000var RetryOperation = require('./retry_operation'); exports.operation = function(options) { var retryForever = false; if (options && options.forever === true) retryForever = true; var timeouts = exports.timeouts(options); return new RetryOperation(timeouts, retryForever); }; exports.timeouts = function(options) { if (options instanceof Array) { return [].concat(options); } var opts = { retries: 10, factor: 2, minTimeout: 1 * 1000, maxTimeout: Infinity, randomize: false }; for (var key in options) { opts[key] = options[key]; } if (opts.minTimeout > opts.maxTimeout) { throw new Error('minTimeout is greater than maxTimeout'); } var timeouts = []; for (var i = 0; i < opts.retries; i++) { timeouts.push(this.createTimeout(i, opts)); } // sort the array numerically ascending timeouts.sort(function(a,b) { return a - b; }); return timeouts; }; exports.createTimeout = function(attempt, opts) { var random = (opts.randomize) ? (Math.random() + 1) : 1; var timeout = Math.round(random * opts.minTimeout * Math.pow(opts.factor, attempt)); timeout = Math.min(timeout, opts.maxTimeout); return timeout; }; exports.wrap = function(obj, options, methods) { if (options instanceof Array) { methods = options; options = null; } if (!methods) { methods = []; for (var key in obj) { if (typeof obj[key] === 'function') { methods.push(key); } } } for (var i = 0; i < methods.length; i++) { var method = methods[i]; var original = obj[method]; obj[method] = function retryWrapper() { var op = exports.operation(options); var args = Array.prototype.slice.call(arguments); var callback = args.pop(); args.push(function(err) { if (op.retry(err)) { return; } if (err) { arguments[0] = op.mainError(); } callback.apply(this, arguments); }); op.attempt(function() { original.apply(obj, args); }); }; obj[method].options = options; } }; npm_3.5.2.orig/node_modules/retry/lib/retry_operation.js0000644000000000000000000000510612631326456021647 0ustar 00000000000000function RetryOperation(timeouts, retryForever) { this._timeouts = timeouts; this._fn = null; this._errors = []; this._attempts = 1; this._operationTimeout = null; this._operationTimeoutCb = null; this._timeout = null; if (!!retryForever) { this._cachedTimeouts = this._timeouts.slice(0); } } module.exports = RetryOperation; RetryOperation.prototype.retry = function(err) { if (this._timeout) { clearTimeout(this._timeout); } if (!err) { return false; } this._errors.push(err); var timeout = this._timeouts.shift(); if (timeout === undefined) { if (this._cachedTimeouts) { // retry forever, only keep last error this._errors.splice(this._errors.length - 1, this._errors.length); this._timeouts = this._cachedTimeouts.slice(0); timeout = this._timeouts.shift(); } else { return false; } } var self = this; setTimeout(function() { self._attempts++; if (self._operationTimeoutCb) { self._timeout = setTimeout(function() { self._operationTimeoutCb(self._attempts); }, self._operationTimeout); } self._fn(self._attempts); }, timeout); return true; }; RetryOperation.prototype.attempt = function(fn, timeoutOps) { this._fn = fn; if (timeoutOps) { if (timeoutOps.timeout) { this._operationTimeout = timeoutOps.timeout; } if (timeoutOps.cb) { this._operationTimeoutCb = timeoutOps.cb; } } var self = this; if (this._operationTimeoutCb) { this._timeout = setTimeout(function() { self._operationTimeoutCb(); }, self._operationTimeout); } this._fn(this._attempts); }; RetryOperation.prototype.try = function(fn) { console.log('Using RetryOperation.try() is deprecated'); this.attempt(fn); }; RetryOperation.prototype.start = function(fn) { console.log('Using RetryOperation.start() is deprecated'); this.attempt(fn); }; RetryOperation.prototype.start = RetryOperation.prototype.try; RetryOperation.prototype.errors = function() { return this._errors; }; RetryOperation.prototype.attempts = function() { return this._attempts; }; RetryOperation.prototype.mainError = function() { if (this._errors.length === 0) { return null; } var counts = {}; var mainError = null; var mainErrorCount = 0; for (var i = 0; i < this._errors.length; i++) { var error = this._errors[i]; var message = error.message; var count = (counts[message] || 0) + 1; counts[message] = count; if (count >= mainErrorCount) { mainError = error; mainErrorCount = count; } } return mainError; }; npm_3.5.2.orig/node_modules/retry/test/common.js0000644000000000000000000000032012631326456020114 0ustar 00000000000000var common = module.exports; var path = require('path'); var rootDir = path.join(__dirname, '..'); common.dir = { lib: rootDir + '/lib' }; common.assert = require('assert'); common.fake = require('fake');npm_3.5.2.orig/node_modules/retry/test/integration/0000755000000000000000000000000012631326456020616 5ustar 00000000000000npm_3.5.2.orig/node_modules/retry/test/runner.js0000644000000000000000000000014612631326456020143 0ustar 00000000000000var far = require('far').create(); far.add(__dirname); far.include(/\/test-.*\.js$/); far.execute(); npm_3.5.2.orig/node_modules/retry/test/integration/test-retry-operation.js0000644000000000000000000000531212631326456025275 0ustar 00000000000000var common = require('../common'); var assert = common.assert; var fake = common.fake.create(); var retry = require(common.dir.lib + '/retry'); (function testErrors() { var operation = retry.operation(); var error = new Error('some error'); var error2 = new Error('some other error'); operation._errors.push(error); operation._errors.push(error2); assert.deepEqual(operation.errors(), [error, error2]); })(); (function testMainErrorReturnsMostFrequentError() { var operation = retry.operation(); var error = new Error('some error'); var error2 = new Error('some other error'); operation._errors.push(error); operation._errors.push(error2); operation._errors.push(error); assert.strictEqual(operation.mainError(), error); })(); (function testMainErrorReturnsLastErrorOnEqualCount() { var operation = retry.operation(); var error = new Error('some error'); var error2 = new Error('some other error'); operation._errors.push(error); operation._errors.push(error2); assert.strictEqual(operation.mainError(), error2); })(); (function testAttempt() { var operation = retry.operation(); var fn = new Function(); var timeoutOpts = { timeout: 1, cb: function() {} }; operation.attempt(fn, timeoutOpts); assert.strictEqual(fn, operation._fn); assert.strictEqual(timeoutOpts.timeout, operation._operationTimeout); assert.strictEqual(timeoutOpts.cb, operation._operationTimeoutCb); })(); (function testRetry() { var times = 3; var error = new Error('some error'); var operation = retry.operation([1, 2, 3]); var attempts = 0; var finalCallback = fake.callback('finalCallback'); fake.expectAnytime(finalCallback); var fn = function() { operation.attempt(function(currentAttempt) { attempts++; assert.equal(currentAttempt, attempts); if (operation.retry(error)) { return; } assert.strictEqual(attempts, 4); assert.strictEqual(operation.attempts(), attempts); assert.strictEqual(operation.mainError(), error); finalCallback(); }); }; fn(); })(); (function testRetryForever() { var error = new Error('some error'); var operation = retry.operation({ retries: 3, forever: true }); var attempts = 0; var finalCallback = fake.callback('finalCallback'); fake.expectAnytime(finalCallback); var fn = function() { operation.attempt(function(currentAttempt) { attempts++; assert.equal(currentAttempt, attempts); if (attempts !== 6 && operation.retry(error)) { return; } assert.strictEqual(attempts, 6); assert.strictEqual(operation.attempts(), attempts); assert.strictEqual(operation.mainError(), error); finalCallback(); }); }; fn(); })(); npm_3.5.2.orig/node_modules/retry/test/integration/test-retry-wrap.js0000644000000000000000000000401512631326456024245 0ustar 00000000000000var common = require('../common'); var assert = common.assert; var fake = common.fake.create(); var retry = require(common.dir.lib + '/retry'); function getLib() { return { fn1: function() {}, fn2: function() {}, fn3: function() {} }; } (function wrapAll() { var lib = getLib(); retry.wrap(lib); assert.equal(lib.fn1.name, 'retryWrapper'); assert.equal(lib.fn2.name, 'retryWrapper'); assert.equal(lib.fn3.name, 'retryWrapper'); }()); (function wrapAllPassOptions() { var lib = getLib(); retry.wrap(lib, {retries: 2}); assert.equal(lib.fn1.name, 'retryWrapper'); assert.equal(lib.fn2.name, 'retryWrapper'); assert.equal(lib.fn3.name, 'retryWrapper'); assert.equal(lib.fn1.options.retries, 2); assert.equal(lib.fn2.options.retries, 2); assert.equal(lib.fn3.options.retries, 2); }()); (function wrapDefined() { var lib = getLib(); retry.wrap(lib, ['fn2', 'fn3']); assert.notEqual(lib.fn1.name, 'retryWrapper'); assert.equal(lib.fn2.name, 'retryWrapper'); assert.equal(lib.fn3.name, 'retryWrapper'); }()); (function wrapDefinedAndPassOptions() { var lib = getLib(); retry.wrap(lib, {retries: 2}, ['fn2', 'fn3']); assert.notEqual(lib.fn1.name, 'retryWrapper'); assert.equal(lib.fn2.name, 'retryWrapper'); assert.equal(lib.fn3.name, 'retryWrapper'); assert.equal(lib.fn2.options.retries, 2); assert.equal(lib.fn3.options.retries, 2); }()); (function runWrappedWithoutError() { var callbackCalled; var lib = {method: function(a, b, callback) { assert.equal(a, 1); assert.equal(b, 2); assert.equal(typeof callback, 'function'); callback(); }}; retry.wrap(lib); lib.method(1, 2, function() { callbackCalled = true; }); assert.ok(callbackCalled); }()); (function runWrappedWithError() { var callbackCalled; var lib = {method: function(callback) { callback(new Error('Some error')); }}; retry.wrap(lib, {retries: 1}); lib.method(function(err) { callbackCalled = true; assert.ok(err instanceof Error); }); assert.ok(!callbackCalled); }()); npm_3.5.2.orig/node_modules/retry/test/integration/test-timeouts.js0000644000000000000000000000336312631326456024007 0ustar 00000000000000var common = require('../common'); var assert = common.assert; var retry = require(common.dir.lib + '/retry'); (function testDefaultValues() { var timeouts = retry.timeouts(); assert.equal(timeouts.length, 10); assert.equal(timeouts[0], 1000); assert.equal(timeouts[1], 2000); assert.equal(timeouts[2], 4000); })(); (function testDefaultValuesWithRandomize() { var minTimeout = 5000; var timeouts = retry.timeouts({ minTimeout: minTimeout, randomize: true }); assert.equal(timeouts.length, 10); assert.ok(timeouts[0] > minTimeout); assert.ok(timeouts[1] > timeouts[0]); assert.ok(timeouts[2] > timeouts[1]); })(); (function testPassedTimeoutsAreUsed() { var timeoutsArray = [1000, 2000, 3000]; var timeouts = retry.timeouts(timeoutsArray); assert.deepEqual(timeouts, timeoutsArray); assert.notStrictEqual(timeouts, timeoutsArray); })(); (function testTimeoutsAreWithinBoundaries() { var minTimeout = 1000; var maxTimeout = 10000; var timeouts = retry.timeouts({ minTimeout: minTimeout, maxTimeout: maxTimeout }); for (var i = 0; i < timeouts; i++) { assert.ok(timeouts[i] >= minTimeout); assert.ok(timeouts[i] <= maxTimeout); } })(); (function testTimeoutsAreIncremental() { var timeouts = retry.timeouts(); var lastTimeout = timeouts[0]; for (var i = 0; i < timeouts; i++) { assert.ok(timeouts[i] > lastTimeout); lastTimeout = timeouts[i]; } })(); (function testTimeoutsAreIncrementalForFactorsLessThanOne() { var timeouts = retry.timeouts({ retries: 3, factor: 0.5 }); var expected = [250, 500, 1000]; assert.deepEqual(expected, timeouts); })(); (function testRetries() { var timeouts = retry.timeouts({retries: 2}); assert.strictEqual(timeouts.length, 2); })(); npm_3.5.2.orig/node_modules/rimraf/LICENSE0000644000000000000000000000137512631326456016442 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/rimraf/README.md0000644000000000000000000000317112631326456016710 0ustar 00000000000000[![Build Status](https://travis-ci.org/isaacs/rimraf.svg?branch=master)](https://travis-ci.org/isaacs/rimraf) [![Dependency Status](https://david-dm.org/isaacs/rimraf.svg)](https://david-dm.org/isaacs/rimraf) [![devDependency Status](https://david-dm.org/isaacs/rimraf/dev-status.svg)](https://david-dm.org/isaacs/rimraf#info=devDependencies) The [UNIX command](http://en.wikipedia.org/wiki/Rm_(Unix)) `rm -rf` for node. Install with `npm install rimraf`, or just drop rimraf.js somewhere. ## API `rimraf(f, callback)` The callback will be called with an error if there is one. Certain errors are handled for you: * Windows: `EBUSY` and `ENOTEMPTY` - rimraf will back off a maximum of `opts.maxBusyTries` times before giving up, adding 100ms of wait between each attempt. The default `maxBusyTries` is 3. * `ENOENT` - If the file doesn't exist, rimraf will return successfully, since your desired outcome is already the case. * `EMFILE` - Since `readdir` requires opening a file descriptor, it's possible to hit `EMFILE` if too many file descriptors are in use. In the sync case, there's nothing to be done for this. But in the async case, rimraf will gradually back off with timeouts up to `opts.emfileWait` ms, which defaults to 1000. ## rimraf.sync It can remove stuff synchronously, too. But that's not so good. Use the async API. It's better. ## CLI If installed with `npm install rimraf -g` it can be used as a global command `rimraf [ ...]` which is useful for cross platform support. ## mkdirp If you need to create a directory recursively, check out [mkdirp](https://github.com/substack/node-mkdirp). npm_3.5.2.orig/node_modules/rimraf/bin.js0000755000000000000000000000150612631326456016542 0ustar 00000000000000#!/usr/bin/env node var rimraf = require('./') var help = false var dashdash = false var args = process.argv.slice(2).filter(function(arg) { if (dashdash) return !!arg else if (arg === '--') dashdash = true else if (arg.match(/^(-+|\/)(h(elp)?|\?)$/)) help = true else return !!arg }); if (help || args.length === 0) { // If they didn't ask for help, then this is not a "success" var log = help ? console.log : console.error log('Usage: rimraf [ ...]') log('') log(' Deletes all files and folders at "path" recursively.') log('') log('Options:') log('') log(' -h, --help Display this usage info') process.exit(help ? 0 : 1) } else go(0) function go (n) { if (n >= args.length) return rimraf(args[n], function (er) { if (er) throw er go(n+1) }) } npm_3.5.2.orig/node_modules/rimraf/package.json0000644000000000000000000000522112631326456017715 0ustar 00000000000000{ "name": "rimraf", "version": "2.4.3", "main": "rimraf.js", "description": "A deep deletion module for node (like `rm -rf`)", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "repository": { "type": "git", "url": "git://github.com/isaacs/rimraf.git" }, "scripts": { "test": "tap test/*.js" }, "bin": { "rimraf": "./bin.js" }, "dependencies": { "glob": "^5.0.14" }, "files": [ "LICENSE", "README.md", "bin.js", "rimraf.js" ], "devDependencies": { "mkdirp": "^0.5.1", "tap": "^1.3.1" }, "readme": "[![Build Status](https://travis-ci.org/isaacs/rimraf.svg?branch=master)](https://travis-ci.org/isaacs/rimraf) [![Dependency Status](https://david-dm.org/isaacs/rimraf.svg)](https://david-dm.org/isaacs/rimraf) [![devDependency Status](https://david-dm.org/isaacs/rimraf/dev-status.svg)](https://david-dm.org/isaacs/rimraf#info=devDependencies)\n\nThe [UNIX command](http://en.wikipedia.org/wiki/Rm_(Unix)) `rm -rf` for node. \n\nInstall with `npm install rimraf`, or just drop rimraf.js somewhere.\n\n## API\n\n`rimraf(f, callback)`\n\nThe callback will be called with an error if there is one. Certain\nerrors are handled for you:\n\n* Windows: `EBUSY` and `ENOTEMPTY` - rimraf will back off a maximum of\n `opts.maxBusyTries` times before giving up, adding 100ms of wait\n between each attempt. The default `maxBusyTries` is 3.\n* `ENOENT` - If the file doesn't exist, rimraf will return\n successfully, since your desired outcome is already the case.\n* `EMFILE` - Since `readdir` requires opening a file descriptor, it's\n possible to hit `EMFILE` if too many file descriptors are in use.\n In the sync case, there's nothing to be done for this. But in the\n async case, rimraf will gradually back off with timeouts up to\n `opts.emfileWait` ms, which defaults to 1000.\n\n## rimraf.sync\n\nIt can remove stuff synchronously, too. But that's not so good. Use\nthe async API. It's better.\n\n## CLI\n\nIf installed with `npm install rimraf -g` it can be used as a global\ncommand `rimraf [ ...]` which is useful for cross platform support.\n\n## mkdirp\n\nIf you need to create a directory recursively, check out\n[mkdirp](https://github.com/substack/node-mkdirp).\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/isaacs/rimraf/issues" }, "homepage": "https://github.com/isaacs/rimraf#readme", "_id": "rimraf@2.4.3", "_shasum": "e5b51c9437a4c582adb955e9f28cf8d945e272af", "_resolved": "https://registry.npmjs.org/rimraf/-/rimraf-2.4.3.tgz", "_from": "rimraf@>=2.4.3 <2.5.0" } npm_3.5.2.orig/node_modules/rimraf/rimraf.js0000644000000000000000000001740112631326456017250 0ustar 00000000000000module.exports = rimraf rimraf.sync = rimrafSync var assert = require("assert") var path = require("path") var fs = require("fs") var glob = require("glob") var globOpts = { nosort: true, nocomment: true, nonegate: true, silent: true } // for EMFILE handling var timeout = 0 var isWindows = (process.platform === "win32") function defaults (options) { var methods = [ 'unlink', 'chmod', 'stat', 'lstat', 'rmdir', 'readdir' ] methods.forEach(function(m) { options[m] = options[m] || fs[m] m = m + 'Sync' options[m] = options[m] || fs[m] }) options.maxBusyTries = options.maxBusyTries || 3 options.emfileWait = options.emfileWait || 1000 options.disableGlob = options.disableGlob || false } function rimraf (p, options, cb) { if (typeof options === 'function') { cb = options options = {} } assert(p, 'rimraf: missing path') assert.equal(typeof p, 'string', 'rimraf: path should be a string') assert(options, 'rimraf: missing options') assert.equal(typeof options, 'object', 'rimraf: options should be object') assert.equal(typeof cb, 'function', 'rimraf: callback function required') defaults(options) var busyTries = 0 var errState = null var n = 0 if (options.disableGlob || !glob.hasMagic(p)) return afterGlob(null, [p]) fs.lstat(p, function (er, stat) { if (!er) return afterGlob(null, [p]) glob(p, globOpts, afterGlob) }) function next (er) { errState = errState || er if (--n === 0) cb(errState) } function afterGlob (er, results) { if (er) return cb(er) n = results.length if (n === 0) return cb() results.forEach(function (p) { rimraf_(p, options, function CB (er) { if (er) { if (isWindows && (er.code === "EBUSY" || er.code === "ENOTEMPTY" || er.code === "EPERM") && busyTries < options.maxBusyTries) { busyTries ++ var time = busyTries * 100 // try again, with the same exact callback as this one. return setTimeout(function () { rimraf_(p, options, CB) }, time) } // this one won't happen if graceful-fs is used. if (er.code === "EMFILE" && timeout < options.emfileWait) { return setTimeout(function () { rimraf_(p, options, CB) }, timeout ++) } // already gone if (er.code === "ENOENT") er = null } timeout = 0 next(er) }) }) } } // Two possible strategies. // 1. Assume it's a file. unlink it, then do the dir stuff on EPERM or EISDIR // 2. Assume it's a directory. readdir, then do the file stuff on ENOTDIR // // Both result in an extra syscall when you guess wrong. However, there // are likely far more normal files in the world than directories. This // is based on the assumption that a the average number of files per // directory is >= 1. // // If anyone ever complains about this, then I guess the strategy could // be made configurable somehow. But until then, YAGNI. function rimraf_ (p, options, cb) { assert(p) assert(options) assert(typeof cb === 'function') // sunos lets the root user unlink directories, which is... weird. // so we have to lstat here and make sure it's not a dir. options.lstat(p, function (er, st) { if (er && er.code === "ENOENT") return cb(null) if (st && st.isDirectory()) return rmdir(p, options, er, cb) options.unlink(p, function (er) { if (er) { if (er.code === "ENOENT") return cb(null) if (er.code === "EPERM") return (isWindows) ? fixWinEPERM(p, options, er, cb) : rmdir(p, options, er, cb) if (er.code === "EISDIR") return rmdir(p, options, er, cb) } return cb(er) }) }) } function fixWinEPERM (p, options, er, cb) { assert(p) assert(options) assert(typeof cb === 'function') if (er) assert(er instanceof Error) options.chmod(p, 666, function (er2) { if (er2) cb(er2.code === "ENOENT" ? null : er) else options.stat(p, function(er3, stats) { if (er3) cb(er3.code === "ENOENT" ? null : er) else if (stats.isDirectory()) rmdir(p, options, er, cb) else options.unlink(p, cb) }) }) } function fixWinEPERMSync (p, options, er) { assert(p) assert(options) if (er) assert(er instanceof Error) try { options.chmodSync(p, 666) } catch (er2) { if (er2.code === "ENOENT") return else throw er } try { var stats = options.statSync(p) } catch (er3) { if (er3.code === "ENOENT") return else throw er } if (stats.isDirectory()) rmdirSync(p, options, er) else options.unlinkSync(p) } function rmdir (p, options, originalEr, cb) { assert(p) assert(options) if (originalEr) assert(originalEr instanceof Error) assert(typeof cb === 'function') // try to rmdir first, and only readdir on ENOTEMPTY or EEXIST (SunOS) // if we guessed wrong, and it's not a directory, then // raise the original error. options.rmdir(p, function (er) { if (er && (er.code === "ENOTEMPTY" || er.code === "EEXIST" || er.code === "EPERM")) rmkids(p, options, cb) else if (er && er.code === "ENOTDIR") cb(originalEr) else cb(er) }) } function rmkids(p, options, cb) { assert(p) assert(options) assert(typeof cb === 'function') options.readdir(p, function (er, files) { if (er) return cb(er) var n = files.length if (n === 0) return options.rmdir(p, cb) var errState files.forEach(function (f) { rimraf(path.join(p, f), options, function (er) { if (errState) return if (er) return cb(errState = er) if (--n === 0) options.rmdir(p, cb) }) }) }) } // this looks simpler, and is strictly *faster*, but will // tie up the JavaScript thread and fail on excessively // deep directory trees. function rimrafSync (p, options) { options = options || {} defaults(options) assert(p, 'rimraf: missing path') assert.equal(typeof p, 'string', 'rimraf: path should be a string') assert(options, 'rimraf: missing options') assert.equal(typeof options, 'object', 'rimraf: options should be object') var results if (options.disableGlob || !glob.hasMagic(p)) { results = [p] } else { try { fs.lstatSync(p) results = [p] } catch (er) { results = glob.sync(p, globOpts) } } if (!results.length) return for (var i = 0; i < results.length; i++) { var p = results[i] try { var st = options.lstatSync(p) } catch (er) { if (er.code === "ENOENT") return } try { // sunos lets the root user unlink directories, which is... weird. if (st && st.isDirectory()) rmdirSync(p, options, null) else options.unlinkSync(p) } catch (er) { if (er.code === "ENOENT") return if (er.code === "EPERM") return isWindows ? fixWinEPERMSync(p, options, er) : rmdirSync(p, options, er) if (er.code !== "EISDIR") throw er rmdirSync(p, options, er) } } } function rmdirSync (p, options, originalEr) { assert(p) assert(options) if (originalEr) assert(originalEr instanceof Error) try { options.rmdirSync(p) } catch (er) { if (er.code === "ENOENT") return if (er.code === "ENOTDIR") throw originalEr if (er.code === "ENOTEMPTY" || er.code === "EEXIST" || er.code === "EPERM") rmkidsSync(p, options) } } function rmkidsSync (p, options) { assert(p) assert(options) options.readdirSync(p).forEach(function (f) { rimrafSync(path.join(p, f), options) }) options.rmdirSync(p, options) } npm_3.5.2.orig/node_modules/semver/.npmignore0000644000000000000000000000006112631326456017444 0ustar 00000000000000node_modules/ coverage/ .nyc_output/ nyc_output/ npm_3.5.2.orig/node_modules/semver/.travis.yml0000644000000000000000000000007412631326456017562 0ustar 00000000000000language: node_js node_js: - '0.10' - '0.12' - 'iojs' npm_3.5.2.orig/node_modules/semver/LICENSE0000644000000000000000000000137512631326456016463 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/semver/README.md0000644000000000000000000002705112631326456016734 0ustar 00000000000000semver(1) -- The semantic versioner for npm =========================================== ## Usage $ npm install semver semver.valid('1.2.3') // '1.2.3' semver.valid('a.b.c') // null semver.clean(' =v1.2.3 ') // '1.2.3' semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true semver.gt('1.2.3', '9.8.7') // false semver.lt('1.2.3', '9.8.7') // true As a command-line utility: $ semver -h Usage: semver [ [...]] [-r | -i | --preid | -l | -rv] Test if version(s) satisfy the supplied range(s), and sort them. Multiple versions or ranges may be supplied, unless increment option is specified. In that case, only a single version may be used, and it is incremented by the specified level Program exits successfully if any valid version satisfies all supplied ranges, and prints all satisfying versions. If no versions are valid, or ranges are not satisfied, then exits failure. Versions are printed in ascending order, so supplying multiple versions to the utility will just sort them. ## Versions A "version" is described by the `v2.0.0` specification found at . A leading `"="` or `"v"` character is stripped off and ignored. ## Ranges A `version range` is a set of `comparators` which specify versions that satisfy the range. A `comparator` is composed of an `operator` and a `version`. The set of primitive `operators` is: * `<` Less than * `<=` Less than or equal to * `>` Greater than * `>=` Greater than or equal to * `=` Equal. If no operator is specified, then equality is assumed, so this operator is optional, but MAY be included. For example, the comparator `>=1.2.7` would match the versions `1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` or `1.1.0`. Comparators can be joined by whitespace to form a `comparator set`, which is satisfied by the **intersection** of all of the comparators it includes. A range is composed of one or more comparator sets, joined by `||`. A version matches a range if and only if every comparator in at least one of the `||`-separated comparator sets is satisfied by the version. For example, the range `>=1.2.7 <1.3.0` would match the versions `1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, or `1.1.0`. The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, `1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. ### Prerelease Tags If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then it will only be allowed to satisfy comparator sets if at least one comparator with the same `[major, minor, patch]` tuple also has a prerelease tag. For example, the range `>1.2.3-alpha.3` would be allowed to match the version `1.2.3-alpha.7`, but it would *not* be satisfied by `3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater than" `1.2.3-alpha.3` according to the SemVer sort rules. The version range only accepts prerelease tags on the `1.2.3` version. The version `3.4.5` *would* satisfy the range, because it does not have a prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. The purpose for this behavior is twofold. First, prerelease versions frequently are updated very quickly, and contain many breaking changes that are (by the author's design) not yet fit for public consumption. Therefore, by default, they are excluded from range matching semantics. Second, a user who has opted into using a prerelease version has clearly indicated the intent to use *that specific* set of alpha/beta/rc versions. By including a prerelease tag in the range, the user is indicating that they are aware of the risk. However, it is still not appropriate to assume that they have opted into taking a similar risk on the *next* set of prerelease versions. #### Prerelease Identifiers The method `.inc` takes an additional `identifier` string argument that will append the value of the string as a prerelease identifier: ```javascript > semver.inc('1.2.3', 'pre', 'beta') '1.2.4-beta.0' ``` command-line example: ```shell $ semver 1.2.3 -i prerelease --preid beta 1.2.4-beta.0 ``` Which then can be used to increment further: ```shell $ semver 1.2.4-beta.0 -i prerelease 1.2.4-beta.1 ``` ### Advanced Range Syntax Advanced range syntax desugars to primitive comparators in deterministic ways. Advanced ranges may be combined in the same way as primitive comparators using white space or `||`. #### Hyphen Ranges `X.Y.Z - A.B.C` Specifies an inclusive set. * `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` If a partial version is provided as the first version in the inclusive range, then the missing pieces are replaced with zeroes. * `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` If a partial version is provided as the second version in the inclusive range, then all versions that start with the supplied parts of the tuple are accepted, but nothing that would be greater than the provided tuple parts. * `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` * `1.2.3 - 2` := `>=1.2.3 <3.0.0` #### X-Ranges `1.2.x` `1.X` `1.2.*` `*` Any of `X`, `x`, or `*` may be used to "stand in" for one of the numeric values in the `[major, minor, patch]` tuple. * `*` := `>=0.0.0` (Any version satisfies) * `1.x` := `>=1.0.0 <2.0.0` (Matching major version) * `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) A partial version range is treated as an X-Range, so the special character is in fact optional. * `""` (empty string) := `*` := `>=0.0.0` * `1` := `1.x.x` := `>=1.0.0 <2.0.0` * `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` #### Tilde Ranges `~1.2.3` `~1.2` `~1` Allows patch-level changes if a minor version is specified on the comparator. Allows minor-level changes if not. * `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` * `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) * `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) * `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` * `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) * `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) * `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in the `1.2.3` version will be allowed, if they are greater than or equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but `1.2.4-beta.2` would not, because it is a prerelease of a different `[major, minor, patch]` tuple. #### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` Allows changes that do not modify the left-most non-zero digit in the `[major, minor, patch]` tuple. In other words, this allows patch and minor updates for versions `1.0.0` and above, patch updates for versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. Many authors treat a `0.x` version as if the `x` were the major "breaking-change" indicator. Caret ranges are ideal when an author may make breaking changes between `0.2.4` and `0.3.0` releases, which is a common practice. However, it presumes that there will *not* be breaking changes between `0.2.4` and `0.2.5`. It allows for changes that are presumed to be additive (but non-breaking), according to commonly observed practices. * `^1.2.3` := `>=1.2.3 <2.0.0` * `^0.2.3` := `>=0.2.3 <0.3.0` * `^0.0.3` := `>=0.0.3 <0.0.4` * `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in the `1.2.3` version will be allowed, if they are greater than or equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but `1.2.4-beta.2` would not, because it is a prerelease of a different `[major, minor, patch]` tuple. * `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the `0.0.3` version *only* will be allowed, if they are greater than or equal to `beta`. So, `0.0.3-pr.2` would be allowed. When parsing caret ranges, a missing `patch` value desugars to the number `0`, but will allow flexibility within that value, even if the major and minor versions are both `0`. * `^1.2.x` := `>=1.2.0 <2.0.0` * `^0.0.x` := `>=0.0.0 <0.1.0` * `^0.0` := `>=0.0.0 <0.1.0` A missing `minor` and `patch` values will desugar to zero, but also allow flexibility within those values, even if the major version is zero. * `^1.x` := `>=1.0.0 <2.0.0` * `^0.x` := `>=0.0.0 <1.0.0` ## Functions All methods and classes take a final `loose` boolean argument that, if true, will be more forgiving about not-quite-valid semver strings. The resulting output will always be 100% strict, of course. Strict-mode Comparators and Ranges will be strict about the SemVer strings that they parse. * `valid(v)`: Return the parsed version, or null if it's not valid. * `inc(v, release)`: Return the version incremented by the release type (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), or null if it's not valid * `premajor` in one call will bump the version up to the next major version and down to a prerelease of that major version. `preminor`, and `prepatch` work the same way. * If called from a non-prerelease version, the `prerelease` will work the same as `prepatch`. It increments the patch version, then makes a prerelease. If the input version is already a prerelease it simply increments it. * `major(v)`: Return the major version number. * `minor(v)`: Return the minor version number. * `patch(v)`: Return the patch version number. ### Comparison * `gt(v1, v2)`: `v1 > v2` * `gte(v1, v2)`: `v1 >= v2` * `lt(v1, v2)`: `v1 < v2` * `lte(v1, v2)`: `v1 <= v2` * `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, even if they're not the exact same string. You already know how to compare strings. * `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. * `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call the corresponding function above. `"==="` and `"!=="` do simple string comparison, but are included for completeness. Throws if an invalid comparison string is provided. * `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. * `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions in descending order when passed to `Array.sort()`. * `diff(v1, v2)`: Returns difference between two versions by the release type (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), or null if the versions are the same. ### Ranges * `validRange(range)`: Return the valid range or null if it's not valid * `satisfies(version, range)`: Return true if the version satisfies the range. * `maxSatisfying(versions, range)`: Return the highest version in the list that satisfies the range, or `null` if none of them do. * `gtr(version, range)`: Return `true` if version is greater than all the versions possible in the range. * `ltr(version, range)`: Return `true` if version is less than all the versions possible in the range. * `outside(version, range, hilo)`: Return true if the version is outside the bounds of the range in either the high or low direction. The `hilo` argument must be either the string `'>'` or `'<'`. (This is the function called by `gtr` and `ltr`.) Note that, since ranges may be non-contiguous, a version might not be greater than a range, less than a range, *or* satisfy a range! For example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` until `2.0.0`, so the version `1.2.10` would not be greater than the range (because `2.0.1` satisfies, which is higher), nor less than the range (since `1.2.8` satisfies, which is lower), and it also does not satisfy the range. If you want to know if a version satisfies or does not satisfy a range, use the `satisfies(version, range)` function. npm_3.5.2.orig/node_modules/semver/bin/0000755000000000000000000000000012631326456016220 5ustar 00000000000000npm_3.5.2.orig/node_modules/semver/package.json0000644000000000000000000003113712631326456017743 0ustar 00000000000000{ "name": "semver", "version": "5.0.3", "description": "The semantic version parser used by npm.", "main": "semver.js", "scripts": { "test": "tap test/*.js" }, "devDependencies": { "tap": "^1.3.4" }, "license": "ISC", "repository": { "type": "git", "url": "git+https://github.com/npm/node-semver.git" }, "bin": { "semver": "./bin/semver" }, "readme": "semver(1) -- The semantic versioner for npm\n===========================================\n\n## Usage\n\n $ npm install semver\n\n semver.valid('1.2.3') // '1.2.3'\n semver.valid('a.b.c') // null\n semver.clean(' =v1.2.3 ') // '1.2.3'\n semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true\n semver.gt('1.2.3', '9.8.7') // false\n semver.lt('1.2.3', '9.8.7') // true\n\nAs a command-line utility:\n\n $ semver -h\n\n Usage: semver [ [...]] [-r | -i | --preid | -l | -rv]\n Test if version(s) satisfy the supplied range(s), and sort them.\n\n Multiple versions or ranges may be supplied, unless increment\n option is specified. In that case, only a single version may\n be used, and it is incremented by the specified level\n\n Program exits successfully if any valid version satisfies\n all supplied ranges, and prints all satisfying versions.\n\n If no versions are valid, or ranges are not satisfied,\n then exits failure.\n\n Versions are printed in ascending order, so supplying\n multiple versions to the utility will just sort them.\n\n## Versions\n\nA \"version\" is described by the `v2.0.0` specification found at\n.\n\nA leading `\"=\"` or `\"v\"` character is stripped off and ignored.\n\n## Ranges\n\nA `version range` is a set of `comparators` which specify versions\nthat satisfy the range.\n\nA `comparator` is composed of an `operator` and a `version`. The set\nof primitive `operators` is:\n\n* `<` Less than\n* `<=` Less than or equal to\n* `>` Greater than\n* `>=` Greater than or equal to\n* `=` Equal. If no operator is specified, then equality is assumed,\n so this operator is optional, but MAY be included.\n\nFor example, the comparator `>=1.2.7` would match the versions\n`1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6`\nor `1.1.0`.\n\nComparators can be joined by whitespace to form a `comparator set`,\nwhich is satisfied by the **intersection** of all of the comparators\nit includes.\n\nA range is composed of one or more comparator sets, joined by `||`. A\nversion matches a range if and only if every comparator in at least\none of the `||`-separated comparator sets is satisfied by the version.\n\nFor example, the range `>=1.2.7 <1.3.0` would match the versions\n`1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`,\nor `1.1.0`.\n\nThe range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`,\n`1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`.\n\n### Prerelease Tags\n\nIf a version has a prerelease tag (for example, `1.2.3-alpha.3`) then\nit will only be allowed to satisfy comparator sets if at least one\ncomparator with the same `[major, minor, patch]` tuple also has a\nprerelease tag.\n\nFor example, the range `>1.2.3-alpha.3` would be allowed to match the\nversion `1.2.3-alpha.7`, but it would *not* be satisfied by\n`3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically \"greater\nthan\" `1.2.3-alpha.3` according to the SemVer sort rules. The version\nrange only accepts prerelease tags on the `1.2.3` version. The\nversion `3.4.5` *would* satisfy the range, because it does not have a\nprerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`.\n\nThe purpose for this behavior is twofold. First, prerelease versions\nfrequently are updated very quickly, and contain many breaking changes\nthat are (by the author's design) not yet fit for public consumption.\nTherefore, by default, they are excluded from range matching\nsemantics.\n\nSecond, a user who has opted into using a prerelease version has\nclearly indicated the intent to use *that specific* set of\nalpha/beta/rc versions. By including a prerelease tag in the range,\nthe user is indicating that they are aware of the risk. However, it\nis still not appropriate to assume that they have opted into taking a\nsimilar risk on the *next* set of prerelease versions.\n\n#### Prerelease Identifiers\n\nThe method `.inc` takes an additional `identifier` string argument that\nwill append the value of the string as a prerelease identifier:\n\n```javascript\n> semver.inc('1.2.3', 'pre', 'beta')\n'1.2.4-beta.0'\n```\n\ncommand-line example:\n\n```shell\n$ semver 1.2.3 -i prerelease --preid beta\n1.2.4-beta.0\n```\n\nWhich then can be used to increment further:\n\n```shell\n$ semver 1.2.4-beta.0 -i prerelease\n1.2.4-beta.1\n```\n\n### Advanced Range Syntax\n\nAdvanced range syntax desugars to primitive comparators in\ndeterministic ways.\n\nAdvanced ranges may be combined in the same way as primitive\ncomparators using white space or `||`.\n\n#### Hyphen Ranges `X.Y.Z - A.B.C`\n\nSpecifies an inclusive set.\n\n* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4`\n\nIf a partial version is provided as the first version in the inclusive\nrange, then the missing pieces are replaced with zeroes.\n\n* `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4`\n\nIf a partial version is provided as the second version in the\ninclusive range, then all versions that start with the supplied parts\nof the tuple are accepted, but nothing that would be greater than the\nprovided tuple parts.\n\n* `1.2.3 - 2.3` := `>=1.2.3 <2.4.0`\n* `1.2.3 - 2` := `>=1.2.3 <3.0.0`\n\n#### X-Ranges `1.2.x` `1.X` `1.2.*` `*`\n\nAny of `X`, `x`, or `*` may be used to \"stand in\" for one of the\nnumeric values in the `[major, minor, patch]` tuple.\n\n* `*` := `>=0.0.0` (Any version satisfies)\n* `1.x` := `>=1.0.0 <2.0.0` (Matching major version)\n* `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions)\n\nA partial version range is treated as an X-Range, so the special\ncharacter is in fact optional.\n\n* `\"\"` (empty string) := `*` := `>=0.0.0`\n* `1` := `1.x.x` := `>=1.0.0 <2.0.0`\n* `1.2` := `1.2.x` := `>=1.2.0 <1.3.0`\n\n#### Tilde Ranges `~1.2.3` `~1.2` `~1`\n\nAllows patch-level changes if a minor version is specified on the\ncomparator. Allows minor-level changes if not.\n\n* `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0`\n* `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`)\n* `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`)\n* `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0`\n* `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`)\n* `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`)\n* `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in\n the `1.2.3` version will be allowed, if they are greater than or\n equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but\n `1.2.4-beta.2` would not, because it is a prerelease of a\n different `[major, minor, patch]` tuple.\n\n#### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4`\n\nAllows changes that do not modify the left-most non-zero digit in the\n`[major, minor, patch]` tuple. In other words, this allows patch and\nminor updates for versions `1.0.0` and above, patch updates for\nversions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`.\n\nMany authors treat a `0.x` version as if the `x` were the major\n\"breaking-change\" indicator.\n\nCaret ranges are ideal when an author may make breaking changes\nbetween `0.2.4` and `0.3.0` releases, which is a common practice.\nHowever, it presumes that there will *not* be breaking changes between\n`0.2.4` and `0.2.5`. It allows for changes that are presumed to be\nadditive (but non-breaking), according to commonly observed practices.\n\n* `^1.2.3` := `>=1.2.3 <2.0.0`\n* `^0.2.3` := `>=0.2.3 <0.3.0`\n* `^0.0.3` := `>=0.0.3 <0.0.4`\n* `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in\n the `1.2.3` version will be allowed, if they are greater than or\n equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but\n `1.2.4-beta.2` would not, because it is a prerelease of a\n different `[major, minor, patch]` tuple.\n* `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the\n `0.0.3` version *only* will be allowed, if they are greater than or\n equal to `beta`. So, `0.0.3-pr.2` would be allowed.\n\nWhen parsing caret ranges, a missing `patch` value desugars to the\nnumber `0`, but will allow flexibility within that value, even if the\nmajor and minor versions are both `0`.\n\n* `^1.2.x` := `>=1.2.0 <2.0.0`\n* `^0.0.x` := `>=0.0.0 <0.1.0`\n* `^0.0` := `>=0.0.0 <0.1.0`\n\nA missing `minor` and `patch` values will desugar to zero, but also\nallow flexibility within those values, even if the major version is\nzero.\n\n* `^1.x` := `>=1.0.0 <2.0.0`\n* `^0.x` := `>=0.0.0 <1.0.0`\n\n## Functions\n\nAll methods and classes take a final `loose` boolean argument that, if\ntrue, will be more forgiving about not-quite-valid semver strings.\nThe resulting output will always be 100% strict, of course.\n\nStrict-mode Comparators and Ranges will be strict about the SemVer\nstrings that they parse.\n\n* `valid(v)`: Return the parsed version, or null if it's not valid.\n* `inc(v, release)`: Return the version incremented by the release\n type (`major`, `premajor`, `minor`, `preminor`, `patch`,\n `prepatch`, or `prerelease`), or null if it's not valid\n * `premajor` in one call will bump the version up to the next major\n version and down to a prerelease of that major version.\n `preminor`, and `prepatch` work the same way.\n * If called from a non-prerelease version, the `prerelease` will work the\n same as `prepatch`. It increments the patch version, then makes a\n prerelease. If the input version is already a prerelease it simply\n increments it.\n* `major(v)`: Return the major version number.\n* `minor(v)`: Return the minor version number.\n* `patch(v)`: Return the patch version number.\n\n### Comparison\n\n* `gt(v1, v2)`: `v1 > v2`\n* `gte(v1, v2)`: `v1 >= v2`\n* `lt(v1, v2)`: `v1 < v2`\n* `lte(v1, v2)`: `v1 <= v2`\n* `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent,\n even if they're not the exact same string. You already know how to\n compare strings.\n* `neq(v1, v2)`: `v1 != v2` The opposite of `eq`.\n* `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call\n the corresponding function above. `\"===\"` and `\"!==\"` do simple\n string comparison, but are included for completeness. Throws if an\n invalid comparison string is provided.\n* `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if\n `v2` is greater. Sorts in ascending order if passed to `Array.sort()`.\n* `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions\n in descending order when passed to `Array.sort()`.\n* `diff(v1, v2)`: Returns difference between two versions by the release type\n (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`),\n or null if the versions are the same.\n\n\n### Ranges\n\n* `validRange(range)`: Return the valid range or null if it's not valid\n* `satisfies(version, range)`: Return true if the version satisfies the\n range.\n* `maxSatisfying(versions, range)`: Return the highest version in the list\n that satisfies the range, or `null` if none of them do.\n* `gtr(version, range)`: Return `true` if version is greater than all the\n versions possible in the range.\n* `ltr(version, range)`: Return `true` if version is less than all the\n versions possible in the range.\n* `outside(version, range, hilo)`: Return true if the version is outside\n the bounds of the range in either the high or low direction. The\n `hilo` argument must be either the string `'>'` or `'<'`. (This is\n the function called by `gtr` and `ltr`.)\n\nNote that, since ranges may be non-contiguous, a version might not be\ngreater than a range, less than a range, *or* satisfy a range! For\nexample, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9`\nuntil `2.0.0`, so the version `1.2.10` would not be greater than the\nrange (because `2.0.1` satisfies, which is higher), nor less than the\nrange (since `1.2.8` satisfies, which is lower), and it also does not\nsatisfy the range.\n\nIf you want to know if a version satisfies or does not satisfy a\nrange, use the `satisfies(version, range)` function.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/npm/node-semver/issues" }, "homepage": "https://github.com/npm/node-semver#readme", "_id": "semver@5.0.3", "_shasum": "77466de589cd5d3c95f138aa78bc569a3cb5d27a", "_resolved": "https://registry.npmjs.org/semver/-/semver-5.0.3.tgz", "_from": "semver@>=5.0.3 <5.1.0" } npm_3.5.2.orig/node_modules/semver/semver.js0000644000000000000000000010002112631326456017301 0ustar 00000000000000exports = module.exports = SemVer; // The debug function is excluded entirely from the minified version. /* nomin */ var debug; /* nomin */ if (typeof process === 'object' && /* nomin */ process.env && /* nomin */ process.env.NODE_DEBUG && /* nomin */ /\bsemver\b/i.test(process.env.NODE_DEBUG)) /* nomin */ debug = function() { /* nomin */ var args = Array.prototype.slice.call(arguments, 0); /* nomin */ args.unshift('SEMVER'); /* nomin */ console.log.apply(console, args); /* nomin */ }; /* nomin */ else /* nomin */ debug = function() {}; // Note: this is the semver.org version of the spec that it implements // Not necessarily the package version of this code. exports.SEMVER_SPEC_VERSION = '2.0.0'; var MAX_LENGTH = 256; var MAX_SAFE_INTEGER = Number.MAX_SAFE_INTEGER || 9007199254740991; // The actual regexps go on exports.re var re = exports.re = []; var src = exports.src = []; var R = 0; // The following Regular Expressions can be used for tokenizing, // validating, and parsing SemVer version strings. // ## Numeric Identifier // A single `0`, or a non-zero digit followed by zero or more digits. var NUMERICIDENTIFIER = R++; src[NUMERICIDENTIFIER] = '0|[1-9]\\d*'; var NUMERICIDENTIFIERLOOSE = R++; src[NUMERICIDENTIFIERLOOSE] = '[0-9]+'; // ## Non-numeric Identifier // Zero or more digits, followed by a letter or hyphen, and then zero or // more letters, digits, or hyphens. var NONNUMERICIDENTIFIER = R++; src[NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*'; // ## Main Version // Three dot-separated numeric identifiers. var MAINVERSION = R++; src[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\.' + '(' + src[NUMERICIDENTIFIER] + ')\\.' + '(' + src[NUMERICIDENTIFIER] + ')'; var MAINVERSIONLOOSE = R++; src[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + '(' + src[NUMERICIDENTIFIERLOOSE] + ')'; // ## Pre-release Version Identifier // A numeric identifier, or a non-numeric identifier. var PRERELEASEIDENTIFIER = R++; src[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] + '|' + src[NONNUMERICIDENTIFIER] + ')'; var PRERELEASEIDENTIFIERLOOSE = R++; src[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] + '|' + src[NONNUMERICIDENTIFIER] + ')'; // ## Pre-release Version // Hyphen, followed by one or more dot-separated pre-release version // identifiers. var PRERELEASE = R++; src[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] + '(?:\\.' + src[PRERELEASEIDENTIFIER] + ')*))'; var PRERELEASELOOSE = R++; src[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] + '(?:\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))'; // ## Build Metadata Identifier // Any combination of digits, letters, or hyphens. var BUILDIDENTIFIER = R++; src[BUILDIDENTIFIER] = '[0-9A-Za-z-]+'; // ## Build Metadata // Plus sign, followed by one or more period-separated build metadata // identifiers. var BUILD = R++; src[BUILD] = '(?:\\+(' + src[BUILDIDENTIFIER] + '(?:\\.' + src[BUILDIDENTIFIER] + ')*))'; // ## Full Version String // A main version, followed optionally by a pre-release version and // build metadata. // Note that the only major, minor, patch, and pre-release sections of // the version string are capturing groups. The build metadata is not a // capturing group, because it should not ever be used in version // comparison. var FULL = R++; var FULLPLAIN = 'v?' + src[MAINVERSION] + src[PRERELEASE] + '?' + src[BUILD] + '?'; src[FULL] = '^' + FULLPLAIN + '$'; // like full, but allows v1.2.3 and =1.2.3, which people do sometimes. // also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty // common in the npm registry. var LOOSEPLAIN = '[v=\\s]*' + src[MAINVERSIONLOOSE] + src[PRERELEASELOOSE] + '?' + src[BUILD] + '?'; var LOOSE = R++; src[LOOSE] = '^' + LOOSEPLAIN + '$'; var GTLT = R++; src[GTLT] = '((?:<|>)?=?)'; // Something like "2.*" or "1.2.x". // Note that "x.x" is a valid xRange identifer, meaning "any version" // Only the first item is strictly required. var XRANGEIDENTIFIERLOOSE = R++; src[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\*'; var XRANGEIDENTIFIER = R++; src[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\*'; var XRANGEPLAIN = R++; src[XRANGEPLAIN] = '[v=\\s]*(' + src[XRANGEIDENTIFIER] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + '(?:' + src[PRERELEASE] + ')?' + src[BUILD] + '?' + ')?)?'; var XRANGEPLAINLOOSE = R++; src[XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + '(?:' + src[PRERELEASELOOSE] + ')?' + src[BUILD] + '?' + ')?)?'; var XRANGE = R++; src[XRANGE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAIN] + '$'; var XRANGELOOSE = R++; src[XRANGELOOSE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAINLOOSE] + '$'; // Tilde ranges. // Meaning is "reasonably at or greater than" var LONETILDE = R++; src[LONETILDE] = '(?:~>?)'; var TILDETRIM = R++; src[TILDETRIM] = '(\\s*)' + src[LONETILDE] + '\\s+'; re[TILDETRIM] = new RegExp(src[TILDETRIM], 'g'); var tildeTrimReplace = '$1~'; var TILDE = R++; src[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$'; var TILDELOOSE = R++; src[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$'; // Caret ranges. // Meaning is "at least and backwards compatible with" var LONECARET = R++; src[LONECARET] = '(?:\\^)'; var CARETTRIM = R++; src[CARETTRIM] = '(\\s*)' + src[LONECARET] + '\\s+'; re[CARETTRIM] = new RegExp(src[CARETTRIM], 'g'); var caretTrimReplace = '$1^'; var CARET = R++; src[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$'; var CARETLOOSE = R++; src[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$'; // A simple gt/lt/eq thing, or just "" to indicate "any version" var COMPARATORLOOSE = R++; src[COMPARATORLOOSE] = '^' + src[GTLT] + '\\s*(' + LOOSEPLAIN + ')$|^$'; var COMPARATOR = R++; src[COMPARATOR] = '^' + src[GTLT] + '\\s*(' + FULLPLAIN + ')$|^$'; // An expression to strip any whitespace between the gtlt and the thing // it modifies, so that `> 1.2.3` ==> `>1.2.3` var COMPARATORTRIM = R++; src[COMPARATORTRIM] = '(\\s*)' + src[GTLT] + '\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')'; // this one has to use the /g flag re[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g'); var comparatorTrimReplace = '$1$2$3'; // Something like `1.2.3 - 1.2.4` // Note that these all use the loose form, because they'll be // checked against either the strict or loose comparator form // later. var HYPHENRANGE = R++; src[HYPHENRANGE] = '^\\s*(' + src[XRANGEPLAIN] + ')' + '\\s+-\\s+' + '(' + src[XRANGEPLAIN] + ')' + '\\s*$'; var HYPHENRANGELOOSE = R++; src[HYPHENRANGELOOSE] = '^\\s*(' + src[XRANGEPLAINLOOSE] + ')' + '\\s+-\\s+' + '(' + src[XRANGEPLAINLOOSE] + ')' + '\\s*$'; // Star ranges basically just allow anything at all. var STAR = R++; src[STAR] = '(<|>)?=?\\s*\\*'; // Compile to actual regexp objects. // All are flag-free, unless they were created above with a flag. for (var i = 0; i < R; i++) { debug(i, src[i]); if (!re[i]) re[i] = new RegExp(src[i]); } exports.parse = parse; function parse(version, loose) { if (version instanceof SemVer) return version; if (typeof version !== 'string') return null; if (version.length > MAX_LENGTH) return null; var r = loose ? re[LOOSE] : re[FULL]; if (!r.test(version)) return null; try { return new SemVer(version, loose); } catch (er) { return null; } } exports.valid = valid; function valid(version, loose) { var v = parse(version, loose); return v ? v.version : null; } exports.clean = clean; function clean(version, loose) { var s = parse(version.trim().replace(/^[=v]+/, ''), loose); return s ? s.version : null; } exports.SemVer = SemVer; function SemVer(version, loose) { if (version instanceof SemVer) { if (version.loose === loose) return version; else version = version.version; } else if (typeof version !== 'string') { throw new TypeError('Invalid Version: ' + version); } if (version.length > MAX_LENGTH) throw new TypeError('version is longer than ' + MAX_LENGTH + ' characters') if (!(this instanceof SemVer)) return new SemVer(version, loose); debug('SemVer', version, loose); this.loose = loose; var m = version.trim().match(loose ? re[LOOSE] : re[FULL]); if (!m) throw new TypeError('Invalid Version: ' + version); this.raw = version; // these are actually numbers this.major = +m[1]; this.minor = +m[2]; this.patch = +m[3]; if (this.major > MAX_SAFE_INTEGER || this.major < 0) throw new TypeError('Invalid major version') if (this.minor > MAX_SAFE_INTEGER || this.minor < 0) throw new TypeError('Invalid minor version') if (this.patch > MAX_SAFE_INTEGER || this.patch < 0) throw new TypeError('Invalid patch version') // numberify any prerelease numeric ids if (!m[4]) this.prerelease = []; else this.prerelease = m[4].split('.').map(function(id) { if (/^[0-9]+$/.test(id)) { var num = +id if (num >= 0 && num < MAX_SAFE_INTEGER) return num } return id; }); this.build = m[5] ? m[5].split('.') : []; this.format(); } SemVer.prototype.format = function() { this.version = this.major + '.' + this.minor + '.' + this.patch; if (this.prerelease.length) this.version += '-' + this.prerelease.join('.'); return this.version; }; SemVer.prototype.inspect = function() { return ''; }; SemVer.prototype.toString = function() { return this.version; }; SemVer.prototype.compare = function(other) { debug('SemVer.compare', this.version, this.loose, other); if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); return this.compareMain(other) || this.comparePre(other); }; SemVer.prototype.compareMain = function(other) { if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); return compareIdentifiers(this.major, other.major) || compareIdentifiers(this.minor, other.minor) || compareIdentifiers(this.patch, other.patch); }; SemVer.prototype.comparePre = function(other) { if (!(other instanceof SemVer)) other = new SemVer(other, this.loose); // NOT having a prerelease is > having one if (this.prerelease.length && !other.prerelease.length) return -1; else if (!this.prerelease.length && other.prerelease.length) return 1; else if (!this.prerelease.length && !other.prerelease.length) return 0; var i = 0; do { var a = this.prerelease[i]; var b = other.prerelease[i]; debug('prerelease compare', i, a, b); if (a === undefined && b === undefined) return 0; else if (b === undefined) return 1; else if (a === undefined) return -1; else if (a === b) continue; else return compareIdentifiers(a, b); } while (++i); }; // preminor will bump the version up to the next minor release, and immediately // down to pre-release. premajor and prepatch work the same way. SemVer.prototype.inc = function(release, identifier) { switch (release) { case 'premajor': this.prerelease.length = 0; this.patch = 0; this.minor = 0; this.major++; this.inc('pre', identifier); break; case 'preminor': this.prerelease.length = 0; this.patch = 0; this.minor++; this.inc('pre', identifier); break; case 'prepatch': // If this is already a prerelease, it will bump to the next version // drop any prereleases that might already exist, since they are not // relevant at this point. this.prerelease.length = 0; this.inc('patch', identifier); this.inc('pre', identifier); break; // If the input is a non-prerelease version, this acts the same as // prepatch. case 'prerelease': if (this.prerelease.length === 0) this.inc('patch', identifier); this.inc('pre', identifier); break; case 'major': // If this is a pre-major version, bump up to the same major version. // Otherwise increment major. // 1.0.0-5 bumps to 1.0.0 // 1.1.0 bumps to 2.0.0 if (this.minor !== 0 || this.patch !== 0 || this.prerelease.length === 0) this.major++; this.minor = 0; this.patch = 0; this.prerelease = []; break; case 'minor': // If this is a pre-minor version, bump up to the same minor version. // Otherwise increment minor. // 1.2.0-5 bumps to 1.2.0 // 1.2.1 bumps to 1.3.0 if (this.patch !== 0 || this.prerelease.length === 0) this.minor++; this.patch = 0; this.prerelease = []; break; case 'patch': // If this is not a pre-release version, it will increment the patch. // If it is a pre-release it will bump up to the same patch version. // 1.2.0-5 patches to 1.2.0 // 1.2.0 patches to 1.2.1 if (this.prerelease.length === 0) this.patch++; this.prerelease = []; break; // This probably shouldn't be used publicly. // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. case 'pre': if (this.prerelease.length === 0) this.prerelease = [0]; else { var i = this.prerelease.length; while (--i >= 0) { if (typeof this.prerelease[i] === 'number') { this.prerelease[i]++; i = -2; } } if (i === -1) // didn't increment anything this.prerelease.push(0); } if (identifier) { // 1.2.0-beta.1 bumps to 1.2.0-beta.2, // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 if (this.prerelease[0] === identifier) { if (isNaN(this.prerelease[1])) this.prerelease = [identifier, 0]; } else this.prerelease = [identifier, 0]; } break; default: throw new Error('invalid increment argument: ' + release); } this.format(); this.raw = this.version; return this; }; exports.inc = inc; function inc(version, release, loose, identifier) { if (typeof(loose) === 'string') { identifier = loose; loose = undefined; } try { return new SemVer(version, loose).inc(release, identifier).version; } catch (er) { return null; } } exports.diff = diff; function diff(version1, version2) { if (eq(version1, version2)) { return null; } else { var v1 = parse(version1); var v2 = parse(version2); if (v1.prerelease.length || v2.prerelease.length) { for (var key in v1) { if (key === 'major' || key === 'minor' || key === 'patch') { if (v1[key] !== v2[key]) { return 'pre'+key; } } } return 'prerelease'; } for (var key in v1) { if (key === 'major' || key === 'minor' || key === 'patch') { if (v1[key] !== v2[key]) { return key; } } } } } exports.compareIdentifiers = compareIdentifiers; var numeric = /^[0-9]+$/; function compareIdentifiers(a, b) { var anum = numeric.test(a); var bnum = numeric.test(b); if (anum && bnum) { a = +a; b = +b; } return (anum && !bnum) ? -1 : (bnum && !anum) ? 1 : a < b ? -1 : a > b ? 1 : 0; } exports.rcompareIdentifiers = rcompareIdentifiers; function rcompareIdentifiers(a, b) { return compareIdentifiers(b, a); } exports.major = major; function major(a, loose) { return new SemVer(a, loose).major; } exports.minor = minor; function minor(a, loose) { return new SemVer(a, loose).minor; } exports.patch = patch; function patch(a, loose) { return new SemVer(a, loose).patch; } exports.compare = compare; function compare(a, b, loose) { return new SemVer(a, loose).compare(b); } exports.compareLoose = compareLoose; function compareLoose(a, b) { return compare(a, b, true); } exports.rcompare = rcompare; function rcompare(a, b, loose) { return compare(b, a, loose); } exports.sort = sort; function sort(list, loose) { return list.sort(function(a, b) { return exports.compare(a, b, loose); }); } exports.rsort = rsort; function rsort(list, loose) { return list.sort(function(a, b) { return exports.rcompare(a, b, loose); }); } exports.gt = gt; function gt(a, b, loose) { return compare(a, b, loose) > 0; } exports.lt = lt; function lt(a, b, loose) { return compare(a, b, loose) < 0; } exports.eq = eq; function eq(a, b, loose) { return compare(a, b, loose) === 0; } exports.neq = neq; function neq(a, b, loose) { return compare(a, b, loose) !== 0; } exports.gte = gte; function gte(a, b, loose) { return compare(a, b, loose) >= 0; } exports.lte = lte; function lte(a, b, loose) { return compare(a, b, loose) <= 0; } exports.cmp = cmp; function cmp(a, op, b, loose) { var ret; switch (op) { case '===': if (typeof a === 'object') a = a.version; if (typeof b === 'object') b = b.version; ret = a === b; break; case '!==': if (typeof a === 'object') a = a.version; if (typeof b === 'object') b = b.version; ret = a !== b; break; case '': case '=': case '==': ret = eq(a, b, loose); break; case '!=': ret = neq(a, b, loose); break; case '>': ret = gt(a, b, loose); break; case '>=': ret = gte(a, b, loose); break; case '<': ret = lt(a, b, loose); break; case '<=': ret = lte(a, b, loose); break; default: throw new TypeError('Invalid operator: ' + op); } return ret; } exports.Comparator = Comparator; function Comparator(comp, loose) { if (comp instanceof Comparator) { if (comp.loose === loose) return comp; else comp = comp.value; } if (!(this instanceof Comparator)) return new Comparator(comp, loose); debug('comparator', comp, loose); this.loose = loose; this.parse(comp); if (this.semver === ANY) this.value = ''; else this.value = this.operator + this.semver.version; debug('comp', this); } var ANY = {}; Comparator.prototype.parse = function(comp) { var r = this.loose ? re[COMPARATORLOOSE] : re[COMPARATOR]; var m = comp.match(r); if (!m) throw new TypeError('Invalid comparator: ' + comp); this.operator = m[1]; if (this.operator === '=') this.operator = ''; // if it literally is just '>' or '' then allow anything. if (!m[2]) this.semver = ANY; else this.semver = new SemVer(m[2], this.loose); }; Comparator.prototype.inspect = function() { return ''; }; Comparator.prototype.toString = function() { return this.value; }; Comparator.prototype.test = function(version) { debug('Comparator.test', version, this.loose); if (this.semver === ANY) return true; if (typeof version === 'string') version = new SemVer(version, this.loose); return cmp(version, this.operator, this.semver, this.loose); }; exports.Range = Range; function Range(range, loose) { if ((range instanceof Range) && range.loose === loose) return range; if (!(this instanceof Range)) return new Range(range, loose); this.loose = loose; // First, split based on boolean or || this.raw = range; this.set = range.split(/\s*\|\|\s*/).map(function(range) { return this.parseRange(range.trim()); }, this).filter(function(c) { // throw out any that are not relevant for whatever reason return c.length; }); if (!this.set.length) { throw new TypeError('Invalid SemVer Range: ' + range); } this.format(); } Range.prototype.inspect = function() { return ''; }; Range.prototype.format = function() { this.range = this.set.map(function(comps) { return comps.join(' ').trim(); }).join('||').trim(); return this.range; }; Range.prototype.toString = function() { return this.range; }; Range.prototype.parseRange = function(range) { var loose = this.loose; range = range.trim(); debug('range', range, loose); // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE]; range = range.replace(hr, hyphenReplace); debug('hyphen replace', range); // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace); debug('comparator trim', range, re[COMPARATORTRIM]); // `~ 1.2.3` => `~1.2.3` range = range.replace(re[TILDETRIM], tildeTrimReplace); // `^ 1.2.3` => `^1.2.3` range = range.replace(re[CARETTRIM], caretTrimReplace); // normalize spaces range = range.split(/\s+/).join(' '); // At this point, the range is completely trimmed and // ready to be split into comparators. var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR]; var set = range.split(' ').map(function(comp) { return parseComparator(comp, loose); }).join(' ').split(/\s+/); if (this.loose) { // in loose mode, throw out any that are not valid comparators set = set.filter(function(comp) { return !!comp.match(compRe); }); } set = set.map(function(comp) { return new Comparator(comp, loose); }); return set; }; // Mostly just for testing and legacy API reasons exports.toComparators = toComparators; function toComparators(range, loose) { return new Range(range, loose).set.map(function(comp) { return comp.map(function(c) { return c.value; }).join(' ').trim().split(' '); }); } // comprised of xranges, tildes, stars, and gtlt's at this point. // already replaced the hyphen ranges // turn into a set of JUST comparators. function parseComparator(comp, loose) { debug('comp', comp); comp = replaceCarets(comp, loose); debug('caret', comp); comp = replaceTildes(comp, loose); debug('tildes', comp); comp = replaceXRanges(comp, loose); debug('xrange', comp); comp = replaceStars(comp, loose); debug('stars', comp); return comp; } function isX(id) { return !id || id.toLowerCase() === 'x' || id === '*'; } // ~, ~> --> * (any, kinda silly) // ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 // ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 // ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 // ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 // ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 function replaceTildes(comp, loose) { return comp.trim().split(/\s+/).map(function(comp) { return replaceTilde(comp, loose); }).join(' '); } function replaceTilde(comp, loose) { var r = loose ? re[TILDELOOSE] : re[TILDE]; return comp.replace(r, function(_, M, m, p, pr) { debug('tilde', comp, _, M, m, p, pr); var ret; if (isX(M)) ret = ''; else if (isX(m)) ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; else if (isX(p)) // ~1.2 == >=1.2.0- <1.3.0- ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; else if (pr) { debug('replaceTilde pr', pr); if (pr.charAt(0) !== '-') pr = '-' + pr; ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + (+m + 1) + '.0'; } else // ~1.2.3 == >=1.2.3 <1.3.0 ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + (+m + 1) + '.0'; debug('tilde return', ret); return ret; }); } // ^ --> * (any, kinda silly) // ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 // ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 // ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 // ^1.2.3 --> >=1.2.3 <2.0.0 // ^1.2.0 --> >=1.2.0 <2.0.0 function replaceCarets(comp, loose) { return comp.trim().split(/\s+/).map(function(comp) { return replaceCaret(comp, loose); }).join(' '); } function replaceCaret(comp, loose) { debug('caret', comp, loose); var r = loose ? re[CARETLOOSE] : re[CARET]; return comp.replace(r, function(_, M, m, p, pr) { debug('caret', comp, _, M, m, p, pr); var ret; if (isX(M)) ret = ''; else if (isX(m)) ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; else if (isX(p)) { if (M === '0') ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; else ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0'; } else if (pr) { debug('replaceCaret pr', pr); if (pr.charAt(0) !== '-') pr = '-' + pr; if (M === '0') { if (m === '0') ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + m + '.' + (+p + 1); else ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + M + '.' + (+m + 1) + '.0'; } else ret = '>=' + M + '.' + m + '.' + p + pr + ' <' + (+M + 1) + '.0.0'; } else { debug('no pr'); if (M === '0') { if (m === '0') ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + m + '.' + (+p + 1); else ret = '>=' + M + '.' + m + '.' + p + ' <' + M + '.' + (+m + 1) + '.0'; } else ret = '>=' + M + '.' + m + '.' + p + ' <' + (+M + 1) + '.0.0'; } debug('caret return', ret); return ret; }); } function replaceXRanges(comp, loose) { debug('replaceXRanges', comp, loose); return comp.split(/\s+/).map(function(comp) { return replaceXRange(comp, loose); }).join(' '); } function replaceXRange(comp, loose) { comp = comp.trim(); var r = loose ? re[XRANGELOOSE] : re[XRANGE]; return comp.replace(r, function(ret, gtlt, M, m, p, pr) { debug('xRange', comp, ret, gtlt, M, m, p, pr); var xM = isX(M); var xm = xM || isX(m); var xp = xm || isX(p); var anyX = xp; if (gtlt === '=' && anyX) gtlt = ''; if (xM) { if (gtlt === '>' || gtlt === '<') { // nothing is allowed ret = '<0.0.0'; } else { // nothing is forbidden ret = '*'; } } else if (gtlt && anyX) { // replace X with 0 if (xm) m = 0; if (xp) p = 0; if (gtlt === '>') { // >1 => >=2.0.0 // >1.2 => >=1.3.0 // >1.2.3 => >= 1.2.4 gtlt = '>='; if (xm) { M = +M + 1; m = 0; p = 0; } else if (xp) { m = +m + 1; p = 0; } } else if (gtlt === '<=') { // <=0.7.x is actually <0.8.0, since any 0.7.x should // pass. Similarly, <=7.x is actually <8.0.0, etc. gtlt = '<' if (xm) M = +M + 1 else m = +m + 1 } ret = gtlt + M + '.' + m + '.' + p; } else if (xm) { ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0'; } else if (xp) { ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0'; } debug('xRange return', ret); return ret; }); } // Because * is AND-ed with everything else in the comparator, // and '' means "any version", just remove the *s entirely. function replaceStars(comp, loose) { debug('replaceStars', comp, loose); // Looseness is ignored here. star is always as loose as it gets! return comp.trim().replace(re[STAR], ''); } // This function is passed to string.replace(re[HYPHENRANGE]) // M, m, patch, prerelease, build // 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 // 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do // 1.2 - 3.4 => >=1.2.0 <3.5.0 function hyphenReplace($0, from, fM, fm, fp, fpr, fb, to, tM, tm, tp, tpr, tb) { if (isX(fM)) from = ''; else if (isX(fm)) from = '>=' + fM + '.0.0'; else if (isX(fp)) from = '>=' + fM + '.' + fm + '.0'; else from = '>=' + from; if (isX(tM)) to = ''; else if (isX(tm)) to = '<' + (+tM + 1) + '.0.0'; else if (isX(tp)) to = '<' + tM + '.' + (+tm + 1) + '.0'; else if (tpr) to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr; else to = '<=' + to; return (from + ' ' + to).trim(); } // if ANY of the sets match ALL of its comparators, then pass Range.prototype.test = function(version) { if (!version) return false; if (typeof version === 'string') version = new SemVer(version, this.loose); for (var i = 0; i < this.set.length; i++) { if (testSet(this.set[i], version)) return true; } return false; }; function testSet(set, version) { for (var i = 0; i < set.length; i++) { if (!set[i].test(version)) return false; } if (version.prerelease.length) { // Find the set of versions that are allowed to have prereleases // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 // That should allow `1.2.3-pr.2` to pass. // However, `1.2.4-alpha.notready` should NOT be allowed, // even though it's within the range set by the comparators. for (var i = 0; i < set.length; i++) { debug(set[i].semver); if (set[i].semver === ANY) continue; if (set[i].semver.prerelease.length > 0) { var allowed = set[i].semver; if (allowed.major === version.major && allowed.minor === version.minor && allowed.patch === version.patch) return true; } } // Version has a -pre, but it's not one of the ones we like. return false; } return true; } exports.satisfies = satisfies; function satisfies(version, range, loose) { try { range = new Range(range, loose); } catch (er) { return false; } return range.test(version); } exports.maxSatisfying = maxSatisfying; function maxSatisfying(versions, range, loose) { return versions.filter(function(version) { return satisfies(version, range, loose); }).sort(function(a, b) { return rcompare(a, b, loose); })[0] || null; } exports.validRange = validRange; function validRange(range, loose) { try { // Return '*' instead of '' so that truthiness works. // This will throw if it's invalid anyway return new Range(range, loose).range || '*'; } catch (er) { return null; } } // Determine if version is less than all the versions possible in the range exports.ltr = ltr; function ltr(version, range, loose) { return outside(version, range, '<', loose); } // Determine if version is greater than all the versions possible in the range. exports.gtr = gtr; function gtr(version, range, loose) { return outside(version, range, '>', loose); } exports.outside = outside; function outside(version, range, hilo, loose) { version = new SemVer(version, loose); range = new Range(range, loose); var gtfn, ltefn, ltfn, comp, ecomp; switch (hilo) { case '>': gtfn = gt; ltefn = lte; ltfn = lt; comp = '>'; ecomp = '>='; break; case '<': gtfn = lt; ltefn = gte; ltfn = gt; comp = '<'; ecomp = '<='; break; default: throw new TypeError('Must provide a hilo val of "<" or ">"'); } // If it satisifes the range it is not outside if (satisfies(version, range, loose)) { return false; } // From now on, variable terms are as if we're in "gtr" mode. // but note that everything is flipped for the "ltr" function. for (var i = 0; i < range.set.length; ++i) { var comparators = range.set[i]; var high = null; var low = null; comparators.forEach(function(comparator) { if (comparator.semver === ANY) { comparator = new Comparator('>=0.0.0') } high = high || comparator; low = low || comparator; if (gtfn(comparator.semver, high.semver, loose)) { high = comparator; } else if (ltfn(comparator.semver, low.semver, loose)) { low = comparator; } }); // If the edge version comparator has a operator then our version // isn't outside it if (high.operator === comp || high.operator === ecomp) { return false; } // If the lowest version comparator has an operator and our version // is less than it then it isn't higher than the range if ((!low.operator || low.operator === comp) && ltefn(version, low.semver)) { return false; } else if (low.operator === ecomp && ltfn(version, low.semver)) { return false; } } return true; } npm_3.5.2.orig/node_modules/semver/test/0000755000000000000000000000000012631326456016427 5ustar 00000000000000npm_3.5.2.orig/node_modules/semver/bin/semver0000755000000000000000000000777412631326456017466 0ustar 00000000000000#!/usr/bin/env node // Standalone semver comparison program. // Exits successfully and prints matching version(s) if // any supplied version is valid and passes all tests. var argv = process.argv.slice(2) , versions = [] , range = [] , gt = [] , lt = [] , eq = [] , inc = null , version = require("../package.json").version , loose = false , identifier = undefined , semver = require("../semver") , reverse = false main() function main () { if (!argv.length) return help() while (argv.length) { var a = argv.shift() var i = a.indexOf('=') if (i !== -1) { a = a.slice(0, i) argv.unshift(a.slice(i + 1)) } switch (a) { case "-rv": case "-rev": case "--rev": case "--reverse": reverse = true break case "-l": case "--loose": loose = true break case "-v": case "--version": versions.push(argv.shift()) break case "-i": case "--inc": case "--increment": switch (argv[0]) { case "major": case "minor": case "patch": case "prerelease": case "premajor": case "preminor": case "prepatch": inc = argv.shift() break default: inc = "patch" break } break case "--preid": identifier = argv.shift() break case "-r": case "--range": range.push(argv.shift()) break case "-h": case "--help": case "-?": return help() default: versions.push(a) break } } versions = versions.filter(function (v) { return semver.valid(v, loose) }) if (!versions.length) return fail() if (inc && (versions.length !== 1 || range.length)) return failInc() for (var i = 0, l = range.length; i < l ; i ++) { versions = versions.filter(function (v) { return semver.satisfies(v, range[i], loose) }) if (!versions.length) return fail() } return success(versions) } function failInc () { console.error("--inc can only be used on a single version with no range") fail() } function fail () { process.exit(1) } function success () { var compare = reverse ? "rcompare" : "compare" versions.sort(function (a, b) { return semver[compare](a, b, loose) }).map(function (v) { return semver.clean(v, loose) }).map(function (v) { return inc ? semver.inc(v, inc, loose, identifier) : v }).forEach(function (v,i,_) { console.log(v) }) } function help () { console.log(["SemVer " + version ,"" ,"A JavaScript implementation of the http://semver.org/ specification" ,"Copyright Isaac Z. Schlueter" ,"" ,"Usage: semver [options] [ [...]]" ,"Prints valid versions sorted by SemVer precedence" ,"" ,"Options:" ,"-r --range " ," Print versions that match the specified range." ,"" ,"-i --increment []" ," Increment a version by the specified level. Level can" ," be one of: major, minor, patch, premajor, preminor," ," prepatch, or prerelease. Default level is 'patch'." ," Only one version may be specified." ,"" ,"--preid " ," Identifier to be used to prefix premajor, preminor," ," prepatch or prerelease version increments." ,"" ,"-l --loose" ," Interpret versions and ranges loosely" ,"" ,"Program exits successfully if any valid version satisfies" ,"all supplied ranges, and prints all satisfying versions." ,"" ,"If no satisfying versions are found, then exits failure." ,"" ,"Versions are printed in ascending order, so supplying" ,"multiple versions to the utility will just sort them." ].join("\n")) } npm_3.5.2.orig/node_modules/semver/test/big-numbers.js0000644000000000000000000000150212631326456021175 0ustar 00000000000000var test = require('tap').test var semver = require('../') test('long version is too long', function (t) { var v = '1.2.' + new Array(256).join('1') t.throws(function () { new semver.SemVer(v) }) t.equal(semver.valid(v, false), null) t.equal(semver.valid(v, true), null) t.equal(semver.inc(v, 'patch'), null) t.end() }) test('big number is like too long version', function (t) { var v = '1.2.' + new Array(100).join('1') t.throws(function () { new semver.SemVer(v) }) t.equal(semver.valid(v, false), null) t.equal(semver.valid(v, true), null) t.equal(semver.inc(v, 'patch'), null) t.end() }) test('parsing null does not throw', function (t) { t.equal(semver.parse(null), null) t.equal(semver.parse({}), null) t.equal(semver.parse(new semver.SemVer('1.2.3')).version, '1.2.3') t.end() }) npm_3.5.2.orig/node_modules/semver/test/clean.js0000644000000000000000000000130312631326456020044 0ustar 00000000000000var tap = require('tap'); var test = tap.test; var semver = require('../semver.js'); var clean = semver.clean; test('\nclean tests', function(t) { // [range, version] // Version should be detectable despite extra characters [ ['1.2.3', '1.2.3'], [' 1.2.3 ', '1.2.3'], [' 1.2.3-4 ', '1.2.3-4'], [' 1.2.3-pre ', '1.2.3-pre'], [' =v1.2.3 ', '1.2.3'], ['v1.2.3', '1.2.3'], [' v1.2.3 ', '1.2.3'], ['\t1.2.3', '1.2.3'], ['>1.2.3', null], ['~1.2.3', null], ['<=1.2.3', null], ['1.2.x', null] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var msg = 'clean(' + range + ') = ' + version; t.equal(clean(range), version, msg); }); t.end(); }); npm_3.5.2.orig/node_modules/semver/test/gtr.js0000644000000000000000000001150512631326456017563 0ustar 00000000000000var tap = require('tap'); var test = tap.test; var semver = require('../semver.js'); var gtr = semver.gtr; test('\ngtr tests', function(t) { // [range, version, loose] // Version should be greater than range [ ['~1.2.2', '1.3.0'], ['~0.6.1-1', '0.7.1-1'], ['1.0.0 - 2.0.0', '2.0.1'], ['1.0.0', '1.0.1-beta1'], ['1.0.0', '2.0.0'], ['<=2.0.0', '2.1.1'], ['<=2.0.0', '3.2.9'], ['<2.0.0', '2.0.0'], ['0.1.20 || 1.2.4', '1.2.5'], ['2.x.x', '3.0.0'], ['1.2.x', '1.3.0'], ['1.2.x || 2.x', '3.0.0'], ['2.*.*', '5.0.1'], ['1.2.*', '1.3.3'], ['1.2.* || 2.*', '4.0.0'], ['2', '3.0.0'], ['2.3', '2.4.2'], ['~2.4', '2.5.0'], // >=2.4.0 <2.5.0 ['~2.4', '2.5.5'], ['~>3.2.1', '3.3.0'], // >=3.2.1 <3.3.0 ['~1', '2.2.3'], // >=1.0.0 <2.0.0 ['~>1', '2.2.4'], ['~> 1', '3.2.3'], ['~1.0', '1.1.2'], // >=1.0.0 <1.1.0 ['~ 1.0', '1.1.0'], ['<1.2', '1.2.0'], ['< 1.2', '1.2.1'], ['1', '2.0.0beta', true], ['~v0.5.4-pre', '0.6.0'], ['~v0.5.4-pre', '0.6.1-pre'], ['=0.7.x', '0.8.0'], ['=0.7.x', '0.8.0-asdf'], ['<0.7.x', '0.7.0'], ['~1.2.2', '1.3.0'], ['1.0.0 - 2.0.0', '2.2.3'], ['1.0.0', '1.0.1'], ['<=2.0.0', '3.0.0'], ['<=2.0.0', '2.9999.9999'], ['<=2.0.0', '2.2.9'], ['<2.0.0', '2.9999.9999'], ['<2.0.0', '2.2.9'], ['2.x.x', '3.1.3'], ['1.2.x', '1.3.3'], ['1.2.x || 2.x', '3.1.3'], ['2.*.*', '3.1.3'], ['1.2.*', '1.3.3'], ['1.2.* || 2.*', '3.1.3'], ['2', '3.1.2'], ['2.3', '2.4.1'], ['~2.4', '2.5.0'], // >=2.4.0 <2.5.0 ['~>3.2.1', '3.3.2'], // >=3.2.1 <3.3.0 ['~1', '2.2.3'], // >=1.0.0 <2.0.0 ['~>1', '2.2.3'], ['~1.0', '1.1.0'], // >=1.0.0 <1.1.0 ['<1', '1.0.0'], ['1', '2.0.0beta', true], ['<1', '1.0.0beta', true], ['< 1', '1.0.0beta', true], ['=0.7.x', '0.8.2'], ['<0.7.x', '0.7.2'] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = 'gtr(' + version + ', ' + range + ', ' + loose + ')'; t.ok(gtr(version, range, loose), msg); }); t.end(); }); test('\nnegative gtr tests', function(t) { // [range, version, loose] // Version should NOT be greater than range [ ['~0.6.1-1', '0.6.1-1'], ['1.0.0 - 2.0.0', '1.2.3'], ['1.0.0 - 2.0.0', '0.9.9'], ['1.0.0', '1.0.0'], ['>=*', '0.2.4'], ['', '1.0.0', true], ['*', '1.2.3'], ['*', 'v1.2.3-foo'], ['>=1.0.0', '1.0.0'], ['>=1.0.0', '1.0.1'], ['>=1.0.0', '1.1.0'], ['>1.0.0', '1.0.1'], ['>1.0.0', '1.1.0'], ['<=2.0.0', '2.0.0'], ['<=2.0.0', '1.9999.9999'], ['<=2.0.0', '0.2.9'], ['<2.0.0', '1.9999.9999'], ['<2.0.0', '0.2.9'], ['>= 1.0.0', '1.0.0'], ['>= 1.0.0', '1.0.1'], ['>= 1.0.0', '1.1.0'], ['> 1.0.0', '1.0.1'], ['> 1.0.0', '1.1.0'], ['<= 2.0.0', '2.0.0'], ['<= 2.0.0', '1.9999.9999'], ['<= 2.0.0', '0.2.9'], ['< 2.0.0', '1.9999.9999'], ['<\t2.0.0', '0.2.9'], ['>=0.1.97', 'v0.1.97'], ['>=0.1.97', '0.1.97'], ['0.1.20 || 1.2.4', '1.2.4'], ['0.1.20 || >1.2.4', '1.2.4'], ['0.1.20 || 1.2.4', '1.2.3'], ['0.1.20 || 1.2.4', '0.1.20'], ['>=0.2.3 || <0.0.1', '0.0.0'], ['>=0.2.3 || <0.0.1', '0.2.3'], ['>=0.2.3 || <0.0.1', '0.2.4'], ['||', '1.3.4'], ['2.x.x', '2.1.3'], ['1.2.x', '1.2.3'], ['1.2.x || 2.x', '2.1.3'], ['1.2.x || 2.x', '1.2.3'], ['x', '1.2.3'], ['2.*.*', '2.1.3'], ['1.2.*', '1.2.3'], ['1.2.* || 2.*', '2.1.3'], ['1.2.* || 2.*', '1.2.3'], ['1.2.* || 2.*', '1.2.3'], ['*', '1.2.3'], ['2', '2.1.2'], ['2.3', '2.3.1'], ['~2.4', '2.4.0'], // >=2.4.0 <2.5.0 ['~2.4', '2.4.5'], ['~>3.2.1', '3.2.2'], // >=3.2.1 <3.3.0 ['~1', '1.2.3'], // >=1.0.0 <2.0.0 ['~>1', '1.2.3'], ['~> 1', '1.2.3'], ['~1.0', '1.0.2'], // >=1.0.0 <1.1.0 ['~ 1.0', '1.0.2'], ['>=1', '1.0.0'], ['>= 1', '1.0.0'], ['<1.2', '1.1.1'], ['< 1.2', '1.1.1'], ['1', '1.0.0beta', true], ['~v0.5.4-pre', '0.5.5'], ['~v0.5.4-pre', '0.5.4'], ['=0.7.x', '0.7.2'], ['>=0.7.x', '0.7.2'], ['=0.7.x', '0.7.0-asdf'], ['>=0.7.x', '0.7.0-asdf'], ['<=0.7.x', '0.6.2'], ['>0.2.3 >0.2.4 <=0.2.5', '0.2.5'], ['>=0.2.3 <=0.2.4', '0.2.4'], ['1.0.0 - 2.0.0', '2.0.0'], ['^1', '0.0.0-0'], ['^3.0.0', '2.0.0'], ['^1.0.0 || ~2.0.1', '2.0.0'], ['^0.1.0 || ~3.0.1 || 5.0.0', '3.2.0'], ['^0.1.0 || ~3.0.1 || 5.0.0', '1.0.0beta', true], ['^0.1.0 || ~3.0.1 || 5.0.0', '5.0.0-0', true], ['^0.1.0 || ~3.0.1 || >4 <=5.0.0', '3.5.0'] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = '!gtr(' + version + ', ' + range + ', ' + loose + ')'; t.notOk(gtr(version, range, loose), msg); }); t.end(); }); npm_3.5.2.orig/node_modules/semver/test/index.js0000644000000000000000000005575312631326456020113 0ustar 00000000000000'use strict'; var tap = require('tap'); var test = tap.test; var semver = require('../semver.js'); var eq = semver.eq; var gt = semver.gt; var lt = semver.lt; var neq = semver.neq; var cmp = semver.cmp; var gte = semver.gte; var lte = semver.lte; var satisfies = semver.satisfies; var validRange = semver.validRange; var inc = semver.inc; var diff = semver.diff; var replaceStars = semver.replaceStars; var toComparators = semver.toComparators; var SemVer = semver.SemVer; var Range = semver.Range; test('\ncomparison tests', function(t) { // [version1, version2] // version1 should be greater than version2 [['0.0.0', '0.0.0-foo'], ['0.0.1', '0.0.0'], ['1.0.0', '0.9.9'], ['0.10.0', '0.9.0'], ['0.99.0', '0.10.0'], ['2.0.0', '1.2.3'], ['v0.0.0', '0.0.0-foo', true], ['v0.0.1', '0.0.0', true], ['v1.0.0', '0.9.9', true], ['v0.10.0', '0.9.0', true], ['v0.99.0', '0.10.0', true], ['v2.0.0', '1.2.3', true], ['0.0.0', 'v0.0.0-foo', true], ['0.0.1', 'v0.0.0', true], ['1.0.0', 'v0.9.9', true], ['0.10.0', 'v0.9.0', true], ['0.99.0', 'v0.10.0', true], ['2.0.0', 'v1.2.3', true], ['1.2.3', '1.2.3-asdf'], ['1.2.3', '1.2.3-4'], ['1.2.3', '1.2.3-4-foo'], ['1.2.3-5-foo', '1.2.3-5'], ['1.2.3-5', '1.2.3-4'], ['1.2.3-5-foo', '1.2.3-5-Foo'], ['3.0.0', '2.7.2+asdf'], ['1.2.3-a.10', '1.2.3-a.5'], ['1.2.3-a.b', '1.2.3-a.5'], ['1.2.3-a.b', '1.2.3-a'], ['1.2.3-a.b.c.10.d.5', '1.2.3-a.b.c.5.d.100'], ['1.2.3-r2', '1.2.3-r100'], ['1.2.3-r100', '1.2.3-R2'] ].forEach(function(v) { var v0 = v[0]; var v1 = v[1]; var loose = v[2]; t.ok(gt(v0, v1, loose), "gt('" + v0 + "', '" + v1 + "')"); t.ok(lt(v1, v0, loose), "lt('" + v1 + "', '" + v0 + "')"); t.ok(!gt(v1, v0, loose), "!gt('" + v1 + "', '" + v0 + "')"); t.ok(!lt(v0, v1, loose), "!lt('" + v0 + "', '" + v1 + "')"); t.ok(eq(v0, v0, loose), "eq('" + v0 + "', '" + v0 + "')"); t.ok(eq(v1, v1, loose), "eq('" + v1 + "', '" + v1 + "')"); t.ok(neq(v0, v1, loose), "neq('" + v0 + "', '" + v1 + "')"); t.ok(cmp(v1, '==', v1, loose), "cmp('" + v1 + "' == '" + v1 + "')"); t.ok(cmp(v0, '>=', v1, loose), "cmp('" + v0 + "' >= '" + v1 + "')"); t.ok(cmp(v1, '<=', v0, loose), "cmp('" + v1 + "' <= '" + v0 + "')"); t.ok(cmp(v0, '!=', v1, loose), "cmp('" + v0 + "' != '" + v1 + "')"); }); t.end(); }); test('\nequality tests', function(t) { // [version1, version2] // version1 should be equivalent to version2 [['1.2.3', 'v1.2.3', true], ['1.2.3', '=1.2.3', true], ['1.2.3', 'v 1.2.3', true], ['1.2.3', '= 1.2.3', true], ['1.2.3', ' v1.2.3', true], ['1.2.3', ' =1.2.3', true], ['1.2.3', ' v 1.2.3', true], ['1.2.3', ' = 1.2.3', true], ['1.2.3-0', 'v1.2.3-0', true], ['1.2.3-0', '=1.2.3-0', true], ['1.2.3-0', 'v 1.2.3-0', true], ['1.2.3-0', '= 1.2.3-0', true], ['1.2.3-0', ' v1.2.3-0', true], ['1.2.3-0', ' =1.2.3-0', true], ['1.2.3-0', ' v 1.2.3-0', true], ['1.2.3-0', ' = 1.2.3-0', true], ['1.2.3-1', 'v1.2.3-1', true], ['1.2.3-1', '=1.2.3-1', true], ['1.2.3-1', 'v 1.2.3-1', true], ['1.2.3-1', '= 1.2.3-1', true], ['1.2.3-1', ' v1.2.3-1', true], ['1.2.3-1', ' =1.2.3-1', true], ['1.2.3-1', ' v 1.2.3-1', true], ['1.2.3-1', ' = 1.2.3-1', true], ['1.2.3-beta', 'v1.2.3-beta', true], ['1.2.3-beta', '=1.2.3-beta', true], ['1.2.3-beta', 'v 1.2.3-beta', true], ['1.2.3-beta', '= 1.2.3-beta', true], ['1.2.3-beta', ' v1.2.3-beta', true], ['1.2.3-beta', ' =1.2.3-beta', true], ['1.2.3-beta', ' v 1.2.3-beta', true], ['1.2.3-beta', ' = 1.2.3-beta', true], ['1.2.3-beta+build', ' = 1.2.3-beta+otherbuild', true], ['1.2.3+build', ' = 1.2.3+otherbuild', true], ['1.2.3-beta+build', '1.2.3-beta+otherbuild'], ['1.2.3+build', '1.2.3+otherbuild'], [' v1.2.3+build', '1.2.3+otherbuild'] ].forEach(function(v) { var v0 = v[0]; var v1 = v[1]; var loose = v[2]; t.ok(eq(v0, v1, loose), "eq('" + v0 + "', '" + v1 + "')"); t.ok(!neq(v0, v1, loose), "!neq('" + v0 + "', '" + v1 + "')"); t.ok(cmp(v0, '==', v1, loose), 'cmp(' + v0 + '==' + v1 + ')'); t.ok(!cmp(v0, '!=', v1, loose), '!cmp(' + v0 + '!=' + v1 + ')'); t.ok(!cmp(v0, '===', v1, loose), '!cmp(' + v0 + '===' + v1 + ')'); t.ok(cmp(v0, '!==', v1, loose), 'cmp(' + v0 + '!==' + v1 + ')'); t.ok(!gt(v0, v1, loose), "!gt('" + v0 + "', '" + v1 + "')"); t.ok(gte(v0, v1, loose), "gte('" + v0 + "', '" + v1 + "')"); t.ok(!lt(v0, v1, loose), "!lt('" + v0 + "', '" + v1 + "')"); t.ok(lte(v0, v1, loose), "lte('" + v0 + "', '" + v1 + "')"); }); t.end(); }); test('\nrange tests', function(t) { // [range, version] // version should be included by range [['1.0.0 - 2.0.0', '1.2.3'], ['^1.2.3+build', '1.2.3'], ['^1.2.3+build', '1.3.0'], ['1.2.3-pre+asdf - 2.4.3-pre+asdf', '1.2.3'], ['1.2.3pre+asdf - 2.4.3-pre+asdf', '1.2.3', true], ['1.2.3-pre+asdf - 2.4.3pre+asdf', '1.2.3', true], ['1.2.3pre+asdf - 2.4.3pre+asdf', '1.2.3', true], ['1.2.3-pre+asdf - 2.4.3-pre+asdf', '1.2.3-pre.2'], ['1.2.3-pre+asdf - 2.4.3-pre+asdf', '2.4.3-alpha'], ['1.2.3+asdf - 2.4.3+asdf', '1.2.3'], ['1.0.0', '1.0.0'], ['>=*', '0.2.4'], ['', '1.0.0'], ['*', '1.2.3'], ['*', 'v1.2.3', true], ['>=1.0.0', '1.0.0'], ['>=1.0.0', '1.0.1'], ['>=1.0.0', '1.1.0'], ['>1.0.0', '1.0.1'], ['>1.0.0', '1.1.0'], ['<=2.0.0', '2.0.0'], ['<=2.0.0', '1.9999.9999'], ['<=2.0.0', '0.2.9'], ['<2.0.0', '1.9999.9999'], ['<2.0.0', '0.2.9'], ['>= 1.0.0', '1.0.0'], ['>= 1.0.0', '1.0.1'], ['>= 1.0.0', '1.1.0'], ['> 1.0.0', '1.0.1'], ['> 1.0.0', '1.1.0'], ['<= 2.0.0', '2.0.0'], ['<= 2.0.0', '1.9999.9999'], ['<= 2.0.0', '0.2.9'], ['< 2.0.0', '1.9999.9999'], ['<\t2.0.0', '0.2.9'], ['>=0.1.97', 'v0.1.97', true], ['>=0.1.97', '0.1.97'], ['0.1.20 || 1.2.4', '1.2.4'], ['>=0.2.3 || <0.0.1', '0.0.0'], ['>=0.2.3 || <0.0.1', '0.2.3'], ['>=0.2.3 || <0.0.1', '0.2.4'], ['||', '1.3.4'], ['2.x.x', '2.1.3'], ['1.2.x', '1.2.3'], ['1.2.x || 2.x', '2.1.3'], ['1.2.x || 2.x', '1.2.3'], ['x', '1.2.3'], ['2.*.*', '2.1.3'], ['1.2.*', '1.2.3'], ['1.2.* || 2.*', '2.1.3'], ['1.2.* || 2.*', '1.2.3'], ['*', '1.2.3'], ['2', '2.1.2'], ['2.3', '2.3.1'], ['~2.4', '2.4.0'], // >=2.4.0 <2.5.0 ['~2.4', '2.4.5'], ['~>3.2.1', '3.2.2'], // >=3.2.1 <3.3.0, ['~1', '1.2.3'], // >=1.0.0 <2.0.0 ['~>1', '1.2.3'], ['~> 1', '1.2.3'], ['~1.0', '1.0.2'], // >=1.0.0 <1.1.0, ['~ 1.0', '1.0.2'], ['~ 1.0.3', '1.0.12'], ['>=1', '1.0.0'], ['>= 1', '1.0.0'], ['<1.2', '1.1.1'], ['< 1.2', '1.1.1'], ['~v0.5.4-pre', '0.5.5'], ['~v0.5.4-pre', '0.5.4'], ['=0.7.x', '0.7.2'], ['<=0.7.x', '0.7.2'], ['>=0.7.x', '0.7.2'], ['<=0.7.x', '0.6.2'], ['~1.2.1 >=1.2.3', '1.2.3'], ['~1.2.1 =1.2.3', '1.2.3'], ['~1.2.1 1.2.3', '1.2.3'], ['~1.2.1 >=1.2.3 1.2.3', '1.2.3'], ['~1.2.1 1.2.3 >=1.2.3', '1.2.3'], ['~1.2.1 1.2.3', '1.2.3'], ['>=1.2.1 1.2.3', '1.2.3'], ['1.2.3 >=1.2.1', '1.2.3'], ['>=1.2.3 >=1.2.1', '1.2.3'], ['>=1.2.1 >=1.2.3', '1.2.3'], ['>=1.2', '1.2.8'], ['^1.2.3', '1.8.1'], ['^0.1.2', '0.1.2'], ['^0.1', '0.1.2'], ['^1.2', '1.4.2'], ['^1.2 ^1', '1.4.2'], ['^1.2.3-alpha', '1.2.3-pre'], ['^1.2.0-alpha', '1.2.0-pre'], ['^0.0.1-alpha', '0.0.1-beta'] ].forEach(function(v) { var range = v[0]; var ver = v[1]; var loose = v[2]; t.ok(satisfies(ver, range, loose), range + ' satisfied by ' + ver); }); t.end(); }); test('\nnegative range tests', function(t) { // [range, version] // version should not be included by range [['1.0.0 - 2.0.0', '2.2.3'], ['1.2.3+asdf - 2.4.3+asdf', '1.2.3-pre.2'], ['1.2.3+asdf - 2.4.3+asdf', '2.4.3-alpha'], ['^1.2.3+build', '2.0.0'], ['^1.2.3+build', '1.2.0'], ['^1.2.3', '1.2.3-pre'], ['^1.2', '1.2.0-pre'], ['>1.2', '1.3.0-beta'], ['<=1.2.3', '1.2.3-beta'], ['^1.2.3', '1.2.3-beta'], ['=0.7.x', '0.7.0-asdf'], ['>=0.7.x', '0.7.0-asdf'], ['1', '1.0.0beta', true], ['<1', '1.0.0beta', true], ['< 1', '1.0.0beta', true], ['1.0.0', '1.0.1'], ['>=1.0.0', '0.0.0'], ['>=1.0.0', '0.0.1'], ['>=1.0.0', '0.1.0'], ['>1.0.0', '0.0.1'], ['>1.0.0', '0.1.0'], ['<=2.0.0', '3.0.0'], ['<=2.0.0', '2.9999.9999'], ['<=2.0.0', '2.2.9'], ['<2.0.0', '2.9999.9999'], ['<2.0.0', '2.2.9'], ['>=0.1.97', 'v0.1.93', true], ['>=0.1.97', '0.1.93'], ['0.1.20 || 1.2.4', '1.2.3'], ['>=0.2.3 || <0.0.1', '0.0.3'], ['>=0.2.3 || <0.0.1', '0.2.2'], ['2.x.x', '1.1.3'], ['2.x.x', '3.1.3'], ['1.2.x', '1.3.3'], ['1.2.x || 2.x', '3.1.3'], ['1.2.x || 2.x', '1.1.3'], ['2.*.*', '1.1.3'], ['2.*.*', '3.1.3'], ['1.2.*', '1.3.3'], ['1.2.* || 2.*', '3.1.3'], ['1.2.* || 2.*', '1.1.3'], ['2', '1.1.2'], ['2.3', '2.4.1'], ['~2.4', '2.5.0'], // >=2.4.0 <2.5.0 ['~2.4', '2.3.9'], ['~>3.2.1', '3.3.2'], // >=3.2.1 <3.3.0 ['~>3.2.1', '3.2.0'], // >=3.2.1 <3.3.0 ['~1', '0.2.3'], // >=1.0.0 <2.0.0 ['~>1', '2.2.3'], ['~1.0', '1.1.0'], // >=1.0.0 <1.1.0 ['<1', '1.0.0'], ['>=1.2', '1.1.1'], ['1', '2.0.0beta', true], ['~v0.5.4-beta', '0.5.4-alpha'], ['=0.7.x', '0.8.2'], ['>=0.7.x', '0.6.2'], ['<0.7.x', '0.7.2'], ['<1.2.3', '1.2.3-beta'], ['=1.2.3', '1.2.3-beta'], ['>1.2', '1.2.8'], ['^1.2.3', '2.0.0-alpha'], ['^1.2.3', '1.2.2'], ['^1.2', '1.1.9'], ['*', 'v1.2.3-foo', true], // invalid ranges never satisfied! ['blerg', '1.2.3'], ['git+https://user:password0123@github.com/foo', '123.0.0', true], ['^1.2.3', '2.0.0-pre'] ].forEach(function(v) { var range = v[0]; var ver = v[1]; var loose = v[2]; var found = satisfies(ver, range, loose); t.ok(!found, ver + ' not satisfied by ' + range); }); t.end(); }); test('\nincrement versions test', function(t) { // [version, inc, result, identifier] // inc(version, inc) -> result [['1.2.3', 'major', '2.0.0'], ['1.2.3', 'minor', '1.3.0'], ['1.2.3', 'patch', '1.2.4'], ['1.2.3tag', 'major', '2.0.0', true], ['1.2.3-tag', 'major', '2.0.0'], ['1.2.3', 'fake', null], ['1.2.0-0', 'patch', '1.2.0'], ['fake', 'major', null], ['1.2.3-4', 'major', '2.0.0'], ['1.2.3-4', 'minor', '1.3.0'], ['1.2.3-4', 'patch', '1.2.3'], ['1.2.3-alpha.0.beta', 'major', '2.0.0'], ['1.2.3-alpha.0.beta', 'minor', '1.3.0'], ['1.2.3-alpha.0.beta', 'patch', '1.2.3'], ['1.2.4', 'prerelease', '1.2.5-0'], ['1.2.3-0', 'prerelease', '1.2.3-1'], ['1.2.3-alpha.0', 'prerelease', '1.2.3-alpha.1'], ['1.2.3-alpha.1', 'prerelease', '1.2.3-alpha.2'], ['1.2.3-alpha.2', 'prerelease', '1.2.3-alpha.3'], ['1.2.3-alpha.0.beta', 'prerelease', '1.2.3-alpha.1.beta'], ['1.2.3-alpha.1.beta', 'prerelease', '1.2.3-alpha.2.beta'], ['1.2.3-alpha.2.beta', 'prerelease', '1.2.3-alpha.3.beta'], ['1.2.3-alpha.10.0.beta', 'prerelease', '1.2.3-alpha.10.1.beta'], ['1.2.3-alpha.10.1.beta', 'prerelease', '1.2.3-alpha.10.2.beta'], ['1.2.3-alpha.10.2.beta', 'prerelease', '1.2.3-alpha.10.3.beta'], ['1.2.3-alpha.10.beta.0', 'prerelease', '1.2.3-alpha.10.beta.1'], ['1.2.3-alpha.10.beta.1', 'prerelease', '1.2.3-alpha.10.beta.2'], ['1.2.3-alpha.10.beta.2', 'prerelease', '1.2.3-alpha.10.beta.3'], ['1.2.3-alpha.9.beta', 'prerelease', '1.2.3-alpha.10.beta'], ['1.2.3-alpha.10.beta', 'prerelease', '1.2.3-alpha.11.beta'], ['1.2.3-alpha.11.beta', 'prerelease', '1.2.3-alpha.12.beta'], ['1.2.0', 'prepatch', '1.2.1-0'], ['1.2.0-1', 'prepatch', '1.2.1-0'], ['1.2.0', 'preminor', '1.3.0-0'], ['1.2.3-1', 'preminor', '1.3.0-0'], ['1.2.0', 'premajor', '2.0.0-0'], ['1.2.3-1', 'premajor', '2.0.0-0'], ['1.2.0-1', 'minor', '1.2.0'], ['1.0.0-1', 'major', '1.0.0'], ['1.2.3', 'major', '2.0.0', false, 'dev'], ['1.2.3', 'minor', '1.3.0', false, 'dev'], ['1.2.3', 'patch', '1.2.4', false, 'dev'], ['1.2.3tag', 'major', '2.0.0', true, 'dev'], ['1.2.3-tag', 'major', '2.0.0', false, 'dev'], ['1.2.3', 'fake', null, false, 'dev'], ['1.2.0-0', 'patch', '1.2.0', false, 'dev'], ['fake', 'major', null, false, 'dev'], ['1.2.3-4', 'major', '2.0.0', false, 'dev'], ['1.2.3-4', 'minor', '1.3.0', false, 'dev'], ['1.2.3-4', 'patch', '1.2.3', false, 'dev'], ['1.2.3-alpha.0.beta', 'major', '2.0.0', false, 'dev'], ['1.2.3-alpha.0.beta', 'minor', '1.3.0', false, 'dev'], ['1.2.3-alpha.0.beta', 'patch', '1.2.3', false, 'dev'], ['1.2.4', 'prerelease', '1.2.5-dev.0', false, 'dev'], ['1.2.3-0', 'prerelease', '1.2.3-dev.0', false, 'dev'], ['1.2.3-alpha.0', 'prerelease', '1.2.3-dev.0', false, 'dev'], ['1.2.3-alpha.0', 'prerelease', '1.2.3-alpha.1', false, 'alpha'], ['1.2.3-alpha.0.beta', 'prerelease', '1.2.3-dev.0', false, 'dev'], ['1.2.3-alpha.0.beta', 'prerelease', '1.2.3-alpha.1.beta', false, 'alpha'], ['1.2.3-alpha.10.0.beta', 'prerelease', '1.2.3-dev.0', false, 'dev'], ['1.2.3-alpha.10.0.beta', 'prerelease', '1.2.3-alpha.10.1.beta', false, 'alpha'], ['1.2.3-alpha.10.1.beta', 'prerelease', '1.2.3-alpha.10.2.beta', false, 'alpha'], ['1.2.3-alpha.10.2.beta', 'prerelease', '1.2.3-alpha.10.3.beta', false, 'alpha'], ['1.2.3-alpha.10.beta.0', 'prerelease', '1.2.3-dev.0', false, 'dev'], ['1.2.3-alpha.10.beta.0', 'prerelease', '1.2.3-alpha.10.beta.1', false, 'alpha'], ['1.2.3-alpha.10.beta.1', 'prerelease', '1.2.3-alpha.10.beta.2', false, 'alpha'], ['1.2.3-alpha.10.beta.2', 'prerelease', '1.2.3-alpha.10.beta.3', false, 'alpha'], ['1.2.3-alpha.9.beta', 'prerelease', '1.2.3-dev.0', false, 'dev'], ['1.2.3-alpha.9.beta', 'prerelease', '1.2.3-alpha.10.beta', false, 'alpha'], ['1.2.3-alpha.10.beta', 'prerelease', '1.2.3-alpha.11.beta', false, 'alpha'], ['1.2.3-alpha.11.beta', 'prerelease', '1.2.3-alpha.12.beta', false, 'alpha'], ['1.2.0', 'prepatch', '1.2.1-dev.0', false, 'dev'], ['1.2.0-1', 'prepatch', '1.2.1-dev.0', false, 'dev'], ['1.2.0', 'preminor', '1.3.0-dev.0', false, 'dev'], ['1.2.3-1', 'preminor', '1.3.0-dev.0', false, 'dev'], ['1.2.0', 'premajor', '2.0.0-dev.0', false, 'dev'], ['1.2.3-1', 'premajor', '2.0.0-dev.0', false, 'dev'], ['1.2.0-1', 'minor', '1.2.0', false, 'dev'], ['1.0.0-1', 'major', '1.0.0', false, 'dev'], ['1.2.3-dev.bar', 'prerelease', '1.2.3-dev.0', false, 'dev'] ].forEach(function(v) { var pre = v[0]; var what = v[1]; var wanted = v[2]; var loose = v[3]; var id = v[4]; var found = inc(pre, what, loose, id); var cmd = 'inc(' + pre + ', ' + what + ', ' + id + ')'; t.equal(found, wanted, cmd + ' === ' + wanted); var parsed = semver.parse(pre, loose); if (wanted) { parsed.inc(what, id); t.equal(parsed.version, wanted, cmd + ' object version updated'); t.equal(parsed.raw, wanted, cmd + ' object raw field updated'); } else if (parsed) { t.throws(function () { parsed.inc(what, id) }) } else { t.equal(parsed, null) } }); t.end(); }); test('\ndiff versions test', function(t) { // [version1, version2, result] // diff(version1, version2) -> result [['1.2.3', '0.2.3', 'major'], ['1.4.5', '0.2.3', 'major'], ['1.2.3', '2.0.0-pre', 'premajor'], ['1.2.3', '1.3.3', 'minor'], ['1.0.1', '1.1.0-pre', 'preminor'], ['1.2.3', '1.2.4', 'patch'], ['1.2.3', '1.2.4-pre', 'prepatch'], ['0.0.1', '0.0.1-pre', 'prerelease'], ['0.0.1', '0.0.1-pre-2', 'prerelease'], ['1.1.0', '1.1.0-pre', 'prerelease'], ['1.1.0-pre-1', '1.1.0-pre-2', 'prerelease'], ['1.0.0', '1.0.0', null] ].forEach(function(v) { var version1 = v[0]; var version2 = v[1]; var wanted = v[2]; var found = diff(version1, version2); var cmd = 'diff(' + version1 + ', ' + version2 + ')'; t.equal(found, wanted, cmd + ' === ' + wanted); }); t.end(); }); test('\nvalid range test', function(t) { // [range, result] // validRange(range) -> result // translate ranges into their canonical form [['1.0.0 - 2.0.0', '>=1.0.0 <=2.0.0'], ['1.0.0', '1.0.0'], ['>=*', '*'], ['', '*'], ['*', '*'], ['*', '*'], ['>=1.0.0', '>=1.0.0'], ['>1.0.0', '>1.0.0'], ['<=2.0.0', '<=2.0.0'], ['1', '>=1.0.0 <2.0.0'], ['<=2.0.0', '<=2.0.0'], ['<=2.0.0', '<=2.0.0'], ['<2.0.0', '<2.0.0'], ['<2.0.0', '<2.0.0'], ['>= 1.0.0', '>=1.0.0'], ['>= 1.0.0', '>=1.0.0'], ['>= 1.0.0', '>=1.0.0'], ['> 1.0.0', '>1.0.0'], ['> 1.0.0', '>1.0.0'], ['<= 2.0.0', '<=2.0.0'], ['<= 2.0.0', '<=2.0.0'], ['<= 2.0.0', '<=2.0.0'], ['< 2.0.0', '<2.0.0'], ['< 2.0.0', '<2.0.0'], ['>=0.1.97', '>=0.1.97'], ['>=0.1.97', '>=0.1.97'], ['0.1.20 || 1.2.4', '0.1.20||1.2.4'], ['>=0.2.3 || <0.0.1', '>=0.2.3||<0.0.1'], ['>=0.2.3 || <0.0.1', '>=0.2.3||<0.0.1'], ['>=0.2.3 || <0.0.1', '>=0.2.3||<0.0.1'], ['||', '||'], ['2.x.x', '>=2.0.0 <3.0.0'], ['1.2.x', '>=1.2.0 <1.3.0'], ['1.2.x || 2.x', '>=1.2.0 <1.3.0||>=2.0.0 <3.0.0'], ['1.2.x || 2.x', '>=1.2.0 <1.3.0||>=2.0.0 <3.0.0'], ['x', '*'], ['2.*.*', '>=2.0.0 <3.0.0'], ['1.2.*', '>=1.2.0 <1.3.0'], ['1.2.* || 2.*', '>=1.2.0 <1.3.0||>=2.0.0 <3.0.0'], ['*', '*'], ['2', '>=2.0.0 <3.0.0'], ['2.3', '>=2.3.0 <2.4.0'], ['~2.4', '>=2.4.0 <2.5.0'], ['~2.4', '>=2.4.0 <2.5.0'], ['~>3.2.1', '>=3.2.1 <3.3.0'], ['~1', '>=1.0.0 <2.0.0'], ['~>1', '>=1.0.0 <2.0.0'], ['~> 1', '>=1.0.0 <2.0.0'], ['~1.0', '>=1.0.0 <1.1.0'], ['~ 1.0', '>=1.0.0 <1.1.0'], ['^0', '>=0.0.0 <1.0.0'], ['^ 1', '>=1.0.0 <2.0.0'], ['^0.1', '>=0.1.0 <0.2.0'], ['^1.0', '>=1.0.0 <2.0.0'], ['^1.2', '>=1.2.0 <2.0.0'], ['^0.0.1', '>=0.0.1 <0.0.2'], ['^0.0.1-beta', '>=0.0.1-beta <0.0.2'], ['^0.1.2', '>=0.1.2 <0.2.0'], ['^1.2.3', '>=1.2.3 <2.0.0'], ['^1.2.3-beta.4', '>=1.2.3-beta.4 <2.0.0'], ['<1', '<1.0.0'], ['< 1', '<1.0.0'], ['>=1', '>=1.0.0'], ['>= 1', '>=1.0.0'], ['<1.2', '<1.2.0'], ['< 1.2', '<1.2.0'], ['1', '>=1.0.0 <2.0.0'], ['>01.02.03', '>1.2.3', true], ['>01.02.03', null], ['~1.2.3beta', '>=1.2.3-beta <1.3.0', true], ['~1.2.3beta', null], ['^ 1.2 ^ 1', '>=1.2.0 <2.0.0 >=1.0.0 <2.0.0'] ].forEach(function(v) { var pre = v[0]; var wanted = v[1]; var loose = v[2]; var found = validRange(pre, loose); t.equal(found, wanted, 'validRange(' + pre + ') === ' + wanted); }); t.end(); }); test('\ncomparators test', function(t) { // [range, comparators] // turn range into a set of individual comparators [['1.0.0 - 2.0.0', [['>=1.0.0', '<=2.0.0']]], ['1.0.0', [['1.0.0']]], ['>=*', [['']]], ['', [['']]], ['*', [['']]], ['*', [['']]], ['>=1.0.0', [['>=1.0.0']]], ['>=1.0.0', [['>=1.0.0']]], ['>=1.0.0', [['>=1.0.0']]], ['>1.0.0', [['>1.0.0']]], ['>1.0.0', [['>1.0.0']]], ['<=2.0.0', [['<=2.0.0']]], ['1', [['>=1.0.0', '<2.0.0']]], ['<=2.0.0', [['<=2.0.0']]], ['<=2.0.0', [['<=2.0.0']]], ['<2.0.0', [['<2.0.0']]], ['<2.0.0', [['<2.0.0']]], ['>= 1.0.0', [['>=1.0.0']]], ['>= 1.0.0', [['>=1.0.0']]], ['>= 1.0.0', [['>=1.0.0']]], ['> 1.0.0', [['>1.0.0']]], ['> 1.0.0', [['>1.0.0']]], ['<= 2.0.0', [['<=2.0.0']]], ['<= 2.0.0', [['<=2.0.0']]], ['<= 2.0.0', [['<=2.0.0']]], ['< 2.0.0', [['<2.0.0']]], ['<\t2.0.0', [['<2.0.0']]], ['>=0.1.97', [['>=0.1.97']]], ['>=0.1.97', [['>=0.1.97']]], ['0.1.20 || 1.2.4', [['0.1.20'], ['1.2.4']]], ['>=0.2.3 || <0.0.1', [['>=0.2.3'], ['<0.0.1']]], ['>=0.2.3 || <0.0.1', [['>=0.2.3'], ['<0.0.1']]], ['>=0.2.3 || <0.0.1', [['>=0.2.3'], ['<0.0.1']]], ['||', [[''], ['']]], ['2.x.x', [['>=2.0.0', '<3.0.0']]], ['1.2.x', [['>=1.2.0', '<1.3.0']]], ['1.2.x || 2.x', [['>=1.2.0', '<1.3.0'], ['>=2.0.0', '<3.0.0']]], ['1.2.x || 2.x', [['>=1.2.0', '<1.3.0'], ['>=2.0.0', '<3.0.0']]], ['x', [['']]], ['2.*.*', [['>=2.0.0', '<3.0.0']]], ['1.2.*', [['>=1.2.0', '<1.3.0']]], ['1.2.* || 2.*', [['>=1.2.0', '<1.3.0'], ['>=2.0.0', '<3.0.0']]], ['1.2.* || 2.*', [['>=1.2.0', '<1.3.0'], ['>=2.0.0', '<3.0.0']]], ['*', [['']]], ['2', [['>=2.0.0', '<3.0.0']]], ['2.3', [['>=2.3.0', '<2.4.0']]], ['~2.4', [['>=2.4.0', '<2.5.0']]], ['~2.4', [['>=2.4.0', '<2.5.0']]], ['~>3.2.1', [['>=3.2.1', '<3.3.0']]], ['~1', [['>=1.0.0', '<2.0.0']]], ['~>1', [['>=1.0.0', '<2.0.0']]], ['~> 1', [['>=1.0.0', '<2.0.0']]], ['~1.0', [['>=1.0.0', '<1.1.0']]], ['~ 1.0', [['>=1.0.0', '<1.1.0']]], ['~ 1.0.3', [['>=1.0.3', '<1.1.0']]], ['~> 1.0.3', [['>=1.0.3', '<1.1.0']]], ['<1', [['<1.0.0']]], ['< 1', [['<1.0.0']]], ['>=1', [['>=1.0.0']]], ['>= 1', [['>=1.0.0']]], ['<1.2', [['<1.2.0']]], ['< 1.2', [['<1.2.0']]], ['1', [['>=1.0.0', '<2.0.0']]], ['1 2', [['>=1.0.0', '<2.0.0', '>=2.0.0', '<3.0.0']]], ['1.2 - 3.4.5', [['>=1.2.0', '<=3.4.5']]], ['1.2.3 - 3.4', [['>=1.2.3', '<3.5.0']]], ['1.2.3 - 3', [['>=1.2.3', '<4.0.0']]], ['>*', [['<0.0.0']]], ['<*', [['<0.0.0']]] ].forEach(function(v) { var pre = v[0]; var wanted = v[1]; var found = toComparators(v[0]); var jw = JSON.stringify(wanted); t.equivalent(found, wanted, 'toComparators(' + pre + ') === ' + jw); }); t.end(); }); test('\ninvalid version numbers', function(t) { ['1.2.3.4', 'NOT VALID', 1.2, null, 'Infinity.NaN.Infinity' ].forEach(function(v) { t.throws(function() { new SemVer(v); }, {name:'TypeError', message:'Invalid Version: ' + v}); }); t.end(); }); test('\nstrict vs loose version numbers', function(t) { [['=1.2.3', '1.2.3'], ['01.02.03', '1.2.3'], ['1.2.3-beta.01', '1.2.3-beta.1'], [' =1.2.3', '1.2.3'], ['1.2.3foo', '1.2.3-foo'] ].forEach(function(v) { var loose = v[0]; var strict = v[1]; t.throws(function() { new SemVer(loose); }); var lv = new SemVer(loose, true); t.equal(lv.version, strict); t.ok(eq(loose, strict, true)); t.throws(function() { eq(loose, strict); }); t.throws(function() { new SemVer(strict).compare(loose); }); }); t.end(); }); test('\nstrict vs loose ranges', function(t) { [['>=01.02.03', '>=1.2.3'], ['~1.02.03beta', '>=1.2.3-beta <1.3.0'] ].forEach(function(v) { var loose = v[0]; var comps = v[1]; t.throws(function() { new Range(loose); }); t.equal(new Range(loose, true).range, comps); }); t.end(); }); test('\nmax satisfying', function(t) { [[['1.2.3', '1.2.4'], '1.2', '1.2.4'], [['1.2.4', '1.2.3'], '1.2', '1.2.4'], [['1.2.3', '1.2.4', '1.2.5', '1.2.6'], '~1.2.3', '1.2.6'], [['1.1.0', '1.2.0', '1.2.1', '1.3.0', '2.0.0b1', '2.0.0b2', '2.0.0b3', '2.0.0', '2.1.0'], '~2.0.0', '2.0.0', true] ].forEach(function(v) { var versions = v[0]; var range = v[1]; var expect = v[2]; var loose = v[3]; var actual = semver.maxSatisfying(versions, range, loose); t.equal(actual, expect); }); t.end(); }); npm_3.5.2.orig/node_modules/semver/test/ltr.js0000644000000000000000000001216512631326456017573 0ustar 00000000000000var tap = require('tap'); var test = tap.test; var semver = require('../semver.js'); var ltr = semver.ltr; test('\nltr tests', function(t) { // [range, version, loose] // Version should be less than range [ ['~1.2.2', '1.2.1'], ['~0.6.1-1', '0.6.1-0'], ['1.0.0 - 2.0.0', '0.0.1'], ['1.0.0-beta.2', '1.0.0-beta.1'], ['1.0.0', '0.0.0'], ['>=2.0.0', '1.1.1'], ['>=2.0.0', '1.2.9'], ['>2.0.0', '2.0.0'], ['0.1.20 || 1.2.4', '0.1.5'], ['2.x.x', '1.0.0'], ['1.2.x', '1.1.0'], ['1.2.x || 2.x', '1.0.0'], ['2.*.*', '1.0.1'], ['1.2.*', '1.1.3'], ['1.2.* || 2.*', '1.1.9999'], ['2', '1.0.0'], ['2.3', '2.2.2'], ['~2.4', '2.3.0'], // >=2.4.0 <2.5.0 ['~2.4', '2.3.5'], ['~>3.2.1', '3.2.0'], // >=3.2.1 <3.3.0 ['~1', '0.2.3'], // >=1.0.0 <2.0.0 ['~>1', '0.2.4'], ['~> 1', '0.2.3'], ['~1.0', '0.1.2'], // >=1.0.0 <1.1.0 ['~ 1.0', '0.1.0'], ['>1.2', '1.2.0'], ['> 1.2', '1.2.1'], ['1', '0.0.0beta', true], ['~v0.5.4-pre', '0.5.4-alpha'], ['~v0.5.4-pre', '0.5.4-alpha'], ['=0.7.x', '0.6.0'], ['=0.7.x', '0.6.0-asdf'], ['>=0.7.x', '0.6.0'], ['~1.2.2', '1.2.1'], ['1.0.0 - 2.0.0', '0.2.3'], ['1.0.0', '0.0.1'], ['>=2.0.0', '1.0.0'], ['>=2.0.0', '1.9999.9999'], ['>=2.0.0', '1.2.9'], ['>2.0.0', '2.0.0'], ['>2.0.0', '1.2.9'], ['2.x.x', '1.1.3'], ['1.2.x', '1.1.3'], ['1.2.x || 2.x', '1.1.3'], ['2.*.*', '1.1.3'], ['1.2.*', '1.1.3'], ['1.2.* || 2.*', '1.1.3'], ['2', '1.9999.9999'], ['2.3', '2.2.1'], ['~2.4', '2.3.0'], // >=2.4.0 <2.5.0 ['~>3.2.1', '2.3.2'], // >=3.2.1 <3.3.0 ['~1', '0.2.3'], // >=1.0.0 <2.0.0 ['~>1', '0.2.3'], ['~1.0', '0.0.0'], // >=1.0.0 <1.1.0 ['>1', '1.0.0'], ['2', '1.0.0beta', true], ['>1', '1.0.0beta', true], ['> 1', '1.0.0beta', true], ['=0.7.x', '0.6.2'], ['=0.7.x', '0.7.0-asdf'], ['^1', '1.0.0-0'], ['>=0.7.x', '0.7.0-asdf'], ['1', '1.0.0beta', true], ['>=0.7.x', '0.6.2'], ['>1.2.3', '1.3.0-alpha'] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = 'ltr(' + version + ', ' + range + ', ' + loose + ')'; t.ok(ltr(version, range, loose), msg); }); t.end(); }); test('\nnegative ltr tests', function(t) { // [range, version, loose] // Version should NOT be less than range [ ['~ 1.0', '1.1.0'], ['~0.6.1-1', '0.6.1-1'], ['1.0.0 - 2.0.0', '1.2.3'], ['1.0.0 - 2.0.0', '2.9.9'], ['1.0.0', '1.0.0'], ['>=*', '0.2.4'], ['', '1.0.0', true], ['*', '1.2.3'], ['>=1.0.0', '1.0.0'], ['>=1.0.0', '1.0.1'], ['>=1.0.0', '1.1.0'], ['>1.0.0', '1.0.1'], ['>1.0.0', '1.1.0'], ['<=2.0.0', '2.0.0'], ['<=2.0.0', '1.9999.9999'], ['<=2.0.0', '0.2.9'], ['<2.0.0', '1.9999.9999'], ['<2.0.0', '0.2.9'], ['>= 1.0.0', '1.0.0'], ['>= 1.0.0', '1.0.1'], ['>= 1.0.0', '1.1.0'], ['> 1.0.0', '1.0.1'], ['> 1.0.0', '1.1.0'], ['<= 2.0.0', '2.0.0'], ['<= 2.0.0', '1.9999.9999'], ['<= 2.0.0', '0.2.9'], ['< 2.0.0', '1.9999.9999'], ['<\t2.0.0', '0.2.9'], ['>=0.1.97', 'v0.1.97'], ['>=0.1.97', '0.1.97'], ['0.1.20 || 1.2.4', '1.2.4'], ['0.1.20 || >1.2.4', '1.2.4'], ['0.1.20 || 1.2.4', '1.2.3'], ['0.1.20 || 1.2.4', '0.1.20'], ['>=0.2.3 || <0.0.1', '0.0.0'], ['>=0.2.3 || <0.0.1', '0.2.3'], ['>=0.2.3 || <0.0.1', '0.2.4'], ['||', '1.3.4'], ['2.x.x', '2.1.3'], ['1.2.x', '1.2.3'], ['1.2.x || 2.x', '2.1.3'], ['1.2.x || 2.x', '1.2.3'], ['x', '1.2.3'], ['2.*.*', '2.1.3'], ['1.2.*', '1.2.3'], ['1.2.* || 2.*', '2.1.3'], ['1.2.* || 2.*', '1.2.3'], ['1.2.* || 2.*', '1.2.3'], ['*', '1.2.3'], ['2', '2.1.2'], ['2.3', '2.3.1'], ['~2.4', '2.4.0'], // >=2.4.0 <2.5.0 ['~2.4', '2.4.5'], ['~>3.2.1', '3.2.2'], // >=3.2.1 <3.3.0 ['~1', '1.2.3'], // >=1.0.0 <2.0.0 ['~>1', '1.2.3'], ['~> 1', '1.2.3'], ['~1.0', '1.0.2'], // >=1.0.0 <1.1.0 ['~ 1.0', '1.0.2'], ['>=1', '1.0.0'], ['>= 1', '1.0.0'], ['<1.2', '1.1.1'], ['< 1.2', '1.1.1'], ['~v0.5.4-pre', '0.5.5'], ['~v0.5.4-pre', '0.5.4'], ['=0.7.x', '0.7.2'], ['>=0.7.x', '0.7.2'], ['<=0.7.x', '0.6.2'], ['>0.2.3 >0.2.4 <=0.2.5', '0.2.5'], ['>=0.2.3 <=0.2.4', '0.2.4'], ['1.0.0 - 2.0.0', '2.0.0'], ['^3.0.0', '4.0.0'], ['^1.0.0 || ~2.0.1', '2.0.0'], ['^0.1.0 || ~3.0.1 || 5.0.0', '3.2.0'], ['^0.1.0 || ~3.0.1 || 5.0.0', '1.0.0beta', true], ['^0.1.0 || ~3.0.1 || 5.0.0', '5.0.0-0', true], ['^0.1.0 || ~3.0.1 || >4 <=5.0.0', '3.5.0'], ['^1.0.0alpha', '1.0.0beta', true], ['~1.0.0alpha', '1.0.0beta', true], ['^1.0.0-alpha', '1.0.0beta', true], ['~1.0.0-alpha', '1.0.0beta', true], ['^1.0.0-alpha', '1.0.0-beta'], ['~1.0.0-alpha', '1.0.0-beta'], ['=0.1.0', '1.0.0'] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = '!ltr(' + version + ', ' + range + ', ' + loose + ')'; t.notOk(ltr(version, range, loose), msg); }); t.end(); }); npm_3.5.2.orig/node_modules/semver/test/major-minor-patch.js0000644000000000000000000000337512631326456022324 0ustar 00000000000000var tap = require('tap'); var test = tap.test; var semver = require('../semver.js'); test('\nmajor tests', function(t) { // [range, version] // Version should be detectable despite extra characters [ ['1.2.3', 1], [' 1.2.3 ', 1], [' 2.2.3-4 ', 2], [' 3.2.3-pre ', 3], ['v5.2.3', 5], [' v8.2.3 ', 8], ['\t13.2.3', 13], ['=21.2.3', 21, true], ['v=34.2.3', 34, true] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = 'major(' + range + ') = ' + version; t.equal(semver.major(range, loose), version, msg); }); t.end(); }); test('\nminor tests', function(t) { // [range, version] // Version should be detectable despite extra characters [ ['1.1.3', 1], [' 1.1.3 ', 1], [' 1.2.3-4 ', 2], [' 1.3.3-pre ', 3], ['v1.5.3', 5], [' v1.8.3 ', 8], ['\t1.13.3', 13], ['=1.21.3', 21, true], ['v=1.34.3', 34, true] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = 'minor(' + range + ') = ' + version; t.equal(semver.minor(range, loose), version, msg); }); t.end(); }); test('\npatch tests', function(t) { // [range, version] // Version should be detectable despite extra characters [ ['1.2.1', 1], [' 1.2.1 ', 1], [' 1.2.2-4 ', 2], [' 1.2.3-pre ', 3], ['v1.2.5', 5], [' v1.2.8 ', 8], ['\t1.2.13', 13], ['=1.2.21', 21, true], ['v=1.2.34', 34, true] ].forEach(function(tuple) { var range = tuple[0]; var version = tuple[1]; var loose = tuple[2] || false; var msg = 'patch(' + range + ') = ' + version; t.equal(semver.patch(range, loose), version, msg); }); t.end(); }); npm_3.5.2.orig/node_modules/sha/.npmignore0000644000000000000000000000005012631326456016714 0ustar 00000000000000node_modules test .gitignore .travis.ymlnpm_3.5.2.orig/node_modules/sha/LICENSE0000644000000000000000000000442612631326456015735 0ustar 00000000000000Copyright (c) 2013 Forbes Lindesay The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. The MIT License (MIT) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.npm_3.5.2.orig/node_modules/sha/README.md0000644000000000000000000000332012631326456016177 0ustar 00000000000000# sha Check and get file hashes (using any algorithm) [![Build Status](https://img.shields.io/travis/ForbesLindesay/sha/master.svg)](https://travis-ci.org/ForbesLindesay/sha) [![Dependency Status](https://img.shields.io/gemnasium/ForbesLindesay/sha.svg)](https://gemnasium.com/ForbesLindesay/sha) [![NPM version](https://img.shields.io/npm/v/sha.svg)](http://badge.fury.io/js/sha) ## Installation $ npm install sha ## API ### check(fileName, expected, [options,] cb) / checkSync(filename, expected, [options]) Asynchronously check that `fileName` has a "hash" of `expected`. The callback will be called with either `null` or an error (indicating that they did not match). Options: - algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash` ### get(fileName, [options,] cb) / getSync(filename, [options]) Asynchronously get the "hash" of `fileName`. The callback will be called with an optional `error` object and the (lower cased) hex digest of the hash. Options: - algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash` ### stream(expected, [options]) Check the hash of a stream without ever buffering it. This is a pass through stream so you can do things like: ```js fs.createReadStream('src') .pipe(sha.stream('expected')) .pipe(fs.createWriteStream('dest')) ``` `dest` will be a complete copy of `src` and an error will be emitted if the hash did not match `'expected'`. Options: - algorithm: defaults to `sha1` and can be any of the algorithms supported by `crypto.createHash` ## License You may use this software under the BSD or MIT. Take your pick. If you want me to release it under another license, open a pull request.npm_3.5.2.orig/node_modules/sha/index.js0000644000000000000000000000537012631326456016374 0ustar 00000000000000'use strict' var Transform = require('stream').Transform || require('readable-stream').Transform var crypto = require('crypto') var fs = require('graceful-fs') exports.check = check exports.checkSync = checkSync exports.get = get exports.getSync = getSync exports.stream = stream function check(file, expected, options, cb) { if (typeof options === 'function') { cb = options options = undefined } expected = expected.toLowerCase().trim() get(file, options, function (er, actual) { if (er) { if (er.message) er.message += ' while getting shasum for ' + file return cb(er) } if (actual === expected) return cb(null) cb(new Error( 'shasum check failed for ' + file + '\n' + 'Expected: ' + expected + '\n' + 'Actual: ' + actual)) }) } function checkSync(file, expected, options) { expected = expected.toLowerCase().trim() var actual try { actual = getSync(file, options) } catch (er) { if (er.message) er.message += ' while getting shasum for ' + file throw er } if (actual !== expected) { var ex = new Error( 'shasum check failed for ' + file + '\n' + 'Expected: ' + expected + '\n' + 'Actual: ' + actual) throw ex } } function get(file, options, cb) { if (typeof options === 'function') { cb = options options = undefined } options = options || {} var algorithm = options.algorithm || 'sha1' var hash = crypto.createHash(algorithm) var source = fs.createReadStream(file) var errState = null source .on('error', function (er) { if (errState) return return cb(errState = er) }) .on('data', function (chunk) { if (errState) return hash.update(chunk) }) .on('end', function () { if (errState) return var actual = hash.digest("hex").toLowerCase().trim() cb(null, actual) }) } function getSync(file, options) { options = options || {} var algorithm = options.algorithm || 'sha1' var hash = crypto.createHash(algorithm) var source = fs.readFileSync(file) hash.update(source) return hash.digest("hex").toLowerCase().trim() } function stream(expected, options) { expected = expected.toLowerCase().trim() options = options || {} var algorithm = options.algorithm || 'sha1' var hash = crypto.createHash(algorithm) var stream = new Transform() stream._transform = function (chunk, encoding, callback) { hash.update(chunk) stream.push(chunk) callback() } stream._flush = function (cb) { var actual = hash.digest("hex").toLowerCase().trim() if (actual === expected) return cb(null) cb(new Error( 'shasum check failed for:\n' + ' Expected: ' + expected + '\n' + ' Actual: ' + actual)) this.push(null) } return stream }npm_3.5.2.orig/node_modules/sha/node_modules/0000755000000000000000000000000012631326456017377 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/package.json0000644000000000000000000000247012631326456017213 0ustar 00000000000000{ "name": "sha", "version": "2.0.1", "description": "Check and get file hashes", "scripts": { "test": "mocha -R spec" }, "repository": { "type": "git", "url": "https://github.com/ForbesLindesay/sha.git" }, "license": "(BSD-2-Clause OR MIT)", "dependencies": { "graceful-fs": "^4.1.2", "readable-stream": "^2.0.2" }, "devDependencies": { "mocha": "~1.9.0" }, "gitHead": "ce7c72ba753d886fb46c396cbadcbfc8eac25b4f", "bugs": { "url": "https://github.com/ForbesLindesay/sha/issues" }, "homepage": "https://github.com/ForbesLindesay/sha", "_id": "sha@2.0.1", "_shasum": "6030822fbd2c9823949f8f72ed6411ee5cf25aae", "_from": "sha@>=2.0.1 <2.1.0", "_npmVersion": "2.7.1", "_nodeVersion": "1.6.2", "_npmUser": { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" }, "maintainers": [ { "name": "forbeslindesay", "email": "forbes@lindesay.co.uk" }, { "name": "kenan", "email": "kenan@kenany.me" }, { "name": "thefourtheye", "email": "thechargingvolcano@gmail.com" } ], "dist": { "shasum": "6030822fbd2c9823949f8f72ed6411ee5cf25aae", "tarball": "http://registry.npmjs.org/sha/-/sha-2.0.1.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/sha/-/sha-2.0.1.tgz" } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/0000755000000000000000000000000012631326456022427 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/.npmignore0000644000000000000000000000004412631326456024424 0ustar 00000000000000build/ test/ examples/ fs.js zlib.jsnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/.travis.yml0000644000000000000000000000264112631326456024543 0ustar 00000000000000sudo: false language: node_js before_install: - npm install -g npm notifications: email: false matrix: include: - node_js: '0.8' env: TASK=test - node_js: '0.10' env: TASK=test - node_js: '0.11' env: TASK=test - node_js: '0.12' env: TASK=test - node_js: 'iojs' env: TASK=test - node_js: 'iojs' env: TASK=browser BROWSER_NAME=opera BROWSER_VERSION="11..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=ie BROWSER_VERSION="9..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=chrome BROWSER_VERSION="39..beta" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=firefox BROWSER_VERSION="34..beta" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=ipad BROWSER_VERSION="6.0..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=iphone BROWSER_VERSION="6.0..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=safari BROWSER_VERSION="5..latest" - node_js: 'iojs' env: TASK=browser BROWSER_NAME=android BROWSER_VERSION="4.0..latest" script: "npm run $TASK" env: global: - secure: rE2Vvo7vnjabYNULNyLFxOyt98BoJexDqsiOnfiD6kLYYsiQGfr/sbZkPMOFm9qfQG7pjqx+zZWZjGSswhTt+626C0t/njXqug7Yps4c3dFblzGfreQHp7wNX5TFsvrxd6dAowVasMp61sJcRnB2w8cUzoe3RAYUDHyiHktwqMc= - secure: g9YINaKAdMatsJ28G9jCGbSaguXCyxSTy+pBO6Ch0Cf57ZLOTka3HqDj8p3nV28LUIHZ3ut5WO43CeYKwt4AUtLpBS3a0dndHdY6D83uY6b2qh5hXlrcbeQTq2cvw2y95F7hm4D1kwrgZ7ViqaKggRcEupAL69YbJnxeUDKWEdI= npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/.zuul.yml0000644000000000000000000000001112631326456024217 0ustar 00000000000000ui: tape npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/LICENSE0000644000000000000000000000211012631326456023426 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. All rights reserved. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/README.md0000644000000000000000000000361012631326456023706 0ustar 00000000000000# readable-stream ***Node-core streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream) [![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/) [![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/) [![Sauce Test Status](https://saucelabs.com/browser-matrix/readable-stream.svg)](https://saucelabs.com/u/readable-stream) ```bash npm install --save readable-stream ``` ***Node-core streams for userland*** This package is a mirror of the Streams2 and Streams3 implementations in Node-core, including [documentation](doc/stream.markdown). If you want to guarantee a stable streams base, regardless of what version of Node you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *"stream"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html). As of version 2.0.0 **readable-stream** uses semantic versioning. # Streams WG Team Members * **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com> - Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B * **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com> - Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242 * **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org> - Release GPG key: DD8F2338BAE7501E3DD5AC78C273792F7D83545D * **Sam Newman** ([@sonewman](https://github.com/sonewman)) <newmansam@outlook.com> * **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com> * **Domenic Denicola** ([@domenic](https://github.com/domenic)) <d@domenic.me> npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/doc/0000755000000000000000000000000012631326456023174 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/duplex.js0000644000000000000000000000006412631326456024266 0ustar 00000000000000module.exports = require("./lib/_stream_duplex.js") npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/lib/0000755000000000000000000000000012631326456023175 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/0000755000000000000000000000000012631326456025104 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/package.json0000644000000000000000000000623512631326456024723 0ustar 00000000000000{ "name": "readable-stream", "version": "2.0.2", "description": "Streams3, a user-land copy of the stream library from iojs v2.x", "main": "readable.js", "dependencies": { "core-util-is": "~1.0.0", "inherits": "~2.0.1", "isarray": "0.0.1", "process-nextick-args": "~1.0.0", "string_decoder": "~0.10.x", "util-deprecate": "~1.0.1" }, "devDependencies": { "tap": "~0.2.6", "tape": "~4.0.0", "zuul": "~3.0.0" }, "scripts": { "test": "tap test/parallel/*.js", "browser": "zuul --browser-name $BROWSER_NAME --browser-version $BROWSER_VERSION -- test/browser.js" }, "repository": { "type": "git", "url": "git://github.com/nodejs/readable-stream.git" }, "keywords": [ "readable", "stream", "pipe" ], "browser": { "util": false }, "license": "MIT", "readme": "# readable-stream\n\n***Node-core streams for userland*** [![Build Status](https://travis-ci.org/nodejs/readable-stream.svg?branch=master)](https://travis-ci.org/nodejs/readable-stream)\n\n\n[![NPM](https://nodei.co/npm/readable-stream.png?downloads=true&downloadRank=true)](https://nodei.co/npm/readable-stream/)\n[![NPM](https://nodei.co/npm-dl/readable-stream.png?&months=6&height=3)](https://nodei.co/npm/readable-stream/)\n\n\n[![Sauce Test Status](https://saucelabs.com/browser-matrix/readable-stream.svg)](https://saucelabs.com/u/readable-stream)\n\n```bash\nnpm install --save readable-stream\n```\n\n***Node-core streams for userland***\n\nThis package is a mirror of the Streams2 and Streams3 implementations in\nNode-core, including [documentation](doc/stream.markdown).\n\nIf you want to guarantee a stable streams base, regardless of what version of\nNode you, or the users of your libraries are using, use **readable-stream** *only* and avoid the *\"stream\"* module in Node-core, for background see [this blogpost](http://r.va.gg/2014/06/why-i-dont-use-nodes-core-stream-module.html).\n\nAs of version 2.0.0 **readable-stream** uses semantic versioning. \n\n# Streams WG Team Members\n\n* **Chris Dickinson** ([@chrisdickinson](https://github.com/chrisdickinson)) <christopher.s.dickinson@gmail.com>\n - Release GPG key: 9554F04D7259F04124DE6B476D5A82AC7E37093B\n* **Calvin Metcalf** ([@calvinmetcalf](https://github.com/calvinmetcalf)) <calvin.metcalf@gmail.com>\n - Release GPG key: F3EF5F62A87FC27A22E643F714CE4FF5015AA242\n* **Rod Vagg** ([@rvagg](https://github.com/rvagg)) <rod@vagg.org>\n - Release GPG key: DD8F2338BAE7501E3DD5AC78C273792F7D83545D\n* **Sam Newman** ([@sonewman](https://github.com/sonewman)) <newmansam@outlook.com>\n* **Mathias Buus** ([@mafintosh](https://github.com/mafintosh)) <mathiasbuus@gmail.com>\n* **Domenic Denicola** ([@domenic](https://github.com/domenic)) <d@domenic.me>\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/nodejs/readable-stream/issues" }, "homepage": "https://github.com/nodejs/readable-stream#readme", "_id": "readable-stream@2.0.2", "_shasum": "bec81beae8cf455168bc2e5b2b31f5bcfaed9b1b", "_resolved": "https://registry.npmjs.org/readable-stream/-/readable-stream-2.0.2.tgz", "_from": "readable-stream@>=2.0.2 <3.0.0" } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/passthrough.js0000644000000000000000000000007112631326456025332 0ustar 00000000000000module.exports = require("./lib/_stream_passthrough.js") npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/readable.js0000644000000000000000000000101112631326456024515 0ustar 00000000000000var Stream = (function (){ try { return require('st' + 'ream'); // hack to fix a circular dependency issue when used with browserify } catch(_){} }()); exports = module.exports = require('./lib/_stream_readable.js'); exports.Stream = Stream || exports; exports.Readable = exports; exports.Writable = require('./lib/_stream_writable.js'); exports.Duplex = require('./lib/_stream_duplex.js'); exports.Transform = require('./lib/_stream_transform.js'); exports.PassThrough = require('./lib/_stream_passthrough.js'); npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/transform.js0000644000000000000000000000006712631326456025003 0ustar 00000000000000module.exports = require("./lib/_stream_transform.js") npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/writable.js0000644000000000000000000000006612631326456024600 0ustar 00000000000000module.exports = require("./lib/_stream_writable.js") npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/doc/stream.markdown0000644000000000000000000014777612631326456026261 0ustar 00000000000000# Stream Stability: 2 - Stable A stream is an abstract interface implemented by various objects in io.js. For example a [request to an HTTP server](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_http_incomingmessage) is a stream, as is [stdout][]. Streams are readable, writable, or both. All streams are instances of [EventEmitter][] You can load the Stream base classes by doing `require('stream')`. There are base classes provided for [Readable][] streams, [Writable][] streams, [Duplex][] streams, and [Transform][] streams. This document is split up into 3 sections. The first explains the parts of the API that you need to be aware of to use streams in your programs. If you never implement a streaming API yourself, you can stop there. The second section explains the parts of the API that you need to use if you implement your own custom streams yourself. The API is designed to make this easy for you to do. The third section goes into more depth about how streams work, including some of the internal mechanisms and functions that you should probably not modify unless you definitely know what you are doing. ## API for Stream Consumers Streams can be either [Readable][], [Writable][], or both ([Duplex][]). All streams are EventEmitters, but they also have other custom methods and properties depending on whether they are Readable, Writable, or Duplex. If a stream is both Readable and Writable, then it implements all of the methods and events below. So, a [Duplex][] or [Transform][] stream is fully described by this API, though their implementation may be somewhat different. It is not necessary to implement Stream interfaces in order to consume streams in your programs. If you **are** implementing streaming interfaces in your own program, please also refer to [API for Stream Implementors][] below. Almost all io.js programs, no matter how simple, use Streams in some way. Here is an example of using Streams in an io.js program: ```javascript var http = require('http'); var server = http.createServer(function (req, res) { // req is an http.IncomingMessage, which is a Readable Stream // res is an http.ServerResponse, which is a Writable Stream var body = ''; // we want to get the data as utf8 strings // If you don't set an encoding, then you'll get Buffer objects req.setEncoding('utf8'); // Readable streams emit 'data' events once a listener is added req.on('data', function (chunk) { body += chunk; }); // the end event tells you that you have entire body req.on('end', function () { try { var data = JSON.parse(body); } catch (er) { // uh oh! bad json! res.statusCode = 400; return res.end('error: ' + er.message); } // write back something interesting to the user: res.write(typeof data); res.end(); }); }); server.listen(1337); // $ curl localhost:1337 -d '{}' // object // $ curl localhost:1337 -d '"foo"' // string // $ curl localhost:1337 -d 'not json' // error: Unexpected token o ``` ### Class: stream.Readable The Readable stream interface is the abstraction for a *source* of data that you are reading from. In other words, data comes *out* of a Readable stream. A Readable stream will not start emitting data until you indicate that you are ready to receive it. Readable streams have two "modes": a **flowing mode** and a **paused mode**. When in flowing mode, data is read from the underlying system and provided to your program as fast as possible. In paused mode, you must explicitly call `stream.read()` to get chunks of data out. Streams start out in paused mode. **Note**: If no data event handlers are attached, and there are no [`pipe()`][] destinations, and the stream is switched into flowing mode, then data will be lost. You can switch to flowing mode by doing any of the following: * Adding a [`'data'` event][] handler to listen for data. * Calling the [`resume()`][] method to explicitly open the flow. * Calling the [`pipe()`][] method to send the data to a [Writable][]. You can switch back to paused mode by doing either of the following: * If there are no pipe destinations, by calling the [`pause()`][] method. * If there are pipe destinations, by removing any [`'data'` event][] handlers, and removing all pipe destinations by calling the [`unpipe()`][] method. Note that, for backwards compatibility reasons, removing `'data'` event handlers will **not** automatically pause the stream. Also, if there are piped destinations, then calling `pause()` will not guarantee that the stream will *remain* paused once those destinations drain and ask for more data. Examples of readable streams include: * [http responses, on the client](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_http_incomingmessage) * [http requests, on the server](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_http_incomingmessage) * [fs read streams](https://iojs.org/dist/v2.3.0/doc/api/fs.html#fs_class_fs_readstream) * [zlib streams][] * [crypto streams][] * [tcp sockets][] * [child process stdout and stderr][] * [process.stdin][] #### Event: 'readable' When a chunk of data can be read from the stream, it will emit a `'readable'` event. In some cases, listening for a `'readable'` event will cause some data to be read into the internal buffer from the underlying system, if it hadn't already. ```javascript var readable = getReadableStreamSomehow(); readable.on('readable', function() { // there is some data to read now }); ``` Once the internal buffer is drained, a `readable` event will fire again when more data is available. #### Event: 'data' * `chunk` {Buffer | String} The chunk of data. Attaching a `data` event listener to a stream that has not been explicitly paused will switch the stream into flowing mode. Data will then be passed as soon as it is available. If you just want to get all the data out of the stream as fast as possible, this is the best way to do so. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); }); ``` #### Event: 'end' This event fires when there will be no more data to read. Note that the `end` event **will not fire** unless the data is completely consumed. This can be done by switching into flowing mode, or by calling `read()` repeatedly until you get to the end. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); }); readable.on('end', function() { console.log('there will be no more data.'); }); ``` #### Event: 'close' Emitted when the underlying resource (for example, the backing file descriptor) has been closed. Not all streams will emit this. #### Event: 'error' * {Error Object} Emitted if there was an error receiving data. #### readable.read([size]) * `size` {Number} Optional argument to specify how much data to read. * Return {String | Buffer | null} The `read()` method pulls some data out of the internal buffer and returns it. If there is no data available, then it will return `null`. If you pass in a `size` argument, then it will return that many bytes. If `size` bytes are not available, then it will return `null`. If you do not specify a `size` argument, then it will return all the data in the internal buffer. This method should only be called in paused mode. In flowing mode, this method is called automatically until the internal buffer is drained. ```javascript var readable = getReadableStreamSomehow(); readable.on('readable', function() { var chunk; while (null !== (chunk = readable.read())) { console.log('got %d bytes of data', chunk.length); } }); ``` If this method returns a data chunk, then it will also trigger the emission of a [`'data'` event][]. #### readable.setEncoding(encoding) * `encoding` {String} The encoding to use. * Return: `this` Call this function to cause the stream to return strings of the specified encoding instead of Buffer objects. For example, if you do `readable.setEncoding('utf8')`, then the output data will be interpreted as UTF-8 data, and returned as strings. If you do `readable.setEncoding('hex')`, then the data will be encoded in hexadecimal string format. This properly handles multi-byte characters that would otherwise be potentially mangled if you simply pulled the Buffers directly and called `buf.toString(encoding)` on them. If you want to read the data as strings, always use this method. ```javascript var readable = getReadableStreamSomehow(); readable.setEncoding('utf8'); readable.on('data', function(chunk) { assert.equal(typeof chunk, 'string'); console.log('got %d characters of string data', chunk.length); }); ``` #### readable.resume() * Return: `this` This method will cause the readable stream to resume emitting `data` events. This method will switch the stream into flowing mode. If you do *not* want to consume the data from a stream, but you *do* want to get to its `end` event, you can call [`readable.resume()`][] to open the flow of data. ```javascript var readable = getReadableStreamSomehow(); readable.resume(); readable.on('end', function() { console.log('got to the end, but did not read anything'); }); ``` #### readable.pause() * Return: `this` This method will cause a stream in flowing mode to stop emitting `data` events, switching out of flowing mode. Any data that becomes available will remain in the internal buffer. ```javascript var readable = getReadableStreamSomehow(); readable.on('data', function(chunk) { console.log('got %d bytes of data', chunk.length); readable.pause(); console.log('there will be no more data for 1 second'); setTimeout(function() { console.log('now data will start flowing again'); readable.resume(); }, 1000); }); ``` #### readable.isPaused() * Return: `Boolean` This method returns whether or not the `readable` has been **explicitly** paused by client code (using `readable.pause()` without a corresponding `readable.resume()`). ```javascript var readable = new stream.Readable readable.isPaused() // === false readable.pause() readable.isPaused() // === true readable.resume() readable.isPaused() // === false ``` #### readable.pipe(destination[, options]) * `destination` {[Writable][] Stream} The destination for writing data * `options` {Object} Pipe options * `end` {Boolean} End the writer when the reader ends. Default = `true` This method pulls all the data out of a readable stream, and writes it to the supplied destination, automatically managing the flow so that the destination is not overwhelmed by a fast readable stream. Multiple destinations can be piped to safely. ```javascript var readable = getReadableStreamSomehow(); var writable = fs.createWriteStream('file.txt'); // All the data from readable goes into 'file.txt' readable.pipe(writable); ``` This function returns the destination stream, so you can set up pipe chains like so: ```javascript var r = fs.createReadStream('file.txt'); var z = zlib.createGzip(); var w = fs.createWriteStream('file.txt.gz'); r.pipe(z).pipe(w); ``` For example, emulating the Unix `cat` command: ```javascript process.stdin.pipe(process.stdout); ``` By default [`end()`][] is called on the destination when the source stream emits `end`, so that `destination` is no longer writable. Pass `{ end: false }` as `options` to keep the destination stream open. This keeps `writer` open so that "Goodbye" can be written at the end. ```javascript reader.pipe(writer, { end: false }); reader.on('end', function() { writer.end('Goodbye\n'); }); ``` Note that `process.stderr` and `process.stdout` are never closed until the process exits, regardless of the specified options. #### readable.unpipe([destination]) * `destination` {[Writable][] Stream} Optional specific stream to unpipe This method will remove the hooks set up for a previous `pipe()` call. If the destination is not specified, then all pipes are removed. If the destination is specified, but no pipe is set up for it, then this is a no-op. ```javascript var readable = getReadableStreamSomehow(); var writable = fs.createWriteStream('file.txt'); // All the data from readable goes into 'file.txt', // but only for the first second readable.pipe(writable); setTimeout(function() { console.log('stop writing to file.txt'); readable.unpipe(writable); console.log('manually close the file stream'); writable.end(); }, 1000); ``` #### readable.unshift(chunk) * `chunk` {Buffer | String} Chunk of data to unshift onto the read queue This is useful in certain cases where a stream is being consumed by a parser, which needs to "un-consume" some data that it has optimistically pulled out of the source, so that the stream can be passed on to some other party. If you find that you must often call `stream.unshift(chunk)` in your programs, consider implementing a [Transform][] stream instead. (See API for Stream Implementors, below.) ```javascript // Pull off a header delimited by \n\n // use unshift() if we get too much // Call the callback with (error, header, stream) var StringDecoder = require('string_decoder').StringDecoder; function parseHeader(stream, callback) { stream.on('error', callback); stream.on('readable', onReadable); var decoder = new StringDecoder('utf8'); var header = ''; function onReadable() { var chunk; while (null !== (chunk = stream.read())) { var str = decoder.write(chunk); if (str.match(/\n\n/)) { // found the header boundary var split = str.split(/\n\n/); header += split.shift(); var remaining = split.join('\n\n'); var buf = new Buffer(remaining, 'utf8'); if (buf.length) stream.unshift(buf); stream.removeListener('error', callback); stream.removeListener('readable', onReadable); // now the body of the message can be read from the stream. callback(null, header, stream); } else { // still reading the header. header += str; } } } } ``` #### readable.wrap(stream) * `stream` {Stream} An "old style" readable stream Versions of Node.js prior to v0.10 had streams that did not implement the entire Streams API as it is today. (See "Compatibility" below for more information.) If you are using an older io.js library that emits `'data'` events and has a [`pause()`][] method that is advisory only, then you can use the `wrap()` method to create a [Readable][] stream that uses the old stream as its data source. You will very rarely ever need to call this function, but it exists as a convenience for interacting with old io.js programs and libraries. For example: ```javascript var OldReader = require('./old-api-module.js').OldReader; var oreader = new OldReader; var Readable = require('stream').Readable; var myReader = new Readable().wrap(oreader); myReader.on('readable', function() { myReader.read(); // etc. }); ``` ### Class: stream.Writable The Writable stream interface is an abstraction for a *destination* that you are writing data *to*. Examples of writable streams include: * [http requests, on the client](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_class_http_clientrequest) * [http responses, on the server](https://iojs.org/dist/v2.3.0/doc/api/http.html#http_class_http_serverresponse) * [fs write streams](https://iojs.org/dist/v2.3.0/doc/api/fs.html#fs_class_fs_writestream) * [zlib streams][] * [crypto streams][] * [tcp sockets][] * [child process stdin](https://iojs.org/dist/v2.3.0/doc/api/child_process.html#child_process_child_stdin) * [process.stdout][], [process.stderr][] #### writable.write(chunk[, encoding][, callback]) * `chunk` {String | Buffer} The data to write * `encoding` {String} The encoding, if `chunk` is a String * `callback` {Function} Callback for when this chunk of data is flushed * Returns: {Boolean} True if the data was handled completely. This method writes some data to the underlying system, and calls the supplied callback once the data has been fully handled. The return value indicates if you should continue writing right now. If the data had to be buffered internally, then it will return `false`. Otherwise, it will return `true`. This return value is strictly advisory. You MAY continue to write, even if it returns `false`. However, writes will be buffered in memory, so it is best not to do this excessively. Instead, wait for the `drain` event before writing more data. #### Event: 'drain' If a [`writable.write(chunk)`][] call returns false, then the `drain` event will indicate when it is appropriate to begin writing more data to the stream. ```javascript // Write the data to the supplied writable stream 1MM times. // Be attentive to back-pressure. function writeOneMillionTimes(writer, data, encoding, callback) { var i = 1000000; write(); function write() { var ok = true; do { i -= 1; if (i === 0) { // last time! writer.write(data, encoding, callback); } else { // see if we should continue, or wait // don't pass the callback, because we're not done yet. ok = writer.write(data, encoding); } } while (i > 0 && ok); if (i > 0) { // had to stop early! // write some more once it drains writer.once('drain', write); } } } ``` #### writable.cork() Forces buffering of all writes. Buffered data will be flushed either at `.uncork()` or at `.end()` call. #### writable.uncork() Flush all data, buffered since `.cork()` call. #### writable.setDefaultEncoding(encoding) * `encoding` {String} The new default encoding Sets the default encoding for a writable stream. #### writable.end([chunk][, encoding][, callback]) * `chunk` {String | Buffer} Optional data to write * `encoding` {String} The encoding, if `chunk` is a String * `callback` {Function} Optional callback for when the stream is finished Call this method when no more data will be written to the stream. If supplied, the callback is attached as a listener on the `finish` event. Calling [`write()`][] after calling [`end()`][] will raise an error. ```javascript // write 'hello, ' and then end with 'world!' var file = fs.createWriteStream('example.txt'); file.write('hello, '); file.end('world!'); // writing more now is not allowed! ``` #### Event: 'finish' When the [`end()`][] method has been called, and all data has been flushed to the underlying system, this event is emitted. ```javascript var writer = getWritableStreamSomehow(); for (var i = 0; i < 100; i ++) { writer.write('hello, #' + i + '!\n'); } writer.end('this is the end\n'); writer.on('finish', function() { console.error('all writes are now complete.'); }); ``` #### Event: 'pipe' * `src` {[Readable][] Stream} source stream that is piping to this writable This is emitted whenever the `pipe()` method is called on a readable stream, adding this writable to its set of destinations. ```javascript var writer = getWritableStreamSomehow(); var reader = getReadableStreamSomehow(); writer.on('pipe', function(src) { console.error('something is piping into the writer'); assert.equal(src, reader); }); reader.pipe(writer); ``` #### Event: 'unpipe' * `src` {[Readable][] Stream} The source stream that [unpiped][] this writable This is emitted whenever the [`unpipe()`][] method is called on a readable stream, removing this writable from its set of destinations. ```javascript var writer = getWritableStreamSomehow(); var reader = getReadableStreamSomehow(); writer.on('unpipe', function(src) { console.error('something has stopped piping into the writer'); assert.equal(src, reader); }); reader.pipe(writer); reader.unpipe(writer); ``` #### Event: 'error' * {Error object} Emitted if there was an error when writing or piping data. ### Class: stream.Duplex Duplex streams are streams that implement both the [Readable][] and [Writable][] interfaces. See above for usage. Examples of Duplex streams include: * [tcp sockets][] * [zlib streams][] * [crypto streams][] ### Class: stream.Transform Transform streams are [Duplex][] streams where the output is in some way computed from the input. They implement both the [Readable][] and [Writable][] interfaces. See above for usage. Examples of Transform streams include: * [zlib streams][] * [crypto streams][] ## API for Stream Implementors To implement any sort of stream, the pattern is the same: 1. Extend the appropriate parent class in your own subclass. (The [`util.inherits`][] method is particularly helpful for this.) 2. Call the appropriate parent class constructor in your constructor, to be sure that the internal mechanisms are set up properly. 2. Implement one or more specific methods, as detailed below. The class to extend and the method(s) to implement depend on the sort of stream class you are writing:

      Use-case

      Class

      Method(s) to implement

      Reading only

      [Readable](#stream_class_stream_readable_1)

      [_read][]

      Writing only

      [Writable](#stream_class_stream_writable_1)

      [_write][], _writev

      Reading and writing

      [Duplex](#stream_class_stream_duplex_1)

      [_read][], [_write][], _writev

      Operate on written data, then read the result

      [Transform](#stream_class_stream_transform_1)

      _transform, _flush

      In your implementation code, it is very important to never call the methods described in [API for Stream Consumers][] above. Otherwise, you can potentially cause adverse side effects in programs that consume your streaming interfaces. ### Class: stream.Readable `stream.Readable` is an abstract class designed to be extended with an underlying implementation of the [`_read(size)`][] method. Please see above under [API for Stream Consumers][] for how to consume streams in your programs. What follows is an explanation of how to implement Readable streams in your programs. #### Example: A Counting Stream This is a basic example of a Readable stream. It emits the numerals from 1 to 1,000,000 in ascending order, and then ends. ```javascript var Readable = require('stream').Readable; var util = require('util'); util.inherits(Counter, Readable); function Counter(opt) { Readable.call(this, opt); this._max = 1000000; this._index = 1; } Counter.prototype._read = function() { var i = this._index++; if (i > this._max) this.push(null); else { var str = '' + i; var buf = new Buffer(str, 'ascii'); this.push(buf); } }; ``` #### Example: SimpleProtocol v1 (Sub-optimal) This is similar to the `parseHeader` function described above, but implemented as a custom stream. Also, note that this implementation does not convert the incoming data to a string. However, this would be better implemented as a [Transform][] stream. See below for a better implementation. ```javascript // A parser for a simple data protocol. // The "header" is a JSON object, followed by 2 \n characters, and // then a message body. // // NOTE: This can be done more simply as a Transform stream! // Using Readable directly for this is sub-optimal. See the // alternative example below under the Transform section. var Readable = require('stream').Readable; var util = require('util'); util.inherits(SimpleProtocol, Readable); function SimpleProtocol(source, options) { if (!(this instanceof SimpleProtocol)) return new SimpleProtocol(source, options); Readable.call(this, options); this._inBody = false; this._sawFirstCr = false; // source is a readable stream, such as a socket or file this._source = source; var self = this; source.on('end', function() { self.push(null); }); // give it a kick whenever the source is readable // read(0) will not consume any bytes source.on('readable', function() { self.read(0); }); this._rawHeader = []; this.header = null; } SimpleProtocol.prototype._read = function(n) { if (!this._inBody) { var chunk = this._source.read(); // if the source doesn't have data, we don't have data yet. if (chunk === null) return this.push(''); // check if the chunk has a \n\n var split = -1; for (var i = 0; i < chunk.length; i++) { if (chunk[i] === 10) { // '\n' if (this._sawFirstCr) { split = i; break; } else { this._sawFirstCr = true; } } else { this._sawFirstCr = false; } } if (split === -1) { // still waiting for the \n\n // stash the chunk, and try again. this._rawHeader.push(chunk); this.push(''); } else { this._inBody = true; var h = chunk.slice(0, split); this._rawHeader.push(h); var header = Buffer.concat(this._rawHeader).toString(); try { this.header = JSON.parse(header); } catch (er) { this.emit('error', new Error('invalid simple protocol data')); return; } // now, because we got some extra data, unshift the rest // back into the read queue so that our consumer will see it. var b = chunk.slice(split); this.unshift(b); // and let them know that we are done parsing the header. this.emit('header', this.header); } } else { // from there on, just provide the data to our consumer. // careful not to push(null), since that would indicate EOF. var chunk = this._source.read(); if (chunk) this.push(chunk); } }; // Usage: // var parser = new SimpleProtocol(source); // Now parser is a readable stream that will emit 'header' // with the parsed header data. ``` #### new stream.Readable([options]) * `options` {Object} * `highWaterMark` {Number} The maximum number of bytes to store in the internal buffer before ceasing to read from the underlying resource. Default=16kb, or 16 for `objectMode` streams * `encoding` {String} If specified, then buffers will be decoded to strings using the specified encoding. Default=null * `objectMode` {Boolean} Whether this stream should behave as a stream of objects. Meaning that stream.read(n) returns a single value instead of a Buffer of size n. Default=false In classes that extend the Readable class, make sure to call the Readable constructor so that the buffering settings can be properly initialized. #### readable.\_read(size) * `size` {Number} Number of bytes to read asynchronously Note: **Implement this function, but do NOT call it directly.** This function should NOT be called directly. It should be implemented by child classes, and only called by the internal Readable class methods. All Readable stream implementations must provide a `_read` method to fetch data from the underlying resource. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. When data is available, put it into the read queue by calling `readable.push(chunk)`. If `push` returns false, then you should stop reading. When `_read` is called again, you should start pushing more data. The `size` argument is advisory. Implementations where a "read" is a single call that returns data can use this to know how much data to fetch. Implementations where that is not relevant, such as TCP or TLS, may ignore this argument, and simply provide data whenever it becomes available. There is no need, for example to "wait" until `size` bytes are available before calling [`stream.push(chunk)`][]. #### readable.push(chunk[, encoding]) * `chunk` {Buffer | null | String} Chunk of data to push into the read queue * `encoding` {String} Encoding of String chunks. Must be a valid Buffer encoding, such as `'utf8'` or `'ascii'` * return {Boolean} Whether or not more pushes should be performed Note: **This function should be called by Readable implementors, NOT by consumers of Readable streams.** The `_read()` function will not be called again until at least one `push(chunk)` call is made. The `Readable` class works by putting data into a read queue to be pulled out later by calling the `read()` method when the `'readable'` event fires. The `push()` method will explicitly insert some data into the read queue. If it is called with `null` then it will signal the end of the data (EOF). This API is designed to be as flexible as possible. For example, you may be wrapping a lower-level source which has some sort of pause/resume mechanism, and a data callback. In those cases, you could wrap the low-level source object by doing something like this: ```javascript // source is an object with readStop() and readStart() methods, // and an `ondata` member that gets called when it has data, and // an `onend` member that gets called when the data is over. util.inherits(SourceWrapper, Readable); function SourceWrapper(options) { Readable.call(this, options); this._source = getLowlevelSourceObject(); var self = this; // Every time there's data, we push it into the internal buffer. this._source.ondata = function(chunk) { // if push() returns false, then we need to stop reading from source if (!self.push(chunk)) self._source.readStop(); }; // When the source ends, we push the EOF-signaling `null` chunk this._source.onend = function() { self.push(null); }; } // _read will be called when the stream wants to pull more data in // the advisory size argument is ignored in this case. SourceWrapper.prototype._read = function(size) { this._source.readStart(); }; ``` ### Class: stream.Writable `stream.Writable` is an abstract class designed to be extended with an underlying implementation of the [`_write(chunk, encoding, callback)`][] method. Please see above under [API for Stream Consumers][] for how to consume writable streams in your programs. What follows is an explanation of how to implement Writable streams in your programs. #### new stream.Writable([options]) * `options` {Object} * `highWaterMark` {Number} Buffer level when [`write()`][] starts returning false. Default=16kb, or 16 for `objectMode` streams * `decodeStrings` {Boolean} Whether or not to decode strings into Buffers before passing them to [`_write()`][]. Default=true * `objectMode` {Boolean} Whether or not the `write(anyObj)` is a valid operation. If set you can write arbitrary data instead of only `Buffer` / `String` data. Default=false In classes that extend the Writable class, make sure to call the constructor so that the buffering settings can be properly initialized. #### writable.\_write(chunk, encoding, callback) * `chunk` {Buffer | String} The chunk to be written. Will **always** be a buffer unless the `decodeStrings` option was set to `false`. * `encoding` {String} If the chunk is a string, then this is the encoding type. If chunk is a buffer, then this is the special value - 'buffer', ignore it in this case. * `callback` {Function} Call this function (optionally with an error argument) when you are done processing the supplied chunk. All Writable stream implementations must provide a [`_write()`][] method to send data to the underlying resource. Note: **This function MUST NOT be called directly.** It should be implemented by child classes, and called by the internal Writable class methods only. Call the callback using the standard `callback(error)` pattern to signal that the write completed successfully or with an error. If the `decodeStrings` flag is set in the constructor options, then `chunk` may be a string rather than a Buffer, and `encoding` will indicate the sort of string that it is. This is to support implementations that have an optimized handling for certain string data encodings. If you do not explicitly set the `decodeStrings` option to `false`, then you can safely ignore the `encoding` argument, and assume that `chunk` will always be a Buffer. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### writable.\_writev(chunks, callback) * `chunks` {Array} The chunks to be written. Each chunk has following format: `{ chunk: ..., encoding: ... }`. * `callback` {Function} Call this function (optionally with an error argument) when you are done processing the supplied chunks. Note: **This function MUST NOT be called directly.** It may be implemented by child classes, and called by the internal Writable class methods only. This function is completely optional to implement. In most cases it is unnecessary. If implemented, it will be called with all the chunks that are buffered in the write queue. ### Class: stream.Duplex A "duplex" stream is one that is both Readable and Writable, such as a TCP socket connection. Note that `stream.Duplex` is an abstract class designed to be extended with an underlying implementation of the `_read(size)` and [`_write(chunk, encoding, callback)`][] methods as you would with a Readable or Writable stream class. Since JavaScript doesn't have multiple prototypal inheritance, this class prototypally inherits from Readable, and then parasitically from Writable. It is thus up to the user to implement both the lowlevel `_read(n)` method as well as the lowlevel [`_write(chunk, encoding, callback)`][] method on extension duplex classes. #### new stream.Duplex(options) * `options` {Object} Passed to both Writable and Readable constructors. Also has the following fields: * `allowHalfOpen` {Boolean} Default=true. If set to `false`, then the stream will automatically end the readable side when the writable side ends and vice versa. * `readableObjectMode` {Boolean} Default=false. Sets `objectMode` for readable side of the stream. Has no effect if `objectMode` is `true`. * `writableObjectMode` {Boolean} Default=false. Sets `objectMode` for writable side of the stream. Has no effect if `objectMode` is `true`. In classes that extend the Duplex class, make sure to call the constructor so that the buffering settings can be properly initialized. ### Class: stream.Transform A "transform" stream is a duplex stream where the output is causally connected in some way to the input, such as a [zlib][] stream or a [crypto][] stream. There is no requirement that the output be the same size as the input, the same number of chunks, or arrive at the same time. For example, a Hash stream will only ever have a single chunk of output which is provided when the input is ended. A zlib stream will produce output that is either much smaller or much larger than its input. Rather than implement the [`_read()`][] and [`_write()`][] methods, Transform classes must implement the `_transform()` method, and may optionally also implement the `_flush()` method. (See below.) #### new stream.Transform([options]) * `options` {Object} Passed to both Writable and Readable constructors. In classes that extend the Transform class, make sure to call the constructor so that the buffering settings can be properly initialized. #### transform.\_transform(chunk, encoding, callback) * `chunk` {Buffer | String} The chunk to be transformed. Will **always** be a buffer unless the `decodeStrings` option was set to `false`. * `encoding` {String} If the chunk is a string, then this is the encoding type. If chunk is a buffer, then this is the special value - 'buffer', ignore it in this case. * `callback` {Function} Call this function (optionally with an error argument and data) when you are done processing the supplied chunk. Note: **This function MUST NOT be called directly.** It should be implemented by child classes, and called by the internal Transform class methods only. All Transform stream implementations must provide a `_transform` method to accept input and produce output. `_transform` should do whatever has to be done in this specific Transform class, to handle the bytes being written, and pass them off to the readable portion of the interface. Do asynchronous I/O, process things, and so on. Call `transform.push(outputChunk)` 0 or more times to generate output from this input chunk, depending on how much data you want to output as a result of this chunk. Call the callback function only when the current chunk is completely consumed. Note that there may or may not be output as a result of any particular input chunk. If you supply output as the second argument to the callback, it will be passed to push method, in other words the following are equivalent: ```javascript transform.prototype._transform = function (data, encoding, callback) { this.push(data); callback(); } transform.prototype._transform = function (data, encoding, callback) { callback(null, data); } ``` This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### transform.\_flush(callback) * `callback` {Function} Call this function (optionally with an error argument) when you are done flushing any remaining data. Note: **This function MUST NOT be called directly.** It MAY be implemented by child classes, and if so, will be called by the internal Transform class methods only. In some cases, your transform operation may need to emit a bit more data at the end of the stream. For example, a `Zlib` compression stream will store up some internal state so that it can optimally compress the output. At the end, however, it needs to do the best it can with what is left, so that the data will be complete. In those cases, you can implement a `_flush` method, which will be called at the very end, after all the written data is consumed, but before emitting `end` to signal the end of the readable side. Just like with `_transform`, call `transform.push(chunk)` zero or more times, as appropriate, and call `callback` when the flush operation is complete. This method is prefixed with an underscore because it is internal to the class that defines it, and should not be called directly by user programs. However, you **are** expected to override this method in your own extension classes. #### Events: 'finish' and 'end' The [`finish`][] and [`end`][] events are from the parent Writable and Readable classes respectively. The `finish` event is fired after `.end()` is called and all chunks have been processed by `_transform`, `end` is fired after all data has been output which is after the callback in `_flush` has been called. #### Example: `SimpleProtocol` parser v2 The example above of a simple protocol parser can be implemented simply by using the higher level [Transform][] stream class, similar to the `parseHeader` and `SimpleProtocol v1` examples above. In this example, rather than providing the input as an argument, it would be piped into the parser, which is a more idiomatic io.js stream approach. ```javascript var util = require('util'); var Transform = require('stream').Transform; util.inherits(SimpleProtocol, Transform); function SimpleProtocol(options) { if (!(this instanceof SimpleProtocol)) return new SimpleProtocol(options); Transform.call(this, options); this._inBody = false; this._sawFirstCr = false; this._rawHeader = []; this.header = null; } SimpleProtocol.prototype._transform = function(chunk, encoding, done) { if (!this._inBody) { // check if the chunk has a \n\n var split = -1; for (var i = 0; i < chunk.length; i++) { if (chunk[i] === 10) { // '\n' if (this._sawFirstCr) { split = i; break; } else { this._sawFirstCr = true; } } else { this._sawFirstCr = false; } } if (split === -1) { // still waiting for the \n\n // stash the chunk, and try again. this._rawHeader.push(chunk); } else { this._inBody = true; var h = chunk.slice(0, split); this._rawHeader.push(h); var header = Buffer.concat(this._rawHeader).toString(); try { this.header = JSON.parse(header); } catch (er) { this.emit('error', new Error('invalid simple protocol data')); return; } // and let them know that we are done parsing the header. this.emit('header', this.header); // now, because we got some extra data, emit this first. this.push(chunk.slice(split)); } } else { // from there on, just provide the data to our consumer as-is. this.push(chunk); } done(); }; // Usage: // var parser = new SimpleProtocol(); // source.pipe(parser) // Now parser is a readable stream that will emit 'header' // with the parsed header data. ``` ### Class: stream.PassThrough This is a trivial implementation of a [Transform][] stream that simply passes the input bytes across to the output. Its purpose is mainly for examples and testing, but there are occasionally use cases where it can come in handy as a building block for novel sorts of streams. ## Simplified Constructor API In simple cases there is now the added benefit of being able to construct a stream without inheritance. This can be done by passing the appropriate methods as constructor options: Examples: ### Readable ```javascript var readable = new stream.Readable({ read: function(n) { // sets this._read under the hood } }); ``` ### Writable ```javascript var writable = new stream.Writable({ write: function(chunk, encoding, next) { // sets this._write under the hood } }); // or var writable = new stream.Writable({ writev: function(chunks, next) { // sets this._writev under the hood } }); ``` ### Duplex ```javascript var duplex = new stream.Duplex({ read: function(n) { // sets this._read under the hood }, write: function(chunk, encoding, next) { // sets this._write under the hood } }); // or var duplex = new stream.Duplex({ read: function(n) { // sets this._read under the hood }, writev: function(chunks, next) { // sets this._writev under the hood } }); ``` ### Transform ```javascript var transform = new stream.Transform({ transform: function(chunk, encoding, next) { // sets this._transform under the hood }, flush: function(done) { // sets this._flush under the hood } }); ``` ## Streams: Under the Hood ### Buffering Both Writable and Readable streams will buffer data on an internal object called `_writableState.buffer` or `_readableState.buffer`, respectively. The amount of data that will potentially be buffered depends on the `highWaterMark` option which is passed into the constructor. Buffering in Readable streams happens when the implementation calls [`stream.push(chunk)`][]. If the consumer of the Stream does not call `stream.read()`, then the data will sit in the internal queue until it is consumed. Buffering in Writable streams happens when the user calls [`stream.write(chunk)`][] repeatedly, even when `write()` returns `false`. The purpose of streams, especially with the `pipe()` method, is to limit the buffering of data to acceptable levels, so that sources and destinations of varying speed will not overwhelm the available memory. ### `stream.read(0)` There are some cases where you want to trigger a refresh of the underlying readable stream mechanisms, without actually consuming any data. In that case, you can call `stream.read(0)`, which will always return null. If the internal read buffer is below the `highWaterMark`, and the stream is not currently reading, then calling `read(0)` will trigger a low-level `_read` call. There is almost never a need to do this. However, you will see some cases in io.js's internals where this is done, particularly in the Readable stream class internals. ### `stream.push('')` Pushing a zero-byte string or Buffer (when not in [Object mode][]) has an interesting side effect. Because it *is* a call to [`stream.push()`][], it will end the `reading` process. However, it does *not* add any data to the readable buffer, so there's nothing for a user to consume. Very rarely, there are cases where you have no data to provide now, but the consumer of your stream (or, perhaps, another bit of your own code) will know when to check again, by calling `stream.read(0)`. In those cases, you *may* call `stream.push('')`. So far, the only use case for this functionality is in the [tls.CryptoStream][] class, which is deprecated in io.js v1.0. If you find that you have to use `stream.push('')`, please consider another approach, because it almost certainly indicates that something is horribly wrong. ### Compatibility with Older Node.js Versions In versions of Node.js prior to v0.10, the Readable stream interface was simpler, but also less powerful and less useful. * Rather than waiting for you to call the `read()` method, `'data'` events would start emitting immediately. If you needed to do some I/O to decide how to handle data, then you had to store the chunks in some kind of buffer so that they would not be lost. * The [`pause()`][] method was advisory, rather than guaranteed. This meant that you still had to be prepared to receive `'data'` events even when the stream was in a paused state. In io.js v1.0 and Node.js v0.10, the Readable class described below was added. For backwards compatibility with older Node.js programs, Readable streams switch into "flowing mode" when a `'data'` event handler is added, or when the [`resume()`][] method is called. The effect is that, even if you are not using the new `read()` method and `'readable'` event, you no longer have to worry about losing `'data'` chunks. Most programs will continue to function normally. However, this introduces an edge case in the following conditions: * No [`'data'` event][] handler is added. * The [`resume()`][] method is never called. * The stream is not piped to any writable destination. For example, consider the following code: ```javascript // WARNING! BROKEN! net.createServer(function(socket) { // we add an 'end' method, but never consume the data socket.on('end', function() { // It will never get here. socket.end('I got your message (but didnt read it)\n'); }); }).listen(1337); ``` In versions of Node.js prior to v0.10, the incoming message data would be simply discarded. However, in io.js v1.0 and Node.js v0.10 and beyond, the socket will remain paused forever. The workaround in this situation is to call the `resume()` method to start the flow of data: ```javascript // Workaround net.createServer(function(socket) { socket.on('end', function() { socket.end('I got your message (but didnt read it)\n'); }); // start the flow of data, discarding it. socket.resume(); }).listen(1337); ``` In addition to new Readable streams switching into flowing mode, pre-v0.10 style streams can be wrapped in a Readable class using the `wrap()` method. ### Object Mode Normally, Streams operate on Strings and Buffers exclusively. Streams that are in **object mode** can emit generic JavaScript values other than Buffers and Strings. A Readable stream in object mode will always return a single item from a call to `stream.read(size)`, regardless of what the size argument is. A Writable stream in object mode will always ignore the `encoding` argument to `stream.write(data, encoding)`. The special value `null` still retains its special value for object mode streams. That is, for object mode readable streams, `null` as a return value from `stream.read()` indicates that there is no more data, and [`stream.push(null)`][] will signal the end of stream data (`EOF`). No streams in io.js core are object mode streams. This pattern is only used by userland streaming libraries. You should set `objectMode` in your stream child class constructor on the options object. Setting `objectMode` mid-stream is not safe. For Duplex streams `objectMode` can be set exclusively for readable or writable side with `readableObjectMode` and `writableObjectMode` respectively. These options can be used to implement parsers and serializers with Transform streams. ```javascript var util = require('util'); var StringDecoder = require('string_decoder').StringDecoder; var Transform = require('stream').Transform; util.inherits(JSONParseStream, Transform); // Gets \n-delimited JSON string data, and emits the parsed objects function JSONParseStream() { if (!(this instanceof JSONParseStream)) return new JSONParseStream(); Transform.call(this, { readableObjectMode : true }); this._buffer = ''; this._decoder = new StringDecoder('utf8'); } JSONParseStream.prototype._transform = function(chunk, encoding, cb) { this._buffer += this._decoder.write(chunk); // split on newlines var lines = this._buffer.split(/\r?\n/); // keep the last partial line buffered this._buffer = lines.pop(); for (var l = 0; l < lines.length; l++) { var line = lines[l]; try { var obj = JSON.parse(line); } catch (er) { this.emit('error', er); return; } // push the parsed object out to the readable consumer this.push(obj); } cb(); }; JSONParseStream.prototype._flush = function(cb) { // Just handle any leftover var rem = this._buffer.trim(); if (rem) { try { var obj = JSON.parse(rem); } catch (er) { this.emit('error', er); return; } // push the parsed object out to the readable consumer this.push(obj); } cb(); }; ``` [EventEmitter]: https://iojs.org/dist/v2.3.0/doc/api/events.html#events_class_events_eventemitter [Object mode]: #stream_object_mode [`stream.push(chunk)`]: #stream_readable_push_chunk_encoding [`stream.push(null)`]: #stream_readable_push_chunk_encoding [`stream.push()`]: #stream_readable_push_chunk_encoding [`unpipe()`]: #stream_readable_unpipe_destination [unpiped]: #stream_readable_unpipe_destination [tcp sockets]: https://iojs.org/dist/v2.3.0/doc/api/net.html#net_class_net_socket [zlib streams]: zlib.html [zlib]: zlib.html [crypto streams]: crypto.html [crypto]: crypto.html [tls.CryptoStream]: https://iojs.org/dist/v2.3.0/doc/api/tls.html#tls_class_cryptostream [process.stdin]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stdin [stdout]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stdout [process.stdout]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stdout [process.stderr]: https://iojs.org/dist/v2.3.0/doc/api/process.html#process_process_stderr [child process stdout and stderr]: https://iojs.org/dist/v2.3.0/doc/api/child_process.html#child_process_child_stdout [API for Stream Consumers]: #stream_api_for_stream_consumers [API for Stream Implementors]: #stream_api_for_stream_implementors [Readable]: #stream_class_stream_readable [Writable]: #stream_class_stream_writable [Duplex]: #stream_class_stream_duplex [Transform]: #stream_class_stream_transform [`end`]: #stream_event_end [`finish`]: #stream_event_finish [`_read(size)`]: #stream_readable_read_size_1 [`_read()`]: #stream_readable_read_size_1 [_read]: #stream_readable_read_size_1 [`writable.write(chunk)`]: #stream_writable_write_chunk_encoding_callback [`write(chunk, encoding, callback)`]: #stream_writable_write_chunk_encoding_callback [`write()`]: #stream_writable_write_chunk_encoding_callback [`stream.write(chunk)`]: #stream_writable_write_chunk_encoding_callback [`_write(chunk, encoding, callback)`]: #stream_writable_write_chunk_encoding_callback_1 [`_write()`]: #stream_writable_write_chunk_encoding_callback_1 [_write]: #stream_writable_write_chunk_encoding_callback_1 [`util.inherits`]: https://iojs.org/dist/v2.3.0/doc/api/util.html#util_util_inherits_constructor_superconstructor [`end()`]: #stream_writable_end_chunk_encoding_callback [`'data'` event]: #stream_event_data [`resume()`]: #stream_readable_resume [`readable.resume()`]: #stream_readable_resume [`pause()`]: #stream_readable_pause [`unpipe()`]: #stream_readable_unpipe_destination [`pipe()`]: #stream_readable_pipe_destination_options npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/doc/wg-meetings/0000755000000000000000000000000012631326456025422 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/doc/wg-meetings/2015-01-30.md0000644000000000000000000000435012631326456026773 0ustar 00000000000000# streams WG Meeting 2015-01-30 ## Links * **Google Hangouts Video**: http://www.youtube.com/watch?v=I9nDOSGfwZg * **GitHub Issue**: https://github.com/iojs/readable-stream/issues/106 * **Original Minutes Google Doc**: https://docs.google.com/document/d/17aTgLnjMXIrfjgNaTUnHQO7m3xgzHR2VXBTmi03Qii4/ ## Agenda Extracted from https://github.com/iojs/readable-stream/labels/wg-agenda prior to meeting. * adopt a charter [#105](https://github.com/iojs/readable-stream/issues/105) * release and versioning strategy [#101](https://github.com/iojs/readable-stream/issues/101) * simpler stream creation [#102](https://github.com/iojs/readable-stream/issues/102) * proposal: deprecate implicit flowing of streams [#99](https://github.com/iojs/readable-stream/issues/99) ## Minutes ### adopt a charter * group: +1's all around ### What versioning scheme should be adopted? * group: +1’s 3.0.0 * domenic+group: pulling in patches from other sources where appropriate * mikeal: version independently, suggesting versions for io.js * mikeal+domenic: work with TC to notify in advance of changes simpler stream creation ### streamline creation of streams * sam: streamline creation of streams * domenic: nice simple solution posted but, we lose the opportunity to change the model may not be backwards incompatible (double check keys) **action item:** domenic will check ### remove implicit flowing of streams on(‘data’) * add isFlowing / isPaused * mikeal: worrying that we’re documenting polyfill methods – confuses users * domenic: more reflective API is probably good, with warning labels for users * new section for mad scientists (reflective stream access) * calvin: name the “third state” * mikeal: maybe borrow the name from whatwg? * domenic: we’re missing the “third state” * consensus: kind of difficult to name the third state * mikeal: figure out differences in states / compat * mathias: always flow on data – eliminates third state * explore what it breaks **action items:** * ask isaac for ability to list packages by what public io.js APIs they use (esp. Stream) * ask rod/build for infrastructure * **chris**: explore the “flow on data” approach * add isPaused/isFlowing * add new docs section * move isPaused to that section npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/lib/_stream_duplex.js0000644000000000000000000000351412631326456026551 0ustar 00000000000000// a duplex stream is just a stream that is both readable and writable. // Since JS doesn't have multiple prototypal inheritance, this class // prototypally inherits from Readable, and then parasitically from // Writable. 'use strict'; /**/ var objectKeys = Object.keys || function (obj) { var keys = []; for (var key in obj) keys.push(key); return keys; } /**/ module.exports = Duplex; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ var Readable = require('./_stream_readable'); var Writable = require('./_stream_writable'); util.inherits(Duplex, Readable); var keys = objectKeys(Writable.prototype); for (var v = 0; v < keys.length; v++) { var method = keys[v]; if (!Duplex.prototype[method]) Duplex.prototype[method] = Writable.prototype[method]; } function Duplex(options) { if (!(this instanceof Duplex)) return new Duplex(options); Readable.call(this, options); Writable.call(this, options); if (options && options.readable === false) this.readable = false; if (options && options.writable === false) this.writable = false; this.allowHalfOpen = true; if (options && options.allowHalfOpen === false) this.allowHalfOpen = false; this.once('end', onend); } // the no-half-open enforcer function onend() { // if we allow half-open state, or if the writable side ended, // then we're ok. if (this.allowHalfOpen || this._writableState.ended) return; // no more data can be written. // But allow more writes to happen in this tick. processNextTick(onEndNT, this); } function onEndNT(self) { self.end(); } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/lib/_stream_passthrough.js0000644000000000000000000000114012631326456027610 0ustar 00000000000000// a passthrough stream. // basically just the most minimal sort of Transform stream. // Every written chunk gets output as-is. 'use strict'; module.exports = PassThrough; var Transform = require('./_stream_transform'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(PassThrough, Transform); function PassThrough(options) { if (!(this instanceof PassThrough)) return new PassThrough(options); Transform.call(this, options); } PassThrough.prototype._transform = function(chunk, encoding, cb) { cb(null, chunk); }; npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/lib/_stream_readable.js0000644000000000000000000006144512631326456027016 0ustar 00000000000000'use strict'; module.exports = Readable; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var isArray = require('isarray'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Readable.ReadableState = ReadableState; var EE = require('events').EventEmitter; /**/ if (!EE.listenerCount) EE.listenerCount = function(emitter, type) { return emitter.listeners(type).length; }; /**/ /**/ var Stream; (function (){try{ Stream = require('st' + 'ream'); }catch(_){}finally{ if (!Stream) Stream = require('events').EventEmitter; }}()) /**/ var Buffer = require('buffer').Buffer; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ /**/ var debug = require('util'); if (debug && debug.debuglog) { debug = debug.debuglog('stream'); } else { debug = function () {}; } /**/ var StringDecoder; util.inherits(Readable, Stream); function ReadableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // object stream flag. Used to make read(n) ignore n and to // make all the buffer merging and length checks go away this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.readableObjectMode; // the point at which it stops calling _read() to fill the buffer // Note: 0 is a valid value, means "don't call _read preemptively ever" var hwm = options.highWaterMark; var defaultHwm = this.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.buffer = []; this.length = 0; this.pipes = null; this.pipesCount = 0; this.flowing = null; this.ended = false; this.endEmitted = false; this.reading = false; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // whenever we return null, then we set a flag to say // that we're awaiting a 'readable' event emission. this.needReadable = false; this.emittedReadable = false; this.readableListening = false; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // when piping, we only care about 'readable' events that happen // after read()ing all the bytes and not getting any pushback. this.ranOut = false; // the number of writers that are awaiting a drain event in .pipe()s this.awaitDrain = 0; // if true, a maybeReadMore has been scheduled this.readingMore = false; this.decoder = null; this.encoding = null; if (options.encoding) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this.decoder = new StringDecoder(options.encoding); this.encoding = options.encoding; } } function Readable(options) { var Duplex = require('./_stream_duplex'); if (!(this instanceof Readable)) return new Readable(options); this._readableState = new ReadableState(options, this); // legacy this.readable = true; if (options && typeof options.read === 'function') this._read = options.read; Stream.call(this); } // Manually shove something into the read() buffer. // This returns true if the highWaterMark has not been hit yet, // similar to how Writable.write() returns true if you should // write() some more. Readable.prototype.push = function(chunk, encoding) { var state = this._readableState; if (!state.objectMode && typeof chunk === 'string') { encoding = encoding || state.defaultEncoding; if (encoding !== state.encoding) { chunk = new Buffer(chunk, encoding); encoding = ''; } } return readableAddChunk(this, state, chunk, encoding, false); }; // Unshift should *always* be something directly out of read() Readable.prototype.unshift = function(chunk) { var state = this._readableState; return readableAddChunk(this, state, chunk, '', true); }; Readable.prototype.isPaused = function() { return this._readableState.flowing === false; }; function readableAddChunk(stream, state, chunk, encoding, addToFront) { var er = chunkInvalid(state, chunk); if (er) { stream.emit('error', er); } else if (chunk === null) { state.reading = false; onEofChunk(stream, state); } else if (state.objectMode || chunk && chunk.length > 0) { if (state.ended && !addToFront) { var e = new Error('stream.push() after EOF'); stream.emit('error', e); } else if (state.endEmitted && addToFront) { var e = new Error('stream.unshift() after end event'); stream.emit('error', e); } else { if (state.decoder && !addToFront && !encoding) chunk = state.decoder.write(chunk); if (!addToFront) state.reading = false; // if we want the data now, just emit it. if (state.flowing && state.length === 0 && !state.sync) { stream.emit('data', chunk); stream.read(0); } else { // update the buffer info. state.length += state.objectMode ? 1 : chunk.length; if (addToFront) state.buffer.unshift(chunk); else state.buffer.push(chunk); if (state.needReadable) emitReadable(stream); } maybeReadMore(stream, state); } } else if (!addToFront) { state.reading = false; } return needMoreData(state); } // if it's past the high water mark, we can push in some more. // Also, if we have no data yet, we can stand some // more bytes. This is to work around cases where hwm=0, // such as the repl. Also, if the push() triggered a // readable event, and the user called read(largeNumber) such that // needReadable was set, then we ought to push more, so that another // 'readable' event will be triggered. function needMoreData(state) { return !state.ended && (state.needReadable || state.length < state.highWaterMark || state.length === 0); } // backwards compatibility. Readable.prototype.setEncoding = function(enc) { if (!StringDecoder) StringDecoder = require('string_decoder/').StringDecoder; this._readableState.decoder = new StringDecoder(enc); this._readableState.encoding = enc; return this; }; // Don't raise the hwm > 128MB var MAX_HWM = 0x800000; function roundUpToNextPowerOf2(n) { if (n >= MAX_HWM) { n = MAX_HWM; } else { // Get the next highest power of 2 n--; for (var p = 1; p < 32; p <<= 1) n |= n >> p; n++; } return n; } function howMuchToRead(n, state) { if (state.length === 0 && state.ended) return 0; if (state.objectMode) return n === 0 ? 0 : 1; if (n === null || isNaN(n)) { // only flow one buffer at a time if (state.flowing && state.buffer.length) return state.buffer[0].length; else return state.length; } if (n <= 0) return 0; // If we're asking for more than the target buffer level, // then raise the water mark. Bump up to the next highest // power of 2, to prevent increasing it excessively in tiny // amounts. if (n > state.highWaterMark) state.highWaterMark = roundUpToNextPowerOf2(n); // don't have that much. return null, unless we've ended. if (n > state.length) { if (!state.ended) { state.needReadable = true; return 0; } else { return state.length; } } return n; } // you can override either this method, or the async _read(n) below. Readable.prototype.read = function(n) { debug('read', n); var state = this._readableState; var nOrig = n; if (typeof n !== 'number' || n > 0) state.emittedReadable = false; // if we're doing read(0) to trigger a readable event, but we // already have a bunch of data in the buffer, then just trigger // the 'readable' event and move on. if (n === 0 && state.needReadable && (state.length >= state.highWaterMark || state.ended)) { debug('read: emitReadable', state.length, state.ended); if (state.length === 0 && state.ended) endReadable(this); else emitReadable(this); return null; } n = howMuchToRead(n, state); // if we've ended, and we're now clear, then finish it up. if (n === 0 && state.ended) { if (state.length === 0) endReadable(this); return null; } // All the actual chunk generation logic needs to be // *below* the call to _read. The reason is that in certain // synthetic stream cases, such as passthrough streams, _read // may be a completely synchronous operation which may change // the state of the read buffer, providing enough data when // before there was *not* enough. // // So, the steps are: // 1. Figure out what the state of things will be after we do // a read from the buffer. // // 2. If that resulting state will trigger a _read, then call _read. // Note that this may be asynchronous, or synchronous. Yes, it is // deeply ugly to write APIs this way, but that still doesn't mean // that the Readable class should behave improperly, as streams are // designed to be sync/async agnostic. // Take note if the _read call is sync or async (ie, if the read call // has returned yet), so that we know whether or not it's safe to emit // 'readable' etc. // // 3. Actually pull the requested chunks out of the buffer and return. // if we need a readable event, then we need to do some reading. var doRead = state.needReadable; debug('need readable', doRead); // if we currently have less than the highWaterMark, then also read some if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); } // however, if we've ended, then there's no point, and if we're already // reading, then it's unnecessary. if (state.ended || state.reading) { doRead = false; debug('reading or ended', doRead); } if (doRead) { debug('do read'); state.reading = true; state.sync = true; // if the length is currently zero, then we *need* a readable event. if (state.length === 0) state.needReadable = true; // call internal read method this._read(state.highWaterMark); state.sync = false; } // If _read pushed data synchronously, then `reading` will be false, // and we need to re-evaluate how much data we can return to the user. if (doRead && !state.reading) n = howMuchToRead(nOrig, state); var ret; if (n > 0) ret = fromList(n, state); else ret = null; if (ret === null) { state.needReadable = true; n = 0; } state.length -= n; // If we have nothing in the buffer, then we want to know // as soon as we *do* get something into the buffer. if (state.length === 0 && !state.ended) state.needReadable = true; // If we tried to read() past the EOF, then emit end on the next tick. if (nOrig !== n && state.ended && state.length === 0) endReadable(this); if (ret !== null) this.emit('data', ret); return ret; }; function chunkInvalid(state, chunk) { var er = null; if (!(Buffer.isBuffer(chunk)) && typeof chunk !== 'string' && chunk !== null && chunk !== undefined && !state.objectMode) { er = new TypeError('Invalid non-string/buffer chunk'); } return er; } function onEofChunk(stream, state) { if (state.ended) return; if (state.decoder) { var chunk = state.decoder.end(); if (chunk && chunk.length) { state.buffer.push(chunk); state.length += state.objectMode ? 1 : chunk.length; } } state.ended = true; // emit 'readable' now to make sure it gets picked up. emitReadable(stream); } // Don't emit readable right away in sync mode, because this can trigger // another read() call => stack overflow. This way, it might trigger // a nextTick recursion warning, but that's not so bad. function emitReadable(stream) { var state = stream._readableState; state.needReadable = false; if (!state.emittedReadable) { debug('emitReadable', state.flowing); state.emittedReadable = true; if (state.sync) processNextTick(emitReadable_, stream); else emitReadable_(stream); } } function emitReadable_(stream) { debug('emit readable'); stream.emit('readable'); flow(stream); } // at this point, the user has presumably seen the 'readable' event, // and called read() to consume some data. that may have triggered // in turn another _read(n) call, in which case reading = true if // it's in progress. // However, if we're not ended, or reading, and the length < hwm, // then go ahead and try to read some more preemptively. function maybeReadMore(stream, state) { if (!state.readingMore) { state.readingMore = true; processNextTick(maybeReadMore_, stream, state); } } function maybeReadMore_(stream, state) { var len = state.length; while (!state.reading && !state.flowing && !state.ended && state.length < state.highWaterMark) { debug('maybeReadMore read 0'); stream.read(0); if (len === state.length) // didn't get any data, stop spinning. break; else len = state.length; } state.readingMore = false; } // abstract method. to be overridden in specific implementation classes. // call cb(er, data) where data is <= n in length. // for virtual (non-string, non-buffer) streams, "length" is somewhat // arbitrary, and perhaps not very meaningful. Readable.prototype._read = function(n) { this.emit('error', new Error('not implemented')); }; Readable.prototype.pipe = function(dest, pipeOpts) { var src = this; var state = this._readableState; switch (state.pipesCount) { case 0: state.pipes = dest; break; case 1: state.pipes = [state.pipes, dest]; break; default: state.pipes.push(dest); break; } state.pipesCount += 1; debug('pipe count=%d opts=%j', state.pipesCount, pipeOpts); var doEnd = (!pipeOpts || pipeOpts.end !== false) && dest !== process.stdout && dest !== process.stderr; var endFn = doEnd ? onend : cleanup; if (state.endEmitted) processNextTick(endFn); else src.once('end', endFn); dest.on('unpipe', onunpipe); function onunpipe(readable) { debug('onunpipe'); if (readable === src) { cleanup(); } } function onend() { debug('onend'); dest.end(); } // when the dest drains, it reduces the awaitDrain counter // on the source. This would be more elegant with a .once() // handler in flow(), but adding and removing repeatedly is // too slow. var ondrain = pipeOnDrain(src); dest.on('drain', ondrain); function cleanup() { debug('cleanup'); // cleanup event handlers once the pipe is broken dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); dest.removeListener('drain', ondrain); dest.removeListener('error', onerror); dest.removeListener('unpipe', onunpipe); src.removeListener('end', onend); src.removeListener('end', cleanup); src.removeListener('data', ondata); // if the reader is waiting for a drain event from this // specific writer, then it would cause it to never start // flowing again. // So, if this is awaiting a drain, then we just call it now. // If we don't know, then assume that we are waiting for one. if (state.awaitDrain && (!dest._writableState || dest._writableState.needDrain)) ondrain(); } src.on('data', ondata); function ondata(chunk) { debug('ondata'); var ret = dest.write(chunk); if (false === ret) { debug('false write response, pause', src._readableState.awaitDrain); src._readableState.awaitDrain++; src.pause(); } } // if the dest has an error, then stop piping into it. // however, don't suppress the throwing behavior for this. function onerror(er) { debug('onerror', er); unpipe(); dest.removeListener('error', onerror); if (EE.listenerCount(dest, 'error') === 0) dest.emit('error', er); } // This is a brutally ugly hack to make sure that our error handler // is attached before any userland ones. NEVER DO THIS. if (!dest._events || !dest._events.error) dest.on('error', onerror); else if (isArray(dest._events.error)) dest._events.error.unshift(onerror); else dest._events.error = [onerror, dest._events.error]; // Both close and finish should trigger unpipe, but only once. function onclose() { dest.removeListener('finish', onfinish); unpipe(); } dest.once('close', onclose); function onfinish() { debug('onfinish'); dest.removeListener('close', onclose); unpipe(); } dest.once('finish', onfinish); function unpipe() { debug('unpipe'); src.unpipe(dest); } // tell the dest that it's being piped to dest.emit('pipe', src); // start the flow if it hasn't been started already. if (!state.flowing) { debug('pipe resume'); src.resume(); } return dest; }; function pipeOnDrain(src) { return function() { var state = src._readableState; debug('pipeOnDrain', state.awaitDrain); if (state.awaitDrain) state.awaitDrain--; if (state.awaitDrain === 0 && EE.listenerCount(src, 'data')) { state.flowing = true; flow(src); } }; } Readable.prototype.unpipe = function(dest) { var state = this._readableState; // if we're not piping anywhere, then do nothing. if (state.pipesCount === 0) return this; // just one destination. most common case. if (state.pipesCount === 1) { // passed in one, but it's not the right one. if (dest && dest !== state.pipes) return this; if (!dest) dest = state.pipes; // got a match. state.pipes = null; state.pipesCount = 0; state.flowing = false; if (dest) dest.emit('unpipe', this); return this; } // slow case. multiple pipe destinations. if (!dest) { // remove all. var dests = state.pipes; var len = state.pipesCount; state.pipes = null; state.pipesCount = 0; state.flowing = false; for (var i = 0; i < len; i++) dests[i].emit('unpipe', this); return this; } // try to find the right one. var i = indexOf(state.pipes, dest); if (i === -1) return this; state.pipes.splice(i, 1); state.pipesCount -= 1; if (state.pipesCount === 1) state.pipes = state.pipes[0]; dest.emit('unpipe', this); return this; }; // set up data events if they are asked for // Ensure readable listeners eventually get something Readable.prototype.on = function(ev, fn) { var res = Stream.prototype.on.call(this, ev, fn); // If listening to data, and it has not explicitly been paused, // then call resume to start the flow of data on the next tick. if (ev === 'data' && false !== this._readableState.flowing) { this.resume(); } if (ev === 'readable' && this.readable) { var state = this._readableState; if (!state.readableListening) { state.readableListening = true; state.emittedReadable = false; state.needReadable = true; if (!state.reading) { processNextTick(nReadingNextTick, this); } else if (state.length) { emitReadable(this, state); } } } return res; }; Readable.prototype.addListener = Readable.prototype.on; function nReadingNextTick(self) { debug('readable nexttick read 0'); self.read(0); } // pause() and resume() are remnants of the legacy readable stream API // If the user uses them, then switch into old mode. Readable.prototype.resume = function() { var state = this._readableState; if (!state.flowing) { debug('resume'); state.flowing = true; resume(this, state); } return this; }; function resume(stream, state) { if (!state.resumeScheduled) { state.resumeScheduled = true; processNextTick(resume_, stream, state); } } function resume_(stream, state) { if (!state.reading) { debug('resume read 0'); stream.read(0); } state.resumeScheduled = false; stream.emit('resume'); flow(stream); if (state.flowing && !state.reading) stream.read(0); } Readable.prototype.pause = function() { debug('call pause flowing=%j', this._readableState.flowing); if (false !== this._readableState.flowing) { debug('pause'); this._readableState.flowing = false; this.emit('pause'); } return this; }; function flow(stream) { var state = stream._readableState; debug('flow', state.flowing); if (state.flowing) { do { var chunk = stream.read(); } while (null !== chunk && state.flowing); } } // wrap an old-style stream as the async data source. // This is *not* part of the readable stream interface. // It is an ugly unfortunate mess of history. Readable.prototype.wrap = function(stream) { var state = this._readableState; var paused = false; var self = this; stream.on('end', function() { debug('wrapped end'); if (state.decoder && !state.ended) { var chunk = state.decoder.end(); if (chunk && chunk.length) self.push(chunk); } self.push(null); }); stream.on('data', function(chunk) { debug('wrapped data'); if (state.decoder) chunk = state.decoder.write(chunk); // don't skip over falsy values in objectMode if (state.objectMode && (chunk === null || chunk === undefined)) return; else if (!state.objectMode && (!chunk || !chunk.length)) return; var ret = self.push(chunk); if (!ret) { paused = true; stream.pause(); } }); // proxy all the other methods. // important when wrapping filters and duplexes. for (var i in stream) { if (this[i] === undefined && typeof stream[i] === 'function') { this[i] = function(method) { return function() { return stream[method].apply(stream, arguments); }; }(i); } } // proxy certain important events. var events = ['error', 'close', 'destroy', 'pause', 'resume']; forEach(events, function(ev) { stream.on(ev, self.emit.bind(self, ev)); }); // when we try to consume some more bytes, simply unpause the // underlying stream. self._read = function(n) { debug('wrapped _read', n); if (paused) { paused = false; stream.resume(); } }; return self; }; // exposed for testing purposes only. Readable._fromList = fromList; // Pluck off n bytes from an array of buffers. // Length is the combined lengths of all the buffers in the list. function fromList(n, state) { var list = state.buffer; var length = state.length; var stringMode = !!state.decoder; var objectMode = !!state.objectMode; var ret; // nothing in the list, definitely empty. if (list.length === 0) return null; if (length === 0) ret = null; else if (objectMode) ret = list.shift(); else if (!n || n >= length) { // read it all, truncate the array. if (stringMode) ret = list.join(''); else ret = Buffer.concat(list, length); list.length = 0; } else { // read just some of it. if (n < list[0].length) { // just take a part of the first list item. // slice is the same for buffers and strings. var buf = list[0]; ret = buf.slice(0, n); list[0] = buf.slice(n); } else if (n === list[0].length) { // first list is a perfect match ret = list.shift(); } else { // complex case. // we have enough to cover it, but it spans past the first buffer. if (stringMode) ret = ''; else ret = new Buffer(n); var c = 0; for (var i = 0, l = list.length; i < l && c < n; i++) { var buf = list[0]; var cpy = Math.min(n - c, buf.length); if (stringMode) ret += buf.slice(0, cpy); else buf.copy(ret, c, 0, cpy); if (cpy < buf.length) list[0] = buf.slice(cpy); else list.shift(); c += cpy; } } } return ret; } function endReadable(stream) { var state = stream._readableState; // If we get here before consuming all the bytes, then that is a // bug in node. Should never happen. if (state.length > 0) throw new Error('endReadable called on non-empty stream'); if (!state.endEmitted) { state.ended = true; processNextTick(endReadableNT, state, stream); } } function endReadableNT(state, stream) { // Check that we didn't get one last unshift. if (!state.endEmitted && state.length === 0) { state.endEmitted = true; stream.readable = false; stream.emit('end'); } } function forEach (xs, f) { for (var i = 0, l = xs.length; i < l; i++) { f(xs[i], i); } } function indexOf (xs, x) { for (var i = 0, l = xs.length; i < l; i++) { if (xs[i] === x) return i; } return -1; } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/lib/_stream_transform.js0000644000000000000000000001441512631326456027265 0ustar 00000000000000// a transform stream is a readable/writable stream where you do // something with the data. Sometimes it's called a "filter", // but that's not a great name for it, since that implies a thing where // some bits pass through, and others are simply ignored. (That would // be a valid example of a transform, of course.) // // While the output is causally related to the input, it's not a // necessarily symmetric or synchronous transformation. For example, // a zlib stream might take multiple plain-text writes(), and then // emit a single compressed chunk some time in the future. // // Here's how this works: // // The Transform stream has all the aspects of the readable and writable // stream classes. When you write(chunk), that calls _write(chunk,cb) // internally, and returns false if there's a lot of pending writes // buffered up. When you call read(), that calls _read(n) until // there's enough pending readable data buffered up. // // In a transform stream, the written data is placed in a buffer. When // _read(n) is called, it transforms the queued up data, calling the // buffered _write cb's as it consumes chunks. If consuming a single // written chunk would result in multiple output chunks, then the first // outputted bit calls the readcb, and subsequent chunks just go into // the read buffer, and will cause it to emit 'readable' if necessary. // // This way, back-pressure is actually determined by the reading side, // since _read has to be called to start processing a new chunk. However, // a pathological inflate type of transform can cause excessive buffering // here. For example, imagine a stream where every byte of input is // interpreted as an integer from 0-255, and then results in that many // bytes of output. Writing the 4 bytes {ff,ff,ff,ff} would result in // 1kb of data being output. In this case, you could write a very small // amount of input, and end up with a very large amount of output. In // such a pathological inflating mechanism, there'd be no way to tell // the system to stop doing the transform. A single 4MB write could // cause the system to run out of memory. // // However, even in such a pathological case, only a single written chunk // would be consumed, and then the rest would wait (un-transformed) until // the results of the previous transformed chunk were consumed. 'use strict'; module.exports = Transform; var Duplex = require('./_stream_duplex'); /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ util.inherits(Transform, Duplex); function TransformState(stream) { this.afterTransform = function(er, data) { return afterTransform(stream, er, data); }; this.needTransform = false; this.transforming = false; this.writecb = null; this.writechunk = null; } function afterTransform(stream, er, data) { var ts = stream._transformState; ts.transforming = false; var cb = ts.writecb; if (!cb) return stream.emit('error', new Error('no writecb in Transform class')); ts.writechunk = null; ts.writecb = null; if (data !== null && data !== undefined) stream.push(data); if (cb) cb(er); var rs = stream._readableState; rs.reading = false; if (rs.needReadable || rs.length < rs.highWaterMark) { stream._read(rs.highWaterMark); } } function Transform(options) { if (!(this instanceof Transform)) return new Transform(options); Duplex.call(this, options); this._transformState = new TransformState(this); // when the writable side finishes, then flush out anything remaining. var stream = this; // start out asking for a readable event once data is transformed. this._readableState.needReadable = true; // we have implemented the _read method, and done the other things // that Readable wants before the first _read call, so unset the // sync guard flag. this._readableState.sync = false; if (options) { if (typeof options.transform === 'function') this._transform = options.transform; if (typeof options.flush === 'function') this._flush = options.flush; } this.once('prefinish', function() { if (typeof this._flush === 'function') this._flush(function(er) { done(stream, er); }); else done(stream); }); } Transform.prototype.push = function(chunk, encoding) { this._transformState.needTransform = false; return Duplex.prototype.push.call(this, chunk, encoding); }; // This is the part where you do stuff! // override this function in implementation classes. // 'chunk' is an input chunk. // // Call `push(newChunk)` to pass along transformed output // to the readable side. You may call 'push' zero or more times. // // Call `cb(err)` when you are done with this chunk. If you pass // an error, then that'll put the hurt on the whole operation. If you // never call cb(), then you'll never get another chunk. Transform.prototype._transform = function(chunk, encoding, cb) { throw new Error('not implemented'); }; Transform.prototype._write = function(chunk, encoding, cb) { var ts = this._transformState; ts.writecb = cb; ts.writechunk = chunk; ts.writeencoding = encoding; if (!ts.transforming) { var rs = this._readableState; if (ts.needTransform || rs.needReadable || rs.length < rs.highWaterMark) this._read(rs.highWaterMark); } }; // Doesn't matter what the args are here. // _transform does all the work. // That we got here means that the readable side wants more data. Transform.prototype._read = function(n) { var ts = this._transformState; if (ts.writechunk !== null && ts.writecb && !ts.transforming) { ts.transforming = true; this._transform(ts.writechunk, ts.writeencoding, ts.afterTransform); } else { // mark that we need a transform, so that any data that comes in // will get processed, now that we've asked for it. ts.needTransform = true; } }; function done(stream, er) { if (er) return stream.emit('error', er); // if there's nothing in the write buffer, then that means // that nothing more will ever be provided var ws = stream._writableState; var ts = stream._transformState; if (ws.length) throw new Error('calling transform done when ws.length != 0'); if (ts.transforming) throw new Error('calling transform done when still transforming'); return stream.push(null); } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/lib/_stream_writable.js0000644000000000000000000003243512631326456027065 0ustar 00000000000000// A bit simpler than readable streams. // Implement an async ._write(chunk, cb), and it'll handle all // the drain event emission and buffering. 'use strict'; module.exports = Writable; /**/ var processNextTick = require('process-nextick-args'); /**/ /**/ var Buffer = require('buffer').Buffer; /**/ Writable.WritableState = WritableState; /**/ var util = require('core-util-is'); util.inherits = require('inherits'); /**/ /**/ var Stream; (function (){try{ Stream = require('st' + 'ream'); }catch(_){}finally{ if (!Stream) Stream = require('events').EventEmitter; }}()) /**/ var Buffer = require('buffer').Buffer; util.inherits(Writable, Stream); function nop() {} function WriteReq(chunk, encoding, cb) { this.chunk = chunk; this.encoding = encoding; this.callback = cb; this.next = null; } function WritableState(options, stream) { var Duplex = require('./_stream_duplex'); options = options || {}; // object stream flag to indicate whether or not this stream // contains buffers or objects. this.objectMode = !!options.objectMode; if (stream instanceof Duplex) this.objectMode = this.objectMode || !!options.writableObjectMode; // the point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if // the entire buffer is not flushed immediately on write() var hwm = options.highWaterMark; var defaultHwm = this.objectMode ? 16 : 16 * 1024; this.highWaterMark = (hwm || hwm === 0) ? hwm : defaultHwm; // cast to ints. this.highWaterMark = ~~this.highWaterMark; this.needDrain = false; // at the start of calling end() this.ending = false; // when end() has been called, and returned this.ended = false; // when 'finish' is emitted this.finished = false; // should we decode strings into buffers before passing to _write? // this is here so that some node-core streams can optimize string // handling at a lower level. var noDecode = options.decodeStrings === false; this.decodeStrings = !noDecode; // Crypto is kind of old and crusty. Historically, its default string // encoding is 'binary' so we have to make this configurable. // Everything else in the universe uses 'utf8', though. this.defaultEncoding = options.defaultEncoding || 'utf8'; // not an actual buffer we keep track of, but a measurement // of how much we're waiting to get pushed to some underlying // socket or file. this.length = 0; // a flag to see when we're in the middle of a write. this.writing = false; // when true all writes will be buffered until .uncork() call this.corked = 0; // a flag to be able to tell if the onwrite cb is called immediately, // or on a later tick. We set this to true at first, because any // actions that shouldn't happen until "later" should generally also // not happen before the first write call. this.sync = true; // a flag to know if we're processing previously buffered items, which // may call the _write() callback in the same tick, so that we don't // end up in an overlapped onwrite situation. this.bufferProcessing = false; // the callback that's passed to _write(chunk,cb) this.onwrite = function(er) { onwrite(stream, er); }; // the callback that the user supplies to write(chunk,encoding,cb) this.writecb = null; // the amount that is being written when _write is called. this.writelen = 0; this.bufferedRequest = null; this.lastBufferedRequest = null; // number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted this.pendingcb = 0; // emit prefinish if the only thing we're waiting for is _write cbs // This is relevant for synchronous Transform streams this.prefinished = false; // True if the error was already emitted and should not be thrown again this.errorEmitted = false; } WritableState.prototype.getBuffer = function writableStateGetBuffer() { var current = this.bufferedRequest; var out = []; while (current) { out.push(current); current = current.next; } return out; }; (function (){try { Object.defineProperty(WritableState.prototype, 'buffer', { get: require('util-deprecate')(function() { return this.getBuffer(); }, '_writableState.buffer is deprecated. Use ' + '_writableState.getBuffer() instead.') }); }catch(_){}}()); function Writable(options) { var Duplex = require('./_stream_duplex'); // Writable ctor is applied to Duplexes, though they're not // instanceof Writable, they're instanceof Readable. if (!(this instanceof Writable) && !(this instanceof Duplex)) return new Writable(options); this._writableState = new WritableState(options, this); // legacy. this.writable = true; if (options) { if (typeof options.write === 'function') this._write = options.write; if (typeof options.writev === 'function') this._writev = options.writev; } Stream.call(this); } // Otherwise people can pipe Writable streams, which is just wrong. Writable.prototype.pipe = function() { this.emit('error', new Error('Cannot pipe. Not readable.')); }; function writeAfterEnd(stream, cb) { var er = new Error('write after end'); // TODO: defer error events consistently everywhere, not just the cb stream.emit('error', er); processNextTick(cb, er); } // If we get something that is not a buffer, string, null, or undefined, // and we're not in objectMode, then that's an error. // Otherwise stream chunks are all considered to be of length=1, and the // watermarks determine how many objects to keep in the buffer, rather than // how many bytes or characters. function validChunk(stream, state, chunk, cb) { var valid = true; if (!(Buffer.isBuffer(chunk)) && typeof chunk !== 'string' && chunk !== null && chunk !== undefined && !state.objectMode) { var er = new TypeError('Invalid non-string/buffer chunk'); stream.emit('error', er); processNextTick(cb, er); valid = false; } return valid; } Writable.prototype.write = function(chunk, encoding, cb) { var state = this._writableState; var ret = false; if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (Buffer.isBuffer(chunk)) encoding = 'buffer'; else if (!encoding) encoding = state.defaultEncoding; if (typeof cb !== 'function') cb = nop; if (state.ended) writeAfterEnd(this, cb); else if (validChunk(this, state, chunk, cb)) { state.pendingcb++; ret = writeOrBuffer(this, state, chunk, encoding, cb); } return ret; }; Writable.prototype.cork = function() { var state = this._writableState; state.corked++; }; Writable.prototype.uncork = function() { var state = this._writableState; if (state.corked) { state.corked--; if (!state.writing && !state.corked && !state.finished && !state.bufferProcessing && state.bufferedRequest) clearBuffer(this, state); } }; Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) { // node::ParseEncoding() requires lower case. if (typeof encoding === 'string') encoding = encoding.toLowerCase(); if (!(['hex', 'utf8', 'utf-8', 'ascii', 'binary', 'base64', 'ucs2', 'ucs-2','utf16le', 'utf-16le', 'raw'] .indexOf((encoding + '').toLowerCase()) > -1)) throw new TypeError('Unknown encoding: ' + encoding); this._writableState.defaultEncoding = encoding; }; function decodeChunk(state, chunk, encoding) { if (!state.objectMode && state.decodeStrings !== false && typeof chunk === 'string') { chunk = new Buffer(chunk, encoding); } return chunk; } // if we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. function writeOrBuffer(stream, state, chunk, encoding, cb) { chunk = decodeChunk(state, chunk, encoding); if (Buffer.isBuffer(chunk)) encoding = 'buffer'; var len = state.objectMode ? 1 : chunk.length; state.length += len; var ret = state.length < state.highWaterMark; // we must ensure that previous needDrain will not be reset to false. if (!ret) state.needDrain = true; if (state.writing || state.corked) { var last = state.lastBufferedRequest; state.lastBufferedRequest = new WriteReq(chunk, encoding, cb); if (last) { last.next = state.lastBufferedRequest; } else { state.bufferedRequest = state.lastBufferedRequest; } } else { doWrite(stream, state, false, len, chunk, encoding, cb); } return ret; } function doWrite(stream, state, writev, len, chunk, encoding, cb) { state.writelen = len; state.writecb = cb; state.writing = true; state.sync = true; if (writev) stream._writev(chunk, state.onwrite); else stream._write(chunk, encoding, state.onwrite); state.sync = false; } function onwriteError(stream, state, sync, er, cb) { --state.pendingcb; if (sync) processNextTick(cb, er); else cb(er); stream._writableState.errorEmitted = true; stream.emit('error', er); } function onwriteStateUpdate(state) { state.writing = false; state.writecb = null; state.length -= state.writelen; state.writelen = 0; } function onwrite(stream, er) { var state = stream._writableState; var sync = state.sync; var cb = state.writecb; onwriteStateUpdate(state); if (er) onwriteError(stream, state, sync, er, cb); else { // Check if we're actually ready to finish, but don't emit yet var finished = needFinish(state); if (!finished && !state.corked && !state.bufferProcessing && state.bufferedRequest) { clearBuffer(stream, state); } if (sync) { processNextTick(afterWrite, stream, state, finished, cb); } else { afterWrite(stream, state, finished, cb); } } } function afterWrite(stream, state, finished, cb) { if (!finished) onwriteDrain(stream, state); state.pendingcb--; cb(); finishMaybe(stream, state); } // Must force callback to be called on nextTick, so that we don't // emit 'drain' before the write() consumer gets the 'false' return // value, and has a chance to attach a 'drain' listener. function onwriteDrain(stream, state) { if (state.length === 0 && state.needDrain) { state.needDrain = false; stream.emit('drain'); } } // if there's something in the buffer waiting, then process it function clearBuffer(stream, state) { state.bufferProcessing = true; var entry = state.bufferedRequest; if (stream._writev && entry && entry.next) { // Fast case, write everything using _writev() var buffer = []; var cbs = []; while (entry) { cbs.push(entry.callback); buffer.push(entry); entry = entry.next; } // count the one we are adding, as well. // TODO(isaacs) clean this up state.pendingcb++; state.lastBufferedRequest = null; doWrite(stream, state, true, state.length, buffer, '', function(err) { for (var i = 0; i < cbs.length; i++) { state.pendingcb--; cbs[i](err); } }); // Clear buffer } else { // Slow case, write chunks one-by-one while (entry) { var chunk = entry.chunk; var encoding = entry.encoding; var cb = entry.callback; var len = state.objectMode ? 1 : chunk.length; doWrite(stream, state, false, len, chunk, encoding, cb); entry = entry.next; // if we didn't call the onwrite immediately, then // it means that we need to wait until it does. // also, that means that the chunk and cb are currently // being processed, so move the buffer counter past them. if (state.writing) { break; } } if (entry === null) state.lastBufferedRequest = null; } state.bufferedRequest = entry; state.bufferProcessing = false; } Writable.prototype._write = function(chunk, encoding, cb) { cb(new Error('not implemented')); }; Writable.prototype._writev = null; Writable.prototype.end = function(chunk, encoding, cb) { var state = this._writableState; if (typeof chunk === 'function') { cb = chunk; chunk = null; encoding = null; } else if (typeof encoding === 'function') { cb = encoding; encoding = null; } if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); // .end() fully uncorks if (state.corked) { state.corked = 1; this.uncork(); } // ignore unnecessary end() calls. if (!state.ending && !state.finished) endWritable(this, state, cb); }; function needFinish(state) { return (state.ending && state.length === 0 && state.bufferedRequest === null && !state.finished && !state.writing); } function prefinish(stream, state) { if (!state.prefinished) { state.prefinished = true; stream.emit('prefinish'); } } function finishMaybe(stream, state) { var need = needFinish(state); if (need) { if (state.pendingcb === 0) { prefinish(stream, state); state.finished = true; stream.emit('finish'); } else { prefinish(stream, state); } } return need; } function endWritable(stream, state, cb) { state.ending = true; finishMaybe(stream, state); if (cb) { if (state.finished) processNextTick(cb); else stream.once('finish', cb); } state.ended = true; } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/0000755000000000000000000000000012631326456027420 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/0000755000000000000000000000000012631326456026556 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/0000755000000000000000000000000012631326456031157 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/0000755000000000000000000000000012631326456030077 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/0000755000000000000000000000000012631326456030013 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/README.md0000644000000000000000000000010312631326456030671 0ustar 00000000000000# core-util-is The `util.is*` functions introduced in Node v0.12. npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/float.patch0000644000000000000000000003762612631326456031564 0ustar 00000000000000diff --git a/lib/util.js b/lib/util.js index a03e874..9074e8e 100644 --- a/lib/util.js +++ b/lib/util.js @@ -19,430 +19,6 @@ // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. -var formatRegExp = /%[sdj%]/g; -exports.format = function(f) { - if (!isString(f)) { - var objects = []; - for (var i = 0; i < arguments.length; i++) { - objects.push(inspect(arguments[i])); - } - return objects.join(' '); - } - - var i = 1; - var args = arguments; - var len = args.length; - var str = String(f).replace(formatRegExp, function(x) { - if (x === '%%') return '%'; - if (i >= len) return x; - switch (x) { - case '%s': return String(args[i++]); - case '%d': return Number(args[i++]); - case '%j': - try { - return JSON.stringify(args[i++]); - } catch (_) { - return '[Circular]'; - } - default: - return x; - } - }); - for (var x = args[i]; i < len; x = args[++i]) { - if (isNull(x) || !isObject(x)) { - str += ' ' + x; - } else { - str += ' ' + inspect(x); - } - } - return str; -}; - - -// Mark that a method should not be used. -// Returns a modified function which warns once by default. -// If --no-deprecation is set, then it is a no-op. -exports.deprecate = function(fn, msg) { - // Allow for deprecating things in the process of starting up. - if (isUndefined(global.process)) { - return function() { - return exports.deprecate(fn, msg).apply(this, arguments); - }; - } - - if (process.noDeprecation === true) { - return fn; - } - - var warned = false; - function deprecated() { - if (!warned) { - if (process.throwDeprecation) { - throw new Error(msg); - } else if (process.traceDeprecation) { - console.trace(msg); - } else { - console.error(msg); - } - warned = true; - } - return fn.apply(this, arguments); - } - - return deprecated; -}; - - -var debugs = {}; -var debugEnviron; -exports.debuglog = function(set) { - if (isUndefined(debugEnviron)) - debugEnviron = process.env.NODE_DEBUG || ''; - set = set.toUpperCase(); - if (!debugs[set]) { - if (new RegExp('\\b' + set + '\\b', 'i').test(debugEnviron)) { - var pid = process.pid; - debugs[set] = function() { - var msg = exports.format.apply(exports, arguments); - console.error('%s %d: %s', set, pid, msg); - }; - } else { - debugs[set] = function() {}; - } - } - return debugs[set]; -}; - - -/** - * Echos the value of a value. Trys to print the value out - * in the best way possible given the different types. - * - * @param {Object} obj The object to print out. - * @param {Object} opts Optional options object that alters the output. - */ -/* legacy: obj, showHidden, depth, colors*/ -function inspect(obj, opts) { - // default options - var ctx = { - seen: [], - stylize: stylizeNoColor - }; - // legacy... - if (arguments.length >= 3) ctx.depth = arguments[2]; - if (arguments.length >= 4) ctx.colors = arguments[3]; - if (isBoolean(opts)) { - // legacy... - ctx.showHidden = opts; - } else if (opts) { - // got an "options" object - exports._extend(ctx, opts); - } - // set default options - if (isUndefined(ctx.showHidden)) ctx.showHidden = false; - if (isUndefined(ctx.depth)) ctx.depth = 2; - if (isUndefined(ctx.colors)) ctx.colors = false; - if (isUndefined(ctx.customInspect)) ctx.customInspect = true; - if (ctx.colors) ctx.stylize = stylizeWithColor; - return formatValue(ctx, obj, ctx.depth); -} -exports.inspect = inspect; - - -// http://en.wikipedia.org/wiki/ANSI_escape_code#graphics -inspect.colors = { - 'bold' : [1, 22], - 'italic' : [3, 23], - 'underline' : [4, 24], - 'inverse' : [7, 27], - 'white' : [37, 39], - 'grey' : [90, 39], - 'black' : [30, 39], - 'blue' : [34, 39], - 'cyan' : [36, 39], - 'green' : [32, 39], - 'magenta' : [35, 39], - 'red' : [31, 39], - 'yellow' : [33, 39] -}; - -// Don't use 'blue' not visible on cmd.exe -inspect.styles = { - 'special': 'cyan', - 'number': 'yellow', - 'boolean': 'yellow', - 'undefined': 'grey', - 'null': 'bold', - 'string': 'green', - 'date': 'magenta', - // "name": intentionally not styling - 'regexp': 'red' -}; - - -function stylizeWithColor(str, styleType) { - var style = inspect.styles[styleType]; - - if (style) { - return '\u001b[' + inspect.colors[style][0] + 'm' + str + - '\u001b[' + inspect.colors[style][1] + 'm'; - } else { - return str; - } -} - - -function stylizeNoColor(str, styleType) { - return str; -} - - -function arrayToHash(array) { - var hash = {}; - - array.forEach(function(val, idx) { - hash[val] = true; - }); - - return hash; -} - - -function formatValue(ctx, value, recurseTimes) { - // Provide a hook for user-specified inspect functions. - // Check that value is an object with an inspect function on it - if (ctx.customInspect && - value && - isFunction(value.inspect) && - // Filter out the util module, it's inspect function is special - value.inspect !== exports.inspect && - // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { - var ret = value.inspect(recurseTimes, ctx); - if (!isString(ret)) { - ret = formatValue(ctx, ret, recurseTimes); - } - return ret; - } - - // Primitive types cannot have properties - var primitive = formatPrimitive(ctx, value); - if (primitive) { - return primitive; - } - - // Look up the keys of the object. - var keys = Object.keys(value); - var visibleKeys = arrayToHash(keys); - - if (ctx.showHidden) { - keys = Object.getOwnPropertyNames(value); - } - - // Some type of object without properties can be shortcutted. - if (keys.length === 0) { - if (isFunction(value)) { - var name = value.name ? ': ' + value.name : ''; - return ctx.stylize('[Function' + name + ']', 'special'); - } - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } - if (isDate(value)) { - return ctx.stylize(Date.prototype.toString.call(value), 'date'); - } - if (isError(value)) { - return formatError(value); - } - } - - var base = '', array = false, braces = ['{', '}']; - - // Make Array say that they are Array - if (isArray(value)) { - array = true; - braces = ['[', ']']; - } - - // Make functions say that they are functions - if (isFunction(value)) { - var n = value.name ? ': ' + value.name : ''; - base = ' [Function' + n + ']'; - } - - // Make RegExps say that they are RegExps - if (isRegExp(value)) { - base = ' ' + RegExp.prototype.toString.call(value); - } - - // Make dates with properties first say the date - if (isDate(value)) { - base = ' ' + Date.prototype.toUTCString.call(value); - } - - // Make error with message first say the error - if (isError(value)) { - base = ' ' + formatError(value); - } - - if (keys.length === 0 && (!array || value.length == 0)) { - return braces[0] + base + braces[1]; - } - - if (recurseTimes < 0) { - if (isRegExp(value)) { - return ctx.stylize(RegExp.prototype.toString.call(value), 'regexp'); - } else { - return ctx.stylize('[Object]', 'special'); - } - } - - ctx.seen.push(value); - - var output; - if (array) { - output = formatArray(ctx, value, recurseTimes, visibleKeys, keys); - } else { - output = keys.map(function(key) { - return formatProperty(ctx, value, recurseTimes, visibleKeys, key, array); - }); - } - - ctx.seen.pop(); - - return reduceToSingleString(output, base, braces); -} - - -function formatPrimitive(ctx, value) { - if (isUndefined(value)) - return ctx.stylize('undefined', 'undefined'); - if (isString(value)) { - var simple = '\'' + JSON.stringify(value).replace(/^"|"$/g, '') - .replace(/'/g, "\\'") - .replace(/\\"/g, '"') + '\''; - return ctx.stylize(simple, 'string'); - } - if (isNumber(value)) { - // Format -0 as '-0'. Strict equality won't distinguish 0 from -0, - // so instead we use the fact that 1 / -0 < 0 whereas 1 / 0 > 0 . - if (value === 0 && 1 / value < 0) - return ctx.stylize('-0', 'number'); - return ctx.stylize('' + value, 'number'); - } - if (isBoolean(value)) - return ctx.stylize('' + value, 'boolean'); - // For some reason typeof null is "object", so special case here. - if (isNull(value)) - return ctx.stylize('null', 'null'); -} - - -function formatError(value) { - return '[' + Error.prototype.toString.call(value) + ']'; -} - - -function formatArray(ctx, value, recurseTimes, visibleKeys, keys) { - var output = []; - for (var i = 0, l = value.length; i < l; ++i) { - if (hasOwnProperty(value, String(i))) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - String(i), true)); - } else { - output.push(''); - } - } - keys.forEach(function(key) { - if (!key.match(/^\d+$/)) { - output.push(formatProperty(ctx, value, recurseTimes, visibleKeys, - key, true)); - } - }); - return output; -} - - -function formatProperty(ctx, value, recurseTimes, visibleKeys, key, array) { - var name, str, desc; - desc = Object.getOwnPropertyDescriptor(value, key) || { value: value[key] }; - if (desc.get) { - if (desc.set) { - str = ctx.stylize('[Getter/Setter]', 'special'); - } else { - str = ctx.stylize('[Getter]', 'special'); - } - } else { - if (desc.set) { - str = ctx.stylize('[Setter]', 'special'); - } - } - if (!hasOwnProperty(visibleKeys, key)) { - name = '[' + key + ']'; - } - if (!str) { - if (ctx.seen.indexOf(desc.value) < 0) { - if (isNull(recurseTimes)) { - str = formatValue(ctx, desc.value, null); - } else { - str = formatValue(ctx, desc.value, recurseTimes - 1); - } - if (str.indexOf('\n') > -1) { - if (array) { - str = str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n').substr(2); - } else { - str = '\n' + str.split('\n').map(function(line) { - return ' ' + line; - }).join('\n'); - } - } - } else { - str = ctx.stylize('[Circular]', 'special'); - } - } - if (isUndefined(name)) { - if (array && key.match(/^\d+$/)) { - return str; - } - name = JSON.stringify('' + key); - if (name.match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)) { - name = name.substr(1, name.length - 2); - name = ctx.stylize(name, 'name'); - } else { - name = name.replace(/'/g, "\\'") - .replace(/\\"/g, '"') - .replace(/(^"|"$)/g, "'"); - name = ctx.stylize(name, 'string'); - } - } - - return name + ': ' + str; -} - - -function reduceToSingleString(output, base, braces) { - var numLinesEst = 0; - var length = output.reduce(function(prev, cur) { - numLinesEst++; - if (cur.indexOf('\n') >= 0) numLinesEst++; - return prev + cur.replace(/\u001b\[\d\d?m/g, '').length + 1; - }, 0); - - if (length > 60) { - return braces[0] + - (base === '' ? '' : base + '\n ') + - ' ' + - output.join(',\n ') + - ' ' + - braces[1]; - } - - return braces[0] + base + ' ' + output.join(', ') + ' ' + braces[1]; -} - - // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { @@ -522,166 +98,10 @@ function isPrimitive(arg) { exports.isPrimitive = isPrimitive; function isBuffer(arg) { - return arg instanceof Buffer; + return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); -} - - -function pad(n) { - return n < 10 ? '0' + n.toString(10) : n.toString(10); -} - - -var months = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', - 'Oct', 'Nov', 'Dec']; - -// 26 Feb 16:19:34 -function timestamp() { - var d = new Date(); - var time = [pad(d.getHours()), - pad(d.getMinutes()), - pad(d.getSeconds())].join(':'); - return [d.getDate(), months[d.getMonth()], time].join(' '); -} - - -// log is just a thin wrapper to console.log that prepends a timestamp -exports.log = function() { - console.log('%s - %s', timestamp(), exports.format.apply(exports, arguments)); -}; - - -/** - * Inherit the prototype methods from one constructor into another. - * - * The Function.prototype.inherits from lang.js rewritten as a standalone - * function (not on Function.prototype). NOTE: If this file is to be loaded - * during bootstrapping this function needs to be rewritten using some native - * functions as prototype setup using normal JavaScript does not work as - * expected during bootstrapping (see mirror.js in r114903). - * - * @param {function} ctor Constructor function which needs to inherit the - * prototype. - * @param {function} superCtor Constructor function to inherit prototype from. - */ -exports.inherits = function(ctor, superCtor) { - ctor.super_ = superCtor; - ctor.prototype = Object.create(superCtor.prototype, { - constructor: { - value: ctor, - enumerable: false, - writable: true, - configurable: true - } - }); -}; - -exports._extend = function(origin, add) { - // Don't do anything if add isn't an object - if (!add || !isObject(add)) return origin; - - var keys = Object.keys(add); - var i = keys.length; - while (i--) { - origin[keys[i]] = add[keys[i]]; - } - return origin; -}; - -function hasOwnProperty(obj, prop) { - return Object.prototype.hasOwnProperty.call(obj, prop); -} - - -// Deprecated old stuff. - -exports.p = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - console.error(exports.inspect(arguments[i])); - } -}, 'util.p: Use console.error() instead'); - - -exports.exec = exports.deprecate(function() { - return require('child_process').exec.apply(this, arguments); -}, 'util.exec is now called `child_process.exec`.'); - - -exports.print = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(String(arguments[i])); - } -}, 'util.print: Use console.log instead'); - - -exports.puts = exports.deprecate(function() { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stdout.write(arguments[i] + '\n'); - } -}, 'util.puts: Use console.log instead'); - - -exports.debug = exports.deprecate(function(x) { - process.stderr.write('DEBUG: ' + x + '\n'); -}, 'util.debug: Use console.error instead'); - - -exports.error = exports.deprecate(function(x) { - for (var i = 0, len = arguments.length; i < len; ++i) { - process.stderr.write(arguments[i] + '\n'); - } -}, 'util.error: Use console.error instead'); - - -exports.pump = exports.deprecate(function(readStream, writeStream, callback) { - var callbackCalled = false; - - function call(a, b, c) { - if (callback && !callbackCalled) { - callback(a, b, c); - callbackCalled = true; - } - } - - readStream.addListener('data', function(chunk) { - if (writeStream.write(chunk) === false) readStream.pause(); - }); - - writeStream.addListener('drain', function() { - readStream.resume(); - }); - - readStream.addListener('end', function() { - writeStream.end(); - }); - - readStream.addListener('close', function() { - call(); - }); - - readStream.addListener('error', function(err) { - writeStream.end(); - call(err); - }); - - writeStream.addListener('error', function(err) { - readStream.destroy(); - call(err); - }); -}, 'util.pump(): Use readableStream.pipe() instead'); - - -var uv; -exports._errnoException = function(err, syscall) { - if (isUndefined(uv)) uv = process.binding('uv'); - var errname = uv.errname(err); - var e = new Error(syscall + ' ' + errname); - e.code = errname; - e.errno = errname; - e.syscall = syscall; - return e; -}; +}npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/lib/0000755000000000000000000000000012631326456030166 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/package.json0000644000000000000000000000175012631326456031711 0ustar 00000000000000{ "name": "core-util-is", "version": "1.0.1", "description": "The `util.is*` functions introduced in Node v0.12.", "main": "lib/util.js", "repository": { "type": "git", "url": "git://github.com/isaacs/core-util-is.git" }, "keywords": [ "util", "isBuffer", "isArray", "isNumber", "isString", "isRegExp", "isThis", "isThat", "polyfill" ], "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "MIT", "bugs": { "url": "https://github.com/isaacs/core-util-is/issues" }, "readme": "# core-util-is\n\nThe `util.is*` functions introduced in Node v0.12.\n", "readmeFilename": "README.md", "homepage": "https://github.com/isaacs/core-util-is#readme", "_id": "core-util-is@1.0.1", "_shasum": "6b07085aef9a3ccac6ee53bf9d3df0c1521a5538", "_resolved": "https://registry.npmjs.org/core-util-is/-/core-util-is-1.0.1.tgz", "_from": "core-util-is@>=1.0.0 <1.1.0" } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/util.js0000644000000000000000000000570412631326456030741 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && objectToString(e) === '[object Error]'; } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return arg instanceof Buffer; } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/core-util-is/lib/util.js0000644000000000000000000000574012631326456031507 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. // NOTE: These type checking functions intentionally don't use `instanceof` // because it is fragile and can be easily faked with `Object.create()`. function isArray(ar) { return Array.isArray(ar); } exports.isArray = isArray; function isBoolean(arg) { return typeof arg === 'boolean'; } exports.isBoolean = isBoolean; function isNull(arg) { return arg === null; } exports.isNull = isNull; function isNullOrUndefined(arg) { return arg == null; } exports.isNullOrUndefined = isNullOrUndefined; function isNumber(arg) { return typeof arg === 'number'; } exports.isNumber = isNumber; function isString(arg) { return typeof arg === 'string'; } exports.isString = isString; function isSymbol(arg) { return typeof arg === 'symbol'; } exports.isSymbol = isSymbol; function isUndefined(arg) { return arg === void 0; } exports.isUndefined = isUndefined; function isRegExp(re) { return isObject(re) && objectToString(re) === '[object RegExp]'; } exports.isRegExp = isRegExp; function isObject(arg) { return typeof arg === 'object' && arg !== null; } exports.isObject = isObject; function isDate(d) { return isObject(d) && objectToString(d) === '[object Date]'; } exports.isDate = isDate; function isError(e) { return isObject(e) && (objectToString(e) === '[object Error]' || e instanceof Error); } exports.isError = isError; function isFunction(arg) { return typeof arg === 'function'; } exports.isFunction = isFunction; function isPrimitive(arg) { return arg === null || typeof arg === 'boolean' || typeof arg === 'number' || typeof arg === 'string' || typeof arg === 'symbol' || // ES6 symbol typeof arg === 'undefined'; } exports.isPrimitive = isPrimitive; function isBuffer(arg) { return Buffer.isBuffer(arg); } exports.isBuffer = isBuffer; function objectToString(o) { return Object.prototype.toString.call(o); }npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/README.md0000644000000000000000000000302512631326456030035 0ustar 00000000000000 # isarray `Array#isArray` for older browsers. ## Usage ```js var isArray = require('isarray'); console.log(isArray([])); // => true console.log(isArray({})); // => false ``` ## Installation With [npm](http://npmjs.org) do ```bash $ npm install isarray ``` Then bundle for the browser with [browserify](https://github.com/substack/browserify). With [component](http://component.io) do ```bash $ component install juliangruber/isarray ``` ## License (MIT) Copyright (c) 2013 Julian Gruber <julian@juliangruber.com> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/build/0000755000000000000000000000000012631326456027655 5ustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/component.json0000644000000000000000000000072612631326456031460 0ustar 00000000000000{ "name" : "isarray", "description" : "Array#isArray for older browsers", "version" : "0.0.1", "repository" : "juliangruber/isarray", "homepage": "https://github.com/juliangruber/isarray", "main" : "index.js", "scripts" : [ "index.js" ], "dependencies" : {}, "keywords": ["browser","isarray","array"], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT" } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/index.js0000644000000000000000000000017012631326456030221 0ustar 00000000000000module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/package.json0000644000000000000000000000472712631326456031056 0ustar 00000000000000{ "name": "isarray", "description": "Array#isArray for older browsers", "version": "0.0.1", "repository": { "type": "git", "url": "git://github.com/juliangruber/isarray.git" }, "homepage": "https://github.com/juliangruber/isarray", "main": "index.js", "scripts": { "test": "tap test/*.js" }, "dependencies": {}, "devDependencies": { "tap": "*" }, "keywords": [ "browser", "isarray", "array" ], "author": { "name": "Julian Gruber", "email": "mail@juliangruber.com", "url": "http://juliangruber.com" }, "license": "MIT", "readme": "\n# isarray\n\n`Array#isArray` for older browsers.\n\n## Usage\n\n```js\nvar isArray = require('isarray');\n\nconsole.log(isArray([])); // => true\nconsole.log(isArray({})); // => false\n```\n\n## Installation\n\nWith [npm](http://npmjs.org) do\n\n```bash\n$ npm install isarray\n```\n\nThen bundle for the browser with\n[browserify](https://github.com/substack/browserify).\n\nWith [component](http://component.io) do\n\n```bash\n$ component install juliangruber/isarray\n```\n\n## License\n\n(MIT)\n\nCopyright (c) 2013 Julian Gruber <julian@juliangruber.com>\n\nPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \"Software\"), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/juliangruber/isarray/issues" }, "_id": "isarray@0.0.1", "_shasum": "8a18acfca9a8f4177e09abfc6038939b05d1eedf", "_resolved": "https://registry.npmjs.org/isarray/-/isarray-0.0.1.tgz", "_from": "isarray@0.0.1" } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/isarray/build/build.js0000644000000000000000000000777112631326456031326 0ustar 00000000000000 /** * Require the given path. * * @param {String} path * @return {Object} exports * @api public */ function require(path, parent, orig) { var resolved = require.resolve(path); // lookup failed if (null == resolved) { orig = orig || path; parent = parent || 'root'; var err = new Error('Failed to require "' + orig + '" from "' + parent + '"'); err.path = orig; err.parent = parent; err.require = true; throw err; } var module = require.modules[resolved]; // perform real require() // by invoking the module's // registered function if (!module.exports) { module.exports = {}; module.client = module.component = true; module.call(this, module.exports, require.relative(resolved), module); } return module.exports; } /** * Registered modules. */ require.modules = {}; /** * Registered aliases. */ require.aliases = {}; /** * Resolve `path`. * * Lookup: * * - PATH/index.js * - PATH.js * - PATH * * @param {String} path * @return {String} path or null * @api private */ require.resolve = function(path) { if (path.charAt(0) === '/') path = path.slice(1); var index = path + '/index.js'; var paths = [ path, path + '.js', path + '.json', path + '/index.js', path + '/index.json' ]; for (var i = 0; i < paths.length; i++) { var path = paths[i]; if (require.modules.hasOwnProperty(path)) return path; } if (require.aliases.hasOwnProperty(index)) { return require.aliases[index]; } }; /** * Normalize `path` relative to the current path. * * @param {String} curr * @param {String} path * @return {String} * @api private */ require.normalize = function(curr, path) { var segs = []; if ('.' != path.charAt(0)) return path; curr = curr.split('/'); path = path.split('/'); for (var i = 0; i < path.length; ++i) { if ('..' == path[i]) { curr.pop(); } else if ('.' != path[i] && '' != path[i]) { segs.push(path[i]); } } return curr.concat(segs).join('/'); }; /** * Register module at `path` with callback `definition`. * * @param {String} path * @param {Function} definition * @api private */ require.register = function(path, definition) { require.modules[path] = definition; }; /** * Alias a module definition. * * @param {String} from * @param {String} to * @api private */ require.alias = function(from, to) { if (!require.modules.hasOwnProperty(from)) { throw new Error('Failed to alias "' + from + '", it does not exist'); } require.aliases[to] = from; }; /** * Return a require function relative to the `parent` path. * * @param {String} parent * @return {Function} * @api private */ require.relative = function(parent) { var p = require.normalize(parent, '..'); /** * lastIndexOf helper. */ function lastIndexOf(arr, obj) { var i = arr.length; while (i--) { if (arr[i] === obj) return i; } return -1; } /** * The relative require() itself. */ function localRequire(path) { var resolved = localRequire.resolve(path); return require(resolved, parent, path); } /** * Resolve relative to the parent. */ localRequire.resolve = function(path) { var c = path.charAt(0); if ('/' == c) return path.slice(1); if ('.' == c) return require.normalize(p, path); // resolve deps by returning // the dep in the nearest "deps" // directory var segs = parent.split('/'); var i = lastIndexOf(segs, 'deps') + 1; if (!i) i = 0; path = segs.slice(0, i + 1).join('/') + '/deps/' + path; return path; }; /** * Check if module is defined at `path`. */ localRequire.exists = function(path) { return require.modules.hasOwnProperty(localRequire.resolve(path)); }; return localRequire; }; require.register("isarray/index.js", function(exports, require, module){ module.exports = Array.isArray || function (arr) { return Object.prototype.toString.call(arr) == '[object Array]'; }; }); require.alias("isarray/index.js", "isarray/index.js"); ././@LongLink0000000000000000000000000000015300000000000011214 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/.travis.ymlnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/.trav0000644000000000000000000000012112631326456032126 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" - "0.11" - "0.12" - "iojs" ././@LongLink0000000000000000000000000000015000000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/index.jsnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/index0000644000000000000000000000040712631326456032212 0ustar 00000000000000'use strict'; module.exports = nextTick; function nextTick(fn) { var args = new Array(arguments.length - 1); var i = 0; while (i < args.length) { args[i++] = arguments[i]; } process.nextTick(function afterTick() { fn.apply(null, args); }); } ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/license.mdnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/licen0000644000000000000000000000205012631326456032171 0ustar 00000000000000# Copyright (c) 2015 Calvin Metcalf Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. **THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.** ././@LongLink0000000000000000000000000000015400000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/package.jsonnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/packa0000644000000000000000000000244012631326456032161 0ustar 00000000000000{ "name": "process-nextick-args", "version": "1.0.3", "description": "process.nextTick but always with args", "main": "index.js", "scripts": { "test": "node test.js" }, "repository": { "type": "git", "url": "git+https://github.com/calvinmetcalf/process-nextick-args.git" }, "author": "", "license": "MIT", "bugs": { "url": "https://github.com/calvinmetcalf/process-nextick-args/issues" }, "homepage": "https://github.com/calvinmetcalf/process-nextick-args", "devDependencies": { "tap": "~0.2.6" }, "readme": "process-nextick-args\n=====\n\n[![Build Status](https://travis-ci.org/calvinmetcalf/process-nextick-args.svg?branch=master)](https://travis-ci.org/calvinmetcalf/process-nextick-args)\n\n```bash\nnpm install --save process-nextick-args\n```\n\nAlways be able to pass arguments to process.nextTick, no matter the platform\n\n```js\nvar nextTick = require('process-nextick-args');\n\nnextTick(function (a, b, c) {\n console.log(a, b, c);\n}, 'step', 3, 'profit');\n```\n", "readmeFilename": "readme.md", "_id": "process-nextick-args@1.0.3", "_shasum": "e272eed825d5e9f4ea74d8d73b1fe311c3beb630", "_resolved": "https://registry.npmjs.org/process-nextick-args/-/process-nextick-args-1.0.3.tgz", "_from": "process-nextick-args@>=1.0.0 <1.1.0" } ././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/readme.mdnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/readm0000644000000000000000000000070312631326456032172 0ustar 00000000000000process-nextick-args ===== [![Build Status](https://travis-ci.org/calvinmetcalf/process-nextick-args.svg?branch=master)](https://travis-ci.org/calvinmetcalf/process-nextick-args) ```bash npm install --save process-nextick-args ``` Always be able to pass arguments to process.nextTick, no matter the platform ```js var nextTick = require('process-nextick-args'); nextTick(function (a, b, c) { console.log(a, b, c); }, 'step', 3, 'profit'); ``` ././@LongLink0000000000000000000000000000014700000000000011217 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/test.jsnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/process-nextick-args/test.0000644000000000000000000000101612631326456032135 0ustar 00000000000000var test = require("tap").test; var nextTick = require('./'); test('should work', function (t) { t.plan(5); nextTick(function (a) { t.ok(a); nextTick(function (thing) { t.equals(thing, 7); }, 7); }, true); nextTick(function (a, b, c) { t.equals(a, 'step'); t.equals(b, 3); t.equals(c, 'profit'); }, 'step', 3, 'profit'); }); test('correct number of arguments', function (t) { t.plan(1); nextTick(function () { t.equals(2, arguments.length, 'correct number'); }, 1, 2); }); npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/.npmignore0000644000000000000000000000001312631326456032070 0ustar 00000000000000build test npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/LICENSE0000644000000000000000000000206412631326456031106 0ustar 00000000000000Copyright Joyent, Inc. and other Node contributors. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/README.md0000644000000000000000000000076212631326456031363 0ustar 00000000000000**string_decoder.js** (`require('string_decoder')`) from Node.js core Copyright Joyent, Inc. and other Node contributors. See LICENCE file for details. Version numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.** The *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/index.js0000644000000000000000000001716412631326456031555 0ustar 00000000000000// Copyright Joyent, Inc. and other Node contributors. // // Permission is hereby granted, free of charge, to any person obtaining a // copy of this software and associated documentation files (the // "Software"), to deal in the Software without restriction, including // without limitation the rights to use, copy, modify, merge, publish, // distribute, sublicense, and/or sell copies of the Software, and to permit // persons to whom the Software is furnished to do so, subject to the // following conditions: // // The above copyright notice and this permission notice shall be included // in all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS // OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF // MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN // NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, // DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR // OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE // USE OR OTHER DEALINGS IN THE SOFTWARE. var Buffer = require('buffer').Buffer; var isBufferEncoding = Buffer.isEncoding || function(encoding) { switch (encoding && encoding.toLowerCase()) { case 'hex': case 'utf8': case 'utf-8': case 'ascii': case 'binary': case 'base64': case 'ucs2': case 'ucs-2': case 'utf16le': case 'utf-16le': case 'raw': return true; default: return false; } } function assertEncoding(encoding) { if (encoding && !isBufferEncoding(encoding)) { throw new Error('Unknown encoding: ' + encoding); } } // StringDecoder provides an interface for efficiently splitting a series of // buffers into a series of JS strings without breaking apart multi-byte // characters. CESU-8 is handled as part of the UTF-8 encoding. // // @TODO Handling all encodings inside a single object makes it very difficult // to reason about this code, so it should be split up in the future. // @TODO There should be a utf8-strict encoding that rejects invalid UTF-8 code // points as used by CESU-8. var StringDecoder = exports.StringDecoder = function(encoding) { this.encoding = (encoding || 'utf8').toLowerCase().replace(/[-_]/, ''); assertEncoding(encoding); switch (this.encoding) { case 'utf8': // CESU-8 represents each of Surrogate Pair by 3-bytes this.surrogateSize = 3; break; case 'ucs2': case 'utf16le': // UTF-16 represents each of Surrogate Pair by 2-bytes this.surrogateSize = 2; this.detectIncompleteChar = utf16DetectIncompleteChar; break; case 'base64': // Base-64 stores 3 bytes in 4 chars, and pads the remainder. this.surrogateSize = 3; this.detectIncompleteChar = base64DetectIncompleteChar; break; default: this.write = passThroughWrite; return; } // Enough space to store all bytes of a single character. UTF-8 needs 4 // bytes, but CESU-8 may require up to 6 (3 bytes per surrogate). this.charBuffer = new Buffer(6); // Number of bytes received for the current incomplete multi-byte character. this.charReceived = 0; // Number of bytes expected for the current incomplete multi-byte character. this.charLength = 0; }; // write decodes the given buffer and returns it as JS string that is // guaranteed to not contain any partial multi-byte characters. Any partial // character found at the end of the buffer is buffered up, and will be // returned when calling write again with the remaining bytes. // // Note: Converting a Buffer containing an orphan surrogate to a String // currently works, but converting a String to a Buffer (via `new Buffer`, or // Buffer#write) will replace incomplete surrogates with the unicode // replacement character. See https://codereview.chromium.org/121173009/ . StringDecoder.prototype.write = function(buffer) { var charStr = ''; // if our last write ended with an incomplete multibyte character while (this.charLength) { // determine how many remaining bytes this buffer has to offer for this char var available = (buffer.length >= this.charLength - this.charReceived) ? this.charLength - this.charReceived : buffer.length; // add the new bytes to the char buffer buffer.copy(this.charBuffer, this.charReceived, 0, available); this.charReceived += available; if (this.charReceived < this.charLength) { // still not enough chars in this buffer? wait for more ... return ''; } // remove bytes belonging to the current character from the buffer buffer = buffer.slice(available, buffer.length); // get the character that was split charStr = this.charBuffer.slice(0, this.charLength).toString(this.encoding); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character var charCode = charStr.charCodeAt(charStr.length - 1); if (charCode >= 0xD800 && charCode <= 0xDBFF) { this.charLength += this.surrogateSize; charStr = ''; continue; } this.charReceived = this.charLength = 0; // if there are no more bytes in this buffer, just emit our char if (buffer.length === 0) { return charStr; } break; } // determine and set charLength / charReceived this.detectIncompleteChar(buffer); var end = buffer.length; if (this.charLength) { // buffer the incomplete character bytes we got buffer.copy(this.charBuffer, 0, buffer.length - this.charReceived, end); end -= this.charReceived; } charStr += buffer.toString(this.encoding, 0, end); var end = charStr.length - 1; var charCode = charStr.charCodeAt(end); // CESU-8: lead surrogate (D800-DBFF) is also the incomplete character if (charCode >= 0xD800 && charCode <= 0xDBFF) { var size = this.surrogateSize; this.charLength += size; this.charReceived += size; this.charBuffer.copy(this.charBuffer, size, 0, size); buffer.copy(this.charBuffer, 0, 0, size); return charStr.substring(0, end); } // or just emit the charStr return charStr; }; // detectIncompleteChar determines if there is an incomplete UTF-8 character at // the end of the given buffer. If so, it sets this.charLength to the byte // length that character, and sets this.charReceived to the number of bytes // that are available for this character. StringDecoder.prototype.detectIncompleteChar = function(buffer) { // determine how many bytes we have to check at the end of this buffer var i = (buffer.length >= 3) ? 3 : buffer.length; // Figure out if one of the last i bytes of our buffer announces an // incomplete char. for (; i > 0; i--) { var c = buffer[buffer.length - i]; // See http://en.wikipedia.org/wiki/UTF-8#Description // 110XXXXX if (i == 1 && c >> 5 == 0x06) { this.charLength = 2; break; } // 1110XXXX if (i <= 2 && c >> 4 == 0x0E) { this.charLength = 3; break; } // 11110XXX if (i <= 3 && c >> 3 == 0x1E) { this.charLength = 4; break; } } this.charReceived = i; }; StringDecoder.prototype.end = function(buffer) { var res = ''; if (buffer && buffer.length) res = this.write(buffer); if (this.charReceived) { var cr = this.charReceived; var buf = this.charBuffer; var enc = this.encoding; res += buf.slice(0, cr).toString(enc); } return res; }; function passThroughWrite(buffer) { return buffer.toString(this.encoding); } function utf16DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 2; this.charLength = this.charReceived ? 2 : 0; } function base64DetectIncompleteChar(buffer) { this.charReceived = buffer.length % 3; this.charLength = this.charReceived ? 3 : 0; } ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/package.jsonnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/string_decoder/package.jso0000644000000000000000000000254012631326456032210 0ustar 00000000000000{ "name": "string_decoder", "version": "0.10.31", "description": "The string_decoder module from Node core", "main": "index.js", "dependencies": {}, "devDependencies": { "tap": "~0.4.8" }, "scripts": { "test": "tap test/simple/*.js" }, "repository": { "type": "git", "url": "git://github.com/rvagg/string_decoder.git" }, "homepage": "https://github.com/rvagg/string_decoder", "keywords": [ "string", "decoder", "browser", "browserify" ], "license": "MIT", "readme": "**string_decoder.js** (`require('string_decoder')`) from Node.js core\n\nCopyright Joyent, Inc. and other Node contributors. See LICENCE file for details.\n\nVersion numbers match the versions found in Node core, e.g. 0.10.24 matches Node 0.10.24, likewise 0.11.10 matches Node 0.11.10. **Prefer the stable version over the unstable.**\n\nThe *build/* directory contains a build script that will scrape the source from the [joyent/node](https://github.com/joyent/node) repo given a specific Node version.", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/rvagg/string_decoder/issues" }, "_id": "string_decoder@0.10.31", "_shasum": "62e203bc41766c6c28c9fc84301dab1c5310fa94", "_resolved": "https://registry.npmjs.org/string_decoder/-/string_decoder-0.10.31.tgz", "_from": "string_decoder@>=0.10.0 <0.11.0" } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/History.md0000644000000000000000000000043212631326456031775 0ustar 00000000000000 1.0.2 / 2015-10-07 ================== * use try/catch when checking `localStorage` (#3, @kumavis) 1.0.1 / 2014-11-25 ================== * browser: use `console.warn()` for deprecation calls * browser: more jsdocs 1.0.0 / 2014-04-30 ================== * initial commit npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/LICENSE0000644000000000000000000000211612631326456031020 0ustar 00000000000000(The MIT License) Copyright (c) 2014 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/README.md0000644000000000000000000000320212631326456031267 0ustar 00000000000000util-deprecate ============== ### The Node.js `util.deprecate()` function with browser support In Node.js, this module simply re-exports the `util.deprecate()` function. In the web browser (i.e. via browserify), a browser-specific implementation of the `util.deprecate()` function is used. ## API A `deprecate()` function is the only thing exposed by this module. ``` javascript // setup: exports.foo = deprecate(foo, 'foo() is deprecated, use bar() instead'); // users see: foo(); // foo() is deprecated, use bar() instead foo(); foo(); ``` ## License (The MIT License) Copyright (c) 2014 Nathan Rajlich Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/browser.js0000644000000000000000000000311612631326456032035 0ustar 00000000000000 /** * Module exports. */ module.exports = deprecate; /** * Mark that a method should not be used. * Returns a modified function which warns once by default. * * If `localStorage.noDeprecation = true` is set, then it is a no-op. * * If `localStorage.throwDeprecation = true` is set, then deprecated functions * will throw an Error when invoked. * * If `localStorage.traceDeprecation = true` is set, then deprecated functions * will invoke `console.trace()` instead of `console.error()`. * * @param {Function} fn - the function to deprecate * @param {String} msg - the string to print to the console when `fn` is invoked * @returns {Function} a new "deprecated" version of `fn` * @api public */ function deprecate (fn, msg) { if (config('noDeprecation')) { return fn; } var warned = false; function deprecated() { if (!warned) { if (config('throwDeprecation')) { throw new Error(msg); } else if (config('traceDeprecation')) { console.trace(msg); } else { console.warn(msg); } warned = true; } return fn.apply(this, arguments); } return deprecated; } /** * Checks `localStorage` for boolean values for the given `name`. * * @param {String} name * @returns {Boolean} * @api private */ function config (name) { // accessing global.localStorage can trigger a DOMException in sandboxed iframes try { if (!global.localStorage) return false; } catch (_) { return false; } var val = global.localStorage[name]; if (null == val) return false; return String(val).toLowerCase() === 'true'; } npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/node.js0000644000000000000000000000017312631326456031277 0ustar 00000000000000 /** * For Node.js, simply re-export the core `util.deprecate` function. */ module.exports = require('util').deprecate; ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/package.jsonnpm_3.5.2.orig/node_modules/sha/node_modules/readable-stream/node_modules/util-deprecate/package.jso0000644000000000000000000000264212631326456032127 0ustar 00000000000000{ "name": "util-deprecate", "version": "1.0.2", "description": "The Node.js `util.deprecate()` function with browser support", "main": "node.js", "browser": "browser.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "repository": { "type": "git", "url": "git://github.com/TooTallNate/util-deprecate.git" }, "keywords": [ "util", "deprecate", "browserify", "browser", "node" ], "author": { "name": "Nathan Rajlich", "email": "nathan@tootallnate.net", "url": "http://n8.io/" }, "license": "MIT", "bugs": { "url": "https://github.com/TooTallNate/util-deprecate/issues" }, "homepage": "https://github.com/TooTallNate/util-deprecate", "gitHead": "475fb6857cd23fafff20c1be846c1350abf8e6d4", "_id": "util-deprecate@1.0.2", "_shasum": "450d4dc9fa70de732762fbd2d4a28981419a0ccf", "_from": "util-deprecate@>=1.0.1 <1.1.0", "_npmVersion": "2.14.4", "_nodeVersion": "4.1.2", "_npmUser": { "name": "tootallnate", "email": "nathan@tootallnate.net" }, "maintainers": [ { "name": "tootallnate", "email": "nathan@tootallnate.net" } ], "dist": { "shasum": "450d4dc9fa70de732762fbd2d4a28981419a0ccf", "tarball": "http://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/util-deprecate/-/util-deprecate-1.0.2.tgz" } npm_3.5.2.orig/node_modules/slide/LICENSE0000644000000000000000000000135412631326456016257 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/slide/README.md0000644000000000000000000000765512631326456016543 0ustar 00000000000000# Controlling Flow: callbacks are easy ## What's actually hard? - Doing a bunch of things in a specific order. - Knowing when stuff is done. - Handling failures. - Breaking up functionality into parts (avoid nested inline callbacks) ## Common Mistakes - Abandoning convention and consistency. - Putting all callbacks inline. - Using libraries without grokking them. - Trying to make async code look sync. ## Define Conventions - Two kinds of functions: *actors* take action, *callbacks* get results. - Essentially the continuation pattern. Resulting code *looks* similar to fibers, but is *much* simpler to implement. - Node works this way in the lowlevel APIs already, and it's very flexible. ## Callbacks - Simple responders - Must always be prepared to handle errors, that's why it's the first argument. - Often inline anonymous, but not always. - Can trap and call other callbacks with modified data, or pass errors upwards. ## Actors - Last argument is a callback. - If any error occurs, and can't be handled, pass it to the callback and return. - Must not throw. Return value ignored. - return x ==> return cb(null, x) - throw er ==> return cb(er) ```javascript // return true if a path is either // a symlink or a directory. function isLinkOrDir (path, cb) { fs.lstat(path, function (er, s) { if (er) return cb(er) return cb(null, s.isDirectory() || s.isSymbolicLink()) }) } ``` # asyncMap ## Usecases - I have a list of 10 files, and need to read all of them, and then continue when they're all done. - I have a dozen URLs, and need to fetch them all, and then continue when they're all done. - I have 4 connected users, and need to send a message to all of them, and then continue when that's done. - I have a list of n things, and I need to dosomething with all of them, in parallel, and get the results once they're all complete. ## Solution ```javascript var asyncMap = require("slide").asyncMap function writeFiles (files, what, cb) { asyncMap(files, function (f, cb) { fs.writeFile(f, what, cb) }, cb) } writeFiles([my, file, list], "foo", cb) ``` # chain ## Usecases - I have to do a bunch of things, in order. Get db credentials out of a file, read the data from the db, write that data to another file. - If anything fails, do not continue. - I still have to provide an array of functions, which is a lot of boilerplate, and a pita if your functions take args like ```javascript function (cb) { blah(a, b, c, cb) } ``` - Results are discarded, which is a bit lame. - No way to branch. ## Solution - reduces boilerplate by converting an array of [fn, args] to an actor that takes no arguments (except cb) - A bit like Function#bind, but tailored for our use-case. - bindActor(obj, "method", a, b, c) - bindActor(fn, a, b, c) - bindActor(obj, fn, a, b, c) - branching, skipping over falsey arguments ```javascript chain([ doThing && [thing, a, b, c] , isFoo && [doFoo, "foo"] , subChain && [chain, [one, two]] ], cb) ``` - tracking results: results are stored in an optional array passed as argument, last result is always in results[results.length - 1]. - treat chain.first and chain.last as placeholders for the first/last result up until that point. ## Non-trivial example - Read number files in a directory - Add the results together - Ping a web service with the result - Write the response to a file - Delete the number files ```javascript var chain = require("slide").chain function myProgram (cb) { var res = [], last = chain.last, first = chain.first chain([ [fs, "readdir", "the-directory"] , [readFiles, "the-directory", last] , [sum, last] , [ping, "POST", "example.com", 80, "/foo", last] , [fs, "writeFile", "result.txt", last] , [rmFiles, "./the-directory", first] ], res, cb) } ``` # Conclusion: Convention Profits - Consistent API from top to bottom. - Sneak in at any point to inject functionality. Testable, reusable, ... - When ruby and python users whine, you can smile condescendingly. npm_3.5.2.orig/node_modules/slide/index.js0000644000000000000000000000004612631326456016714 0ustar 00000000000000module.exports=require("./lib/slide") npm_3.5.2.orig/node_modules/slide/lib/0000755000000000000000000000000012631326456016015 5ustar 00000000000000npm_3.5.2.orig/node_modules/slide/package.json0000644000000000000000000000263012631326456017536 0ustar 00000000000000{ "name": "slide", "version": "1.1.6", "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "contributors": [ { "name": "S. Sriram", "email": "ssriram@gmail.com", "url": "http://www.565labs.com" } ], "description": "A flow control lib small enough to fit on in a slide presentation. Derived live at Oak.JS", "main": "./lib/slide.js", "dependencies": {}, "devDependencies": {}, "engines": { "node": "*" }, "repository": { "type": "git", "url": "git://github.com/isaacs/slide-flow-control.git" }, "license": "ISC", "gitHead": "8345e51ee41e35825abc1a40750ea11462f57028", "bugs": { "url": "https://github.com/isaacs/slide-flow-control/issues" }, "homepage": "https://github.com/isaacs/slide-flow-control", "_id": "slide@1.1.6", "scripts": {}, "_shasum": "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707", "_from": "slide@>=1.1.6 <1.2.0", "_npmVersion": "2.0.0-beta.3", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "56eb027d65b4d2dce6cb2e2d32c4d4afc9e1d707", "tarball": "http://registry.npmjs.org/slide/-/slide-1.1.6.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/slide/-/slide-1.1.6.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/slide/lib/async-map-ordered.js0000644000000000000000000000330112631326456021662 0ustar 00000000000000 throw new Error("TODO: Not yet implemented.") /* usage: Like asyncMap, but only can take a single cb, and guarantees the order of the results. */ module.exports = asyncMapOrdered function asyncMapOrdered (list, fn, cb_) { if (typeof cb_ !== "function") throw new Error( "No callback provided to asyncMapOrdered") if (typeof fn !== "function") throw new Error( "No map function provided to asyncMapOrdered") if (list === undefined || list === null) return cb_(null, []) if (!Array.isArray(list)) list = [list] if (!list.length) return cb_(null, []) var errState = null , l = list.length , a = l , res = [] , resCount = 0 , maxArgLen = 0 function cb (index) { return function () { if (errState) return var er = arguments[0] var argLen = arguments.length maxArgLen = Math.max(maxArgLen, argLen) res[index] = argLen === 1 ? [er] : Array.apply(null, arguments) // see if any new things have been added. if (list.length > l) { var newList = list.slice(l) a += (list.length - l) var oldLen = l l = list.length process.nextTick(function () { newList.forEach(function (ar, i) { fn(ar, cb(i + oldLen)) }) }) } if (er || --a === 0) { errState = er cb_.apply(null, [errState].concat(flip(res, resCount, maxArgLen))) } }} // expect the supplied cb function to be called // "n" times for each thing in the array. list.forEach(function (ar) { steps.forEach(function (fn, i) { fn(ar, cb(i)) }) }) } function flip (res, resCount, argLen) { var flat = [] // res = [[er, x, y], [er, x1, y1], [er, x2, y2, z2]] // return [[x, x1, x2], [y, y1, y2], [undefined, undefined, z2]] npm_3.5.2.orig/node_modules/slide/lib/async-map.js0000644000000000000000000000267712631326456020257 0ustar 00000000000000 /* usage: // do something to a list of things asyncMap(myListOfStuff, function (thing, cb) { doSomething(thing.foo, cb) }, cb) // do more than one thing to each item asyncMap(list, fooFn, barFn, cb) */ module.exports = asyncMap function asyncMap () { var steps = Array.prototype.slice.call(arguments) , list = steps.shift() || [] , cb_ = steps.pop() if (typeof cb_ !== "function") throw new Error( "No callback provided to asyncMap") if (!list) return cb_(null, []) if (!Array.isArray(list)) list = [list] var n = steps.length , data = [] // 2d array , errState = null , l = list.length , a = l * n if (!a) return cb_(null, []) function cb (er) { if (er && !errState) errState = er var argLen = arguments.length for (var i = 1; i < argLen; i ++) if (arguments[i] !== undefined) { data[i - 1] = (data[i - 1] || []).concat(arguments[i]) } // see if any new things have been added. if (list.length > l) { var newList = list.slice(l) a += (list.length - l) * n l = list.length process.nextTick(function () { newList.forEach(function (ar) { steps.forEach(function (fn) { fn(ar, cb) }) }) }) } if (--a === 0) cb_.apply(null, [errState].concat(data)) } // expect the supplied cb function to be called // "n" times for each thing in the array. list.forEach(function (ar) { steps.forEach(function (fn) { fn(ar, cb) }) }) } npm_3.5.2.orig/node_modules/slide/lib/bind-actor.js0000644000000000000000000000057612631326456020405 0ustar 00000000000000module.exports = bindActor function bindActor () { var args = Array.prototype.slice.call (arguments) // jswtf. , obj = null , fn if (typeof args[0] === "object") { obj = args.shift() fn = args.shift() if (typeof fn === "string") fn = obj[ fn ] } else fn = args.shift() return function (cb) { fn.apply(obj, args.concat(cb)) } } npm_3.5.2.orig/node_modules/slide/lib/chain.js0000644000000000000000000000122412631326456017434 0ustar 00000000000000module.exports = chain var bindActor = require("./bind-actor.js") chain.first = {} ; chain.last = {} function chain (things, cb) { var res = [] ;(function LOOP (i, len) { if (i >= len) return cb(null,res) if (Array.isArray(things[i])) things[i] = bindActor.apply(null, things[i].map(function(i){ return (i===chain.first) ? res[0] : (i===chain.last) ? res[res.length - 1] : i })) if (!things[i]) return LOOP(i + 1, len) things[i](function (er, data) { if (er) return cb(er, res) if (data !== undefined) res = res.concat(data) LOOP(i + 1, len) }) })(0, things.length) } npm_3.5.2.orig/node_modules/slide/lib/slide.js0000644000000000000000000000017112631326456017452 0ustar 00000000000000exports.asyncMap = require("./async-map") exports.bindActor = require("./bind-actor") exports.chain = require("./chain") npm_3.5.2.orig/node_modules/sorted-object/LICENSE.txt0000644000000000000000000000134612631326456020542 0ustar 00000000000000Copyright © 2014 Domenic Denicola This work is free. You can redistribute it and/or modify it under the terms of the Do What The Fuck You Want To Public License, Version 2, as published by Sam Hocevar. See below for more details. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE Version 2, December 2004 Copyright (C) 2004 Sam Hocevar Everyone is permitted to copy and distribute verbatim or modified copies of this license document, and changing it is allowed as long as the name is changed. DO WHAT THE FUCK YOU WANT TO PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. You just DO WHAT THE FUCK YOU WANT TO. npm_3.5.2.orig/node_modules/sorted-object/README.md0000644000000000000000000000177412631326456020203 0ustar 00000000000000# Get a Version of an Object with Sorted Keys Although objects in JavaScript are theoretically unsorted, in practice most engines use insertion order—at least, ignoring numeric keys. This manifests itself most prominently when dealing with an object's JSON serialization. So, for example, you might be trying to serialize some object to a JSON file. But every time you write it, it ends up being output in a different order, depending on how you created it in the first place! This makes for some ugly diffs. **sorted-object** gives you the answer. Just use this package to create a version of your object with its keys sorted before serializing, and you'll get a consistent order every time. ```js var sortedObject = require("sorted-object"); var objectToSerialize = generateStuffNondeterministically(); // Before: fs.writeFileSync("dest.json", JSON.stringify(objectToSerialize)); // After: var sortedVersion = sortedObject(objectToSerialize); fs.writeFileSync("dest.json", JSON.stringify(sortedVersion)); ``` npm_3.5.2.orig/node_modules/sorted-object/lib/0000755000000000000000000000000012631326456017461 5ustar 00000000000000npm_3.5.2.orig/node_modules/sorted-object/package.json0000644000000000000000000000254212631326456021204 0ustar 00000000000000{ "name": "sorted-object", "description": "Returns a copy of an object with its keys sorted", "keywords": [ "sort", "keys", "object" ], "version": "1.0.0", "author": { "name": "Domenic Denicola", "email": "domenic@domenicdenicola.com", "url": "http://domenic.me/" }, "license": "WTFPL", "repository": { "type": "git", "url": "git://github.com/domenic/sorted-object.git" }, "bugs": { "url": "http://github.com/domenic/sorted-object/issues" }, "main": "lib/sorted-object.js", "scripts": { "test": "tape test/tests.js", "lint": "jshint lib && jshint test" }, "devDependencies": { "jshint": "~2.4.3", "tape": "~2.4.2" }, "homepage": "https://github.com/domenic/sorted-object", "_id": "sorted-object@1.0.0", "dist": { "shasum": "5d1f4f9c1fb2cd48965967304e212eb44cfb6d05", "tarball": "http://registry.npmjs.org/sorted-object/-/sorted-object-1.0.0.tgz" }, "_from": "sorted-object@>=1.0.0 <1.1.0", "_npmVersion": "1.3.25", "_npmUser": { "name": "domenic", "email": "domenic@domenicdenicola.com" }, "maintainers": [ { "name": "domenic", "email": "domenic@domenicdenicola.com" } ], "directories": {}, "_shasum": "5d1f4f9c1fb2cd48965967304e212eb44cfb6d05", "_resolved": "https://registry.npmjs.org/sorted-object/-/sorted-object-1.0.0.tgz" } npm_3.5.2.orig/node_modules/sorted-object/lib/sorted-object.js0000644000000000000000000000032212631326456022560 0ustar 00000000000000"use strict"; module.exports = function (input) { var output = Object.create(null); Object.keys(input).sort().forEach(function (key) { output[key] = input[key]; }); return output; }; npm_3.5.2.orig/node_modules/strip-ansi/index.js0000644000000000000000000000024112631326456017702 0ustar 00000000000000'use strict'; var ansiRegex = require('ansi-regex')(); module.exports = function (str) { return typeof str === 'string' ? str.replace(ansiRegex, '') : str; }; npm_3.5.2.orig/node_modules/strip-ansi/license0000644000000000000000000000213712631326456017610 0ustar 00000000000000The MIT License (MIT) Copyright (c) Sindre Sorhus (sindresorhus.com) Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/strip-ansi/package.json0000644000000000000000000000454112631326456020532 0ustar 00000000000000{ "name": "strip-ansi", "version": "3.0.0", "description": "Strip ANSI escape codes", "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/sindresorhus/strip-ansi.git" }, "author": { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, "maintainers": [ { "name": "Sindre Sorhus", "email": "sindresorhus@gmail.com", "url": "sindresorhus.com" }, { "name": "Joshua Appelman", "email": "jappelman@xebia.com", "url": "jbnicolai.com" } ], "engines": { "node": ">=0.10.0" }, "scripts": { "test": "node test.js" }, "files": [ "index.js" ], "keywords": [ "strip", "trim", "remove", "ansi", "styles", "color", "colour", "colors", "terminal", "console", "string", "tty", "escape", "formatting", "rgb", "256", "shell", "xterm", "log", "logging", "command-line", "text" ], "dependencies": { "ansi-regex": "^2.0.0" }, "devDependencies": { "ava": "0.0.4" }, "readme": "# strip-ansi [![Build Status](https://travis-ci.org/sindresorhus/strip-ansi.svg?branch=master)](https://travis-ci.org/sindresorhus/strip-ansi)\n\n> Strip [ANSI escape codes](http://en.wikipedia.org/wiki/ANSI_escape_code)\n\n\n## Install\n\n```\n$ npm install --save strip-ansi\n```\n\n\n## Usage\n\n```js\nvar stripAnsi = require('strip-ansi');\n\nstripAnsi('\\u001b[4mcake\\u001b[0m');\n//=> 'cake'\n```\n\n\n## Related\n\n- [strip-ansi-cli](https://github.com/sindresorhus/strip-ansi-cli) - CLI for this module\n- [has-ansi](https://github.com/sindresorhus/has-ansi) - Check if a string has ANSI escape codes\n- [ansi-regex](https://github.com/sindresorhus/ansi-regex) - Regular expression for matching ANSI escape codes\n- [chalk](https://github.com/sindresorhus/chalk) - Terminal string styling done right\n\n\n## License\n\nMIT © [Sindre Sorhus](http://sindresorhus.com)\n", "readmeFilename": "readme.md", "bugs": { "url": "https://github.com/sindresorhus/strip-ansi/issues" }, "homepage": "https://github.com/sindresorhus/strip-ansi#readme", "_id": "strip-ansi@3.0.0", "_shasum": "7510b665567ca914ccb5d7e072763ac968be3724", "_resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-3.0.0.tgz", "_from": "strip-ansi@3.0.0" } npm_3.5.2.orig/node_modules/strip-ansi/readme.md0000644000000000000000000000151312631326456020017 0ustar 00000000000000# strip-ansi [![Build Status](https://travis-ci.org/sindresorhus/strip-ansi.svg?branch=master)](https://travis-ci.org/sindresorhus/strip-ansi) > Strip [ANSI escape codes](http://en.wikipedia.org/wiki/ANSI_escape_code) ## Install ``` $ npm install --save strip-ansi ``` ## Usage ```js var stripAnsi = require('strip-ansi'); stripAnsi('\u001b[4mcake\u001b[0m'); //=> 'cake' ``` ## Related - [strip-ansi-cli](https://github.com/sindresorhus/strip-ansi-cli) - CLI for this module - [has-ansi](https://github.com/sindresorhus/has-ansi) - Check if a string has ANSI escape codes - [ansi-regex](https://github.com/sindresorhus/ansi-regex) - Regular expression for matching ANSI escape codes - [chalk](https://github.com/sindresorhus/chalk) - Terminal string styling done right ## License MIT © [Sindre Sorhus](http://sindresorhus.com) npm_3.5.2.orig/node_modules/tar/.npmignore0000644000000000000000000000007712631326456016740 0ustar 00000000000000.*.swp node_modules examples/extract/ test/tmp/ test/fixtures/ npm_3.5.2.orig/node_modules/tar/.travis.yml0000644000000000000000000000005512631326456017046 0ustar 00000000000000language: node_js node_js: - 0.10 - 0.11 npm_3.5.2.orig/node_modules/tar/LICENSE0000644000000000000000000000137212631326456015745 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/tar/README.md0000644000000000000000000000242212631326456016214 0ustar 00000000000000# node-tar Tar for Node.js. [![NPM](https://nodei.co/npm/tar.png)](https://nodei.co/npm/tar/) ## API See `examples/` for usage examples. ### var tar = require('tar') Returns an object with `.Pack`, `.Extract` and `.Parse` methods. ### tar.Pack([properties]) Returns a through stream. Use [fstream](https://npmjs.org/package/fstream) to write files into the pack stream and you will receive tar archive data from the pack stream. This only works with directories, it does not work with individual files. The optional `properties` object are used to set properties in the tar 'Global Extended Header'. If the `fromBase` property is set to true, the tar will contain files relative to the path passed, and not with the path included. ### tar.Extract([options]) Returns a through stream. Write tar data to the stream and the files in the tarball will be extracted onto the filesystem. `options` can be: ```js { path: '/path/to/extract/tar/into', strip: 0, // how many path segments to strip from the root when extracting } ``` `options` also get passed to the `fstream.Writer` instance that `tar` uses internally. ### tar.Parse() Returns a writable stream. Write tar data to it and it will emit `entry` events for each entry parsed from the tarball. This is used by `tar.Extract`. npm_3.5.2.orig/node_modules/tar/examples/0000755000000000000000000000000012631326456016553 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/lib/0000755000000000000000000000000012631326456015503 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/node_modules/0000755000000000000000000000000012631326456017412 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/package.json0000644000000000000000000000307012631326456017223 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "tar", "description": "tar for node", "version": "2.2.1", "repository": { "type": "git", "url": "git://github.com/isaacs/node-tar.git" }, "main": "tar.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "block-stream": "*", "fstream": "^1.0.2", "inherits": "2" }, "devDependencies": { "graceful-fs": "^4.1.2", "rimraf": "1.x", "tap": "0.x", "mkdirp": "^0.5.0" }, "license": "ISC", "gitHead": "52237e39d2eb68d22a32d9a98f1d762189fe6a3d", "bugs": { "url": "https://github.com/isaacs/node-tar/issues" }, "homepage": "https://github.com/isaacs/node-tar#readme", "_id": "tar@2.2.1", "_shasum": "8e4d2a256c0e2185c6b18ad694aec968b83cb1d1", "_from": "tar@>=2.2.1 <2.3.0", "_npmVersion": "2.14.3", "_nodeVersion": "2.2.2", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "8e4d2a256c0e2185c6b18ad694aec968b83cb1d1", "tarball": "http://registry.npmjs.org/tar/-/tar-2.2.1.tgz" }, "maintainers": [ { "name": "isaacs", "email": "isaacs@npmjs.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" }, { "name": "soldair", "email": "soldair@gmail.com" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/tar/-/tar-2.2.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/tar/tar.js0000644000000000000000000001002412631326456016056 0ustar 00000000000000// field paths that every tar file must have. // header is padded to 512 bytes. var f = 0 , fields = {} , path = fields.path = f++ , mode = fields.mode = f++ , uid = fields.uid = f++ , gid = fields.gid = f++ , size = fields.size = f++ , mtime = fields.mtime = f++ , cksum = fields.cksum = f++ , type = fields.type = f++ , linkpath = fields.linkpath = f++ , headerSize = 512 , blockSize = 512 , fieldSize = [] fieldSize[path] = 100 fieldSize[mode] = 8 fieldSize[uid] = 8 fieldSize[gid] = 8 fieldSize[size] = 12 fieldSize[mtime] = 12 fieldSize[cksum] = 8 fieldSize[type] = 1 fieldSize[linkpath] = 100 // "ustar\0" may introduce another bunch of headers. // these are optional, and will be nulled out if not present. var ustar = fields.ustar = f++ , ustarver = fields.ustarver = f++ , uname = fields.uname = f++ , gname = fields.gname = f++ , devmaj = fields.devmaj = f++ , devmin = fields.devmin = f++ , prefix = fields.prefix = f++ , fill = fields.fill = f++ // terminate fields. fields[f] = null fieldSize[ustar] = 6 fieldSize[ustarver] = 2 fieldSize[uname] = 32 fieldSize[gname] = 32 fieldSize[devmaj] = 8 fieldSize[devmin] = 8 fieldSize[prefix] = 155 fieldSize[fill] = 12 // nb: prefix field may in fact be 130 bytes of prefix, // a null char, 12 bytes for atime, 12 bytes for ctime. // // To recognize this format: // 1. prefix[130] === ' ' or '\0' // 2. atime and ctime are octal numeric values // 3. atime and ctime have ' ' in their last byte var fieldEnds = {} , fieldOffs = {} , fe = 0 for (var i = 0; i < f; i ++) { fieldOffs[i] = fe fieldEnds[i] = (fe += fieldSize[i]) } // build a translation table of field paths. Object.keys(fields).forEach(function (f) { if (fields[f] !== null) fields[fields[f]] = f }) // different values of the 'type' field // paths match the values of Stats.isX() functions, where appropriate var types = { 0: "File" , "\0": "OldFile" // like 0 , "": "OldFile" , 1: "Link" , 2: "SymbolicLink" , 3: "CharacterDevice" , 4: "BlockDevice" , 5: "Directory" , 6: "FIFO" , 7: "ContiguousFile" // like 0 // posix headers , g: "GlobalExtendedHeader" // k=v for the rest of the archive , x: "ExtendedHeader" // k=v for the next file // vendor-specific stuff , A: "SolarisACL" // skip , D: "GNUDumpDir" // like 5, but with data, which should be skipped , I: "Inode" // metadata only, skip , K: "NextFileHasLongLinkpath" // data = link path of next file , L: "NextFileHasLongPath" // data = path of next file , M: "ContinuationFile" // skip , N: "OldGnuLongPath" // like L , S: "SparseFile" // skip , V: "TapeVolumeHeader" // skip , X: "OldExtendedHeader" // like x } Object.keys(types).forEach(function (t) { types[types[t]] = types[types[t]] || t }) // values for the mode field var modes = { suid: 04000 // set uid on extraction , sgid: 02000 // set gid on extraction , svtx: 01000 // set restricted deletion flag on dirs on extraction , uread: 0400 , uwrite: 0200 , uexec: 0100 , gread: 040 , gwrite: 020 , gexec: 010 , oread: 4 , owrite: 2 , oexec: 1 , all: 07777 } var numeric = { mode: true , uid: true , gid: true , size: true , mtime: true , devmaj: true , devmin: true , cksum: true , atime: true , ctime: true , dev: true , ino: true , nlink: true } Object.keys(modes).forEach(function (t) { modes[modes[t]] = modes[modes[t]] || t }) var knownExtended = { atime: true , charset: true , comment: true , ctime: true , gid: true , gname: true , linkpath: true , mtime: true , path: true , realtime: true , security: true , size: true , uid: true , uname: true } exports.fields = fields exports.fieldSize = fieldSize exports.fieldOffs = fieldOffs exports.fieldEnds = fieldEnds exports.types = types exports.modes = modes exports.numeric = numeric exports.headerSize = headerSize exports.blockSize = blockSize exports.knownExtended = knownExtended exports.Pack = require("./lib/pack.js") exports.Parse = require("./lib/parse.js") exports.Extract = require("./lib/extract.js") npm_3.5.2.orig/node_modules/tar/test/0000755000000000000000000000000012631326456015714 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/examples/extracter.js0000644000000000000000000000060112631326456021107 0ustar 00000000000000var tar = require("../tar.js") , fs = require("fs") function onError(err) { console.error('An error occurred:', err) } function onEnd() { console.log('Extracted!') } var extractor = tar.Extract({path: __dirname + "/extract"}) .on('error', onError) .on('end', onEnd); fs.createReadStream(__dirname + "/../test/fixtures/c.tar") .on('error', onError) .pipe(extractor); npm_3.5.2.orig/node_modules/tar/examples/packer.js0000644000000000000000000000075012631326456020360 0ustar 00000000000000var tar = require("../tar.js") , fstream = require("fstream") , fs = require("fs") var dirDest = fs.createWriteStream('dir.tar') function onError(err) { console.error('An error occurred:', err) } function onEnd() { console.log('Packed!') } var packer = tar.Pack({ noProprietary: true }) .on('error', onError) .on('end', onEnd); // This must be a "directory" fstream.Reader({ path: __dirname, type: "Directory" }) .on('error', onError) .pipe(packer) .pipe(dirDest) npm_3.5.2.orig/node_modules/tar/examples/reader.js0000644000000000000000000000175412631326456020362 0ustar 00000000000000var tar = require("../tar.js") , fs = require("fs") fs.createReadStream(__dirname + "/../test/fixtures/c.tar") .pipe(tar.Parse()) .on("extendedHeader", function (e) { console.error("extended pax header", e.props) e.on("end", function () { console.error("extended pax fields:", e.fields) }) }) .on("ignoredEntry", function (e) { console.error("ignoredEntry?!?", e.props) }) .on("longLinkpath", function (e) { console.error("longLinkpath entry", e.props) e.on("end", function () { console.error("value=%j", e.body.toString()) }) }) .on("longPath", function (e) { console.error("longPath entry", e.props) e.on("end", function () { console.error("value=%j", e.body.toString()) }) }) .on("entry", function (e) { console.error("entry", e.props) e.on("data", function (c) { console.error(" >>>" + c.toString().replace(/\n/g, "\\n")) }) e.on("end", function () { console.error(" << 0 return !this._needDrain } EntryWriter.prototype.end = function (c) { // console.error(".. ew end") if (c) this._buffer.push(c) this._buffer.push(EOF) this._ended = true this._process() this._needDrain = this._buffer.length > 0 } EntryWriter.prototype.pause = function () { // console.error(".. ew pause") this._paused = true this.emit("pause") } EntryWriter.prototype.resume = function () { // console.error(".. ew resume") this._paused = false this.emit("resume") this._process() } EntryWriter.prototype.add = function (entry) { // console.error(".. ew add") if (!this.parent) return this.emit("error", new Error("no parent")) // make sure that the _header and such is emitted, and clear out // the _currentEntry link on the parent. if (!this._ended) this.end() return this.parent.add(entry) } EntryWriter.prototype._header = function () { // console.error(".. ew header") if (this._didHeader) return this._didHeader = true var headerBlock = TarHeader.encode(this.props) if (this.props.needExtended && !this._meta) { var me = this ExtendedHeaderWriter = ExtendedHeaderWriter || require("./extended-header-writer.js") ExtendedHeaderWriter(this.props) .on("data", function (c) { me.emit("data", c) }) .on("error", function (er) { me.emit("error", er) }) .end() } // console.error(".. .. ew headerBlock emitting") this.emit("data", headerBlock) this.emit("header") } EntryWriter.prototype._process = function () { // console.error(".. .. ew process") if (!this._didHeader && !this._meta) { this._header() } if (this._paused || this._processing) { // console.error(".. .. .. paused=%j, processing=%j", this._paused, this._processing) return } this._processing = true var buf = this._buffer for (var i = 0; i < buf.length; i ++) { // console.error(".. .. .. i=%d", i) var c = buf[i] if (c === EOF) this._stream.end() else this._stream.write(c) if (this._paused) { // console.error(".. .. .. paused mid-emission") this._processing = false if (i < buf.length) { this._needDrain = true this._buffer = buf.slice(i + 1) } return } } // console.error(".. .. .. emitted") this._buffer.length = 0 this._processing = false // console.error(".. .. .. emitting drain") this.emit("drain") } EntryWriter.prototype.destroy = function () {} npm_3.5.2.orig/node_modules/tar/lib/entry.js0000644000000000000000000001224212631326456017203 0ustar 00000000000000// A passthrough read/write stream that sets its properties // based on a header, extendedHeader, and globalHeader // // Can be either a file system object of some sort, or // a pax/ustar metadata entry. module.exports = Entry var TarHeader = require("./header.js") , tar = require("../tar") , assert = require("assert").ok , Stream = require("stream").Stream , inherits = require("inherits") , fstream = require("fstream").Abstract function Entry (header, extended, global) { Stream.call(this) this.readable = true this.writable = true this._needDrain = false this._paused = false this._reading = false this._ending = false this._ended = false this._remaining = 0 this._abort = false this._queue = [] this._index = 0 this._queueLen = 0 this._read = this._read.bind(this) this.props = {} this._header = header this._extended = extended || {} // globals can change throughout the course of // a file parse operation. Freeze it at its current state. this._global = {} var me = this Object.keys(global || {}).forEach(function (g) { me._global[g] = global[g] }) this._setProps() } inherits(Entry, Stream) Entry.prototype.write = function (c) { if (this._ending) this.error("write() after end()", null, true) if (this._remaining === 0) { this.error("invalid bytes past eof") } // often we'll get a bunch of \0 at the end of the last write, // since chunks will always be 512 bytes when reading a tarball. if (c.length > this._remaining) { c = c.slice(0, this._remaining) } this._remaining -= c.length // put it on the stack. var ql = this._queueLen this._queue.push(c) this._queueLen ++ this._read() // either paused, or buffered if (this._paused || ql > 0) { this._needDrain = true return false } return true } Entry.prototype.end = function (c) { if (c) this.write(c) this._ending = true this._read() } Entry.prototype.pause = function () { this._paused = true this.emit("pause") } Entry.prototype.resume = function () { // console.error(" Tar Entry resume", this.path) this.emit("resume") this._paused = false this._read() return this._queueLen - this._index > 1 } // This is bound to the instance Entry.prototype._read = function () { // console.error(" Tar Entry _read", this.path) if (this._paused || this._reading || this._ended) return // set this flag so that event handlers don't inadvertently // get multiple _read() calls running. this._reading = true // have any data to emit? while (this._index < this._queueLen && !this._paused) { var chunk = this._queue[this._index ++] this.emit("data", chunk) } // check if we're drained if (this._index >= this._queueLen) { this._queue.length = this._queueLen = this._index = 0 if (this._needDrain) { this._needDrain = false this.emit("drain") } if (this._ending) { this._ended = true this.emit("end") } } // if the queue gets too big, then pluck off whatever we can. // this should be fairly rare. var mql = this._maxQueueLen if (this._queueLen > mql && this._index > 0) { mql = Math.min(this._index, mql) this._index -= mql this._queueLen -= mql this._queue = this._queue.slice(mql) } this._reading = false } Entry.prototype._setProps = function () { // props = extended->global->header->{} var header = this._header , extended = this._extended , global = this._global , props = this.props // first get the values from the normal header. var fields = tar.fields for (var f = 0; fields[f] !== null; f ++) { var field = fields[f] , val = header[field] if (typeof val !== "undefined") props[field] = val } // next, the global header for this file. // numeric values, etc, will have already been parsed. ;[global, extended].forEach(function (p) { Object.keys(p).forEach(function (f) { if (typeof p[f] !== "undefined") props[f] = p[f] }) }) // no nulls allowed in path or linkpath ;["path", "linkpath"].forEach(function (p) { if (props.hasOwnProperty(p)) { props[p] = props[p].split("\0")[0] } }) // set date fields to be a proper date ;["mtime", "ctime", "atime"].forEach(function (p) { if (props.hasOwnProperty(p)) { props[p] = new Date(props[p] * 1000) } }) // set the type so that we know what kind of file to create var type switch (tar.types[props.type]) { case "OldFile": case "ContiguousFile": type = "File" break case "GNUDumpDir": type = "Directory" break case undefined: type = "Unknown" break case "Link": case "SymbolicLink": case "CharacterDevice": case "BlockDevice": case "Directory": case "FIFO": default: type = tar.types[props.type] } this.type = type this.path = props.path this.size = props.size // size is special, since it signals when the file needs to end. this._remaining = props.size } // the parser may not call write if _abort is true. // useful for skipping data from some files quickly. Entry.prototype.abort = function(){ this._abort = true } Entry.prototype.warn = fstream.warn Entry.prototype.error = fstream.error npm_3.5.2.orig/node_modules/tar/lib/extended-header-writer.js0000644000000000000000000001232112631326456022400 0ustar 00000000000000 module.exports = ExtendedHeaderWriter var inherits = require("inherits") , EntryWriter = require("./entry-writer.js") inherits(ExtendedHeaderWriter, EntryWriter) var tar = require("../tar.js") , path = require("path") , TarHeader = require("./header.js") // props is the props of the thing we need to write an // extended header for. // Don't be shy with it. Just encode everything. function ExtendedHeaderWriter (props) { // console.error(">> ehw ctor") var me = this if (!(me instanceof ExtendedHeaderWriter)) { return new ExtendedHeaderWriter(props) } me.fields = props var p = { path : ("PaxHeader" + path.join("/", props.path || "")) .replace(/\\/g, "/").substr(0, 100) , mode : props.mode || 0666 , uid : props.uid || 0 , gid : props.gid || 0 , size : 0 // will be set later , mtime : props.mtime || Date.now() / 1000 , type : "x" , linkpath : "" , ustar : "ustar\0" , ustarver : "00" , uname : props.uname || "" , gname : props.gname || "" , devmaj : props.devmaj || 0 , devmin : props.devmin || 0 } EntryWriter.call(me, p) // console.error(">> ehw props", me.props) me.props = p me._meta = true } ExtendedHeaderWriter.prototype.end = function () { // console.error(">> ehw end") var me = this if (me._ended) return me._ended = true me._encodeFields() if (me.props.size === 0) { // nothing to write! me._ready = true me._stream.end() return } me._stream.write(TarHeader.encode(me.props)) me.body.forEach(function (l) { me._stream.write(l) }) me._ready = true // console.error(">> ehw _process calling end()", me.props) this._stream.end() } ExtendedHeaderWriter.prototype._encodeFields = function () { // console.error(">> ehw _encodeFields") this.body = [] if (this.fields.prefix) { this.fields.path = this.fields.prefix + "/" + this.fields.path this.fields.prefix = "" } encodeFields(this.fields, "", this.body, this.fields.noProprietary) var me = this this.body.forEach(function (l) { me.props.size += l.length }) } function encodeFields (fields, prefix, body, nop) { // console.error(">> >> ehw encodeFields") // "%d %s=%s\n", , , // The length is a decimal number, and includes itself and the \n // Numeric values are decimal strings. Object.keys(fields).forEach(function (k) { var val = fields[k] , numeric = tar.numeric[k] if (prefix) k = prefix + "." + k // already including NODETAR.type, don't need File=true also if (k === fields.type && val === true) return switch (k) { // don't include anything that's always handled just fine // in the normal header, or only meaningful in the context // of nodetar case "mode": case "cksum": case "ustar": case "ustarver": case "prefix": case "basename": case "dirname": case "needExtended": case "block": case "filter": return case "rdev": if (val === 0) return break case "nlink": case "dev": // Truly a hero among men, Creator of Star! case "ino": // Speak his name with reverent awe! It is: k = "SCHILY." + k break default: break } if (val && typeof val === "object" && !Buffer.isBuffer(val)) encodeFields(val, k, body, nop) else if (val === null || val === undefined) return else body.push.apply(body, encodeField(k, val, nop)) }) return body } function encodeField (k, v, nop) { // lowercase keys must be valid, otherwise prefix with // "NODETAR." if (k.charAt(0) === k.charAt(0).toLowerCase()) { var m = k.split(".")[0] if (!tar.knownExtended[m]) k = "NODETAR." + k } // no proprietary if (nop && k.charAt(0) !== k.charAt(0).toLowerCase()) { return [] } if (typeof val === "number") val = val.toString(10) var s = new Buffer(" " + k + "=" + v + "\n") , digits = Math.floor(Math.log(s.length) / Math.log(10)) + 1 // console.error("1 s=%j digits=%j s.length=%d", s.toString(), digits, s.length) // if adding that many digits will make it go over that length, // then add one to it. For example, if the string is: // " foo=bar\n" // then that's 9 characters. With the "9", that bumps the length // up to 10. However, this is invalid: // "10 foo=bar\n" // but, since that's actually 11 characters, since 10 adds another // character to the length, and the length includes the number // itself. In that case, just bump it up again. if (s.length + digits >= Math.pow(10, digits)) digits += 1 // console.error("2 s=%j digits=%j s.length=%d", s.toString(), digits, s.length) var len = digits + s.length // console.error("3 s=%j digits=%j s.length=%d len=%d", s.toString(), digits, s.length, len) var lenBuf = new Buffer("" + len) if (lenBuf.length + s.length !== len) { throw new Error("Bad length calculation\n"+ "len="+len+"\n"+ "lenBuf="+JSON.stringify(lenBuf.toString())+"\n"+ "lenBuf.length="+lenBuf.length+"\n"+ "digits="+digits+"\n"+ "s="+JSON.stringify(s.toString())+"\n"+ "s.length="+s.length) } return [lenBuf, s] } npm_3.5.2.orig/node_modules/tar/lib/extended-header.js0000644000000000000000000000675312631326456021102 0ustar 00000000000000// An Entry consisting of: // // "%d %s=%s\n", , , // // The length is a decimal number, and includes itself and the \n // \0 does not terminate anything. Only the length terminates the string. // Numeric values are decimal strings. module.exports = ExtendedHeader var Entry = require("./entry.js") , inherits = require("inherits") , tar = require("../tar.js") , numeric = tar.numeric , keyTrans = { "SCHILY.dev": "dev" , "SCHILY.ino": "ino" , "SCHILY.nlink": "nlink" } function ExtendedHeader () { Entry.apply(this, arguments) this.on("data", this._parse) this.fields = {} this._position = 0 this._fieldPos = 0 this._state = SIZE this._sizeBuf = [] this._keyBuf = [] this._valBuf = [] this._size = -1 this._key = "" } inherits(ExtendedHeader, Entry) ExtendedHeader.prototype._parse = parse var s = 0 , states = ExtendedHeader.states = {} , SIZE = states.SIZE = s++ , KEY = states.KEY = s++ , VAL = states.VAL = s++ , ERR = states.ERR = s++ Object.keys(states).forEach(function (s) { states[states[s]] = states[s] }) states[s] = null // char code values for comparison var _0 = "0".charCodeAt(0) , _9 = "9".charCodeAt(0) , point = ".".charCodeAt(0) , a = "a".charCodeAt(0) , Z = "Z".charCodeAt(0) , a = "a".charCodeAt(0) , z = "z".charCodeAt(0) , space = " ".charCodeAt(0) , eq = "=".charCodeAt(0) , cr = "\n".charCodeAt(0) function parse (c) { if (this._state === ERR) return for ( var i = 0, l = c.length ; i < l ; this._position++, this._fieldPos++, i++) { // console.error("top of loop, size="+this._size) var b = c[i] if (this._size >= 0 && this._fieldPos > this._size) { error(this, "field exceeds length="+this._size) return } switch (this._state) { case ERR: return case SIZE: // console.error("parsing size, b=%d, rest=%j", b, c.slice(i).toString()) if (b === space) { this._state = KEY // this._fieldPos = this._sizeBuf.length this._size = parseInt(new Buffer(this._sizeBuf).toString(), 10) this._sizeBuf.length = 0 continue } if (b < _0 || b > _9) { error(this, "expected [" + _0 + ".." + _9 + "], got " + b) return } this._sizeBuf.push(b) continue case KEY: // can be any char except =, not > size. if (b === eq) { this._state = VAL this._key = new Buffer(this._keyBuf).toString() if (keyTrans[this._key]) this._key = keyTrans[this._key] this._keyBuf.length = 0 continue } this._keyBuf.push(b) continue case VAL: // field must end with cr if (this._fieldPos === this._size - 1) { // console.error("finished with "+this._key) if (b !== cr) { error(this, "expected \\n at end of field") return } var val = new Buffer(this._valBuf).toString() if (numeric[this._key]) { val = parseFloat(val) } this.fields[this._key] = val this._valBuf.length = 0 this._state = SIZE this._size = -1 this._fieldPos = -1 continue } this._valBuf.push(b) continue } } } function error (me, msg) { msg = "invalid header: " + msg + "\nposition=" + me._position + "\nfield position=" + me._fieldPos me.error(msg) me.state = ERR } npm_3.5.2.orig/node_modules/tar/lib/extract.js0000644000000000000000000000477312631326456017526 0ustar 00000000000000// give it a tarball and a path, and it'll dump the contents module.exports = Extract var tar = require("../tar.js") , fstream = require("fstream") , inherits = require("inherits") , path = require("path") function Extract (opts) { if (!(this instanceof Extract)) return new Extract(opts) tar.Parse.apply(this) if (typeof opts !== "object") { opts = { path: opts } } // better to drop in cwd? seems more standard. opts.path = opts.path || path.resolve("node-tar-extract") opts.type = "Directory" opts.Directory = true // similar to --strip or --strip-components opts.strip = +opts.strip if (!opts.strip || opts.strip <= 0) opts.strip = 0 this._fst = fstream.Writer(opts) this.pause() var me = this // Hardlinks in tarballs are relative to the root // of the tarball. So, they need to be resolved against // the target directory in order to be created properly. me.on("entry", function (entry) { // if there's a "strip" argument, then strip off that many // path components. if (opts.strip) { var p = entry.path.split("/").slice(opts.strip).join("/") entry.path = entry.props.path = p if (entry.linkpath) { var lp = entry.linkpath.split("/").slice(opts.strip).join("/") entry.linkpath = entry.props.linkpath = lp } } if (entry.type === "Link") { entry.linkpath = entry.props.linkpath = path.join(opts.path, path.join("/", entry.props.linkpath)) } if (entry.type === "SymbolicLink") { var dn = path.dirname(entry.path) || "" var linkpath = entry.props.linkpath var target = path.resolve(opts.path, dn, linkpath) if (target.indexOf(opts.path) !== 0) { linkpath = path.join(opts.path, path.join("/", linkpath)) } entry.linkpath = entry.props.linkpath = linkpath } }) this._fst.on("ready", function () { me.pipe(me._fst, { end: false }) me.resume() }) this._fst.on('error', function(err) { me.emit('error', err) }) this._fst.on('drain', function() { me.emit('drain') }) // this._fst.on("end", function () { // console.error("\nEEEE Extract End", me._fst.path) // }) this._fst.on("close", function () { // console.error("\nEEEE Extract End", me._fst.path) me.emit("finish") me.emit("end") me.emit("close") }) } inherits(Extract, tar.Parse) Extract.prototype._streamEnd = function () { var me = this if (!me._ended || me._entry) me.error("unexpected eof") me._fst.end() // my .end() is coming later. } npm_3.5.2.orig/node_modules/tar/lib/global-header-writer.js0000644000000000000000000000060412631326456022041 0ustar 00000000000000module.exports = GlobalHeaderWriter var ExtendedHeaderWriter = require("./extended-header-writer.js") , inherits = require("inherits") inherits(GlobalHeaderWriter, ExtendedHeaderWriter) function GlobalHeaderWriter (props) { if (!(this instanceof GlobalHeaderWriter)) { return new GlobalHeaderWriter(props) } ExtendedHeaderWriter.call(this, props) this.props.type = "g" } npm_3.5.2.orig/node_modules/tar/lib/header.js0000644000000000000000000002543212631326456017277 0ustar 00000000000000// parse a 512-byte header block to a data object, or vice-versa // If the data won't fit nicely in a simple header, then generate // the appropriate extended header file, and return that. module.exports = TarHeader var tar = require("../tar.js") , fields = tar.fields , fieldOffs = tar.fieldOffs , fieldEnds = tar.fieldEnds , fieldSize = tar.fieldSize , numeric = tar.numeric , assert = require("assert").ok , space = " ".charCodeAt(0) , slash = "/".charCodeAt(0) , bslash = process.platform === "win32" ? "\\".charCodeAt(0) : null function TarHeader (block) { if (!(this instanceof TarHeader)) return new TarHeader(block) if (block) this.decode(block) } TarHeader.prototype = { decode : decode , encode: encode , calcSum: calcSum , checkSum: checkSum } TarHeader.parseNumeric = parseNumeric TarHeader.encode = encode TarHeader.decode = decode // note that this will only do the normal ustar header, not any kind // of extended posix header file. If something doesn't fit comfortably, // then it will set obj.needExtended = true, and set the block to // the closest approximation. function encode (obj) { if (!obj && !(this instanceof TarHeader)) throw new Error( "encode must be called on a TarHeader, or supplied an object") obj = obj || this var block = obj.block = new Buffer(512) // if the object has a "prefix", then that's actually an extension of // the path field. if (obj.prefix) { // console.error("%% header encoding, got a prefix", obj.prefix) obj.path = obj.prefix + "/" + obj.path // console.error("%% header encoding, prefixed path", obj.path) obj.prefix = "" } obj.needExtended = false if (obj.mode) { if (typeof obj.mode === "string") obj.mode = parseInt(obj.mode, 8) obj.mode = obj.mode & 0777 } for (var f = 0; fields[f] !== null; f ++) { var field = fields[f] , off = fieldOffs[f] , end = fieldEnds[f] , ret switch (field) { case "cksum": // special, done below, after all the others break case "prefix": // special, this is an extension of the "path" field. // console.error("%% header encoding, skip prefix later") break case "type": // convert from long name to a single char. var type = obj.type || "0" if (type.length > 1) { type = tar.types[obj.type] if (!type) type = "0" } writeText(block, off, end, type) break case "path": // uses the "prefix" field if > 100 bytes, but <= 255 var pathLen = Buffer.byteLength(obj.path) , pathFSize = fieldSize[fields.path] , prefFSize = fieldSize[fields.prefix] // paths between 100 and 255 should use the prefix field. // longer than 255 if (pathLen > pathFSize && pathLen <= pathFSize + prefFSize) { // need to find a slash somewhere in the middle so that // path and prefix both fit in their respective fields var searchStart = pathLen - 1 - pathFSize , searchEnd = prefFSize , found = false , pathBuf = new Buffer(obj.path) for ( var s = searchStart ; (s <= searchEnd) ; s ++ ) { if (pathBuf[s] === slash || pathBuf[s] === bslash) { found = s break } } if (found !== false) { prefix = pathBuf.slice(0, found).toString("utf8") path = pathBuf.slice(found + 1).toString("utf8") ret = writeText(block, off, end, path) off = fieldOffs[fields.prefix] end = fieldEnds[fields.prefix] // console.error("%% header writing prefix", off, end, prefix) ret = writeText(block, off, end, prefix) || ret break } } // paths less than 100 chars don't need a prefix // and paths longer than 255 need an extended header and will fail // on old implementations no matter what we do here. // Null out the prefix, and fallthrough to default. // console.error("%% header writing no prefix") var poff = fieldOffs[fields.prefix] , pend = fieldEnds[fields.prefix] writeText(block, poff, pend, "") // fallthrough // all other fields are numeric or text default: ret = numeric[field] ? writeNumeric(block, off, end, obj[field]) : writeText(block, off, end, obj[field] || "") break } obj.needExtended = obj.needExtended || ret } var off = fieldOffs[fields.cksum] , end = fieldEnds[fields.cksum] writeNumeric(block, off, end, calcSum.call(this, block)) return block } // if it's a negative number, or greater than will fit, // then use write256. var MAXNUM = { 12: 077777777777 , 11: 07777777777 , 8 : 07777777 , 7 : 0777777 } function writeNumeric (block, off, end, num) { var writeLen = end - off , maxNum = MAXNUM[writeLen] || 0 num = num || 0 // console.error(" numeric", num) if (num instanceof Date || Object.prototype.toString.call(num) === "[object Date]") { num = num.getTime() / 1000 } if (num > maxNum || num < 0) { write256(block, off, end, num) // need an extended header if negative or too big. return true } // god, tar is so annoying // if the string is small enough, you should put a space // between the octal string and the \0, but if it doesn't // fit, then don't. var numStr = Math.floor(num).toString(8) if (num < MAXNUM[writeLen - 1]) numStr += " " // pad with "0" chars if (numStr.length < writeLen) { numStr = (new Array(writeLen - numStr.length).join("0")) + numStr } if (numStr.length !== writeLen - 1) { throw new Error("invalid length: " + JSON.stringify(numStr) + "\n" + "expected: "+writeLen) } block.write(numStr, off, writeLen, "utf8") block[end - 1] = 0 } function write256 (block, off, end, num) { var buf = block.slice(off, end) var positive = num >= 0 buf[0] = positive ? 0x80 : 0xFF // get the number as a base-256 tuple if (!positive) num *= -1 var tuple = [] do { var n = num % 256 tuple.push(n) num = (num - n) / 256 } while (num) var bytes = tuple.length var fill = buf.length - bytes for (var i = 1; i < fill; i ++) { buf[i] = positive ? 0 : 0xFF } // tuple is a base256 number, with [0] as the *least* significant byte // if it's negative, then we need to flip all the bits once we hit the // first non-zero bit. The 2's-complement is (0x100 - n), and the 1's- // complement is (0xFF - n). var zero = true for (i = bytes; i > 0; i --) { var byte = tuple[bytes - i] if (positive) buf[fill + i] = byte else if (zero && byte === 0) buf[fill + i] = 0 else if (zero) { zero = false buf[fill + i] = 0x100 - byte } else buf[fill + i] = 0xFF - byte } } function writeText (block, off, end, str) { // strings are written as utf8, then padded with \0 var strLen = Buffer.byteLength(str) , writeLen = Math.min(strLen, end - off) // non-ascii fields need extended headers // long fields get truncated , needExtended = strLen !== str.length || strLen > writeLen // write the string, and null-pad if (writeLen > 0) block.write(str, off, writeLen, "utf8") for (var i = off + writeLen; i < end; i ++) block[i] = 0 return needExtended } function calcSum (block) { block = block || this.block assert(Buffer.isBuffer(block) && block.length === 512) if (!block) throw new Error("Need block to checksum") // now figure out what it would be if the cksum was " " var sum = 0 , start = fieldOffs[fields.cksum] , end = fieldEnds[fields.cksum] for (var i = 0; i < fieldOffs[fields.cksum]; i ++) { sum += block[i] } for (var i = start; i < end; i ++) { sum += space } for (var i = end; i < 512; i ++) { sum += block[i] } return sum } function checkSum (block) { var sum = calcSum.call(this, block) block = block || this.block var cksum = block.slice(fieldOffs[fields.cksum], fieldEnds[fields.cksum]) cksum = parseNumeric(cksum) return cksum === sum } function decode (block) { block = block || this.block assert(Buffer.isBuffer(block) && block.length === 512) this.block = block this.cksumValid = this.checkSum() var prefix = null // slice off each field. for (var f = 0; fields[f] !== null; f ++) { var field = fields[f] , val = block.slice(fieldOffs[f], fieldEnds[f]) switch (field) { case "ustar": // if not ustar, then everything after that is just padding. if (val.toString() !== "ustar\0") { this.ustar = false return } else { // console.error("ustar:", val, val.toString()) this.ustar = val.toString() } break // prefix is special, since it might signal the xstar header case "prefix": var atime = parseNumeric(val.slice(131, 131 + 12)) , ctime = parseNumeric(val.slice(131 + 12, 131 + 12 + 12)) if ((val[130] === 0 || val[130] === space) && typeof atime === "number" && typeof ctime === "number" && val[131 + 12] === space && val[131 + 12 + 12] === space) { this.atime = atime this.ctime = ctime val = val.slice(0, 130) } prefix = val.toString("utf8").replace(/\0+$/, "") // console.error("%% header reading prefix", prefix) break // all other fields are null-padding text // or a number. default: if (numeric[field]) { this[field] = parseNumeric(val) } else { this[field] = val.toString("utf8").replace(/\0+$/, "") } break } } // if we got a prefix, then prepend it to the path. if (prefix) { this.path = prefix + "/" + this.path // console.error("%% header got a prefix", this.path) } } function parse256 (buf) { // first byte MUST be either 80 or FF // 80 for positive, FF for 2's comp var positive if (buf[0] === 0x80) positive = true else if (buf[0] === 0xFF) positive = false else return null // build up a base-256 tuple from the least sig to the highest var zero = false , tuple = [] for (var i = buf.length - 1; i > 0; i --) { var byte = buf[i] if (positive) tuple.push(byte) else if (zero && byte === 0) tuple.push(0) else if (zero) { zero = false tuple.push(0x100 - byte) } else tuple.push(0xFF - byte) } for (var sum = 0, i = 0, l = tuple.length; i < l; i ++) { sum += tuple[i] * Math.pow(256, i) } return positive ? sum : -1 * sum } function parseNumeric (f) { if (f[0] & 0x80) return parse256(f) var str = f.toString("utf8").split("\0")[0].trim() , res = parseInt(str, 8) return isNaN(res) ? null : res } npm_3.5.2.orig/node_modules/tar/lib/pack.js0000644000000000000000000001317712631326456016770 0ustar 00000000000000// pipe in an fstream, and it'll make a tarball. // key-value pair argument is global extended header props. module.exports = Pack var EntryWriter = require("./entry-writer.js") , Stream = require("stream").Stream , path = require("path") , inherits = require("inherits") , GlobalHeaderWriter = require("./global-header-writer.js") , collect = require("fstream").collect , eof = new Buffer(512) for (var i = 0; i < 512; i ++) eof[i] = 0 inherits(Pack, Stream) function Pack (props) { // console.error("-- p ctor") var me = this if (!(me instanceof Pack)) return new Pack(props) if (props) me._noProprietary = props.noProprietary else me._noProprietary = false me._global = props me.readable = true me.writable = true me._buffer = [] // console.error("-- -- set current to null in ctor") me._currentEntry = null me._processing = false me._pipeRoot = null me.on("pipe", function (src) { if (src.root === me._pipeRoot) return me._pipeRoot = src src.on("end", function () { me._pipeRoot = null }) me.add(src) }) } Pack.prototype.addGlobal = function (props) { // console.error("-- p addGlobal") if (this._didGlobal) return this._didGlobal = true var me = this GlobalHeaderWriter(props) .on("data", function (c) { me.emit("data", c) }) .end() } Pack.prototype.add = function (stream) { if (this._global && !this._didGlobal) this.addGlobal(this._global) if (this._ended) return this.emit("error", new Error("add after end")) collect(stream) this._buffer.push(stream) this._process() this._needDrain = this._buffer.length > 0 return !this._needDrain } Pack.prototype.pause = function () { this._paused = true if (this._currentEntry) this._currentEntry.pause() this.emit("pause") } Pack.prototype.resume = function () { this._paused = false if (this._currentEntry) this._currentEntry.resume() this.emit("resume") this._process() } Pack.prototype.end = function () { this._ended = true this._buffer.push(eof) this._process() } Pack.prototype._process = function () { var me = this if (me._paused || me._processing) { return } var entry = me._buffer.shift() if (!entry) { if (me._needDrain) { me.emit("drain") } return } if (entry.ready === false) { // console.error("-- entry is not ready", entry) me._buffer.unshift(entry) entry.on("ready", function () { // console.error("-- -- ready!", entry) me._process() }) return } me._processing = true if (entry === eof) { // need 2 ending null blocks. me.emit("data", eof) me.emit("data", eof) me.emit("end") me.emit("close") return } // Change the path to be relative to the root dir that was // added to the tarball. // // XXX This should be more like how -C works, so you can // explicitly set a root dir, and also explicitly set a pathname // in the tarball to use. That way we can skip a lot of extra // work when resolving symlinks for bundled dependencies in npm. var root = path.dirname((entry.root || entry).path); if (me._global && me._global.fromBase && entry.root && entry.root.path) { // user set 'fromBase: true' indicating tar root should be directory itself root = entry.root.path; } var wprops = {} Object.keys(entry.props || {}).forEach(function (k) { wprops[k] = entry.props[k] }) if (me._noProprietary) wprops.noProprietary = true wprops.path = path.relative(root, entry.path || '') // actually not a matter of opinion or taste. if (process.platform === "win32") { wprops.path = wprops.path.replace(/\\/g, "/") } if (!wprops.type) wprops.type = 'Directory' switch (wprops.type) { // sockets not supported case "Socket": return case "Directory": wprops.path += "/" wprops.size = 0 break case "Link": var lp = path.resolve(path.dirname(entry.path), entry.linkpath) wprops.linkpath = path.relative(root, lp) || "." wprops.size = 0 break case "SymbolicLink": var lp = path.resolve(path.dirname(entry.path), entry.linkpath) wprops.linkpath = path.relative(path.dirname(entry.path), lp) || "." wprops.size = 0 break } // console.error("-- new writer", wprops) // if (!wprops.type) { // // console.error("-- no type?", entry.constructor.name, entry) // } // console.error("-- -- set current to new writer", wprops.path) var writer = me._currentEntry = EntryWriter(wprops) writer.parent = me // writer.on("end", function () { // // console.error("-- -- writer end", writer.path) // }) writer.on("data", function (c) { me.emit("data", c) }) writer.on("header", function () { Buffer.prototype.toJSON = function () { return this.toString().split(/\0/).join(".") } // console.error("-- -- writer header %j", writer.props) if (writer.props.size === 0) nextEntry() }) writer.on("close", nextEntry) var ended = false function nextEntry () { if (ended) return ended = true // console.error("-- -- writer close", writer.path) // console.error("-- -- set current to null", wprops.path) me._currentEntry = null me._processing = false me._process() } writer.on("error", function (er) { // console.error("-- -- writer error", writer.path) me.emit("error", er) }) // if it's the root, then there's no need to add its entries, // or data, since they'll be added directly. if (entry === me._pipeRoot) { // console.error("-- is the root, don't auto-add") writer.add = null } entry.pipe(writer) } Pack.prototype.destroy = function () {} Pack.prototype.write = function () {} npm_3.5.2.orig/node_modules/tar/lib/parse.js0000644000000000000000000001545212631326456017162 0ustar 00000000000000 // A writable stream. // It emits "entry" events, which provide a readable stream that has // header info attached. module.exports = Parse.create = Parse var stream = require("stream") , Stream = stream.Stream , BlockStream = require("block-stream") , tar = require("../tar.js") , TarHeader = require("./header.js") , Entry = require("./entry.js") , BufferEntry = require("./buffer-entry.js") , ExtendedHeader = require("./extended-header.js") , assert = require("assert").ok , inherits = require("inherits") , fstream = require("fstream") // reading a tar is a lot like reading a directory // However, we're actually not going to run the ctor, // since it does a stat and various other stuff. // This inheritance gives us the pause/resume/pipe // behavior that is desired. inherits(Parse, fstream.Reader) function Parse () { var me = this if (!(me instanceof Parse)) return new Parse() // doesn't apply fstream.Reader ctor? // no, becasue we don't want to stat/etc, we just // want to get the entry/add logic from .pipe() Stream.apply(me) me.writable = true me.readable = true me._stream = new BlockStream(512) me.position = 0 me._ended = false me._stream.on("error", function (e) { me.emit("error", e) }) me._stream.on("data", function (c) { me._process(c) }) me._stream.on("end", function () { me._streamEnd() }) me._stream.on("drain", function () { me.emit("drain") }) } // overridden in Extract class, since it needs to // wait for its DirWriter part to finish before // emitting "end" Parse.prototype._streamEnd = function () { var me = this if (!me._ended || me._entry) me.error("unexpected eof") me.emit("end") } // a tar reader is actually a filter, not just a readable stream. // So, you should pipe a tarball stream into it, and it needs these // write/end methods to do that. Parse.prototype.write = function (c) { if (this._ended) { // gnutar puts a LOT of nulls at the end. // you can keep writing these things forever. // Just ignore them. for (var i = 0, l = c.length; i > l; i ++) { if (c[i] !== 0) return this.error("write() after end()") } return } return this._stream.write(c) } Parse.prototype.end = function (c) { this._ended = true return this._stream.end(c) } // don't need to do anything, since we're just // proxying the data up from the _stream. // Just need to override the parent's "Not Implemented" // error-thrower. Parse.prototype._read = function () {} Parse.prototype._process = function (c) { assert(c && c.length === 512, "block size should be 512") // one of three cases. // 1. A new header // 2. A part of a file/extended header // 3. One of two or more EOF null blocks if (this._entry) { var entry = this._entry if(!entry._abort) entry.write(c) else { entry._remaining -= c.length if(entry._remaining < 0) entry._remaining = 0 } if (entry._remaining === 0) { entry.end() this._entry = null } } else { // either zeroes or a header var zero = true for (var i = 0; i < 512 && zero; i ++) { zero = c[i] === 0 } // eof is *at least* 2 blocks of nulls, and then the end of the // file. you can put blocks of nulls between entries anywhere, // so appending one tarball to another is technically valid. // ending without the eof null blocks is not allowed, however. if (zero) { if (this._eofStarted) this._ended = true this._eofStarted = true } else { this._eofStarted = false this._startEntry(c) } } this.position += 512 } // take a header chunk, start the right kind of entry. Parse.prototype._startEntry = function (c) { var header = new TarHeader(c) , self = this , entry , ev , EntryType , onend , meta = false if (null === header.size || !header.cksumValid) { var e = new Error("invalid tar file") e.header = header e.tar_file_offset = this.position e.tar_block = this.position / 512 return this.emit("error", e) } switch (tar.types[header.type]) { case "File": case "OldFile": case "Link": case "SymbolicLink": case "CharacterDevice": case "BlockDevice": case "Directory": case "FIFO": case "ContiguousFile": case "GNUDumpDir": // start a file. // pass in any extended headers // These ones consumers are typically most interested in. EntryType = Entry ev = "entry" break case "GlobalExtendedHeader": // extended headers that apply to the rest of the tarball EntryType = ExtendedHeader onend = function () { self._global = self._global || {} Object.keys(entry.fields).forEach(function (k) { self._global[k] = entry.fields[k] }) } ev = "globalExtendedHeader" meta = true break case "ExtendedHeader": case "OldExtendedHeader": // extended headers that apply to the next entry EntryType = ExtendedHeader onend = function () { self._extended = entry.fields } ev = "extendedHeader" meta = true break case "NextFileHasLongLinkpath": // set linkpath= in extended header EntryType = BufferEntry onend = function () { self._extended = self._extended || {} self._extended.linkpath = entry.body } ev = "longLinkpath" meta = true break case "NextFileHasLongPath": case "OldGnuLongPath": // set path= in file-extended header EntryType = BufferEntry onend = function () { self._extended = self._extended || {} self._extended.path = entry.body } ev = "longPath" meta = true break default: // all the rest we skip, but still set the _entry // member, so that we can skip over their data appropriately. // emit an event to say that this is an ignored entry type? EntryType = Entry ev = "ignoredEntry" break } var global, extended if (meta) { global = extended = null } else { var global = this._global var extended = this._extended // extendedHeader only applies to one entry, so once we start // an entry, it's over. this._extended = null } entry = new EntryType(header, extended, global) entry.meta = meta // only proxy data events of normal files. if (!meta) { entry.on("data", function (c) { me.emit("data", c) }) } if (onend) entry.on("end", onend) this._entry = entry var me = this entry.on("pause", function () { me.pause() }) entry.on("resume", function () { me.resume() }) if (this.listeners("*").length) { this.emit("*", ev, entry) } this.emit(ev, entry) // Zero-byte entry. End immediately. if (entry.props.size === 0) { entry.end() this._entry = null } } npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/0000755000000000000000000000000012631326456021775 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/LICENCE0000644000000000000000000000244612631326456022770 0ustar 00000000000000Copyright (c) Isaac Z. Schlueter All rights reserved. The BSD License Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE NETBSD FOUNDATION, INC. AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE FOUNDATION OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/LICENSE0000644000000000000000000000137512631326456023010 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/README.md0000644000000000000000000000056112631326456023256 0ustar 00000000000000# block-stream A stream of blocks. Write data into it, and it'll output data in buffer blocks the size you specify, padding with zeroes if necessary. ```javascript var block = new BlockStream(512) fs.createReadStream("some-file").pipe(block) block.pipe(fs.createWriteStream("block-file")) ``` When `.end()` or `.flush()` is called, it'll pad the block with zeroes. npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/bench/0000755000000000000000000000000012631326456023054 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/block-stream.js0000644000000000000000000001463312631326456024725 0ustar 00000000000000// write data to it, and it'll emit data in 512 byte blocks. // if you .end() or .flush(), it'll emit whatever it's got, // padded with nulls to 512 bytes. module.exports = BlockStream var Stream = require("stream").Stream , inherits = require("inherits") , assert = require("assert").ok , debug = process.env.DEBUG ? console.error : function () {} function BlockStream (size, opt) { this.writable = this.readable = true this._opt = opt || {} this._chunkSize = size || 512 this._offset = 0 this._buffer = [] this._bufferLength = 0 if (this._opt.nopad) this._zeroes = false else { this._zeroes = new Buffer(this._chunkSize) for (var i = 0; i < this._chunkSize; i ++) { this._zeroes[i] = 0 } } } inherits(BlockStream, Stream) BlockStream.prototype.write = function (c) { // debug(" BS write", c) if (this._ended) throw new Error("BlockStream: write after end") if (c && !Buffer.isBuffer(c)) c = new Buffer(c + "") if (c.length) { this._buffer.push(c) this._bufferLength += c.length } // debug("pushed onto buffer", this._bufferLength) if (this._bufferLength >= this._chunkSize) { if (this._paused) { // debug(" BS paused, return false, need drain") this._needDrain = true return false } this._emitChunk() } return true } BlockStream.prototype.pause = function () { // debug(" BS pausing") this._paused = true } BlockStream.prototype.resume = function () { // debug(" BS resume") this._paused = false return this._emitChunk() } BlockStream.prototype.end = function (chunk) { // debug("end", chunk) if (typeof chunk === "function") cb = chunk, chunk = null if (chunk) this.write(chunk) this._ended = true this.flush() } BlockStream.prototype.flush = function () { this._emitChunk(true) } BlockStream.prototype._emitChunk = function (flush) { // debug("emitChunk flush=%j emitting=%j paused=%j", flush, this._emitting, this._paused) // emit a chunk if (flush && this._zeroes) { // debug(" BS push zeroes", this._bufferLength) // push a chunk of zeroes var padBytes = (this._bufferLength % this._chunkSize) if (padBytes !== 0) padBytes = this._chunkSize - padBytes if (padBytes > 0) { // debug("padBytes", padBytes, this._zeroes.slice(0, padBytes)) this._buffer.push(this._zeroes.slice(0, padBytes)) this._bufferLength += padBytes // debug(this._buffer[this._buffer.length - 1].length, this._bufferLength) } } if (this._emitting || this._paused) return this._emitting = true // debug(" BS entering loops") var bufferIndex = 0 while (this._bufferLength >= this._chunkSize && (flush || !this._paused)) { // debug(" BS data emission loop", this._bufferLength) var out , outOffset = 0 , outHas = this._chunkSize while (outHas > 0 && (flush || !this._paused) ) { // debug(" BS data inner emit loop", this._bufferLength) var cur = this._buffer[bufferIndex] , curHas = cur.length - this._offset // debug("cur=", cur) // debug("curHas=%j", curHas) // If it's not big enough to fill the whole thing, then we'll need // to copy multiple buffers into one. However, if it is big enough, // then just slice out the part we want, to save unnecessary copying. // Also, need to copy if we've already done some copying, since buffers // can't be joined like cons strings. if (out || curHas < outHas) { out = out || new Buffer(this._chunkSize) cur.copy(out, outOffset, this._offset, this._offset + Math.min(curHas, outHas)) } else if (cur.length === outHas && this._offset === 0) { // shortcut -- cur is exactly long enough, and no offset. out = cur } else { // slice out the piece of cur that we need. out = cur.slice(this._offset, this._offset + outHas) } if (curHas > outHas) { // means that the current buffer couldn't be completely output // update this._offset to reflect how much WAS written this._offset += outHas outHas = 0 } else { // output the entire current chunk. // toss it away outHas -= curHas outOffset += curHas bufferIndex ++ this._offset = 0 } } this._bufferLength -= this._chunkSize assert(out.length === this._chunkSize) // debug("emitting data", out) // debug(" BS emitting, paused=%j", this._paused, this._bufferLength) this.emit("data", out) out = null } // debug(" BS out of loops", this._bufferLength) // whatever is left, it's not enough to fill up a block, or we're paused this._buffer = this._buffer.slice(bufferIndex) if (this._paused) { // debug(" BS paused, leaving", this._bufferLength) this._needsDrain = true this._emitting = false return } // if flushing, and not using null-padding, then need to emit the last // chunk(s) sitting in the queue. We know that it's not enough to // fill up a whole block, because otherwise it would have been emitted // above, but there may be some offset. var l = this._buffer.length if (flush && !this._zeroes && l) { if (l === 1) { if (this._offset) { this.emit("data", this._buffer[0].slice(this._offset)) } else { this.emit("data", this._buffer[0]) } } else { var outHas = this._bufferLength , out = new Buffer(outHas) , outOffset = 0 for (var i = 0; i < l; i ++) { var cur = this._buffer[i] , curHas = cur.length - this._offset cur.copy(out, outOffset, this._offset) this._offset = 0 outOffset += curHas this._bufferLength -= curHas } this.emit("data", out) } // truncate this._buffer.length = 0 this._bufferLength = 0 this._offset = 0 } // now either drained or ended // debug("either draining, or ended", this._bufferLength, this._ended) // means that we've flushed out all that we can so far. if (this._needDrain) { // debug("emitting drain", this._bufferLength) this._needDrain = false this.emit("drain") } if ((this._bufferLength === 0) && this._ended && !this._endEmitted) { // debug("emitting end", this._bufferLength) this._endEmitted = true this.emit("end") } this._emitting = false // debug(" BS no longer emitting", flush, this._paused, this._emitting, this._bufferLength, this._chunkSize) } npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/package.json0000644000000000000000000000252612631326456024270 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "block-stream", "description": "a stream of blocks", "version": "0.0.8", "repository": { "type": "git", "url": "git://github.com/isaacs/block-stream.git" }, "engines": { "node": "0.4 || >=0.5.8" }, "main": "block-stream.js", "dependencies": { "inherits": "~2.0.0" }, "devDependencies": { "tap": "0.x" }, "scripts": { "test": "tap test/" }, "license": "ISC", "gitHead": "b35520314f4763af0788d65a846bb43d9c0a8f02", "bugs": { "url": "https://github.com/isaacs/block-stream/issues" }, "homepage": "https://github.com/isaacs/block-stream#readme", "_id": "block-stream@0.0.8", "_shasum": "0688f46da2bbf9cff0c4f68225a0cb95cbe8a46b", "_from": "block-stream@*", "_npmVersion": "2.10.0", "_nodeVersion": "2.0.1", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "0688f46da2bbf9cff0c4f68225a0cb95cbe8a46b", "tarball": "http://registry.npmjs.org/block-stream/-/block-stream-0.0.8.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/block-stream/-/block-stream-0.0.8.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/0000755000000000000000000000000012631326456022754 5ustar 00000000000000npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/bench/block-stream-pause.js0000644000000000000000000000403112631326456027106 0ustar 00000000000000var BlockStream = require("../block-stream.js") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation f.pause() setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } f.resume() }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/bench/block-stream.js0000644000000000000000000000376612631326456026011 0ustar 00000000000000var BlockStream = require("../block-stream.js") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/bench/dropper-pause.js0000644000000000000000000000401612631326456026201 0ustar 00000000000000var BlockStream = require("dropper") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation f.pause() setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } f.resume() }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/bench/dropper.js0000644000000000000000000000375312631326456025075 0ustar 00000000000000var BlockStream = require("dropper") var blockSizes = [16, 25, 1024] , writeSizes = [4, 8, 15, 16, 17, 64, 100] , writeCounts = [1, 10, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/basic.js0000644000000000000000000000133212631326456024372 0ustar 00000000000000var tap = require("tap") , BlockStream = require("../block-stream.js") tap.test("basic test", function (t) { var b = new BlockStream(16) var fs = require("fs") var fstr = fs.createReadStream(__filename, {encoding: "utf8"}) fstr.pipe(b) var stat t.doesNotThrow(function () { stat = fs.statSync(__filename) }, "stat should not throw") var totalBytes = 0 b.on("data", function (c) { t.equal(c.length, 16, "chunks should be 16 bytes long") t.type(c, Buffer, "chunks should be buffer objects") totalBytes += c.length }) b.on("end", function () { var expectedBytes = stat.size + (16 - stat.size % 16) t.equal(totalBytes, expectedBytes, "Should be multiple of 16") t.end() }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/nopad-thorough.js0000644000000000000000000000400412631326456026246 0ustar 00000000000000var BlockStream = require("../block-stream.js") var blockSizes = [16]//, 25]//, 1024] , writeSizes = [4, 15, 16, 17, 64 ]//, 64, 100] , writeCounts = [1, 10]//, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize, {nopad: true }) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = writeSize * writeCount * 2 t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/nopad.js0000644000000000000000000000225612631326456024420 0ustar 00000000000000var BlockStream = require("../") var tap = require("tap") tap.test("don't pad, small writes", function (t) { var f = new BlockStream(16, { nopad: true }) t.plan(1) f.on("data", function (c) { t.equal(c.toString(), "abc", "should get 'abc'") }) f.on("end", function () { t.end() }) f.write(new Buffer("a")) f.write(new Buffer("b")) f.write(new Buffer("c")) f.end() }) tap.test("don't pad, exact write", function (t) { var f = new BlockStream(16, { nopad: true }) t.plan(1) var first = true f.on("data", function (c) { if (first) { first = false t.equal(c.toString(), "abcdefghijklmnop", "first chunk") } else { t.fail("should only get one") } }) f.on("end", function () { t.end() }) f.end(new Buffer("abcdefghijklmnop")) }) tap.test("don't pad, big write", function (t) { var f = new BlockStream(16, { nopad: true }) t.plan(2) var first = true f.on("data", function (c) { if (first) { first = false t.equal(c.toString(), "abcdefghijklmnop", "first chunk") } else { t.equal(c.toString(), "q") } }) f.on("end", function () { t.end() }) f.end(new Buffer("abcdefghijklmnopq")) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/pause-resume.js0000644000000000000000000000416412631326456025732 0ustar 00000000000000var BlockStream = require("../block-stream.js") var blockSizes = [16] , writeSizes = [15, 16, 17] , writeCounts = [1, 10]//, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 var paused = false f.on("data", function (c) { timeouts ++ t.notOk(paused, "should not be paused when emitting data") actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation paused = true f.pause() process.nextTick(function () { var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } paused = false f.resume() timeouts -- }) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = expectChunks * blockSize t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 200) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/thorough.js0000644000000000000000000000376212631326456025161 0ustar 00000000000000var BlockStream = require("../block-stream.js") var blockSizes = [16]//, 25]//, 1024] , writeSizes = [4, 15, 16, 17, 64 ]//, 64, 100] , writeCounts = [1, 10]//, 100] , tap = require("tap") writeCounts.forEach(function (writeCount) { blockSizes.forEach(function (blockSize) { writeSizes.forEach(function (writeSize) { tap.test("writeSize=" + writeSize + " blockSize="+blockSize + " writeCount="+writeCount, function (t) { var f = new BlockStream(blockSize) var actualChunks = 0 var actualBytes = 0 var timeouts = 0 f.on("data", function (c) { timeouts ++ actualChunks ++ actualBytes += c.length // make sure that no data gets corrupted, and basic sanity var before = c.toString() // simulate a slow write operation setTimeout(function () { timeouts -- var after = c.toString() t.equal(after, before, "should not change data") // now corrupt it, to find leaks. for (var i = 0; i < c.length; i ++) { c[i] = "x".charCodeAt(0) } }, 100) }) f.on("end", function () { // round up to the nearest block size var expectChunks = Math.ceil(writeSize * writeCount * 2 / blockSize) var expectBytes = expectChunks * blockSize t.equal(actualBytes, expectBytes, "bytes=" + expectBytes + " writeSize=" + writeSize) t.equal(actualChunks, expectChunks, "chunks=" + expectChunks + " writeSize=" + writeSize) // wait for all the timeout checks to finish, then end the test setTimeout(function WAIT () { if (timeouts > 0) return setTimeout(WAIT) t.end() }, 100) }) for (var i = 0; i < writeCount; i ++) { var a = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) a[j] = "a".charCodeAt(0) var b = new Buffer(writeSize); for (var j = 0; j < writeSize; j ++) b[j] = "b".charCodeAt(0) f.write(a) f.write(b) } f.end() }) }) }) }) npm_3.5.2.orig/node_modules/tar/node_modules/block-stream/test/two-stream.js0000644000000000000000000000307712631326456025423 0ustar 00000000000000var log = console.log, assert = require( 'assert' ), BlockStream = require("../block-stream.js"), isize = 0, tsize = 0, fsize = 0, psize = 0, i = 0, filter = null, paper = null, stack = null, // a source data buffer tsize = 1 * 1024; // <- 1K stack = new Buffer( tsize ); for ( ; i < tsize; i++) stack[i] = "x".charCodeAt(0); isize = 1 * 1024; // <- initial packet size with 4K no bug! fsize = 2 * 1024 ; // <- first block-stream size psize = Math.ceil( isize / 6 ); // <- second block-stream size fexpected = Math.ceil( tsize / fsize ); // <- packets expected for first pexpected = Math.ceil( tsize / psize ); // <- packets expected for second filter = new BlockStream( fsize, { nopad : true } ); paper = new BlockStream( psize, { nopad : true } ); var fcounter = 0; filter.on( 'data', function (c) { // verify that they're not null-padded for (var i = 0; i < c.length; i ++) { assert.strictEqual(c[i], "x".charCodeAt(0)) } ++fcounter; } ); var pcounter = 0; paper.on( 'data', function (c) { // verify that they're not null-padded for (var i = 0; i < c.length; i ++) { assert.strictEqual(c[i], "x".charCodeAt(0)) } ++pcounter; } ); filter.pipe( paper ); filter.on( 'end', function () { log("fcounter: %s === %s", fcounter, fexpected) assert.strictEqual( fcounter, fexpected ); } ); paper.on( 'end', function () { log("pcounter: %s === %s", pcounter, pexpected); assert.strictEqual( pcounter, pexpected ); } ); for ( i = 0, j = isize; j <= tsize; j += isize ) { filter.write( stack.slice( j - isize, j ) ); } filter.end(); npm_3.5.2.orig/node_modules/tar/test/00-setup-fixtures.js0000644000000000000000000000277112631326456021505 0ustar 00000000000000// the fixtures have some weird stuff that is painful // to include directly in the repo for various reasons. // // So, unpack the fixtures with the system tar first. // // This means, of course, that it'll only work if you // already have a tar implementation, and some of them // will not properly unpack the fixtures anyway. // // But, since usually those tests will fail on Windows // and other systems with less capable filesystems anyway, // at least this way we don't cause inconveniences by // merely cloning the repo or installing the package. var tap = require("tap") , child_process = require("child_process") , rimraf = require("rimraf") , test = tap.test , path = require("path") test("clean fixtures", function (t) { rimraf(path.resolve(__dirname, "fixtures"), function (er) { t.ifError(er, "rimraf ./fixtures/") t.end() }) }) test("clean tmp", function (t) { rimraf(path.resolve(__dirname, "tmp"), function (er) { t.ifError(er, "rimraf ./tmp/") t.end() }) }) test("extract fixtures", function (t) { var c = child_process.spawn("tar" ,["xzvf", "fixtures.tgz"] ,{ cwd: __dirname }) c.stdout.on("data", errwrite) c.stderr.on("data", errwrite) function errwrite (chunk) { process.stderr.write(chunk) } c.on("exit", function (code) { t.equal(code, 0, "extract fixtures should exit with 0") if (code) { t.comment("Note, all tests from here on out will fail because of this.") } t.end() }) }) npm_3.5.2.orig/node_modules/tar/test/cb-never-called-1.0.1.tgz0000644000000000000000000001000012631326456022007 0ustar 00000000000000XgXۖ&M* Eҕ !t$AT)Б&Bh^* EAD3sg{9w{0[(vA$,%-O`q$H,A (:> @$p+!$>%99!5yƃA_YkwJ:ӟү쯍Si@cζ!v.s* 傂.3vEm@aJ`;t#0P{W/|/rv?Ǚ%%GK~EO?a((= <]e yO'{?2g3'yV- G!`>fm9}tEl0(w$\cJwVPiO+wz.aU L $x `sZl!]h'-PUPx ! ؝]{6w7=JTp'ih~)?~? G__vSŤ?Z6WU|rDx]r+\ GI7Ynȶ!MJw OL/vY49ⵠ0 m`ef)!gwpIa>ҹҗ}gh5FfJ>&)+=@¾)9F}'pDa!fBE(8se LQiR"x>S;#ufcsM8;d|$aE9W*z #^bqNd! uTaBw䶨 ҁW|xv~FaM_އU>wx˲.t%@zwo`W9u9|ԢţfG= Ni{6QWTZR^ߕF5fמ沿$>W]SQK.MՂ"w}bwDSJJ*jǎ7fW24)DLWYa xA_^?r9>y,;DϣatnƤ: -p|'gh`Δ1 ʍsM)۹YtԮ\P * ^gHX/*sEI^՘)u:A!F 3xAO(U+tTpweyV]!f2.xTO(PjZ4V>="8(ėQSA[FtnN5 i:=ϳv%bKD2$f>'X/tXbpO9-ăOrJD7Sb!b8јʰM|\?jq~׽MWc&-Z}A?\mBúdҌ\3ȡQK)3;T)/ h3rR.>7hf~}[5fPՁǎT?Arm/7k-l>#~I(?W$^ӬnAUccvJ˅#Q&}%Y.m23 TZ1 Kk(̢DyˬyԨq)gRE0G\jJnM$ҋ^!e]v'5[ SfPuY_Ѕ^ç h+''mp"H$kK#BW>$8>(j9a['KܡF.ԇ% {k}9/LޝɑnlMYȓv]+)& 4Pi@F#U飾ҧuFsq$=&q{,m̾t\;Z s7E3Hz} \H}gvOX"Ih,7sCUZa1ݱfO\󮅳| *:qGVo0ҋi/.|FƋ==5nx]V!:#\(mZʹObU}nO* jѨQmuKoV\Mxu\@"Nx69Y }zaϐ[aG1Ň i_C:X&uYEc 2]MMrVi`)OûM:5 FUoW:-oػO](i+)lYkE)fK54T:2#6~K?Zty_"7_-^nikX"y2_|FAOcH2MfHNlZAp72*fpVFզՖ;^vX!qxӰGQbJnEgMrp5n1пqeh2፨/>2magke<此k.vQph)Pbe-"RFsd&wP9L9[hFxs~{+jw(zi5ck017=vPn1 7P8b)0ߍ6oAwch%?_P3g_tx{bZ:[9& t=P2=۱]~!%Itc?E<yLE@I\ҿ>H [PڜkwD0My+q rWZ̺, PɆ™۟Pl|‰SV*]8VFMziW{yKf'X=H Pޥ~8hا-ִצŨEqQY %ᨨj2CUznSC` vƯspf.qry ezg);2Mu z*5ꉬV ~}xEAW4}kñ)qk)K뭭Π:qJu0$ r@a ѫL|q}08# d"pK ^~FA=˘5_(\/6Ot&;6g~l`7ku<7`8k}{A>󗱟lhVחƪx,a{UeCaUϣ7!/ ;BoB(MG (v SM|?iہ]è/O(} ?hfй;Dm1 =|x8%;p#ȂyQ'%Y҆wc^Ah\o C4Sy> `V:;!Z[?8ojy'^um܏AK  vtI y>V}OacP tn$t??y⡸ 7eO @-9XjdbS I&&鰓N.~}?lC3F!!o04.QRE'3%@(n'-KX}d܇XLmwL0|-ͦyS+N>dI`Ce3s&58g(H= 98p,Xck 98p,Xck 98p,Xck 98p,Xck 98p,Xck 98p,Xck 98p,Xck 98p,XcU_Ây7^3+F;6Ҩ;cDeuڐ<_MN~?/? /ySx(0z+xc X2mqW<a1؉D 7{8gPYd?1P I%_ Xÿln7C>CLb(Zi͹؇/)Oя/&#bbjA&_k G;'#A_Dg*܆56Tt6P⁤C %a )HH I}'&qإׄ~ΟIXߎvN mFJ Q ?>k$Y҃m(~]{:K:1{UA?nKՙϲ$ADw,5>ہnh?|~௭n%I+a2XNVחA@6G n68a ns; n k_hiۣsJaz5n:`L^'ƣ3!O` } n: q,&՛=đDSznyK=t@M3+2us0bۘ0*P@q?8X%T W.U W>L֜X6OTgt|Mgg6P4}vƹ?8zkH8qKp uN6axńs8= :$| %,\}a?W=Z[?ȇ' ԆdY/~?XP, 8/u}Oz2cbW?qG^FRySx(?*?n?^P,pլ{'yV=v{uX(?-V-_oo- 9oHK$!G#/?7Q\6$r)@p}_LJ_Kwzy:rq~ ?a["lna>>K dIzˮ cևw[~X70Kn a(L^;[ %I6tu+` 10:qhJa+`8±ck 98p3{1m??/L0OIر? K} WzWN[e8p?%6;oc/#/@i)P~z+?6v=2Ӛ,;Is~9 H>vĉDI^'Kwьnc߆a$KzyT#pttc;P!E%z*La{?K@7Rna( vo쫴W\Ϛ!Q;8|&!:88o;^;ppCaT [R k kXsp^up,ea `ñck 98p燗 .ĸv_G[TƜd/alNr/X*ü<[r\ZlQwbq} ?n: W8p m; pQzOAlߦIZ )دwŃwڡ%v"t#] hOGY8 q[t޼Q 1֝o,c>(W߻6 8ϝY ? uˣlpGm\=_0$@ۮRYڅ& {!ғL`rxlw+Yq7+:hχe/ D)eyOLN3qVCF5|nKv8^cԓq>v_vxzN䏳lZ'=)Ԡ= ^ح/67_mlvN'rYǷb>xŮxy痏v?K.z x<{kh={a%T8tw%2C`tb#xw/G>V@1)vM21H/1L]8U:N~[O;(C3\_>77wgF57!gݷ{/kHQuq6ȿΊSXlٷlPyk΁κY1R}~{Ƿ{oKfLUG//o|㇪CjZ٫qƈFE6D t %nC1g}Nώ@k 9-GVI'Y>e HAŽџ_>}~{XΘ=?6 @.~9N" ڝHP4 YIb'Lg%Nn_AJO>iD4 >ϦdX&XJU"0􉀷CZY ?3Xͼz+/Q.~{nz/&13sPw7?l/2&p3"8&rjL? n}ON )@L}lxcΡ#1aΦIMhy@7rp h)#I~A?P&Q 3*A5&F'q sIS)_e F}ܟy*ÓAfi|"kR_OF6a> N$ qr)WuK4$'y}U%DUBoWzlХBD i7f> 4O<y'~78ި5ۃ_/ݜQ}8po|hPǰ䠂d>;OHH&ēyO-%% RX`ʙ׽χ UotQC=k@pV>>Ix~&_?Xq>BP_zȯQ Y2q3^}`w>_ 4_m~z̯탈Q/?2>c}GٳYc|-2ͧLz#S'Ѳbm/HOi>=9v(T~=q~F:<?B8KY"p4OPdKd%#(=׏v1t20dzu{^u;'b7#Dvހ 92m2W-) WIaFyEF_fNnTNӭ֩C\KІ'  N:O`w1ҝƣMed磔'̞ V-|~"j Ri_E4- J9,DwĻ|RtL{/ z[FѝD? ЈHfH"ڳ{nwI#"ꨡ3a݁”#Ft>;H?3Y~4YXR"" KI$3g×_?>`y=AdU?BעvIO>2OoiL7ɟ5Fl~>Z!c#z|%eiaWFַ|@'9 7#R'56EcOQ$E5ƩE,Ӣ5Ϡjh8j蟁(J}ªaBiMh"àJ0j)|pYߕoz}3{Qm91OF>M^=Agk( &ʿ 'EفPR%{Oɋ9Jհ?Ӷ=n7$wIT`! URʭ߷ R/,,c(ǶzL/J췳q4;/ {~ "]a>άq wӯ∶:p(?KZC݌C%%8J'E=ѯick>0 ЀG8ʘܱHu pG$ʽ:nBⱑh 2LLYkHc㝍c-) G7vw>.WpA;ƙ?~z519ܨF`RX8f<=pT>Tq`e8HaqSg^|uC3-C;XD/QȜ3a]<`BiĹ1@;Ang'J@?ď)0Ӏ~7x3ěC { @ wRibѼk]e|q3h.!Ts];.m.l2 L*tŎ~M1`.O0 Ld  W5( lUshS,[jshvJ*(#3}@`sK)S[JyT]Fjdel4O3/H6#l1lK#yxΏ茱j{ Ots|Bbn‰QG#-xN)CNC9)<\_Đ{SsFRQ lmN'*}ha2,"̘=㬇+:n OKAu!qq0-Tis*c w[(J颴K\R}o,0A68[O љ8CvOZr/,܄}^h#V08 hg/iS0Lz$&d)ƼVJ.\wJ1PMD;`]0AO EMG98sLIfD"FJ&}&e܎]e|RgەIi@:;ChnP?#"ZwdodwuKI^q?2t1l;;tyQ):(7XVT:)Na>$vSWx#`xNNRW槥$ /HqZ7H x@BӯG$}"h"p{hi&Ij*K=UIH5  `>gg)ԘL-"͗sF rTIeҥeMvUkфQ^|? \& IAV rmwq9<њ/I48+}PI_ M"lQj-vz0A?Sy$ţ J&%LǨ@DBʧH.w/7Bڀ1=l:Ƴigy-f_7( NQ; w*!3k8"˜E)fAK/%r֡>03o}(y,kXh?tL#ڂSA&ӈSA#Lmr"NPC]r"-U<56%i-c0rQPqr樌?3DqƎ}F'\ d jV)A^YOɑ[H}qFGX5r"r_9t#L驳|A$bN ܩt垚=7l/|&Xh?煊  IAKe=y?w R8p&~j `>Ƀy,&ؙ?̞Uz\rH !@kcAʖB٤\Ir LNAFH5p9ى:G''KD0RPmB|GMUC_8BdJacx.Վʈ{e|!i CnKsHjc%i:2r[r[WOmEóP*@g5GMh-ׯxOI*O .0|F"alO'hvt 5:G]BW#"2S֩c D t E0ҲOmҊ$X dC84>Wk_Okoɓ6@#(&ٙ Bq=!ON9ʔ?p +3=E;5k4i)X(6w/1PWZmųM8,-C U=?(sT(9єHbR*4xy70YlZLlݪ:Nvɠ+4\t]]G,}RWl4mlL "U#(J|r8Tydc|f1K⊖޷ew^LӪ^@t,C ]&biZKX$OSaQaF~{3|3 Re ^B"noiΔzCW C)롵rM _%S0*'2ŮBE 5hdo ^''PmNmoɲHOON1”oj}70jW|Zɲ?}{dV+DRT\sR;fV6El]X$@.ȓ+{r(BQ"Me$Y.!N6gSM uR$Q2fghB[3z51)No6/OIT#-ppA od=]i\bS_t`:ԅ%!mW&M4AI}H)APnUJS/=F}4ZLØzz̺ =lAt8n X ;!@_ٝR:RM)ӯR(C\JaКX}),'RS7^t h\fB*灅M}=4?=SfJU &jVHv4mPTH*Dj.BWDYㅠP e:V>~`9+/s^ ˜9RaNֵ iJS*im,Cd@He]mYHwt>Me:Y.NoPIQ|di*Wyd ʏXg sRRƝ q:Qlֵly"H|v*OOMxħ^Tu*d!YyH C4VØ!,ߣaCs0ỽKKE@w}^ц=oBttYKplZ՛uL?y'd;Yqì6%hxylW6sj49&fRJief?U&l׀t+T8PIM~kVQ(3Ʉ f}V/W:E:6*[4YbaY'5=E2XF uxYKѼ6C#͢~ A9k <}̇&TSzXLPOP{S Rw$WhXbe3k.io"sGKaS)< |YB$ElrK?QT1i7ʒYlJ+6"S [\NBڵ1 f|y0Ձ?c6Ne4RۣᙉG$v*|t[ryʊ(k@ʪdAd OҠucpAJb)]n%1{ɂ8\J ]OA!Ds)=uY9#!9Uo$Nʅ4eG}h`:1 j`U+}Q=אI*nwA)EDnb/)N 8;3/Ca 2PمNLr{T0He7)Azfe /! ) HV}Y'݆.}uqvrj  EY0&}Kix[JvP(JJ%RZvxn#%eL2I*=@ڸRy[ݧ*T&Qɜ;> F eʄ=q1q0/ [V)㍘ځWyWVQeEJu-MS-ʢgi;0LƠ4Ց"5U9(I)VT\9fRy:qW =V6$`N#&ˤpI;ojUL,f C0anSu{ (>_e\Kmɻ,JEciHƑʧ{Ƴ6&:6uׯ`r 澠2¡83uORGB&NW`4 B/x]]_eӘH΄Ӟb(6:Kq&FƔddtaN>SVE6JǕ$[V!V} P$"V~'|uGj{\R"jY1RM bfֆnvXpyU |O%[ǵ+7ߓvVm8\ZX.lAŔ gS,b+fl{jgCX#UP|ÛIG?;^ YZ|Pb<ߨJ!thCH8OQE[qR.Fwʨ* vpFV}plso(F#KS)JJn>PZtcXq>rwU3P{_-[vcZMdqd4XNT>#[4L3XW x緶zzT8Q"b4M% fCe>r]Ԭ+$ZB@ IgYlni7+~'RjnRFnZq?(*jK2sGZ^KIu|i]c߫0Zg }77̲oL[4ftɛV.1FU$e6o|">'NۖVYUWpē-Of}sL3˲ڔ $IU>_ăCT+coL m#U2aiC(>k7jlb^oDעJJFcJ-WYKu?Θ6ݾl]eIU^EY5r8cJqՐR˙JE/+[>ͬ5ݾ2v6RaWc`xYErMJyHKN V!gH 5EzM\?i&ҥH?6RV~46UuvĶwO{+Oḕ'Z|l]1G-sc+P$}rJJ];NfO^:LNY63]{+f V T9I;{D%+jҍ&zmK+7s*Sۮ>hd "&M3ܸA@Ou4Oc <+J&UC!1=ճ%>mBPUVx -Maz-/C?U(ǒv ʝ3Be-22d_+=MSL:Mer jfKb.ml9Qjcʶd`{[6V6ڐtIjcM軗dbΔYEoVR#8*%EU6|3S{4t d}U^a``KdzSZY\h|5W=d9r\)+&;=D `mt%7!|Eʼ\eQQuCW7c 2>ɬJy }.Ԕrr)_ʼEm8<]TVeBw=s]dn5ôr^ߊ+Is0)H*RQ)T@ņʲjǀ&sZ "JʬQV90cg_Uzv\i)VWI[A8=7U !ڒ{=ȭ@H*FYf?#42VyA޾(鸃Rz;FS \xiC}V )|"KlN;vtJjQdX@1]{gMLI>b4B#Ɩ$aʛ^UFaʽ 7J(; t|Ѭzz Tծ *XWЗf*Yy+.pg̸.K%aU,0ihӰj Λဏefy ci j*U|ֱUUz*ފ[ _& 4pW|P5u't9'n@_7$m9++Ӝl  I2CXtt lӺސx*QePy|P#v S0?p]24ʥH%$fѣ,0qv&u6 iimTԠ@+4`Pͣ#*d h"Cj((/NkJj*jSzͧ-K}lS.+f .Sd׬+Љ`>oROYgRt,EY#5;R^'vrZQy27S3=W [%˿u$3}QjD!o}t}ZZI x#},;r<zkZiξ(˰ٚ=MQԋǒ] ۲ɴ[y<ꦂzg(QvATcx\R#ح?wQUm5c{WئLWH^@\ڮ7dt3E")>cfjh59MI55sWLr`9Jd:>Viܾ2EEӌp9{o =-Qڧ$AEJc5m$9k3n67U_ U(֫jF5\R v+H,vyL0W2j4(Ii+BR^*]h^rBc*NdYO.*[hMe!UЫ}B]RS'| U3t 3DV]CB8$x5a9ե#sj&y~i>? 讹Lv;$nРSEjuxrD$[3 APH l]ucMC$.e8FOD=@z)Z|jZ{ܘ2/V3^Sj`Ç_1R]iQҤUxCP|_ƭUZpiFx{A$xf(2סBA!]ѕ]WEoZ? _^s }We76`UbQ_qA'="T-ބ&k"қEwu:d,ڸg1.h^6p5:|W'bVgl8!0ε緬PN+;|:w{f{%3=lv^Ut-%i4 ȌosX\g*^vGMu2I6Z3-yYڟ4%8r6es:BeGzZu~rWiҔ1vb#sٷT 9Q.3Oބ&kΦ) o{DX ϰ> ?픑fWY݌ $?wr_R;3o{6>ç3Vq4%{-8d[yiM;V֝n4ϻ[ 8ԟ/RhnKbU#Im=~b;q ?N0G/|N<?}~7!˿NVoNy/i~VE ?iG'k,K?߽?zͭu|ed>no`KJdr7ʞ`!PEN;M9J?[M<8<#4ٲ_ 91Pq޽@nxt?IVL~I59ܾ (N|*K$B/Px#q~R ]i)f?z=nex7O}"/ێ❝?K0(v6Pf^Im0Fmog;hoLGw>0kﲍy'q?]vb9UnU6ڱw{v$A$Q;З9;p8p8xnpm_3.5.2.orig/node_modules/tar/test/header.js0000644000000000000000000002422512631326456017507 0ustar 00000000000000var tap = require("tap") var TarHeader = require("../lib/header.js") var tar = require("../tar.js") var fs = require("fs") var headers = { "a.txt file header": [ "612e747874000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000030303036343420003035373736312000303030303234200030303030303030303430312031313635313336303333332030313234353100203000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000757374617200303069736161637300000000000000000000000000000000000000000000000000007374616666000000000000000000000000000000000000000000000000000000303030303030200030303030303020000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" , { cksumValid: true , path: 'a.txt' , mode: 420 , uid: 24561 , gid: 20 , size: 257 , mtime: 1319493851 , cksum: 5417 , type: '0' , linkpath: '' , ustar: 'ustar\0' , ustarver: '00' , uname: 'isaacs' , gname: 'staff' , devmaj: 0 , devmin: 0 , fill: '' } ] , "omega pax": // the extended header from omega tar. [ "5061784865616465722fcea92e74787400000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000030303036343420003035373736312000303030303234200030303030303030303137302031313534333731303631312030313530353100207800000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000757374617200303069736161637300000000000000000000000000000000000000000000000000007374616666000000000000000000000000000000000000000000000000000000303030303030200030303030303020000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" , { cksumValid: true , path: 'PaxHeader/Ω.txt' , mode: 420 , uid: 24561 , gid: 20 , size: 120 , mtime: 1301254537 , cksum: 6697 , type: 'x' , linkpath: '' , ustar: 'ustar\0' , ustarver: '00' , uname: 'isaacs' , gname: 'staff' , devmaj: 0 , devmin: 0 , fill: '' } ] , "omega file header": [ "cea92e7478740000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000030303036343420003035373736312000303030303234200030303030303030303030322031313534333731303631312030313330373200203000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000757374617200303069736161637300000000000000000000000000000000000000000000000000007374616666000000000000000000000000000000000000000000000000000000303030303030200030303030303020000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" , { cksumValid: true , path: 'Ω.txt' , mode: 420 , uid: 24561 , gid: 20 , size: 2 , mtime: 1301254537 , cksum: 5690 , type: '0' , linkpath: '' , ustar: 'ustar\0' , ustarver: '00' , uname: 'isaacs' , gname: 'staff' , devmaj: 0 , devmin: 0 , fill: '' } ] , "foo.js file header": [ "666f6f2e6a730000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000030303036343420003035373736312000303030303234200030303030303030303030342031313534333637303734312030313236313700203000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000757374617200303069736161637300000000000000000000000000000000000000000000000000007374616666000000000000000000000000000000000000000000000000000000303030303030200030303030303020000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" , { cksumValid: true , path: 'foo.js' , mode: 420 , uid: 24561 , gid: 20 , size: 4 , mtime: 1301246433 , cksum: 5519 , type: '0' , linkpath: '' , ustar: 'ustar\0' , ustarver: '00' , uname: 'isaacs' , gname: 'staff' , devmaj: 0 , devmin: 0 , fill: '' } ] , "b.txt file header": [ "622e747874000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000030303036343420003035373736312000303030303234200030303030303030313030302031313635313336303637372030313234363100203000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000757374617200303069736161637300000000000000000000000000000000000000000000000000007374616666000000000000000000000000000000000000000000000000000000303030303030200030303030303020000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" , { cksumValid: true , path: 'b.txt' , mode: 420 , uid: 24561 , gid: 20 , size: 512 , mtime: 1319494079 , cksum: 5425 , type: '0' , linkpath: '' , ustar: 'ustar\0' , ustarver: '00' , uname: 'isaacs' , gname: 'staff' , devmaj: 0 , devmin: 0 , fill: '' } ] , "deep nested file": [ "636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363636363633030303634342000303537373631200030303030323420003030303030303030313434203131363532313531353333203034333331340020300000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000075737461720030306973616163730000000000000000000000000000000000000000000000000000737461666600000000000000000000000000000000000000000000000000000030303030303020003030303030302000722f652f612f6c2f6c2f792f2d2f642f652f652f702f2d2f662f6f2f6c2f642f652f722f2d2f702f612f742f680000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000" , { cksumValid: true, path: 'r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/h/cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc' , mode: 420 , uid: 24561 , gid: 20 , size: 100 , mtime: 1319687003 , cksum: 18124 , type: '0' , linkpath: '' , ustar: 'ustar\0' , ustarver: '00' , uname: 'isaacs' , gname: 'staff' , devmaj: 0 , devmin: 0 , fill: '' } ] } tap.test("parsing", function (t) { Object.keys(headers).forEach(function (name) { var h = headers[name] , header = new Buffer(h[0], "hex") , expect = h[1] , parsed = new TarHeader(header) // console.error(parsed) t.has(parsed, expect, "parse " + name) }) t.end() }) tap.test("encoding", function (t) { Object.keys(headers).forEach(function (name) { var h = headers[name] , expect = new Buffer(h[0], "hex") , encoded = TarHeader.encode(h[1]) // might have slightly different bytes, since the standard // isn't very strict, but should have the same semantics // checkSum will be different, but cksumValid will be true var th = new TarHeader(encoded) delete h[1].block delete h[1].needExtended delete h[1].cksum t.has(th, h[1], "fields "+name) }) t.end() }) // test these manually. they're a bit rare to find in the wild tap.test("parseNumeric tests", function (t) { var parseNumeric = TarHeader.parseNumeric , numbers = { "303737373737373700": 2097151 , "30373737373737373737373700": 8589934591 , "303030303036343400": 420 , "800000ffffffffffff": 281474976710655 , "ffffff000000000001": -281474976710654 , "ffffff000000000000": -281474976710655 , "800000000000200000": 2097152 , "8000000000001544c5": 1393861 , "ffffffffffff1544c5": -15383354 } Object.keys(numbers).forEach(function (n) { var b = new Buffer(n, "hex") t.equal(parseNumeric(b), numbers[n], n + " === " + numbers[n]) }) t.end() }) npm_3.5.2.orig/node_modules/tar/test/pack-no-proprietary.js0000644000000000000000000004640012631326456022164 0ustar 00000000000000// This is exactly like test/pack.js, except that it's excluding // any proprietary headers. // // This loses some information about the filesystem, but creates // tarballs that are supported by more versions of tar, especially // old non-spec-compliant copies of gnutar. // the symlink file is excluded from git, because it makes // windows freak the hell out. var fs = require("fs") , path = require("path") , symlink = path.resolve(__dirname, "fixtures/symlink") try { fs.unlinkSync(symlink) } catch (e) {} fs.symlinkSync("./hardlink-1", symlink) process.on("exit", function () { fs.unlinkSync(symlink) }) var tap = require("tap") , tar = require("../tar.js") , pkg = require("../package.json") , Pack = tar.Pack , fstream = require("fstream") , Reader = fstream.Reader , Writer = fstream.Writer , input = path.resolve(__dirname, "fixtures/") , target = path.resolve(__dirname, "tmp/pack.tar") , uid = process.getuid ? process.getuid() : 0 , gid = process.getgid ? process.getgid() : 0 , entries = // the global header and root fixtures/ dir are going to get // a different date each time, so omit that bit. // Also, dev/ino values differ across machines, so that's not // included. [ [ 'entry', { path: 'fixtures/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'extendedHeader', { path: 'PaxHeader/fixtures/200cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: uid, gid: gid, type: 'x', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { path: 'fixtures/200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', uid: uid, gid: gid, size: 200 } ] , [ 'entry', { path: 'fixtures/200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: uid, gid: gid, size: 200, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/a.txt', mode: 420, uid: uid, gid: gid, size: 257, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/b.txt', mode: 420, uid: uid, gid: gid, size: 512, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/c.txt', mode: 420, uid: uid, gid: gid, size: 513, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/cc.txt', mode: 420, uid: uid, gid: gid, size: 513, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/dir/', mode: 488, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/dir/sub/', mode: 488, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/foo.js', mode: 420, uid: uid, gid: gid, size: 4, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/hardlink-1', mode: 420, uid: uid, gid: gid, size: 200, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/hardlink-2', mode: 420, uid: uid, gid: gid, size: 0, type: '1', linkpath: 'fixtures/hardlink-1', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/omega.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/packtest/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/packtest/omega.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/packtest/star.4.html', mode: 420, uid: uid, gid: gid, size: 54081, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'extendedHeader', { path: 'PaxHeader/fixtures/packtest/Ω.txt', mode: 420, uid: uid, gid: gid, type: 'x', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { path: 'fixtures/packtest/Ω.txt', uid: uid, gid: gid, size: 2 } ] , [ 'entry', { path: 'fixtures/packtest/Ω.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/h/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/h/cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: uid, gid: gid, size: 100, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/symlink', uid: uid, gid: gid, size: 0, type: '2', linkpath: 'hardlink-1', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'extendedHeader', { path: 'PaxHeader/fixtures/Ω.txt', mode: 420, uid: uid, gid: gid, type: 'x', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { path: "fixtures/Ω.txt" , uid: uid , gid: gid , size: 2 } ] , [ 'entry', { path: 'fixtures/Ω.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] ] // first, make sure that the hardlinks are actually hardlinks, or this // won't work. Git has a way of replacing them with a copy. var hard1 = path.resolve(__dirname, "fixtures/hardlink-1") , hard2 = path.resolve(__dirname, "fixtures/hardlink-2") , fs = require("fs") try { fs.unlinkSync(hard2) } catch (e) {} fs.linkSync(hard1, hard2) tap.test("with global header", { timeout: 10000 }, function (t) { runTest(t, true) }) tap.test("without global header", { timeout: 10000 }, function (t) { runTest(t, false) }) function alphasort (a, b) { return a === b ? 0 : a.toLowerCase() > b.toLowerCase() ? 1 : a.toLowerCase() < b.toLowerCase() ? -1 : a > b ? 1 : -1 } function runTest (t, doGH) { var reader = Reader({ path: input , filter: function () { return !this.path.match(/\.(tar|hex)$/) } , sort: alphasort }) var props = doGH ? pkg : {} props.noProprietary = true var pack = Pack(props) var writer = Writer(target) // global header should be skipped regardless, since it has no content. var entry = 0 t.ok(reader, "reader ok") t.ok(pack, "pack ok") t.ok(writer, "writer ok") pack.pipe(writer) var parse = tar.Parse() t.ok(parse, "parser should be ok") pack.on("data", function (c) { // console.error("PACK DATA") if (c.length !== 512) { // this one is too noisy, only assert if it'll be relevant t.equal(c.length, 512, "parser should emit data in 512byte blocks") } parse.write(c) }) pack.on("end", function () { // console.error("PACK END") t.pass("parser ends") parse.end() }) pack.on("error", function (er) { t.fail("pack error", er) }) parse.on("error", function (er) { t.fail("parse error", er) }) writer.on("error", function (er) { t.fail("writer error", er) }) reader.on("error", function (er) { t.fail("reader error", er) }) parse.on("*", function (ev, e) { var wanted = entries[entry++] if (!wanted) { t.fail("unexpected event: "+ev) return } t.equal(ev, wanted[0], "event type should be "+wanted[0]) if (ev !== wanted[0] || e.path !== wanted[1].path) { console.error("wanted", wanted) console.error([ev, e.props]) e.on("end", function () { console.error(e.fields) throw "break" }) } t.has(e.props, wanted[1], "properties "+wanted[1].path) if (wanted[2]) { e.on("end", function () { if (!e.fields) { t.ok(e.fields, "should get fields") } else { t.has(e.fields, wanted[2], "should get expected fields") } }) } }) reader.pipe(pack) writer.on("close", function () { t.equal(entry, entries.length, "should get all expected entries") t.pass("it finished") t.end() }) } npm_3.5.2.orig/node_modules/tar/test/pack.js0000644000000000000000000005213612631326456017177 0ustar 00000000000000 // the symlink file is excluded from git, because it makes // windows freak the hell out. var fs = require("fs") , path = require("path") , symlink = path.resolve(__dirname, "fixtures/symlink") try { fs.unlinkSync(symlink) } catch (e) {} fs.symlinkSync("./hardlink-1", symlink) process.on("exit", function () { fs.unlinkSync(symlink) }) var tap = require("tap") , tar = require("../tar.js") , pkg = require("../package.json") , Pack = tar.Pack , fstream = require("fstream") , Reader = fstream.Reader , Writer = fstream.Writer , input = path.resolve(__dirname, "fixtures/") , target = path.resolve(__dirname, "tmp/pack.tar") , uid = process.getuid ? process.getuid() : 0 , gid = process.getgid ? process.getgid() : 0 , entries = // the global header and root fixtures/ dir are going to get // a different date each time, so omit that bit. // Also, dev/ino values differ across machines, so that's not // included. [ [ 'globalExtendedHeader', { path: 'PaxHeader/', mode: 438, uid: 0, gid: 0, type: 'g', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { "NODETAR.author": pkg.author, "NODETAR.name": pkg.name, "NODETAR.description": pkg.description, "NODETAR.version": pkg.version, "NODETAR.repository.type": pkg.repository.type, "NODETAR.repository.url": pkg.repository.url, "NODETAR.main": pkg.main, "NODETAR.scripts.test": pkg.scripts.test } ] , [ 'entry', { path: 'fixtures/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'extendedHeader', { path: 'PaxHeader/fixtures/200cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: uid, gid: gid, type: 'x', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { path: 'fixtures/200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', 'NODETAR.depth': '1', 'NODETAR.type': 'File', nlink: 1, uid: uid, gid: gid, size: 200, 'NODETAR.blksize': '4096', 'NODETAR.blocks': '8' } ] , [ 'entry', { path: 'fixtures/200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: uid, gid: gid, size: 200, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '', 'NODETAR.depth': '1', 'NODETAR.type': 'File', nlink: 1, 'NODETAR.blksize': '4096', 'NODETAR.blocks': '8' } ] , [ 'entry', { path: 'fixtures/a.txt', mode: 420, uid: uid, gid: gid, size: 257, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/b.txt', mode: 420, uid: uid, gid: gid, size: 512, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/c.txt', mode: 420, uid: uid, gid: gid, size: 513, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/cc.txt', mode: 420, uid: uid, gid: gid, size: 513, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/dir/', mode: 488, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/dir/sub/', mode: 488, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/foo.js', mode: 420, uid: uid, gid: gid, size: 4, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/hardlink-1', mode: 420, uid: uid, gid: gid, size: 200, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/hardlink-2', mode: 420, uid: uid, gid: gid, size: 0, type: '1', linkpath: 'fixtures/hardlink-1', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/omega.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/packtest/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/packtest/omega.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/packtest/star.4.html', mode: 420, uid: uid, gid: gid, size: 54081, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'extendedHeader', { path: 'PaxHeader/fixtures/packtest/Ω.txt', mode: 420, uid: uid, gid: gid, type: 'x', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { path: 'fixtures/packtest/Ω.txt', 'NODETAR.depth': '2', 'NODETAR.type': 'File', nlink: 1, uid: uid, gid: gid, size: 2, 'NODETAR.blksize': '4096', 'NODETAR.blocks': '8' } ] , [ 'entry', { path: 'fixtures/packtest/Ω.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '', 'NODETAR.depth': '2', 'NODETAR.type': 'File', nlink: 1, 'NODETAR.blksize': '4096', 'NODETAR.blocks': '8' } ] , [ 'entry', { path: 'fixtures/r/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/h/', mode: 493, uid: uid, gid: gid, size: 0, type: '5', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/h/cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: uid, gid: gid, size: 100, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'entry', { path: 'fixtures/symlink', uid: uid, gid: gid, size: 0, type: '2', linkpath: 'hardlink-1', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' } ] , [ 'extendedHeader', { path: 'PaxHeader/fixtures/Ω.txt', mode: 420, uid: uid, gid: gid, type: 'x', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '' }, { path: "fixtures/Ω.txt" , "NODETAR.depth": "1" , "NODETAR.type": "File" , nlink: 1 , uid: uid , gid: gid , size: 2 , "NODETAR.blksize": "4096" , "NODETAR.blocks": "8" } ] , [ 'entry', { path: 'fixtures/Ω.txt', mode: 420, uid: uid, gid: gid, size: 2, type: '0', linkpath: '', ustar: 'ustar\u0000', ustarver: '00', uname: '', gname: '', devmaj: 0, devmin: 0, fill: '', 'NODETAR.depth': '1', 'NODETAR.type': 'File', nlink: 1, 'NODETAR.blksize': '4096', 'NODETAR.blocks': '8' } ] ] // first, make sure that the hardlinks are actually hardlinks, or this // won't work. Git has a way of replacing them with a copy. var hard1 = path.resolve(__dirname, "fixtures/hardlink-1") , hard2 = path.resolve(__dirname, "fixtures/hardlink-2") , fs = require("fs") try { fs.unlinkSync(hard2) } catch (e) {} fs.linkSync(hard1, hard2) tap.test("with global header", { timeout: 10000 }, function (t) { runTest(t, true) }) tap.test("without global header", { timeout: 10000 }, function (t) { runTest(t, false) }) tap.test("with from base", { timeout: 10000 }, function (t) { runTest(t, true, true) }) function alphasort (a, b) { return a === b ? 0 : a.toLowerCase() > b.toLowerCase() ? 1 : a.toLowerCase() < b.toLowerCase() ? -1 : a > b ? 1 : -1 } function runTest (t, doGH, doFromBase) { var reader = Reader({ path: input , filter: function () { return !this.path.match(/\.(tar|hex)$/) } , sort: alphasort }) var props = doGH ? pkg : {} if(doFromBase) props.fromBase = true; var pack = Pack(props) var writer = Writer(target) // skip the global header if we're not doing that. var entry = doGH ? 0 : 1 t.ok(reader, "reader ok") t.ok(pack, "pack ok") t.ok(writer, "writer ok") pack.pipe(writer) var parse = tar.Parse() t.ok(parse, "parser should be ok") pack.on("data", function (c) { // console.error("PACK DATA") if (c.length !== 512) { // this one is too noisy, only assert if it'll be relevant t.equal(c.length, 512, "parser should emit data in 512byte blocks") } parse.write(c) }) pack.on("end", function () { // console.error("PACK END") t.pass("parser ends") parse.end() }) pack.on("error", function (er) { t.fail("pack error", er) }) parse.on("error", function (er) { t.fail("parse error", er) }) writer.on("error", function (er) { t.fail("writer error", er) }) reader.on("error", function (er) { t.fail("reader error", er) }) parse.on("*", function (ev, e) { var wanted = entries[entry++] if (!wanted) { t.fail("unexpected event: "+ev) return } t.equal(ev, wanted[0], "event type should be "+wanted[0]) if(doFromBase) { if(wanted[1].path.indexOf('fixtures/') && wanted[1].path.length == 100) wanted[1].path = wanted[1].path.replace('fixtures/', '') + 'ccccccccc' if(wanted[1]) wanted[1].path = wanted[1].path.replace('fixtures/', '').replace('//', '/') if(wanted[1].path == '') wanted[1].path = '/' if(wanted[2] && wanted[2].path) wanted[2].path = wanted[2].path.replace('fixtures', '').replace(/^\//, '') wanted[1].linkpath = wanted[1].linkpath.replace('fixtures/', '') } if (ev !== wanted[0] || e.path !== wanted[1].path) { console.error("wanted", wanted) console.error([ev, e.props]) e.on("end", function () { console.error(e.fields) throw "break" }) } t.has(e.props, wanted[1], "properties "+wanted[1].path) if (wanted[2]) { e.on("end", function () { if (!e.fields) { t.ok(e.fields, "should get fields") } else { t.has(e.fields, wanted[2], "should get expected fields") } }) } }) reader.pipe(pack) writer.on("close", function () { t.equal(entry, entries.length, "should get all expected entries") t.pass("it finished") t.end() }) } npm_3.5.2.orig/node_modules/tar/test/parse-discard.js0000644000000000000000000000120312631326456020767 0ustar 00000000000000var tap = require("tap") , tar = require("../tar.js") , fs = require("fs") , path = require("path") , file = path.resolve(__dirname, "fixtures/c.tar") tap.test("parser test", function (t) { var parser = tar.Parse() var total = 0 var dataTotal = 0 parser.on("end", function () { t.equals(total-513,dataTotal,'should have discarded only c.txt') t.end() }) fs.createReadStream(file) .pipe(parser) .on('entry',function(entry){ if(entry.path === 'c.txt') entry.abort() total += entry.size; entry.on('data',function(data){ dataTotal += data.length }) }) }) npm_3.5.2.orig/node_modules/tar/test/parse.js0000644000000000000000000002435112631326456017371 0ustar 00000000000000var tap = require("tap") , tar = require("../tar.js") , fs = require("fs") , path = require("path") , file = path.resolve(__dirname, "fixtures/c.tar") , index = 0 , expect = [ [ 'entry', { path: 'c.txt', mode: 420, uid: 24561, gid: 20, size: 513, mtime: new Date('Wed, 26 Oct 2011 01:10:58 GMT'), cksum: 5422, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, undefined ], [ 'entry', { path: 'cc.txt', mode: 420, uid: 24561, gid: 20, size: 513, mtime: new Date('Wed, 26 Oct 2011 01:11:02 GMT'), cksum: 5525, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, undefined ], [ 'entry', { path: 'r/e/a/l/l/y/-/d/e/e/p/-/f/o/l/d/e/r/-/p/a/t/h/cccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: 24561, gid: 20, size: 100, mtime: new Date('Thu, 27 Oct 2011 03:43:23 GMT'), cksum: 18124, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, undefined ], [ 'entry', { path: 'Ω.txt', mode: 420, uid: 24561, gid: 20, size: 2, mtime: new Date('Thu, 27 Oct 2011 17:51:49 GMT'), cksum: 5695, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, undefined ], [ 'extendedHeader', { path: 'PaxHeader/Ω.txt', mode: 420, uid: 24561, gid: 20, size: 120, mtime: new Date('Thu, 27 Oct 2011 17:51:49 GMT'), cksum: 6702, type: 'x', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, { path: 'Ω.txt', ctime: 1319737909, atime: 1319739061, dev: 234881026, ino: 51693379, nlink: 1 } ], [ 'entry', { path: 'Ω.txt', mode: 420, uid: 24561, gid: 20, size: 2, mtime: new Date('Thu, 27 Oct 2011 17:51:49 GMT'), cksum: 5695, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '', ctime: new Date('Thu, 27 Oct 2011 17:51:49 GMT'), atime: new Date('Thu, 27 Oct 2011 18:11:01 GMT'), dev: 234881026, ino: 51693379, nlink: 1 }, undefined ], [ 'extendedHeader', { path: 'PaxHeader/200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: 24561, gid: 20, size: 353, mtime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), cksum: 14488, type: 'x', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, { path: '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', ctime: 1319686868, atime: 1319741254, 'LIBARCHIVE.creationtime': '1319686852', dev: 234881026, ino: 51681874, nlink: 1 } ], [ 'entry', { path: '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: 24561, gid: 20, size: 200, mtime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), cksum: 14570, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '', ctime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), atime: new Date('Thu, 27 Oct 2011 18:47:34 GMT'), 'LIBARCHIVE.creationtime': '1319686852', dev: 234881026, ino: 51681874, nlink: 1 }, undefined ], [ 'longPath', { path: '././@LongLink', mode: 0, uid: 0, gid: 0, size: 201, mtime: new Date('Thu, 01 Jan 1970 00:00:00 GMT'), cksum: 4976, type: 'L', linkpath: '', ustar: false }, '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc' ], [ 'entry', { path: '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: 1000, gid: 1000, size: 201, mtime: new Date('Thu, 27 Oct 2011 22:21:50 GMT'), cksum: 14086, type: '0', linkpath: '', ustar: false }, undefined ], [ 'longLinkpath', { path: '././@LongLink', mode: 0, uid: 0, gid: 0, size: 201, mtime: new Date('Thu, 01 Jan 1970 00:00:00 GMT'), cksum: 4975, type: 'K', linkpath: '', ustar: false }, '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc' ], [ 'longPath', { path: '././@LongLink', mode: 0, uid: 0, gid: 0, size: 201, mtime: new Date('Thu, 01 Jan 1970 00:00:00 GMT'), cksum: 4976, type: 'L', linkpath: '', ustar: false }, '200LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL' ], [ 'entry', { path: '200LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL', mode: 511, uid: 1000, gid: 1000, size: 0, mtime: new Date('Fri, 28 Oct 2011 23:05:17 GMT'), cksum: 21603, type: '2', linkpath: '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', ustar: false }, undefined ], [ 'extendedHeader', { path: 'PaxHeader/200-hard', mode: 420, uid: 24561, gid: 20, size: 143, mtime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), cksum: 6533, type: 'x', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, { ctime: 1320617144, atime: 1320617232, 'LIBARCHIVE.creationtime': '1319686852', dev: 234881026, ino: 51681874, nlink: 2 } ], [ 'entry', { path: '200-hard', mode: 420, uid: 24561, gid: 20, size: 200, mtime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), cksum: 5526, type: '0', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '', ctime: new Date('Sun, 06 Nov 2011 22:05:44 GMT'), atime: new Date('Sun, 06 Nov 2011 22:07:12 GMT'), 'LIBARCHIVE.creationtime': '1319686852', dev: 234881026, ino: 51681874, nlink: 2 }, undefined ], [ 'extendedHeader', { path: 'PaxHeader/200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: 24561, gid: 20, size: 353, mtime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), cksum: 14488, type: 'x', linkpath: '', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '' }, { path: '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', ctime: 1320617144, atime: 1320617406, 'LIBARCHIVE.creationtime': '1319686852', dev: 234881026, ino: 51681874, nlink: 2 } ], [ 'entry', { path: '200ccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccccc', mode: 420, uid: 24561, gid: 20, size: 0, mtime: new Date('Thu, 27 Oct 2011 03:41:08 GMT'), cksum: 15173, type: '1', linkpath: '200-hard', ustar: 'ustar\0', ustarver: '00', uname: 'isaacs', gname: 'staff', devmaj: 0, devmin: 0, fill: '', ctime: new Date('Sun, 06 Nov 2011 22:05:44 GMT'), atime: new Date('Sun, 06 Nov 2011 22:10:06 GMT'), 'LIBARCHIVE.creationtime': '1319686852', dev: 234881026, ino: 51681874, nlink: 2 }, undefined ] ] tap.test("parser test", function (t) { var parser = tar.Parse() parser.on("end", function () { t.equal(index, expect.length, "saw all expected events") t.end() }) fs.createReadStream(file) .pipe(parser) .on("*", function (ev, entry) { var wanted = expect[index] if (!wanted) { return t.fail("Unexpected event: " + ev) } var result = [ev, entry.props] entry.on("end", function () { result.push(entry.fields || entry.body) t.equal(ev, wanted[0], index + " event type") t.equivalent(entry.props, wanted[1], wanted[1].path + " entry properties") if (wanted[2]) { t.equivalent(result[2], wanted[2], "metadata values") } index ++ }) }) }) npm_3.5.2.orig/node_modules/tar/test/zz-cleanup.js0000644000000000000000000000065612631326456020351 0ustar 00000000000000// clean up the fixtures var tap = require("tap") , rimraf = require("rimraf") , test = tap.test , path = require("path") test("clean fixtures", function (t) { rimraf(path.resolve(__dirname, "fixtures"), function (er) { t.ifError(er, "rimraf ./fixtures/") t.end() }) }) test("clean tmp", function (t) { rimraf(path.resolve(__dirname, "tmp"), function (er) { t.ifError(er, "rimraf ./tmp/") t.end() }) }) npm_3.5.2.orig/node_modules/text-table/.travis.yml0000644000000000000000000000006012631326456020325 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" npm_3.5.2.orig/node_modules/text-table/LICENSE0000644000000000000000000000206112631326456017224 0ustar 00000000000000This software is released under the MIT license: Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/text-table/example/0000755000000000000000000000000012631326456017653 5ustar 00000000000000npm_3.5.2.orig/node_modules/text-table/index.js0000644000000000000000000000457412631326456017677 0ustar 00000000000000module.exports = function (rows_, opts) { if (!opts) opts = {}; var hsep = opts.hsep === undefined ? ' ' : opts.hsep; var align = opts.align || []; var stringLength = opts.stringLength || function (s) { return String(s).length; } ; var dotsizes = reduce(rows_, function (acc, row) { forEach(row, function (c, ix) { var n = dotindex(c); if (!acc[ix] || n > acc[ix]) acc[ix] = n; }); return acc; }, []); var rows = map(rows_, function (row) { return map(row, function (c_, ix) { var c = String(c_); if (align[ix] === '.') { var index = dotindex(c); var size = dotsizes[ix] + (/\./.test(c) ? 1 : 2) - (stringLength(c) - index) ; return c + Array(size).join(' '); } else return c; }); }); var sizes = reduce(rows, function (acc, row) { forEach(row, function (c, ix) { var n = stringLength(c); if (!acc[ix] || n > acc[ix]) acc[ix] = n; }); return acc; }, []); return map(rows, function (row) { return map(row, function (c, ix) { var n = (sizes[ix] - stringLength(c)) || 0; var s = Array(Math.max(n + 1, 1)).join(' '); if (align[ix] === 'r' || align[ix] === '.') { return s + c; } if (align[ix] === 'c') { return Array(Math.ceil(n / 2 + 1)).join(' ') + c + Array(Math.floor(n / 2 + 1)).join(' ') ; } return c + s; }).join(hsep).replace(/\s+$/, ''); }).join('\n'); }; function dotindex (c) { var m = /\.[^.]*$/.exec(c); return m ? m.index + 1 : c.length; } function reduce (xs, f, init) { if (xs.reduce) return xs.reduce(f, init); var i = 0; var acc = arguments.length >= 3 ? init : xs[i++]; for (; i < xs.length; i++) { f(acc, xs[i], i); } return acc; } function forEach (xs, f) { if (xs.forEach) return xs.forEach(f); for (var i = 0; i < xs.length; i++) { f.call(xs, xs[i], i); } } function map (xs, f) { if (xs.map) return xs.map(f); var res = []; for (var i = 0; i < xs.length; i++) { res.push(f.call(xs, xs[i], i)); } return res; } npm_3.5.2.orig/node_modules/text-table/package.json0000644000000000000000000000737012631326456020515 0ustar 00000000000000{ "name": "text-table", "version": "0.2.0", "description": "borderless text tables with alignment", "main": "index.js", "devDependencies": { "tap": "~0.4.0", "tape": "~1.0.2", "cli-color": "~0.2.3" }, "scripts": { "test": "tap test/*.js" }, "testling": { "files": "test/*.js", "browsers": [ "ie/6..latest", "chrome/20..latest", "firefox/10..latest", "safari/latest", "opera/11.0..latest", "iphone/6", "ipad/6" ] }, "repository": { "type": "git", "url": "git://github.com/substack/text-table.git" }, "homepage": "https://github.com/substack/text-table", "keywords": [ "text", "table", "align", "ascii", "rows", "tabular" ], "author": { "name": "James Halliday", "email": "mail@substack.net", "url": "http://substack.net" }, "license": "MIT", "readme": "# text-table\n\ngenerate borderless text table strings suitable for printing to stdout\n\n[![build status](https://secure.travis-ci.org/substack/text-table.png)](http://travis-ci.org/substack/text-table)\n\n[![browser support](https://ci.testling.com/substack/text-table.png)](http://ci.testling.com/substack/text-table)\n\n# example\n\n## default align\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'master', '0123456789abcdef' ],\n [ 'staging', 'fedcba9876543210' ]\n]);\nconsole.log(t);\n```\n\n```\nmaster 0123456789abcdef\nstaging fedcba9876543210\n```\n\n## left-right align\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'beep', '1024' ],\n [ 'boop', '33450' ],\n [ 'foo', '1006' ],\n [ 'bar', '45' ]\n], { align: [ 'l', 'r' ] });\nconsole.log(t);\n```\n\n```\nbeep 1024\nboop 33450\nfoo 1006\nbar 45\n```\n\n## dotted align\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'beep', '1024' ],\n [ 'boop', '334.212' ],\n [ 'foo', '1006' ],\n [ 'bar', '45.6' ],\n [ 'baz', '123.' ]\n], { align: [ 'l', '.' ] });\nconsole.log(t);\n```\n\n```\nbeep 1024\nboop 334.212\nfoo 1006\nbar 45.6\nbaz 123.\n```\n\n## centered\n\n``` js\nvar table = require('text-table');\nvar t = table([\n [ 'beep', '1024', 'xyz' ],\n [ 'boop', '3388450', 'tuv' ],\n [ 'foo', '10106', 'qrstuv' ],\n [ 'bar', '45', 'lmno' ]\n], { align: [ 'l', 'c', 'l' ] });\nconsole.log(t);\n```\n\n```\nbeep 1024 xyz\nboop 3388450 tuv\nfoo 10106 qrstuv\nbar 45 lmno\n```\n\n# methods\n\n``` js\nvar table = require('text-table')\n```\n\n## var s = table(rows, opts={})\n\nReturn a formatted table string `s` from an array of `rows` and some options\n`opts`.\n\n`rows` should be an array of arrays containing strings, numbers, or other\nprintable values.\n\noptions can be:\n\n* `opts.hsep` - separator to use between columns, default `' '`\n* `opts.align` - array of alignment types for each column, default `['l','l',...]`\n* `opts.stringLength` - callback function to use when calculating the string length\n\nalignment types are:\n\n* `'l'` - left\n* `'r'` - right\n* `'c'` - center\n* `'.'` - decimal\n\n# install\n\nWith [npm](https://npmjs.org) do:\n\n```\nnpm install text-table\n```\n\n# Use with ANSI-colors\n\nSince the string length of ANSI color schemes does not equal the length\nJavaScript sees internally it is necessary to pass the a custom string length\ncalculator during the main function call.\n\nSee the `test/ansi-colors.js` file for an example.\n\n# license\n\nMIT\n", "readmeFilename": "readme.markdown", "bugs": { "url": "https://github.com/substack/text-table/issues" }, "_id": "text-table@0.2.0", "_shasum": "7f5ee823ae805207c00af2df4a84ec3fcfa570b4", "_resolved": "https://registry.npmjs.org/text-table/-/text-table-0.2.0.tgz", "_from": "text-table@>=0.2.0 <0.3.0" } npm_3.5.2.orig/node_modules/text-table/readme.markdown0000644000000000000000000000464612631326456021233 0ustar 00000000000000# text-table generate borderless text table strings suitable for printing to stdout [![build status](https://secure.travis-ci.org/substack/text-table.png)](http://travis-ci.org/substack/text-table) [![browser support](https://ci.testling.com/substack/text-table.png)](http://ci.testling.com/substack/text-table) # example ## default align ``` js var table = require('text-table'); var t = table([ [ 'master', '0123456789abcdef' ], [ 'staging', 'fedcba9876543210' ] ]); console.log(t); ``` ``` master 0123456789abcdef staging fedcba9876543210 ``` ## left-right align ``` js var table = require('text-table'); var t = table([ [ 'beep', '1024' ], [ 'boop', '33450' ], [ 'foo', '1006' ], [ 'bar', '45' ] ], { align: [ 'l', 'r' ] }); console.log(t); ``` ``` beep 1024 boop 33450 foo 1006 bar 45 ``` ## dotted align ``` js var table = require('text-table'); var t = table([ [ 'beep', '1024' ], [ 'boop', '334.212' ], [ 'foo', '1006' ], [ 'bar', '45.6' ], [ 'baz', '123.' ] ], { align: [ 'l', '.' ] }); console.log(t); ``` ``` beep 1024 boop 334.212 foo 1006 bar 45.6 baz 123. ``` ## centered ``` js var table = require('text-table'); var t = table([ [ 'beep', '1024', 'xyz' ], [ 'boop', '3388450', 'tuv' ], [ 'foo', '10106', 'qrstuv' ], [ 'bar', '45', 'lmno' ] ], { align: [ 'l', 'c', 'l' ] }); console.log(t); ``` ``` beep 1024 xyz boop 3388450 tuv foo 10106 qrstuv bar 45 lmno ``` # methods ``` js var table = require('text-table') ``` ## var s = table(rows, opts={}) Return a formatted table string `s` from an array of `rows` and some options `opts`. `rows` should be an array of arrays containing strings, numbers, or other printable values. options can be: * `opts.hsep` - separator to use between columns, default `' '` * `opts.align` - array of alignment types for each column, default `['l','l',...]` * `opts.stringLength` - callback function to use when calculating the string length alignment types are: * `'l'` - left * `'r'` - right * `'c'` - center * `'.'` - decimal # install With [npm](https://npmjs.org) do: ``` npm install text-table ``` # Use with ANSI-colors Since the string length of ANSI color schemes does not equal the length JavaScript sees internally it is necessary to pass the a custom string length calculator during the main function call. See the `test/ansi-colors.js` file for an example. # license MIT npm_3.5.2.orig/node_modules/text-table/test/0000755000000000000000000000000012631326456017177 5ustar 00000000000000npm_3.5.2.orig/node_modules/text-table/example/align.js0000644000000000000000000000026512631326456021306 0ustar 00000000000000var table = require('../'); var t = table([ [ 'beep', '1024' ], [ 'boop', '33450' ], [ 'foo', '1006' ], [ 'bar', '45' ] ], { align: [ 'l', 'r' ] }); console.log(t); npm_3.5.2.orig/node_modules/text-table/example/center.js0000644000000000000000000000033512631326456021472 0ustar 00000000000000var table = require('../'); var t = table([ [ 'beep', '1024', 'xyz' ], [ 'boop', '3388450', 'tuv' ], [ 'foo', '10106', 'qrstuv' ], [ 'bar', '45', 'lmno' ] ], { align: [ 'l', 'c', 'l' ] }); console.log(t); npm_3.5.2.orig/node_modules/text-table/example/dotalign.js0000644000000000000000000000032012631326456022005 0ustar 00000000000000var table = require('../'); var t = table([ [ 'beep', '1024' ], [ 'boop', '334.212' ], [ 'foo', '1006' ], [ 'bar', '45.6' ], [ 'baz', '123.' ] ], { align: [ 'l', '.' ] }); console.log(t); npm_3.5.2.orig/node_modules/text-table/example/doubledot.js0000644000000000000000000000031612631326456022172 0ustar 00000000000000var table = require('../'); var t = table([ [ '0.1.2' ], [ '11.22.33' ], [ '5.6.7' ], [ '1.22222' ], [ '12345.' ], [ '5555.' ], [ '123' ] ], { align: [ '.' ] }); console.log(t); npm_3.5.2.orig/node_modules/text-table/example/table.js0000644000000000000000000000021412631326456021275 0ustar 00000000000000var table = require('../'); var t = table([ [ 'master', '0123456789abcdef' ], [ 'staging', 'fedcba9876543210' ] ]); console.log(t); npm_3.5.2.orig/node_modules/text-table/test/align.js0000644000000000000000000000061112631326456020625 0ustar 00000000000000var test = require('tape'); var table = require('../'); test('align', function (t) { t.plan(1); var s = table([ [ 'beep', '1024' ], [ 'boop', '33450' ], [ 'foo', '1006' ], [ 'bar', '45' ] ], { align: [ 'l', 'r' ] }); t.equal(s, [ 'beep 1024', 'boop 33450', 'foo 1006', 'bar 45' ].join('\n')); }); npm_3.5.2.orig/node_modules/text-table/test/ansi-colors.js0000644000000000000000000000157412631326456021775 0ustar 00000000000000var test = require('tape'); var table = require('../'); var color = require('cli-color'); var ansiTrim = require('cli-color/lib/trim'); test('center', function (t) { t.plan(1); var opts = { align: [ 'l', 'c', 'l' ], stringLength: function(s) { return ansiTrim(s).length } }; var s = table([ [ color.red('Red'), color.green('Green'), color.blue('Blue') ], [ color.bold('Bold'), color.underline('Underline'), color.italic('Italic') ], [ color.inverse('Inverse'), color.strike('Strike'), color.blink('Blink') ], [ 'bar', '45', 'lmno' ] ], opts); t.equal(ansiTrim(s), [ 'Red Green Blue', 'Bold Underline Italic', 'Inverse Strike Blink', 'bar 45 lmno' ].join('\n')); }); npm_3.5.2.orig/node_modules/text-table/test/center.js0000644000000000000000000000072212631326456021016 0ustar 00000000000000var test = require('tape'); var table = require('../'); test('center', function (t) { t.plan(1); var s = table([ [ 'beep', '1024', 'xyz' ], [ 'boop', '3388450', 'tuv' ], [ 'foo', '10106', 'qrstuv' ], [ 'bar', '45', 'lmno' ] ], { align: [ 'l', 'c', 'l' ] }); t.equal(s, [ 'beep 1024 xyz', 'boop 3388450 tuv', 'foo 10106 qrstuv', 'bar 45 lmno' ].join('\n')); }); npm_3.5.2.orig/node_modules/text-table/test/dotalign.js0000644000000000000000000000070512631326456021340 0ustar 00000000000000var test = require('tape'); var table = require('../'); test('dot align', function (t) { t.plan(1); var s = table([ [ 'beep', '1024' ], [ 'boop', '334.212' ], [ 'foo', '1006' ], [ 'bar', '45.6' ], [ 'baz', '123.' ] ], { align: [ 'l', '.' ] }); t.equal(s, [ 'beep 1024', 'boop 334.212', 'foo 1006', 'bar 45.6', 'baz 123.' ].join('\n')); }); npm_3.5.2.orig/node_modules/text-table/test/doubledot.js0000644000000000000000000000073412631326456021522 0ustar 00000000000000var test = require('tape'); var table = require('../'); test('dot align', function (t) { t.plan(1); var s = table([ [ '0.1.2' ], [ '11.22.33' ], [ '5.6.7' ], [ '1.22222' ], [ '12345.' ], [ '5555.' ], [ '123' ] ], { align: [ '.' ] }); t.equal(s, [ ' 0.1.2', '11.22.33', ' 5.6.7', ' 1.22222', '12345.', ' 5555.', ' 123' ].join('\n')); }); npm_3.5.2.orig/node_modules/text-table/test/table.js0000644000000000000000000000050612631326456020625 0ustar 00000000000000var test = require('tape'); var table = require('../'); test('table', function (t) { t.plan(1); var s = table([ [ 'master', '0123456789abcdef' ], [ 'staging', 'fedcba9876543210' ] ]); t.equal(s, [ 'master 0123456789abcdef', 'staging fedcba9876543210' ].join('\n')); }); npm_3.5.2.orig/node_modules/uid-number/LICENSE0000644000000000000000000000135412631326456017226 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/uid-number/README.md0000644000000000000000000000053112631326456017474 0ustar 00000000000000Use this module to convert a username/groupname to a uid/gid number. Usage: ``` npm install uid-number ``` Then, in your node program: ```javascript var uidNumber = require("uid-number") uidNumber("isaacs", function (er, uid, gid) { // gid is null because we didn't ask for a group name // uid === 24561 because that's my number. }) ``` npm_3.5.2.orig/node_modules/uid-number/get-uid-gid.js0000755000000000000000000000120412631326456020653 0ustar 00000000000000if (module !== require.main) { throw new Error("This file should not be loaded with require()") } if (!process.getuid || !process.getgid) { throw new Error("this file should not be called without uid/gid support") } var argv = process.argv.slice(2) , user = argv[0] || process.getuid() , group = argv[1] || process.getgid() if (!isNaN(user)) user = +user if (!isNaN(group)) group = +group console.error([user, group]) try { process.setgid(group) process.setuid(user) console.log(JSON.stringify({uid:+process.getuid(), gid:+process.getgid()})) } catch (ex) { console.log(JSON.stringify({error:ex.message,errno:ex.errno})) } npm_3.5.2.orig/node_modules/uid-number/package.json0000644000000000000000000000235712631326456020513 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "name": "uid-number", "description": "Convert a username/group name to a uid/gid number", "version": "0.0.6", "repository": { "type": "git", "url": "git://github.com/isaacs/uid-number.git" }, "main": "uid-number.js", "dependencies": {}, "devDependencies": {}, "optionalDependencies": {}, "engines": { "node": "*" }, "license": "ISC", "gitHead": "aab48f5d6bda85794946b26d945d2ee452e0e9ab", "bugs": { "url": "https://github.com/isaacs/uid-number/issues" }, "homepage": "https://github.com/isaacs/uid-number", "_id": "uid-number@0.0.6", "scripts": {}, "_shasum": "0ea10e8035e8eb5b8e4449f06da1c730663baa81", "_from": "uid-number@0.0.6", "_npmVersion": "2.1.3", "_nodeVersion": "0.10.31", "_npmUser": { "name": "isaacs", "email": "i@izs.me" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "dist": { "shasum": "0ea10e8035e8eb5b8e4449f06da1c730663baa81", "tarball": "http://registry.npmjs.org/uid-number/-/uid-number-0.0.6.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/uid-number/-/uid-number-0.0.6.tgz" } npm_3.5.2.orig/node_modules/uid-number/uid-number.js0000644000000000000000000000333112631326456020623 0ustar 00000000000000module.exports = uidNumber // This module calls into get-uid-gid.js, which sets the // uid and gid to the supplied argument, in order to find out their // numeric value. This can't be done in the main node process, // because otherwise node would be running as that user from this // point on. var child_process = require("child_process") , path = require("path") , uidSupport = process.getuid && process.setuid , uidCache = {} , gidCache = {} function uidNumber (uid, gid, cb) { if (!uidSupport) return cb() if (typeof cb !== "function") cb = gid, gid = null if (typeof cb !== "function") cb = uid, uid = null if (gid == null) gid = process.getgid() if (uid == null) uid = process.getuid() if (!isNaN(gid)) gid = gidCache[gid] = +gid if (!isNaN(uid)) uid = uidCache[uid] = +uid if (uidCache.hasOwnProperty(uid)) uid = uidCache[uid] if (gidCache.hasOwnProperty(gid)) gid = gidCache[gid] if (typeof gid === "number" && typeof uid === "number") { return process.nextTick(cb.bind(null, null, uid, gid)) } var getter = require.resolve("./get-uid-gid.js") child_process.execFile( process.execPath , [getter, uid, gid] , function (code, out, stderr) { if (code) { var er = new Error("could not get uid/gid\n" + stderr) er.code = code return cb(er) } try { out = JSON.parse(out+"") } catch (ex) { return cb(ex) } if (out.error) { var er = new Error(out.error) er.errno = out.errno return cb(er) } if (isNaN(out.uid) || isNaN(out.gid)) return cb(new Error( "Could not get uid/gid: "+JSON.stringify(out))) cb(null, uidCache[uid] = +out.uid, gidCache[gid] = +out.gid) }) } npm_3.5.2.orig/node_modules/umask/.npmignore0000644000000000000000000000111312631326456017262 0ustar 00000000000000# Logs logs *.log # Runtime data pids *.pid *.seed # Directory for instrumented libs generated by jscoverage/JSCover lib-cov # Coverage directory used by tools like istanbul coverage # Grunt intermediate storage (http://gruntjs.com/creating-plugins#storing-task-files) .grunt # Compiled binary addons (http://nodejs.org/api/addons.html) build/Release # Dependency directory # Commenting this out is preferred by some people, see # https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git- node_modules # Users Environment Variables .lock-wscript npm_3.5.2.orig/node_modules/umask/ChangeLog0000644000000000000000000000032012631326456017034 0ustar 000000000000002015-01-15 Sam Mikes * index.js: (convert_fromString) accept decimal strings provided they don't begin with '0' 2015-01-14 Sam Mikes * index.js: initial rev npm_3.5.2.orig/node_modules/umask/LICENSE0000644000000000000000000000206512631326456016277 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2015 Sam Mikes Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/umask/README.md0000644000000000000000000000353412631326456016553 0ustar 00000000000000# umask Convert umask from string <-> number. ## Installation & Use ``` $ npm install -S umask var umask = require('umask'); console.log(umask.toString(18)); // 0022 console.log(umask.fromString('0777')) // 511 ``` ## API ### `toString( val )` Converts `val` to a 0-padded octal string. `val` is assumed to be a Number in the correct range (0..511) ### `fromString( val, [cb] )` Converts `val` to a Number that can be used as a umask. `val` can be of the following forms: * String containing octal number (leading 0) * String containing decimal number * Number In all cases above, the value obtained is then converted to an integer and checked against the legal `umask` range 0..511 `fromString` can be used as a simple converter, with no error feedback, by omitting the optional callback argument `cb`: ``` var mask = umask.fromString(val); // mask is now the umask descibed by val or // the default, 0022 (18 dec) ``` The callback arguments are `(err, val)` where `err` is either `null` or an Error object and `val` is either the converted umask or the default umask, `0022`. ``` umask.fromString(val, function (err, val) { if (err) { console.error("invalid umask: " + err.message) } /* do something with val */ }); ``` The callback, if provided, is always called **synchronously**. ### `validate( data, k, val )` This is a validation function of the form expected by `nopt`. If `val` is a valid umask, the function returns true and sets `data[k]`. If `val` is not a valid umask, the function returns false. The `validate` function is stricter than `fromString`: it only accepts Number or octal String values, and the String value must begin with `0`. The `validate` function does **not** accept Strings containing decimal numbers. # Maintainer Sam Mikes # License MITnpm_3.5.2.orig/node_modules/umask/index.js0000644000000000000000000000373012631326456016737 0ustar 00000000000000'use strict'; var util = require("util"); function toString(val) { val = val.toString(8); while (val.length < 4) { val = "0" + val; } return val; } var defaultUmask = 18; // 0022; var defaultUmaskString = toString(defaultUmask); function validate(data, k, val) { // must be either an integer or an octal string. if (typeof val === "number" && !isNaN(val)) { data[k] = val; return true; } if (typeof val === "string") { if (val.charAt(0) !== "0") { return false; } data[k] = parseInt(val, 8); return true; } return false; } function convert_fromString(val, cb) { if (typeof val === "string") { // check for octal string first if (val.charAt(0) === '0' && /^[0-7]+$/.test(val)) { val = parseInt(val, 8); } else if (val.charAt(0) !== '0' && /^[0-9]+$/.test(val)) { // legacy support for decimal strings val = parseInt(val, 10); } else { return cb(new Error(util.format("Expected octal string, got %j, defaulting to %j", val, defaultUmaskString)), defaultUmask); } } else if (typeof val !== "number") { return cb(new Error(util.format("Expected number or octal string, got %j, defaulting to %j", val, defaultUmaskString)), defaultUmask); } val = Math.floor(val); if ((val < 0) || (val > 511)) { return cb(new Error(util.format("Must be in range 0..511 (0000..0777), got %j", val)), defaultUmask); } cb(null, val); } function fromString(val, cb) { // synchronous callback, no zalgo convert_fromString(val, cb || function (err, result) { /*jslint unparam:true*/ val = result; }); return val; } exports.toString = toString; exports.fromString = fromString; exports.validate = validate; npm_3.5.2.orig/node_modules/umask/package.json0000644000000000000000000000240712631326456017560 0ustar 00000000000000{ "name": "umask", "version": "1.1.0", "description": "convert umask from string <-> number", "main": "index.js", "scripts": { "test": "lab -ct 100", "lint": "jslint --terse --latest *.js test/*.js" }, "repository": { "type": "git", "url": "https://github.com/smikes/umask.git" }, "keywords": [ "umask" ], "author": { "name": "Sam Mikes", "email": "smikes@cubane.com" }, "license": "MIT", "bugs": { "url": "https://github.com/smikes/umask/issues" }, "homepage": "https://github.com/smikes/umask", "devDependencies": { "code": "^1.2.1", "jslint": "^0.7.2", "lab": "^5.2.0" }, "gitHead": "63d821e4d0b06ef9a4b727c5fbe5976e9534d76e", "_id": "umask@1.1.0", "_shasum": "f29cebf01df517912bb58ff9c4e50fde8e33320d", "_from": "umask@>=1.1.0 <1.2.0", "_npmVersion": "2.2.0", "_nodeVersion": "0.10.35", "_npmUser": { "name": "smikes", "email": "smikes@cubane.com" }, "maintainers": [ { "name": "smikes", "email": "smikes@cubane.com" } ], "dist": { "shasum": "f29cebf01df517912bb58ff9c4e50fde8e33320d", "tarball": "http://registry.npmjs.org/umask/-/umask-1.1.0.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/umask/-/umask-1.1.0.tgz" } npm_3.5.2.orig/node_modules/umask/test/0000755000000000000000000000000012631326456016246 5ustar 00000000000000npm_3.5.2.orig/node_modules/umask/test/simple.js0000644000000000000000000001230112631326456020072 0ustar 00000000000000'use strict'; var umask = require('..'); var Code = require('code'); var Lab = require('lab'); var lab = Lab.script(); exports.lab = lab; var describe = lab.describe; var it = lab.it; var expect = Code.expect; describe('validates umask', function () { // signature of validator: validate(obj, key, val) // store valid value in obj[key] // return false if invalid it('accepts numbers', function (done) { var o = {}, result = false; result = umask.validate(o, 'umask', 0); expect(result).to.equal(true); expect(o.umask).to.equal(0); result = umask.validate(o, 'umask', 511); expect(result).to.equal(true); expect(o.umask).to.equal(511); done(); }); it('accepts strings', function (done) { var o = {}, result; result = umask.validate(o, 'umask', "0"); expect(result).to.equal(true); expect(o.umask).to.equal(0); result = umask.validate(o, 'umask', "0777"); expect(result).to.equal(true); expect(o.umask).to.equal(511); done(); }); it('rejects other types', function (done) { expect(umask.validate(undefined, undefined, false)).to.equal(false); expect(umask.validate(undefined, undefined, {})).to.equal(false); done(); }); it('rejects non-octalish strings', function (done) { expect(umask.validate(undefined, undefined, "1")).to.equal(false); done(); }); it('rejects NaN strings', function (done) { expect(umask.validate(undefined, undefined, NaN)).to.equal(false); done(); }); }); describe('umask to string', function () { it("converts umask to string", function (done) { expect(umask.toString(0)).to.equal("0000"); expect(umask.toString(1)).to.equal("0001"); expect(umask.toString(7)).to.equal("0007"); expect(umask.toString(8)).to.equal("0010"); expect(umask.toString(511)).to.equal("0777"); expect(umask.toString(18)).to.equal("0022"); expect(umask.toString(16)).to.equal("0020"); done(); }); }); describe('umask from string', function () { it('converts valid values', function (done) { expect(umask.fromString("0000")).to.equal(0); expect(umask.fromString("0")).to.equal(0); expect(umask.fromString("0777")).to.equal(511); expect(umask.fromString("0024")).to.equal(20); expect(umask.fromString(0)).to.equal(0); expect(umask.fromString(20)).to.equal(20); expect(umask.fromString(21)).to.equal(21); expect(umask.fromString(511)).to.equal(511); done(); }); it('converts valid values', function (done) { expect(umask.fromString("0000")).to.equal(0); expect(umask.fromString("0")).to.equal(0); expect(umask.fromString("010")).to.equal(8); expect(umask.fromString("0777")).to.equal(511); expect(umask.fromString("0024")).to.equal(20); expect(umask.fromString("8")).to.equal(8); expect(umask.fromString("9")).to.equal(9); expect(umask.fromString("18")).to.equal(18); expect(umask.fromString("16")).to.equal(16); expect(umask.fromString(0)).to.equal(0); expect(umask.fromString(20)).to.equal(20); expect(umask.fromString(21)).to.equal(21); expect(umask.fromString(511)).to.equal(511); expect(umask.fromString(0.1)).to.equal(0); expect(umask.fromString(511.1)).to.equal(511); done(); }); it('errors on empty string', function (done) { umask.fromString("", function (err, val) { expect(err.message).to.equal('Expected octal string, got "", defaulting to "0022"'); expect(val).to.equal(18); done(); }); }); it('errors on invalid octal string', function (done) { umask.fromString("099", function (err, val) { expect(err.message).to.equal('Expected octal string, got "099", defaulting to "0022"'); expect(val).to.equal(18); done(); }); }); it('errors when non-string, non-number (boolean)', function (done) { umask.fromString(false, function (err, val) { expect(err.message).to.equal('Expected number or octal string, got false, defaulting to "0022"'); expect(val).to.equal(18); done(); }); }); it('errors when non-string, non-number (object)', function (done) { umask.fromString({}, function (err, val) { expect(err.message).to.equal('Expected number or octal string, got {}, defaulting to "0022"'); expect(val).to.equal(18); done(); }); }); it('errors when out of range (<0)', function (done) { umask.fromString(-1, function (err, val) { expect(err.message).to.equal('Must be in range 0..511 (0000..0777), got -1'); expect(val).to.equal(18); done(); }); }); it('errors when out of range (>511)', function (done) { umask.fromString(512, function (err, val) { expect(err.message).to.equal('Must be in range 0..511 (0000..0777), got 512'); expect(val).to.equal(18); done(); }); }); }); npm_3.5.2.orig/node_modules/unique-filename/.npmignore0000644000000000000000000000003512631326456021230 0ustar 00000000000000*~ .#* DEADJOE node_modules npm_3.5.2.orig/node_modules/unique-filename/README.md0000644000000000000000000000234612631326456020517 0ustar 00000000000000unique-filename =============== Generate a unique filename for use in temporary directories or caches. ``` var uniqueFilename = require('unique-filename') // returns something like: /tmp/912ec803b2ce49e4a541068d495ab570 var randomTmpfile = uniqueFilename(os.tmpdir()) // returns something like: /tmp/my-test-912ec803b2ce49e4a541068d495ab570 var randomPrefixedTmpfile = uniqueFilename(os.tmpdir(), 'my-test') var uniqueTmpfile = uniqueFilename('/tmp', 'testing', '/my/thing/to/uniq/on') ``` ### uniqueFilename(*dir*, *fileprefix*, *uniqstr*) → String Returns the full path of a unique filename that looks like: `dir/prefix-912ec803b2ce49e4a541068d495ab570` or `dir/912ec803b2ce49e4a541068d495ab570` *dir* – The path you want the filename in. `os.tmpdir()` is a good choice for this. *fileprefix* – A string to append prior to the unique part of the filename. The parameter is required if *uniqstr* is also passed in but is otherwise optional and can be `undefined`/`null`/`''`. If present and not empty then this string plus a hyphen are prepended to the unique part. *uniqstr* – Optional, if not passed the unique part of the resulting filename will be random. If passed in it will be generated from this string in a reproducable way. npm_3.5.2.orig/node_modules/unique-filename/index.js0000644000000000000000000000032712631326456020702 0ustar 00000000000000'use strict' var path = require('path') var uniqueSlug = require('unique-slug') module.exports = function (filepath, prefix, uniq) { return path.join(filepath, (prefix ? prefix + '-' : '') + uniqueSlug(uniq)) } npm_3.5.2.orig/node_modules/unique-filename/node_modules/0000755000000000000000000000000012631326456021710 5ustar 00000000000000npm_3.5.2.orig/node_modules/unique-filename/package.json0000644000000000000000000000403212631326456021520 0ustar 00000000000000{ "_args": [ [ "unique-filename@^1.1.0", "/Users/ogd/Documents/projects/npm/npm" ] ], "_from": "unique-filename@>=1.1.0 <2.0.0", "_id": "unique-filename@1.1.0", "_inCache": true, "_installable": true, "_location": "/unique-filename", "_nodeVersion": "4.2.2", "_npmUser": { "email": "me@re-becca.org", "name": "iarna" }, "_npmVersion": "2.14.13", "_phantomChildren": { "imurmurhash": "0.1.4" }, "_requested": { "name": "unique-filename", "raw": "unique-filename@^1.1.0", "rawSpec": "^1.1.0", "scope": null, "spec": ">=1.1.0 <2.0.0", "type": "range" }, "_requiredBy": [ "/" ], "_resolved": "https://registry.npmjs.org/unique-filename/-/unique-filename-1.1.0.tgz", "_shasum": "d05f2fe4032560871f30e93cbe735eea201514f3", "_shrinkwrap": null, "_spec": "unique-filename@^1.1.0", "_where": "/Users/ogd/Documents/projects/npm/npm", "author": { "email": "me@re-becca.org", "name": "Rebecca Turner", "url": "http://re-becca.org/" }, "bugs": { "url": "https://github.com/iarna/unique-filename/issues" }, "dependencies": { "unique-slug": "^2.0.0" }, "description": "Generate a unique filename for use in temporary directories or caches.", "devDependencies": { "standard": "^5.4.1", "tap": "^2.3.1" }, "directories": {}, "dist": { "shasum": "d05f2fe4032560871f30e93cbe735eea201514f3", "tarball": "http://registry.npmjs.org/unique-filename/-/unique-filename-1.1.0.tgz" }, "gitHead": "cb31644c71f842258a8019e0e6ef8f2b8533a5c0", "homepage": "https://github.com/iarna/unique-filename", "keywords": [], "license": "ISC", "main": "index.js", "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" } ], "name": "unique-filename", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git+https://github.com/iarna/unique-filename.git" }, "scripts": { "test": "standard && tap test" }, "version": "1.1.0" } npm_3.5.2.orig/node_modules/unique-filename/test/0000755000000000000000000000000012631326456020212 5ustar 00000000000000npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/0000755000000000000000000000000012631326456024166 5ustar 00000000000000npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/.npmignore0000644000000000000000000000005212631326456026162 0ustar 00000000000000*~ .#* DEADJOE node_modules .nyc_output/ npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/.travis.yml0000644000000000000000000000021312631326456026273 0ustar 00000000000000language: node_js sudo: false before_install: - "npm -g install npm" node_js: - "0.8" - "0.10" - "0.12" - "iojs" - "4" - "5" npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/README.md0000644000000000000000000000071312631326456025446 0ustar 00000000000000unique-slug =========== Generate a unique character string suitible for use in files and URLs. ``` var uniqueSlug = require('unique-slug') var randomSlug = uniqueSlug() var fileSlug = uniqueSlug('/etc/passwd') ``` ### uniqueSlug(*str*) → String (8 chars) If *str* is passed in then the return value will be its murmur hash in hex. If *str* is not passed in, it will be 4 bytes coverted into 8 hex characters, generated by `crypto.pseudoRandomBytes`. npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/index.js0000644000000000000000000000126412631326456025636 0ustar 00000000000000'use strict' var crypto = require('crypto') var MurmurHash3 = require('imurmurhash') module.exports = function (uniq) { if (uniq) { var hash = new MurmurHash3(uniq) return ('00000000' + hash.result().toString(16)).substr(-8) } else { // Called without a callback, because this interface should neither block // nor error (by contrast with randomBytes which will throw an exception // without enough entropy). // // However, due to a change in Node 0.10.27+, pseudoRandomBytes is now the // same as randomBytes, and may in fact block in situations where // insufficent entropy is available. return crypto.pseudoRandomBytes(4).toString('hex') } } npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/package.json0000644000000000000000000000404212631326456026454 0ustar 00000000000000{ "_args": [ [ "unique-slug@^2.0.0", "/Users/ogd/Documents/projects/npm/npm/node_modules/unique-filename" ] ], "_from": "unique-slug@>=2.0.0 <3.0.0", "_id": "unique-slug@2.0.0", "_inCache": true, "_installable": true, "_location": "/unique-filename/unique-slug", "_nodeVersion": "5.1.0", "_npmUser": { "email": "ogd@aoaioxxysz.net", "name": "othiym23" }, "_npmVersion": "3.5.1", "_phantomChildren": {}, "_requested": { "name": "unique-slug", "raw": "unique-slug@^2.0.0", "rawSpec": "^2.0.0", "scope": null, "spec": ">=2.0.0 <3.0.0", "type": "range" }, "_requiredBy": [ "/unique-filename" ], "_shasum": "db6676e7c7cc0629878ff196097c78855ae9f4ab", "_shrinkwrap": null, "_spec": "unique-slug@^2.0.0", "_where": "/Users/ogd/Documents/projects/npm/npm/node_modules/unique-filename", "author": { "email": "me@re-becca.org", "name": "Rebecca Turner", "url": "http://re-becca.org" }, "bugs": { "url": "https://github.com/iarna/unique-slug/issues" }, "dependencies": { "imurmurhash": "^0.1.4" }, "description": "Generate a unique character string suitible for use in files and URLs.", "devDependencies": { "standard": "^5.4.1", "tap": "^2.3.1" }, "directories": {}, "dist": { "shasum": "db6676e7c7cc0629878ff196097c78855ae9f4ab", "tarball": "http://registry.npmjs.org/unique-slug/-/unique-slug-2.0.0.tgz" }, "gitHead": "b1d9d082ee5bd381961a2011a9aa3d9988e83ca7", "homepage": "https://github.com/iarna/unique-slug#readme", "keywords": [], "license": "ISC", "main": "index.js", "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "name": "unique-slug", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git://github.com/iarna/unique-slug.git" }, "scripts": { "test": "standard && tap --coverage test" }, "version": "2.0.0" } npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/test/0000755000000000000000000000000012631326456025145 5ustar 00000000000000npm_3.5.2.orig/node_modules/unique-filename/node_modules/unique-slug/test/index.js0000644000000000000000000000101512631326456026607 0ustar 00000000000000'use strict' var t = require('tap') var uniqueSlug = require('../index.js') t.plan(5) var slugA = uniqueSlug() t.is(slugA.length, 8, 'random slugs are 8 chars') t.notEqual(slugA, uniqueSlug(), "two slugs aren't the same") var base = '/path/to/thingy' var slugB = uniqueSlug(base) t.is(slugB.length, 8, 'string based slugs are 8 chars') t.is(slugB, uniqueSlug(base), 'two string based slugs, from the same string are the same') t.notEqual(slugB, uniqueSlug(slugA), 'two string based slongs, from diff strings are different') npm_3.5.2.orig/node_modules/unique-filename/test/index.js0000644000000000000000000000164412631326456021664 0ustar 00000000000000'sue strict' var t = require('tap') var uniqueFilename = require('../index.js') t.plan(6) var randomTmpfile = uniqueFilename('tmp') t.like(randomTmpfile, /^tmp.[a-f0-9]{8}$/, 'random tmp file') var randomAgain = uniqueFilename('tmp') t.notEqual(randomAgain, randomTmpfile, 'random tmp files are not the same') var randomPrefixedTmpfile = uniqueFilename('tmp', 'my-test') t.like(randomPrefixedTmpfile, /^tmp.my-test-[a-f0-9]{8}$/, 'random prefixed tmp file') var randomPrefixedAgain = uniqueFilename('tmp', 'my-test') t.notEqual(randomPrefixedAgain, randomPrefixedTmpfile, 'random prefixed tmp files are not the same') var uniqueTmpfile = uniqueFilename('tmp', 'testing', '/my/thing/to/uniq/on') t.like(uniqueTmpfile, /^tmp.testing-7ddd44c0$/, 'unique filename') var uniqueAgain = uniqueFilename('tmp', 'testing', '/my/thing/to/uniq/on') t.is(uniqueTmpfile, uniqueAgain, 'same unique string component produces same filename') npm_3.5.2.orig/node_modules/unpipe/HISTORY.md0000644000000000000000000000007312631326456017132 0ustar 000000000000001.0.0 / 2015-06-14 ================== * Initial release npm_3.5.2.orig/node_modules/unpipe/LICENSE0000644000000000000000000000213212631326456016452 0ustar 00000000000000(The MIT License) Copyright (c) 2015 Douglas Christopher Wilson Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/unpipe/README.md0000644000000000000000000000234212631326456016727 0ustar 00000000000000# unpipe [![NPM Version][npm-image]][npm-url] [![NPM Downloads][downloads-image]][downloads-url] [![Node.js Version][node-image]][node-url] [![Build Status][travis-image]][travis-url] [![Test Coverage][coveralls-image]][coveralls-url] Unpipe a stream from all destinations. ## Installation ```sh $ npm install unpipe ``` ## API ```js var unpipe = require('unpipe') ``` ### unpipe(stream) Unpipes all destinations from a given stream. With stream 2+, this is equivalent to `stream.unpipe()`. When used with streams 1 style streams (typically Node.js 0.8 and below), this module attempts to undo the actions done in `stream.pipe(dest)`. ## License [MIT](LICENSE) [npm-image]: https://img.shields.io/npm/v/unpipe.svg [npm-url]: https://npmjs.org/package/unpipe [node-image]: https://img.shields.io/node/v/unpipe.svg [node-url]: http://nodejs.org/download/ [travis-image]: https://img.shields.io/travis/stream-utils/unpipe.svg [travis-url]: https://travis-ci.org/stream-utils/unpipe [coveralls-image]: https://img.shields.io/coveralls/stream-utils/unpipe.svg [coveralls-url]: https://coveralls.io/r/stream-utils/unpipe?branch=master [downloads-image]: https://img.shields.io/npm/dm/unpipe.svg [downloads-url]: https://npmjs.org/package/unpipe npm_3.5.2.orig/node_modules/unpipe/index.js0000644000000000000000000000213612631326456017116 0ustar 00000000000000/*! * unpipe * Copyright(c) 2015 Douglas Christopher Wilson * MIT Licensed */ 'use strict' /** * Module exports. * @public */ module.exports = unpipe /** * Determine if there are Node.js pipe-like data listeners. * @private */ function hasPipeDataListeners(stream) { var listeners = stream.listeners('data') for (var i = 0; i < listeners.length; i++) { if (listeners[i].name === 'ondata') { return true } } return false } /** * Unpipe a stream from all destinations. * * @param {object} stream * @public */ function unpipe(stream) { if (!stream) { throw new TypeError('argument stream is required') } if (typeof stream.unpipe === 'function') { // new-style stream.unpipe() return } // Node.js 0.8 hack if (!hasPipeDataListeners(stream)) { return } var listener var listeners = stream.listeners('close') for (var i = 0; i < listeners.length; i++) { listener = listeners[i] if (listener.name !== 'cleanup' && listener.name !== 'onclose') { continue } // invoke the listener listener.call(stream) } } npm_3.5.2.orig/node_modules/unpipe/package.json0000644000000000000000000000474712631326456017751 0ustar 00000000000000{ "name": "unpipe", "description": "Unpipe a stream from all destinations", "version": "1.0.0", "author": { "name": "Douglas Christopher Wilson", "email": "doug@somethingdoug.com" }, "license": "MIT", "repository": { "type": "git", "url": "git+https://github.com/stream-utils/unpipe.git" }, "devDependencies": { "istanbul": "0.3.15", "mocha": "2.2.5", "readable-stream": "1.1.13" }, "files": [ "HISTORY.md", "LICENSE", "README.md", "index.js" ], "engines": { "node": ">= 0.8" }, "scripts": { "test": "mocha --reporter spec --bail --check-leaks test/", "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" }, "readme": "# unpipe\n\n[![NPM Version][npm-image]][npm-url]\n[![NPM Downloads][downloads-image]][downloads-url]\n[![Node.js Version][node-image]][node-url]\n[![Build Status][travis-image]][travis-url]\n[![Test Coverage][coveralls-image]][coveralls-url]\n\nUnpipe a stream from all destinations.\n\n## Installation\n\n```sh\n$ npm install unpipe\n```\n\n## API\n\n```js\nvar unpipe = require('unpipe')\n```\n\n### unpipe(stream)\n\nUnpipes all destinations from a given stream. With stream 2+, this is\nequivalent to `stream.unpipe()`. When used with streams 1 style streams\n(typically Node.js 0.8 and below), this module attempts to undo the\nactions done in `stream.pipe(dest)`.\n\n## License\n\n[MIT](LICENSE)\n\n[npm-image]: https://img.shields.io/npm/v/unpipe.svg\n[npm-url]: https://npmjs.org/package/unpipe\n[node-image]: https://img.shields.io/node/v/unpipe.svg\n[node-url]: http://nodejs.org/download/\n[travis-image]: https://img.shields.io/travis/stream-utils/unpipe.svg\n[travis-url]: https://travis-ci.org/stream-utils/unpipe\n[coveralls-image]: https://img.shields.io/coveralls/stream-utils/unpipe.svg\n[coveralls-url]: https://coveralls.io/r/stream-utils/unpipe?branch=master\n[downloads-image]: https://img.shields.io/npm/dm/unpipe.svg\n[downloads-url]: https://npmjs.org/package/unpipe\n", "readmeFilename": "README.md", "bugs": { "url": "https://github.com/stream-utils/unpipe/issues" }, "homepage": "https://github.com/stream-utils/unpipe#readme", "_id": "unpipe@1.0.0", "_shasum": "b2bf4ee8514aae6165b4817829d21b2ef49904ec", "_resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz", "_from": "unpipe@>=1.0.0 <1.1.0" } npm_3.5.2.orig/node_modules/validate-npm-package-license/LICENSE0000644000000000000000000002205712631326456022554 0ustar 00000000000000SPDX:Apache-2.0 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. npm_3.5.2.orig/node_modules/validate-npm-package-license/README.md0000644000000000000000000000500112631326456023014 0ustar 00000000000000validate-npm-package-license ============================ Give me a string and I'll tell you if it's a valid npm package license string. ```javascript var valid = require('validate-npm-package-license'); ``` SPDX license identifiers are valid license strings: ```javascript var assert = require('assert'); var validSPDXExpression = { validForNewPackages: true, validForOldPackages: true, spdx: true }; assert.deepEqual(valid('MIT'), validSPDXExpression); assert.deepEqual(valid('BSD-2-Clause'), validSPDXExpression); assert.deepEqual(valid('Apache-2.0'), validSPDXExpression); assert.deepEqual(valid('ISC'), validSPDXExpression); ``` The function will return a warning and suggestion for nearly-correct license identifiers: ```javascript assert.deepEqual( valid('Apache 2.0'), { validForOldPackages: false, validForNewPackages: false, warnings: [ 'license should be ' + 'a valid SPDX license expression (without "LicenseRef"), ' + '"UNLICENSED", or ' + '"SEE LICENSE IN "', 'license is similar to the valid expression "Apache-2.0"' ] } ); ``` SPDX expressions are valid, too ... ```javascript // Simple SPDX license expression for dual licensing assert.deepEqual( valid('(GPL-3.0 OR BSD-2-Clause)'), validSPDXExpression ); ``` ... except if they contain `LicenseRef`: ```javascript var warningAboutLicenseRef = { validForOldPackages: false, validForNewPackages: false, spdx: true, warnings: [ 'license should be ' + 'a valid SPDX license expression (without "LicenseRef"), ' + '"UNLICENSED", or ' + '"SEE LICENSE IN "', ] }; assert.deepEqual( valid('LicenseRef-Made-Up'), warningAboutLicenseRef ); assert.deepEqual( valid('(MIT OR LicenseRef-Made-Up)'), warningAboutLicenseRef ); ``` If you can't describe your licensing terms with standardized SPDX identifiers, put the terms in a file in the package and point users there: ```javascript assert.deepEqual( valid('SEE LICENSE IN LICENSE.txt'), { validForNewPackages: true, validForOldPackages: true, inFile: 'LICENSE.txt' } ); assert.deepEqual( valid('SEE LICENSE IN license.md'), { validForNewPackages: true, validForOldPackages: true, inFile: 'license.md' } ); ``` If there aren't any licensing terms, use `UNLICENSED`: ```javascript var unlicensed = { validForNewPackages: true, validForOldPackages: true, unlicensed: true }; assert.deepEqual(valid('UNLICENSED'), unlicensed); assert.deepEqual(valid('UNLICENCED'), unlicensed); ``` npm_3.5.2.orig/node_modules/validate-npm-package-license/index.js0000644000000000000000000000350712631326456023213 0ustar 00000000000000var parse = require('spdx-expression-parse'); var correct = require('spdx-correct'); var genericWarning = ( 'license should be ' + 'a valid SPDX license expression (without "LicenseRef"), ' + '"UNLICENSED", or ' + '"SEE LICENSE IN "' ); var fileReferenceRE = /^SEE LICEN[CS]E IN (.+)$/; function startsWith(prefix, string) { return string.slice(0, prefix.length) === prefix; } function usesLicenseRef(ast) { if (ast.hasOwnProperty('license')) { var license = ast.license; return ( startsWith('LicenseRef', license) || startsWith('DocumentRef', license) ); } else { return ( usesLicenseRef(ast.left) || usesLicenseRef(ast.right) ); } } module.exports = function(argument) { var ast; try { ast = parse(argument); } catch (e) { var match if ( argument === 'UNLICENSED' || argument === 'UNLICENCED' ) { return { validForOldPackages: true, validForNewPackages: true, unlicensed: true }; } else if (match = fileReferenceRE.exec(argument)) { return { validForOldPackages: true, validForNewPackages: true, inFile: match[1] }; } else { var result = { validForOldPackages: false, validForNewPackages: false, warnings: [genericWarning] }; var corrected = correct(argument); if (corrected) { result.warnings.push( 'license is similar to the valid expression "' + corrected + '"' ); } return result; } } if (usesLicenseRef(ast)) { return { validForNewPackages: false, validForOldPackages: false, spdx: true, warnings: [genericWarning] }; } else { return { validForNewPackages: true, validForOldPackages: true, spdx: true }; } }; npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/0000755000000000000000000000000012631326456024216 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/package.json0000644000000000000000000000351012631326456024026 0ustar 00000000000000{ "name": "validate-npm-package-license", "description": "Give me a string and I'll tell you if it's a valid npm package license string", "version": "3.0.1", "author": { "name": "Kyle E. Mitchell", "email": "kyle@kemitchell.com", "url": "https://kemitchell.com" }, "dependencies": { "spdx-correct": "~1.0.0", "spdx-expression-parse": "~1.0.0" }, "devDependencies": { "defence-cli": "^1.0.1", "replace-require-self": "^1.0.0" }, "keywords": [ "license", "npm", "package", "validation" ], "license": "Apache-2.0", "repository": { "type": "git", "url": "git+https://github.com/kemitchell/validate-npm-package-license.js.git" }, "scripts": { "test": "defence README.md | replace-require-self | node" }, "gitHead": "00200d28f9960985f221bc1a8a71e4760daf39bf", "bugs": { "url": "https://github.com/kemitchell/validate-npm-package-license.js/issues" }, "homepage": "https://github.com/kemitchell/validate-npm-package-license.js#readme", "_id": "validate-npm-package-license@3.0.1", "_shasum": "2804babe712ad3379459acfbe24746ab2c303fbc", "_from": "validate-npm-package-license@3.0.1", "_npmVersion": "2.13.5", "_nodeVersion": "0.12.7", "_npmUser": { "name": "kemitchell", "email": "kyle@kemitchell.com" }, "dist": { "shasum": "2804babe712ad3379459acfbe24746ab2c303fbc", "tarball": "http://registry.npmjs.org/validate-npm-package-license/-/validate-npm-package-license-3.0.1.tgz" }, "maintainers": [ { "name": "kemitchell", "email": "kyle@kemitchell.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/validate-npm-package-license/-/validate-npm-package-license-3.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-correct/0000755000000000000000000000000012631326456026633 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/0000755000000000000000000000000012631326456030501 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-license-ids/0000755000000000000000000000000012631326456027371 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-correct/README.md0000644000000000000000000000034112631326456030110 0ustar 00000000000000```javascript var correct = require('spdx-correct'); var assert = require('assert'); assert.equal(correct('mit'), 'MIT') assert.equal(correct('Apache 2'), 'Apache-2.0') assert(correct('No idea what license') === null) ``` npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-correct/index.js0000644000000000000000000001330512631326456030302 0ustar 00000000000000var licenseIDs = require('spdx-license-ids'); function valid(string) { return licenseIDs.indexOf(string) > -1; } // Common transpositions of license identifier acronyms var transpositions = [ ['APGL', 'AGPL'], ['Gpl', 'GPL'], ['GLP', 'GPL'], ['APL', 'Apache'], ['ISD', 'ISC'], ['GLP', 'GPL'], ['IST', 'ISC'], ['Claude', 'Clause'], [' or later', '+'], [' International', ''], ['GNU', 'GPL'], ['GUN', 'GPL'], ['+', ''], ['GNU GPL', 'GPL'], ['GNU/GPL', 'GPL'], ['GNU GLP', 'GPL'], ['GNU General Public License', 'GPL'], ['Gnu public license', 'GPL'], ['GNU Public License', 'GPL'], ['GNU GENERAL PUBLIC LICENSE', 'GPL'], ['MTI', 'MIT'], ['Mozilla Public License', 'MPL'], ['WTH', 'WTF'], ['-License', ''] ]; var TRANSPOSED = 0; var CORRECT = 1; // Simple corrections to nearly valid identifiers. var transforms = [ // e.g. 'mit' function(argument) { return argument.toUpperCase(); }, // e.g. 'MIT ' function(argument) { return argument.trim(); }, // e.g. 'M.I.T.' function(argument) { return argument.replace(/\./g, ''); }, // e.g. 'Apache- 2.0' function(argument) { return argument.replace(/\s+/g, ''); }, // e.g. 'CC BY 4.0'' function(argument) { return argument.replace(/\s+/g, '-'); }, // e.g. 'LGPLv2.1' function(argument) { return argument.replace('v', '-'); }, // e.g. 'Apache 2.0' function(argument) { return argument.replace(/,?\s*(\d)/, '-$1'); }, // e.g. 'GPL 2' function(argument) { return argument.replace(/,?\s*(\d)/, '-$1.0'); }, // e.g. 'Apache Version 2.0' function(argument) { return argument.replace(/,?\s*(V\.|v\.|V|v|Version|version)\s*(\d)/, '-$2'); }, // e.g. 'Apache Version 2' function(argument) { return argument.replace(/,?\s*(V\.|v\.|V|v|Version|version)\s*(\d)/, '-$2.0'); }, // e.g. 'ZLIB' function(argument) { return argument[0].toUpperCase() + argument.slice(1); }, // e.g. 'MPL/2.0' function(argument) { return argument.replace('/', '-'); }, // e.g. 'Apache 2' function(argument) { return argument .replace(/\s*V\s*(\d)/, '-$1') .replace(/(\d)$/, '$1.0'); }, // e.g. 'GPL-2.0-' function(argument) { return argument.slice(0, argument.length - 1); }, // e.g. 'GPL2' function(argument) { return argument.replace(/(\d)$/, '-$1.0'); }, // e.g. 'BSD 3' function(argument) { return argument.replace(/(-| )?(\d)$/, '-$2-Clause'); }, // e.g. 'BSD clause 3' function(argument) { return argument.replace(/(-| )clause(-| )(\d)/, '-$3-Clause'); }, // e.g. 'BY-NC-4.0' function(argument) { return 'CC-' + argument; }, // e.g. 'BY-NC' function(argument) { return 'CC-' + argument + '-4.0'; }, // e.g. 'Attribution-NonCommercial' function(argument) { return argument .replace('Attribution', 'BY') .replace('NonCommercial', 'NC') .replace('NoDerivatives', 'ND') .replace(/ (\d)/, '-$1') .replace(/ ?International/, ''); }, // e.g. 'Attribution-NonCommercial' function(argument) { return 'CC-' + argument .replace('Attribution', 'BY') .replace('NonCommercial', 'NC') .replace('NoDerivatives', 'ND') .replace(/ (\d)/, '-$1') .replace(/ ?International/, '') + '-4.0'; } ]; // If all else fails, guess that strings containing certain substrings // meant to identify certain licenses. var lastResorts = [ ['UNLI', 'Unlicense'], ['WTF', 'WTFPL'], ['2 CLAUSE', 'BSD-2-Clause'], ['2-CLAUSE', 'BSD-2-Clause'], ['3 CLAUSE', 'BSD-3-Clause'], ['3-CLAUSE', 'BSD-3-Clause'], ['AFFERO', 'AGPL-3.0'], ['AGPL', 'AGPL-3.0'], ['APACHE', 'Apache-2.0'], ['ARTISTIC', 'Artistic-2.0'], ['Affero', 'AGPL-3.0'], ['BEER', 'Beerware'], ['BOOST', 'BSL-1.0'], ['BSD', 'BSD-2-Clause'], ['ECLIPSE', 'EPL-1.0'], ['FUCK', 'WTFPL'], ['GNU', 'GPL-3.0'], ['LGPL', 'LGPL-3.0'], ['GPL', 'GPL-3.0'], ['MIT', 'MIT'], ['MPL', 'MPL-2.0'], ['X11', 'X11'], ['ZLIB', 'Zlib'] ]; var SUBSTRING = 0; var IDENTIFIER = 1; var validTransformation = function(identifier) { for (var i = 0; i < transforms.length; i++) { var transformed = transforms[i](identifier); if (transformed !== identifier && valid(transformed)) { return transformed; } } return null; }; var validLastResort = function(identifier) { var upperCased = identifier.toUpperCase(); for (var i = 0; i < lastResorts.length; i++) { var lastResort = lastResorts[i]; if (upperCased.indexOf(lastResort[SUBSTRING]) > -1) { return lastResort[IDENTIFIER]; } } return null; }; var anyCorrection = function(identifier, check) { for (var i = 0; i < transpositions.length; i++) { var transposition = transpositions[i]; var transposed = transposition[TRANSPOSED]; if (identifier.indexOf(transposed) > -1) { var corrected = identifier.replace( transposed, transposition[CORRECT] ); var checked = check(corrected); if (checked !== null) { return checked; } } } return null; }; module.exports = function(identifier) { identifier = identifier.replace(/\+$/, ''); if (valid(identifier)) { return identifier; } var transformed = validTransformation(identifier); if (transformed !== null) { return transformed; } transformed = anyCorrection(identifier, function(argument) { if (valid(argument)) { return argument; } return validTransformation(argument); }); if (transformed !== null) { return transformed; } transformed = validLastResort(identifier); if (transformed !== null) { return transformed; } transformed = anyCorrection(identifier, validLastResort); if (transformed !== null) { return transformed; } return null; }; npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-correct/package.json0000644000000000000000000000327012631326456031123 0ustar 00000000000000{ "name": "spdx-correct", "description": "correct invalid SPDX identifiers", "version": "1.0.1", "author": { "name": "Kyle E. Mitchell", "email": "kyle@kemitchell.com", "url": "https://kemitchell.com" }, "dependencies": { "spdx-license-ids": "^1.0.2" }, "devDependencies": { "defence-cli": "^1.0.1", "replace-require-self": "^1.0.0", "spdx-expression-parse": "^1.0.0", "tape": "~4.0.0" }, "keywords": [ "SPDX", "law", "legal", "license", "metadata" ], "license": "Apache-2.0", "repository": { "type": "git", "url": "git+https://github.com/kemitchell/spdx-correct.js.git" }, "scripts": { "test": "defence README.md | replace-require-self | node && tape *.test.js" }, "gitHead": "f3581dea1529d975851ceab7f86e646d8220608a", "bugs": { "url": "https://github.com/kemitchell/spdx-correct.js/issues" }, "homepage": "https://github.com/kemitchell/spdx-correct.js#readme", "_id": "spdx-correct@1.0.1", "_shasum": "ac075f5f2f6a06c0bfdd1c847eb3dde3dd8221ea", "_from": "spdx-correct@>=1.0.0 <1.1.0", "_npmVersion": "2.13.5", "_nodeVersion": "0.12.7", "_npmUser": { "name": "kemitchell", "email": "kyle@kemitchell.com" }, "dist": { "shasum": "ac075f5f2f6a06c0bfdd1c847eb3dde3dd8221ea", "tarball": "http://registry.npmjs.org/spdx-correct/-/spdx-correct-1.0.1.tgz" }, "maintainers": [ { "name": "kemitchell", "email": "kyle@kemitchell.com" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/spdx-correct/-/spdx-correct-1.0.1.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/LICENSE0000644000000000000000000000211012631326456031500 0ustar 00000000000000SPDX:MIT MIT License Copyright (c) 2015 Kyle E. Mitchell Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ././@LongLink0000000000000000000000000000014600000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/README.mdnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/README.m0000644000000000000000000000177312631326456031624 0ustar 00000000000000```javascript var parse = require('./') var assert = require('assert') var firstAST = { left: { license: 'LGPL-2.1' }, conjunction: 'or', right: { left: { license: 'BSD-3-Clause' }, conjunction: 'and', right: { license: 'MIT' } } } assert.deepEqual( parse('(LGPL-2.1 OR BSD-3-Clause AND MIT)'), firstAST) var secondAST = { left: { license: 'MIT' }, conjunction: 'and', right: { left: { license: 'LGPL-2.1', plus: true }, conjunction: 'and', right: { license: 'BSD-3-Clause' } } } assert.deepEqual( parse('(MIT AND (LGPL-2.1+ AND BSD-3-Clause))'), secondAST) ``` --- [The Software Package Data Exchange (SPDX) specification](http://spdx.org) is the work of the [Linux Foundation](http://www.linuxfoundation.org) and its contributors, and is licensed under the terms of [the Creative Commons Attribution License 3.0 Unported (SPDX: "CC-BY-3.0")](http://spdx.org/licenses/CC-BY-3.0). "SPDX" is a United States federally registered trademark of the Linux Foundation. npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/index.js0000644000000000000000000000017612631326456032152 0ustar 00000000000000var parser = require('./parser.generated.js').parser module.exports = function(argument) { return parser.parse(argument) } ././@LongLink0000000000000000000000000000015200000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_modules/npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_mod0000755000000000000000000000000012631326456032206 5ustar 00000000000000././@LongLink0000000000000000000000000000015100000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/package.jsonnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/package.0000644000000000000000000000350612631326456032101 0ustar 00000000000000{ "name": "spdx-expression-parse", "description": "parse SPDX license expressions", "version": "1.0.0", "author": { "name": "Kyle E. Mitchell", "email": "kyle@kemitchell.com", "url": "http://kemitchell.com" }, "dependencies": { "spdx-exceptions": "^1.0.0", "spdx-license-ids": "^1.0.0" }, "devDependencies": { "defence-cli": "^1.0.1", "jison": "^0.4.15" }, "keywords": [ "SPDX", "law", "legal", "license", "metadata", "package", "package.json", "standards" ], "license": "(MIT AND CC-BY-3.0)", "repository": { "type": "git", "url": "git+https://github.com/kemitchell/spdx-expression-parse.js.git" }, "scripts": { "generate": "node generate-parser.js > parser.generated.js", "prepublish": "npm run generate", "pretest": "npm run generate", "test": "defence -i javascript README.md | node" }, "gitHead": "213bc03808f709a4ceaadb8466740a8c96c1e896", "bugs": { "url": "https://github.com/kemitchell/spdx-expression-parse.js/issues" }, "homepage": "https://github.com/kemitchell/spdx-expression-parse.js#readme", "_id": "spdx-expression-parse@1.0.0", "_shasum": "4fbb7e738c9e98fa0b0914dfd961ac6629fbcdef", "_from": "spdx-expression-parse@>=1.0.0 <1.1.0", "_npmVersion": "2.13.3", "_nodeVersion": "0.12.7", "_npmUser": { "name": "kemitchell", "email": "kyle@kemitchell.com" }, "dist": { "shasum": "4fbb7e738c9e98fa0b0914dfd961ac6629fbcdef", "tarball": "http://registry.npmjs.org/spdx-expression-parse/-/spdx-expression-parse-1.0.0.tgz" }, "maintainers": [ { "name": "kemitchell", "email": "kyle@kemitchell.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/spdx-expression-parse/-/spdx-expression-parse-1.0.0.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000016000000000000011212 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/parser.generated.jsnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/parser.g0000644000000000000000000011015312631326456032146 0ustar 00000000000000/* parser generated by jison 0.4.15 */ /* Returns a Parser object of the following structure: Parser: { yy: {} } Parser.prototype: { yy: {}, trace: function(), symbols_: {associative list: name ==> number}, terminals_: {associative list: number ==> name}, productions_: [...], performAction: function anonymous(yytext, yyleng, yylineno, yy, yystate, $$, _$), table: [...], defaultActions: {...}, parseError: function(str, hash), parse: function(input), lexer: { EOF: 1, parseError: function(str, hash), setInput: function(input), input: function(), unput: function(str), more: function(), less: function(n), pastInput: function(), upcomingInput: function(), showPosition: function(), test_match: function(regex_match_array, rule_index), next: function(), lex: function(), begin: function(condition), popState: function(), _currentRules: function(), topState: function(), pushState: function(condition), options: { ranges: boolean (optional: true ==> token location info will include a .range[] member) flex: boolean (optional: true ==> flex-like lexing behaviour where the rules are tested exhaustively to find the longest match) backtrack_lexer: boolean (optional: true ==> lexer regexes are tested in order and for each matching regex the action code is invoked; the lexer terminates the scan when a token is returned by the action code) }, performAction: function(yy, yy_, $avoiding_name_collisions, YY_START), rules: [...], conditions: {associative list: name ==> set}, } } token location info (@$, _$, etc.): { first_line: n, last_line: n, first_column: n, last_column: n, range: [start_number, end_number] (where the numbers are indexes into the input string, regular zero-based) } the parseError function receives a 'hash' object with these members for lexer and parser errors: { text: (matched text) token: (the produced terminal token, if any) line: (yylineno) } while parser (grammar) errors will also provide these members, i.e. parser errors deliver a superset of attributes: { loc: (yylloc) expected: (string describing the set of expected tokens) recoverable: (boolean: TRUE when the parser has a error recovery rule available for this particular error) } */ var spdxparse = (function(){ var o=function(k,v,o,l){for(o=o||{},l=k.length;l--;o[k[l]]=v);return o},$V0=[1,5],$V1=[1,6],$V2=[1,7],$V3=[1,4],$V4=[1,9],$V5=[1,10],$V6=[5,14,15,17],$V7=[5,12,14,15,17]; var parser = {trace: function trace() { }, yy: {}, symbols_: {"error":2,"start":3,"expression":4,"EOS":5,"simpleExpression":6,"LICENSE":7,"PLUS":8,"LICENSEREF":9,"DOCUMENTREF":10,"COLON":11,"WITH":12,"EXCEPTION":13,"AND":14,"OR":15,"OPEN":16,"CLOSE":17,"$accept":0,"$end":1}, terminals_: {2:"error",5:"EOS",7:"LICENSE",8:"PLUS",9:"LICENSEREF",10:"DOCUMENTREF",11:"COLON",12:"WITH",13:"EXCEPTION",14:"AND",15:"OR",16:"OPEN",17:"CLOSE"}, productions_: [0,[3,2],[6,1],[6,2],[6,1],[6,3],[4,1],[4,3],[4,3],[4,3],[4,3]], performAction: function anonymous(yytext, yyleng, yylineno, yy, yystate /* action[1] */, $$ /* vstack */, _$ /* lstack */) { /* this == yyval */ var $0 = $$.length - 1; switch (yystate) { case 1: return this.$ = $$[$0-1]; break; case 2: case 4: case 5: this.$ = { license: yytext }; break; case 3: this.$ = { license: $$[$0-1], plus: true }; break; case 6: this.$ = $$[$0]; break; case 7: this.$ = { exception: $$[$0] }; this.$.license = $$[$0-2].license; if ($$[$0-2].hasOwnProperty('plus')) { this.$.plus = $$[$0-2].plus; } break; case 8: this.$ = { conjunction: 'and', left: $$[$0-2], right: $$[$0] }; break; case 9: this.$ = { conjunction: 'or', left: $$[$0-2], right: $$[$0] }; break; case 10: this.$ = $$[$0-1] break; } }, table: [{3:1,4:2,6:3,7:$V0,9:$V1,10:$V2,16:$V3},{1:[3]},{5:[1,8],14:$V4,15:$V5},o($V6,[2,6],{12:[1,11]}),{4:12,6:3,7:$V0,9:$V1,10:$V2,16:$V3},o($V7,[2,2],{8:[1,13]}),o($V7,[2,4]),{11:[1,14]},{1:[2,1]},{4:15,6:3,7:$V0,9:$V1,10:$V2,16:$V3},{4:16,6:3,7:$V0,9:$V1,10:$V2,16:$V3},{13:[1,17]},{14:$V4,15:$V5,17:[1,18]},o($V7,[2,3]),{9:[1,19]},o($V6,[2,8]),o([5,15,17],[2,9],{14:$V4}),o($V6,[2,7]),o($V6,[2,10]),o($V7,[2,5])], defaultActions: {8:[2,1]}, parseError: function parseError(str, hash) { if (hash.recoverable) { this.trace(str); } else { throw new Error(str); } }, parse: function parse(input) { var self = this, stack = [0], tstack = [], vstack = [null], lstack = [], table = this.table, yytext = '', yylineno = 0, yyleng = 0, recovering = 0, TERROR = 2, EOF = 1; var args = lstack.slice.call(arguments, 1); var lexer = Object.create(this.lexer); var sharedState = { yy: {} }; for (var k in this.yy) { if (Object.prototype.hasOwnProperty.call(this.yy, k)) { sharedState.yy[k] = this.yy[k]; } } lexer.setInput(input, sharedState.yy); sharedState.yy.lexer = lexer; sharedState.yy.parser = this; if (typeof lexer.yylloc == 'undefined') { lexer.yylloc = {}; } var yyloc = lexer.yylloc; lstack.push(yyloc); var ranges = lexer.options && lexer.options.ranges; if (typeof sharedState.yy.parseError === 'function') { this.parseError = sharedState.yy.parseError; } else { this.parseError = Object.getPrototypeOf(this).parseError; } function popStack(n) { stack.length = stack.length - 2 * n; vstack.length = vstack.length - n; lstack.length = lstack.length - n; } _token_stack: function lex() { var token; token = lexer.lex() || EOF; if (typeof token !== 'number') { token = self.symbols_[token] || token; } return token; } var symbol, preErrorSymbol, state, action, a, r, yyval = {}, p, len, newState, expected; while (true) { state = stack[stack.length - 1]; if (this.defaultActions[state]) { action = this.defaultActions[state]; } else { if (symbol === null || typeof symbol == 'undefined') { symbol = lex(); } action = table[state] && table[state][symbol]; } if (typeof action === 'undefined' || !action.length || !action[0]) { var errStr = ''; expected = []; for (p in table[state]) { if (this.terminals_[p] && p > TERROR) { expected.push('\'' + this.terminals_[p] + '\''); } } if (lexer.showPosition) { errStr = 'Parse error on line ' + (yylineno + 1) + ':\n' + lexer.showPosition() + '\nExpecting ' + expected.join(', ') + ', got \'' + (this.terminals_[symbol] || symbol) + '\''; } else { errStr = 'Parse error on line ' + (yylineno + 1) + ': Unexpected ' + (symbol == EOF ? 'end of input' : '\'' + (this.terminals_[symbol] || symbol) + '\''); } this.parseError(errStr, { text: lexer.match, token: this.terminals_[symbol] || symbol, line: lexer.yylineno, loc: yyloc, expected: expected }); } if (action[0] instanceof Array && action.length > 1) { throw new Error('Parse Error: multiple actions possible at state: ' + state + ', token: ' + symbol); } switch (action[0]) { case 1: stack.push(symbol); vstack.push(lexer.yytext); lstack.push(lexer.yylloc); stack.push(action[1]); symbol = null; if (!preErrorSymbol) { yyleng = lexer.yyleng; yytext = lexer.yytext; yylineno = lexer.yylineno; yyloc = lexer.yylloc; if (recovering > 0) { recovering--; } } else { symbol = preErrorSymbol; preErrorSymbol = null; } break; case 2: len = this.productions_[action[1]][1]; yyval.$ = vstack[vstack.length - len]; yyval._$ = { first_line: lstack[lstack.length - (len || 1)].first_line, last_line: lstack[lstack.length - 1].last_line, first_column: lstack[lstack.length - (len || 1)].first_column, last_column: lstack[lstack.length - 1].last_column }; if (ranges) { yyval._$.range = [ lstack[lstack.length - (len || 1)].range[0], lstack[lstack.length - 1].range[1] ]; } r = this.performAction.apply(yyval, [ yytext, yyleng, yylineno, sharedState.yy, action[1], vstack, lstack ].concat(args)); if (typeof r !== 'undefined') { return r; } if (len) { stack = stack.slice(0, -1 * len * 2); vstack = vstack.slice(0, -1 * len); lstack = lstack.slice(0, -1 * len); } stack.push(this.productions_[action[1]][0]); vstack.push(yyval.$); lstack.push(yyval._$); newState = table[stack[stack.length - 2]][stack[stack.length - 1]]; stack.push(newState); break; case 3: return true; } } return true; }}; /* generated by jison-lex 0.3.4 */ var lexer = (function(){ var lexer = ({ EOF:1, parseError:function parseError(str, hash) { if (this.yy.parser) { this.yy.parser.parseError(str, hash); } else { throw new Error(str); } }, // resets the lexer, sets new input setInput:function (input, yy) { this.yy = yy || this.yy || {}; this._input = input; this._more = this._backtrack = this.done = false; this.yylineno = this.yyleng = 0; this.yytext = this.matched = this.match = ''; this.conditionStack = ['INITIAL']; this.yylloc = { first_line: 1, first_column: 0, last_line: 1, last_column: 0 }; if (this.options.ranges) { this.yylloc.range = [0,0]; } this.offset = 0; return this; }, // consumes and returns one char from the input input:function () { var ch = this._input[0]; this.yytext += ch; this.yyleng++; this.offset++; this.match += ch; this.matched += ch; var lines = ch.match(/(?:\r\n?|\n).*/g); if (lines) { this.yylineno++; this.yylloc.last_line++; } else { this.yylloc.last_column++; } if (this.options.ranges) { this.yylloc.range[1]++; } this._input = this._input.slice(1); return ch; }, // unshifts one char (or a string) into the input unput:function (ch) { var len = ch.length; var lines = ch.split(/(?:\r\n?|\n)/g); this._input = ch + this._input; this.yytext = this.yytext.substr(0, this.yytext.length - len); //this.yyleng -= len; this.offset -= len; var oldLines = this.match.split(/(?:\r\n?|\n)/g); this.match = this.match.substr(0, this.match.length - 1); this.matched = this.matched.substr(0, this.matched.length - 1); if (lines.length - 1) { this.yylineno -= lines.length - 1; } var r = this.yylloc.range; this.yylloc = { first_line: this.yylloc.first_line, last_line: this.yylineno + 1, first_column: this.yylloc.first_column, last_column: lines ? (lines.length === oldLines.length ? this.yylloc.first_column : 0) + oldLines[oldLines.length - lines.length].length - lines[0].length : this.yylloc.first_column - len }; if (this.options.ranges) { this.yylloc.range = [r[0], r[0] + this.yyleng - len]; } this.yyleng = this.yytext.length; return this; }, // When called from action, caches matched text and appends it on next action more:function () { this._more = true; return this; }, // When called from action, signals the lexer that this rule fails to match the input, so the next matching rule (regex) should be tested instead. reject:function () { if (this.options.backtrack_lexer) { this._backtrack = true; } else { return this.parseError('Lexical error on line ' + (this.yylineno + 1) + '. You can only invoke reject() in the lexer when the lexer is of the backtracking persuasion (options.backtrack_lexer = true).\n' + this.showPosition(), { text: "", token: null, line: this.yylineno }); } return this; }, // retain first n characters of the match less:function (n) { this.unput(this.match.slice(n)); }, // displays already matched input, i.e. for error messages pastInput:function () { var past = this.matched.substr(0, this.matched.length - this.match.length); return (past.length > 20 ? '...':'') + past.substr(-20).replace(/\n/g, ""); }, // displays upcoming input, i.e. for error messages upcomingInput:function () { var next = this.match; if (next.length < 20) { next += this._input.substr(0, 20-next.length); } return (next.substr(0,20) + (next.length > 20 ? '...' : '')).replace(/\n/g, ""); }, // displays the character position where the lexing error occurred, i.e. for error messages showPosition:function () { var pre = this.pastInput(); var c = new Array(pre.length + 1).join("-"); return pre + this.upcomingInput() + "\n" + c + "^"; }, // test the lexed token: return FALSE when not a match, otherwise return token test_match:function (match, indexed_rule) { var token, lines, backup; if (this.options.backtrack_lexer) { // save context backup = { yylineno: this.yylineno, yylloc: { first_line: this.yylloc.first_line, last_line: this.last_line, first_column: this.yylloc.first_column, last_column: this.yylloc.last_column }, yytext: this.yytext, match: this.match, matches: this.matches, matched: this.matched, yyleng: this.yyleng, offset: this.offset, _more: this._more, _input: this._input, yy: this.yy, conditionStack: this.conditionStack.slice(0), done: this.done }; if (this.options.ranges) { backup.yylloc.range = this.yylloc.range.slice(0); } } lines = match[0].match(/(?:\r\n?|\n).*/g); if (lines) { this.yylineno += lines.length; } this.yylloc = { first_line: this.yylloc.last_line, last_line: this.yylineno + 1, first_column: this.yylloc.last_column, last_column: lines ? lines[lines.length - 1].length - lines[lines.length - 1].match(/\r?\n?/)[0].length : this.yylloc.last_column + match[0].length }; this.yytext += match[0]; this.match += match[0]; this.matches = match; this.yyleng = this.yytext.length; if (this.options.ranges) { this.yylloc.range = [this.offset, this.offset += this.yyleng]; } this._more = false; this._backtrack = false; this._input = this._input.slice(match[0].length); this.matched += match[0]; token = this.performAction.call(this, this.yy, this, indexed_rule, this.conditionStack[this.conditionStack.length - 1]); if (this.done && this._input) { this.done = false; } if (token) { return token; } else if (this._backtrack) { // recover context for (var k in backup) { this[k] = backup[k]; } return false; // rule action called reject() implying the next rule should be tested instead. } return false; }, // return next match in input next:function () { if (this.done) { return this.EOF; } if (!this._input) { this.done = true; } var token, match, tempMatch, index; if (!this._more) { this.yytext = ''; this.match = ''; } var rules = this._currentRules(); for (var i = 0; i < rules.length; i++) { tempMatch = this._input.match(this.rules[rules[i]]); if (tempMatch && (!match || tempMatch[0].length > match[0].length)) { match = tempMatch; index = i; if (this.options.backtrack_lexer) { token = this.test_match(tempMatch, rules[i]); if (token !== false) { return token; } else if (this._backtrack) { match = false; continue; // rule action called reject() implying a rule MISmatch. } else { // else: this is a lexer rule which consumes input without producing a token (e.g. whitespace) return false; } } else if (!this.options.flex) { break; } } } if (match) { token = this.test_match(match, rules[index]); if (token !== false) { return token; } // else: this is a lexer rule which consumes input without producing a token (e.g. whitespace) return false; } if (this._input === "") { return this.EOF; } else { return this.parseError('Lexical error on line ' + (this.yylineno + 1) + '. Unrecognized text.\n' + this.showPosition(), { text: "", token: null, line: this.yylineno }); } }, // return next match that has a token lex:function lex() { var r = this.next(); if (r) { return r; } else { return this.lex(); } }, // activates a new lexer condition state (pushes the new lexer condition state onto the condition stack) begin:function begin(condition) { this.conditionStack.push(condition); }, // pop the previously active lexer condition state off the condition stack popState:function popState() { var n = this.conditionStack.length - 1; if (n > 0) { return this.conditionStack.pop(); } else { return this.conditionStack[0]; } }, // produce the lexer rule set which is active for the currently active lexer condition state _currentRules:function _currentRules() { if (this.conditionStack.length && this.conditionStack[this.conditionStack.length - 1]) { return this.conditions[this.conditionStack[this.conditionStack.length - 1]].rules; } else { return this.conditions["INITIAL"].rules; } }, // return the currently active lexer condition state; when an index argument is provided it produces the N-th previous condition state, if available topState:function topState(n) { n = this.conditionStack.length - 1 - Math.abs(n || 0); if (n >= 0) { return this.conditionStack[n]; } else { return "INITIAL"; } }, // alias for begin(condition) pushState:function pushState(condition) { this.begin(condition); }, // return the number of states currently on the stack stateStackSize:function stateStackSize() { return this.conditionStack.length; }, options: {}, performAction: function anonymous(yy,yy_,$avoiding_name_collisions,YY_START) { var YYSTATE=YY_START; switch($avoiding_name_collisions) { case 0:return 5; break; case 1:/* skip whitespace */ break; case 2:return 8; break; case 3:return 16; break; case 4:return 17; break; case 5:return 11; break; case 6:return 10; break; case 7:return 9; break; case 8:return 14; break; case 9:return 15; break; case 10:return 12; break; case 11:return 7 break; case 12:return 7 break; case 13:return 7 break; case 14:return 7 break; case 15:return 7 break; case 16:return 7 break; case 17:return 7 break; case 18:return 7 break; case 19:return 7 break; case 20:return 7 break; case 21:return 7 break; case 22:return 7 break; case 23:return 7 break; case 24:return 7 break; case 25:return 7 break; case 26:return 7 break; case 27:return 7 break; case 28:return 7 break; case 29:return 7 break; case 30:return 7 break; case 31:return 7 break; case 32:return 7 break; case 33:return 7 break; case 34:return 7 break; case 35:return 7 break; case 36:return 7 break; case 37:return 7 break; case 38:return 7 break; case 39:return 7 break; case 40:return 7 break; case 41:return 7 break; case 42:return 7 break; case 43:return 7 break; case 44:return 7 break; case 45:return 7 break; case 46:return 7 break; case 47:return 7 break; case 48:return 7 break; case 49:return 7 break; case 50:return 7 break; case 51:return 7 break; case 52:return 7 break; case 53:return 7 break; case 54:return 7 break; case 55:return 7 break; case 56:return 7 break; case 57:return 7 break; case 58:return 7 break; case 59:return 7 break; case 60:return 7 break; case 61:return 7 break; case 62:return 7 break; case 63:return 7 break; case 64:return 7 break; case 65:return 7 break; case 66:return 7 break; case 67:return 7 break; case 68:return 7 break; case 69:return 7 break; case 70:return 7 break; case 71:return 7 break; case 72:return 7 break; case 73:return 7 break; case 74:return 7 break; case 75:return 7 break; case 76:return 7 break; case 77:return 7 break; case 78:return 7 break; case 79:return 7 break; case 80:return 7 break; case 81:return 7 break; case 82:return 7 break; case 83:return 7 break; case 84:return 7 break; case 85:return 7 break; case 86:return 7 break; case 87:return 7 break; case 88:return 7 break; case 89:return 7 break; case 90:return 7 break; case 91:return 7 break; case 92:return 7 break; case 93:return 7 break; case 94:return 7 break; case 95:return 7 break; case 96:return 7 break; case 97:return 7 break; case 98:return 7 break; case 99:return 7 break; case 100:return 7 break; case 101:return 7 break; case 102:return 7 break; case 103:return 7 break; case 104:return 7 break; case 105:return 7 break; case 106:return 7 break; case 107:return 7 break; case 108:return 7 break; case 109:return 7 break; case 110:return 7 break; case 111:return 7 break; case 112:return 7 break; case 113:return 7 break; case 114:return 7 break; case 115:return 7 break; case 116:return 7 break; case 117:return 7 break; case 118:return 7 break; case 119:return 7 break; case 120:return 7 break; case 121:return 7 break; case 122:return 7 break; case 123:return 7 break; case 124:return 7 break; case 125:return 7 break; case 126:return 7 break; case 127:return 7 break; case 128:return 7 break; case 129:return 7 break; case 130:return 7 break; case 131:return 7 break; case 132:return 7 break; case 133:return 7 break; case 134:return 7 break; case 135:return 7 break; case 136:return 7 break; case 137:return 7 break; case 138:return 7 break; case 139:return 7 break; case 140:return 7 break; case 141:return 7 break; case 142:return 7 break; case 143:return 7 break; case 144:return 7 break; case 145:return 7 break; case 146:return 7 break; case 147:return 7 break; case 148:return 7 break; case 149:return 7 break; case 150:return 7 break; case 151:return 7 break; case 152:return 7 break; case 153:return 7 break; case 154:return 7 break; case 155:return 7 break; case 156:return 7 break; case 157:return 7 break; case 158:return 7 break; case 159:return 7 break; case 160:return 7 break; case 161:return 7 break; case 162:return 7 break; case 163:return 7 break; case 164:return 7 break; case 165:return 7 break; case 166:return 7 break; case 167:return 7 break; case 168:return 7 break; case 169:return 7 break; case 170:return 7 break; case 171:return 7 break; case 172:return 7 break; case 173:return 7 break; case 174:return 7 break; case 175:return 7 break; case 176:return 7 break; case 177:return 7 break; case 178:return 7 break; case 179:return 7 break; case 180:return 7 break; case 181:return 7 break; case 182:return 7 break; case 183:return 7 break; case 184:return 7 break; case 185:return 7 break; case 186:return 7 break; case 187:return 7 break; case 188:return 7 break; case 189:return 7 break; case 190:return 7 break; case 191:return 7 break; case 192:return 7 break; case 193:return 7 break; case 194:return 7 break; case 195:return 7 break; case 196:return 7 break; case 197:return 7 break; case 198:return 7 break; case 199:return 7 break; case 200:return 7 break; case 201:return 7 break; case 202:return 7 break; case 203:return 7 break; case 204:return 7 break; case 205:return 7 break; case 206:return 7 break; case 207:return 7 break; case 208:return 7 break; case 209:return 7 break; case 210:return 7 break; case 211:return 7 break; case 212:return 7 break; case 213:return 7 break; case 214:return 7 break; case 215:return 7 break; case 216:return 7 break; case 217:return 7 break; case 218:return 7 break; case 219:return 7 break; case 220:return 7 break; case 221:return 7 break; case 222:return 7 break; case 223:return 7 break; case 224:return 7 break; case 225:return 7 break; case 226:return 7 break; case 227:return 7 break; case 228:return 7 break; case 229:return 7 break; case 230:return 7 break; case 231:return 7 break; case 232:return 7 break; case 233:return 7 break; case 234:return 7 break; case 235:return 7 break; case 236:return 7 break; case 237:return 7 break; case 238:return 7 break; case 239:return 7 break; case 240:return 7 break; case 241:return 7 break; case 242:return 7 break; case 243:return 7 break; case 244:return 7 break; case 245:return 7 break; case 246:return 7 break; case 247:return 7 break; case 248:return 7 break; case 249:return 7 break; case 250:return 7 break; case 251:return 7 break; case 252:return 7 break; case 253:return 7 break; case 254:return 7 break; case 255:return 7 break; case 256:return 7 break; case 257:return 7 break; case 258:return 7 break; case 259:return 7 break; case 260:return 7 break; case 261:return 7 break; case 262:return 7 break; case 263:return 7 break; case 264:return 7 break; case 265:return 7 break; case 266:return 7 break; case 267:return 7 break; case 268:return 7 break; case 269:return 7 break; case 270:return 7 break; case 271:return 7 break; case 272:return 7 break; case 273:return 7 break; case 274:return 7 break; case 275:return 7 break; case 276:return 7 break; case 277:return 7 break; case 278:return 7 break; case 279:return 7 break; case 280:return 7 break; case 281:return 7 break; case 282:return 7 break; case 283:return 7 break; case 284:return 7 break; case 285:return 7 break; case 286:return 7 break; case 287:return 7 break; case 288:return 7 break; case 289:return 7 break; case 290:return 7 break; case 291:return 7 break; case 292:return 7 break; case 293:return 7 break; case 294:return 7 break; case 295:return 7 break; case 296:return 7 break; case 297:return 7 break; case 298:return 7 break; case 299:return 7 break; case 300:return 7 break; case 301:return 7 break; case 302:return 7 break; case 303:return 7 break; case 304:return 7 break; case 305:return 7 break; case 306:return 7 break; case 307:return 7 break; case 308:return 7 break; case 309:return 7 break; case 310:return 7 break; case 311:return 7 break; case 312:return 13 break; case 313:return 13 break; case 314:return 13 break; case 315:return 13 break; case 316:return 13 break; case 317:return 13 break; case 318:return 13 break; case 319:return 13 break; case 320:return 13 break; case 321:return 13 break; case 322:return 13 break; case 323:return 13 break; case 324:return 13 break; case 325:return 13 break; case 326:return 13 break; case 327:return 13 break; case 328:return 13 break; case 329:return 13 break; case 330:return 13 break; case 331:return 13 break; case 332:return 13 break; case 333:return 13 break; } }, rules: [/^(?:$)/,/^(?:\s+)/,/^(?:\+)/,/^(?:\()/,/^(?:\))/,/^(?::)/,/^(?:DocumentRef-([0-9A-Za-z-+.]+))/,/^(?:LicenseRef-([0-9A-Za-z-+.]+))/,/^(?:AND)/,/^(?:OR)/,/^(?:WITH)/,/^(?:Glide)/,/^(?:Abstyles)/,/^(?:AFL-1.1)/,/^(?:AFL-1.2)/,/^(?:AFL-2.0)/,/^(?:AFL-2.1)/,/^(?:AFL-3.0)/,/^(?:AMPAS)/,/^(?:APL-1.0)/,/^(?:Adobe-Glyph)/,/^(?:APAFML)/,/^(?:Adobe-2006)/,/^(?:AGPL-1.0)/,/^(?:Afmparse)/,/^(?:Aladdin)/,/^(?:ADSL)/,/^(?:AMDPLPA)/,/^(?:ANTLR-PD)/,/^(?:Apache-1.0)/,/^(?:Apache-1.1)/,/^(?:Apache-2.0)/,/^(?:AML)/,/^(?:APSL-1.0)/,/^(?:APSL-1.1)/,/^(?:APSL-1.2)/,/^(?:APSL-2.0)/,/^(?:Artistic-1.0)/,/^(?:Artistic-1.0-Perl)/,/^(?:Artistic-1.0-cl8)/,/^(?:Artistic-2.0)/,/^(?:AAL)/,/^(?:Bahyph)/,/^(?:Barr)/,/^(?:Beerware)/,/^(?:BitTorrent-1.0)/,/^(?:BitTorrent-1.1)/,/^(?:BSL-1.0)/,/^(?:Borceux)/,/^(?:BSD-2-Clause)/,/^(?:BSD-2-Clause-FreeBSD)/,/^(?:BSD-2-Clause-NetBSD)/,/^(?:BSD-3-Clause)/,/^(?:BSD-3-Clause-Clear)/,/^(?:BSD-4-Clause)/,/^(?:BSD-Protection)/,/^(?:BSD-3-Clause-Attribution)/,/^(?:BSD-4-Clause-UC)/,/^(?:bzip2-1.0.5)/,/^(?:bzip2-1.0.6)/,/^(?:Caldera)/,/^(?:CECILL-1.0)/,/^(?:CECILL-1.1)/,/^(?:CECILL-2.0)/,/^(?:CECILL-B)/,/^(?:CECILL-C)/,/^(?:ClArtistic)/,/^(?:MIT-CMU)/,/^(?:CNRI-Jython)/,/^(?:CNRI-Python)/,/^(?:CNRI-Python-GPL-Compatible)/,/^(?:CPOL-1.02)/,/^(?:CDDL-1.0)/,/^(?:CDDL-1.1)/,/^(?:CPAL-1.0)/,/^(?:CPL-1.0)/,/^(?:CATOSL-1.1)/,/^(?:Condor-1.1)/,/^(?:CC-BY-1.0)/,/^(?:CC-BY-2.0)/,/^(?:CC-BY-2.5)/,/^(?:CC-BY-3.0)/,/^(?:CC-BY-4.0)/,/^(?:CC-BY-ND-1.0)/,/^(?:CC-BY-ND-2.0)/,/^(?:CC-BY-ND-2.5)/,/^(?:CC-BY-ND-3.0)/,/^(?:CC-BY-ND-4.0)/,/^(?:CC-BY-NC-1.0)/,/^(?:CC-BY-NC-2.0)/,/^(?:CC-BY-NC-2.5)/,/^(?:CC-BY-NC-3.0)/,/^(?:CC-BY-NC-4.0)/,/^(?:CC-BY-NC-ND-1.0)/,/^(?:CC-BY-NC-ND-2.0)/,/^(?:CC-BY-NC-ND-2.5)/,/^(?:CC-BY-NC-ND-3.0)/,/^(?:CC-BY-NC-ND-4.0)/,/^(?:CC-BY-NC-SA-1.0)/,/^(?:CC-BY-NC-SA-2.0)/,/^(?:CC-BY-NC-SA-2.5)/,/^(?:CC-BY-NC-SA-3.0)/,/^(?:CC-BY-NC-SA-4.0)/,/^(?:CC-BY-SA-1.0)/,/^(?:CC-BY-SA-2.0)/,/^(?:CC-BY-SA-2.5)/,/^(?:CC-BY-SA-3.0)/,/^(?:CC-BY-SA-4.0)/,/^(?:CC0-1.0)/,/^(?:Crossword)/,/^(?:CUA-OPL-1.0)/,/^(?:Cube)/,/^(?:D-FSL-1.0)/,/^(?:diffmark)/,/^(?:WTFPL)/,/^(?:DOC)/,/^(?:Dotseqn)/,/^(?:DSDP)/,/^(?:dvipdfm)/,/^(?:EPL-1.0)/,/^(?:ECL-1.0)/,/^(?:ECL-2.0)/,/^(?:eGenix)/,/^(?:EFL-1.0)/,/^(?:EFL-2.0)/,/^(?:MIT-advertising)/,/^(?:MIT-enna)/,/^(?:Entessa)/,/^(?:ErlPL-1.1)/,/^(?:EUDatagrid)/,/^(?:EUPL-1.0)/,/^(?:EUPL-1.1)/,/^(?:Eurosym)/,/^(?:Fair)/,/^(?:MIT-feh)/,/^(?:Frameworx-1.0)/,/^(?:FreeImage)/,/^(?:FTL)/,/^(?:FSFUL)/,/^(?:FSFULLR)/,/^(?:Giftware)/,/^(?:GL2PS)/,/^(?:Glulxe)/,/^(?:AGPL-3.0)/,/^(?:GFDL-1.1)/,/^(?:GFDL-1.2)/,/^(?:GFDL-1.3)/,/^(?:GPL-1.0)/,/^(?:GPL-2.0)/,/^(?:GPL-3.0)/,/^(?:LGPL-2.1)/,/^(?:LGPL-3.0)/,/^(?:LGPL-2.0)/,/^(?:gnuplot)/,/^(?:gSOAP-1.3b)/,/^(?:HaskellReport)/,/^(?:HPND)/,/^(?:IBM-pibs)/,/^(?:IPL-1.0)/,/^(?:ICU)/,/^(?:ImageMagick)/,/^(?:iMatix)/,/^(?:Imlib2)/,/^(?:IJG)/,/^(?:Intel-ACPI)/,/^(?:Intel)/,/^(?:IPA)/,/^(?:ISC)/,/^(?:JasPer-2.0)/,/^(?:JSON)/,/^(?:LPPL-1.3a)/,/^(?:LPPL-1.0)/,/^(?:LPPL-1.1)/,/^(?:LPPL-1.2)/,/^(?:LPPL-1.3c)/,/^(?:Latex2e)/,/^(?:BSD-3-Clause-LBNL)/,/^(?:Leptonica)/,/^(?:LGPLLR)/,/^(?:Libpng)/,/^(?:libtiff)/,/^(?:LPL-1.02)/,/^(?:LPL-1.0)/,/^(?:MakeIndex)/,/^(?:MTLL)/,/^(?:MS-PL)/,/^(?:MS-RL)/,/^(?:MirOS)/,/^(?:MITNFA)/,/^(?:MIT)/,/^(?:Motosoto)/,/^(?:MPL-1.0)/,/^(?:MPL-1.1)/,/^(?:MPL-2.0)/,/^(?:MPL-2.0-no-copyleft-exception)/,/^(?:mpich2)/,/^(?:Multics)/,/^(?:Mup)/,/^(?:NASA-1.3)/,/^(?:Naumen)/,/^(?:NBPL-1.0)/,/^(?:NetCDF)/,/^(?:NGPL)/,/^(?:NOSL)/,/^(?:NPL-1.0)/,/^(?:NPL-1.1)/,/^(?:Newsletr)/,/^(?:NLPL)/,/^(?:Nokia)/,/^(?:NPOSL-3.0)/,/^(?:Noweb)/,/^(?:NRL)/,/^(?:NTP)/,/^(?:Nunit)/,/^(?:OCLC-2.0)/,/^(?:ODbL-1.0)/,/^(?:PDDL-1.0)/,/^(?:OGTSL)/,/^(?:OLDAP-2.2.2)/,/^(?:OLDAP-1.1)/,/^(?:OLDAP-1.2)/,/^(?:OLDAP-1.3)/,/^(?:OLDAP-1.4)/,/^(?:OLDAP-2.0)/,/^(?:OLDAP-2.0.1)/,/^(?:OLDAP-2.1)/,/^(?:OLDAP-2.2)/,/^(?:OLDAP-2.2.1)/,/^(?:OLDAP-2.3)/,/^(?:OLDAP-2.4)/,/^(?:OLDAP-2.5)/,/^(?:OLDAP-2.6)/,/^(?:OLDAP-2.7)/,/^(?:OLDAP-2.8)/,/^(?:OML)/,/^(?:OPL-1.0)/,/^(?:OSL-1.0)/,/^(?:OSL-1.1)/,/^(?:OSL-2.0)/,/^(?:OSL-2.1)/,/^(?:OSL-3.0)/,/^(?:OpenSSL)/,/^(?:PHP-3.0)/,/^(?:PHP-3.01)/,/^(?:Plexus)/,/^(?:PostgreSQL)/,/^(?:psfrag)/,/^(?:psutils)/,/^(?:Python-2.0)/,/^(?:QPL-1.0)/,/^(?:Qhull)/,/^(?:Rdisc)/,/^(?:RPSL-1.0)/,/^(?:RPL-1.1)/,/^(?:RPL-1.5)/,/^(?:RHeCos-1.1)/,/^(?:RSCPL)/,/^(?:RSA-MD)/,/^(?:Ruby)/,/^(?:SAX-PD)/,/^(?:Saxpath)/,/^(?:SCEA)/,/^(?:SWL)/,/^(?:SGI-B-1.0)/,/^(?:SGI-B-1.1)/,/^(?:SGI-B-2.0)/,/^(?:OFL-1.0)/,/^(?:OFL-1.1)/,/^(?:SimPL-2.0)/,/^(?:Sleepycat)/,/^(?:SNIA)/,/^(?:Spencer-86)/,/^(?:Spencer-94)/,/^(?:Spencer-99)/,/^(?:SMLNJ)/,/^(?:SugarCRM-1.1.3)/,/^(?:SISSL)/,/^(?:SISSL-1.2)/,/^(?:SPL-1.0)/,/^(?:Watcom-1.0)/,/^(?:TCL)/,/^(?:Unlicense)/,/^(?:TMate)/,/^(?:TORQUE-1.1)/,/^(?:TOSL)/,/^(?:Unicode-TOU)/,/^(?:UPL-1.0)/,/^(?:NCSA)/,/^(?:Vim)/,/^(?:VOSTROM)/,/^(?:VSL-1.0)/,/^(?:W3C-19980720)/,/^(?:W3C)/,/^(?:Wsuipa)/,/^(?:Xnet)/,/^(?:X11)/,/^(?:Xerox)/,/^(?:XFree86-1.1)/,/^(?:xinetd)/,/^(?:xpp)/,/^(?:XSkat)/,/^(?:YPL-1.0)/,/^(?:YPL-1.1)/,/^(?:Zed)/,/^(?:Zend-2.0)/,/^(?:Zimbra-1.3)/,/^(?:Zimbra-1.4)/,/^(?:Zlib)/,/^(?:zlib-acknowledgement)/,/^(?:ZPL-1.1)/,/^(?:ZPL-2.0)/,/^(?:ZPL-2.1)/,/^(?:389-exception)/,/^(?:Autoconf-exception-2.0 )/,/^(?:Autoconf-exception-3.0 )/,/^(?:Bison-exception-2.2)/,/^(?:CLISP-exception-2.0)/,/^(?:Classpath-exception-2.0)/,/^(?:FLTK-exception )/,/^(?:FLTK-exception-2.0 )/,/^(?:Font-exception-2.0 )/,/^(?:GCC-exception-2.0)/,/^(?:GCC-exception-3.1)/,/^(?:LZMA-exception )/,/^(?:Libtool-exception)/,/^(?:Nokia-Qt-exception-1.1 )/,/^(?:Qwt-exception-1.0)/,/^(?:WxWindows-exception-3.1)/,/^(?:eCos-exception-2.0 )/,/^(?:freertos-exception-2.0 )/,/^(?:gnu-javamail-exception )/,/^(?:i2p-gpl-java-exception )/,/^(?:mif-exception)/,/^(?:u-boot-exception-2.0 )/], conditions: {"INITIAL":{"rules":[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100,101,102,103,104,105,106,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,132,133,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163,164,165,166,167,168,169,170,171,172,173,174,175,176,177,178,179,180,181,182,183,184,185,186,187,188,189,190,191,192,193,194,195,196,197,198,199,200,201,202,203,204,205,206,207,208,209,210,211,212,213,214,215,216,217,218,219,220,221,222,223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240,241,242,243,244,245,246,247,248,249,250,251,252,253,254,255,256,257,258,259,260,261,262,263,264,265,266,267,268,269,270,271,272,273,274,275,276,277,278,279,280,281,282,283,284,285,286,287,288,289,290,291,292,293,294,295,296,297,298,299,300,301,302,303,304,305,306,307,308,309,310,311,312,313,314,315,316,317,318,319,320,321,322,323,324,325,326,327,328,329,330,331,332,333],"inclusive":true}} }); return lexer; })(); parser.lexer = lexer; function Parser () { this.yy = {}; } Parser.prototype = parser;parser.Parser = Parser; return new Parser; })(); if (typeof require !== 'undefined' && typeof exports !== 'undefined') { exports.parser = spdxparse; exports.Parser = spdxparse.Parser; exports.parse = function () { return spdxparse.parse.apply(spdxparse, arguments); }; exports.main = function commonjsMain(args) { if (!args[1]) { console.log('Usage: '+args[0]+' FILE'); process.exit(1); } var source = require('fs').readFileSync(require('path').normalize(args[1]), "utf8"); return exports.parser.parse(source); }; if (typeof module !== 'undefined' && require.main === module) { exports.main(process.argv.slice(1)); } } ././@LongLink0000000000000000000000000000017200000000000011215 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_modules/spdx-exceptions/npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_mod0000755000000000000000000000000012631326456032206 5ustar 00000000000000././@LongLink0000000000000000000000000000020300000000000011210 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_modules/spdx-exceptions/README.mdnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_mod0000644000000000000000000000005112631326456032204 0ustar 00000000000000The package exports an array of strings. ././@LongLink0000000000000000000000000000020400000000000011211 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_modules/spdx-exceptions/index.jsonnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_mod0000644000000000000000000000105712631326456032213 0ustar 00000000000000[ "389-exception", "Autoconf-exception-2.0 ", "Autoconf-exception-3.0 ", "Bison-exception-2.2", "CLISP-exception-2.0", "Classpath-exception-2.0", "FLTK-exception ", "FLTK-exception-2.0 ", "Font-exception-2.0 ", "GCC-exception-2.0", "GCC-exception-3.1", "LZMA-exception ", "Libtool-exception", "Nokia-Qt-exception-1.1 ", "Qwt-exception-1.0", "WxWindows-exception-3.1", "eCos-exception-2.0 ", "freertos-exception-2.0 ", "gnu-javamail-exception ", "i2p-gpl-java-exception ", "mif-exception", "u-boot-exception-2.0 " ] ././@LongLink0000000000000000000000000000020600000000000011213 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_modules/spdx-exceptions/package.jsonnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-expression-parse/node_mod0000644000000000000000000000254612631326456032217 0ustar 00000000000000{ "name": "spdx-exceptions", "description": "list of SPDX standard license exceptions", "version": "1.0.3", "author": { "name": "The Linux Foundation" }, "contributors": [ { "name": "Kyle E. Mitchell", "email": "kyle@kemitchell.com", "url": "https://kemitchell.com/" } ], "license": "CC-BY-3.0", "repository": { "type": "git", "url": "git+https://github.com/kemitchell/spdx-exceptions.json.git" }, "gitHead": "fdd2c68ac29d4cd891c31e9b60523177eb9b338e", "bugs": { "url": "https://github.com/kemitchell/spdx-exceptions.json/issues" }, "homepage": "https://github.com/kemitchell/spdx-exceptions.json#readme", "_id": "spdx-exceptions@1.0.3", "scripts": {}, "_shasum": "39ec5ed2cebddf08d180555d7e99c3aff9b4764a", "_from": "spdx-exceptions@>=1.0.0 <2.0.0", "_npmVersion": "3.3.4", "_nodeVersion": "4.1.1", "_npmUser": { "name": "kemitchell", "email": "kyle@kemitchell.com" }, "dist": { "shasum": "39ec5ed2cebddf08d180555d7e99c3aff9b4764a", "tarball": "http://registry.npmjs.org/spdx-exceptions/-/spdx-exceptions-1.0.3.tgz" }, "maintainers": [ { "name": "kemitchell", "email": "kyle@kemitchell.com" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/spdx-exceptions/-/spdx-exceptions-1.0.3.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-license-ids/LICENSE0000644000000000000000000000227312631326456030402 0ustar 00000000000000This is free and unencumbered software released into the public domain. Anyone is free to copy, modify, publish, use, compile, sell, or distribute this software, either in source code form or as a compiled binary, for any purpose, commercial or non-commercial, and by any means. In jurisdictions that recognize copyright laws, the author or authors of this software dedicate any and all copyright interest in the software to the public domain. We make this dedication for the benefit of the public at large and to the detriment of our heirs and successors. We intend this dedication to be an overt act of relinquishment in perpetuity of all present and future rights to this software under copyright law. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. For more information, please refer to npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-license-ids/README.md0000755000000000000000000000307612631326456030661 0ustar 00000000000000# spdx-license-ids A list of [SPDX license](https://spdx.org/licenses/) identifiers [**Download JSON**](https://raw.githubusercontent.com/shinnn/spdx-license-ids/master/spdx-license-ids.json) ## Use as a JavaScript Library [![NPM version](https://img.shields.io/npm/v/spdx-license-ids.svg)](https://www.npmjs.org/package/spdx-license-ids) [![Bower version](https://img.shields.io/bower/v/spdx-license-ids.svg)](https://github.com/shinnn/spdx-license-ids/releases) [![Build Status](https://travis-ci.org/shinnn/spdx-license-ids.svg?branch=master)](https://travis-ci.org/shinnn/spdx-license-ids) [![Coverage Status](https://img.shields.io/coveralls/shinnn/spdx-license-ids.svg)](https://coveralls.io/r/shinnn/spdx-license-ids) [![devDependency Status](https://david-dm.org/shinnn/spdx-license-ids/dev-status.svg)](https://david-dm.org/shinnn/spdx-license-ids#info=devDependencies) ### Installation #### Package managers ##### [npm](https://www.npmjs.com/) ```sh npm install spdx-license-ids ``` ##### [bower](http://bower.io/) ```sh bower install spdx-license-ids ``` ##### [Duo](http://duojs.org/) ```javascript const spdxLicenseIds = require('shinnn/spdx-license-ids'); ``` #### Standalone [Download the script file directly.](https://raw.githubusercontent.com/shinnn/spdx-license-ids/master/spdx-license-ids-browser.js) ### API #### spdxLicenseIds Type: `Array` of `String` It returns an array of SPDX license identifiers. ```javascript const spdxLicenseIds = require('spdx-license-ids'); //=> ['Glide', 'Abstyles', 'AFL-1.1', ... ] ``` ## License [The Unlicense](./LICENSE). npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-license-ids/package.json0000644000000000000000000000432512631326456031663 0ustar 00000000000000{ "name": "spdx-license-ids", "version": "1.0.2", "description": "A list of SPDX license identifiers", "repository": { "type": "git", "url": "git+https://github.com/shinnn/spdx-license-ids.git" }, "author": { "name": "Shinnosuke Watanabe", "url": "https://github.com/shinnn" }, "scripts": { "build": "node --harmony_arrow_functions build.js", "lint": "eslint --config node_modules/@shinnn/eslintrc/rc.json --ignore-path .gitignore .", "pretest": "${npm_package_scripts_build} && ${npm_package_scripts_lint}", "test": "node --harmony_arrow_functions test.js", "coverage": "node --harmony_arrow_functions node_modules/.bin/istanbul cover test.js", "coveralls": "${npm_package_scripts_coverage} && istanbul-coveralls" }, "license": "Unlicense", "main": "spdx-license-ids.json", "files": [ "spdx-license-ids.json" ], "keywords": [ "spdx", "license", "licenses", "id", "identifier", "identifiers", "json", "array", "oss", "browser", "client-side" ], "devDependencies": { "@shinnn/eslintrc": "^1.0.0", "each-async": "^1.1.1", "eslint": "^0.24.0", "got": "^3.3.0", "istanbul": "^0.3.17", "istanbul-coveralls": "^1.0.3", "require-bower-files": "^2.0.0", "rimraf": "^2.4.1", "stringify-object": "^2.2.0", "tape": "^4.0.0" }, "gitHead": "df183ecdf1738f77b1e8e41f686ee56206a40693", "bugs": { "url": "https://github.com/shinnn/spdx-license-ids/issues" }, "homepage": "https://github.com/shinnn/spdx-license-ids#readme", "_id": "spdx-license-ids@1.0.2", "_shasum": "0674e9c9a230f980016b5b073a10aa165701677c", "_from": "spdx-license-ids@1.0.2", "_npmVersion": "2.12.1", "_nodeVersion": "2.3.3", "_npmUser": { "name": "shinnn", "email": "snnskwtnb@gmail.com" }, "maintainers": [ { "name": "shinnn", "email": "snnskwtnb@gmail.com" } ], "dist": { "shasum": "0674e9c9a230f980016b5b073a10aa165701677c", "tarball": "http://registry.npmjs.org/spdx-license-ids/-/spdx-license-ids-1.0.2.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/spdx-license-ids/-/spdx-license-ids-1.0.2.tgz", "readme": "ERROR: No README data found!" } ././@LongLink0000000000000000000000000000015500000000000011216 Lustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-license-ids/spdx-license-ids.jsonnpm_3.5.2.orig/node_modules/validate-npm-package-license/node_modules/spdx-license-ids/spdx-license-0000644000000000000000000001035212631326456031770 0ustar 00000000000000[ "Glide", "Abstyles", "AFL-1.1", "AFL-1.2", "AFL-2.0", "AFL-2.1", "AFL-3.0", "AMPAS", "APL-1.0", "Adobe-Glyph", "APAFML", "Adobe-2006", "AGPL-1.0", "Afmparse", "Aladdin", "ADSL", "AMDPLPA", "ANTLR-PD", "Apache-1.0", "Apache-1.1", "Apache-2.0", "AML", "APSL-1.0", "APSL-1.1", "APSL-1.2", "APSL-2.0", "Artistic-1.0", "Artistic-1.0-Perl", "Artistic-1.0-cl8", "Artistic-2.0", "AAL", "Bahyph", "Barr", "Beerware", "BitTorrent-1.0", "BitTorrent-1.1", "BSL-1.0", "Borceux", "BSD-2-Clause", "BSD-2-Clause-FreeBSD", "BSD-2-Clause-NetBSD", "BSD-3-Clause", "BSD-3-Clause-Clear", "BSD-4-Clause", "BSD-Protection", "BSD-3-Clause-Attribution", "BSD-4-Clause-UC", "bzip2-1.0.5", "bzip2-1.0.6", "Caldera", "CECILL-1.0", "CECILL-1.1", "CECILL-2.0", "CECILL-B", "CECILL-C", "ClArtistic", "MIT-CMU", "CNRI-Jython", "CNRI-Python", "CNRI-Python-GPL-Compatible", "CPOL-1.02", "CDDL-1.0", "CDDL-1.1", "CPAL-1.0", "CPL-1.0", "CATOSL-1.1", "Condor-1.1", "CC-BY-1.0", "CC-BY-2.0", "CC-BY-2.5", "CC-BY-3.0", "CC-BY-4.0", "CC-BY-ND-1.0", "CC-BY-ND-2.0", "CC-BY-ND-2.5", "CC-BY-ND-3.0", "CC-BY-ND-4.0", "CC-BY-NC-1.0", "CC-BY-NC-2.0", "CC-BY-NC-2.5", "CC-BY-NC-3.0", "CC-BY-NC-4.0", "CC-BY-NC-ND-1.0", "CC-BY-NC-ND-2.0", "CC-BY-NC-ND-2.5", "CC-BY-NC-ND-3.0", "CC-BY-NC-ND-4.0", "CC-BY-NC-SA-1.0", "CC-BY-NC-SA-2.0", "CC-BY-NC-SA-2.5", "CC-BY-NC-SA-3.0", "CC-BY-NC-SA-4.0", "CC-BY-SA-1.0", "CC-BY-SA-2.0", "CC-BY-SA-2.5", "CC-BY-SA-3.0", "CC-BY-SA-4.0", "CC0-1.0", "Crossword", "CUA-OPL-1.0", "Cube", "D-FSL-1.0", "diffmark", "WTFPL", "DOC", "Dotseqn", "DSDP", "dvipdfm", "EPL-1.0", "ECL-1.0", "ECL-2.0", "eGenix", "EFL-1.0", "EFL-2.0", "MIT-advertising", "MIT-enna", "Entessa", "ErlPL-1.1", "EUDatagrid", "EUPL-1.0", "EUPL-1.1", "Eurosym", "Fair", "MIT-feh", "Frameworx-1.0", "FreeImage", "FTL", "FSFUL", "FSFULLR", "Giftware", "GL2PS", "Glulxe", "AGPL-3.0", "GFDL-1.1", "GFDL-1.2", "GFDL-1.3", "GPL-1.0", "GPL-2.0", "GPL-3.0", "LGPL-2.1", "LGPL-3.0", "LGPL-2.0", "gnuplot", "gSOAP-1.3b", "HaskellReport", "HPND", "IBM-pibs", "IPL-1.0", "ICU", "ImageMagick", "iMatix", "Imlib2", "IJG", "Intel-ACPI", "Intel", "IPA", "ISC", "JasPer-2.0", "JSON", "LPPL-1.3a", "LPPL-1.0", "LPPL-1.1", "LPPL-1.2", "LPPL-1.3c", "Latex2e", "BSD-3-Clause-LBNL", "Leptonica", "LGPLLR", "Libpng", "libtiff", "LPL-1.02", "LPL-1.0", "MakeIndex", "MTLL", "MS-PL", "MS-RL", "MirOS", "MITNFA", "MIT", "Motosoto", "MPL-1.0", "MPL-1.1", "MPL-2.0", "MPL-2.0-no-copyleft-exception", "mpich2", "Multics", "Mup", "NASA-1.3", "Naumen", "NBPL-1.0", "NetCDF", "NGPL", "NOSL", "NPL-1.0", "NPL-1.1", "Newsletr", "NLPL", "Nokia", "NPOSL-3.0", "Noweb", "NRL", "NTP", "Nunit", "OCLC-2.0", "ODbL-1.0", "PDDL-1.0", "OGTSL", "OLDAP-2.2.2", "OLDAP-1.1", "OLDAP-1.2", "OLDAP-1.3", "OLDAP-1.4", "OLDAP-2.0", "OLDAP-2.0.1", "OLDAP-2.1", "OLDAP-2.2", "OLDAP-2.2.1", "OLDAP-2.3", "OLDAP-2.4", "OLDAP-2.5", "OLDAP-2.6", "OLDAP-2.7", "OLDAP-2.8", "OML", "OPL-1.0", "OSL-1.0", "OSL-1.1", "OSL-2.0", "OSL-2.1", "OSL-3.0", "OpenSSL", "PHP-3.0", "PHP-3.01", "Plexus", "PostgreSQL", "psfrag", "psutils", "Python-2.0", "QPL-1.0", "Qhull", "Rdisc", "RPSL-1.0", "RPL-1.1", "RPL-1.5", "RHeCos-1.1", "RSCPL", "RSA-MD", "Ruby", "SAX-PD", "Saxpath", "SCEA", "SWL", "SGI-B-1.0", "SGI-B-1.1", "SGI-B-2.0", "OFL-1.0", "OFL-1.1", "SimPL-2.0", "Sleepycat", "SNIA", "Spencer-86", "Spencer-94", "Spencer-99", "SMLNJ", "SugarCRM-1.1.3", "SISSL", "SISSL-1.2", "SPL-1.0", "Watcom-1.0", "TCL", "Unlicense", "TMate", "TORQUE-1.1", "TOSL", "Unicode-TOU", "UPL-1.0", "NCSA", "Vim", "VOSTROM", "VSL-1.0", "W3C-19980720", "W3C", "Wsuipa", "Xnet", "X11", "Xerox", "XFree86-1.1", "xinetd", "xpp", "XSkat", "YPL-1.0", "YPL-1.1", "Zed", "Zend-2.0", "Zimbra-1.3", "Zimbra-1.4", "Zlib", "zlib-acknowledgement", "ZPL-1.1", "ZPL-2.0", "ZPL-2.1" ] npm_3.5.2.orig/node_modules/validate-npm-package-name/.npmignore0000644000000000000000000000001512631326456023032 0ustar 00000000000000node_modules npm_3.5.2.orig/node_modules/validate-npm-package-name/LICENSE0000644000000000000000000000133012631326456022041 0ustar 00000000000000Copyright (c) 2015, npm, Inc Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/validate-npm-package-name/README.md0000644000000000000000000000546612631326456022331 0ustar 00000000000000# validate-npm-package-name Give me a string and I'll tell you if it's a valid `npm` package name. This package exports a single synchronous function that takes a `string` as input and returns an object with two properties: - `validForNewPackages` :: `Boolean` - `validForOldPackages` :: `Boolean` ## Contents - [Naming rules](#naming-rules) - [Examples](#examples) + [Valid Names](#valid-names) + [Invalid Names](#invalid-names) - [Legacy Names](#legacy-names) - [Tests](#tests) - [License](#license) ## Naming Rules Below is a list of rules that valid `npm` package name should conform to. - package name length should be greater than zero - all the characters in the package name must be lowercase i.e., no uppercase or mixed case names are allowed - package name *can* consist of hyphens - package name must *not* contain any non-url-safe characters (since name ends up being part of a URL) - package name should not start with `.` or `_` - package name should *not* contain any leading or trailing spaces - package name *cannot* be the same as a node.js/io.js core module nor a reserved/blacklisted name. For example, the following names are invalid: + http + stream + node_modules + favicon.ico - package name length cannot exceed 214 ## Examples ### Valid Names ```js var validate = require("validate-npm-package-name") validate("some-package") validate("example.com") validate("under_score") validate("123numeric") validate("crazy!") validate("@npm/thingy") validate("@jane/foo.js") ``` All of the above names are valid, so you'll get this object back: ```js { validForNewPackages: true, validForOldPackages: true } ``` ### Invalid Names ```js validate(" leading-space:and:weirdchars") ``` That was never a valid package name, so you get this: ```js { validForNewPackages: false, validForOldPackages: false, errors: [ 'name cannot contain leading or trailing spaces', 'name can only contain URL-friendly characters' ] } ``` ## Legacy Names In the old days of npm, package names were wild. They could have capital letters in them. They could be really long. They could be the name of an existing module in node core. If you give this function a package name that **used to be valid**, you'll see a change in the value of `validForNewPackages` property, and a warnings array will be present: ```js validate("cRaZY-paCkAgE-with-mixed-case-and-more-than-214-characters-----------------------------------------------------------------------------------------------------------------------------------------------------------") ``` returns: ```js { validForNewPackages: false, validForOldPackages: true, warnings: [ "name can no longer contain capital letters", "name can no longer contain more than 214 characters" ] } ``` ## Tests ```sh npm install npm test ``` ## License ISC npm_3.5.2.orig/node_modules/validate-npm-package-name/index.js0000644000000000000000000000520012631326456022501 0ustar 00000000000000var scopedPackagePattern = new RegExp("^(?:@([^/]+?)[/])?([^/]+?)$"); var builtins = require("builtins") var blacklist = [ "node_modules", "favicon.ico" ]; var validate = module.exports = function(name) { var warnings = [] var errors = [] if (name === null) { errors.push("name cannot be null") return done(warnings, errors) } if (name === undefined) { errors.push("name cannot be undefined") return done(warnings, errors) } if (typeof name !== "string") { errors.push("name must be a string") return done(warnings, errors) } if (!name.length) { errors.push("name length must be greater than zero") } if (name.match(/^\./)) { errors.push("name cannot start with a period") } if (name.match(/^_/)) { errors.push("name cannot start with an underscore") } if (name.trim() !== name) { errors.push("name cannot contain leading or trailing spaces") } // No funny business blacklist.forEach(function(blacklistedName){ if (name.toLowerCase() === blacklistedName) { errors.push(blacklistedName + " is a blacklisted name") } }) // Generate warnings for stuff that used to be allowed // core module names like http, events, util, etc builtins.forEach(function(builtin){ if (name.toLowerCase() === builtin) { warnings.push(builtin + " is a core module name") } }) // really-long-package-names-------------------------------such--length-----many---wow // the thisisareallyreallylongpackagenameitshouldpublishdowenowhavealimittothelengthofpackagenames-poch. if (name.length > 214) { warnings.push("name can no longer contain more than 214 characters") } // mIxeD CaSe nAMEs if (name.toLowerCase() !== name) { warnings.push("name can no longer contain capital letters") } if (encodeURIComponent(name) !== name) { // Maybe it's a scoped package name, like @user/package var nameMatch = name.match(scopedPackagePattern) if (nameMatch) { var user = nameMatch[1] var pkg = nameMatch[2] if (encodeURIComponent(user) === user && encodeURIComponent(pkg) === pkg) { return done(warnings, errors) } } errors.push("name can only contain URL-friendly characters") } return done(warnings, errors) } validate.scopedPackagePattern = scopedPackagePattern var done = function (warnings, errors) { var result = { validForNewPackages: errors.length === 0 && warnings.length === 0, validForOldPackages: errors.length === 0, warnings: warnings, errors: errors } if (!result.warnings.length) delete result.warnings if (!result.errors.length) delete result.errors return result } npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/0000755000000000000000000000000012631326456023514 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-name/package.json0000644000000000000000000000321112631326456023322 0ustar 00000000000000{ "name": "validate-npm-package-name", "version": "2.2.2", "description": "Give me a string and I'll tell you if it's a valid npm package name", "main": "index.js", "directories": { "test": "test" }, "dependencies": { "builtins": "0.0.7" }, "devDependencies": { "tap": "^0.4.13" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/npm/validate-npm-package-name.git" }, "keywords": [ "npm", "package", "names", "validation" ], "author": { "name": "zeke" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/validate-npm-package-name/issues" }, "homepage": "https://github.com/npm/validate-npm-package-name", "gitHead": "3af92c881549f1b96f05ab6bfb5768bba94ad72d", "_id": "validate-npm-package-name@2.2.2", "_shasum": "f65695b22f7324442019a3c7fa39a6e7fd299085", "_from": "validate-npm-package-name@>=2.2.2 <2.3.0", "_npmVersion": "3.0.0", "_nodeVersion": "0.12.5", "_npmUser": { "name": "zkat", "email": "kat@sykosomatic.org" }, "dist": { "shasum": "f65695b22f7324442019a3c7fa39a6e7fd299085", "tarball": "http://registry.npmjs.org/validate-npm-package-name/-/validate-npm-package-name-2.2.2.tgz" }, "maintainers": [ { "name": "zeke", "email": "zeke@sikelianos.com" }, { "name": "bcoe", "email": "ben@npmjs.com" }, { "name": "zkat", "email": "kat@sykosomatic.org" } ], "_resolved": "https://registry.npmjs.org/validate-npm-package-name/-/validate-npm-package-name-2.2.2.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/validate-npm-package-name/test/0000755000000000000000000000000012631326456022016 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/builtins/0000755000000000000000000000000012631326456025345 5ustar 00000000000000npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/builtins/.travis.yml0000644000000000000000000000006012631326456027452 0ustar 00000000000000language: node_js node_js: - "0.8" - "0.10" npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/builtins/History.md0000644000000000000000000000072412631326456027333 0ustar 00000000000000 0.0.7 / 2014-09-01 ================== * update .repository 0.0.6 / 2014-09-01 ================== * add travis * add test script * add constants 0.0.5 / 2014-06-27 ================== * add module * publish to public npm 0.0.4 / 2014-04-25 ================== * add timers 0.0.3 / 2014-02-22 ================== * add buffer 0.0.2 / 2014-02-11 ================== * add assert 0.0.1 / 2014-02-11 ================== * add main * initial commit npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/builtins/Readme.md0000644000000000000000000000047112631326456027066 0ustar 00000000000000 # builtins List of node.js [builtin modules](http://nodejs.org/api/). [![build status](https://secure.travis-ci.org/juliangruber/builtins.svg)](http://travis-ci.org/juliangruber/builtins) ## Example ```js var builtins = require('builtins'); assert(builtins.indexOf('http') > -1); ``` ## License MIT npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/builtins/builtins.json0000644000000000000000000000052212631326456030070 0ustar 00000000000000[ "assert", "buffer", "child_process", "cluster", "constants", "crypto", "dns", "domain", "events", "fs", "http", "https", "module", "net", "os", "path", "punycode", "querystring", "repl", "stream", "string_decoder", "timers", "tls", "tty", "dgram", "url", "util", "vm", "zlib" ] npm_3.5.2.orig/node_modules/validate-npm-package-name/node_modules/builtins/package.json0000644000000000000000000000230012631326456027626 0ustar 00000000000000{ "name": "builtins", "version": "0.0.7", "description": "List of node.js builtin modules", "repository": { "type": "git", "url": "git://github.com/juliangruber/builtins.git" }, "license": "MIT", "main": "builtins.json", "publishConfig": { "registry": "https://registry.npmjs.org" }, "scripts": { "test": "node -e \"require('./builtins.json')\"" }, "bugs": { "url": "https://github.com/juliangruber/builtins/issues" }, "homepage": "https://github.com/juliangruber/builtins", "_id": "builtins@0.0.7", "dist": { "shasum": "355219cd6cf18dbe7c01cc7fd2dce765cfdc549a", "tarball": "http://registry.npmjs.org/builtins/-/builtins-0.0.7.tgz" }, "_from": "builtins@0.0.7", "_npmVersion": "1.3.22", "_npmUser": { "name": "juliangruber", "email": "julian@juliangruber.com" }, "maintainers": [ { "name": "juliangruber", "email": "julian@juliangruber.com" }, { "name": "segment", "email": "tj@segment.io" } ], "directories": {}, "_shasum": "355219cd6cf18dbe7c01cc7fd2dce765cfdc549a", "_resolved": "https://registry.npmjs.org/builtins/-/builtins-0.0.7.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/validate-npm-package-name/test/index.js0000644000000000000000000000735612631326456023476 0ustar 00000000000000var validate = require("..") var test = require("tap").test var path = require("path") var fs = require("fs") test("validate-npm-package-name", function (t) { // Traditional t.deepEqual(validate("some-package"), {validForNewPackages: true, validForOldPackages: true}) t.deepEqual(validate("example.com"), {validForNewPackages: true, validForOldPackages: true}) t.deepEqual(validate("under_score"), {validForNewPackages: true, validForOldPackages: true}) t.deepEqual(validate("period.js"), {validForNewPackages: true, validForOldPackages: true}) t.deepEqual(validate("123numeric"), {validForNewPackages: true, validForOldPackages: true}) t.deepEqual(validate("crazy!"), {validForNewPackages: true, validForOldPackages: true}) // Scoped (npm 2+) t.deepEqual(validate("@npm/thingy"), {validForNewPackages: true, validForOldPackages: true}) t.deepEqual(validate("@npm-zors/money!time.js"), {validForNewPackages: true, validForOldPackages: true}) // Invalid t.deepEqual(validate(""), { validForNewPackages: false, validForOldPackages: false, errors: ["name length must be greater than zero"]}) t.deepEqual(validate(""), { validForNewPackages: false, validForOldPackages: false, errors: ["name length must be greater than zero"]}) t.deepEqual(validate(".start-with-period"), { validForNewPackages: false, validForOldPackages: false, errors: ["name cannot start with a period"]}) t.deepEqual(validate("_start-with-underscore"), { validForNewPackages: false, validForOldPackages: false, errors: ["name cannot start with an underscore"]}) t.deepEqual(validate("contain:colons"), { validForNewPackages: false, validForOldPackages: false, errors: ["name can only contain URL-friendly characters"]}) t.deepEqual(validate(" leading-space"), { validForNewPackages: false, validForOldPackages: false, errors: ["name cannot contain leading or trailing spaces", "name can only contain URL-friendly characters"]}) t.deepEqual(validate("trailing-space "), { validForNewPackages: false, validForOldPackages: false, errors: ["name cannot contain leading or trailing spaces", "name can only contain URL-friendly characters"]}) t.deepEqual(validate("s/l/a/s/h/e/s"), { validForNewPackages: false, validForOldPackages: false, errors: ["name can only contain URL-friendly characters"]}) t.deepEqual(validate("node_modules"), { validForNewPackages: false, validForOldPackages: false, errors: ["node_modules is a blacklisted name"]}) t.deepEqual(validate("favicon.ico"), { validForNewPackages: false, validForOldPackages: false, errors: ["favicon.ico is a blacklisted name"]}) // Node/IO Core t.deepEqual(validate("http"), { validForNewPackages: false, validForOldPackages: true, warnings: ["http is a core module name"]}) // Long Package Names t.deepEqual(validate("ifyouwanttogetthesumoftwonumberswherethosetwonumbersarechosenbyfindingthelargestoftwooutofthreenumbersandsquaringthemwhichismultiplyingthembyitselfthenyoushouldinputthreenumbersintothisfunctionanditwilldothatforyou-"), { validForNewPackages: false, validForOldPackages: true, warnings: ["name can no longer contain more than 214 characters"] }) t.deepEqual(validate("ifyouwanttogetthesumoftwonumberswherethosetwonumbersarechosenbyfindingthelargestoftwooutofthreenumbersandsquaringthemwhichismultiplyingthembyitselfthenyoushouldinputthreenumbersintothisfunctionanditwilldothatforyou"), { validForNewPackages: true, validForOldPackages: true }) // Legacy Mixed-Case t.deepEqual(validate("CAPITAL-LETTERS"), { validForNewPackages: false, validForOldPackages: true, warnings: ["name can no longer contain capital letters"]}) t.end() }) npm_3.5.2.orig/node_modules/which/.travis.yml0000644000000000000000000000010512631326456017356 0ustar 00000000000000sudo: false language: node_js node_js: - '0.10' - '0.12' - '4' npm_3.5.2.orig/node_modules/which/LICENSE0000644000000000000000000000137512631326456016264 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/which/README.md0000644000000000000000000000205012631326456016525 0ustar 00000000000000# which Like the unix `which` utility. Finds the first instance of a specified executable in the PATH environment variable. Does not cache the results, so `hash -r` is not needed when the PATH changes. ## USAGE ```javascript var which = require('which') // async usage which('node', function (er, resolvedPath) { // er is returned if no "node" is found on the PATH // if it is found, then the absolute path to the exec is returned }) // sync usage // throws if not found var resolved = which.sync('node') // Pass options to override the PATH and PATHEXT environment vars. which('node', { path: someOtherPath }, function (er, resolved) { if (er) throw er console.log('found at %j', resolved) }) ``` ## OPTIONS You may pass an options object as the second argument. - `path`: Use instead of the `PATH` environment variable. - `pathExt`: Use instead of the `PATHEXT` environment variable. - `all`: Return all matches, instead of just the first one. Note that this means the function returns an array of strings instead of a single string. npm_3.5.2.orig/node_modules/which/bin/0000755000000000000000000000000012631326456016021 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/node_modules/0000755000000000000000000000000012631326456017726 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/package.json0000644000000000000000000000262612631326456017545 0ustar 00000000000000{ "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me" }, "name": "which", "description": "Like which(1) unix command. Find the first instance of an executable in the PATH.", "version": "1.2.0", "repository": { "type": "git", "url": "git://github.com/isaacs/node-which.git" }, "main": "which.js", "bin": { "which": "./bin/which" }, "license": "ISC", "dependencies": { "is-absolute": "^0.1.7" }, "devDependencies": { "mkdirp": "^0.5.0", "rimraf": "^2.3.3", "tap": "^2.0.0" }, "scripts": { "test": "tap test/*.js" }, "gitHead": "98925d6bced9ba820a17fd857e7a53a491958419", "bugs": { "url": "https://github.com/isaacs/node-which/issues" }, "homepage": "https://github.com/isaacs/node-which#readme", "_id": "which@1.2.0", "_shasum": "a5c8df5abc792f6ce9652c8d9ca8f3a91b77e59d", "_from": "which@>=1.2.0 <1.3.0", "_npmVersion": "3.3.2", "_nodeVersion": "4.0.0", "_npmUser": { "name": "isaacs", "email": "isaacs@npmjs.com" }, "dist": { "shasum": "a5c8df5abc792f6ce9652c8d9ca8f3a91b77e59d", "tarball": "http://registry.npmjs.org/which/-/which-1.2.0.tgz" }, "maintainers": [ { "name": "isaacs", "email": "i@izs.me" } ], "directories": {}, "_resolved": "https://registry.npmjs.org/which/-/which-1.2.0.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/which/test/0000755000000000000000000000000012631326456016230 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/which.js0000644000000000000000000000613712631326456016720 0ustar 00000000000000module.exports = which which.sync = whichSync var isWindows = process.platform === 'win32' || process.env.OSTYPE === 'cygwin' || process.env.OSTYPE === 'msys' var path = require('path') var COLON = isWindows ? ';' : ':' var isExe var fs = require('fs') var isAbsolute = require('is-absolute') var G = parseInt('0010', 8) var U = parseInt('0100', 8) var UG = parseInt('0110', 8) if (isWindows) { // On windows, there is no good way to check that a file is executable isExe = function isExe () { return true } } else { isExe = function isExe (mod, uid, gid) { var ret = (mod & 1) || (mod & U) && process.getgid && gid === process.getgid() || (mod & G) && process.getuid && uid === process.getuid() || (mod & UG) && process.getuid && 0 === process.getuid() if (!ret && process.getgroups && (mod & G)) { var groups = process.getgroups() for (var g = 0; g < groups.length; g++) { if (groups[g] === gid) return true } } return ret } } function getPathInfo(cmd, opt) { var colon = opt.colon || COLON var pathEnv = opt.path || process.env.PATH || '' var pathExt = [''] pathEnv = pathEnv.split(colon) if (isWindows) { pathEnv.unshift(process.cwd()) pathExt = (opt.pathExt || process.env.PATHEXT || '.EXE').split(colon) if (cmd.indexOf('.') !== -1 && pathExt[0] !== '') pathExt.unshift('') } // If it's absolute, then we don't bother searching the pathenv. // just check the file itself, and that's it. if (isAbsolute(cmd)) pathEnv = [''] return {env: pathEnv, ext: pathExt} } function which (cmd, opt, cb) { if (typeof opt === 'function') { cb = opt opt = {} } var info = getPathInfo(cmd, opt) var pathEnv = info.env var pathExt = info.ext var found = [] ;(function F (i, l) { if (i === l) { if (opt.all && found.length) return cb(null, found) else return cb(new Error('not found: '+cmd)) } var p = path.resolve(pathEnv[i], cmd) ;(function E (ii, ll) { if (ii === ll) return F(i + 1, l) var ext = pathExt[ii] fs.stat(p + ext, function (er, stat) { if (!er && stat.isFile() && isExe(stat.mode, stat.uid, stat.gid)) { if (opt.all) found.push(p + ext) else return cb(null, p + ext) } return E(ii + 1, ll) }) })(0, pathExt.length) })(0, pathEnv.length) } function whichSync (cmd, opt) { opt = opt || {} var info = getPathInfo(cmd, opt) var pathEnv = info.env var pathExt = info.ext var found = [] for (var i = 0, l = pathEnv.length; i < l; i ++) { var p = path.join(pathEnv[i], cmd) for (var j = 0, ll = pathExt.length; j < ll; j ++) { var cur = p + pathExt[j] var stat try { stat = fs.statSync(cur) if (stat.isFile() && isExe(stat.mode, stat.uid, stat.gid)) { if (opt.all) found.push(cur) else return cur } } catch (ex) {} } } if (opt.all && found.length) return found throw new Error('not found: '+cmd) } npm_3.5.2.orig/node_modules/which/bin/which0000755000000000000000000000173112631326456017053 0ustar 00000000000000#!/usr/bin/env node var which = require("../") if (process.argv.length < 3) usage() function usage () { console.error('usage: which [-as] program ...') process.exit(1) } var all = false var silent = false var dashdash = false var args = process.argv.slice(2).filter(function (arg) { if (dashdash || !/^-/.test(arg)) return true if (arg === '--') { dashdash = true return false } var flags = arg.substr(1).split('') for (var f = 0; f < flags.length; f++) { var flag = flags[f] switch (flag) { case 's': silent = true break case 'a': all = true break default: console.error('which: illegal option -- ' + flag) usage() } } return false }) process.exit(args.reduce(function (pv, current) { try { var f = which.sync(current, { all: all }) if (all) f = f.join('\n') if (!silent) console.log(f) return pv; } catch (e) { return 1; } }, 0)) npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/0000755000000000000000000000000012631326456022155 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/LICENSE0000644000000000000000000000215012631326456023160 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2014-2015, Jon Schlinkert.Copyright (c) 2009-2015, TJ Holowaychuk. Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/README.md0000644000000000000000000000337112631326456023440 0ustar 00000000000000# is-absolute [![NPM version](https://badge.fury.io/js/is-absolute.svg)](http://badge.fury.io/js/is-absolute) [![Build Status](https://travis-ci.org/jonschlinkert/is-absolute.svg)](https://travis-ci.org/jonschlinkert/is-absolute) > Return true if a file path is absolute. Based on the `isAbsolute` utility method in [express](https://github.com/visionmedia/express). ## Install with [npm](npmjs.org) ```bash npm i is-absolute --save ``` ## Usage ```js var isAbsolute = require('is-absolute'); console.log(isAbsolute('a/b/c.js')); //=> 'false'; ``` ## Running tests Install dev dependencies. ```bash npm i -d && npm test ``` ## Contributing Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](https://github.com/jonschlinkert/is-absolute/issues) ## Other projects * [is-relative](https://github.com/jonschlinkert/is-relative): Returns `true` if the path appears to be relative. * [is-dotfile](https://github.com/regexps/is-dotfile): Return true if a file path is (or has) a dotfile. * [is-glob](https://github.com/jonschlinkert/is-glob): Returns `true` if the given string looks like a glob pattern. * [cwd](https://github.com/jonschlinkert/cwd): Node.js util for easily getting the current working directory of a project based on package.json or the given path. * [git-config-path](https://github.com/jonschlinkert/git-config-path): Resolve the path to the user's global .gitconfig. ## Author **Jon Schlinkert** + [github/jonschlinkert](https://github.com/jonschlinkert) + [twitter/jonschlinkert](http://twitter.com/jonschlinkert) ## License Copyright (c) 2014-2015 Jon Schlinkert Released under the MIT license *** _This file was generated by [verb-cli](https://github.com/assemble/verb-cli) on March 05, 2015._ npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/index.js0000644000000000000000000000103712631326456023623 0ustar 00000000000000/*! * is-absolute * * Copyright (c) 2014-2015, Jon Schlinkert. * Licensed under the MIT License. */ 'use strict'; var isRelative = require('is-relative'); module.exports = function isAbsolute(filepath) { if ('/' === filepath[0]) { return true; } if (':' === filepath[1] && '\\' === filepath[2]) { return true; } // Microsoft Azure absolute filepath if ('\\\\' == filepath.substring(0, 2)) { return true; } if (!isRelative(filepath)) { return true; } }; npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/node_modules/0000755000000000000000000000000012631326456024632 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/package.json0000644000000000000000000000332612631326456024447 0ustar 00000000000000{ "name": "is-absolute", "description": "Return true if a file path is absolute.", "version": "0.1.7", "homepage": "https://github.com/jonschlinkert/is-absolute", "author": { "name": "Jon Schlinkert", "url": "https://github.com/jonschlinkert" }, "repository": { "type": "git", "url": "git://github.com/jonschlinkert/is-absolute.git" }, "bugs": { "url": "https://github.com/jonschlinkert/is-absolute/issues" }, "license": { "type": "MIT", "url": "https://github.com/jonschlinkert/is-absolute/blob/master/LICENSE" }, "files": [ "index.js" ], "main": "index.js", "engines": { "node": ">=0.10.0" }, "scripts": { "test": "mocha" }, "dependencies": { "is-relative": "^0.1.0" }, "devDependencies": { "mocha": "*" }, "keywords": [ "absolute", "check", "file", "filepath", "is", "normalize", "path", "path.relative", "relative", "resolve", "slash", "slashes", "uri", "url" ], "gitHead": "90cca7b671620bf28b778a61fddc8a986a2e1095", "_id": "is-absolute@0.1.7", "_shasum": "847491119fccb5fb436217cc737f7faad50f603f", "_from": "is-absolute@>=0.1.7 <0.2.0", "_npmVersion": "2.5.1", "_nodeVersion": "0.12.0", "_npmUser": { "name": "jonschlinkert", "email": "github@sellside.com" }, "maintainers": [ { "name": "jonschlinkert", "email": "github@sellside.com" } ], "dist": { "shasum": "847491119fccb5fb436217cc737f7faad50f603f", "tarball": "http://registry.npmjs.org/is-absolute/-/is-absolute-0.1.7.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/is-absolute/-/is-absolute-0.1.7.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/node_modules/is-relative/0000755000000000000000000000000012631326456027056 5ustar 00000000000000npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/node_modules/is-relative/LICENSE-MIT0000644000000000000000000000207112631326456030512 0ustar 00000000000000The MIT License (MIT) Copyright (c) 2014 Jon Schlinkert Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/node_modules/is-relative/README.md0000644000000000000000000000141112631326456030332 0ustar 00000000000000# is-relative [![NPM version](https://badge.fury.io/js/is-relative.svg)](http://badge.fury.io/js/is-relative) > Returns `true` if the path appears to be relative. ## Install ### Install with [npm](npmjs.org) ```bash npm i is-relative --save ``` ## Usage ### [isRelative](index.js#L16) * `filepath` **{String}**: Path to test. * `returns`: {Boolean} ```js var isRelative = require('is-relative'); isRelative('README.md'); //=> true ``` ## Author **Jon Schlinkert** + [github/jonschlinkert](https://github.com/jonschlinkert) + [twitter/jonschlinkert](http://twitter.com/jonschlinkert) ## License Copyright (c) 2014 Jon Schlinkert Released under the MIT license *** _This file was generated by [verb](https://github.com/assemble/verb) on November 17, 2014._npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/node_modules/is-relative/index.js0000644000000000000000000000064512631326456030530 0ustar 00000000000000'use strict'; /** * ```js * var isRelative = require('is-relative'); * isRelative('README.md'); * //=> true * ``` * * @name isRelative * @param {String} `filepath` Path to test. * @return {Boolean} * @api public */ module.exports = function isRelative(filepath) { if (typeof filepath !== 'string') { throw new Error('isRelative expects a string.'); } return !/^([a-z]+:)?[\\\/]/i.test(filepath); };npm_3.5.2.orig/node_modules/which/node_modules/is-absolute/node_modules/is-relative/package.json0000644000000000000000000000330412631326456031344 0ustar 00000000000000{ "name": "is-relative", "description": "Returns `true` if the path appears to be relative.", "version": "0.1.3", "homepage": "https://github.com/jonschlinkert/is-relative", "author": { "name": "Jon Schlinkert", "url": "https://github.com/jonschlinkert" }, "repository": { "type": "git", "url": "git://github.com/jonschlinkert/is-relative.git" }, "bugs": { "url": "https://github.com/jonschlinkert/is-relative/issues" }, "licenses": [ { "type": "MIT", "url": "https://github.com/jonschlinkert/is-relative/blob/master/LICENSE-MIT" } ], "keywords": [ "absolute", "check", "file", "filepath", "is", "normalize", "path", "path.relative", "relative", "resolve", "slash", "slashes", "uri", "url" ], "main": "index.js", "files": [ "index.js", "LICENSE-MIT" ], "engines": { "node": ">=0.10.0" }, "scripts": { "test": "mocha -R spec" }, "devDependencies": { "mocha": "*", "verb": ">= 0.2.6", "verb-tag-jscomments": "^0.1.4" }, "_id": "is-relative@0.1.3", "_shasum": "905fee8ae86f45b3ec614bc3c15c869df0876e82", "_from": "is-relative@>=0.1.0 <0.2.0", "_npmVersion": "1.4.9", "_npmUser": { "name": "jonschlinkert", "email": "github@sellside.com" }, "maintainers": [ { "name": "jonschlinkert", "email": "github@sellside.com" } ], "dist": { "shasum": "905fee8ae86f45b3ec614bc3c15c869df0876e82", "tarball": "http://registry.npmjs.org/is-relative/-/is-relative-0.1.3.tgz" }, "directories": {}, "_resolved": "https://registry.npmjs.org/is-relative/-/is-relative-0.1.3.tgz", "readme": "ERROR: No README data found!" } npm_3.5.2.orig/node_modules/which/test/basic.js0000644000000000000000000000416012631326456017650 0ustar 00000000000000var t = require('tap') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var fixture = __dirname + '/fixture' var which = require('../which.js') var path = require('path') var isWindows = process.platform === 'win32' || process.env.OSTYPE === 'cygwin' || process.env.OSTYPE === 'msys' var skip = { skip: isWindows ? 'not relevant on windows' : false } t.test('setup', function (t) { rimraf.sync(fixture) mkdirp.sync(fixture) fs.writeFileSync(fixture + '/foo.sh', 'echo foo\n') t.end() }) t.test('does not find non-executable', skip, function (t) { t.plan(2) t.test('absolute', function (t) { t.plan(2) which(fixture + '/foo.sh', function (er) { t.isa(er, Error) }) t.throws(function () { which.sync(fixture + '/foo.sh') }) }) t.test('with path', function (t) { t.plan(2) which('foo.sh', { path: fixture }, function (er) { t.isa(er, Error) }) t.throws(function () { which.sync('foo.sh', { path: fixture }) }) }) }) t.test('make executable', function (t) { fs.chmodSync(fixture + '/foo.sh', '0755') t.end() }) t.test('find when executable', function (t) { t.plan(4) var opt = { pathExt: '.sh' } var expect = path.resolve(fixture, 'foo.sh').toLowerCase() var PATH = process.env.PATH t.test('absolute', function (t) { runTest(fixture + '/foo.sh', t) }) t.test('with process.env.PATH', function (t) { process.env.PATH = fixture runTest('foo.sh', t) }) t.test('with process.env.Path', { skip: isWindows ? false : 'Only for Windows' }, function (t) { process.env.PATH = "" process.env.Path = fixture runTest('foo.sh', t) }) t.test('with path opt', function (t) { opt.path = fixture runTest('foo.sh', t) }) function runTest(exec, t) { t.plan(2) which(exec, opt, function (er, found) { if (er) throw er t.equal(found.toLowerCase(), expect) process.env.PATH = PATH }) var found = which.sync(exec, opt).toLowerCase() t.equal(found, expect) } }) t.test('clean', function (t) { rimraf.sync(fixture) t.end() }) npm_3.5.2.orig/node_modules/which/test/bin.js0000644000000000000000000000565412631326456017350 0ustar 00000000000000var t = require('tap') var spawn = require('child_process').spawn var node = process.execPath var bin = require.resolve('../bin/which') function which (args, extraPath, cb) { if (typeof extraPath === 'function') cb = extraPath, extraPath = null var options = {} if (extraPath) { var sep = process.platform === 'win32' ? ';' : ':' var p = process.env.PATH + sep + extraPath options.env = Object.keys(process.env).reduce(function (env, k) { if (!k.match(/^path$/i)) env[k] = process.env[k] return env }, { PATH: p }) } var out = '' var err = '' var child = spawn(node, [bin].concat(args), options) child.stdout.on('data', function (c) { out += c }) child.stderr.on('data', function (c) { err += c }) child.on('close', function (code, signal) { cb(code, signal, out.trim(), err.trim()) }) } t.test('finds node', function (t) { which('node', function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 0) t.equal(err, '') t.match(out, /[\\\/]node(\.exe)?$/) t.end() }) }) t.test('does not find flergyderp', function (t) { which('flergyderp', function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 1) t.equal(err, '') t.match(out, '') t.end() }) }) t.test('finds node and tap', function (t) { which(['node', 'tap'], function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 0) t.equal(err, '') t.match(out.split(/\n/), [ /[\\\/]node(\.exe)?$/, /[\\\/]tap(\.cmd)?$/ ]) t.end() }) }) t.test('finds node and tap, but not flergyderp', function (t) { which(['node', 'flergyderp', 'tap'], function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 1) t.equal(err, '') t.match(out.split(/\n/), [ /[\\\/]node(\.exe)?$/, /[\\\/]tap(\.cmd)?$/ ]) t.end() }) }) t.test('cli flags', function (t) { var p = require('path').dirname(bin) var cases = [ '-a', '-s', '-as', '-sa' ] t.plan(cases.length) cases.forEach(function (c) { t.test(c, function (t) { which(['which', c], p, function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 0) t.equal(err, '') if (/s/.test(c)) t.equal(out, '', 'should be silent') else if (/a/.test(c)) t.ok(out.split(/\n/).length > 1, 'should have more than 1 result') t.end() }) }) }) }) t.test('shows usage', function (t) { which([], function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 1) t.equal(err, 'usage: which [-as] program ...') t.equal(out, '') t.end() }) }) t.test('complains about unknown flag', function (t) { which(['node', '-sax'], function (code, signal, out, err) { t.equal(signal, null) t.equal(code, 1) t.equal(out, '') t.equal(err, 'which: illegal option -- x\nusage: which [-as] program ...') t.end() }) }) npm_3.5.2.orig/node_modules/wrappy/LICENSE0000644000000000000000000000137512631326456016504 0ustar 00000000000000The ISC License Copyright (c) Isaac Z. Schlueter and Contributors Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/wrappy/README.md0000644000000000000000000000125512631326456016753 0ustar 00000000000000# wrappy Callback wrapping utility ## USAGE ```javascript var wrappy = require("wrappy") // var wrapper = wrappy(wrapperFunction) // make sure a cb is called only once // See also: http://npm.im/once for this specific use case var once = wrappy(function (cb) { var called = false return function () { if (called) return called = true return cb.apply(this, arguments) } }) function printBoo () { console.log('boo') } // has some rando property printBoo.iAmBooPrinter = true var onlyPrintOnce = once(printBoo) onlyPrintOnce() // prints 'boo' onlyPrintOnce() // does nothing // random property is retained! assert.equal(onlyPrintOnce.iAmBooPrinter, true) ``` npm_3.5.2.orig/node_modules/wrappy/package.json0000644000000000000000000000304712631326456017763 0ustar 00000000000000{ "name": "wrappy", "version": "1.0.1", "description": "Callback wrapping utility", "main": "wrappy.js", "directories": { "test": "test" }, "dependencies": {}, "devDependencies": { "tap": "^0.4.12" }, "scripts": { "test": "tap test/*.js" }, "repository": { "type": "git", "url": "git+https://github.com/npm/wrappy.git" }, "author": { "name": "Isaac Z. Schlueter", "email": "i@izs.me", "url": "http://blog.izs.me/" }, "license": "ISC", "bugs": { "url": "https://github.com/npm/wrappy/issues" }, "homepage": "https://github.com/npm/wrappy", "readme": "# wrappy\n\nCallback wrapping utility\n\n## USAGE\n\n```javascript\nvar wrappy = require(\"wrappy\")\n\n// var wrapper = wrappy(wrapperFunction)\n\n// make sure a cb is called only once\n// See also: http://npm.im/once for this specific use case\nvar once = wrappy(function (cb) {\n var called = false\n return function () {\n if (called) return\n called = true\n return cb.apply(this, arguments)\n }\n})\n\nfunction printBoo () {\n console.log('boo')\n}\n// has some rando property\nprintBoo.iAmBooPrinter = true\n\nvar onlyPrintOnce = once(printBoo)\n\nonlyPrintOnce() // prints 'boo'\nonlyPrintOnce() // does nothing\n\n// random property is retained!\nassert.equal(onlyPrintOnce.iAmBooPrinter, true)\n```\n", "readmeFilename": "README.md", "_id": "wrappy@1.0.1", "_shasum": "1e65969965ccbc2db4548c6b84a6f2c5aedd4739", "_resolved": "https://registry.npmjs.org/wrappy/-/wrappy-1.0.1.tgz", "_from": "wrappy@>=1.0.1 <1.1.0" } npm_3.5.2.orig/node_modules/wrappy/test/0000755000000000000000000000000012631326456016450 5ustar 00000000000000npm_3.5.2.orig/node_modules/wrappy/wrappy.js0000644000000000000000000000161112631326456017350 0ustar 00000000000000// Returns a wrapper function that returns a wrapped callback // The wrapper function should do some stuff, and return a // presumably different callback function. // This makes sure that own properties are retained, so that // decorations and such are not lost along the way. module.exports = wrappy function wrappy (fn, cb) { if (fn && cb) return wrappy(fn)(cb) if (typeof fn !== 'function') throw new TypeError('need wrapper function') Object.keys(fn).forEach(function (k) { wrapper[k] = fn[k] }) return wrapper function wrapper() { var args = new Array(arguments.length) for (var i = 0; i < args.length; i++) { args[i] = arguments[i] } var ret = fn.apply(this, args) var cb = args[args.length-1] if (typeof ret === 'function' && ret !== cb) { Object.keys(cb).forEach(function (k) { ret[k] = cb[k] }) } return ret } } npm_3.5.2.orig/node_modules/wrappy/test/basic.js0000644000000000000000000000215112631326456020066 0ustar 00000000000000var test = require('tap').test var wrappy = require('../wrappy.js') test('basic', function (t) { function onceifier (cb) { var called = false return function () { if (called) return called = true return cb.apply(this, arguments) } } onceifier.iAmOnce = {} var once = wrappy(onceifier) t.equal(once.iAmOnce, onceifier.iAmOnce) var called = 0 function boo () { t.equal(called, 0) called++ } // has some rando property boo.iAmBoo = true var onlyPrintOnce = once(boo) onlyPrintOnce() // prints 'boo' onlyPrintOnce() // does nothing t.equal(called, 1) // random property is retained! t.equal(onlyPrintOnce.iAmBoo, true) var logs = [] var logwrap = wrappy(function (msg, cb) { logs.push(msg + ' wrapping cb') return function () { logs.push(msg + ' before cb') var ret = cb.apply(this, arguments) logs.push(msg + ' after cb') } }) var c = logwrap('foo', function () { t.same(logs, [ 'foo wrapping cb', 'foo before cb' ]) }) c() t.same(logs, [ 'foo wrapping cb', 'foo before cb', 'foo after cb' ]) t.end() }) npm_3.5.2.orig/node_modules/write-file-atomic/.npmignore0000644000000000000000000000003312631326456021463 0ustar 00000000000000*~ DEADJOE .#* node_modulesnpm_3.5.2.orig/node_modules/write-file-atomic/.travis.yml0000644000000000000000000000021312631326456021575 0ustar 00000000000000language: node_js sudo: false before_install: - "npm -g install npm" node_js: - "0.8" - "0.10" - "0.12" - "iojs" - "4" - "5" npm_3.5.2.orig/node_modules/write-file-atomic/LICENSE0000644000000000000000000000133612631326456020500 0ustar 00000000000000Copyright (c) 2015, Rebecca Turner Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. npm_3.5.2.orig/node_modules/write-file-atomic/README.md0000644000000000000000000000312112631326456020744 0ustar 00000000000000write-file-atomic ----------------- This is an extension for node's `fs.writeFile` that makes its operation atomic and allows you set ownership (uid/gid of the file). ### var writeFileAtomic = require('write-file-atomic')
      writeFileAtomic(filename, data, [options], callback) * filename **String** * data **String** | **Buffer** * options **Object** * chown **Object** * uid **Number** * gid **Number** * encoding **String** | **Null** default = 'utf8' * mode **Number** default = 438 (aka 0666 in Octal) callback **Function** Atomically and asynchronously writes data to a file, replacing the file if it already exists. data can be a string or a buffer. The file is initially named `filename + "." + md5hex(__filename, process.pid, ++invocations)`. If writeFile completes successfully then, if passed the **chown** option it will change the ownership of the file. Finally it renames the file back to the filename you specified. If it encounters errors at any of these steps it will attempt to unlink the temporary file and then pass the error back to the caller. If provided, the **chown** option requires both **uid** and **gid** properties or else you'll get an error. The **encoding** option is ignored if **data** is a buffer. It defaults to 'utf8'. Example: ```javascript writeFileAtomic('message.txt', 'Hello Node', {chown:{uid:100,gid:50}}, function (err) { if (err) throw err; console.log('It\'s saved!'); }); ``` ### var writeFileAtomicSync = require('write-file-atomic').sync
      writeFileAtomicSync(filename, data, [options]) The synchronous version of **writeFileAtomic**. npm_3.5.2.orig/node_modules/write-file-atomic/index.js0000644000000000000000000000246112631326456021140 0ustar 00000000000000'use strict' var fs = require('graceful-fs') var chain = require('slide').chain var MurmurHash3 = require('imurmurhash') function murmurhex () { var hash = new MurmurHash3() for (var ii = 0; ii < arguments.length; ++ii) hash.hash('' + arguments[ii]) return hash.result() } var invocations = 0 var getTmpname = function (filename) { return filename + '.' + murmurhex(__filename, process.pid, ++invocations) } module.exports = function writeFile (filename, data, options, callback) { if (options instanceof Function) { callback = options options = null } if (!options) options = {} var tmpfile = getTmpname(filename) chain([ [fs, fs.writeFile, tmpfile, data, options], options.chown && [fs, fs.chown, tmpfile, options.chown.uid, options.chown.gid], [fs, fs.rename, tmpfile, filename] ], function (err) { err ? fs.unlink(tmpfile, function () { callback(err) }) : callback() }) } module.exports.sync = function writeFileSync (filename, data, options) { if (!options) options = {} var tmpfile = getTmpname(filename) try { fs.writeFileSync(tmpfile, data, options) if (options.chown) fs.chownSync(tmpfile, options.chown.uid, options.chown.gid) fs.renameSync(tmpfile, filename) } catch (err) { try { fs.unlinkSync(tmpfile) } catch (e) {} throw err } } npm_3.5.2.orig/node_modules/write-file-atomic/package.json0000644000000000000000000000420512631326456021757 0ustar 00000000000000{ "_args": [ [ "write-file-atomic@^1.1.4", "/Users/ogd/Documents/projects/npm/npm" ] ], "_from": "write-file-atomic@>=1.1.4 <2.0.0", "_id": "write-file-atomic@1.1.4", "_inCache": true, "_installable": true, "_location": "/write-file-atomic", "_nodeVersion": "5.1.0", "_npmUser": { "email": "ogd@aoaioxxysz.net", "name": "othiym23" }, "_npmVersion": "3.5.1", "_phantomChildren": {}, "_requested": { "name": "write-file-atomic", "raw": "write-file-atomic@^1.1.4", "rawSpec": "^1.1.4", "scope": null, "spec": ">=1.1.4 <2.0.0", "type": "range" }, "_requiredBy": [ "/" ], "_shasum": "b1f52dc2e8dc0e3cb04d187a25f758a38a90ca3b", "_shrinkwrap": null, "_spec": "write-file-atomic@^1.1.4", "_where": "/Users/ogd/Documents/projects/npm/npm", "author": { "email": "me@re-becca.org", "name": "Rebecca Turner", "url": "http://re-becca.org" }, "bugs": { "url": "https://github.com/iarna/write-file-atomic/issues" }, "dependencies": { "graceful-fs": "^4.1.2", "imurmurhash": "^0.1.4", "slide": "^1.1.5" }, "description": "Write files in an atomic fashion w/configurable ownership", "devDependencies": { "require-inject": "^1.1.0", "standard": "^5.4.1", "tap": "^2.3.1" }, "directories": {}, "dist": { "shasum": "b1f52dc2e8dc0e3cb04d187a25f758a38a90ca3b", "tarball": "http://registry.npmjs.org/write-file-atomic/-/write-file-atomic-1.1.4.tgz" }, "gitHead": "42dc04a17af96ac045f4979c8c951ee5a14a8b8b", "homepage": "https://github.com/iarna/write-file-atomic", "keywords": [ "atomic", "writeFile" ], "license": "ISC", "main": "index.js", "maintainers": [ { "name": "iarna", "email": "me@re-becca.org" }, { "name": "othiym23", "email": "ogd@aoaioxxysz.net" } ], "name": "write-file-atomic", "optionalDependencies": {}, "readme": "ERROR: No README data found!", "repository": { "type": "git", "url": "git+ssh://git@github.com/iarna/write-file-atomic.git" }, "scripts": { "test": "standard && tap --coverage test/*.js" }, "version": "1.1.4" } npm_3.5.2.orig/node_modules/write-file-atomic/test/0000755000000000000000000000000012631326456020447 5ustar 00000000000000npm_3.5.2.orig/node_modules/write-file-atomic/test/basic.js0000644000000000000000000000657012631326456022076 0ustar 00000000000000'use strict' var test = require('tap').test var requireInject = require('require-inject') var writeFileAtomic = requireInject('../index', { 'graceful-fs': { writeFile: function (tmpfile, data, options, cb) { if (/nowrite/.test(tmpfile)) return cb(new Error('ENOWRITE')) cb() }, chown: function (tmpfile, uid, gid, cb) { if (/nochown/.test(tmpfile)) return cb(new Error('ENOCHOWN')) cb() }, rename: function (tmpfile, filename, cb) { if (/norename/.test(tmpfile)) return cb(new Error('ENORENAME')) cb() }, unlink: function (tmpfile, cb) { if (/nounlink/.test(tmpfile)) return cb(new Error('ENOUNLINK')) cb() }, writeFileSync: function (tmpfile, data, options) { if (/nowrite/.test(tmpfile)) throw new Error('ENOWRITE') }, chownSync: function (tmpfile, uid, gid) { if (/nochown/.test(tmpfile)) throw new Error('ENOCHOWN') }, renameSync: function (tmpfile, filename) { if (/norename/.test(tmpfile)) throw new Error('ENORENAME') }, unlinkSync: function (tmpfile) { if (/nounlink/.test(tmpfile)) throw new Error('ENOUNLINK') } } }) var writeFileAtomicSync = writeFileAtomic.sync test('async tests', function (t) { t.plan(7) writeFileAtomic('good', 'test', {mode: '0777'}, function (err) { t.notOk(err, 'No errors occur when passing in options') }) writeFileAtomic('good', 'test', function (err) { t.notOk(err, 'No errors occur when NOT passing in options') }) writeFileAtomic('nowrite', 'test', function (err) { t.is(err.message, 'ENOWRITE', 'writeFile failures propagate') }) writeFileAtomic('nochown', 'test', {chown: {uid: 100, gid: 100}}, function (err) { t.is(err.message, 'ENOCHOWN', 'Chown failures propagate') }) writeFileAtomic('nochown', 'test', function (err) { t.notOk(err, 'No attempt to chown when no uid/gid passed in') }) writeFileAtomic('norename', 'test', function (err) { t.is(err.message, 'ENORENAME', 'Rename errors propagate') }) writeFileAtomic('norename nounlink', 'test', function (err) { t.is(err.message, 'ENORENAME', 'Failure to unlink the temp file does not clobber the original error') }) }) test('sync tests', function (t) { t.plan(7) var throws = function (shouldthrow, msg, todo) { var err try { todo() } catch (e) { err = e } t.is(shouldthrow, err.message, msg) } var noexception = function (msg, todo) { var err try { todo() } catch (e) { err = e } t.notOk(err, msg) } noexception('No errors occur when passing in options', function () { writeFileAtomicSync('good', 'test', {mode: '0777'}) }) noexception('No errors occur when NOT passing in options', function () { writeFileAtomicSync('good', 'test') }) throws('ENOWRITE', 'writeFile failures propagate', function () { writeFileAtomicSync('nowrite', 'test') }) throws('ENOCHOWN', 'Chown failures propagate', function () { writeFileAtomicSync('nochown', 'test', {chown: {uid: 100, gid: 100}}) }) noexception('No attempt to chown when no uid/gid passed in', function () { writeFileAtomicSync('nochown', 'test') }) throws('ENORENAME', 'Rename errors propagate', function () { writeFileAtomicSync('norename', 'test') }) throws('ENORENAME', 'Failure to unlink the temp file does not clobber the original error', function () { writeFileAtomicSync('norename nounlink', 'test') }) }) npm_3.5.2.orig/scripts/clean-old.sh0000755000000000000000000001024212631326456015355 0ustar 00000000000000#!/bin/bash # look for old 0.x cruft, and get rid of it. # Should already be sitting in the npm folder. # This doesn't have to be quite as cross-platform as install.sh. # There are some bash-isms, because maintaining *two* # fully-portable posix/bourne sh scripts is too much for # one project with a sane maintainer. # If readlink isn't available, then this is just too tricky. # However, greadlink is fine, so Solaris can join the party, too. readlink="readlink" which $readlink >/dev/null 2>/dev/null if [ $? -ne 0 ]; then readlink="greadlink" which $readlink >/dev/null 2>/dev/null if [ $? -ne 0 ]; then echo "Can't find the readlink or greadlink command. Aborting." exit 1 fi fi if [ "x$npm_config_prefix" != "x" ]; then PREFIXES=$npm_config_prefix else node="$NODE" if [ "x$node" = "x" ]; then node=`which node` fi if [ "x$node" = "x" ]; then echo "Can't find node to determine prefix. Aborting." exit 1 fi PREFIX=`dirname $node` PREFIX=`dirname $PREFIX` echo "cleanup prefix=$PREFIX" PREFIXES=$PREFIX altprefix=`"$node" -e process.installPrefix` if [ "x$altprefix" != "x" ] && [ "x$altprefix" != "x$PREFIX" ]; then echo "altprefix=$altprefix" PREFIXES="$PREFIX $altprefix" fi fi # now prefix is where npm would be rooted by default # go hunting. packages= for prefix in $PREFIXES; do packages="$packages "`ls "$prefix"/lib/node/.npm 2>/dev/null | grep -v .cache` done packages=`echo $packages` filelist=() fid=0 for prefix in $PREFIXES; do # remove any links into the .npm dir, or links to # version-named shims/symlinks. for folder in share/man bin lib/node; do find $prefix/$folder -type l | while read file; do target=`$readlink $file | grep '/\.npm/'` if [ "x$target" != "x" ]; then # found one! filelist[$fid]="$file" let 'fid++' # also remove any symlinks to this file. base=`basename "$file"` base=`echo "$base" | awk -F@ '{print $1}'` if [ "x$base" != "x" ]; then find "`dirname $file`" -type l -name "$base"'*' \ | while read l; do target=`$readlink "$l" | grep "$base"` if [ "x$target" != "x" ]; then filelist[$fid]="$1" let 'fid++' fi done fi fi done # Scour for shim files. These are relics of 0.2 npm installs. # note: grep -r is not portable. find $prefix/$folder -type f \ | xargs grep -sl '// generated by npm' \ | while read file; do filelist[$fid]="$file" let 'fid++' done done # now remove the package modules, and the .npm folder itself. if [ "x$packages" != "x" ]; then for pkg in $packages; do filelist[$fid]="$prefix/lib/node/$pkg" let 'fid++' for i in $prefix/lib/node/$pkg\@*; do filelist[$fid]="$i" let 'fid++' done done fi for folder in lib/node/.npm lib/npm share/npm; do if [ -d $prefix/$folder ]; then filelist[$fid]="$prefix/$folder" let 'fid++' fi done done # now actually clean, but only if there's anything TO clean if [ "${#filelist[@]}" -gt 0 ]; then echo "" echo "This script will find and eliminate any shims, symbolic" echo "links, and other cruft that was installed by npm 0.x." echo "" if [ "x$packages" != "x" ]; then echo "The following packages appear to have been installed with" echo "an old version of npm, and will be removed forcibly:" for pkg in $packages; do echo " $pkg" done echo "Make a note of these. You may want to install them" echo "with npm 1.0 when this process is completed." echo "" fi OK= if [ "x$1" = "x-y" ]; then OK="yes" fi while [ "$OK" != "y" ] && [ "$OK" != "yes" ] && [ "$OK" != "no" ]; do echo "Is this OK?" echo " enter 'yes' or 'no'" echo " or 'show' to see a list of files " read OK if [ "x$OK" = "xshow" ] || [ "x$OK" = "xs" ]; then for i in "${filelist[@]}"; do echo "$i" done fi done if [ "$OK" = "no" ]; then echo "Aborting" exit 1 fi for i in "${filelist[@]}"; do rm -rf "$i" done fi echo "" echo 'All clean!' exit 0 npm_3.5.2.orig/scripts/doc-build.sh0000755000000000000000000000627412631326456015373 0ustar 00000000000000#!/usr/bin/env bash if [[ $DEBUG != "" ]]; then set -x fi set -o errexit set -o pipefail if ! [ -x node_modules/.bin/marked-man ]; then ps=0 if [ -f .building_marked-man ]; then pid=$(cat .building_marked-man) ps=$(ps -p $pid | grep $pid | wc -l) || true fi if [ -f .building_marked-man ] && [ $ps != 0 ]; then while [ -f .building_marked-man ]; do sleep 1 done else # a race to see which make process will be the one to install marked-man echo $$ > .building_marked-man sleep 1 if [ $(cat .building_marked-man) == $$ ]; then make node_modules/.bin/marked-man rm .building_marked-man else while [ -f .building_marked-man ]; do sleep 1 done fi fi fi if ! [ -x node_modules/.bin/marked ]; then ps=0 if [ -f .building_marked ]; then pid=$(cat .building_marked) ps=$(ps -p $pid | grep $pid | wc -l) || true fi if [ -f .building_marked ] && [ $ps != 0 ]; then while [ -f .building_marked ]; do sleep 1 done else # a race to see which make process will be the one to install marked echo $$ > .building_marked sleep 1 if [ $(cat .building_marked) == $$ ]; then make node_modules/.bin/marked rm .building_marked else while [ -f .building_marked ]; do sleep 1 done fi fi fi src=$1 dest=$2 name=$(basename ${src%.*}) date=$(date -u +'%Y-%m-%d %H:%M:%S') version=$(node cli.js -v) mkdir -p $(dirname $dest) html_replace_tokens () { local url=$1 sed "s|@NAME@|$name|g" \ | sed "s|@DATE@|$date|g" \ | sed "s|@URL@|$url|g" \ | sed "s|@VERSION@|$version|g" \ | perl -p -e 's/]*)>([^\(]*\([0-9]\)) -- (.*?)<\/h1>/

      \2<\/h1>

      \3<\/p>/g' \ | perl -p -e 's/npm-npm/npm/g' \ | perl -p -e 's/([^"-])(npm-)?README(?!\.html)(\(1\))?/\1README<\/a>/g' \ | perl -p -e 's/<a href="[^"]+README.html">README<\/a><\/title>/<title>README<\/title>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(1\))/\1<a href="..\/cli\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(3\))/\1<a href="..\/api\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(5\))/\1<a href="..\/files\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/([^"-])([^\(> ]+)(\(7\))/\1<a href="..\/misc\/\2.html">\2\3<\/a>/g' \ | perl -p -e 's/\([1357]\)<\/a><\/h1>/<\/a><\/h1>/g' \ | (if [ $(basename $(dirname $dest)) == "doc" ]; then perl -p -e 's/ href="\.\.\// href="/g' else cat fi) } man_replace_tokens () { sed "s|@VERSION@|$version|g" \ | perl -p -e 's/(npm\\-)?([a-zA-Z\\\.\-]*)\(1\)/npm help \2/g' \ | perl -p -e 's/(npm\\-)?([a-zA-Z\\\.\-]*)\(([57])\)/npm help \3 \2/g' \ | perl -p -e 's/(npm\\-)?([a-zA-Z\\\.\-]*)\(3\)/npm apihelp \2/g' \ | perl -p -e 's/npm\(1\)/npm help npm/g' \ | perl -p -e 's/npm\(3\)/npm apihelp npm/g' } case $dest in *.[1357]) ./node_modules/.bin/marked-man --roff $src \ | man_replace_tokens > $dest exit $? ;; *.html) url=${dest/html\//} (cat html/dochead.html && \ cat $src | ./node_modules/.bin/marked && cat html/docfoot.html)\ | html_replace_tokens $url \ > $dest exit $? ;; *) echo "Invalid destination type: $dest" >&2 exit 1 ;; esac ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/scripts/index-build.js���������������������������������������������������������������0000755�0000000�0000000�00000003240�12631326456�015725� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var fs = require('fs') var path = require('path') var root = path.resolve(__dirname, '..') var glob = require('glob') var conversion = { 'cli': 1, 'api': 3, 'files': 5, 'misc': 7 } glob(root + '/{README.md,doc/*/*.md}', function (er, files) { if (er) throw er output(files.map(function (f) { var b = path.basename(f) if (b === 'README.md') return [0, b] if (b === 'index.md') return null var s = conversion[path.basename(path.dirname(f))] return [s, f] }).filter(function (f) { return f }).sort(function (a, b) { return (a[0] === b[0]) ? (path.basename(a[1]) === 'npm.md' ? -1 : path.basename(b[1]) === 'npm.md' ? 1 : a[1] > b[1] ? 1 : -1) : a[0] - b[0] })) }) function output (files) { console.log( 'npm-index(7) -- Index of all npm documentation\n' + '==============================================\n') writeLines(files, 0) writeLines(files, 1, 'Command Line Documentation', 'Using npm on the command line') writeLines(files, 3, 'API Documentation', 'Using npm in your Node programs') writeLines(files, 5, 'Files', 'File system structures npm uses') writeLines(files, 7, 'Misc', 'Various other bits and bobs') } function writeLines (files, sxn, heading, desc) { if (heading) { console.log('## %s\n\n%s\n', heading, desc) } files.filter(function (f) { return f[0] === sxn }).forEach(writeLine) } function writeLine (sd) { var sxn = sd[0] || 1 var doc = sd[1] var d = path.basename(doc, '.md') var content = fs.readFileSync(doc, 'utf8').split('\n')[0].split('-- ')[1] console.log('### %s(%d)\n', d, sxn) console.log(content + '\n') } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/scripts/install.sh�������������������������������������������������������������������0000755�0000000�0000000�00000014137�12631326456�015174� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh # A word about this shell script: # # It must work everywhere, including on systems that lack # a /bin/bash, map 'sh' to ksh, ksh97, bash, ash, or zsh, # and potentially have either a posix shell or bourne # shell living at /bin/sh. # # See this helpful document on writing portable shell scripts: # http://www.gnu.org/s/hello/manual/autoconf/Portable-Shell.html # # The only shell it won't ever work on is cmd.exe. if [ "x$0" = "xsh" ]; then # run as curl | sh # on some systems, you can just do cat>npm-install.sh # which is a bit cuter. But on others, &1 is already closed, # so catting to another script file won't do anything. # Follow Location: headers, and fail on errors curl -f -L -s https://www.npmjs.org/install.sh > npm-install-$$.sh ret=$? if [ $ret -eq 0 ]; then (exit 0) else rm npm-install-$$.sh echo "Failed to download script" >&2 exit $ret fi sh npm-install-$$.sh ret=$? rm npm-install-$$.sh exit $ret fi # See what "npm_config_*" things there are in the env, # and make them permanent. # If this fails, it's not such a big deal. configures="`env | grep 'npm_config_' | sed -e 's|^npm_config_||g'`" npm_config_loglevel="error" if [ "x$npm_debug" = "x" ]; then (exit 0) else echo "Running in debug mode." echo "Note that this requires bash or zsh." set -o xtrace set -o pipefail npm_config_loglevel="verbose" fi export npm_config_loglevel # make sure that node exists node=`which node 2>&1` ret=$? if [ $ret -eq 0 ] && [ -x "$node" ]; then (exit 0) else echo "npm cannot be installed without node.js." >&2 echo "Install node first, and then try again." >&2 echo "" >&2 echo "Maybe node is installed, but not in the PATH?" >&2 echo "Note that running as sudo can change envs." >&2 echo "" echo "PATH=$PATH" >&2 exit $ret fi # set the temp dir TMP="${TMPDIR}" if [ "x$TMP" = "x" ]; then TMP="/tmp" fi TMP="${TMP}/npm.$$" rm -rf "$TMP" || true mkdir "$TMP" if [ $? -ne 0 ]; then echo "failed to mkdir $TMP" >&2 exit 1 fi BACK="$PWD" ret=0 tar="${TAR}" if [ -z "$tar" ]; then tar="${npm_config_tar}" fi if [ -z "$tar" ]; then tar=`which tar 2>&1` ret=$? fi if [ $ret -eq 0 ] && [ -x "$tar" ]; then echo "tar=$tar" echo "version:" $tar --version ret=$? fi if [ $ret -eq 0 ]; then (exit 0) else echo "No suitable tar program found." exit 1 fi # Try to find a suitable make # If the MAKE environment var is set, use that. # otherwise, try to find gmake, and then make. # If no make is found, then just execute the necessary commands. # XXX For some reason, make is building all the docs every time. This # is an annoying source of bugs. Figure out why this happens. MAKE=NOMAKE if [ "x$MAKE" = "x" ]; then make=`which gmake 2>&1` if [ $? -eq 0 ] && [ -x "$make" ]; then (exit 0) else make=`which make 2>&1` if [ $? -eq 0 ] && [ -x "$make" ]; then (exit 0) else make=NOMAKE fi fi else make="$MAKE" fi if [ -x "$make" ]; then (exit 0) else # echo "Installing without make. This may fail." >&2 make=NOMAKE fi # If there's no bash, then don't even try to clean if [ -x "/bin/bash" ]; then (exit 0) else clean="no" fi node_version=`"$node" --version 2>&1` ret=$? if [ $ret -ne 0 ]; then echo "You need node to run this program." >&2 echo "node --version reports: $node_version" >&2 echo "with exit code = $ret" >&2 echo "Please install node before continuing." >&2 exit $ret fi t="${npm_install}" if [ -z "$t" ]; then # switch based on node version. # note that we can only use strict sh-compatible patterns here. case $node_version in 0.[01234567].* | v0.[01234567].*) echo "You are using an outdated and unsupported version of" >&2 echo "node ($node_version). Please update node and try again." >&2 exit 99 ;; *) echo "install npm@latest" t="latest" ;; esac fi # need to echo "" after, because Posix sed doesn't treat EOF # as an implied end of line. url=`(curl -SsL https://registry.npmjs.org/npm/$t; echo "") \ | sed -e 's/^.*tarball":"//' \ | sed -e 's/".*$//'` ret=$? if [ "x$url" = "x" ]; then ret=125 # try without the -e arg to sed. url=`(curl -SsL https://registry.npmjs.org/npm/$t; echo "") \ | sed 's/^.*tarball":"//' \ | sed 's/".*$//'` ret=$? if [ "x$url" = "x" ]; then ret=125 fi fi if [ $ret -ne 0 ]; then echo "Failed to get tarball url for npm/$t" >&2 exit $ret fi echo "fetching: $url" >&2 cd "$TMP" \ && curl -SsL "$url" \ | $tar -xzf - \ && cd "$TMP"/* \ && (ver=`"$node" bin/read-package-json.js package.json version` isnpm10=0 if [ $ret -eq 0 ]; then if [ -d node_modules ]; then if "$node" node_modules/semver/bin/semver -v "$ver" -r "1" then isnpm10=1 fi else if "$node" bin/semver -v "$ver" -r ">=1.0"; then isnpm10=1 fi fi fi ret=0 if [ $isnpm10 -eq 1 ] && [ -f "scripts/clean-old.sh" ]; then if [ "x$skipclean" = "x" ]; then (exit 0) else clean=no fi if [ "x$clean" = "xno" ] \ || [ "x$clean" = "xn" ]; then echo "Skipping 0.x cruft clean" >&2 ret=0 elif [ "x$clean" = "xy" ] || [ "x$clean" = "xyes" ]; then NODE="$node" /bin/bash "scripts/clean-old.sh" "-y" ret=$? else NODE="$node" /bin/bash "scripts/clean-old.sh" </dev/tty ret=$? fi fi if [ $ret -ne 0 ]; then echo "Aborted 0.x cleanup. Exiting." >&2 exit $ret fi) \ && (if [ "x$configures" = "x" ]; then (exit 0) else echo "./configure $configures" echo "$configures" > npmrc fi) \ && (if [ "$make" = "NOMAKE" ]; then (exit 0) elif "$make" uninstall install; then (exit 0) else make="NOMAKE" fi if [ "$make" = "NOMAKE" ]; then "$node" cli.js rm npm -gf "$node" cli.js install -gf fi) \ && cd "$BACK" \ && rm -rf "$TMP" \ && echo "It worked" ret=$? if [ $ret -ne 0 ]; then echo "It failed" >&2 fi exit $ret ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/scripts/publish-tag.js���������������������������������������������������������������0000644�0000000�0000000�00000000227�12631326456�015737� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var semver = require('semver') var version = semver.parse(require('../package.json').version) console.log('v%s.%s-next', version.major, version.minor) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/scripts/release.sh�������������������������������������������������������������������0000644�0000000�0000000�00000001221�12631326456�015131� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash # script for creating a zip and tarball for inclusion in node unset CDPATH set -e rm -rf release *.tgz || true mkdir release node ./cli.js pack --loglevel error >/dev/null mv *.tgz release cd release tar xzf *.tgz mkdir node_modules mv package node_modules/npm # make the zip for windows users cp node_modules/npm/bin/*.cmd . zipname=npm-$(node ../cli.js -v).zip zip -q -9 -r -X "$zipname" *.cmd node_modules # make the tar for node's deps cd node_modules tarname=npm-$(node ../../cli.js -v).tgz tar czf "$tarname" npm cd .. mv "node_modules/$tarname" . rm -rf *.cmd rm -rf node_modules echo "release/$tarname" echo "release/$zipname" �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/scripts/relocate.sh������������������������������������������������������������������0000755�0000000�0000000�00000001231�12631326456�015313� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash # Change the cli shebang to point at the specified node # Useful for when the program is moved around after install. # Also used by the default 'make install' in node to point # npm at the newly installed node, rather than the first one # in the PATH, which would be the default otherwise. # bash /path/to/npm/scripts/relocate.sh $nodepath # If $nodepath is blank, then it'll use /usr/bin/env dir="$(dirname "$(dirname "$0")")" cli="$dir"/bin/npm-cli.js tmp="$cli".tmp node="$1" if [ "x$node" = "x" ]; then node="/usr/bin/env node" fi node="#!$node" sed -e 1d "$cli" > "$tmp" echo "$node" > "$cli" cat "$tmp" >> "$cli" rm "$tmp" chmod ogu+x $cli �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/scripts/update-authors.sh������������������������������������������������������������0000755�0000000�0000000�00000000267�12631326456�016472� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/sh git log --reverse --format='%aN <%aE>' | perl -wnE ' BEGIN { say "# Authors sorted by whether or not they\x27re me"; } print $seen{$_} = $_ unless $seen{$_} ' > AUTHORS �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/common-tap.js�������������������������������������������������������������������0000644�0000000�0000000�00000005007�12631326456�015063� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// cheesy hackaround for test deps (read: nock) that rely on setImmediate if (!global.setImmediate || !require('timers').setImmediate) { require('timers').setImmediate = global.setImmediate = function () { var args = [arguments[0], 0].concat([].slice.call(arguments, 1)) setTimeout.apply(this, args) } } var spawn = require('child_process').spawn var path = require('path') var port = exports.port = 1337 exports.registry = 'http://localhost:' + port process.env.npm_config_loglevel = 'error' var npm_config_cache = path.resolve(__dirname, 'npm_cache') process.env.npm_config_cache = exports.npm_config_cache = npm_config_cache process.env.npm_config_userconfig = exports.npm_config_userconfig = path.join(__dirname, 'fixtures', 'config', 'userconfig') process.env.npm_config_globalconfig = exports.npm_config_globalconfig = path.join(__dirname, 'fixtures', 'config', 'globalconfig') process.env.random_env_var = 'foo' var bin = exports.bin = require.resolve('../bin/npm-cli.js') var chain = require('slide').chain var once = require('once') exports.npm = function (cmd, opts, cb) { cb = once(cb) cmd = [bin].concat(cmd) opts = opts || {} opts.env = opts.env || process.env if (!opts.env.npm_config_cache) { opts.env.npm_config_cache = npm_config_cache } var stdout = '' var stderr = '' var node = process.execPath var child = spawn(node, cmd, opts) if (child.stderr) { child.stderr.on('data', function (chunk) { stderr += chunk }) } if (child.stdout) { child.stdout.on('data', function (chunk) { stdout += chunk }) } child.on('error', cb) child.on('close', function (code) { cb(null, code, stdout, stderr) }) return child } exports.makeGitRepo = function (params, cb) { // git must be called after npm.load because it uses config var git = require('../lib/utils/git.js') var root = params.path || process.cwd() var user = params.user || 'PhantomFaker' var email = params.email || 'nope@not.real' var added = params.added || ['package.json'] var message = params.message || 'stub repo' var opts = { cwd: root, env: { PATH: process.env.PATH } } var commands = [ git.chainableExec(['init'], opts), git.chainableExec(['config', 'user.name', user], opts), git.chainableExec(['config', 'user.email', email], opts), git.chainableExec(['add'].concat(added), opts), git.chainableExec(['commit', '-m', message], opts) ] if (Array.isArray(params.commands)) { commands = commands.concat(params.commands) } chain(commands, cb) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/�����������������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�014220� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/�����������������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�014322� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/�����������������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�014227� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/run.js��������������������������������������������������������������������������0000644�0000000�0000000�00000012344�12631326456�013617� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Everything in this file uses child processes, because we're // testing a command line utility. var chain = require('slide').chain var child_process = require('child_process') var path = require('path') var testdir = __dirname var fs = require('graceful-fs') var npmpkg = path.dirname(testdir) var npmcli = path.resolve(npmpkg, 'bin', 'npm-cli.js') var temp = process.env.TMPDIR || process.env.TMP || process.env.TEMP || (process.platform === 'win32' ? 'c:\\windows\\temp' : '/tmp') temp = path.resolve(temp, 'npm-test-' + process.pid) var root = path.resolve(temp, 'root') var cache = path.resolve(temp, 'npm_cache') var failures = 0 var mkdir = require('mkdirp') var rimraf = require('rimraf') var pathEnvSplit = process.platform === 'win32' ? ';' : ':' var pathEnv = process.env.PATH.split(pathEnvSplit) var npmPath = process.platform === 'win32' ? root : path.join(root, 'bin') pathEnv.unshift(npmPath, path.join(root, 'node_modules', '.bin')) // lastly, make sure that we get the same node that is being used to do // run this script. That's very important, especially when running this // test file from in the node source folder. pathEnv.unshift(path.dirname(process.execPath)) // the env for all the test installs etc. var env = {} Object.keys(process.env).forEach(function (i) { env[i] = process.env[i] }) env.npm_config_prefix = root env.npm_config_color = 'always' env.npm_config_global = 'true' // have to set this to false, or it'll try to test itself forever env.npm_config_npat = 'false' env.PATH = pathEnv.join(pathEnvSplit) env.NODE_PATH = path.join(root, 'node_modules') env.npm_config_cache = cache env.npm_config_user_agent = '' function cleanup (cb) { if (failures !== 0) return rimraf(root, function (er) { if (er) cb(er) mkdir(root, parseInt('0755', 8), cb) }) } function prefix (content, pref) { return pref + (content.trim().split(/\r?\n/).join('\n' + pref)) } var execCount = 0 function exec (cmd, cwd, shouldFail, cb) { if (typeof shouldFail === 'function') { cb = shouldFail shouldFail = false } console.error('\n+' + cmd + (shouldFail ? ' (expect failure)' : '')) // special: replace 'node' with the current execPath, // and 'npm' with the thing we installed. var cmdShow = cmd var npmReplace = path.resolve(npmPath, 'npm') var nodeReplace = process.execPath if (process.platform === 'win32') { npmReplace = '"' + npmReplace + '"' nodeReplace = '"' + nodeReplace + '"' } cmd = cmd.replace(/^npm /, npmReplace + ' ') cmd = cmd.replace(/^node /, nodeReplace + ' ') console.error('$$$$$$ cd %s; PATH=%s %s', cwd, env.PATH, cmd) child_process.exec(cmd, {cwd: cwd, env: env}, function (er, stdout, stderr) { console.error('$$$$$$ after command', cmd, cwd) if (stdout) { console.error(prefix(stdout, ' 1> ')) } if (stderr) { console.error(prefix(stderr, ' 2> ')) } execCount++ if (!shouldFail && !er || shouldFail && er) { // stdout = (''+stdout).trim() console.log('ok ' + execCount + ' ' + cmdShow) return cb() } else { console.log('not ok ' + execCount + ' ' + cmdShow) cb(new Error('failed ' + cmdShow)) } }) } function execChain (cmds, cb) { chain(cmds.map(function (args) { return [exec].concat(args) }), cb) } function flatten (arr) { return arr.reduce(function (l, r) { return l.concat(r) }, []) } function setup (cb) { cleanup(function (er) { if (er) return cb(er) exec('node \'' + npmcli + '\' install \'' + npmpkg + '\'', root, false, cb) }) } function main (cb) { console.log('# testing in %s', temp) console.log('# global prefix = %s', root) failures = 0 process.chdir(testdir) var base = path.resolve(root, path.join('lib', 'node_modules')) // get the list of packages var packages = fs.readdirSync(path.resolve(testdir, 'packages')) packages = packages.filter(function (p) { return p && !p.match(/^\./) }) installAllThenTestAll() function installAllThenTestAll () { var packagesToRm = packages.slice(0) if (process.platform !== 'win32') { // Windows can't handle npm rm npm due to file-in-use issues. packagesToRm.push('npm') } chain( [ setup, [exec, 'npm install ' + npmpkg, testdir], [execChain, packages.map(function (p) { return [ 'npm install packages/' + p, testdir ] })], [execChain, packages.map(function (p) { return [ 'npm test -ddd', path.resolve(base, p) ] })], [execChain, packagesToRm.map(function (p) { return [ 'npm rm ' + p, root ] })], installAndTestEach ], cb ) } function installAndTestEach (cb) { var thingsToChain = [ setup, [execChain, flatten(packages.map(function (p) { return [ ['npm install packages/' + p, testdir], ['npm test', path.resolve(base, p)], ['npm rm ' + p, root] ] }))] ] if (process.platform !== 'win32') { // Windows can't handle npm rm npm due to file-in-use issues. thingsToChain.push([exec, 'npm rm npm', testdir]) } chain(thingsToChain, cb) } } main(function (er) { console.log('1..' + execCount) if (er) throw er }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/����������������������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�013235� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/update-test.sh������������������������������������������������������������������0000755�0000000�0000000�00000002423�12631326456�015250� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash SELF_PATH="$0" if [ "${SELF_PATH:0:1}" != "." ] && [ "${SELF_PATH:0:1}" != "/" ]; then SELF_PATH=./"$SELF_PATH" fi SELF_PATH=$( cd -P -- "$(dirname -- "$SELF_PATH")" \ && pwd -P \ ) && SELF_PATH=$SELF_PATH/$(basename -- "$0") # resolve symlinks while [ -h "$SELF_PATH" ]; do DIR=$(dirname -- "$SELF_PATH") SYM=$(readlink -- "$SELF_PATH") SELF_PATH=$( cd -- "$DIR" \ && cd -- $(dirname -- "$SYM") \ && pwd \ )/$(basename -- "$SYM") done DIR=$( dirname -- "$SELF_PATH" ) export npm_config_root=$DIR/root export npm_config_binroot=$DIR/bin rm -rf $DIR/{root,bin} mkdir -p $DIR/root mkdir -p $DIR/bin npm ls installed 2>/dev/null | grep -v npm | awk '{print $1}' | xargs npm rm &>/dev/null npm install \ base64@1.0.0 \ eyes@0.1.1 \ vows@0.2.5 \ websocket-server@1.0.5 &>/dev/null npm install ./test/packages/blerg &>/dev/null npm install vows@0.3.0 &>/dev/null echo "" echo "##" echo "## starting update" echo "##" echo "" npm update echo "" echo "##" echo "## update done, all should be 'latest'" echo "##" echo "" list=$( npm ls installed remote 2>/dev/null ) echo "$list" notlatest=$( echo "$list" | grep -v latest ) if [ "$notlatest" != "" ]; then echo "Failed: not latest" echo $notlatest else echo "ok" fi ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/bundlerecurs/����������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�016715� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-1/����������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�016351� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-2/����������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�016352� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/failer/����������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�015462� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/fast/������������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�015155� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-bar/�����������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�016355� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-config/��������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017056� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-foo/�����������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�016374� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/slow/������������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�015204� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/bundlerecurs/package.json����������������������������������������������0000644�0000000�0000000�00000000132�12631326456�021177� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name" : "bundletest" , "version" : "1.0.0" , "dependencies" : { "bundletest" : "*" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-1/bin/������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017121� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-1/package.json����������������������������������������������0000644�0000000�0000000�00000000121�12631326456�020631� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-change-bin" ,"version":"1.2.3" ,"directories":{"bin":"./bin"}} �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-1/bin/foo���������������������������������������������������0000644�0000000�0000000�00000000027�12631326456�017626� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash echo "foo" ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-2/bin/������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017122� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-2/package.json����������������������������������������������0000644�0000000�0000000�00000000121�12631326456�020632� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-change-bin" ,"version":"2.3.4" ,"directories":{"bin":"./bin"}} �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/change-bin-2/bin/bar���������������������������������������������������0000644�0000000�0000000�00000000027�12631326456�017610� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/bin/bash echo "foo" ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/failer/package.json����������������������������������������������������0000644�0000000�0000000�00000000371�12631326456�017751� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name" : "npm-test-failer" , "version" : "9999.999.99" , "dependencies" : { "base64" : "*" } , "scripts" : { "install" : "exit 1", "test": "echo 'This is where the test output would go'; echo 'more test output'; echo 'MOAR MOAR MoAR'; exit 1" } } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/fast/package.json������������������������������������������������������0000644�0000000�0000000�00000000503�12631326456�017441� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name" : "fast" , "description" : "does nothing, and not very fast" , "version" : "1.2.3" , "scripts" : { "preinstall" : "sleep 1 && echo fast 1 $(date +%s) && echo fast 2" , "install" : "sleep 1 && echo fast 2 $(date +%s) && echo fast 3" , "postinstall" : "sleep 1 && echo fast 3 $(date +%s) && echo fast 4" } } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-bar/package.json�����������������������������������������������0000644�0000000�0000000�00000000156�12631326456�020645� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "package-bar", "version": "0.5.0", "dependencies": { "package-foo": "*" } } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-config/package.json��������������������������������������������0000644�0000000�0000000�00000000145�12631326456�021344� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"package-config" ,"version":"1.2.3" ,"config":{"foo":"bar"} ,"scripts":{"test":"./test.js"}} ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-config/test.js�������������������������������������������������0000755�0000000�0000000�00000001170�12631326456�020375� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node var env = process.env var orig = require(process.env.npm_package_name + '/package.json').config var assert = require('assert') console.log( 'Before running this test, do:\n' + ' npm config set package-config:foo boo\n' + "or else it's about to fail." ) assert.equal(env.npm_package_config_foo, 'boo', 'foo != boo') assert.equal(orig.foo, 'bar', 'original foo != bar') assert.equal(env['npm_config_package-config:foo'], 'boo', 'package-config:foo != boo') console.log({ foo: env.npm_package_config_foo, orig_foo: orig.foo, 'package-config:foo': env['npm_config_package-config:foo'] }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/package-foo/package.json�����������������������������������������������0000644�0000000�0000000�00000000066�12631326456�020664� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "package-foo", "version": "0.5.0" } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/disabled/slow/package.json������������������������������������������������������0000644�0000000�0000000�00000000620�12631326456�017470� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name" : "slow" , "description" : "just like fast, but even slower" , "version" : "1.2.3" , "scripts" : { "preinstall" : "sleep 1 && echo slow 1 $(date +%s) && sleep 1 && echo slow 2 $(date +%s)" , "install" : "sleep 1 && echo slow 2 $(date +%s) && sleep 1 && echo slow 3 $(date +%s)" , "postinstall" : "sleep 1 && echo slow 3 $(date +%s) && sleep 1 && echo slow 4 $(date +%s)" } } ����������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/����������������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�015567� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/forked-underscore-1.5.1.tgz��������������������������������������������0000644�0000000�0000000�00000001107�12631326456�021130� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000�������������������������������������������������������������������������������������������������������������������������������������������������������������������������ƎS�Ko0 sΧtt~uo`i ;(c+%E$y`@Ka,HN+ʍ $Iyd KgE'Y^$͓ YԞ:J kn#q}v_%9(Q4`’s+(ZZ2/^OBOy(yh|y^)c\8;px?hK/luh>x\۞Ytb- EH]T Ȕ\]4M!Y\{ǜk n .TcpRI.-z;އ m^Bcm-,{ T[Z JgpLJwZ>T%nbq(7B=9d۳�SQWh Zt)"~vY_*J^O YOX<X[`ր:k]kS)ʾu1%h 'E|ç]!      xR~{w�(�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/github-com-BryanDonovan-dummy-npm-bar.git.tar.gz�����������������������0000644�0000000�0000000�00000030451�12631326456�025355� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000�������������������������������������������������������������������������������������������������������������������������������������������������������������������������NAU�= xր"4 Bdf$} -d64ɄLB[ >T@TAMeSA EPAAAP@DAdm4}>s.{okuy,VkG0 ۋ4]ca\\[W+hqHQKq U�#Hi#Nb8IP8] ܌ -<2m Q7*579�=h'i Iً.>K❢du"At%YEj"ǃwnWE�:,#9{F=+s 1Zj0eN)QQ zQWJU5MsĺN7jQQFFPYݘ;*aP�`n=|@'mX%Pu55=9*4( @I33; T]D1_ * ~GP_U!H-x>(w2l>i`P9*xOO/$A"b@1'@B(xl"FYz& 4k4OѤʨg70 %9~ o*HYL4M g9XH 6&FoxXAjL<z1f0pL⸁P@ ɕt⪑'MF 7`RJOMNo4S�h9 x=FӸ7`} d 0QEKL0*-74EzZOb$M`fixρyt@ =c0pD1Ό&h1py.x j?j_jhGƩs?PgBp>Ic0CA ?$P}CF<P@o Cb ah8 TTh?( C T4 chXCA?4?|#$PTx'$P}La�9D@oBA?H?,_d8 T!$H "p/$P}h⿰ P)2!$Gh=ީ ?P}=pWƬo@7'I<|;PX7/M/uiʍOikf1i v.2g>6OӖ3 Au=ir4fVVonvՄ,i5|„_޹a씽KYu~֣~g#RsjSob{?v7GO ?$ܻsKZۭ7 |f+3}OE?⾔[\|^Z[Gݮum\^e.zxxӇL_o7o:|V#Ҵ]XZ;~MGarNj~JỚ?XH  zlf4X#c-⛣R꿑Ї?PX{Tv<k٦'<&[}cb?eVlaqںľׅONy}n'Oλu:1c;RḥgK߽\mXg;]15pSKh|ѵpFI�)s6ljesҶFx&K{5My/p3z#�7o-`GH1cT~?G#'%;!fm\>qݽ3<⨓ƿ$vU#o>qle>mșS&r|w%6ԫwFLNQs LiFg&~lԌq[l =7{lts-^?(jv?yO8gSr +"5::&۠/xo|cgT֬䟶MǷ|gs}nj|bd&wFann/faG?Z}[v_'~w/7lFN!}5]dܜ(}3/|̞\[Yx;='.ju_;W{x%].mSo-+ޭRE}*erϵ//翓vz'/#nH7jqZ}z瓿-}K9~M \}ek[ O,im ?8)Zoa80l4M48&N4I2  kww!7�KL w ܫOϭ;bS{6t=!Vv55u,$̌ z1 H ˲1 T@a;}cڡ^3ײ|;佹LN^% NZsMKgKif6r׽7DsOMv٫`_>{t3;tW^ۭ޷VZxnú89`Im916bǬt~ۓZΉ:|uate#Ư8{(d_k~do\q_ {}6];m<zgMZ!qsѭΥZz?>+m=)}~MIx~ESO5ͫ:o9h(Ң3,#PXzAe?z?�#Pj5:zGiݻ͔.t:]K&nѱM?|.Wk_mjy:! ӧZ4m׶ͣFϟ)IC黫?5ۑ7O\kg4w^K>ط'ɝ~칤AIb` 5 = 8mѓ&¨/oo00UL�^^E^4On1yČBp#9 Zp13Ybe™)fZMON9~WMaj!#kzaߪ}oJD XKUL:fsZIw&~))/틎EG ܲ$v#>mUO™λܷmq vϜ9uSۿ0ݝmaQg5>ˏ޺Ѣ vlWQ߰y.|wn2ֶa~4v\0[|Om𙷍C\:++yƼ5MͿ&p^oXL& M8qbLVz\rKڧ1uKM\8c3oqݯ'O\q*Uc2 ON\:lYƿ=U$]/ el_zn;ys˧ή52Vgg]F/uХ.~*ׂ-^NOL™_51ǔEW7;{IKY:tm]j[}q ?P?(Oߑ~AT&gi] `F d0IbIqqFp/4QMo>˺9I]n͞燧Eo좍^6mο~o"3gͯc{]{?\{9Mlnԏ^veݒ.Y[O7#Z}{Omr`ՏGGL-է[_i1+4{íksSco=^njqWW;:Ѧ ?A}[~7GI<|7$ ih$G6}xq<;;^Fw]_05vtˆE'^{g:h~|q_?(λ{\x^7uOJxa 5M4F iR 8,<f4YLLHoJ\o!ڽ?7̮<]o YyfP'Zrro:L^oz^8͞M^>-ϏQGձ.5j7ֺvT^dn A}f3' g@ofXd)z7'_@a3vHLlZ-e|�C�t{K/m}EI6 n~Y-츆uM8ZoFˎ5|}}`\WMƥ¤\>e\ OnÞt.QnsSͦ=t`h{Om S-f>quz(1iٍWn}է)nH,F _97k$6Bwͩ˯Q]ضiSOg<e6<~: <ccM9{c@Ms篇)?Ezc(F &iBrm0~?>8?4i/z6uG2\o~d`,&& 3m4$E&33 W}2!ZkVe5תuQZAg21XECz Mz=G $L]?UBlӶ]i<wO ZnmrzrbV-HWSl#1:Al468r6$si6ɳ; 3xxe]s>vxs +q~7_m8Q3&`[c6~HjڮSLܹgum|(>r=q+Wዸ+ O=/iWj .jr@z''4j=n4m k祬ſ] M܁Xc 5hpf?vڲײԾ4o|9Z;_pR=t<}vnܟwI/%ϚKGxy£/Mv/?Z:{#yN`[RvkTYW 1> 5x?bi6(- S3< }sT ϤMn,FS{=ǎk|ϓbz<l O0.}\əWaYT&C,F@nhS5z;fݲvҌ9۾ٯGz1鮝3K~t덱-6Y)lo߾u{ 퍏]vNi.,*YMFAsЉi1#?ij  ,A[X=ux`C,#Y7w%*!fmJ3;͐=7yzrѲu /|pL8~#k8g4/OpݭU.y Zg6vL7=M=41X=֙7=قåG~,$7eua,~uՏ*c8}CNo!|Mc@"Q}.o6s mVF7s3W<8OnrɣoOiw<4{rydp3?& Y7)N��T?#>(W|B&/Xw2[Fl=pȀO^7-*:w.Wk>bѼ9܏ܳ#>N;g# r{<j@ĹI{W5}gNuyd6|~~|(!1gRY>??p>_Nj.9V/h=7׬OD;!5MͿT_F~r54ogd/<k$UO<\ Ĺj;iz) #FoOy3S?Qud8]W{&)i9f=o~ZLx|L>SQ<매9fq̣Gꎩ'lzg6u{տ- ͿY\: WP}'Y 2 +[#gy7YHh W �,U݊4{k&O:r÷_ŬעdK fMhx錡\sKt{d`&?>=ݟ>a_N={~^Us;]"m:{̉4 8-?N#yȚb4W>(׎ŽѯCgtPgڌs i#e4Sz21X#E[9* TXCƭ n/~~_?ݵO#rkzaUG6W>n �}X �o16^4ѝǻ %ړI* cu<ƍA KXLT Ƹx Hw%-hMta tЃ`vQrۊ0KK�yLm6 D,<h+&nLʬqyCb=e8 X @<%"_<izA?O%tڊ�<]JilOF#P@T[Iy( id7&.)dzH[A!bv^\iL>,E`|`p` Qcg6oc-(#;CRyЪЌ]")n1Px*Ih <Kt��_JnhlsVw¼#B 9 FI\]y¡ԥCUоj< 0eKnLSN9HQ+yn k^EѝUMqCD ŧd)0er=[((�yT*jm6:REF[ j@&@R "\iDA�2@? .EY@Yr<0G jcyEje�e%):r{  Jѣ]e`@a#Ffj3ѺRH2 3& P4%L=2 Pca<H%?KNFUȅ�ԼB QYYlj, fYebU& yјbt#G`YCx QMnObw#/_huK,#TY@Ub `ǰ@ {QEW\(.0 v1 GM 0hQ$Q.05<PmN9]!?9P( у[q�fյBE! lgerA0PEߐїJs:T孪 �c6@Y�`_\ @(D55Kqcadr{Bu"\II*" 鰺M*㬂)-;E AH6],&X@y`Lm&M&3nP V#,]GY,mഞ=J%ThGU&H-]`Q'ЇXUIX!Xs,hC$T+@VHnwrTGL Ks9mDv> @ '3Frc8oG2qr V0ș&*Yn @nDB[@�~81. H$0XJԺ6P H`KD9'B! 'yr-`.yۥ%w X#AF{�ðXmV(E"6XJH#ƻ]q duH$F{` �T8,(� #lHkX@CY:<k4,Rr/k 4V^^ ^R*E{&X>}53<ǀܻ9x$#(ȯN2rD,!lIiAu'ʨtMcHu pv&RʗEhE| K 4m}X%HR2TCy@ ,7Q°�=@ iE+lyV6$<$ ?BU $$ qBs>j4d $Un `?( ʄ~ЕOM.#�2rv�O.E[x֚G0y. q(i8(A릸_\q wqUP%Ð3@^ԑ %n0fX eaYY?+"耡E4 ԧ򶲥 syDɿ" B&/:+mLa/(LGDI[k|CT)9`hpL&,dk)!q| $dz N'-m`<SX,�Ws]a2Ih[㲁Zk*JHF0l6X)Py 8Qpg"̍e 9 8HmA !x|:J%�#M�u6/i8;BBqI4`! I` #ՖF$p44pI| Hf]6+ $T@e5h*u :ڛy·=bktP(}Kë}jSRz#`?pTJ?APl(]IE.knt qXE, ";}\_>_8Q^oXn(A %ZPbS:(p@V|[cqeA8w;%7vف8`Q\:ڬGPqj9,4C~,σy\.Rcj-:{٠'+W|8Bw3D\@D]3^M> <džLqe`-0 **, Ϊ˞fy=: (RIrJ<0Wv;gKEvhBa--7Nlq#@jۏ4ǻtUav�4tQ SEt쨂<|XVZ WPa!'wta ;#a+o�)*tʃClEgNQ!AҌu?2,p %e e $Ȗߖu nwQH\hƂR+"W5bd:(F [Zݾ/.T\s�5vɭJk x JD$j-,(/w ^ٰC6Q8>Z'XQrt'otk]�闀VwHе\kA<N[ԠA1* ' bp g�n t.%hth/RFީguTۥ\Hxo +Y~B{CbC .`q`@;ㄊ;ט!%%cD|7@e:t zDGTرXߐeƌEGG&*̊$ MIT4Em!#"!*D4r՘w"K!!j-:%uyX3UDJRv+z@-U1\ۀ_�TQZZH%xz Hje WHSˢ( 3<'oUa=`@Yw0ͼ>g!rTuX<V`*V] ؟g\6+@|^JF,F\tA�yY["/!KTdˊYXp!Sg/x�2�|$`%.o�IV(2نNZ;0Ƚ<N$8Wd*diey(#f`?NJjxUhkP*(*f_c'rhSDުІP;lbyj09T#y:AޙA?OxqV7e$ [DFDVFQ@de"H$i|:68B*@O9AE@#X풷JKy@lly :H:䜲3vҺ�fPd,H�!�/AEU,y'"|΀o' )0U$w@gPw 5*yQe27Jxmۿ,PAmBDw�s#x\hzE*1294łp!qJSc<qw R3P65~R A(&#j4J)Bz蔈k >`@OςVHWYm,R.25 lgA(jawR^ Ti(+mDdS^|FEj~tA}pF딾^ QQ1\| 2HEd[h44*�em#5tCS̗r,+$dR3 pw1K3;=� fLMTM�Cei@"CGFŵoU&>xMaM* e.T^lWO4cHVѝR7|#&(UZyDnb*([EQNOJ찁7b`r=RT-op�urAB.o: |,`ځCRS (Kr.c.@BkHLD5u`rVPL$*r@ ܃I.tKa�?%Ǯ7Jm� U%Q2u\wl gyTjD 3 B5!}P$vq>if8NY*Yt\r{9}zS>`ʷ[uZ_QL֣'ġe$c (t ԍ¢tQ:Fب!9wi)W:vҒ0}nBڀlF',`E>T3jL97cX4$ @ 㰀xMiS_ ռoe?'CU`H%` 3j&W)B Hnagy^o *R"L?4"H2 _PM.(obzxA\dV,�>d)vXx8J-{{8'̱.E Zy >/ c&"D`J0AжqhGHE`*G uB>T<q6`R4Ltv9U}e?(KUBE>KhF*=Jr0 ,@0@Q):"o|F ZvH>e)_C@sJUeu6UFJo=wUÍ<oG,6਼\7Љ"LM/?xO^x!/&9ҿ_B 5ro{yr.dI.4SzS;.۴@y61htY:[?4C LTʽ*btLȀ8oD! XfAVyY Po,0qD䜲z~Q$9y�aT j.Zb(٠[ּOAcGJKD^[p�?ʆ%)gC=1no?. WΈh <yqqԦǡ) C)^]1j"]dD�CaZWv,Ԭ RA}";p! 1Pk�iHXV cv/L.c2Tn2{d)b!S,>^(;jX" (k >e/{HLaXeʬˈvmp/B4U2> x_Z^KFLB/HZE2Ց? _9*Ko7{*P2g̨Ѣe|S7[Q位<%NU8GMz;Cu9\><S:X -Js;re<]7eX{0! aC  l�������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/github-com-BryanDonovan-dummy-npm-buzz.git.tar.gz����������������������0000644�0000000�0000000�00000025154�12631326456�025607� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000�������������������������������������������������������������������������������������������������������������������������������������������������������������������������&NAU�} |U8 꿑]N[Ц6n)PZ@niW@2f&=8WTtx+*z""- ?f&IB۬@;{、ZjA;+CF*%\4nK=3Q-jNI>_v (CCr$M\N'QUc@E<2Gm@uʒ V0vחrw"ȟ}-8WZ>"&+ WTQbd b4%[#VK0˨Gx/aD>*lBϩ UVjQQ%hZD-t:uyt@og<:ʣv?ǫ"F4њSn:`$1a#_DZHL9ɑ&CVR?I'3PxXP5<é0jh,5,D~yhEIS*fhLR�)Gfh# ;(Xh0)#=T&KdDB!$B4jo,{)u]?"hr}!)ǕJr|-#<[].Gn*{(/М0\\q<C~f(MX2HjLTє~A9fؠ$4Lc ->EpѬ =A?uy_gvS <\S)ں_u0He7c忮#vΑh'_R)/K}O JgZ E=R˕?-4veH@:@ E3O ;iP% M=H\:/s'-t2@*giTGf?-4}df7- {3O 4?-}w2?'3�ͻh#Y ^/@8q]@I?@}7?IXxǏ}|=Ӯ\CW68ُ܅ޗ?0;^{=c+>($o*tܫ#¾]o狪7|~`fƗwڱgׯ^7zH~E?~9_ J!g~5Os\Kd~oǽ<bхQ]sgno]>47rCGG^vQ4͏|OzE%g"o{}\>P RvŜU2GOnL^F)rWq>m=׮Y1.>6wTaiԐ{ѫ:׮A{/;sƶW8n^ [.pva v^-޹c^*=oagՏ;n:#?Zz~3vyaȩ?z:^Z(}W_{q80J/Y>;cG;EHӶ+s?M62ikyqÇTL7U 4z)X 6ydu<f&sȳT|k rΝĞ /X^? K{zW_]]1xg=dQšTD;ːZ 0`P\ (B 2.rn1miU]d'=K _<҅s|}\o -:D>'zzӫ뻞߸vؗ>ږ٫}r]r;r[{Fo2^'4k˷A~`wޙ6{as5sˇt|:{5g78¾ӭ{ќŇ1ˮt߾{:8od}մ|�p4Sav:�d?)y-[ןis>vlAe7F?ŵz>E/ ;W繯`ѯuѿ=2ra~'|zb׾ %vr\۸D?ȅOC;8JGeqT8W5y}g> c>h !Ex: y.w yҼz\ʜ5')_StFGo}n|99S{:d ͐?Or9a�M(ݬ, |Ez]?-�qۨT+޿?qu2={t{yӿ+NL{oWn|2'f+wQt{_>5m'_[9ի{iuwvdE_37\rgpnevv;7s(^w9}{<pxƒ'g¾N=v`);mpǗ7:?~;lCm^k]p*vJG7&]|uunX/諒mIeZ9Qc' h'fK<7fQG8GV]WlgqwGc R?HRPhPbn6hM)+0_mZz-Ί7\yЩ'{M?\zÇٺv=6@j?iC{ݠ rͻ\ b'fEz2t]GyM_v?I>h>zXNH�/^oO_]iگ'wq^hQՔ <j{oޗ#/.zC5䄨eĺKV5v-fU�w'{ziѴ=0#oTҖ#^2Wsz7]xy;puGR/@/h{軶iKyd!]~լv2@homJk?C U 枍ob ܾ':tݻ1kQKY]nmؿzM$o5彾J{s_4*+}i?vk+u9N`OS.X<g"s#=:M~ǀ9ם7|Ծ ݬ?(M}].㧼9GkQd&O w[>O_~ ϖtsO<}駯}x։y=}uWU#}q] wUqvսF}(L9Ǔfߛs眷|' 6lƓWzu'_+]=۽,^x$R`3>f/qXqk;o^?Q2bM!+;<[aW^o\騞8w㷎OR/B|;߽§N:i=ՔNΟqic-I?omtjİs{ ^僜#\حjJپuV|:|?+/`'nG+Oqo@jH{异"1GYd 2A.yM!(C3]>'㔜˃7}pߺtO?9 6y%JSx>[Z7ԹŻ{ =j7NFAцf']eA3kP<wݗ;cќO-~pZA׷.O ^n?O{O$i[p7o]|Eƿ8_jN;) %/ԋ&Uͳ:*H}p>H'>(d>,bSfOfEqm�Ն+Oq|ܝOz}zt朗{v Q'Cg-7ud׮K_?;z8& \A3L h_.0HAi/o{f[ <p?zmSSO:z9Z?dito,>7_ˮ.zunG_׻N ]r{/u\-zzԵ;G-{οYYyjytwŚg2/^zo[MNϖiqMiő<sOXϕ}4lgZmwꇻ<탿rї4?Z{O=ϼ?Y\Z 5|$R?ܼ;MgSU BjԶ_4h}/@Ka^?3:a\-0jPo~zf-c+O_P nͅ}W;ywkwsUo4`J+F6,h܁wC~_/z9's#!kۏs]d{2?>IGgyE]O1*o34/=~-_ee<Tcqweks˧ .\CohgFm;eolKjDtdh۟nmÇp~K7}\Qqo}OurG'uyc^:Z=>pʆ.M3۞^7[>;8觯rL;g`+;hP5;vgilXKT xeW vAM%0hߏ'rs F 0/iZ!"U-@QZ�yBC!N:9 e#T^#dF0f%"yQɘ &DhDDTs[!twLJ_owCkdyd"Pjʄ#V?t9Tk,٠ %h(M"FB5<; k+R;d2j"̫*S %"�#Ń@``D{Cfchb9C Vy^4$K9"#EU@ QU@0>"G hPjr$[QTjs0sDd dCȤ*PQxMXȾ;hĢF kΈ,7Jw" d<{F9ss9C-MkǑh#hE)oTF?#_ l (Yu%pJuykcՑ�8aT1Q{j/`X %Kً[MLJ j IQ; "%˂`2 B |(Ѣ,`1A#Dgj^&T@T@f#c}2s@MM'�/b<ZF!*4G 7G%"WuVMsfs8,F9#6P UZC[QVb<gkTAc v!),rHHdS BQZD1 g̭JB(טؒE2D$&쬹"uVM%*)˵a<F@e# /ehw{4�Zh'{2?=Vo1,Px/&+ )8ky HjGmML0dq$(\Õw�0gyǴ@[tN:p"(GAp�fBMļ\$E#KlserAJog\neiS[%9s:\m 81~f`,� Z!pn7r2bkR4"A( L+YjFT5!de#%QU!@- Gxk|wO{8٠KwPs� AwHEQ4iUg%#G╢uЎRK% w>ʊ$@,0 UGv64{9 DtqQ5WQ)vD$hI} /ᄓE%PXZ |Xw8A*<Xu gldjzAOA$ S�"X*V!qa�K&8HT0Mb%AΉ(j%*c&Z"ͨQ bm\ Rs8Ga7C""CT\pFUk UX,xm(7�GԀ0�AԔUHkTAF.Xve;]B }x%}c >J1Kƌ).EsDT~a42r:rp"#i(ӆn(RH K4\wUƥ4eAa5yNVf錍2DDluCQ\z3.fb, TǙc`|q. ,.X"A[P@Dh<Fa]1 d#tbS 9y(q|=9F2sLHTm[C+i%]/-R 0@G$~ȕ'V3tӋ> r9@=KA^@fFj�uF2` 8!f_Y(8u"8FjDvw݋79�X>XZP`̈q :èY?","F]'in':Z "86h0;FV+jl0!}#%Mf.&c t"!(d"64p$qF'A @URdVk 럘R>E7.(3CL3J(>qOZ`<S#ϳyأhh_#PJ^Ve,& <xam+X)x O-l2BXS3sQ s!t@I!tJ&7@&_:)P8g5q\q"*ԘFx^T5$'J0I Ƒs8"4v)P \23̬+$@B"[7m੬3L@ky&"=cctH(rb+@>+Pgk?%ӑMJ|%yOQ7լCe~/-дS*GF# բI[(`Œ$ ָ4XcTJL ͭ덌t9Z3`N,PcyAe0Nv[1~Ry9玳񈪁=p~uaE(~A(9b.ockhP}6M怓<.(hzGmO#"k1O_}>VWGk&fռ% QI0CFC47s@U,Y1Ղ'=+f}='(Hr`sUg :ݎ6jC8(D֎\( 32[(Qꌭ6F#ZitWa;@`.v,YEESe52, ǡ^ϯ9 9 k*(MFEkŒ HWEaz 5-чB glIJώ3錂uLcd4K8BG՘C)¥c9.Jbi -y*Y 1ρMƲ"W4edx0jd`uPQQpqM59ZvP=%a6$q"ʸVcla!qCgT:LEɟ GGMdEН>J8=u x_F6)m?AbőԧCf3%ZbiA&qPdzVJ 3#U#"C!@ -UL `\@rDs{<ʋCqP]R>bTxV86q“pd5런 f"HUySI{ 'j>xAл &9 J TؐIc9m9YSshǦ+m6}~ DNsA@FS|!+& +/TȂLZ|KVœ4Hܸ%&dmnykHYK0jb}(dj~G&p(|bp=ji QT0Պ'{ ʨVcWXݳe1dȌN[UD/ZJw(>ʼ1 PqBbȠrdq%^י`T~XUV?(!YoDf _ #QӫW.8`/+Fb޿D,q BOEDx lgJ3"T N~bVlm. zox/ hJQNV &.zCp:Zm+)+sFaNk ՠ,zPU:hYxSDߪ*xC>ĎDaET!0)  >Σl@7I-&#RK# ,�,YP*OբO wfC<F̌ SOlp2QHH.}{ befF9w$qؙ;U iΑI $�b, h0[,y t<e<,>W=I3I1݂bCF5VsIt1Z|'X_ b6w6q[ѝ`n7Ts#0Q 5LD'*# 23P75qRZZJ`MzCC Q vN#b46{Z DDIIZhca?bh΄q֬ab FIK8NL0NH3JXJ ` HKqԈ覼 !~FE55F&>CqADsC;}MP11Zz 2XEt[mzR<O)YdۈD nd$xB$PR`]ayfOD:=�ӍĦJ&&�BG.h,q j_^"2tsd\ܡVc㺎in ; RM)j_d= |5weX`Ti3Gi cleO`vc(Q"hTT[16 .G].0B$-h3LR)6X#RSCZ=a�RCVTFazVL$.rC %NśP@p 2v5&ZA �7$c5kmԣl2JhaFH $G'֐IOBHp1if8Nm]*Y|\̭tNԳO3S]`o)@2]5e#/e$dfv@]sfN<UD3ۉPgUN x\2!i)ξ`7[1 Xٺk(ip3z/,7q�"9AۈxGhSv]ЊɿH{�2?VC2K%#A8gLQ>D Hu a³XѭA#U*DI]$vzuA}Փq8�lEGQGD8oT`ytzxVm @!?aR.zJc n[FB",oh ͎k.07}x q#aR<Jtv=Unce/56eBkpECCqGGI&U Յ$X?#6:5Pjv`^kOYQiED¦ H<Qi֣b/+3;y GQO 6Ƅx24~DzA6e|% V. ͖E]1fm>lI Gk`,ͤމmmd=FMm u5ķ~i>INL,ƽF4Jp=2fF q`m$'`MͲChL78b9nxMxˀ' %* !Ep?R #d捎  Jm=Rj5/" ­4Vl 5>0Xh foK<cjV9! /gç6j M3xc~j4"e.2+(Ki2k#7"hLVt~'< |XɮXM2( x@PfNDRUBQwX1PLk"1eOіt;Ҕ)bt32<ZXF%7 pvРUV駿ʸ(`:6Xs-/np ߱ľIK^D3/y#dzka<jCKԬϱ֤QE7[(J5o%>l41N{p ]o ;$T~1CRԢ9ױ!Mc@2 d$Z�����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/github-com-BryanDonovan-dummy-npm-foo.git.tar.gz�����������������������0000644�0000000�0000000�00000030650�12631326456�025375� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000�������������������������������������������������������������������������������������������������������������������������������������������������������������������������/NAU�= x,Q!-tM23$oMR&I&f.l(   Ȧ",?>QvD{gtysA]=8|~\"MrK]"2dqL"-4') A$xXI&jc�`8i;Lo1߿ T9PO2L b˂8GpWx$;|H.֗/xe0\#:K ; q+y{Fns 0u8z0|ɤG#m ߏbQWT߁?/Ȝ]Q9ʓ Jȿ˜5 qYc!O|Ňq>I>:5{BgFnQ㤥'(&� r,ʦ\e}W T\s%)O( |G2PχȒ�5G*>F|hLдh@a<7@_sY1k_T |E\ ^hy Bt7:ʒ`X 8Z9lfB? i)\XlpWё6bh\4q�*Xq NZqec93HS`BPYp &*]}ll"׍ഝgh mv "N 4IYgxԑ<oY4͉yf,OsMYlnڭS)4IYl嬬Oi٦ f#qF`�4/V^ ne-ְ)(e.r򏐋hH?hχJ_J5?:PI#T9"?Eq: O?:Py֨ȿG* Iрo 5|4h<DT Q@5?*Py5G"tE*25<-ё?*QF"T ?*PyۄhߢQ%2{4?=?Mi?*PysQJ?FT G@/0Z _FMk?QQySV<}?E; D)`l,o%xB9pS$e3<") A Yf܂k(pfvnF"zn4#s<Fٛ\vIfo}zh/õpx{y;=s`lٰ*>u˜_MvrFխ}۴-Ru{/zթ [ўE&('5/nl}[!o;:z ׭8uܹ}W >_꿩jGO'qCFN7>_kADj<3[,(+GmDp',iCa>vGWFG(CD,ilgy%�,_yY`H&$,h(ػ|hqٗOL[}+oʀkj8wxڮ]h)WZ? "W묂s'$NhCaOfL7jG]S zz]oݚ8aV'3nOl6wne;E-Ŧ.}iti~[ޟ3ͭ 8ūܹ/^9q?":%-VJ$npmvJrv83m,, 9ʕjCaWfoqcՐݷٻ߿A/z;f&^YEqϠKG^8oU}kbF9Gf{y[Z3^Mάj|hi4Z7{„]u?kG䤝O}|gN-x[ͭ]'mɍ{pw\3v\sQ <c2vXi3aF@%X8-U$CAK�jmbߪiλ+~cat$gؓO}٭׎o|SmWpC/l+>ӱWΧ/,7LuףdoܺK&Lވ z5:j+ک,̀/n^⃵N7r~>9}(y㮍~qoxv.no7>g_kADMP qV )+8Ǜ'y{pJ�'G躶iq#IN)w֭i˛roo_/L{w\ց̻f촦$tꬾgbq<j8}|3ήz(73>;J<⪦?"E! 7?V3)`CTSx)ǵ_T�]`uƇu屾7qd֦3ͱK^ښeox?Mmٹ|ˆY;q/qe6|yyVy~a/g/w;ˬIgpx ֯OvU{ߜ!/b)h^d+6N1"k&i?a)$Q@[�^XA^Lcn!~OnH񲻠NJW;j.'~CN"~81zhmM&l[pO14vA<WVw/shFGٯԝu-q%a %I k Ӆ GzU9iK=v,nJڲII8}z]לբq߳奌gPض`bt\"̽Mǘ'];tOՙ陿nY;|̾k'*Z1p$;+/OmǸں7X'/`ze?|fMUMͿDQP}t6_MMj11\nڹw`Gg[)~cIGjGz|ރ^ޥ#ww;~Ӊs`ɜmZrOL%El:|֜}fkfOIj! ;�/2`Dl ۭvFRd%.Z?:p=:gUǦġkz2ktl:$Oya}'՜mmH4쪙9[Ki;vz\/:`sG9fU]Z[M!O (-SM:;l\ 8|ɧ^H+,E 7P76tGA5tƹ)y=>2)5}Æ^C.+S|6ySe/|5tԝR(:;]tvֿn8NnVSf{B>ߤRrKS|oƞ(N d??Eoa2CЌ@6&D, j?I:c̚GJ;\2'p\rï1>E-ݿ }D/۱G:zݚmF9jFl}zέ'Z,ْ&ɺճgmO'93~z/y67p}'ZWA? <N6ĭhI +CҼbmo%3e<^G+Q3sWd7'띹5t=v=rRڏ>POڰgJ?~:)ׂRggZv f YJ[ Q[Q2lҦݫ_卯? O[lmb KF_0wmۨ_?vmus-)Rm=٨gߧClM';eng{+۟8gON<~#i۫3&ƶx7Vo)7i!v5o^=+o]4Xk|ہm0Kϋ]ސV\nxboڥ4ݳ֮ 75~RG>P^>tg-[|:uN>E_tزUsWqL|y|7]o2KwGN:z^<zYw_tķ.};^|-{}:}ς_K^-|NNt~a; :9d:NZlU3^~9+aͼ۾ffv+| xB0r?YO�|_D:Ջf?pθ;.:mmxNC\Ol%{6nk%o>Ox>sQ1#7 />{ҐF׵×IiTf-_jأqW㛞7O/ugRkטi6μؖ3:]6~ 'O 㕅ͳC5[rwه-?uy!;<r@"/L X8f9-4b;M"%>8GR??*gma6f9Բ6E}4("GJ - g~BsOQ4e+Nn!!5ϖf7onنuvb)W~9 mYI3!2EXJ4࿒E{&P> o7_/8ec4pRiM_M4wWZ/17^ݽʱjW]KנZΛmVmiNs4kfyDFfa:h?c &!jONc_Eq;s;Ƥ^Z{^5NQ.X}WWGEKDz9oۤ{7>D+ϵ _٤qwjKc%j!hfqlfqVBY+X(x +#JG`-Uw?ԩ#a_Ӷv6HˆxcmO~P 9n^6^dmOGt]+q=>̦zt5̍-bHȕwtյzg?/#7=whw \yktwg񍉇6ue}z5vT;y[)i]1K. {y-ؕ721$+- ㌝yxN)hi%+cͥh7*�j Uo6 CNN(jې ߻>zݢ_|݋{2=㉫Cm5k>Sh*^?ļ]+~8޸ˤ^r>סu`ybg(꙽֋<GZQ'6}yGs|^//cQI~jC埰X L ff,$ÐfV0K7Z/*�i]v p%S-r`aV$w}HeǒO}fӨTkeygNz{I:M=s,E?7j_4y9Qx&h`۲ ?vÿ2_jzȩs-Glgyε =폮̘-&[g;%=:Yԃ|z,^^ǤoT5].Py9e+EYHZvn@rVg8NT/EaHG / ~/%'uo?;|u/M<>}/xibsYOޫH۳?~}{jٰZW߱Er3pW?՞d NYij_5eWpS 3M_\/kOݵG]Vz8mWw|Zl%s۔EW]oA=ג~[tlrJQ#Ų0'ߧ7†ߺ>EvǴ/IC-oMNx۵g ciЪ3Ohricּ\seU͠?"D ;pm"8m(ZIEo&K}'cQh_Qh֧u>ܦ:s>,Ik mWuZi&gإel;V3~hWqnc|UbmF;qg=)-oQ?i<e睱_ſf>7֝=`~=ָG7^>pٚz_;Zm)qNl=-޻sZ�vNX�%+  9 ̭KbS6 cj\iv.p\ևAx Q.퓍ewɋ+A $,n,=t0Qr:;+N u$,0I#bnK.!bE+` ztN2,bW$&q.pcYY:7K?҂k?с6&msu@@!bݘy$P\!ibUYfsyBcpDs3" pKnX+A+ u+sx% @@e<aЌ5æ/,0"T*;Jqh ')J_\:FA5�U| >Gzeb̍:abBeqD>~*+!*&ʑ) AjThӁDyazP(HKn`н9~hDG \V.Cå qQ}F)KcY"Lņ;5v,h#-I N}`0>t,bC ,9xse)ݯL B}]�5} �q2 �z1 s@fvncc7/FL0&uhL3[F{l$a'r� ӏC#R):vDuȄ�:xtpѥ\QEX&?BĺLW;`."7d%6Dc;vk<b%t�`i6�.͚TT q,�dO͋.)_Aֱc>/=@~İ"򄠒Q? 1Xʅ�*-H?&cJ~TT9"�ziНK~s��&/VV 畈<A/\x?* "Tg&hT`(P:&��u@~+  w`P3K7MnӉ:0[KMtSt1X/`*V1!KW<(T54e41ncq`ᥕgl*NtW8bm J [Fvx9u}Xs8Ԁܢ#P.INDBX]"=^{ţl ^Q)v^I1O8'�(d%_ک- bp#<!gՈ >8%5YΧ$)Q�x`S 2~veؑ,`)+pb`�*AXsB,dbONx"fBn**=@Zr%8|2? 5|qvvw 2F2f^$>ob &;ڒhm"�D8,� #l 2kY@Cewl0p,Al2Hngx ?gMxi]0X,.aMҷoZ:\G`Y#x{p OdU5֡Ȍu^mujȥ(,%gR|Ӡe:q,:p.!RS$%7ǂ{#%q(w9 *@He#R$>7DE8?`YN{ D« @@:�Ж e#T%Q;$|9c(.Cuy!}%!+)PGҥhF@~X R>)}]FF )h \3.*hx4nu04 `�9T˖rݒ2Nخj�z2wŊ+I�|` r{-� v\q_е#a!}ohj/)L0;7WC+rp%`R{b}`TbY~B$lp*Nu|^ ];/(d°N P`)v sYQW> Q/vAdRB/@ОϤ xkK@ ,~4LXA +Ntc)< taXC+kS_B՗੎t1l O&X𨐓Nt9>OB@Р@OS(~MAώ;6B~\Z8*: aʾBn0I3=8v:r >P^D]Nt$@5h*�A@}\ 'DxF֡ঈ .*QOŇB8HJu)l"TrŏʱA'B(I<E^GNK�L ߊTuK>A[1kWXi(A#SP`C(pFZYc0%w{d.`Q2M::GP5jy,4%[yσ^Rcb'y'P} ,Q2n Q1G^l$sS0;j "Ntc&�Q^ESYupxebŲŬBE!<ZN\ٔ FM*fG e�M(̥ƏW#[KimX,:fЃ.0:u2%%bRSuϭPe tØ^yu6*LF$d@`|mG!}'BeHH̝t`` bD??DEHmV�]GYC)BK$Vޖ4m}4`-Q(Y}a KnJ'(e# ˃e F 6u/^\s1zg Jk nN(%Qް }30 ;d%Ũ�Ʃ"4H^M! C~ 9@iar-2h亖\`WY h ȩD`zLD xG]4削\h3 c<>r ^'m$Tft=.=R.9:)EjCh[H^&m".9&4*an1+931 uĈDn&�Η`6Y#nCAPC3 3B9D+sl2+1Q쎄6H!X\ЄJԦ-U}4[Ff&@ فJ =+Z& Tf_|HQҗ<9SuKbKPrx~?2z./(鏐 ; 0Al%E(%,az@+N9 W| =,dQ 8P.\9l(;F779@\�.^ AdJ:] }ԸnACv:bU t@ v"')0}"ˆSW.A +=%,DI&zzn oRU \pfB� P@� V 8PC!0? _+ҶxG Jo.ƾ- E) Y 2~WGɣ͂)<'VA 2~jNdСrT .Xb j0U#ur2A|@$bFޔiP"2B! Ӵ €Á�@? }~n0,,3@{'T jS�.x0 [BM (;rN=B'U`i]3$(} G$h�J@� F~.UAKzaA@O$ 0AO@fRJPw 5U n +8Y_$XY XGPAiBCxw0u#hzURTʪ>:9Ta%�w!ȡSc(<w R3xjB|2UU mgd0nzäz^V zvKL%&)ǁт[`db-v&C+B!^N%,Jx:+EwA@B]QFH'>]Qv<|7:ohCTTm „ EW,)' 5@Ip*Jh]OW &@ ]RԚ{A׵ăvy@� 7q((Pv P %Tz!;loC G,۪N} (ªNK\)?\6-T?`e*v+x*0bUFTk6SF=* tvRj 1}:2oz+9J�Xr(4pŽre KȔr,ڍCR@G"J} z Ч ez%*(Vç{! +B\ Xw R��M(+a㪸Cfh8$x0#Έ  kBwOXw3rEХ(CU/p;3yW�ު3dJU>%7M#9v63�TE9S,fiT,k5A4*Fp18URݒ2־Ag7%7&bP#z̈i_u1 n�v84 C ÒxUjUJ^CP6SZTIݬJDC)  ֠ KàCXc@^JZuA "Z�FU0# J9 K-5PX 0zh skVDϋ@�Ia(!LP;tl\6. h`(FPa*CS&E@VkWB.?H "%[!TZ&F+. '0C[jQQR up*`:(:UFct # 6UjԀ(@A(*K:S3z,+ ;TwC|TaÃ@G,!T^)ubS'DwpXb:&Z_BP) @z Rj~ٵŬ' iK0aMuv{y&Jn[.ٴ@x+61hv j:[?\-m:^|F |@|ȰĀG9!@l@Iw,OJJ78"bNEHs?O< TfTw\hF#A0PAyOѨc%%Gpͭ| QKSkC3Tֱ>o?1& Y) QgDT-))EY\" $x(7;ƩbcCcLUbW c( JΚ5%p"W6$hp(yB<$q@۰`7d,+cdv/lCc=Tn {BX8)W.!,b+[b$XX䍶#{ 눕YWa-XF|K|XG) 0~JР\V)_)08!i[EWR_z{`N;/c\D)+SGxR%wCnURZto%n?+{hpO ?XBEqu`D뮞7/khhhhhhhhhh$?X�@�����������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/github-com-BryanDonovan-npm-git-test.git.tar.gz������������������������0000644�0000000�0000000�00000022461�12631326456�025222� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000�������������������������������������������������������������������������������������������������������������������������������������������������������������������������MAU�}xյMI\V*WF6\qWi+ ڝYʒAqph�B %<Jpsh%[Zg;{˹smԌw)cUEmx[dkHA@ F ck !xH  GcHkg*K4K0h+UI-Dv<P4D{y=·sU*4)겡j)UpF鲪pU䴔QE ZNrUVx#(0GYxf`D)Ȃ\ 2$V5EV0sNKç~&>ˣ.&7 5@(J7Ef* DRDeNPIP %['g?i\~ћBbFr&ASUDݟtCm@UUe(꿓"@?:/ *$�? -,'ICR5PId B=JRT.o\1ŃI<s$HI1)I<bR4ǹ`8⢼JA2sG_(LK?op-0(v_(9ˡ�JX!倒*;,P2e9tJ$h,8,P2ceW(PwYd?<e/ -O;%�xĩJ?*CNWBx<MHLDT2GH<(DXS_p++t Q;q@$*"@8x'<2iǣ[tgyS?]=w9y17=ߩM?oMm-::Ɏn;/Cyor_^|<{ X/~5c/ak+ז~ 34Ȇ}6OPCbPE>G91!HB /h@?2uZ`ѠA @@?D2'DR$S@Dĥ\<*47v`(p1i SHogκiMojKiuֻc_}ۋrsw^+nY>ZybD{yAܫ#nH5؏]_ݷ!;`닞:d6aA翫|I돺k'{f8汹Ҹ,x Ls?; eeke9)w^{d[ok`׃ߤİKl<rئ8fyVM~fS7y/򋗟];~yN㯒K>}2E%N+7.Q?t˻W6}ʛ?84J>uI+>@dO"\@T4GB\(ƥ(h D*LGs?r97|uK/[\;l9q@�üLBLH`b|*$IIDRG?r_:W|T;eXSVE*N͉G<qN-7}s`νK>]=Go<kb>'&񍝯uc~{'owݓF9vڱoy\3u? ]0txٍonݦwnrdc.t?|W^0SΩ<izni;L_8[p, p}`wf|m7N˳oWqMuz^΋׾~Qe?J=̦ӿwΜ7+v=u/ury݁@Bb8"Rq)Eİ( P"ĥO\B!b,k1ܳN=׷=֫v7Z߶"8ځ9)&"!! 0 a1⢉h" sǃ 9V<yK?n6m}˞y#[7}fâ>.5vC˶̾mW_>MS:\Q|Q>aǪdb w!l/=oSZvK7[gWeeӋ7,Q&7^|sdKB契k=[7P;ϐe[5ꉏb?Z/~._xoWorKֽ77׆{>O!G<sc_̿f׬>YUs Vw<vo¯^8a 賃2Ճ?}3܋CyKgGp֮x¸Eݶ捪?_Mxn<'.}oW~uSӮ%8?tvff 9/cR4Bh" GX4PTD1)k ??:_wqq:3CWlq_iJ =.X6v,CkwO5$t'?wG*/hpǹO|ώiMv?qyqGfDֿߵr˼;w4r<?_x ^u/zf~NjOꤍ_woh}?D!HP(d@Bd(`<MC?Ho @P6z[>~rO_us^N|=cr#KNrض/skYY:yݬ|@ߠD2@! O 1 RR�P_@Vsܐq1:N<}S[b~"ĪymG_ң?mFoz89[>޴Ϟoĕ//;U}pn_iuOuMT섿?4Y6 k|x,S0/h8HS!Dd ?Pb4r�Oj[2b넟>5=+f,=Q]oǗ>pɬq7k_|ҩ~p֏[9~gm#YzP!t8n͸6,ۼM;͗{N7G1Ç6_/}O||&n#kf/[p]3#S~6ДfBo_YwGS/!c?zvˉke~xސ5}[^ѫ_h^C7m;7=䱟LѺI*8NjEo=g/la͡w~!?߸_[ʍlb۵_^'(7gj}{=s#wR3'OiI|[0%xBsR"ƧѨb4 $9zw?p,`]<e#~vSfO]ѻ ]=>]_Aǥ_JCHD9|w8sLPI${tF ޔfLarAIue5_HxU#j*Qu#Id4yHtZ]"+-dKuD `sV Cs"CTsd T/Tejso+o#/fӝ՛[|:ɦ{5^Ƌ!P_<ˊ_ouU4*iMᄀB FMB2s-i\F(AA8ћG'jJU^MwhSN(<u=Sdΐ(:jֆfu-$bS FDh" OKGp$gɃ]s|h_zLY.Bu"⮚<ci^i'2|Bꐄ=6Z:ѽUlZ_@ЋcG1_Y`o?&('1`LU֒Co<>U5ż?( @VNAڻ[{Cbb/`,?^4lE%ij I'M[IȜ ?Æ6>UqtωD%Hqisl&e!j9�ġ˜>uLCU rƙ3N:maqSfījݿ`^nԼZ_]2ɼ?[] õ� ĻX'U0_M/' bF]ԅ�j hs!sٴ, S BQfn�!TmN QrS+X+H\R1.յe`QCdtxDݛˊ;PcAjMi!0oBфKm4DexbhgARmM(H {qx1p@>{vWSd FiNj>8�\ܿhX"%dr|Lhoer~Iн?0se偊O9{ �+Ҥv/DBx&QN8R_ɥ$4jx* h&.)KJҬȆ̥M(RlIe&ktB%AHEw] DD(* HHcK/Id Jɘ#5w,ҶqzSs3])[vxHdFH-k p f'JJn 𪚦$}] Uְei>!SvM$ $3>pPRhɵ#1,mq)/QE" ̖@βiS`KAR�"X*Z8rt�Cv*1I & A89n2:J2E5,N^gԜy�{@[Z%M٨q\?oqH-zh,a7iH%40U'@N9]ohˊ@Emq#|Pa &aR`SQ! @QC&T^e/ʫ*Nx^x*K1&6NЄz.?/P rZ$cT.ȯ '7ukZ=i3aL ;*ҁ�&pHu8wF/DZjmYIE*neK 3d462b[z7.VeeHM*C⋴|=K+p9`YKEM{P@.$Uk[\Dg[*ۨHKH@ğR وNt:rB^F9*+ԁ<Hyg$kЏ/1% EE#Q'+ (])|B7Vܧ2ut,ch^JaS:.(`tj�9n庅Y"*j&fG33/Hz`` z=VcF60uh'qY," U!:sڤ2K0V$85L]۪=V 4[jΊ3dp`&AnbB]?ŀ 1sؐf=T, 8 )ڙqND i??zrAxc">�?* %Hџ=ixͧI +x ւ/f4m!˵4h\. !'҅6,}h.QuUb)LTd>DT4ȁ$ u4!<NS {aP{k�4%)Bj8a8g8 <5f+ux@&ar>_1"hЀ+p>`/YYWZ@B! :{Eh㮲2O;!34.LBQg }nړPgGE#~^O/?F?]?MjS[Z RV(y 21*iS5_uZN!Y-of 4shF'&XFrP4FU.r0F/*/ùl<`3Ao AX//f= eI 4V'lV4\)X3Ԭ,}ӗt_.O`e$U>è0Tl Æ D]ˁ9$4rc0C47ȃ90u|րpx61`{U%4VYen}FͿޙմ,xхb-mYld$ 3ڐB.E`(¡�;@..Gj]#GϞ`e}sS0g(qiKPc]WAn*Yd8HWǰyɬ h=!2lHI:<h% %wqaEgS@er lf;nΑbXvABY{Al %=5ˣ(/ޥXnSD N=#3ǃiՎWcrP꠲?jvYkzm64UZ1P6x wXqƲB0XD<-XZf <jj"[nNF(mߦw Hiе, 2XgY\k^TR^/YaƞB{N ZWSf%-c̘4#if i>-H:̀ͬŢCA}aj-pYT o揨%~l_Q>a DΨS|ۈ#$Rh_]mb.*DءlAI etB( )mW"cMFLZ WnbMo6K趣ny5SS\_̲np)%Yޣ_o%V([&-,ֳ%ZNp$X+#SA*@l@pOU/{.,۪"5Lkl1X0ͼy-� �!}<285+׬U$3|NP2;j8--I$݃iI7FU.4֡P/+gbݿC&S"(*s_H�@MVYe F`?ӱZ[az N$ђ"#yA;k` ym| 򧉂*5I3AXl3Pw8kn"l" Е?anbd$\C\Bukۙڟ`-|{S$iho)Ql! , Ԃ$+?LEF.+yaq(}.ke<&,EHRHi^@l@,ΐ#̝)SKPQGFH:�b, h0[\5=Q8 xf<,$a3tSC=(6$Z5du潨 5)`XA<1lP6[ѝl37F7UcR֌9x&pS1*O-2,31g*H$:IMzSC|}T?Si1ZV'=6=롍)`?2{K_ukhoS4ɞSBZQ%7|a ^HlÝQ-QI'%+aF̾FTLk·X*lI<MtT�ŶE n4Rex9^�&=5Er6 tЙlns h �.X.ܶB܂ ˳#ꎌo5'anu,#{S8^o=>ԁ{ K(q~<P�W_{WֈJF4Qlp*:chY#oz9I�<벻ej mq/rΰHm0B#T#eJt#ម0 B4$&zkx.g"iq>dÀv2wؽ+Yj@&Z}̯&lT²f $k!c+51/͜(}BaR)Sam2>Ov7�oE吧*Lkfd贈!nHڈPrJӿT+臐윖yUmHr<n6c Ẋ{&>hL_䪬`Ç**#FEH]-@I(zB/?h0/ `HtS8iQs-fM7�M$]A3UǤC<U!ɠ/iI4]mbc= ^x)PG-:b .D4T`urzt s2y'$Px&pa) &6DFK;Nk.07D}x q6)=}]~,H JJ'+ v Kt=#ʦlv &U2 f0@q:¶AG 7eg@M8KIF';eɮ!9&Gۦ{,װQύyVGRc;*wl$t2kgRsx^ŌZ_ 0)&^=` ݖȮ~v$ņK0n[o{ZWwq.smbhhv[:[?B4ړ6yW)WۮGڻk֑u,wĎnw:r}cđz 9zw('Y FSPs|>jL1lP7L\86zm]"Dqp"j Ps:Π_F YkBڙAS9AC߅Zf,pjXW\Yac(uųzHUY=֎\@yYa<PBֆid~5cv!ׂpa,Q.tcat3łQNJGdNKb1_X敽F[HK(Rt3<.c-,Ra|=pl2E*vAA)h#{_=pB $xݼL)eT1 {XzГ2u7Z4Zt÷Ѻق(ݺg>8p᮷Brsx/tX ]9.\o7};888888888|c,������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/gitignore-and-npmignore-2.tar������������������������������������������0000644�0000000�0000000�00000010000�12631326456�021703� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-test-gitignore/.npmignore�����������������������������������������������������������������������000644 �000765 �000024 �0000000004 �12372717502�014667 �0����������������������������������������������������������������������������������������������������ustar�00����������������������������������������������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������bar ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-test-gitignore/bar������������������������������������������������������������������������������000644 �000765 �000024 �0000000000 �12372717502�013353 �0����������������������������������������������������������������������������������������������������ustar�00����������������������������������������������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-test-gitignore/foo������������������������������������������������������������������������������000644 �000765 �000024 �0000000000 �12372717502�013372 �0����������������������������������������������������������������������������������������������������ustar�00����������������������������������������������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm-test-gitignore/package.json���������������������������������������������������������������������000644 �000765 �000024 �0000000440 �12441730371�015156 �0����������������������������������������������������������������������������������������������������ustar�00����������������������������������������������������������������000000 �000000 ������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-test-gitignore", "version": "2.0.0", "readme": "ERROR: No README data found!", "_id": "npm-test-gitignore@2.0.0", "_shasum": "4ead0b5fb8642be1b7f4a9136522c943ddab5cf6", "_resolved": "file:gitignore-and-npmignore.tar", "_from": "gitignore-and-npmignore.tar" } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/gitignore-and-npmignore.tar��������������������������������������������0000644�0000000�0000000�00000024000�12631326456�021551� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������pax_global_header�����������������������������������������������������������������������������������0000666�0000000�0000000�00000000064�12372717502�0014520�g����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������52 comment=5852d0c1911fe9752aad2391e775cf310ac0b938 ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������package/��������������������������������������������������������������������������������������������0000775�0000000�0000000�00000000000�12372717502�0012457�5����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������package/.gitignore����������������������������������������������������������������������������������0000664�0000000�0000000�00000000004�12372717502�0014441�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������foo ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������package/.npmignore����������������������������������������������������������������������������������0000664�0000000�0000000�00000000004�12372717502�0014450�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������bar ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������package/bar�����������������������������������������������������������������������������������������0000664�0000000�0000000�00000000000�12372717502�0013134�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������package/foo�����������������������������������������������������������������������������������������0000664�0000000�0000000�00000000000�12372717502�0013153�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������package/package.json��������������������������������������������������������������������������������0000664�0000000�0000000�00000000071�12372717502�0014743�0����������������������������������������������������������������������������������������������������ustar�00root����������������������������root����������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-test-gitignore", "version": "2.0.0" } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/gitignore-and-npmignore.tgz��������������������������������������������0000644�0000000�0000000�00000000550�12631326456�021573� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������YSgitignore-and-npmignore.tar�N0�^ eXhIAcb|w;c`qH|7%)7/活`}4l6c4$ 5ӊxI)iHOhBe~vOӇ#_վVYVZqc\Tj`t_mFMh[ys;}\*M,iǶq>عy%_JF({Xݢ(/{0z/ '??ac0x<.Z"^*wmn=a\ 7gֻ%ͳ7l���������sik�(����������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/gitignore.tgz����������������������������������������������������������0000644�0000000�0000000�00000000477�12631326456�017047� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Sgitignore.tar�j0} qqf2$NڎF4-wo, vS-\1 {3ϧꭹ9<)k4qQ 5ӊlqGއ}߭O<-}Ӹ6ܞ .2)JaAilQZEb2!ot|0ۯy8Z˯?^O oC]wkX<`x<G{j/, & ,@i֚eXfv?͸4Y�����������>P�(���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/npmignore.tgz����������������������������������������������������������0000644�0000000�0000000�00000000500�12631326456�017041� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������Snpmignore.tar�N0y egTB Tcb|wz!j a40}>k/;gO'?FZMcLJ¸00 ϙTlq[އu姇#OK4 7Pq*+ j+jWE ̡ӌ%Ζv5uIs Qꟷ]s[߻\|K*11=Xu/l ~7l6Bie1v.!sȮ뇳o%<9͒W|4������������~7TJ�(��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/scoped-underscore-1.3.1.tgz��������������������������������������������0000644�0000000�0000000�00000154167�12631326456�021150� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000�������������������������������������������������������������������������������������������������������������������������������������������������������������������������ěT�~F0xS̎E$EʖL((ZOH�-1h_`_{[@"ٙ9gLLKuuuu];'u:G{{}>?Gv^{AY^ %+Aps7o'I4J,}�<>xbGG{]Xu|,E;[opV?ןQWq1~+nޣG{|[#-{< Ŵ>>j4)¤hiXWVW|0Y=o}WvGdQ?[t5z8<3p<?wŜy^? 'yChG^@( [0}ow<>e, o,D l|iOQC:H4`އ$ޏa1, ganzGYzM#�E6;blC\jM]p's R*.r,zw4 ?)7y}?},iQ{* TqN΁ÂӢH'Uj\FA1w:fҏ08lY._{(DwA`W^ǻ?zM jwiQ}MV>F ^2_DEkUINJ\v o�1 4(>�cߔJ\ .X/FQҢu@pF?Fa` ;}FZH҆rkb=k9J#^s;ipy8(&K6�E LKPn*]љ9;{z.dǑ,q)l5R}m΋V+f|Wt+t H MK:kC݇kK?K`tcXΫEhlV?z䚻}\Yecv^Ī8u}G�%/!¸ƻ}=o)t&v;}[)w-s~{k<{d/uOǑ;w۾ʠDxgT}E >:. ֱv7,o,A bOMBE:' ԧο(/M/MiI1 &iJ mkҮ=.~@e (9phvcY zFǸsA`#%U73%pv^Z :uJc>ܫdC<=fs珑EWsi@WӬ$c_=|@gWv(=:6pٰ[rvÂlIRdi0Ǵ} 1Yn)sGՒuqtޤk/x/9Cg], fSЏu _;׌+.~?v̡^2_W/J 3e}UulpQ;}}^,a=�?F8e:܅Xx6N;/Q0.} 8|37>a<ue}ey{>#x,-�&E˺Q1Ngо|YWtc={30 $cD@3=z/`zd'~j gI||Jhop`#< ZNR"e-J1X_w}h 8q$jʷuoT;ݠ?>n w"!r+K{Ϳ.B)|Yus-4A]Nx?7@1#8Է .~EJ\bŽ}�ǿ$/ԟL9/ԏI]e;c{kN z[ƫ٤W}[)}|]3@>~Xm0 MɌ绞G93q+b4+8BUWX~ކ %4r\ک4m&Hc Wo“  şkN1!v(ݥnRE ?6:O8Aӈ/ %#0V`ԁx/BSw#A@ҨkϔdC%k1 &#$K'e=+yy6l}p#9/s;VG~(9gfiiSGK{B�Z0.8g~6iOY4-b4K?®78-tš4x?j̟LPϦ,̻LgqWS@@/JL (Ri6Q*$UQ߿lg��P 7jo*hJY-c&''s4T%6k{'vnBHQ B3ˆ?ds\㨰ٕ /Cv@A,xe0ĬN+ 9/Z~ 3Apr1z{,6{$J?N?+{aDžE~ w�vY<=.:5(9-Z0é"H#\K\?# 9Z QÎk˓ + c塅 ,/k3X�Iz*pADSmCJ#r:aŻ <a,#[7G;'0'~v">! 0"x <XYV!ğnfxxZ'I @ q1g $Ekߍ ?EʿȎ\!*~M<xF9LY{/fLی#s7Ya(a7N{w35 cfz\aA[<`jYjՒ<ΪK;h,RQ3){~-]xĩ0uȲ4[vX'UCk�LSSvQ^k`w鋔zEPi`] H> Y"(5pouG~倠t=M'r닜{vधLc%0Ɔh:̲7?.81y�g {Ձ8I?9z̫E,鴡η=(T 76ޥǎh֌ѫw~bhi`}5;69Pc$qm]G;K"͚^gd~P;<Qd+*y$bOB>vg(a DǓ_ԍMTT J l2]K4 CQ۠H>ZM9} |XE4p+ɾa]& ]Y$R*h-rrfкS+Ltjz t^dE"z7!@CκMoׁn|; ~]7zwGө4gHHnF?bm9耡k#z)D=Gz[N2WpB76pLf)sY=MzQ1N>ݖ|$E aFz !,>՛pe tC3<}$?tKnnȌ (vv=lz}/mu7hUi l5W\7憌Q´kA!g�7;݀}YM.?@8z ̑%xx9l86V} U,:i`\6O]1SG[* `S 8IeMeȻ?d2<\UK'y u4"=ZAb.ĒKTj_}!K@F-h'DLgQ7dќn5 o8=WRHCߋd4AQJE�9;4.ʙكUB[ Ӹ7~xE20H~!Q4#' v[jla jGk/"g{z LS,>"+b3QB293QrRq"& 뗧i"X4hJ O$OH<.@GDLG ܨq+q -"@M3/ޮ]9vӭ %2%_ ;,\*ŢEbBuPtQ0Ì ?܌W˞˺| =Bl#Q́9z&[ac ueZ#k%[v:%Py\4r4߯]z}ob ꊕ T B-^ =aB!zPf)yrz->˞*Iۍ`%un!q۞6,y[Z 8S2_NYd.t 襨nPF~1BӬhs,*X4W:uà ݍ,ې\T7Xy�pޑo2 Tڞ{kSoR$@0d]W,%<<g_>^Gm U껶3Wr#ۜvf܃U $]` vF+Ŗ+Ji<:a,+[-X{=I3dM F7d5֭B%rJ3LK? XVUdlFr ,gg{D5!O}߷@ۜoR{RM-ڒN<V,875YiNì#<- *U`"Dxvy 0SY{N&P.4mZ(}^AAl&ҡՒ"MG!rgW}o{@*ޥpOb`dٞl0ba. b ,ֶ}>5_ڒ#?+W:{uco9ME੪dJYh &IGV sh0v $2?# k~K Kʿ͐.o` Dj@{5 @Vt;D"N%+|Rpɗ_|HNr "7R!Uښc ,IJuTl%_9C[Yd]BgۊuNof!:y:jY #p˰-2/Ld=Gкn0j$4Y!i dł+8uȑqV/0Nla?u=l݅o-4A]\%;na2*XtP^AKwv|\Oؾlؔw̲~ jCn)u tIs (m $ <}<J۶HWc`C'h3NJ4ТHk(<]>�I}o-{9/Y|彺jc\T-2$5-f!e0i_FP'/<aݿ|_/zE崻 2E"Dx-'wgmxȊ|~2/,}V=ؐ' CFSԧyrcƍZ-$ɬPb7D g)PZRij8κleo v{*&I Dlbb럗ݩP {\i?| r&qC _E5d Z e]Q֔eOiyS|'?iPp$7Rb7W j79&Za ܔB5b'pJءWoI<WJ;QJHWÌLE)A7mT1fswޱsO S%A(!PqY:Е%2`[KH G*Xtb` onߛb`j�n4e .Sp|)�nk$da )ф)N@86l4Ai 4bUX }bf asieաC ( ;0ZiI_ ʏVW|HlC\f0+`@5!hoo#˱)㉬:U$"19\K8{l&Jn)Rԉea'-[bvt E0nPHM�G?w.FׅUx+roİlyI:w btLR[F|u9rwˎTv[*jZ..Y,L5`% J(T*ˌݧӛ L| oEYX4-HL}rԚKdXZ| S4ps0v~[T,cPA+A;[^AԶnB4.Q汃ɢAݍMa `K+ oa2p7|„?&QghS8lډ8VXh6x|ޗMVюKw>$#{Zdp3`6_!K,{kA7#g2'41Ss6 mۂe01C+ 6tGi67ѥHc$ih>'M1L,T nVꤪd 1@4uf ~nJ$_وF!n`) Sg%_2 4ee .t0U+jNvs_iTR8āJM4>{tq1 6J" &(ύ<;�|m:(! <A;jl9elxE[H.yޗΔmm XF ت*]zTK|6l24),DPR8VZr+ 8 04ː H j$NMq, soj/<,H.*l% հb :0WR=E!DAS߾"6k%?«j"*N52EDd9 "R: @h-/E!;;:K+󥬑jxDš'Wɸ8*%‚) 4\+~N#3?<jg wW0y@k0 M8`ڔR;9F} 1 |!33,tʮN4PmN0C1U8p-S~/dQUSjbK;Ѻ2]{"d@ �Eu0k wCjH|doH]P' G5TހZ*Σ_[sz#o{NVqhAw-ڂ>[i zmYfpRn \@LᤱTDyK_n :kM,a&8jG mMFI'Z6q$*@poD倅GJCՕIdXS8E1P(V5<1}j#V"]q}y=c2[ jVx}o>pm2>ؒ%k >kă}+TVw *RT7d7!j.[PHE1L\_O%<Kԁ(6p|]"1-"fA AW!QJ8*Afb:!dU_FuQ:=l Ĝ%iDF`;Y<<{Fo?1BD=HG%'樈~C@Lf7T<(> -V1&)OXO k,#Aqpv۔,6u m͘ap@7f\aۍǶݴږ} 9!3jZCa҅Q0^bPأ:BeE?N(kwK~7(B2̈́@ER=a'fS/$jʊ_mA$ꝽF/`dFf'6U˹"@GU#>̀).MTyY3&&XfsgaQFzҠ-= =z⵼n6[:ZTh9*,EX`ݐyb8##w$2ZRrBfJk1nԈXf*g+쪳0"EGg3\E61\QDFxgLfYbog;N: P XB$5NT|>,)"u NQs峹ϔY<&HqlDMB0jBKlRwoc ;6Yrk_hJf Y lPUzJ��b ڠ u%o�2_ 8`Š std}h+ CvŌLJp V)$O3̤x\P P܊IB@<$R0(IC+^`V8]V1$(zE7"&(/⤗v>Rˏb%uP)le_ 7<:pUuSX)pgW`&R8L##_bg@nWGoI`lKv!4Ł#VZ7Iԁ/+q")+ $}-W>oQ,K 9}T0{/E 8%-2qD`3GǐL{8!l2@vhG#(q!r6netv2ЧmVU,$ 4@!]u`+fA>&4\d7̝c񡼋o3ñ)XFR8v'+@Hi`M/fTeL@}9(qVA+t�7K�Q_ Y1R/n E#%"Ō⒫m|`:$y!*GBcNS`lU5 !e<W'5#XJx8ǸN:+gbF؅6{q51NeGMc(On)oK)wt%-M<Κ:MO66rٚ@JɲI!_sKERVJuTQȤ^dZ|4^\1HR,2�ΥbAi90�I)IJ0h[r2W+[*{5 PX^塟^y6ߒN/5A߼)K^Q[RWgZb:k Eo5Y&ߎeEY+)hd .k.,RaJFOQm2�QFi.Ey#a"c$Guxޠ+ъXT, VQ+XhG5�Rt_ c6jM& PK*IGYҹ8j'U[o` mEpiҧ Bж5ܒ|ZS9Qe[c@%BM.p66p),V,:B"`vE_!5o7')Ġh13sW_I @ȈyO8~T3X9 O%\$@[1[o=l魌 eyQJR$ ``;]r(iz$`Z̀QU5`Jk%|{FYs(3JO:~b; TX*Ŷ0ґRV~JzDcJ JÎUҙ QGjYj/YFF 5wA2( .STNtRk T]@~ UY\0Y(SXk}Z|e Y=W|b`#o}8 ڨ<gLlVS+G+DٌU|o횶_SqQYXs'++[|!080'F[e)"D <XżM\r::4©RxZJUtx3eJϑ\'(PTq([ lhBs4mov:PAuBxKK5Fe\ķ/N4\Icy>`hj\l+p\W0m:uX*a,$$yPӁbB}}X18ZLCNO6&F.G&БJW  Jrt;v|8iܭ(oo-T�bn75F|b\ĵ8D_$Sbu c˶jϺ~-Wf D{ħeQN*;XPԦLja6Yp�Ш|`Vz@(=~ G8%0M?{{qp7@TcѫWJ;E7t-Dku!T2ըtT.pPiHpׄ!{ ҿ|q(֯Jr kOy]>j0&E'@2$JLn45!>e!nr_ B6@P~ݥ^tdƨ+窲3岯>Onv,go2G>̑+s䲯>Gn֫sm2`֋ogceHߵ�׻gK-�R!FtfjPd-ɯ33oK! f/ GW︪;E.fWNSzށ=_S31r܂q&yX~-Oպlc-WM"�"nr"&�7FO+۫R:U 8b@nU5�, ]\W~{\V-Xy@-࿫aĢbh_Vi hMt.S͆)lYg}4D&Gܨ� uc!orvjUtB^{:՘&irVTrpL7p&"VF<79x(W:/TW辽Jf,YɈPh`| Vzyq%v6"DIaKy~C`F=%uB_jmr"(ؐQKTNrvxNhYHknӮYJ4/Kԁ71A3n[RC!X_Q^xhOM$m 5gG5QQ?c?OXJ:/KUudt^ ~m3#=c“Gd#*Aè*e7> [*vĘdEvqh*&Kfe#1KBUAy T";К.2,XviMpĈܲQE΃j}KBU=(Ycl`a _ ZeNcU~ןLD'A9r>K݂ovv s!*0U*xS;^`PF潴mjTZ5a'xݝRUrȻe PilUIf璏Dd3(/ׁEBt^?E:N_;5H $4S҃k.+@_*KϻF,_,챈3hR D&MW/ȣlW3LmFb Fժꌝ7H @ { X%/;fXn0,9ThqdQBY$n!@[pؤ! $d\rc&a|~qՌQRs'J,L߀8R,%%Cze]Bֶ$581a4uGuaXxȓnEO3|PA~R5x/Եz2rjz"++[r03+W>q8ViwcU$r[.Sw&ihSXGL[.009kje#'@b5nh{7Eb w}Gk EFWd)Se+d]Oΐ!a3oCfSo$p'#}]h)QB']3 M^3hh%s1ͱj2]} _a6o8M)z؏ yYpWrTt\1)hg-fr3|w0h^SIZ,شĹW<g}8'd~!'4ehYb S ۬Jj~MJ}oO~=iz~>}b~}}ǥtrzrGĒčlHT-Mzn(AY][iq41Ӈ3DFNUTK8;yGr-9Y-.Ѳ c3k/,JS;SؒyzdnjϪ/?k2|GA f"{:X|QX&:H^J>\`UxeFnۚEݶ1QLpC.f191DJ(&@P� , eGd!:!u %yHVqԴy޲Y4<Pb*Nd{=Al$,cX^@Rگyy`dɅ0(}zGZ1.vyK3.r`N¢I&`d61=b}L@?xLCF9TI ԝߟ?{iul W9B놱 Ufh9p  vҁYp +*#*xkO?\q";?ꜝ2AEZQ9t�v6 !vYa oT'ULXS 7ýV&O,И0WƈLxJ#o.k-eV۽3 KaJgp)nH(wL̢L(ǚ(BQ_8; xwY<]x>f'`9CN:)y[7L\@TA݃K`DV8W-7Z{%g:(';l W,lN8ȋv?Dy:-mgW'Q϶:mc8$-/b{SLYrܵ*JpM2S_ 7 �_~"0& :AyRԴ/h_ǁF1 vQ#4⎽,�f O |ԅ'7iͬ4?9[p4dP9{՚_έR~1PS͠P$Z~[#TNK]y8{ ΟE%%6G@\;cK/b-m4hAiPpU,"b¥<4K'BZ5Ka2eP g:;|aKvh h�O3v[49u[;|JDZL$ nБVqX[ ‘y܊ٚѿ#=-0rY9 e*`w`̆{<H5lfX"Y35lJR 3'VL%t쇱w4f! /+P[;LT>q e)ln(fń_M* (pL95Z*,UBqT\ĺ.I(Ž{O>|'m`TsiwYI;8GO;"  DZ<2,)縶׎ْcٛE΀dPSL^P,FF]EBYH8PBur*~\j(1B/}E(Lk٠θ3|C70"4Əc)6<WT+F( W!kJBn-g$^>ogQt4LԱetuZ,CF%N@b=^Yb/4+ySR0e7/Lp?k-vVŃ/^ cPY 8qp2*O)CYƉq 9mYjiis)H*DcY\[07D-Gſ Dz RR I:qZ$ա(V>7LVE%Sc4x_F}L"E> !qa*.s&»u \4XK4%Y8hc4/4l8&Se%VLW0g"g킒($thO~VM3l集{s;JUa'RoQ,n0wf?4&+Dx^c%?.z1kVPm~94=);ۍ? %w#'p6�y!+Ob#B'=btM(@9JN:]`ᇐ l64nwN!N<*S-ŕ9mJ8e&VhQ\P4PHYsnN'< 7hw&9#Ц4mVpt$p{'hoY^7ls=3u 64Nb)_F6SU\9) `Ga4z^T?m/FoS�?M#MF2\e&$on)xt#d %HҤŊ9!p/,9E"YkF?gYԧl.`O8A9XrE7moi2g~9G=X<ME�h6iujr2׏?m&lanظY1N>? d()F'ebEž6)\V#_CLju6M9/8PˀGI*@9rQk ejb"]r8>\  *o;:綳@I!M-� w[+֎OHO.aDy Sj0(y#,nr-@lA{IњF(C牖 \ ɡ2_J=I ".HCZDD;).)ˤ,t(}ht>5kRŭMyS13ǜacKm}pHOI?6Pef.1(3L}a5;[(@Dk]$9Ƴ0W•4%a@"4Ѡ+b<"VJċf恡CI+\x)rAyQV$ʠ] v "WREl`a VHb:Ҿp:o17zSu ݆ RUR(ܽa%C3Hhh ,;&Ɵvۏ&gpwﯥ`@xg^8w#+zN2auMI+ڕx1�ga ϐBr ׌EjV@[RkEK!I D,eZ"YꘕDPYq6T]a!iuXb1oI`4bݍPIXd�ߒMj.&c`"[t#$u=!VT306N.[tvgrХij?}y,Řa(m.:0+Tgbҷrcđ=Uو=7JZ B*ho  Xjvو"?.BBXQdjZSEJNd6ĴnYEN_>cut#kK 8RG/Ϻq mS#snlN'!aLA2GfdDH,7芶) ?I-wlu$ķAJ:Ke.z9K]a.MA)p~Q!𒊈}Z睇p<!%h3`%5cK81Je߻]-IVM*8B- "XJp)A;FYU&m"O~'^dZhp %B7j W􅟤 [-5$J6ly^OU׌]WGcsT:+2(PXˋi:0)^<_iGY *eiٰH,3O?8}vg&*DsQvY$c2pM w.nQn.Cs<9)t9kgU>IXxWμ1 ʒ*mK*>u뮘ˢBl98,IQ?):ɒW=8%iW}+p 4_Qjl׈jRWt{tw%cM)S~6p! ŕeQ؎ w ~4y{^q5bp(˽j/adv6҆/QDήd틖,rxAcit鍝hT#}*- `N>D_N(yZT @SwcG#`QHA5(>~jwkPش mhC%q*8q+ə9 mZ&ӞOlP` igPIzno/W]2pXNRtREQd1y8hjxG/"(;cͥfS+[xsiYcŐIZ/ ! =:'{VODl?U)̭i{nA/ǾvRъw.Ў&־nr?dN 2be M'eݝ=R/ݲS%`̉r0eځ~̇B_KQɁصX4_2*z/V5a`'>ZGq iF୑׌;h$0)UcVd>??c:i2eSXB!1>Vϧr ɋYqrqHdM[G*9 ".ȕ̮@I.g^W)`a̕}^0-=Ytr5!F× y_|\Pֹ=*[pIP*F*n.ISb$$w|D#EJEHyNRU#+Qh.mmh$DFPfvT&ˆ`gR쵿[mہx] q˝W�v1o"@߮<kiԳbHßf{ǿzϬ]Gp$t7҉%DJr5`<B�/N}X1٧&2[cv&$Ljk;S$޷^e Ax ޶<s11>|S,CeqӸX!LUwK!ܵ0X8i {gn绶Ojo1&ZֺTqy!RX.yfp L*)R ;IB؅Ù@^Dbty+p?Kӡ/m C(E2R} R%-"dN #Zo /eŨD`r*D9h@iU[,HdS:/IJe.*|j/b�|G v=rX4g)D\ÂSqv,/yts {h(^WzŲf.y)ǜ4$=50d0P mHIwp2ji! (>'LA*r8WtkjS&- XC7';ImGJeTJ|& uhUrLR3~, Έ:0=lHx͝$UCedVK؂}- +S>.綁=.ߑnn}wjw@{ܤ6&7PE{8%yXCD_N]1�g ,3xPjVJdF]b@3P,AXm9UM(RDFk.h- ivsb 0k(/<r1VZ<kvaJpV\6^*fK L3$Q vԚluzr~!;qbZ*K3€]Dnq"Y4|bÒ.f II<xcAm^F%\Q(oHMQ迴LGRc g@gE&@l^~!2Hplߣ$:�MAn": VJ6^ەMަa倩֞YR:"Ub\M~ˡeb٩ ¾OWX8Ÿ=ALU]I!)ű QtMd+ Mp!a(KZͪ26$#V6It ' \ujlUe]~f)Pn„|'� f}7\1$3Abf'VG%e}k"3Cb).Q뵤-ZNg�оC_X#9.Htl SS龄(n tjIXG%|ygk{M2_Jq,E*+aHpU퐎#:Ѵ\ԱxtgbWRDYl[t�> WAfrj[8G1VoݯU B^Jy-iQ ?) [d!In0RmY/3ӸbrўvPh'uh ,UҍpG bE#*RLEK<WW۰Uc145,rTV@oFpnelWQw`{|M? 3*2f%ɥdNk{EEzI߽seJPKvS;TVk.~crg[vWS'4/`2h&dr[S1n\_+7>"2%^|mЩ63б3RJ0Db!5&˙&#:T |Xm0jUy"ƤbgQ,vOTm*ڑL>ŇiטФ ג`?I ܌w˳QC+o#M˜ż ;9ri2f+FdKeaf vRw-:ԧ/&Qa,u<öjaGsIl=f*x5{lxb~([¢j+<|G賳˿oovuhu::u0%KCIGzPm8\g! CWv L#ߟǯN>ÇAѿ?i:gD>hx_[);C[[o0##6=J:7N))g54[r&[iPG <tX\R|Wy:V[ t/R>:qiF!0mH–8%9#%_SG@Q)[IŤMg8M/rQF pᄣ;hx$bDNP)s]c8%y eYT2' #ǩaWk vr6wd R9y0T1<fUQ>)WҲt0@�E&6V=6w_{'ޛ~W;:ߵɻ_΃o^wN^=knכǧ7/N5WO_ޫ'/OA^{ء4ur|<~GyɋwhzO޽6CGޛNo^C϶^~u[wp]yG/^PWG?~'?n/C0_sW0/N^¶:zy1ZyKxt[xLE黓ׯ&O_z~6aoW99=nzGoONa[߾kj{ṷ =gE nXNer_M/pH:}C>Ôm5R\bk(!_`:4?KLT1&*7Bہ-a#42;E8u]k(JXfJ5.S1jr/v{jܟ\{ sApriGEmNWoLG8b))LfT0.Jk̬z$NR4ԶorUS?/w,-wg ?|{|q{[w{�KaU/yoxW[ gKq9zzqT:fwe]{KƼ ;?gPy໿҇yFn ^Иw<QmGՑgT!rm-|JeT)X2Ϲ"sN`]Zޱv0,?55r;lߖ/~KP.K!lE64HGV6FC'T 7?E *+вRd;r /9Fڈg ?Ky{#dPG1h)DrRvRtqk%X9rM[7v7-V翕Z}ÑW>ugl%b o"Fj{iY&Z:JiD^DTJ/VC$L)øRi7 ٶv(=&~txo<× K[O|_$VLm-G6L[&r6:7͠)^7h(wz( 9y_tU|*ί؇⃁z0 2Z{6ޒ;̓7Y`M]R-CK{ߡH'90q=l|CL/b}m�plnUⶪ >y� Jo; J潪7cA"z|{o(k<,|2 |@>2hOyW!>)s @d ڜ`j4lU}4[?ژ7^iz�s'5!`"7 8luow =ܽ ;?A #ƍֵ֬\4luY 1Ǵ;Ǎw{8H.P!p�ylph-|Zzw�tZoȀF8}Lz=,ɛiOQMQԻhHK?·Yaь{i[9 7O$y uʪyߑç0}d<6d|=9/Z [vf0b':F13"f$U}}`_xt Dvw]뇆|^@{}BE/ïߏߞ~.A 4 ٿ^.WU{ˆU%mEPo(&vAxuEIwLooM-%ˁԶBX.@pWSxgLmD#WD^=hL|v&G?g}Oo2hw 0p=\8 KĤP+Ic0-P&L#^ogu :3l4'"�~3z$Pa`\_zPΰA>i`c]\):hH[oΉdYӔ*0" n20Pf{?6v%E|d!@6ڒ(oTye55Ho4 l,J?8@_sån8ɔ/7bGDd^2.[Jx[oƙ[f6iЁwx*|&z lh9h {[z1U3N1!8dp4�B' ˒?Sӆ|N,}`N*~Y@s|Ig ҡ$2s�#>ѿ`鵺iOsT Er3w*QUvݽ4 + 7puz�(i<\]qy6 L*΀/VZ)н1#MG7μ:I(\ }+B^VM }Sa_ۛMiQrFSNV3}Dg*9\ʃwY4aL_3? I-Dpz? 7R_$)/ k sjfi߁Ѩq}ݠ<U&qv0x\75-pJrCENi/b ҏ?{r9H{X[�p( ^0݃�n#uѵ} Rq?9{xՑC1;"Xa+L9#v9<Oْ,GYzFYwG;فu�yϤ8e X/)}%C\7hx(]շVitA=52P*v, Vudܹ1ޒ#GRC#BKI f{ uv C?0ɩ3k{} @xOBIc`h>덉sq�;1ڿY|,Г5lސ�!2Zk87Äc1zυct-FEUF_3CC2Coz\ _cVbLfrUk0Jt6+=sDS @h4q]Y7kr583<c:@v$ B85 {K'▖[,l<r#LA9(hc6}<U} Wy^K®Y_:@8<BsxÃ�f=GpFhӁg82^7G\`P G-(hYD?8gpwXӗAz7vP&ˬ <D{ݛ7>>舃F@_k?ws5\*7 Q%9We{t%3~L\ɘ0Ey<fq=72XXaXsj1隅 GOF?+9Uoyؘ8q�:`\Blt-._g̓ŋ 37h�D }G~; O+yX&!ɷ/MH߷%>˖S镪Is'u3"Y[9Xjp T9 l/#iثH 7ɘxpbpc)FЪZ1`ɘ GV"miV?C,Ӥc439ntl-^$>AԦomWuM9D 3n< F,=ϬprjO?۫ƺр-!IHprXKT4eg_w;3놢pq J8DրX|@ݰ *ch{ߠ�uoWy0׵AZ[*&ꮎ kw/>kmΠnEX؂}T۝E5N\Dh+!=ϟXTߠḧ́ te#˂@`U SGA;ʘY�GZ$&2 QH%z/ySܜ%b1|к�{GAƩkreiӁ l]]eu#*OWLj-P>Y7;|drgWR&tk UT^+"T-a"0!lDހp*QB+;'rReD>ݧJ2 ċMi۹3jA*}LqR-Q!fBpGN;˗)L0pMokPTxPҫ{q&O{}ͫt aySg2<K4S7ii!OZair-�_JfQW'XҳO5)q0ʉ ~ץ.a}Ԯo#Qm,hVv\kfp߫5۵k k-V4TǯVkrkkkgՠ Ze<Y .FI} Ɖ惵fִ'!ëT/!fխ,P+OT$ӺlS V|dĤ$X-<{+UUw;#X3vך ɢ6MM\ئ4pf>%>1櫵wQpŽLt.`,\՜ךzmO'qUM :wb"M�Y\j|K 7ǾzYYf1q}W3~SCrR!FPZAVt;Svnm A}Bo3o}(~?X粖')fsN,M] y`H}~$J@dH?IBO7RjZ֣{," BK:Pߋtj.st팼?/$<0Oa1UsӻK{$K l&&P Sm"{a&b&TBQ ɩ�sta2(n*WvY�K~bIU: UYxH]׹>S~k[@m4p5>Ж֐.!FaD2 [}VVnVt Oª(Főb.'WeO {ȤKj\\1Ǩ1jXUOӉYЙê)ߊê(Kmwu9k7y"vj}Cj}HjҾ)$|Bvڇe^?KΡ}M"oP3x _|-fUCo9$gɫdE]8~S;yGoN(eM%~馨|$|||$TV(G3h)&?k~UA ]mhECm퀳ݐ9VMv)WJ:Х L[Vzp g{Ta,ˬ8suWָe+ŸW9'C& =~+?xr{ ͯ 3-8MɲO_ک6g!(4l2eHhn~>qŶ"K;7U2;!X@*Cv@m fW0bU/.≿An=4ہD̯b0ǽ{DL NƯ#<&>=kF! R[J)GqNjZ԰0 ^x/I O P6o9^%2R!ob.݋X{':am2ː{Uj9 a_xߕKƒCtN7)Y]ʙ|*1LbzW4&Lzq{rw  i[f?D)tBE~1*oax "cgUOA;\Hh~TN畍Cow TKT#?6X/!-!CeN]ݝsۯzp2tm/O:dK2>E\©_^x M嬽aFXe}W톳ap,nyTe``Շ]$@o7ֽe !mr A%챺j5§irSrx*nr,Cs]bŒp 8S2 8"ev lcxü0y)7CtԥqmNxoH6?Pm% kisVS%_JD*N78b, {/E!. /\ĵ+1a7ŎL$xox]e#7D"kؚLb|&(ҮA;_Hv{Aײ>m--aVXy_ R D*zl*vT{N]OA,bW\fWTe*<wDCȂ@qzH) I$`B*T2xϟ vX#6 "Z9XD"qz"_YE&C=&`8o�*0Am"!- CI\1,p+U-/Rʊ"Z\ଯrSùZ]nYg ofK=rqckGNH:Ό jTLDJ Nma-0fq)HwWXvݡ�km }5#@v>q!cs?F��F4Nf$=@0n@9T *r\ )~?9 Nʃ;Ŀ&&(<>m!A^Lysm75U>Hz4cb,d$ڊrbI5C#6' ԕFwp'l('_5-QYm%F�Xlôb? #ܠ( 1><)uAeQ !+54=rK">cTgҭBCefoU'<tO^샞b?}_QTs o!gf[̍r?s#Q *O nP$>J8J#e6^7vPx8d ж}'^(Ck< |`}ة. ~ޛ޶%֫7X%MaWeefAKh7A$'nL aAݩȒ%7la�ç2B/_pʗ�"R$_B$�!Jal]lyI6ưٝ]J.HT=;F(VŷІa(Y".BԦpЕֈ#vi\BQsuYDZH7X|wwzB G%{iZ7S0be]#C H A*| aOh!#Zvy⁌Gt�YH@un^\># CJ/b׏^{${THGy."MQ�>}\oՎjz= #(1I&S :뫹bfrD�m*Djl`^0=˩dGaG{.IE5>;d F@'pǛad6eqƐ>=OZEբ}d+Dۢ\xa >__BԘ6O,XWT𐯆|x15$=- Tt4եZ$!7Tbj<+m|?[e+ZeL@BeCF86 [Ƃ9x1ǧBkSQG>,炐|]9B_L(C-7VN3d#KRΓ@}s=|yuB LRʅ29{! g jGK0�9g0gp;fsY!kCs6Ld)6kg =C=^x! g7wA5`�]mK˼vId5PάC0G7 5 #kv0gJxѩe8ɎL2gQw{.|%n܃Zfg(',Ա(<$C'ɏ#irp ;3ϗaQC t87m%u\K6er؟L1d7H9Q-2!YN!~qVJ68 )::$1ܨF<rSQtm<85s` b}@Ar,&F`p,|6a0@w̳/iRSԳ&4D_bÔbuqe& U`T>:`0g[P/848/N|8I7XxMUqhW'c6TB]fQׇqx 4ꉋ{\FNY7�Ewtꃮ BТW9rñ2=W?g@>F-f4R)&I98L -#*3c>3($mL.IW|\a hba]ky¼p jĩH Ҩ ъ|6Ș]3KյTULg5.)I4 )!]<BHJU9*b&2!i)Un{dS9;i$(<`3-n^͏6M gEӤ< 954MVySDΕ'Sڕ.~g| ׼{(p=: 'M/׬:c{H , tfԈSLnͦk$`/sS+z TM|Ra z<dguUxmhE�48�<J?b1 O?&K:KP)77045ftD;�lXo]$~<- HѽFv]ǒ^T,H`&=Z5,񦘈 mô2qia+ᙛֳY - %32tr6'̫y_KV:MQm|6ll5WAI'}yyG`&QQYGfm=V `Į֢>.|5v#*(5[Zj`ȁ˸)N!Vb|r)1 7ˆ c[H&lC͵Gl@Cun1:Y"vGZۺA!)B&n!MJ\zAm'FhSJ)NӘûȒtkNm*ˎ!]CpEyh&ɞұ!ET10.S}|5*<2Sj$Yvllh 5 oK5Iwkq2nFۺ׶nfOk @5Md%z.Y2&CpFj@ ʹ4- p9:&R+C4 x5b~$44.Y,"T}L?kܗnrW^ ",`GEt_4r2d"dlF8th&dx`~eǒZ(kWB^LWX4掮T}jT]Fg|Q@$j' !G2NkxF 4*M jwc)kL9~Qd D1̚pWV[;/Zzt<[NYN,il^@N 7ءfn6آFLpL7 rL8GUnS%ǀb#<7*x9Qi"T:f 214 I0E4hE^vtA+JJcN|\ݒC^E3&.34͊$ƍ&-leh1?#lR45<)FOʢ捂7ia{MgɕWRvNsւɥ!ui 4jr4kעrk0K7Vd, g^ k!rIPy2KO3Úa!58eE&^]@^= 1Rө'ߎ2hojxiعyd:2&Qy"Sv&?^):l:e2Ք!,fx3ډd>ݝHRL6e(`GL%+/F$qt2r zI<L]i(-C8@�@YoT <Lzy!L_Qf-^I]:f6h'l( 䔋[qF8,׳IQ>W7*F5dz &5-$6MI XK$fnXHT 4f0uFrT@OE6W7 Qȑu8)|S]'Ղ&rc5 !!90 `crh*.[Gq@z4]2nENh+3EP6;\bꬊZ7ed$~OB095snS1AXf-f~) (so1 . $r8ْ(2װ~ np^Dc" 7ш3ܗbNIk; r^C'3ggB /^ d`Sا1l7ObIi> ANI ϏcCi;M|,C xY"vrxT"%`%JKT`(&zE]X4\}"n6O6@NWKt;$ aY-SP2 uǪ1aEdr}82N>(a=~>IK&' L`,l>Ig AW9w+1.PjՆif9#XVJK~_dyM'� 6wA#=y~VO?7&STdNբٛ0L{ٱA(g\ mZ&>\ Q TɏO }^9@ԃ!$(O!0MN՝21lQrءAnռ>d&@B6IbO2M &M0ta2Gw"Ş7lɍti]T5+_nPȬnҐ˽t/ |1'U^an @ #?m|o/_ҡ:opm,?l=_@h,S!Kfrn Isz~-OtHjFͣ"6?=95&,2LU疲$5dۛj\shc.F[ ~ Jr^A&2,xTל?w}b4p׀LDcb:%&_} W~7 ňu=Zi8!-bB4T3J1%Otf U=/HXMSxhd-e8Z}h!FJFI(HW yFdzAVN jT< ̉P5%ɮʃS&14k)QUZ_�̗\X#ӯ$4:Р 3CӁpTeS#(ݬQ<3Y kza6k񫃨|XI %Jsꦩ4(/s%}P(H&:0T2M )ea/bn4z\0M P4s#aԙjY|z4cI4)++HL)9 kH)mMtӑD 'v h;$"=m0d H@2'yq9(ydN\߯ ^ vpC%뾔Ďp܏Jͳ(Յa*M:qDVQ[!Lf$<Z5ckW#[gu&u{fGv$UhJUvnK 74.g+\cȘf-3m0Z}8oFtۼk^D'y5y5-8dP-Um:cF0N2|Ċf %*ҼP0yFaI6:$r:[@_fqlh>=F4xК^jYA2lg[W,xIɦ)ŨnFTHjف\8I5%֚E=TcUCۉ5<آtڧByEӭXH$=Z5L&E S�XzR|#ͬZ͂GF0`lp-8U[a2cpc %96` (G 2c My6NLxdCA*l1{7`X -G|+5td9zF$>ƴ@Pֈy'vQ, 5})J'ۻt]Qŵ|G,^2*l璙ovKIkɪՀe苍y<:~B$!e''^eLqNZ2Ⱦy^|OkyMs3!AFL2;7>꥕\@y`IU4{>xzdeL1*c>glѬɀRW]$S>wWR%M8י ʷs Ӏ 3,acsNgFYhK2K5} q8xNd)<v9K6 N>O]rwפ.z>;)>/,zս%ͣ>?@ ܣQ-,ҋm룱nT/ &v~1+4F@ a鯡fir&$޿iC&\!aȖU5L]{qLYx0XѡȊbr^ -S)sAO~eơt exݩ^o6y7c&Oױ_P� &  f+t~!'Syܥ mG<M[0&͏}1i�.fM&0āHz%u|=k~??3uC©5c{i8rC[\._Đ\yJ_\ &Λ֍x|2oe5L+(XVUSʩ0 n#ϫq:1#2a,QhM>@:63Y">MYOٜƛsncx4ߜ%"6T^"dXՏSdB6%P1"<:-[~U?Y]%c#)R=Ǚ+OY*~EH3b*IYc9- MQؐ90zl^B 46sF./q S0u"R? 2cl8(7MefrAz=D䚩LImǡM8ax›6#HTiS2)%#j*ZEؤɓ buZ=Q\9B0r M: 7ҀBқ4p/JYɚOWqHW&GQ7kIP u$dT5yL&h̍G!�L&Ӯ \:79:ӴYdRX59n>Y)H6d NY5وt+$(}"3MINޡɴn(249wS0.L/'M͝<4ə:0%d鉹urUxU`Ut5*y2k^ ,XfvUHeQsEM * 1_7\_�L> }i@ZsRZؔ`G$d2ɴͭq{U@ӸtCCILRf_1q\cbRźgoL&:Fb8x!j7*$M Il�9j>v)fФ{<7~~xVrc>]ȍJ"b5&�#N}(}<t-}n4/Y;T"4lA/J'b"F|O4M6~eɨ6Ek'{Kt;~ c΂q㞹R-(װ�22i6Z kq̔5SɑE ?#YR PZ_vQ">^`* b>:*&B0RDh*hX:UBcœ[ X<oR2'FKJIDA7Sڍr6{}vK&% zpi?KamaGPqՙĖT2`H=9DWnF8@qb)6lqH65gqcgMCsN4[ٌ-Dlj jVVo~(=ry(RRT$oiI1U.WRZX.-vɊD)-dȑ/�)%hMM)IaY=NfKYe|HRK#3y,v%'>O=c& �ifNY$9iEqRZ|7z6{'ͪ5cô !PCL0s 6jd%v* /QyP../,vre(JlR<Ň*{-2 󼫼@qy#/h"<:ܣjc09iylK3ÖXlPC|@Oȶ`xi0诠eki "VHˌ{t4\Īp+.wSJy%E1%BcҨ=w4=ͭSĵ2Jňy^`c@+" ~ O?JegqV`g[~ uxF#M5068aFsEScF_`F>gi IL j*�yͭ. H)u+%e.WYknٜRwiIIRNjӊl ;}%dK3Kքb#|;$|Va<z V(HmII4X"J3 ,h>JK[4)VD"z5U5)<'>PF ~wɴc~#ib(p\XQ1h3Ǹ|jlѭBe%jHXZKb5f71P}m=r ШيĐQ Bр˯ 0{P1r�.f`;Q/Oj- nb۱_\wq)|S \wi|�VV[\�/+u*BW|s)t )UqKSXT^RUZ-.v-+.Ujܮ95Ep eGZ $r!riu _+ y(p qļ/)7#^:qv KJr|CUM  .(Jw" b|w1(-2]e vo7.$E付|\"WwR]nMw|7e|R]F1Ny9S"n]H7EJ@ߖ悸k\\;U*QH\{7ą�I14xh VKj֔V zNUqMyq92zgi׃E^Vj3 )%5s+FveNʑLD&"Dd"2LD&"Dd"2LD&"Dd"2LD&"Dd"2LD&"Dd"2LD&"g28B xgNZ?iV9bNvv|΂엜]q gh1ZvX:>WIHE&2F'j\RAS{8C;&W7<BGCw.0 Zli$y&o$;_bЃ\FgG &eWKgV?oy8<T xB=@u6Rc�~n3V}cNeT/mc T`bIZ~0j#ͣhV[Zx'%e&"oZ QXce^-Cd̳g1eک䩍Z ]Xz-E`*ՁkA,E #-j6!BH䡈ulyi4B=ާ <Hg?G$\\jD[ϖιl)II$F<=beOY &K }v4ߘ>9"˒0'Ɣ `,d 9z x3>VE`ornjfGP!1T5VHmY@R�O]cQʈV`K-*sj<-|F1:-uG76l&NHƢ`�GF!h̸Q~`dtLc0 M*nE'-` EOi\@y3a}Ҟg I|ƯLx$] iRVI9]0W,Ni Vʎ #oZL)t4fĈ#G9r̄ѣFO8y1O>N>&!/kcǍ7.gq&׸eB& ğ&pH$){B֐ Y7ɚTc}2:l #Fe̒)kh֐N>bؐ18asOt T=is;}O^¢Z}ߪ߻<S| j;WwEKN۲h=vï_{їo^|kOKz7<ۗ-yo߽ Rv6P;tańs]' 0a9)咿~X=]GNW":;gYITNBr &$_ !yNfI?x+=ͷʻv>+.8Ե[\}nҥw|?^OyO;r q^9-ﵿv{_{9vi/6[tYWYq6ݱRo~_חϼ/W,+_ֿ +;uю仑wγf74oZ]^8W}='];Goyw޾;_[owloŋ?6'wy՛MKV|zSÍ<᥎=߹^]|OyoY^~>{/>a-^6`Kj{-x(.5=7v>|w_oW5O'Ï^K3~>|/:| -Cw'wߝÏΟ뻚xu~ϊgshի/:cӳ?_uJ+Oˮ?=mӼםΚ~/{Z6t:;U\%}aw1?-~@p>̗g_WqfZߝҭO;>{w>U8łV~iӲWO?yOV_?yptXz͖׽4e?{\؈kjtkmNS}nϬ[򃯮;SߎroxæeqkTȞT|m6珞~՗?rښl=x͟D'w<zٙc]}[놟oy9Oޮ؁<8'9둛~xzB?ҏ|m˞{[z.5_ekW?MG=>Rk/9|/=z_${tn>m{-3Θּ0%<5Ƕ ?w_{YO3s{_={#wwFx/:W|y//9[ׯ{EPrGķ޾>px{g]ǿ|׿=W P=[]OmM{vť|׻oӲNj\w\yE׫zMyyݽs{׿{{J*ŧ>޿ڡn<=o^v9M]n~=}}{3We+=<&;qiҊfcۗ]U+^޲gqETS= h_?:sїc? ׼|E+ؤO=̟y?{?~kڗᏏ?w=76\<XۺWk}׆Rg_W|KmXv^xzk:xڿw~`yz._G:.9.@tߺM?<\ÅV<C9e{<+?hUz^Kn{W^ZU~k?6<bK{~Ss#=_^Q:폾MOlzj߼O{ʃy=0@߷.neOw5-_~/'_m3ʤkxS}ó~7zw+uťqweŇ?ujr=^3)z).O~E\眽qI[\zdθޙwO޲;IrŖOҬkcO_w\q+wũ}sެ`Wn[_m\s UKFVK1뷛Xìg~Sg<“<05xᚿ]퇟3w峋_;ׯ::{ƥ[}ˮsݕoiw=kRÝ>3[ol?nyݻ[Y<K~ٹx{bܶ[P<8{'9bggٳssߞk?wMy׷z\z}v/~ dޟL=.<. .ڹ|+ϽrpqF\*:xz:;홿ōm.]6zP|w7^O߹Wǥ]/IKZ;{k/rx{oo_E yvo]גy[f^0KozG|ߌm_q_m]?W﮾zw{g]ݳןZ׵Ԃ>XcT%yIYZ9 y!IibRT/I|6m$brJF:!D2~Y<)s0ny7f?_5=EF8Fg3XG4soc$<Rj⚦H)0EaìYC:(Grdggi;E΍4`(*I,w&qP<gcBenA[(JHF-mNT#'|΍1//Z×>!6\os#ȝ,xQ!0E52ŁgDj Rζ_E Waxje60[&?,;iW[#ZálSpoYL|ǵ<1ĴyaM O/Ѽ)nT)jVF[,3E4R1EMY@\:ʿ^mTyUhU45Ƣ:ZpPT_6Gx#<A/dgs<^]Дθ`ZtzMaZ8 ?[kESG[<: Dp39s"l})ja OjT2t1 \XHX#*Gu E .E7>iq,Ekc/'NNʍo|9Ɲ t*NxɖɒBsyBB9B9 A*kz�IV - yYQyNdvbjh{ ~n s�^鋉0nB_@15Ѵ.\Bc[rb825mN8` Y j,|KИzq,B)KkVіO_!|rUQ]_ϋ0 K^S)�.mTMn]gp\`.́֓cwQ-YFZAm3v*!d 耴Z_תMB�Xy54kHt'y)ZO.G'?zP�YG2c:8ђY3E|NXiaمcDt'H AULYrQzima2wY|@ɴ66h~GBMw?̪YQbx, ZC#ZY1H? zq_c쉿#i6m�:9>p톟QJVzx{ #/ZnA%]V"Is:ßc7Tְ� sTf]8iN#G ¯&KWj ^=/^ fFbV8y፸)UU<*@ê>,B7W*pɹ*/ ywԨ3F,`>OtH�D^G`zx 1+*N a5Բ$0Ҩ �CDLqXE% RPj ({3E)WmE䕲+_jtզwޝcd@, GصiABD0qsE `_dVŢm-@ 4C+aoo&񰺰FBDj"4O GO4("yxtaA PGe\F,M xE !+#IKPSQBHd1jeYnR9ca56a~ -a/YuZ +p@:RBr !pUD,\Tdԉ@g ᷒l9FƜ $uhL@Դpq!qs؆`[ yp̘ C +q1!8maBys0=lAJsbW!SCn$@SLZ-֠f5 k e5\X5>K/v$jSL_c00_Ե.e�]( 7l.u9ީhnUJiQ[w+n]_0Y 2EUk5Ӽ8x/7*Qg. @FuqIQ[TR b/ }b`diyU#הC6"ќEs6n΢1l%qݘ\9l%qݜEcK41,hwshΦDs>h̢1y|\7<W9OќEsm΢oSҘe }0Uӆ MQOD * [l+i2*h[muՏV*5;.UʜLI[$q އt ra eOU jxp,hsjU?$7ق%5ɼ)yI5D_9岋  kgјj=Z$b,Zm,yRAu0֧ٓ7^'60:ё >#*/FB)4í$MŤhV'bREr;`uL5$ !{B^dry6K$VI%U Kީ+HܞQRc㔃ψR'˧gG*'gW+#Xe2A9))+Ô*ZS(zKʗ;{_+o*۔wWEU%+*6IR4M~sGJwƇ iNɧgG(+(e2NɑVMVJWǷŇJwe^y=Y \վ&)1-rH(A|Z}6Y[]Aͤ(oT WA/oA gH}*<iYE[]Fu ’EdnKD'ͦUdmN[nppహqkƆd65b f�cC8lg;"lne`#Cm 08l`p޾y!3BiFE+q(ň \3Pi<�s~/]n.1mĕdeLﶱH+8i<ra1yc+m\xaOiWK8jP%N3aؔ0hb#p)`6Tr(iIZb[4͐'-zf1C!1%mh9`GVjԽЫA u0Rdq^(:/"H u)OV,;I(B8#55:b4&=6^皠'MaC Z†6 -lhaC Z)l rd8*9-ϤT �w/|&GݘuP:{D[b<<gYb3$$glSģ vаTӬ }k#IT!z? |go|6N£β?S(9k]*51N!Xuj&4rSDjb 'އNϺ.i$vf<50Y ϿxӽRaKu7)u7$K|3TѹՅ=kU# Ͻ~/igI5EEPIh$+\!=ex]V˶ͮ4٪6%K_V;lq3Zk7뤁'͕W5h"oW^psfXX?р/!�'ቱVԷD.ʲ~`B2YJPxΈ̶Q4gy81cWm%} 2b" =*r75P/xYoċ,7p �++kK>̎RV-f?_hA''Y$<i(E鰲FmV"if TJ_vg'lBgBd<[,DmNGڸ#6wV>7ߵ{6KưkWr} D3{66Ycx!59&\#nI|lOV}4kc=P+i}[sHr ʚʺʂkYZ#k<-v߈!XY+_ƠI!q-/7[EKCkX<uT/ȵ$.xȴvYZ#P6iw\hW\W #+<k'Cif|ԥxUy2.M?Y)}R,.ᗆ|ᬯ$RY'/ Alo`b)=.Vm|qJL[Ү6Nn"&MDzoB +cxJnּS9>Zc~Gg43Ũn�:>`C'+ A>/NSVb^ijM= FY4/ Έږ�1}V .hҳSA Oȿ'.~o_!x2+~_!xu7_HwH_KQe }|.}[Fv_#7Jÿ!֍WI-%YZ ƳSsrRRKCr*+R_v {S)/IJo>H@ޕ+;whߓP!+svJC䆎Np|HgGH&$;*wtw!fޠıI']Cq}+?ѐ$u\-ewƳ:𹲣']Hʪ< 6=!][BnIjAR-e5tҐ# ١SA<kKG?gw@"!($@\"K[)K*;}ٛ;!jV0Yc"Vv+=N:UihYXݰ[!.2f59[=RV� )PUx]4~;I:�!찤{!x+I |WH9*C[ +:woDP2/uӐ �~0n{G 25kKai()A:u/5^u�l/4۪;JL.Yk%q4,ؕ_ZnHr6BpP �(Gg&iQI{!ḂPJ,V^+1A*] 4mmR~@ꑆ*O%E% aI[I޼7a&m&"PJ.iHp4Dޒ[sޒ�yi(P NЙ@vIA?4xI"Om:H>niծb!^dH؈US$)7l<xVT i%02cv2;V.3;rW"�mdTM+M#iHNz7< ڝ(B?@Wy  @6I@h$eOASeHI-'%ڜwHw9+Rw )ˑCm@&[)ja( &]Ǜ$BJ. 9&-c+iTкc&c;9T@#K2Śh'ռծQgo@-6"ԄLGt斜qo$t`tt s�1 �;e!2!BѫqZ-T ':H^!T94�I=҇}u#CQ[^1fwo;QBL4UMj$+L:VvJ@)hG@ھ3N-5xۉƔxπ^0T@*(Ǚ;1؅`wtHuճ/yo(=ڳl%ڛhuUH>fI``zSoCL$+[RP#U¬uօ]D*Ώ81B[hr%Kkg?bXPNY+x9c6Hȼ>|iEeAԥ#7MHi9�AZ`$ :Z][t&4nA4K .%X蓆+]O$bM5FGw`W%=6:#ndq2Ьxh.[KS{L6* 4ĩ ƒU0E^]o6H ʽc*v"BzLDP?E 6Ҿ #XHa [4p!o*HT6ebȺ}fO ܰ s8 2D ?6oL&~A;!kAeי z_BCyi =l4P|S'ߤ D�Q~֏LB*t�ieBL jQBd@3CT&P;@; :B 4&VXNmψ}H&"RT[u&&5`}є;-̕:uB#aHlm!&ұ�f>F$rϹ ;>dֽ Gt|(I!m2ZHH!$*6L̚KTt` uQL!Q|H]|$ :p/IFi'iF$Ouvp#t;'4:Ha'pшzJ˥= zѨYn͝nvzY9Xv[C턗=LIaJ绔Z!S> \ت>6t; =fF>s©Уֽx F+ "0d+s:888 S!xeaТtt(ݼ0'@JH1K™.d ُpm+idNC07˻Lz:$uԍs{Df^LItG-!(8&o6YG%H'>tkinۂ`W:{4|G!F)fNӐ.omqԒ-WgU*#=Of}Cڳ@ʃqJy 5p��O/y~M|BW U毷Qf^;Xk{['q`}?N cS:Q g:9%AߏO{Oߏ/{/ߴ>1qMc|v]z#\և/ž ?$<O| ֩(^!ꖲOx<<cЂFJ;y5u,~<ńT�50.2;G`_^Xʑ5//W[HFU3k5z6Ij( 4xVS>ҾR'py_xzt'̾NY}U}UNjp|D|xz܉J||X8RF9(cqN% <QTv[e}eNi}Ӝ3f8+*}N}5}5H'L'j||X8VF;*cN 'NtD +W:* \T;(syN'G8WwIʗ7)79oSns߮C |rnn'ʽNV~~Sy M~Gy GuRÔaNR""+9N bk`9Gq�ũ +&d;cTZcg)Y`d62+|p|h|([]]6'8eKrt#aa'8t֊#-9I(ܦg+TlڦFqU:++pr#FFGW;7+vsTol}lUGʏ(8SsX#7[𷕷*;n'[vW;¯u? ^(pp#7 x6s\ 6h)ϔt%e65Hȯ-EvUIz%Rwu(66LlJO@d3AyڣS93^2Tg VRFm1ǥI<Q![22e̔Sa"TJJM:6eRRf7+FT9#E&Ӹ2Hn!xx~3Cj8 zoY$\)`6IP.1RƜg5F6Mf$ӤݎI={gI?IHuՏN]8[ 7 3 Lv.u)AN%#ٕJL=)q FL\)$r`E$ɚyc3/5~ShkU OwBL2iimVhki |6L1\7̤wGZKn"G)/ JX%EԝEbG08t1Sj)Z #n2YSiSZiS'k-胓Fzإi,\2Q;J1(h|瀖䓌}%d4*p@d@X /tw�dd;ez ~gA-4Ӗ q3r9F f:2wpd9á3`9ӡ3r-9Ö<V1(кlPAV:.кlC<.к,8cFʒS`dQ!Axd|2^>P/X,Axq`*2)0#>AJ:`PԑuD#h⎨wD?dY1bFsFwb+=kq^uF k}uUb_g'C@AdP :PebQ1Wu~,]Y$t x5eB ]^(t x5B ]_,t |Е\8/J 3B rk0녀<Pnn5F@ X.E |>a1:>a1w`:y1e_ۗc(KL/P Ǿφc_X!o_X/v伪JXubzWZzqt^t%}Ga?Ѿj?/{☰OqLX8:/f/\& šWk4/`)m `MAlSH+)m bBZ663rv.)0lO! ا`KSlv�bYV>0>-v@.pl@l7H<&0l oPa 2B ˸o.d*bA* 2C H )dbAbo`B -87Hof^no ^‹?^‹_x;"f77%7 ;CE.Ͳ">SxS<{ zU# Lw0 w|z;>ST/w|* ҄w|0?wo x?Pl@@ x;&Zw xKpw'; |iNy0GA>=R9 -vJGI>}>SKzLg/||U*F̅agA&|Ͳ"|>[ʧѫO76MwT4gd;ef;!fTNNLr'L'>~89q:LwOXU{'>.i:#C!OE34B.3DN:ѧsE/蓡ƋPx'eͱO }0|ѧyGϰqs|: ;ĢyI?pc^xߓ>< WKp]muJJk|P g>�By^8 y</y%ul<?̅pi10</m)~</y>0< y^8',ul<?�l<oK8g(wyކ-3zyt P!y>9k#pU8 y<1p&)3y>ZpO9 G<?�y<V6)(J"RL8 gzL/?&߲ٗ>]f~\yҧ;[F(zѧ{0RwOGց>5U>CD?O-}^۰3T},+}~10|j{:eb9ai1$OAr&ZF^&鄗97@wPmg̛|: 2Ǖ|:3[>-rCʧEL=gd(|S:M>CD&~꘹ɧq%!O/K> 'y$ u.䓳X:ɧQG?G^>|ӈpgE1tO#C?ay!F䅃pᅢ|q(Dk`WH?^ K3l/#"W/)t zWoΰ~fY~Fa*)ZJ zWMȃۅ?>?-`g? 0 ?z~?ST/*~ ?҄0?~9х]gI硫~)p2Y ?~;ȏߖ~)B߾W~~l/߾rRbᶟYKni1۾./mo?`إt7g&stp@8'ߟc y-M3lSWpF? wt)p?ywPiݿ??O; wLQp7 wo>Mh沯O<r^p%_ Hh  j|_m[DYvNSC\ w0R�)N$@J"\ DkcZ%b'_Ȗ mgQ56'eϗhZ=W;@3{NʒsuR dO<K$g1_RMBtj+KӤ,iIS(.%+9wd+Ôhe2^T)e@(+_RܤܢܪܦܮܡܥܭܫܯZyS٦|(ArXgŇG??#~N|&H0x#"OTF(18%G9Q+Llzϗ/2|:L&)=!2.q'ΤBF #ǥ |**h_oѴhi\9di4O 嚫XX;Q|@jl+7FDU!mzop o#=_#Z8X@Kêg-_S&SݢB^Q_0@%Z8)VCd>Tu"+þZ%5AWF<G⦦hZkf Y5DQmF S5mTBP PZBbԹ~dd`DᑿׯW#-e/G4>7n5P _G'Ôz@ QlP0LJՀ=%j@T`%6+.%5d$H /fS !5E4x&6'Va_uK^BMAcj%1X�ypE $/盠-Qe@*#EVx]knZC\'fD#jaEaM]kT_äP4[A5T^`XBU\LjqF(i" ՎNڳtTI&Ѱps*G"0~/ :p]I%P!Nj�PuVq\kE|\MӪ?(o\MYQP%UnD}^6Hw@Ur[;/3g�Lů,*" h r_-. tVC-GC1P<phrm�%<z4?�=@a>e8]n$́mB-S.! @5@G4O,[Ǣ"ba y@Fk 894ia-!ʮdECa[Fdޕ_HfcjQ(l˘dPHИ͠!}kƙ�A;nizs3}v,FfPl &V.|~x9jjW[PkpIGws.5@U*I�ʁ2Ai?bntM.m z̳,cیR0Fh2jK>:xd~Ci"m>h4zcCZh%P$N<&KŢz+#0CHk嶄a+O HymⲘgg`j*.8 x%6 p 7Tד;"txh2y)IkKߠ ylr䒴##+UScz i0{(9ri"0a0Ψ0kEq/DYe>�uJBb7:DžiT0&E4ƂQ}b@@zyc7ԋCZ&mGb? rgMTq8=8CY~vSz Iώ@9 AEμ=;oLyZb ϒ#dyNK̳c C O1QDI4YLuչ$46::c\B!Fr'HG[s'HG7$fD#%HHQL)SgttmwtTXh$Fk8ᬑd _9ax`tD4:qE5‚)쯾8JY_a㱰` 苣^g}q_}qDJz,iFL ca9k\ca%HG)ccHJ|A構 FL H^IRD>Mʗ7atq#0 zmּ8SjXf^5,:?]O>[gKl -l ISB-Ќ`rP3-ی4=g_,ub0ݢ<;L03_)-*vr}ĻM`u&7Kp6wen&y}ѩnMtk[ݚ>ݚS֭NMtjS>ÝڧnݚD&5ѭ}51VD&:5ѩNکnMtk[ݚDD&5ѭnӭ%Ď"XD?A7}o?1.1O]\7KV<O9/jAPѨʂ ͛7ڮL{^&&YMO 52jdȨQ[1Fm=Gf52kdȬ F{52jdȨQڈ9y{G"ǖW+,{OZ=jDW^ל-qY)FǮңY⎷lkry`4s s/la:MM4컚 "7 ,Ga q5_ JwP)0O  EQqݦ;h4ǷQ}kc I)忰 M@mUVC:ۚ8/o|yVWO4[ٙn0Ni\ɣlևj<714pXƍ[o7xoCyIm O¼E%\ܶ+Ut[źF#ZWmT20>lUzWWr3Q\㖉;x5~Ns{QޤW2<6~eۃ;_)m s-wr5nO o3{/MלFqWsosV؆x]1Q"?{^Eѫ*bcx"6ѫzVj۶M<MݾUoɯvNLgp˼MbоYbTSÔ/: l:LyD!=Bn{/&n;?o<�\ g`]忛&$gY8pn[`�{~`=zxoaN) "&3[b+ᝲ R.y?UspA[(?<u[#d7c'%G2M3^uƁ,Gi5v_�@D(t5zgrDSUӯjCBi9z[tĐ؏ E+3O8Sx,߉=L*or27&|L�kK#4ʹt$D8]}\`lq!rSJMfhW* ixN+VēCL .`bV-/ ҙ HU51C 4NzJ]]V3K@:ART5$Ju*s'$QQkĦWT.P7~jA!%x2՞u=qǖJs}U|zeoG f) P`qe[YuӅcTkrFv,K7|uTz)2^3X�[UotP@Ԥm)Qɠy+ydQ seVX3=PjLU#:cw1)S.kҊq|Mm#e9ǻ].q Ks"H9w6CiDCcb0:S]M*ږT8-rSũ \τ<64SEA<<_P/ٟz{ltf_׹vG o<k$2wF$Oe2Fkޢ;<p+8?!HI/eH3J&O|2ed$w92"ّiSi9o۸و P ? Y8$Aiw)\=.Zi#goaGl]" Y>c[w ,(`0*-;i_ԡ&$Pϫ2ה&xȾy !Vt K XQj=ǭ.ũJ&;)%Y޶ml[CfЧd}=0쒪H{葈&EFCâ(EF5ȗRd4" 1EF)2Z e+P hA<(2 EF)2j)WOq!@J(EыMQFQQI EF}Pd4JEFK Ȩ/j(EF *"Ȩ.Ȩ\=Dž(2jEF++EF$ץz衇z衇z|R�P����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/.npmrc����������������������������������������������������������0000644�0000000�0000000�00000000017�12631326456�016705� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just = testing �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/builtin���������������������������������������������������������0000644�0000000�0000000�00000000026�12631326456�017156� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������builtin-config = true ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/globalconfig����������������������������������������������������0000644�0000000�0000000�00000000031�12631326456�020132� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������package-config:foo = boo �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/malformed�������������������������������������������������������0000644�0000000�0000000�00000000013�12631326456�017452� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������email = """���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/multi-ca��������������������������������������������������������0000644�0000000�0000000�00000003540�12631326456�017227� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������-----BEGIN CERTIFICATE----- MIICjTCCAfigAwIBAgIEMaYgRzALBgkqhkiG9w0BAQQwRTELMAkGA1UEBhMCVVMx NjA0BgNVBAoTLU5hdGlvbmFsIEFlcm9uYXV0aWNzIGFuZCBTcGFjZSBBZG1pbmlz dHJhdGlvbjAmFxE5NjA1MjgxMzQ5MDUrMDgwMBcROTgwNTI4MTM0OTA1KzA4MDAw ZzELMAkGA1UEBhMCVVMxNjA0BgNVBAoTLU5hdGlvbmFsIEFlcm9uYXV0aWNzIGFu ZCBTcGFjZSBBZG1pbmlzdHJhdGlvbjEgMAkGA1UEBRMCMTYwEwYDVQQDEwxTdGV2 ZSBTY2hvY2gwWDALBgkqhkiG9w0BAQEDSQAwRgJBALrAwyYdgxmzNP/ts0Uyf6Bp miJYktU/w4NG67ULaN4B5CnEz7k57s9o3YY3LecETgQ5iQHmkwlYDTL2fTgVfw0C AQOjgaswgagwZAYDVR0ZAQH/BFowWDBWMFQxCzAJBgNVBAYTAlVTMTYwNAYDVQQK Ey1OYXRpAAAAACBBZXJvbmF1dGljcyBhbmQgU3BhY2UgQWRtaW5pc3RyYXRpb24x DTALBgNVBAMTBENSTDEwFwYDVR0BAQH/BA0wC4AJODMyOTcwODEwMBgGA1UdAgQR MA8ECTgzMjk3MDgyM4ACBSAwDQYDVR0KBAYwBAMCBkAwCwYJKoZIhvcNAQEEA4GB AH2y1VCEw/A4zaXzSYZJTTUi3uawbbFiS2yxHvgf28+8Js0OHXk1H1w2d6qOHH21 X82tZXd/0JtG0g1T9usFFBDvYK8O0ebgz/P5ELJnBL2+atObEuJy1ZZ0pBDWINR3 WkDNLCGiTkCKp0F5EWIrVDwh54NNevkCQRZita+z4IBO -----END CERTIFICATE----- -----BEGIN CERTIFICATE----- AAAAAACCAfigAwIBAgIEMaYgRzALBgkqhkiG9w0BAQQwRTELMAkGA1UEBhMCVVMx NjA0BgNVBAoTLU5hdGlvbmFsIEFlcm9uYXV0aWNzIGFuZCBTcGFjZSBBZG1pbmlz dHJhdGlvbjAmFxE5NjA1MjgxMzQ5MDUrMDgwMBcROTgwNTI4MTM0OTA1KzA4MDAw ZzELMAkGA1UEBhMCVVMxNjA0BgNVBAoTLU5hdGlvbmFsIEFlcm9uYXV0aWNzIGFu ZCBTcGFjZSBBZG1pbmlzdHJhdGlvbjEgMAkGA1UEBRMCMTYwEwYDVQQDEwxTdGV2 ZSBTY2hvY2gwWDALBgkqhkiG9w0BAQEDSQAwRgJBALrAwyYdgxmzNP/ts0Uyf6Bp miJYktU/w4NG67ULaN4B5CnEz7k57s9o3YY3LecETgQ5iQHmkwlYDTL2fTgVfw0C AQOjgaswgagwZAYDVR0ZAQH/BFowWDBWMFQxCzAJBgNVBAYTAlVTMTYwNAYDVQQK Ey1OYXRpb25hbCBBZXJvbmF1dGljcyBhbmQgU3BhY2UgQWRtaW5pc3RyYXRpb24x DTALBgNVBAMTBENSTDEwFwYDVR0BAQH/BA0wC4AJODMyOTcwODEwMBgGA1UdAgQR MA8ECTgzMjk3MDgyM4ACBSAwDQYDVR0KBAYwBAMCBkAwCwYJKoZIhvcNAQEEA4GB AH2y1VCEw/A4zaXzSYZJTTUi3uawbbFiS2yxHvgf28+8Js0OHXk1H1w2d6qOHH21 X82tZXd/0JtG0g1T9usFFBDvYK8O0ebgz/P5ELJnBL2+atObEuJy1ZZ0pBDWINR3 WkDNLCGiTkCKp0F5EWIrVDwh54NNevkCQRZita+z4IBO -----END CERTIFICATE----- ����������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/package.json����������������������������������������������������0000644�0000000�0000000�00000000000�12631326456�020043� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/fixtures/config/userconfig������������������������������������������������������0000644�0000000�0000000�00000001052�12631326456�017654� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������email = i@izs.me env-thing = ${random_env_var} init.author.name = Isaac Z. Schlueter init.author.email = i@izs.me init.author.url = http://blog.izs.me/ init.version = 1.2.3 proprietary-attribs = false npm:publishtest = true _npmjs.org:couch = https://admin:password@localhost:5984/registry npm-www:nocache = 1 nodedir = /Users/isaacs/dev/js/node-v0.8 sign-git-tag = true message = v%s strict-ssl = false tmp = ~/.tmp _auth = dXNlcm5hbWU6cGFzc3dvcmQ= [_token] AuthSession = yabba-dabba-doodle version = 1 expires = 1345001053415 path = / httponly = true ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-array-bin/����������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017660� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg/��������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017067� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg3/�������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017152� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-bundled-git/��������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020172� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-dir-bin/������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017320� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-env-reader/���������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020024� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/��������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017076� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/�������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017257� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/���������������������������������������������0000755�0000000�0000000�00000000000�12631326456�021147� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-missing-bindir/�����������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020712� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-optional-deps/������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020552� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-platform/�����������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017620� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-platform-all/�������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020366� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-private/������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017446� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-shrinkwrap/���������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020164� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-test-package/�������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020344� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-url-dep/������������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017344� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-array-bin/README����������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020531� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-array-bin/bin/������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020430� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-array-bin/package.json����������������������������������������0000644�0000000�0000000�00000000171�12631326456�022145� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-array-bin" , "version":"1.2.5" , "bin": [ "bin/array-bin" ] , "scripts": { "test": "node test.js" } } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-array-bin/test.js���������������������������������������������0000644�0000000�0000000�00000000253�12631326456�021175� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������require('child_process').exec('array-bin', { env: process.env }, function (err) { if (err && err.code) throw new Error('exited badly with code = ' + err.code) } ) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-array-bin/bin/array-bin���������������������������������������0000644�0000000�0000000�00000000046�12631326456�022237� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node console.log('ok') ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg/README��������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�017740� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg/package.json��������������������������������������������0000644�0000000�0000000�00000000176�12631326456�021361� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-blerg" , "version" : "0.0.2" , "scripts" : { "test" : "node test.js" } , "publishConfig": {"tag": "foo"} } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg/test.js�������������������������������������������������0000644�0000000�0000000�00000000406�12631326456�020404� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require('assert') assert.equal(undefined, process.env.npm_config__password, 'password exposed!') assert.equal(undefined, process.env.npm_config__auth, 'auth exposed!') assert.equal(undefined, process.env.npm_config__authCrypt, 'authCrypt exposed!') ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg3/README�������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020023� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg3/package.json�������������������������������������������0000644�0000000�0000000�00000000224�12631326456�021436� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-blerg3" , "homepage": "https://github.com/npm/npm/issues/2658" , "version" : "0.0.0" , "scripts" : { "test" : "node test.js" } } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-blerg3/test.js������������������������������������������������0000644�0000000�0000000�00000000406�12631326456�020467� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require('assert') assert.equal(undefined, process.env.npm_config__password, 'password exposed!') assert.equal(undefined, process.env.npm_config__auth, 'auth exposed!') assert.equal(undefined, process.env.npm_config__authCrypt, 'authCrypt exposed!') ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-bundled-git/README��������������������������������������������0000644�0000000�0000000�00000000021�12631326456�021043� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-bundled-git/minimatch-expected.json���������������������������0000644�0000000�0000000�00000001010�12631326456�024625� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "minimatch", "description": "a glob matcher in javascript", "version": "0.2.1", "repository": { "type": "git", "url": "git://github.com/isaacs/minimatch.git" }, "main": "minimatch.js", "scripts": { "test": "tap test" }, "engines": { "node": "*" }, "dependencies": { "lru-cache": "~1.0.5" }, "devDependencies": { "tap": "~0.1.3" }, "licenses" : [ { "type" : "MIT", "url" : "http://github.com/isaacs/minimatch/raw/master/LICENSE" } ] } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-bundled-git/package.json��������������������������������������0000644�0000000�0000000�00000000300�12631326456�022451� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-bundled-git" ,"scripts":{"test":"node test.js"} ,"version":"1.2.5" ,"dependencies":{"glob":"git://github.com/isaacs/node-glob.git#npm-test"} ,"bundledDependencies":["glob"]} ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-bundled-git/test.js�������������������������������������������0000644�0000000�0000000�00000000423�12631326456�021506� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var a = require('./node_modules/glob/node_modules/minimatch/package.json') var e = require('./minimatch-expected.json') var assert = require('assert') Object.keys(e).forEach(function (key) { assert.deepEqual(a[key], e[key], "didn't get expected minimatch/package.json") }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-dir-bin/README������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020171� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-dir-bin/bin/��������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020070� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-dir-bin/package.json������������������������������������������0000644�0000000�0000000�00000000176�12631326456�021612� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-dir-bin" , "version":"1.2.5" , "directories": { "bin": "./bin" } , "scripts": { "test": "node test.js" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-dir-bin/test.js�����������������������������������������������0000644�0000000�0000000�00000000270�12631326456�020634� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������require('child_process').exec('dir-bin', { stdio: 'pipe', env: process.env }, function (err) { if (err && err.code) throw new Error('exited badly with code = ' + err.code) } ) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-dir-bin/bin/dir-bin�������������������������������������������0000644�0000000�0000000�00000000046�12631326456�021337� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������#!/usr/bin/env node console.log('ok') ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-env-reader/README���������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020675� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-env-reader/package.json���������������������������������������0000644�0000000�0000000�00000000626�12631326456�022316� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-env-reader" , "version" : "1.2.3" , "scripts" : { "install" : "node test.js" , "preinstall" : "node test.js" , "preuninstall" : "node test.js" , "postuninstall" : "node test.js" , "test" : "node test.js" , "stop" : "node test.js" , "start" : "node test.js" , "restart" : "node test.js" , "foo" : "node test.js" } } ����������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-env-reader/test.js��������������������������������������������0000755�0000000�0000000�00000000337�12631326456�021347� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var envs = [] for (var e in process.env) { if (e.match(/npm|^path$/i)) envs.push(e + '=' + process.env[e]) } envs.sort(function (a, b) { return a === b ? 0 : a > b ? -1 : 1 }).forEach(function (e) { console.log(e) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/.npmignore����������������������������������������������0000644�0000000�0000000�00000000114�12631326456�021071� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������/sub/ignore1 ./sub/include2 ignore3 ./include4 ignoredir1 ignoredir2/ *.tgz ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/README��������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�017747� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/ignore3�������������������������������������������������0000644�0000000�0000000�00000000000�12631326456�020355� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/ignoredir1/���������������������������������������������0000755�0000000�0000000�00000000000�12631326456�021141� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/ignoredir2/���������������������������������������������0000755�0000000�0000000�00000000000�12631326456�021142� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/include4������������������������������������������������0000644�0000000�0000000�00000000000�12631326456�020516� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/package.json��������������������������������������������0000644�0000000�0000000�00000000306�12631326456�021363� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-files" , "version":"1.2.5" , "files": [ "include4" , "sub/include" , "sub/include2" , "sub/include4" , "test.sh" , ".npmignore" ] , "scripts":{"test":"bash test.sh"}} ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/sub/����������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�017667� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/test.sh�������������������������������������������������0000644�0000000�0000000�00000000646�12631326456�020417� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������x=`find . | grep ignore | grep -v npmignore` if [ "$x" != "" ]; then echo "ignored files included: $x" exit 1 fi x=`find . | grep -v ignore | sort` y=". ./include4 ./package.json ./sub ./sub/include ./sub/include2 ./sub/include4 ./test.sh" if [ "$x" != "$y" ]; then echo "missing included files" echo "got:" echo "===" echo "$x" echo "===" echo "wanted:" echo "===" echo "$y" echo "===" exit 1 fi ������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/ignoredir1/a��������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021272� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/ignoredir2/a��������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021273� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/sub/ignore1���������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021144� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/sub/ignore3���������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021146� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/sub/include���������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021223� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/sub/include2��������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021305� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-files/sub/include4��������������������������������������������0000644�0000000�0000000�00000000044�12631326456�021317� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file should be in the package. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/.npmignore���������������������������������������������0000644�0000000�0000000�00000000114�12631326456�021252� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������/sub/ignore1 ./sub/include2 ignore3 ./include4 ignoredir1 ignoredir2/ *.tgz ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/README�������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020130� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/ignore3������������������������������������������������0000644�0000000�0000000�00000000000�12631326456�020536� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/ignoredir1/��������������������������������������������0000755�0000000�0000000�00000000000�12631326456�021322� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/ignoredir2/��������������������������������������������0000755�0000000�0000000�00000000000�12631326456�021323� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/include4�����������������������������������������������0000644�0000000�0000000�00000000000�12631326456�020677� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/package.json�������������������������������������������0000644�0000000�0000000�00000000124�12631326456�021542� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-ignore" , "version":"1.2.5" , "scripts":{"test":"bash test.sh"}} ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/sub/���������������������������������������������������0000755�0000000�0000000�00000000000�12631326456�020050� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/test.sh������������������������������������������������0000644�0000000�0000000�00000000706�12631326456�020575� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������x=`find . | grep ignore | grep -v npmignore` if [ "$x" != "" ]; then echo "ignored files included: $x" exit 1 fi x=`find . | grep -v ignore | sort` y=". ./include4 ./package.json ./README ./sub ./sub/include ./sub/include2 ./sub/include4 ./test.sh" y="`echo "$y" | sort`" if [ "$x" != "$y" ]; then echo "missing included files" echo "got:" echo "===" echo "$x" echo "===" echo "wanted:" echo "===" echo "$y" echo "===" exit 1 fi ����������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/ignoredir1/a�������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021453� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/ignoredir2/a�������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021454� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/sub/ignore1��������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021325� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/sub/ignore3��������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021327� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/sub/include��������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021404� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/sub/include2�������������������������������������������0000644�0000000�0000000�00000000000�12631326456�021466� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore/sub/include4�������������������������������������������0000644�0000000�0000000�00000000044�12631326456�021500� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������This file should be in the package. ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/README���������������������������������������0000644�0000000�0000000�00000000021�12631326456�022020� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/lib/�����������������������������������������0000755�0000000�0000000�00000000000�12631326456�021715� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/package.json���������������������������������0000644�0000000�0000000�00000000133�12631326456�023432� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-ignore-nested-nm" ,"version":"1.2.5" ,"scripts":{"test":"node test.js"}} �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/test.js��������������������������������������0000644�0000000�0000000�00000000110�12631326456�022454� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') fs.statSync(__dirname + '/lib/node_modules/foo') ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/lib/node_modules/����������������������������0000755�0000000�0000000�00000000000�12631326456�024372� 5����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-ignore-nested-nm/lib/node_modules/foo�������������������������0000644�0000000�0000000�00000000027�12631326456�025077� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������I WILL NOT BE IGNORED! ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-missing-bindir/README�����������������������������������������0000644�0000000�0000000�00000000021�12631326456�021563� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-missing-bindir/package.json�����������������������������������0000644�0000000�0000000�00000000220�12631326456�023172� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-missing-bindir" , "version" : "0.0.0" , "scripts" : { "test" : "node test.js" } , "directories": { "bin" : "./not-found" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-missing-bindir/test.js����������������������������������������0000644�0000000�0000000�00000000406�12631326456�022227� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require('assert') assert.equal(undefined, process.env.npm_config__password, 'password exposed!') assert.equal(undefined, process.env.npm_config__auth, 'auth exposed!') assert.equal(undefined, process.env.npm_config__authCrypt, 'authCrypt exposed!') ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-optional-deps/README������������������������������������������0000644�0000000�0000000�00000000021�12631326456�021423� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-optional-deps/package.json������������������������������������0000644�0000000�0000000�00000000462�12631326456�023042� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-test-optional-deps" , "version": "1.2.5" , "scripts": { "test": "node test.js" } , "optionalDependencies": { "npm-test-foobarzaaakakaka": "http://example.com/" , "dnode": "10.999.14234" , "sax": "0.3.5" , "glob": "some invalid version 99 #! $$ x y z" , "npm-test-failer":"*" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-optional-deps/test.js�����������������������������������������0000644�0000000�0000000�00000000444�12631326456�022071� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var assert = require('assert') var path = require('path') // sax should be the only dep that ends up installed var dir = path.resolve(__dirname, 'node_modules') assert.deepEqual(fs.readdirSync(dir), ['sax']) assert.equal(require('sax/package.json').version, '0.3.5') ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-platform/README�����������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020471� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-platform/package.json�����������������������������������������0000644�0000000�0000000�00000000327�12631326456�022110� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-platform" ,"version":"9.9.9-9" ,"homepage":"http://www.youtube.com/watch?v=dQw4w9WgXcQ" ,"os":["!this_is_not_a_real_os", "!neither_is_this"] ,"cpu":["!this_is_not_a_real_cpu","!this_isnt_either"]} ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-platform-all/README�������������������������������������������0000644�0000000�0000000�00000000021�12631326456�021237� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-platform-all/package.json�������������������������������������0000644�0000000�0000000�00000000335�12631326456�022655� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-platform-all" ,"version":"9.9.9-9" ,"homepage":"http://www.zombo.com/" ,"os":["darwin","linux","win32","solaris","haiku","sunos","freebsd","openbsd","netbsd"] ,"cpu":["arm","mips","ia32","x64","sparc"]} ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-private/README������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020317� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-private/package.json������������������������������������������0000644�0000000�0000000�00000000174�12631326456�021736� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{"name":"npm-test-private" ,"version":"9.9.9-9" ,"homepage":"http://www.youtube.com/watch?v=1MLry6Cn_D4" ,"private":"true"} ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-shrinkwrap/README���������������������������������������������0000644�0000000�0000000�00000000021�12631326456�021035� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-shrinkwrap/npm-shrinkwrap.json��������������������������������0000644�0000000�0000000�00000002720�12631326456�024040� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name": "npm-test-shrinkwrap", "version": "0.0.0", "dependencies": { "glob": { "version": "3.1.5", "from": "git://github.com/isaacs/node-glob.git#npm-test", "resolved": "git://github.com/isaacs/node-glob.git#67bda227fd7a559cca5620307c7d30a6732a792f", "dependencies": { "graceful-fs": { "version": "1.1.5", "resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-1.1.5.tgz", "dependencies": { "fast-list": { "version": "1.0.2", "resolved": "https://registry.npmjs.org/fast-list/-/fast-list-1.0.2.tgz" } } }, "inherits": { "version": "1.0.0", "resolved": "https://registry.npmjs.org/inherits/-/inherits-1.0.0.tgz" }, "minimatch": { "version": "0.2.1", "dependencies": { "lru-cache": { "version": "1.0.5" } } } } }, "minimatch": { "version": "0.1.5", "resolved": "https://registry.npmjs.org/minimatch/-/minimatch-0.1.5.tgz", "dependencies": { "lru-cache": { "version": "1.0.5", "resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-1.0.5.tgz" } } }, "npm-test-single-file": { "version": "1.2.3", "resolved": "https://gist.github.com/isaacs/1837112/raw/9ef57a59fc22aeb1d1ca346b68826dcb638b8416/index.js" } } } ������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-shrinkwrap/package.json���������������������������������������0000644�0000000�0000000�00000000631�12631326456�022452� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "author": "Isaac Z. Schlueter <i@izs.me> (http://blog.izs.me/)", "name": "npm-test-shrinkwrap", "version": "0.0.0", "dependencies": { "npm-test-single-file": "https://gist.github.com/isaacs/1837112/raw/9ef57a59fc22aeb1d1ca346b68826dcb638b8416/index.js", "glob": "git://github.com/isaacs/node-glob.git#npm-test", "minimatch": "~0.1.0" }, "scripts": { "test": "node test.js" } } �������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-shrinkwrap/test.js��������������������������������������������0000644�0000000�0000000�00000001650�12631326456�021503� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var assert = require('assert') process.env.npm_config_prefix = process.cwd() delete process.env.npm_config_global delete process.env.npm_config_depth var npm = process.env.npm_execpath require('child_process').execFile(process.execPath, [npm, 'ls', '--json'], { stdio: 'pipe', env: process.env, cwd: process.cwd() }, function (err, stdout, stderr) { if (err) throw err var actual = JSON.parse(stdout) var expected = require('./npm-shrinkwrap.json') rmFrom(actual) actual = actual.dependencies rmFrom(expected) expected = expected.dependencies console.error(JSON.stringify(actual, null, 2)) console.error(JSON.stringify(expected, null, 2)) assert.deepEqual(actual, expected) } ) function rmFrom (obj) { for (var i in obj) { if (i === 'from') { delete obj[i] } else if (i === 'dependencies') { for (var j in obj[i]) { rmFrom(obj[i][j]) } } } } ����������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-test-package/README�������������������������������������������0000644�0000000�0000000�00000000021�12631326456�021215� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-test-package/package.json�������������������������������������0000644�0000000�0000000�00000000301�12631326456�022624� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-test-package" , "author" : "Testy McMock" , "version" : "1.2.3-99-b" , "description" : "This is a test package used for debugging. It has some random data and that's all." } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-url-dep/README������������������������������������������������0000644�0000000�0000000�00000000021�12631326456�020215� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������just an npm test ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/packages/npm-test-url-dep/package.json������������������������������������������0000644�0000000�0000000�00000000352�12631326456�021632� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������{ "name":"npm-test-url-dep" , "version" : "1.2.3" , "dependencies" : { "jsonify" : "https://github.com/substack/jsonify/tarball/master" , "sax": "isaacs/sax-js" , "canonical-host": "git://github.com/isaacs/canonical-host" } } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/00-check-mock-dep.js��������������������������������������������������������0000644�0000000�0000000�00000000764�12631326456�016571� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������console.log('TAP Version 13') process.on('uncaughtException', function (er) { if (er) { throw er } console.log('not ok - Failed checking mock registry dep. Expect much fail!') console.log('1..1') process.exit(1) }) var assert = require('assert') var semver = require('semver') var mock = require('npm-registry-mock/package.json').version var req = require('../../package.json').devDependencies['npm-registry-mock'] assert(semver.satisfies(mock, req)) console.log('ok') console.log('1..1') ������������npm_3.5.2.orig/test/tap/00-config-setup.js����������������������������������������������������������0000644�0000000�0000000�00000005315�12631326456�016417� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var userconfigSrc = path.resolve(__dirname, '..', 'fixtures', 'config', 'userconfig') exports.userconfig = userconfigSrc + '-with-gc' exports.globalconfig = path.resolve(__dirname, '..', 'fixtures', 'config', 'globalconfig') exports.builtin = path.resolve(__dirname, '..', 'fixtures', 'config', 'builtin') exports.malformed = path.resolve(__dirname, '..', 'fixtures', 'config', 'malformed') exports.ucData = { globalconfig: exports.globalconfig, email: 'i@izs.me', 'env-thing': 'asdf', 'init.author.name': 'Isaac Z. Schlueter', 'init.author.email': 'i@izs.me', 'init.author.url': 'http://blog.izs.me/', 'init.version': '1.2.3', 'proprietary-attribs': false, 'npm:publishtest': true, '_npmjs.org:couch': 'https://admin:password@localhost:5984/registry', 'npm-www:nocache': '1', nodedir: '/Users/isaacs/dev/js/node-v0.8', 'sign-git-tag': true, message: 'v%s', 'strict-ssl': false, 'tmp': process.env.HOME + '/.tmp', _auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', _token: { AuthSession: 'yabba-dabba-doodle', version: '1', expires: '1345001053415', path: '/', httponly: true } } // set the userconfig in the env // unset anything else that npm might be trying to foist on us Object.keys(process.env).forEach(function (k) { if (k.match(/^npm_config_/i)) { delete process.env[k] } }) process.env.npm_config_userconfig = exports.userconfig process.env.npm_config_other_env_thing = 1000 process.env.random_env_var = 'asdf' process.env.npm_config__underbar_env_thing = 'underful' process.env.NPM_CONFIG_UPPERCASE_ENV_THING = 42 exports.envData = { userconfig: exports.userconfig, '_underbar-env-thing': 'underful', 'uppercase-env-thing': '42', 'other-env-thing': '1000' } exports.envDataFix = { userconfig: exports.userconfig, '_underbar-env-thing': 'underful', 'uppercase-env-thing': 42, 'other-env-thing': 1000 } var projectConf = path.resolve(__dirname, '..', '..', '.npmrc') try { fs.statSync(projectConf) } catch (er) { // project conf not found, probably working with packed npm fs.writeFileSync(projectConf, 'save-prefix = ~\nproprietary-attribs = false\n') } var projectRc = path.join(__dirname, '..', 'fixtures', 'config', '.npmrc') try { fs.statSync(projectRc) } catch (er) { // project conf not found, probably working with packed npm fs.writeFileSync(projectRc, 'just = testing') } if (module === require.main) { // set the globalconfig in the userconfig var uc = fs.readFileSync(userconfigSrc) var gcini = 'globalconfig = ' + exports.globalconfig + '\n' fs.writeFileSync(exports.userconfig, gcini + uc) console.log('1..1') console.log('ok 1 setup done') } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/00-verify-bundle-deps.js����������������������������������������������������0000644�0000000�0000000�00000000571�12631326456�017517� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var manifest = require('../../package.json') var deps = Object.keys(manifest.dependencies) var bundled = manifest.bundleDependencies test('all deps are bundled deps or dev deps', function (t) { deps.forEach(function (name) { t.assert( bundled.indexOf(name) !== -1, name + ' is in bundledDependencies' ) }) t.end() }) ���������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/00-verify-ls-ok.js����������������������������������������������������������0000644�0000000�0000000�00000001164�12631326456�016341� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap') var test = require('tap').test var path = require('path') var cwd = path.resolve(__dirname, '..', '..') var fs = require('fs') test('npm ls in npm', function (t) { t.ok(fs.existsSync(cwd), 'ensure that the path we are calling ls within exists') var files = fs.readdirSync(cwd) t.notEqual(files.length, 0, 'ensure there are files in the directory we are to ls') var opt = { cwd: cwd, stdio: [ 'ignore', 'ignore', 2 ] } common.npm(['ls'], opt, function (err, code) { t.ifError(err, 'error should not exist') t.equal(code, 0, 'npm ls exited with code') t.end() }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/404-parent.js���������������������������������������������������������������0000644�0000000�0000000�00000003125�12631326456�015372� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var osenv = require('osenv') var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var pkg = path.resolve(__dirname, '404-parent') var mr = require('npm-registry-mock') test('404-parent: if parent exists, specify parent in error message', function (t) { setup() rimraf.sync(path.resolve(pkg, 'node_modules')) performInstall(function (err) { t.ok(err instanceof Error, 'error was returned') t.ok(err.parent === '404-parent-test', "error's parent set") t.end() }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) mkdirp.sync(path.resolve(pkg, 'cache')) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Evan Lucas', name: '404-parent-test', version: '0.0.0', description: 'Test for 404-parent', dependencies: { 'test-npm-404-parent-test': '*' } }), 'utf8') process.chdir(pkg) } function plugin (server) { server.get('/test-npm-404-parent-test') .reply(404, {'error': 'version not found'}) } function performInstall (cb) { mr({port: common.port, plugin: plugin}, function (er, s) { // create mock registry. npm.load({registry: common.registry}, function () { var pwd = process.cwd() process.chdir(pkg) npm.commands.install([], function (err) { process.chdir(pwd) cb(err) s.close() // shutdown mock npm server. }) }) }) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/404-private-registry-scoped.js����������������������������������������������0000644�0000000�0000000�00000001435�12631326456�020676� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������require('../common-tap') var nock = require('nock') var test = require('tap').test var npm = require('../../') var addNamed = require('../../lib/cache/add-named') test('scoped package names not mangled on error with non-root registry', function test404 (t) { nock('http://localhost:1337') .get('/registry/@scope%2ffoo') .reply(404, { error: 'not_found', reason: 'document not found' }) npm.load({registry: 'http://localhost:1337/registry', global: true}, function () { addNamed('@scope/foo', '*', null, function checkError (err) { t.ok(err, 'should error') t.equal(err.message, '404 Not Found: @scope/foo', 'should have package name in error') t.equal(err.pkgid, '@scope/foo', 'err.pkgid should match package name') t.end() }) }) }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/404-private-registry.js�����������������������������������������������������0000644�0000000�0000000�00000001551�12631326456�017422� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������require('../common-tap') var nock = require('nock') var test = require('tap').test var path = require('path') var npm = require('../../') var addNamed = require('../../lib/cache/add-named') var packageName = path.basename(__filename, '.js') test('package names not mangled on error with non-root registry', function test404 (t) { nock('http://localhost:1337') .get('/registry/' + packageName) .reply(404, { error: 'not_found', reason: 'document not found' }) npm.load({registry: 'http://localhost:1337/registry', global: true}, function () { addNamed(packageName, '*', null, function checkError (err) { t.ok(err, 'should error') t.equal(err.message, '404 Not Found: ' + packageName, 'should have package name in error') t.equal(err.pkgid, packageName, 'err.pkgid should match package name') t.end() }) }) }) �������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/access.js�������������������������������������������������������������������0000644�0000000�0000000�00000027661�12631326456�015050� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'access') var server var scoped = { name: '@scoped/pkg', version: '1.1.1' } test('setup', function (t) { mkdirp(pkg, function (er) { t.ifError(er, pkg + ' made successfully') mr({port: common.port}, function (err, s) { t.ifError(err, 'registry mocked successfully') server = s fs.writeFile( path.join(pkg, 'package.json'), JSON.stringify(scoped), function (er) { t.ifError(er, 'wrote package.json') t.end() } ) }) }) }) test('npm access public on current package', function (t) { server.post('/-/package/%40scoped%2Fpkg/access', JSON.stringify({ access: 'public' })).reply(200, { accessChanged: true }) common.npm( [ 'access', 'public', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.end() } ) }) test('npm access public when no package passed and no package.json', function (t) { // need to simulate a missing package.json var missing = path.join(__dirname, 'access-public-missing-guard') mkdirp.sync(path.join(missing, 'node_modules')) common.npm([ 'access', 'public', '--registry', common.registry ], { cwd: missing }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.match(stderr, /no package name passed to command and no package.json found/) rimraf.sync(missing) t.end() }) }) test('npm access public when no package passed and invalid package.json', function (t) { // need to simulate a missing package.json var invalid = path.join(__dirname, 'access-public-invalid-package') mkdirp.sync(path.join(invalid, 'node_modules')) // it's hard to force `read-package-json` to break w/o ENOENT, but this will do it fs.writeFileSync(path.join(invalid, 'package.json'), '{\n') common.npm([ 'access', 'public', '--registry', common.registry ], { cwd: invalid }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.match(stderr, /Failed to parse json/) rimraf.sync(invalid) t.end() }) }) test('npm access restricted on current package', function (t) { server.post('/-/package/%40scoped%2Fpkg/access', JSON.stringify({ access: 'restricted' })).reply(200, { accessChanged: true }) common.npm( [ 'access', 'restricted', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.end() } ) }) test('npm access on named package', function (t) { server.post('/-/package/%40scoped%2Fanother/access', { access: 'public' }).reply(200, { accessChaged: true }) common.npm( [ 'access', 'public', '@scoped/another', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.end() } ) }) test('npm change access on unscoped package', function (t) { common.npm( [ 'access', 'restricted', 'yargs', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ok(code, 'exited with Error') t.matches( stderr, /access commands are only accessible for scoped packages/) t.end() } ) }) test('npm access grant read-only', function (t) { server.put('/-/team/myorg/myteam/package', { permissions: 'read-only', package: '@scoped/another' }).reply(201, { accessChaged: true }) common.npm( [ 'access', 'grant', 'read-only', 'myorg:myteam', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access grant') t.equal(code, 0, 'exited with Error') t.end() } ) }) test('npm access grant read-write', function (t) { server.put('/-/team/myorg/myteam/package', { permissions: 'read-write', package: '@scoped/another' }).reply(201, { accessChaged: true }) common.npm( [ 'access', 'grant', 'read-write', 'myorg:myteam', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access grant') t.equal(code, 0, 'exited with Error') t.end() } ) }) test('npm access grant others', function (t) { common.npm( [ 'access', 'grant', 'rerere', 'myorg:myteam', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ok(code, 'exited with Error') t.matches(stderr, /read-only/) t.matches(stderr, /read-write/) t.end() } ) }) test('npm access revoke', function (t) { server.delete('/-/team/myorg/myteam/package', { package: '@scoped/another' }).reply(200, { accessChaged: true }) common.npm( [ 'access', 'revoke', 'myorg:myteam', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access grant') t.equal(code, 0, 'exited with Error') t.end() } ) }) test('npm access ls-packages with no team', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } server.get( '/-/org/username/package?format=cli' ).reply(200, serverPackages) common.npm( [ 'access', 'ls-packages', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-packages') t.same(JSON.parse(stdout), clientPackages) t.end() } ) }) test('npm access ls-packages on team', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } server.get( '/-/team/myorg/myteam/package?format=cli' ).reply(200, serverPackages) common.npm( [ 'access', 'ls-packages', 'myorg:myteam', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-packages') t.same(JSON.parse(stdout), clientPackages) t.end() } ) }) test('npm access ls-packages on org', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } server.get( '/-/org/myorg/package?format=cli' ).reply(200, serverPackages) common.npm( [ 'access', 'ls-packages', 'myorg', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-packages') t.same(JSON.parse(stdout), clientPackages) t.end() } ) }) test('npm access ls-packages on user', function (t) { var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } server.get( '/-/org/myorg/package?format=cli' ).reply(404, {error: 'nope'}) server.get( '/-/user/myorg/package?format=cli' ).reply(200, serverPackages) common.npm( [ 'access', 'ls-packages', 'myorg', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-packages') t.same(JSON.parse(stdout), clientPackages) t.end() } ) }) test('npm access ls-packages with no package specified or package.json', function (t) { // need to simulate a missing package.json var missing = path.join(__dirname, 'access-missing-guard') mkdirp.sync(path.join(missing, 'node_modules')) var serverPackages = { '@foo/bar': 'write', '@foo/util': 'read' } var clientPackages = { '@foo/bar': 'read-write', '@foo/util': 'read-only' } server.get( '/-/org/myorg/package?format=cli' ).reply(404, {error: 'nope'}) server.get( '/-/user/myorg/package?format=cli' ).reply(200, serverPackages) common.npm( [ 'access', 'ls-packages', 'myorg', '--registry', common.registry ], { cwd: missing }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-packages') t.same(JSON.parse(stdout), clientPackages) rimraf.sync(missing) t.end() } ) }) test('npm access ls-collaborators on current', function (t) { var serverCollaborators = { 'myorg:myteam': 'write', 'myorg:anotherteam': 'read' } var clientCollaborators = { 'myorg:myteam': 'read-write', 'myorg:anotherteam': 'read-only' } server.get( '/-/package/%40scoped%2Fpkg/collaborators?format=cli' ).reply(200, serverCollaborators) common.npm( [ 'access', 'ls-collaborators', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-collaborators') t.same(JSON.parse(stdout), clientCollaborators) t.end() } ) }) test('npm access ls-collaborators on package', function (t) { var serverCollaborators = { 'myorg:myteam': 'write', 'myorg:anotherteam': 'read' } var clientCollaborators = { 'myorg:myteam': 'read-write', 'myorg:anotherteam': 'read-only' } server.get( '/-/package/%40scoped%2Fanother/collaborators?format=cli' ).reply(200, serverCollaborators) common.npm( [ 'access', 'ls-collaborators', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-collaborators') t.same(JSON.parse(stdout), clientCollaborators) t.end() } ) }) test('npm access ls-collaborators on current w/user filter', function (t) { var serverCollaborators = { 'myorg:myteam': 'write', 'myorg:anotherteam': 'read' } var clientCollaborators = { 'myorg:myteam': 'read-write', 'myorg:anotherteam': 'read-only' } server.get( '/-/package/%40scoped%2Fanother/collaborators?format=cli&user=zkat' ).reply(200, serverCollaborators) common.npm( [ 'access', 'ls-collaborators', '@scoped/another', 'zkat', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access ls-collaborators') t.same(JSON.parse(stdout), clientCollaborators) t.end() } ) }) test('npm access edit', function (t) { common.npm( [ 'access', 'edit', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ok(code, 'exited with Error') t.match(stderr, /edit subcommand is not implemented yet/) t.end() } ) }) test('npm access blerg', function (t) { common.npm( [ 'access', 'blerg', '@scoped/another', '--registry', common.registry ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ok(code, 'exited with Error') t.matches(stderr, /Usage:/) t.end() } ) }) test('cleanup', function (t) { t.pass('cleaned up') rimraf.sync(pkg) server.done() server.close() t.end() }) �������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/add-named-update-protocol-port.js�������������������������������������������0000644�0000000�0000000�00000004263�12631326456�021513� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') require('../common-tap') var nock = require('nock') var test = require('tap').test var npm = require('../../') var addNamed = require('../../lib/cache/add-named') var packageName = path.basename(__filename, '.js') var fooPkg = { name: packageName, versions: { '0.0.0': { name: packageName, version: '0.0.0', dist: { tarball: 'https://localhost:1338/registry/' + packageName + '/-/' + packageName + '-0.0.0.tgz', shasum: '356a192b7913b04c54574d18c28d46e6395428ab' } } } } var iPackageName = packageName + 'i' var fooiPkg = { name: iPackageName, versions: { '0.0.0': { name: iPackageName, version: '0.0.0', dist: { tarball: 'http://127.0.0.1:1338/registry/' + iPackageName + '/-/' + iPackageName + '-0.0.0.tgz', shasum: '356a192b7913b04c54574d18c28d46e6395428ab' } } } } test('tarball paths should update port if updating protocol', function (t) { nock('http://localhost:1337/registry') .get('/' + packageName) .reply(200, fooPkg) nock('http://localhost:1337/registry') .get('/' + packageName + '/-/' + packageName + '-0.0.0.tgz') .reply(200, '1') nock('http://localhost:1338/registry') .get('/' + packageName + '/-/' + packageName + '-0.0.0.tgz') .reply(404) npm.load({registry: 'http://localhost:1337/registry', global: true}, function () { addNamed(packageName, '0.0.0', null, function checkPath (err, pkg) { t.ifError(err, 'addNamed worked') t.end() }) }) }) test('tarball paths should NOT update if different hostname', function (t) { nock('http://localhost:1337/registry') .get('/' + iPackageName) .reply(200, fooiPkg) nock('http://127.0.0.1:1338/registry') .get('/' + iPackageName + '/-/' + iPackageName + '-0.0.0.tgz') .reply(200, '1') nock('http://127.0.0.1:1337/registry') .get('/' + iPackageName + '/-/' + iPackageName + '-0.0.0.tgz') .reply(404) npm.load({registry: 'http://localhost:1337/registry', global: true}, function () { addNamed(iPackageName, '0.0.0', null, function checkPath (err, pkg) { t.ifError(err, 'addNamed worked') t.end() }) }) }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/add-remote-git-fake-windows.js����������������������������������������������0000644�0000000�0000000�00000005365�12631326456�021002� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../lib/npm.js') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'add-remote-git') var repo = resolve(__dirname, 'add-remote-git-repo') var daemon var daemonPID var git var pjParent = JSON.stringify({ name: 'parent', version: '1.2.3', dependencies: { child: 'git://localhost:1233/child.git' } }, null, 2) + '\n' var pjChild = JSON.stringify({ name: 'child', version: '1.0.3' }, null, 2) + '\n' test('setup', function (t) { bootstrap() setup(function (er, r) { if (er) { throw er } if (!er) { daemon = r[r.length - 2] daemonPID = r[r.length - 1] } t.end() }) }) test('install from repo on \'Windows\'', function (t) { // before we confuse everything by switching the platform require('../../lib/install.js') require('../../lib/unbuild.js') process.platform = 'win32' process.chdir(pkg) npm.commands.install('.', [], function (er) { t.ifError(er, 'npm installed via git') t.end() }) }) test('clean', function (t) { daemon.on('close', function () { cleanup() t.end() }) process.kill(daemonPID) }) function bootstrap () { rimraf.sync(pkg) mkdirp.sync(pkg) fs.writeFileSync(resolve(pkg, 'package.json'), pjParent) } function setup (cb) { rimraf.sync(repo) mkdirp.sync(repo) fs.writeFileSync(resolve(repo, 'package.json'), pjChild) npm.load({ registry: common.registry, loglevel: 'silent' }, function () { // some really cheesy monkeypatching require('module')._cache[require.resolve('which')] = { exports: function (_, cb) { cb() } } git = require('../../lib/utils/git.js') function startDaemon (cb) { // start git server var d = git.spawn( [ 'daemon', '--verbose', '--listen=localhost', '--export-all', '--base-path=.', '--port=1233' ], { cwd: pkg, env: process.env, stdio: ['pipe', 'pipe', 'pipe'] } ) d.stderr.on('data', childFinder) function childFinder (c) { var cpid = c.toString().match(/^\[(\d+)\]/) if (cpid[1]) { this.removeListener('data', childFinder) cb(null, [d, cpid[1]]) } } } common.makeGitRepo({ path: repo, commands: [ git.chainableExec( ['clone', '--bare', repo, 'child.git'], { cwd: pkg, env: process.env } ), startDaemon ] }, cb) }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(repo) rimraf.sync(pkg) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/add-remote-git-file.js������������������������������������������������������0000644�0000000�0000000�00000003222�12631326456�017311� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var url = require('url') var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../lib/npm.js') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'add-remote-git-file') var repo = resolve(__dirname, 'add-remote-git-file-repo') var git var cloneURL = 'git+file://' + resolve(pkg, 'child.git') var pjChild = JSON.stringify({ name: 'child', version: '1.0.3' }, null, 2) + '\n' test('setup', function (t) { bootstrap() setup(function (er, r) { t.ifError(er, 'git started up successfully') t.end() }) }) test('cache from repo', function (t) { process.chdir(pkg) var addRemoteGit = require('../../lib/cache/add-remote-git.js') addRemoteGit(cloneURL, function (er, data) { t.ifError(er, 'cached via git') t.equal( url.parse(data._resolved).protocol, 'git+file:', 'npm didn\'t go crazy adding git+git+git+git' ) t.end() }) }) test('clean', function (t) { cleanup() t.end() }) function bootstrap () { cleanup() mkdirp.sync(pkg) } function setup (cb) { mkdirp.sync(repo) fs.writeFileSync(resolve(repo, 'package.json'), pjChild) npm.load({ registry: common.registry, loglevel: 'silent' }, function () { git = require('../../lib/utils/git.js') common.makeGitRepo({ path: repo, commands: [git.chainableExec( ['clone', '--bare', repo, 'child.git'], { cwd: pkg, env: process.env } )] }, cb) }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(repo) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/add-remote-git-get-resolved.js����������������������������������������������0000644�0000000�0000000�00000010127�12631326456�020774� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var npm = require('../../lib/npm.js') var common = require('../common-tap.js') var normalizeGitUrl = require('normalize-git-url') var getResolved = null /** * Note: This is here because `normalizeGitUrl` is usually called * before getResolved is, and receives *that* URL. */ function tryGetResolved (uri, treeish) { return getResolved(normalizeGitUrl(uri).url, treeish) } test('setup', function (t) { var opts = { registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') getResolved = require('../../lib/cache/add-remote-git.js').getResolved t.end() }) }) test('add-remote-git#get-resolved git: passthru', function (t) { verify('git:github.com/foo/repo') verify('git:github.com/foo/repo.git') verify('git://github.com/foo/repo#decadacefadabade') verify('git://github.com/foo/repo.git#decadacefadabade') function verify (uri) { t.equal( tryGetResolved(uri, 'decadacefadabade'), 'git://github.com/foo/repo.git#decadacefadabade', uri + ' normalized to canonical form git://github.com/foo/repo.git#decadacefadabade' ) } t.end() }) test('add-remote-git#get-resolved SSH', function (t) { t.comment('tests for https://github.com/npm/npm/issues/7961') verify('git@github.com:foo/repo') verify('git@github.com:foo/repo#master') verify('git+ssh://git@github.com/foo/repo#master') verify('git+ssh://git@github.com/foo/repo#decadacefadabade') function verify (uri) { t.equal( tryGetResolved(uri, 'decadacefadabade'), 'git+ssh://git@github.com/foo/repo.git#decadacefadabade', uri + ' normalized to canonical form git+ssh://git@github.com/foo/repo.git#decadacefadabade' ) } t.end() }) test('add-remote-git#get-resolved HTTPS', function (t) { verify('https://github.com/foo/repo') verify('https://github.com/foo/repo#master') verify('git+https://github.com/foo/repo.git#master') verify('git+https://github.com/foo/repo#decadacefadabade') // DEPRECATED // this is an invalid URL but we normalize it // anyway. Users shouldn't use this in the future. See note // below for how this affected non-hosted URLs. // See https://github.com/npm/npm/issues/8881 verify('git+https://github.com:foo/repo.git#master') function verify (uri) { t.equal( tryGetResolved(uri, 'decadacefadabade'), 'git+https://github.com/foo/repo.git#decadacefadabade', uri + ' normalized to canonical form git+https://github.com/foo/repo.git#decadacefadabade' ) } t.end() }) test('add-remote-git#get-resolved edge cases', function (t) { t.equal( tryGetResolved('git+ssh://user@bananaboat.com:galbi/blah.git', 'decadacefadabade'), 'git+ssh://user@bananaboat.com:galbi/blah.git#decadacefadabade', 'don\'t break non-hosted scp-style locations' ) t.equal( tryGetResolved('git+ssh://bananaboat:galbi/blah', 'decadacefadabade'), 'git+ssh://bananaboat:galbi/blah#decadacefadabade', 'don\'t break non-hosted scp-style locations' ) // DEPRECATED // When we were normalizing all git URIs, git+https: was being // automatically converted to ssh:. Some users were relying // on this funky behavior, so after removing the aggressive // normalization from non-hosted URIs, we brought this back. // See https://github.com/npm/npm/issues/8881 t.equal( tryGetResolved('git+https://bananaboat:galbi/blah', 'decadacefadabade'), 'git+https://bananaboat/galbi/blah#decadacefadabade', 'don\'t break non-hosted scp-style locations' ) t.equal( tryGetResolved('git+ssh://git.bananaboat.net/foo', 'decadacefadabade'), 'git+ssh://git.bananaboat.net/foo#decadacefadabade', 'don\'t break non-hosted SSH URLs' ) t.equal( tryGetResolved('git+ssh://git.bananaboat.net:/foo', 'decadacefadabade'), 'git+ssh://git.bananaboat.net:/foo#decadacefadabade', 'don\'t break non-hosted SSH URLs' ) t.equal( tryGetResolved('git://gitbub.com/foo/bar.git', 'decadacefadabade'), 'git://gitbub.com/foo/bar.git#decadacefadabade', 'don\'t break non-hosted git: URLs' ) t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/add-remote-git-shrinkwrap.js������������������������������������������������0000644�0000000�0000000�00000007200�12631326456�020562� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../lib/npm.js') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'add-remote-git-shrinkwrap') var repo = resolve(__dirname, 'add-remote-git-shrinkwrap-repo') var daemon var daemonPID var git var pjParent = JSON.stringify({ name: 'parent', version: '1.2.3', dependencies: { 'child': 'git://localhost:1235/child.git#master' } }, null, 2) + '\n' var pjChild = JSON.stringify({ name: 'child', version: '1.0.3' }, null, 2) + '\n' test('setup', function (t) { bootstrap() setup(function (er, r) { t.ifError(er, 'git started up successfully') if (!er) { daemon = r[r.length - 2] daemonPID = r[r.length - 1] } t.end() }) }) test('install from repo', function (t) { process.chdir(pkg) npm.commands.install('.', [], function (er) { t.ifError(er, 'npm installed via git') t.end() }) }) test('shrinkwrap gets correct _from and _resolved (#7121)', function (t) { common.npm( [ 'shrinkwrap', '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm shrinkwrapped without errors') t.is(code, 0, '`npm shrinkwrap` exited ok') t.equal(stdout.trim(), 'wrote npm-shrinkwrap.json') t.equal(stderr.trim(), '', 'no error output on successful shrinkwrap') var shrinkwrap = require(resolve(pkg, 'npm-shrinkwrap.json')) t.equal( shrinkwrap.dependencies.child.from, 'git://localhost:1235/child.git#master', 'npm shrinkwrapped from correctly' ) git.whichAndExec( ['rev-list', '-n1', 'master'], { cwd: repo, env: process.env }, function (er, stdout, stderr) { t.ifErr(er, 'git rev-list ran without error') t.notOk(stderr, 'no error output') var treeish = stdout.trim() t.equal( shrinkwrap.dependencies.child.resolved, 'git://localhost:1235/child.git#' + treeish, 'npm shrinkwrapped resolved correctly' ) t.end() } ) } ) }) test('clean', function (t) { daemon.on('close', function () { cleanup() t.end() }) process.kill(daemonPID) }) function bootstrap () { mkdirp.sync(pkg) fs.writeFileSync(resolve(pkg, 'package.json'), pjParent) } function setup (cb) { mkdirp.sync(repo) fs.writeFileSync(resolve(repo, 'package.json'), pjChild) npm.load({ prefix: pkg, registry: common.registry, loglevel: 'silent' }, function () { git = require('../../lib/utils/git.js') function startDaemon (cb) { // start git server var d = git.spawn( [ 'daemon', '--verbose', '--listen=localhost', '--export-all', '--base-path=.', '--port=1235' ], { cwd: pkg, env: process.env, stdio: ['pipe', 'pipe', 'pipe'] } ) d.stderr.on('data', childFinder) function childFinder (c) { var cpid = c.toString().match(/^\[(\d+)\]/) if (cpid[1]) { this.removeListener('data', childFinder) cb(null, [d, cpid[1]]) } } } common.makeGitRepo({ path: repo, commands: [ git.chainableExec( ['clone', '--bare', repo, 'child.git'], { cwd: pkg, env: process.env } ), startDaemon ] }, cb) }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(repo) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/add-remote-git.js�����������������������������������������������������������0000644�0000000�0000000�00000004635�12631326456�016405� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../lib/npm.js') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'add-remote-git') var repo = resolve(__dirname, 'add-remote-git-repo') var daemon var daemonPID var git var pjParent = JSON.stringify({ name: 'parent', version: '1.2.3', dependencies: { child: 'git://localhost:1234/child.git' } }, null, 2) + '\n' var pjChild = JSON.stringify({ name: 'child', version: '1.0.3' }, null, 2) + '\n' test('setup', function (t) { bootstrap() setup(function (er, r) { t.ifError(er, 'git started up successfully') if (!er) { daemon = r[r.length - 2] daemonPID = r[r.length - 1] } t.end() }) }) test('install from repo', function (t) { process.chdir(pkg) npm.commands.install('.', [], function (er) { t.ifError(er, 'npm installed via git') t.end() }) }) test('clean', function (t) { daemon.on('close', function () { cleanup() t.end() }) process.kill(daemonPID) }) function bootstrap () { mkdirp.sync(pkg) fs.writeFileSync(resolve(pkg, 'package.json'), pjParent) } function setup (cb) { mkdirp.sync(repo) fs.writeFileSync(resolve(repo, 'package.json'), pjChild) npm.load({ registry: common.registry, loglevel: 'silent' }, function () { git = require('../../lib/utils/git.js') function startDaemon (cb) { // start git server var d = git.spawn( [ 'daemon', '--verbose', '--listen=localhost', '--export-all', '--base-path=.', '--port=1234' ], { cwd: pkg, env: process.env, stdio: ['pipe', 'pipe', 'pipe'] } ) d.stderr.on('data', childFinder) function childFinder (c) { var cpid = c.toString().match(/^\[(\d+)\]/) if (cpid[1]) { this.removeListener('data', childFinder) cb(null, [d, cpid[1]]) } } } common.makeGitRepo({ path: repo, commands: [ git.chainableExec( ['clone', '--bare', repo, 'child.git'], { cwd: pkg, env: process.env } ), startDaemon ] }, cb) }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(repo) rimraf.sync(pkg) } ���������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/adduser-always-auth.js������������������������������������������������������0000644�0000000�0000000�00000007456�12631326456�017473� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var rimraf = require('rimraf') var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var opts = {cwd: __dirname} var outfile = path.resolve(__dirname, '_npmrc') var responses = { 'Username': 'u\n', 'Password': 'p\n', 'Email': 'u@p.me\n' } function mocks (server) { server.filteringRequestBody(function (r) { if (r.match(/\"_id\":\"org\.couchdb\.user:u\"/)) { return 'auth' } }) server.put('/-/user/org.couchdb.user:u', 'auth') .reply(201, { username: 'u', password: 'p', email: 'u@p.me' }) } test('npm login', function (t) { mr({ port: common.port, plugin: mocks }, function (er, s) { var runner = common.npm( [ 'login', '--registry', common.registry, '--loglevel', 'silent', '--userconfig', outfile ], opts, function (err, code) { t.notOk(code, 'exited OK') t.notOk(err, 'no error output') var config = fs.readFileSync(outfile, 'utf8') t.like(config, /:always-auth=false/, 'always-auth is scoped and false (by default)') s.close() rimraf(outfile, function (err) { t.ifError(err, 'removed config file OK') t.end() }) }) var o = '' var e = '' var remaining = Object.keys(responses).length runner.stdout.on('data', function (chunk) { remaining-- o += chunk var label = chunk.toString('utf8').split(':')[0] runner.stdin.write(responses[label]) if (remaining === 0) runner.stdin.end() }) runner.stderr.on('data', function (chunk) { e += chunk }) }) }) test('npm login --always-auth', function (t) { mr({ port: common.port, plugin: mocks }, function (er, s) { var runner = common.npm( [ 'login', '--registry', common.registry, '--loglevel', 'silent', '--userconfig', outfile, '--always-auth' ], opts, function (err, code) { t.notOk(code, 'exited OK') t.notOk(err, 'no error output') var config = fs.readFileSync(outfile, 'utf8') t.like(config, /:always-auth=true/, 'always-auth is scoped and true') s.close() rimraf(outfile, function (err) { t.ifError(err, 'removed config file OK') t.end() }) }) var o = '' var e = '' var remaining = Object.keys(responses).length runner.stdout.on('data', function (chunk) { remaining-- o += chunk var label = chunk.toString('utf8').split(':')[0] runner.stdin.write(responses[label]) if (remaining === 0) runner.stdin.end() }) runner.stderr.on('data', function (chunk) { e += chunk }) }) }) test('npm login --no-always-auth', function (t) { mr({ port: common.port, plugin: mocks }, function (er, s) { var runner = common.npm( [ 'login', '--registry', common.registry, '--loglevel', 'silent', '--userconfig', outfile, '--no-always-auth' ], opts, function (err, code) { t.notOk(code, 'exited OK') t.notOk(err, 'no error output') var config = fs.readFileSync(outfile, 'utf8') t.like(config, /:always-auth=false/, 'always-auth is scoped and false') s.close() rimraf(outfile, function (err) { t.ifError(err, 'removed config file OK') t.end() }) }) var o = '' var e = '' var remaining = Object.keys(responses).length runner.stdout.on('data', function (chunk) { remaining-- o += chunk var label = chunk.toString('utf8').split(':')[0] runner.stdin.write(responses[label]) if (remaining === 0) runner.stdin.end() }) runner.stderr.on('data', function (chunk) { e += chunk }) }) }) test('cleanup', function (t) { rimraf.sync(outfile) t.pass('cleaned up') t.end() }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/adduser-legacy-auth.js������������������������������������������������������0000644�0000000�0000000�00000005047�12631326456�017431� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var opts = { cwd: __dirname } var pkg = path.resolve(__dirname, 'adduser-legacy-auth') var outfile = path.resolve(pkg, '_npmrc') var contents = '_auth=' + new Buffer('u:x').toString('base64') + '\n' + 'registry=https://nonexistent.lvh.me/registry\n' + 'email=u@p.me\n' var responses = { 'Username': 'u\n', 'Password': 'p\n', 'Email': 'u@p.me\n' } function mocks (server) { server.filteringRequestBody(function (r) { if (r.match(/"_id":"org\.couchdb\.user:u"/)) { return 'auth' } }) server.put('/-/user/org.couchdb.user:u', 'auth') .reply(409, { error: 'user exists' }) server.get('/-/user/org.couchdb.user:u?write=true') .reply(200, { _rev: '3-deadcafebabebeef' }) server.put( '/-/user/org.couchdb.user:u/-rev/3-deadcafebabebeef', 'auth', { authorization: 'Basic dTpw' } ).reply(201, { username: 'u', password: 'p', email: 'u@p.me' }) } test('setup', function (t) { mkdirp(pkg, function (er) { t.ifError(er, pkg + ' made successfully') fs.writeFile(outfile, contents, function (er) { t.ifError(er, 'wrote legacy config') t.end() }) }) }) test('npm login', function (t) { mr({ port: common.port, plugin: mocks }, function (er, s) { var runner = common.npm( [ 'login', '--registry', common.registry, '--loglevel', 'silent', '--userconfig', outfile ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'npm ran without issue') t.notOk(code, 'exited OK') t.notOk(stderr, 'no error output') var config = fs.readFileSync(outfile, 'utf8') t.like(config, /:always-auth=false/, 'always-auth is scoped and false (by default)') s.close() rimraf(outfile, function (err) { t.ifError(err, 'removed config file OK') t.end() }) } ) var o = '' var e = '' var remaining = Object.keys(responses).length runner.stdout.on('data', function (chunk) { remaining-- o += chunk var label = chunk.toString('utf8').split(':')[0] runner.stdin.write(responses[label]) if (remaining === 0) runner.stdin.end() }) runner.stderr.on('data', function (chunk) { e += chunk }) }) }) test('cleanup', function (t) { rimraf.sync(pkg) t.pass('cleaned up') t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bin.js����������������������������������������������������������������������0000644�0000000�0000000�00000001210�12631326456�014335� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require('path') var test = require('tap').test var rimraf = require('rimraf') var common = require('../common-tap.js') var opts = { cwd: __dirname } var binDir = '../../node_modules/.bin' var fixture = path.resolve(__dirname, binDir) test('setup', function (t) { rimraf.sync(path.join(__dirname, 'node_modules')) t.end() }) test('npm bin', function (t) { common.npm(['bin'], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bin ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = path.resolve(stdout) t.equal(res, fixture + '\n') t.end() }) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bitbucket-https-url-with-creds-package.js�����������������������������������0000644�0000000�0000000�00000003573�12631326456�023157� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'bitbucket-https-url-with-creds-package') var json = { name: 'bitbucket-https-url-with-creds-package', version: '0.0.0', dependencies: { 'private': 'git+https://user:pass@bitbucket.org/foo/private.git' } } test('setup', function (t) { setup() t.end() }) test('bitbucket-https-url-with-creds-package', function (t) { var cloneUrls = [ ['https://user:pass@bitbucket.org/foo/private.git', 'Bitbucket URLs with passwords try only that.'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install([], function (er) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bitbucket-https-url-with-creds.js�������������������������������������������0000644�0000000�0000000�00000003534�12631326456�021563� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'bitbucket-https-url-with-creds') var json = { name: 'bitbucket-https-url-with-creds', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('bitbucket-https-url-with-creds', function (t) { var cloneUrls = [ ['https://user:pass@bitbucket.org/foo/private.git', 'Bitbucket URLs with passwords try only that.'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error('execFile mock fails on purpose')) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install(['git+https://user:pass@bitbucket.org/foo/private.git'], function (er) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bitbucket-shortcut-package.js�����������������������������������������������0000644�0000000�0000000�00000003564�12631326456�021021� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'bitbucket-shortcut-package') var json = { name: 'bitbucket-shortcut-package', version: '0.0.0', dependencies: { 'private': 'bitbucket:foo/private' } } test('setup', function (t) { setup() t.end() }) test('bitbucket-shortcut', function (t) { var cloneUrls = [ ['https://bitbucket.org/foo/private.git', 'Bitbucket shortcuts try HTTPS URLs first'], ['git@bitbucket.org:foo/private.git', 'Bitbucket shortcuts try SSH second'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install([], function (er) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bitbucket-shortcut.js�������������������������������������������������������0000644�0000000�0000000�00000003535�12631326456�017426� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'bitbucket-shortcut') var json = { name: 'bitbucket-shortcut', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('bitbucket-shortcut', function (t) { var cloneUrls = [ ['https://bitbucket.org/foo/private.git', 'Bitbucket shortcuts try HTTPS URLs first'], ['git@bitbucket.org:foo/private.git', 'Bitbucket shortcuts try SSH second'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error('execFile mock fails on purpose')) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install(['bitbucket:foo/private'], function (er) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bugs.js���������������������������������������������������������������������0000644�0000000�0000000�00000011711�12631326456�014534� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������if (process.platform === 'win32') { console.error('skipping test, because windows and shebangs') process.exit(0) } var common = require('../common-tap.js') var mr = require('npm-registry-mock') var test = require('tap').test var rimraf = require('rimraf') var fs = require('fs') var path = require('path') var join = path.join var outFile = path.join(__dirname, '/_output') var opts = { cwd: __dirname } test('setup', function (t) { var s = '#!/usr/bin/env bash\n' + 'echo \"$@\" > ' + JSON.stringify(__dirname) + '/_output\n' fs.writeFileSync(join(__dirname, '/_script.sh'), s, 'ascii') fs.chmodSync(join(__dirname, '/_script.sh'), '0755') t.pass('made script') t.end() }) test('npm bugs underscore', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ 'bugs', 'underscore', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + join(__dirname, '/_script.sh') ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bugs ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://github.com/jashkenas/underscore/issues\n') rimraf.sync(outFile) t.end() } ) }) }) test('npm bugs optimist - github (https://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ 'bugs', 'optimist', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + join(__dirname, '/_script.sh') ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bugs ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://github.com/substack/node-optimist/issues\n') rimraf.sync(outFile) t.end() } ) }) }) test('npm bugs npm-test-peer-deps - no repo', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ 'bugs', 'npm-test-peer-deps', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + join(__dirname, '/_script.sh') ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bugs ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://www.npmjs.org/package/npm-test-peer-deps\n') rimraf.sync(outFile) t.end() } ) }) }) test('npm bugs test-repo-url-http - non-github (http://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ 'bugs', 'test-repo-url-http', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + join(__dirname, '/_script.sh') ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bugs ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://www.npmjs.org/package/test-repo-url-http\n') rimraf.sync(outFile) t.end() } ) }) }) test('npm bugs test-repo-url-https - gitlab (https://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ 'bugs', 'test-repo-url-https', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + join(__dirname, '/_script.sh') ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bugs ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://gitlab.com/evanlucas/test-repo-url-https/issues\n') rimraf.sync(outFile) t.end() } ) }) }) test('npm bugs test-repo-url-ssh - gitlab (ssh://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ 'bugs', 'test-repo-url-ssh', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + join(__dirname, '/_script.sh') ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'bugs ran without issue') t.notOk(stderr, 'should have no stderr') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://gitlab.com/evanlucas/test-repo-url-ssh/issues\n') rimraf.sync(outFile) t.end() } ) }) }) test('cleanup', function (t) { fs.unlinkSync(join(__dirname, '/_script.sh')) t.pass('cleaned up') t.end() }) �������������������������������������������������������npm_3.5.2.orig/test/tap/build-already-built.js������������������������������������������������������0000644�0000000�0000000�00000004007�12631326456�017427� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// if "npm rebuild" is run with bundled dependencies, // message "already built" should not be error var test = require('tap').test var path = require('path') var osenv = require('osenv') var rimraf = require('rimraf') var npmlog = require('npmlog') var mkdirp = require('mkdirp') var requireInject = require('require-inject') var npm = require('../../lib/npm.js') var PKG_DIR = path.resolve(__dirname, 'build-already-built') var fakePkg = 'foo' test('setup', function (t) { cleanup() t.end() }) test("issue #6735 build 'already built' message", function (t) { npm.load({ loglevel: 'warn' }, function () { // capture log messages with level var log = '' npmlog.on('log', function (chunk) { log += chunk.level + ' ' + chunk.message + '\n' }) mkdirp.sync(fakePkg) var folder = path.resolve(fakePkg) var global = npm.config.get('global') var build = requireInject('../../lib/build', { }) t.test('pin previous behavior', function (t) { build([fakePkg], global, false, false, function (err) { t.ok(err, 'build failed as expected') t.similar(err.message, /package.json/, 'missing package.json as expected') t.notSimilar(log, /already built/, 'no already built message written') t.end() }) }) t.test('simulate rebuild of bundledDependency', function (t) { log = '' build._didBuild[folder] = true build([fakePkg], global, false, false, function (err) { t.ok(err, 'build failed as expected') t.similar(err.message, /package.json/, 'missing package.json as expected') t.similar(log, /already built/, 'already built message written') t.notSimilar(log, /ERR! already built/, 'already built message written is not error') t.similar(log, /info already built/, 'already built message written is info') t.end() }) }) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(PKG_DIR) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/builtin-config.js�����������������������������������������������������������0000644�0000000�0000000�00000007517�12631326456�016516� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') if (process.argv[2] === 'write-builtin') { var pid = process.argv[3] fs.writeFileSync('npmrc', 'foo=bar\npid=' + pid + '\n') process.exit(0) } var common = require('../common-tap.js') var path = require('path') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var folder = path.resolve(__dirname, 'builtin-config') var test = require('tap').test var npm = path.resolve(__dirname, '../..') var spawn = require('child_process').spawn var node = process.execPath test('setup', function (t) { t.plan(1) rimraf.sync(folder) mkdirp.sync(folder + '/first') mkdirp.sync(folder + '/second') mkdirp.sync(folder + '/cache') mkdirp.sync(folder + '/tmp') t.pass('finished setup') t.end() }) test('install npm into first folder', function (t) { t.plan(1) var args = ['install', npm, '-g', '--prefix=' + folder + '/first', '--ignore-scripts', '--cache=' + folder + '/cache', '--loglevel=silent', '--tmp=' + folder + '/tmp'] common.npm(args, {stdio: 'inherit'}, function (er, code) { if (er) throw er t.equal(code, 0) t.end() }) }) test('write npmrc file', function (t) { t.plan(1) common.npm(['explore', 'npm', '-g', '--prefix=' + folder + '/first', '--cache=' + folder + '/cache', '--tmp=' + folder + '/tmp', '--', node, __filename, 'write-builtin', process.pid ], {'stdio': 'inherit'}, function (er, code) { if (er) throw er t.equal(code, 0) t.end() }) }) test('use first npm to install second npm', function (t) { t.plan(3) // get the root location common.npm( [ 'root', '-g', '--prefix=' + folder + '/first', '--cache=' + folder + '/cache', '--tmp=' + folder + '/tmp' ], {}, function (er, code, so) { if (er) throw er t.equal(code, 0) var root = so.trim() t.ok(fs.statSync(root).isDirectory()) var bin = path.resolve(root, 'npm/bin/npm-cli.js') spawn( node, [ bin, 'install', npm, '-g', '--prefix=' + folder + '/second', '--cache=' + folder + '/cache', '--tmp=' + folder + '/tmp' ] ) .on('error', function (er) { throw er }) .on('close', function (code) { t.equal(code, 0, 'code is zero') t.end() }) } ) }) test('verify that the builtin config matches', function (t) { t.plan(3) common.npm([ 'root', '-g', '--prefix=' + folder + '/first', '--cache=' + folder + '/cache', '--tmp=' + folder + '/tmp' ], {}, function (er, code, so) { if (er) throw er t.equal(code, 0) var firstRoot = so.trim() common.npm([ 'root', '-g', '--prefix=' + folder + '/second', '--cache=' + folder + '/cache', '--tmp=' + folder + '/tmp' ], {}, function (er, code, so) { if (er) throw er t.equal(code, 0) var secondRoot = so.trim() var firstRc = path.resolve(firstRoot, 'npm', 'npmrc') var secondRc = path.resolve(secondRoot, 'npm', 'npmrc') var firstData = fs.readFileSync(firstRc, 'utf8') var secondData = fs.readFileSync(secondRc, 'utf8') t.equal(firstData, secondData) t.end() }) }) }) test('clean', function (t) { rimraf.sync(folder) t.end() }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/bundled-dependencies-nonarray.js��������������������������������������������0000644�0000000�0000000�00000003240�12631326456�021462� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var dir = path.resolve(__dirname, 'bundleddependencies') var pkg = path.resolve(dir, 'pkg-with-bundled') var dep = path.resolve(dir, 'a-bundled-dep') var pj = JSON.stringify({ name: 'pkg-with-bundled', version: '1.0.0', dependencies: { 'a-bundled-dep': 'file:../a-bundled-dep' }, bundledDependencies: { 'a-bundled-dep': 'file:../a-bundled-dep' } }, null, 2) + '\n' var pjDep = JSON.stringify({ name: 'a-bundled-dep', version: '2.0.0' }, null, 2) + '\n' test('setup', function (t) { bootstrap() t.end() }) test('errors on non-array bundleddependencies', function (t) { t.plan(6) common.npm(['install'], { cwd: pkg }, function (err, code, stdout, stderr) { t.ifError(err, 'npm install ran without issue') t.is(code, 0, 'exited with a non-error code') t.is(stderr, '', 'no error output') common.npm(['install', './pkg-with-bundled'], { cwd: dir }, function (err, code, stdout, stderr) { t.ifError(err, 'npm install ran without issue') t.notEqual(code, 0, 'exited with a error code') t.like(stderr, /be an array/, 'nice error output') } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function bootstrap () { cleanup() mkdirp.sync(dir) mkdirp.sync(path.join(dir, 'node_modules')) mkdirp.sync(pkg) fs.writeFileSync(path.resolve(pkg, 'package.json'), pj) mkdirp.sync(dep) fs.writeFileSync(path.resolve(dep, 'package.json'), pjDep) } function cleanup () { rimraf.sync(dir) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/cache-add-localdir-fallback.js����������������������������������������������0000644�0000000�0000000�00000006233�12631326456�020714� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require('path') var test = require('tap').test var npm = require('../../lib/npm.js') var requireInject = require('require-inject') var realizePackageSpecifier = requireInject('realize-package-specifier', { 'fs': { stat: function (file, cb) { process.nextTick(function () { switch (file) { case path.resolve('named'): cb(new Error('ENOENT')) break case path.resolve('file.tgz'): cb(null, { isDirectory: function () { return false } }) break case path.resolve('dir-no-package'): cb(null, { isDirectory: function () { return true } }) break case path.resolve('dir-no-package/package.json'): cb(new Error('ENOENT')) break case path.resolve('dir-with-package'): cb(null, { isDirectory: function () { return true } }) break case path.resolve('dir-with-package/package.json'): cb(null, {}) break case path.resolve(__dirname, 'dir-with-package'): cb(null, { isDirectory: function () { return true } }) break case path.join(__dirname, 'dir-with-package', 'package.json'): cb(null, {}) break case path.resolve(__dirname, 'file.tgz'): cb(null, { isDirectory: function () { return false } }) break default: throw new Error('Unknown test file passed to stat: ' + file) } }) } } }) npm.load({ loglevel: 'silent' }, function () { var cache = requireInject('../../lib/cache.js', { 'realize-package-specifier': realizePackageSpecifier, '../../lib/cache/add-named.js': function addNamed (name, version, data, cb) { cb(null, 'addNamed') }, '../../lib/cache/add-local.js': function addLocal (name, data, cb) { cb(null, 'addLocal') } }) test('npm install localdir fallback', function (t) { t.plan(12) cache.add('named', null, null, false, function (er, which) { t.ifError(er, 'named was cached') t.is(which, 'addNamed', 'registry package name') }) cache.add('file.tgz', null, null, false, function (er, which) { t.ifError(er, 'file.tgz was cached') t.is(which, 'addLocal', 'local file') }) cache.add('dir-no-package', null, null, false, function (er, which) { t.ifError(er, 'local directory was cached') t.is(which, 'addNamed', 'local directory w/o package.json') }) cache.add('dir-with-package', null, null, false, function (er, which) { t.ifError(er, 'local directory with package was cached') t.is(which, 'addLocal', 'local directory with package.json') }) cache.add('file:./dir-with-package', null, __dirname, false, function (er, which) { t.ifError(er, 'local directory (as URI) with package was cached') t.is(which, 'addLocal', 'file: URI to local directory with package.json') }) cache.add('file:./file.tgz', null, __dirname, false, function (er, which) { t.ifError(er, 'local file (as URI) with package was cached') t.is(which, 'addLocal', 'file: URI to local file with package.json') }) }) }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/cache-add-unpublished.js����������������������������������������������������0000644�0000000�0000000�00000001436�12631326456�017710� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test test('cache add', function (t) { setup(function (er, s) { if (er) { throw er } common.npm( [ 'cache', 'add', 'superfoo', '--registry=http://localhost:1337/' ], {}, function (er, c, so, se) { if (er) throw er t.ok(c, 'got non-zero exit code') t.equal(so, '', 'nothing printed to stdout') t.similar(se, /404 Not Found: superfoo/, 'got expected error') s.close() t.end() } ) }) }) function setup (cb) { var s = require('http').createServer(function (req, res) { res.statusCode = 404 res.end('{\"error\":\"not_found\"}\n') }) s.listen(1337, function () { cb(null, s) }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/cache-shasum-fork.js��������������������������������������������������������0000644�0000000�0000000�00000005027�12631326456�017077� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') // Install from a tarball that thinks it is underscore@1.5.1 // (but is actually a fork) var forkPath = path.resolve( __dirname, '..', 'fixtures', 'forked-underscore-1.5.1.tgz' ) var pkg = path.resolve(__dirname, 'cache-shasum-fork') var cache = path.join(pkg, 'cache') var server var installed_output = path.join(__dirname, 'cache-shasum-fork') + '\n`-- underscore@1.5.1 \n\n' test('setup', function (t) { setup() t.comment('test for https://github.com/npm/npm/issues/3265') mr({ port: common.port }, function (er, s) { server = s t.end() }) }) test('npm cache - install from fork', function (t) { setup() common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'install', forkPath ], { cwd: pkg, env: { npm_config_cache: cache } }, function (err, code, stdout, stderr) { t.ifErr(err, 'install finished without error') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.equal(code, 0, 'install finished successfully') t.equal(stdout, installed_output) var index = fs.readFileSync( path.join(pkg, 'node_modules', 'underscore', 'index.js'), 'utf8' ) t.equal(index, 'console.log("This is the fork");\n\n') t.end() } ) }) // Now install the real 1.5.1. test('npm cache - install from origin', function (t) { setup() common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'install', 'underscore' ], { cwd: pkg, env: { npm_config_cache: cache } }, function (err, code, stdout, stderr) { t.ifErr(err, 'install finished without error') t.equal(code, 0, 'install finished successfully') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.equal(stdout, installed_output) var index = fs.readFileSync( path.join(pkg, 'node_modules', 'underscore', 'index.js'), 'utf8' ) t.equal(index, 'module.exports = require(\'./underscore\');\n') t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { mkdirp.sync(cache) mkdirp.sync(path.join(pkg, 'node_modules')) process.chdir(pkg) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/cache-shasum.js�������������������������������������������������������������0000644�0000000�0000000�00000002605�12631326456�016137� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var npm = require.resolve('../../') var test = require('tap').test var path = require('path') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var common = require('../common-tap.js') var cache = path.resolve(__dirname, 'cache-shasum') var spawn = require('child_process').spawn var sha = require('sha') var server test('mock reg', function (t) { rimraf.sync(cache) mkdirp.sync(cache) mr({ port: common.port }, function (er, s) { server = s t.pass('ok') t.end() }) }) test('npm cache add request', function (t) { var c = spawn(process.execPath, [ npm, 'cache', 'add', 'request@2.27.0', '--cache=' + cache, '--registry=' + common.registry, '--loglevel=quiet' ]) c.stderr.pipe(process.stderr) c.stdout.on('data', function (d) { t.fail('Should not get data on stdout: ' + d) }) c.on('close', function (code) { t.notOk(code, 'exit ok') t.end() }) }) test('compare', function (t) { var d = path.resolve(__dirname, 'cache-shasum/request') var p = path.resolve(d, '2.27.0/package.tgz') var r = require('./cache-shasum/localhost_1337/request/.cache.json') var rshasum = r.versions['2.27.0'].dist.shasum sha.get(p, function (er, pshasum) { if (er) throw er t.equal(pshasum, rshasum) t.end() }) }) test('cleanup', function (t) { server.close() rimraf.sync(cache) t.end() }) ���������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/check-cpu-reqs.js�����������������������������������������������������������0000644�0000000�0000000�00000003060�12631326456�016404� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('fs') var test = require('tap').test var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var installFrom = path.join(base, 'from') var installIn = path.join(base, 'in') var json = { name: 'check-cpu-reqs', version: '0.0.1', description: 'fixture', cpu: ['fake-cpu'] } test('setup', function (t) { setup() t.end() }) var INSTALL_OPTS = ['--loglevel', 'silly'] var EXEC_OPTS = {cwd: installIn} test('install bad cpu', function (t) { common.npm(['install', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 1, 'npm install refused to install a package in itself') t.end() }) }) test('force install bad cpu', function (t) { common.npm(['install', '--force', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 0, 'npm install happily installed a package in itself with --force') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(base) } function setup () { cleanup() mkdirp.sync(path.resolve(installFrom, 'node_modules')) fs.writeFileSync( path.join(installFrom, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.resolve(installIn, 'node_modules')) process.chdir(base) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/check-engine-reqs.js��������������������������������������������������������0000644�0000000�0000000�00000003201�12631326456�017057� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('fs') var test = require('tap').test var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var installFrom = path.join(base, 'from') var installIn = path.join(base, 'in') var json = { name: 'check-engine-reqs', version: '0.0.1', description: 'fixture', engines: { node: '1.0.0-not-a-real-version' } } test('setup', function (t) { setup() t.end() }) var INSTALL_OPTS = ['--loglevel', 'silly'] var EXEC_OPTS = {cwd: installIn} test('install bad engine', function (t) { common.npm(['install', '--engine-strict', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 1, 'npm install refused to install a package in itself') t.end() }) }) test('force install bad engine', function (t) { common.npm(['install', '--engine-strict', '--force', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 0, 'npm install happily installed a package in itself with --force') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(base) } function setup () { cleanup() mkdirp.sync(path.resolve(installFrom, 'node_modules')) fs.writeFileSync( path.join(installFrom, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.resolve(installIn, 'node_modules')) process.chdir(base) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/check-install-self.js�������������������������������������������������������0000644�0000000�0000000�00000003176�12631326456�017252� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('fs') var test = require('tap').test var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var installFrom = path.join(base, 'from') var installIn = path.join(base, 'in') var json = { name: 'check-install-self', version: '0.0.1', description: 'fixture' } test('setup', function (t) { setup() t.end() }) var INSTALL_OPTS = ['--loglevel', 'silent'] var EXEC_OPTS = {cwd: installIn} test('install self', function (t) { common.npm(['install', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 1, 'npm install refused to install a package in itself') t.end() }) }) test('force install self', function (t) { common.npm(['install', '--force', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 0, 'npm install happily installed a package in itself with --force') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(base) } function setup () { cleanup() mkdirp.sync(path.resolve(installFrom, 'node_modules')) fs.writeFileSync( path.join(installFrom, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.resolve(installIn, 'node_modules')) fs.writeFileSync( path.join(installIn, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(base) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/check-os-reqs.js������������������������������������������������������������0000644�0000000�0000000�00000003053�12631326456�016240� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('fs') var test = require('tap').test var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var installFrom = path.join(base, 'from') var installIn = path.join(base, 'in') var json = { name: 'check-os-reqs', version: '0.0.1', description: 'fixture', os: ['fake-os'] } test('setup', function (t) { setup() t.end() }) var INSTALL_OPTS = ['--loglevel', 'silly'] var EXEC_OPTS = {cwd: installIn} test('install bad os', function (t) { common.npm(['install', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 1, 'npm install refused to install a package in itself') t.end() }) }) test('force install bad os', function (t) { common.npm(['install', '--force', installFrom].concat(INSTALL_OPTS), EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.is(code, 0, 'npm install happily installed a package in itself with --force') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(base) } function setup () { cleanup() mkdirp.sync(path.resolve(installFrom, 'node_modules')) fs.writeFileSync( path.join(installFrom, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.resolve(installIn, 'node_modules')) process.chdir(base) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/check-permissions.js��������������������������������������������������������0000644�0000000�0000000�00000004352�12631326456�017225� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('fs') var path = require('path') var test = require('tap').test var rimraf = require('rimraf') var writable = require('../../lib/install/writable.js').fsAccessImplementation var writableFallback = require('../../lib/install/writable.js').fsOpenImplementation var exists = require('../../lib/install/exists.js').fsAccessImplementation var existsFallback = require('../../lib/install/exists.js').fsStatImplementation var testBase = path.resolve(__dirname, 'check-permissions') var existingDir = path.resolve(testBase, 'exists') var nonExistingDir = path.resolve(testBase, 'does-not-exist') var writableDir = path.resolve(testBase, 'writable') var nonWritableDir = path.resolve(testBase, 'non-writable') test('setup', function (t) { cleanup() setup() t.end() }) test('exists', function (t) { t.plan(2) // fs.access first introduced in node 0.12 / io.js if (fs.access) { existsTests(t, exists) } else { t.pass('# skip fs.access not available in this version') t.pass('# skip fs.access not available in this version') } }) test('exists-fallback', function (t) { t.plan(2) existsTests(t, existsFallback) }) test('writable', function (t) { t.plan(2) // fs.access first introduced in node 0.12 / io.js if (fs.access) { writableTests(t, writable) } else { t.pass('# skip fs.access not available in this version') t.pass('# skip fs.access not available in this version') } }) test('writable-fallback', function (t) { t.plan(2) writableTests(t, writableFallback) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { fs.mkdirSync(testBase) fs.mkdirSync(existingDir) fs.mkdirSync(writableDir) fs.mkdirSync(nonWritableDir) fs.chmodSync(nonWritableDir, '555') } function existsTests (t, exists) { exists(existingDir, function (er) { t.error(er, 'exists dir is exists') }) exists(nonExistingDir, function (er) { t.ok(er, 'non-existing dir resulted in an error') }) } function writableTests (t, writable) { writable(writableDir, function (er) { t.error(er, 'writable dir is writable') }) writable(nonWritableDir, function (er) { t.ok(er, 'non-writable dir resulted in an error') }) } function cleanup () { rimraf.sync(testBase) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/circular-dep.js�������������������������������������������������������������0000644�0000000�0000000�00000004470�12631326456�016152� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.resolve(__dirname, 'circular-dep') var minimist = path.join(pkg, 'minimist') var EXEC_OPTS = { cwd: path.join(pkg, 'minimist/node_modules'), npm_config_cache: path.join(pkg, 'cache') } var json = { name: 'minimist', version: '0.0.5', dependencies: { optimist: '0.6.0' } } test('setup', function (t) { t.comment('test for https://github.com/npm/npm/issues/4312') setup(function () { t.end() }) }) test('installing a package that depends on the current package', function (t) { common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'install', 'optimist' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm ran without issue') t.notOk(code, 'npm ran without raising an error code') t.notOk(stderr, 'no error output') common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'dedupe' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm ran without issue') t.notOk(code, 'npm ran without raising an error code') t.notOk(stderr, 'no error output') t.ok(existsSync(path.resolve( minimist, 'node_modules', 'optimist' )), 'optimist in place') t.ok(existsSync(path.resolve( minimist, 'node_modules', 'minimist' )), 'circular dependency uncircled') t.end() } ) } ) }) test('cleanup', function (t) { cleanup() server.close() t.end() }) function setup (cb) { cleanup() mkdirp.sync(minimist) fs.writeFileSync( path.join(minimist, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(path.resolve(pkg, 'minimist')) fs.mkdirSync(path.resolve(pkg, 'minimist/node_modules')) mr({ port: common.port }, function (er, s) { server = s cb() }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-basic.js�������������������������������������������������������������0000644�0000000�0000000�00000002532�12631326456�016121� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npmconf = require('../../lib/config/core.js') var common = require('./00-config-setup.js') var path = require('path') var projectData = { 'save-prefix': '~', 'proprietary-attribs': false } var ucData = common.ucData var envData = common.envData var envDataFix = common.envDataFix var gcData = { 'package-config:foo': 'boo' } var biData = {} var cli = { foo: 'bar', umask: parseInt('022', 8) } var expectList = [ cli, envDataFix, projectData, ucData, gcData, biData ] var expectSources = { cli: { data: cli }, env: { data: envDataFix, source: envData, prefix: '' }, project: { path: path.resolve(__dirname, '..', '..', '.npmrc'), type: 'ini', data: projectData }, user: { path: common.userconfig, type: 'ini', data: ucData }, global: { path: common.globalconfig, type: 'ini', data: gcData }, builtin: { data: biData } } test('no builtin', function (t) { npmconf.load(cli, function (er, conf) { if (er) throw er t.same(conf.list, expectList) t.same(conf.sources, expectSources) t.same(npmconf.rootConf.list, []) t.equal(npmconf.rootConf.root, npmconf.defs.defaults) t.equal(conf.root, npmconf.defs.defaults) t.equal(conf.get('umask'), parseInt('022', 8)) t.equal(conf.get('heading'), 'npm') t.end() }) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-builtin.js�����������������������������������������������������������0000644�0000000�0000000�00000002621�12631326456�016505� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npmconf = require('../../lib/config/core.js') var common = require('./00-config-setup.js') var path = require('path') var ucData = common.ucData var envData = common.envData var envDataFix = common.envDataFix var gcData = { 'package-config:foo': 'boo' } var biData = { 'builtin-config': true } var cli = { foo: 'bar', heading: 'foo', 'git-tag-version': false } var projectData = { 'save-prefix': '~', 'proprietary-attribs': false } var expectList = [ cli, envDataFix, projectData, ucData, gcData, biData ] var expectSources = { cli: { data: cli }, env: { data: envDataFix, source: envData, prefix: '' }, project: { path: path.resolve(__dirname, '..', '..', '.npmrc'), type: 'ini', data: projectData }, user: { path: common.userconfig, type: 'ini', data: ucData }, global: { path: common.globalconfig, type: 'ini', data: gcData }, builtin: { data: biData } } test('with builtin', function (t) { npmconf.load(cli, common.builtin, function (er, conf) { if (er) throw er t.same(conf.list, expectList) t.same(conf.sources, expectSources) t.same(npmconf.rootConf.list, []) t.equal(npmconf.rootConf.root, npmconf.defs.defaults) t.equal(conf.root, npmconf.defs.defaults) t.equal(conf.get('heading'), 'foo') t.equal(conf.get('git-tag-version'), false) t.end() }) }) ���������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-certfile.js����������������������������������������������������������0000644�0000000�0000000�00000000762�12631326456�016640� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������require('./00-config-setup.js') var path = require('path') var fs = require('fs') var test = require('tap').test var npmconf = require('../../lib/config/core.js') test('cafile loads as ca', function (t) { var cafile = path.join(__dirname, '..', 'fixtures', 'config', 'multi-ca') npmconf.load({cafile: cafile}, function (er, conf) { if (er) throw er t.same(conf.get('cafile'), cafile) t.same(conf.get('ca').join('\n'), fs.readFileSync(cafile, 'utf8').trim()) t.end() }) }) ��������������npm_3.5.2.orig/test/tap/config-credentials.js�������������������������������������������������������0000644�0000000�0000000�00000020514�12631326456�017335� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npmconf = require('../../lib/config/core.js') var common = require('./00-config-setup.js') var URI = 'https://registry.lvh.me:8661/' test('getting scope with no credentials set', function (t) { npmconf.load({}, function (er, conf) { t.ifError(er, 'configuration loaded') var basic = conf.getCredentialsByURI(URI) t.equal(basic.scope, '//registry.lvh.me:8661/', 'nerfed URL extracted') t.end() }) }) test('trying to set credentials with no URI', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') t.throws(function () { conf.setCredentialsByURI() }, 'enforced missing URI') t.end() }) }) test('trying to clear credentials with no URI', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') t.throws(function () { conf.clearCredentialsByURI() }, 'enforced missing URI') t.end() }) }) test('set with missing credentials object', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') t.throws(function () { conf.setCredentialsByURI(URI) }, 'enforced missing credentials') t.end() }) }) test('set with empty credentials object', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') t.throws(function () { conf.setCredentialsByURI(URI, {}) }, 'enforced missing credentials') t.end() }) }) test('set with token', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') t.doesNotThrow(function () { conf.setCredentialsByURI(URI, { token: 'simple-token' }) }, 'needs only token') var expected = { scope: '//registry.lvh.me:8661/', token: 'simple-token', username: undefined, password: undefined, email: undefined, auth: undefined, alwaysAuth: undefined } t.same(conf.getCredentialsByURI(URI), expected, 'got bearer token and scope') t.end() }) }) test('clear with token', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') t.doesNotThrow(function () { conf.setCredentialsByURI(URI, { token: 'simple-token' }) }, 'needs only token') t.doesNotThrow(function () { conf.clearCredentialsByURI(URI) }, 'needs only URI') t.notOk(conf.getCredentialsByURI(URI).token, 'token all gone') t.end() }) }) test('set with missing username', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { password: 'password', email: 'ogd@aoaioxxysz.net' } t.throws(function () { conf.setCredentialsByURI(URI, credentials) }, 'enforced missing email') t.end() }) }) test('set with missing password', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { username: 'username', email: 'ogd@aoaioxxysz.net' } t.throws(function () { conf.setCredentialsByURI(URI, credentials) }, 'enforced missing email') t.end() }) }) test('set with missing email', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { username: 'username', password: 'password' } t.throws(function () { conf.setCredentialsByURI(URI, credentials) }, 'enforced missing email') t.end() }) }) test('set with old-style credentials', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net' } t.doesNotThrow(function () { conf.setCredentialsByURI(URI, credentials) }, 'requires all of username, password, and email') var expected = { scope: '//registry.lvh.me:8661/', token: undefined, username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', alwaysAuth: false } t.same(conf.getCredentialsByURI(URI), expected, 'got credentials') t.end() }) }) test('clear with old-style credentials', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net' } t.doesNotThrow(function () { conf.setCredentialsByURI(URI, credentials) }, 'requires all of username, password, and email') t.doesNotThrow(function () { conf.clearCredentialsByURI(URI) }, 'clearing only required URI') t.notOk(conf.getCredentialsByURI(URI).username, 'username cleared') t.notOk(conf.getCredentialsByURI(URI).password, 'password cleared') t.end() }) }) test('get old-style credentials for default registry', function (t) { npmconf.load(common.builtin, function (er, conf) { var actual = conf.getCredentialsByURI(conf.get('registry')) var expected = { scope: '//registry.npmjs.org/', token: undefined, password: 'password', username: 'username', email: 'i@izs.me', auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', alwaysAuth: false } t.same(actual, expected) t.end() }) }) test('set with always-auth enabled', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', alwaysAuth: true } conf.setCredentialsByURI(URI, credentials) var expected = { scope: '//registry.lvh.me:8661/', token: undefined, username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', alwaysAuth: true } t.same(conf.getCredentialsByURI(URI), expected, 'got credentials') t.end() }) }) test('set with always-auth disabled', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var credentials = { username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', alwaysAuth: false } conf.setCredentialsByURI(URI, credentials) var expected = { scope: '//registry.lvh.me:8661/', token: undefined, username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', alwaysAuth: false } t.same(conf.getCredentialsByURI(URI), expected, 'got credentials') t.end() }) }) test('set with global always-auth enabled', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var original = conf.get('always-auth') conf.set('always-auth', true) var credentials = { username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net' } conf.setCredentialsByURI(URI, credentials) var expected = { scope: '//registry.lvh.me:8661/', token: undefined, username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', alwaysAuth: true } t.same(conf.getCredentialsByURI(URI), expected, 'got credentials') conf.set('always-auth', original) t.end() }) }) test('set with global always-auth disabled', function (t) { npmconf.load(common.builtin, function (er, conf) { t.ifError(er, 'configuration loaded') var original = conf.get('always-auth') conf.set('always-auth', false) var credentials = { username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net' } conf.setCredentialsByURI(URI, credentials) var expected = { scope: '//registry.lvh.me:8661/', token: undefined, username: 'username', password: 'password', email: 'ogd@aoaioxxysz.net', auth: 'dXNlcm5hbWU6cGFzc3dvcmQ=', alwaysAuth: false } t.same(conf.getCredentialsByURI(URI), expected, 'got credentials') conf.set('always-auth', original) t.end() }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-edit.js��������������������������������������������������������������0000644�0000000�0000000�00000002761�12631326456�015771� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'npm-global-edit') var editorSrc = function () {/* #!/usr/bin/env node var fs = require('fs') if (fs.existsSync(process.argv[2])) { console.log('success') } else { console.log('error') process.exit(1) } */}.toString().split('\n').slice(1, -1).join('\n') var editorPath = path.join(pkg, 'editor') test('setup', function (t) { cleanup(function (er) { t.ifError(er, 'old directory removed') mkdirp(pkg, '0777', function (er) { fs.writeFileSync(editorPath, editorSrc) fs.chmodSync(editorPath, '0777') t.ifError(er, 'created package directory correctly') t.end() }) }) }) test('saving configs', function (t) { var opts = { cwd: pkg, env: { PATH: process.env.PATH, EDITOR: editorPath } } common.npm( [ 'config', '--prefix', pkg, '--global', 'edit' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'command ran without issue') t.equal(stderr, '', 'got nothing on stderr') t.equal(code, 0, 'exit ok') t.equal(stdout, 'success\n', 'got success message') t.end() } ) }) test('cleanup', function (t) { cleanup(function (er) { t.ifError(er, 'test directory removed OK') t.end() }) }) function cleanup (cb) { rimraf(pkg, cb) } ���������������npm_3.5.2.orig/test/tap/config-malformed.js���������������������������������������������������������0000644�0000000�0000000�00000000573�12631326456�017011� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npmconf = require('../../lib/config/core.js') var common = require('./00-config-setup.js') test('with malformed', function (t) { npmconf.load({}, common.malformed, function (er, conf) { t.ok(er, 'Expected parse error') if (!(er && /Failed parsing JSON config key email/.test(er.message))) { throw er } t.end() }) }) �������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-meta.js��������������������������������������������������������������0000644�0000000�0000000�00000007411�12631326456�015767� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this is a weird meta test. It verifies that all the instances of // `npm.config.get(...)` are: // a) Simple strings, and not variables // b) Documented // c) Defined in the `npmconf` package. var test = require('tap').test var fs = require('fs') var path = require('path') var root = path.resolve(__dirname, '..', '..') var lib = path.resolve(root, 'lib') var nm = path.resolve(root, 'node_modules') var doc = path.resolve(root, 'doc/misc/npm-config.md') var FILES = [] var CONFS = {} var DOC = {} var exceptions = [ path.resolve(lib, 'adduser.js'), path.resolve(lib, 'config.js'), path.resolve(lib, 'publish.js'), path.resolve(lib, 'utils', 'lifecycle.js'), path.resolve(lib, 'utils', 'map-to-registry.js'), path.resolve(nm, 'npm-registry-client', 'lib', 'publish.js'), path.resolve(nm, 'npm-registry-client', 'lib', 'request.js') ] test('get files', function (t) { walk(nm) walk(lib) t.pass('got files') t.end() function walk (lib) { var files = fs.readdirSync(lib).map(function (f) { return path.resolve(lib, f) }) files.forEach(function (f) { try { var s = fs.lstatSync(f) } catch (er) { return } if (s.isDirectory()) { walk(f) } else if (f.match(/\.js$/)) { FILES.push(f) } }) } }) test('get lines', function (t) { FILES.forEach(function (f) { var lines = fs.readFileSync(f, 'utf8').split(/\r|\n/) lines.forEach(function (l, i) { var matches = l.split(/conf(?:ig)?\.get\(/g) matches.shift() matches.forEach(function (m) { m = m.split(')').shift() var literal = m.match(/^[''].+?['']/) if (literal) { m = literal[0].slice(1, -1) if (!m.match(/^\_/) && m !== 'argv') { CONFS[m] = { file: f, line: i } } } else if (exceptions.indexOf(f) === -1) { t.fail('non-string-literal config used in ' + f + ':' + i) } }) }) }) t.pass('got lines') t.end() }) test('get docs', function (t) { var d = fs.readFileSync(doc, 'utf8').split(/\r|\n/) // walk down until the '## Config Settings' section for (var i = 0; i < d.length && d[i] !== '## Config Settings'; i++); i++ // now gather up all the ^###\s lines until the next ^##\s for (; i < d.length && !d[i].match(/^## /); i++) { if (d[i].match(/^### /)) { DOC[ d[i].replace(/^### /, '').trim() ] = true } } t.pass('read the docs') t.end() }) test('check configs', function (t) { var defs = require('../../lib/config/defaults.js') var types = Object.keys(defs.types) var defaults = Object.keys(defs.defaults) for (var c1 in CONFS) { if (CONFS[c1].file.indexOf(lib) === 0) { t.ok(DOC[c1], 'should be documented ' + c1 + ' ' + CONFS[c1].file + ':' + CONFS[c1].line) t.ok(types.indexOf(c1) !== -1, 'should be defined in npmconf ' + c1) t.ok(defaults.indexOf(c1) !== -1, 'should have default in npmconf ' + c1) } } for (var c2 in DOC) { if (c2 !== 'versions' && c2 !== 'version' && c2 !== 'init.version') { t.ok(CONFS[c2], 'config in doc should be used somewhere ' + c2) t.ok(types.indexOf(c2) !== -1, 'should be defined in npmconf ' + c2) t.ok(defaults.indexOf(c2) !== -1, 'should have default in npmconf ' + c2) } } types.forEach(function (c) { if (!c.match(/^\_/) && c !== 'argv' && !c.match(/^versions?$/)) { t.ok(DOC[c], 'defined type should be documented ' + c) t.ok(CONFS[c], 'defined type should be used ' + c) } }) defaults.forEach(function (c) { if (!c.match(/^\_/) && c !== 'argv' && !c.match(/^versions?$/)) { t.ok(DOC[c], 'defaulted type should be documented ' + c) t.ok(CONFS[c], 'defaulted type should be used ' + c) } }) t.end() }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-new-cafile.js��������������������������������������������������������0000644�0000000�0000000�00000002310�12631326456�017044� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������require('./00-config-setup.js') var path = require('path') var fs = require('graceful-fs') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var osenv = require('osenv') var npmconf = require('../../lib/config/core.js') var dir = path.resolve(__dirname, 'config-new-cafile') var beep = path.resolve(dir, 'beep.pem') test('setup', function (t) { bootstrap() t.end() }) test('can set new cafile when old is gone', function (t) { t.plan(5) npmconf.load(function (error, conf) { npmconf.loaded = false t.ifError(error) conf.set('cafile', beep, 'user') conf.save('user', function (error) { t.ifError(error) t.equal(conf.get('cafile'), beep) rimraf.sync(beep) npmconf.load(function (error, conf) { if (error) { throw error } t.equal(conf.get('cafile'), beep) conf.del('cafile') conf.save('user', function (error) { t.ifError(error) }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function bootstrap () { mkdirp.sync(dir) fs.writeFileSync(beep, '') } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(dir) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-private.js�����������������������������������������������������������0000644�0000000�0000000�00000003477�12631326456�016523� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var test = require('tap').test var rimraf = require('rimraf') var mkdirp = require('mkdirp') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'config-private') var opts = { cwd: pkg } test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) t.end() }) test('config get private var (old auth)', function (t) { common.npm( [ 'config', 'get', '_auth' ], opts, function (err, code, stdout, stderr) { t.ifError(err) t.similar(stderr, /sekretz/, 'password blocked on stderr') t.equal(stdout, '', 'no output') t.end() } ) }) test('config get private var (new auth)', function (t) { common.npm( [ 'config', 'get', '//registry.npmjs.org/:_password' ], opts, function (err, code, stdout, stderr) { t.ifError(err) t.similar(stderr, /sekretz/, 'password blocked on stderr') t.equal(stdout, '', 'no output') t.end() } ) }) test('config get public var (new username)', function (t) { var FIXTURE_PATH = path.resolve(pkg, 'fixture_npmrc') var s = '//registry.lvh.me/:username = wombat\n' + '//registry.lvh.me/:_password = YmFkIHBhc3N3b3Jk\n' + '//registry.lvh.me/:email = lindsay@wdu.org.au\n' fs.writeFileSync(FIXTURE_PATH, s, 'ascii') fs.chmodSync(FIXTURE_PATH, '0444') common.npm( [ 'config', 'get', '//registry.lvh.me/:username', '--userconfig=' + FIXTURE_PATH, '--registry=http://registry.lvh.me/' ], opts, function (err, code, stdout, stderr) { t.ifError(err) t.equal(stderr, '', 'stderr is empty') t.equal(stdout, 'wombat\n', 'got usename is output') t.end() } ) }) test('clean', function (t) { rimraf.sync(pkg) t.end() }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-project.js�����������������������������������������������������������0000644�0000000�0000000�00000002610�12631326456�016503� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var path = require('path') var fix = path.resolve(__dirname, '..', 'fixtures', 'config') var projectRc = path.resolve(fix, '.npmrc') var npmconf = require('../../lib/config/core.js') var common = require('./00-config-setup.js') var projectData = { just: 'testing' } var ucData = common.ucData var envData = common.envData var envDataFix = common.envDataFix var gcData = { 'package-config:foo': 'boo' } var biData = {} var cli = { foo: 'bar', umask: parseInt('022', 8), prefix: fix } var expectList = [ cli, envDataFix, projectData, ucData, gcData, biData ] var expectSources = { cli: { data: cli }, env: { data: envDataFix, source: envData, prefix: '' }, project: { path: projectRc, type: 'ini', data: projectData }, user: { path: common.userconfig, type: 'ini', data: ucData }, global: { path: common.globalconfig, type: 'ini', data: gcData }, builtin: { data: biData } } test('no builtin', function (t) { npmconf.load(cli, function (er, conf) { if (er) throw er t.same(conf.list, expectList) t.same(conf.sources, expectSources) t.same(npmconf.rootConf.list, []) t.equal(npmconf.rootConf.root, npmconf.defs.defaults) t.equal(conf.root, npmconf.defs.defaults) t.equal(conf.get('umask'), parseInt('022', 8)) t.equal(conf.get('heading'), 'npm') t.end() }) }) ������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/config-save.js��������������������������������������������������������������0000644�0000000�0000000�00000004307�12631326456�016000� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var ini = require('ini') var test = require('tap').test var npmconf = require('../../lib/config/core.js') var common = require('./00-config-setup.js') var expectConf = [ 'globalconfig = ' + common.globalconfig, 'email = i@izs.me', 'env-thing = asdf', 'init.author.name = Isaac Z. Schlueter', 'init.author.email = i@izs.me', 'init.author.url = http://blog.izs.me/', 'init.version = 1.2.3', 'proprietary-attribs = false', 'npm:publishtest = true', '_npmjs.org:couch = https://admin:password@localhost:5984/registry', 'npm-www:nocache = 1', 'sign-git-tag = false', 'message = v%s', 'strict-ssl = false', '_auth = dXNlcm5hbWU6cGFzc3dvcmQ=', '', '[_token]', 'AuthSession = yabba-dabba-doodle', 'version = 1', 'expires = 1345001053415', 'path = /', 'httponly = true', '' ].join('\n') var expectFile = [ 'globalconfig = ' + common.globalconfig, 'email = i@izs.me', 'env-thing = asdf', 'init.author.name = Isaac Z. Schlueter', 'init.author.email = i@izs.me', 'init.author.url = http://blog.izs.me/', 'init.version = 1.2.3', 'proprietary-attribs = false', 'npm:publishtest = true', '_npmjs.org:couch = https://admin:password@localhost:5984/registry', 'npm-www:nocache = 1', 'sign-git-tag = false', 'message = v%s', 'strict-ssl = false', '_auth = dXNlcm5hbWU6cGFzc3dvcmQ=', '', '[_token]', 'AuthSession = yabba-dabba-doodle', 'version = 1', 'expires = 1345001053415', 'path = /', 'httponly = true', '' ].join('\n') test('saving configs', function (t) { npmconf.load(function (er, conf) { if (er) throw er conf.set('sign-git-tag', false, 'user') conf.del('nodedir') conf.del('tmp') var foundConf = ini.stringify(conf.sources.user.data) t.same(ini.parse(foundConf), ini.parse(expectConf)) fs.unlinkSync(common.userconfig) conf.save('user', function (er) { if (er) throw er var uc = fs.readFileSync(conf.get('userconfig'), 'utf8') t.same(ini.parse(uc), ini.parse(expectFile)) t.end() }) }) }) test('setting prefix', function (t) { npmconf.load(function (er, conf) { if (er) throw er conf.prefix = 'newvalue' t.same(conf.prefix, 'newvalue') t.end() }) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/cruft-test.js���������������������������������������������������������������0000644�0000000�0000000�00000001656�12631326456�015703� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirpSync = require('mkdirp').sync var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var cruft = path.join(base, 'node_modules', 'cruuuft') var pkg = { name: 'example', version: '1.0.0', dependencies: {} } function setup () { mkdirpSync(path.dirname(cruft)) fs.writeFileSync(cruft, 'this is some cruft for sure') fs.writeFileSync(path.join(base, 'package.json'), JSON.stringify(pkg)) } function cleanup () { rimraf.sync(base) } test('setup', function (t) { cleanup() setup() t.done() }) test('cruft', function (t) { common.npm(['ls'], {cwd: base}, function (er, code, stdout, stderr) { t.is(stderr, '', 'no warnings or errors from ls') t.done() }) }) test('cleanup', function (t) { cleanup() t.done() }) ����������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/dedupe-scoped.js������������������������������������������������������������0000644�0000000�0000000�00000006477�12631326456�016332� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var join = require('path').join var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = join(__dirname, 'dedupe-scoped') var modules = join(pkg, 'node_modules') var EXEC_OPTS = { cwd: pkg } var body = function () {/* @scope/shared@2.1.6 node_modules/first/node_modules/@scope/shared -> node_modules/@scope/shared firstUnique@0.6.0 node_modules/first/node_modules/firstUnique -> node_modules/firstUnique secondUnique@1.2.0 node_modules/second/node_modules/secondUnique -> node_modules/secondUnique - @scope/shared@2.1.6 node_modules/second/node_modules/@scope/shared */}.toString().split('\n').slice(1, -1) var deduper = { 'name': 'dedupe', 'version': '0.0.0', 'dependencies': { 'first': '1.0.0', 'second': '2.0.0' } } var first = { 'name': 'first', 'version': '1.0.0', 'dependencies': { 'firstUnique': '0.6.0', '@scope/shared': '2.1.6' } } var second = { 'name': 'second', 'version': '2.0.0', 'dependencies': { 'secondUnique': '1.2.0', '@scope/shared': '2.1.6' } } var shared = { 'name': '@scope/shared', 'version': '2.1.6' } var firstUnique = { 'name': 'firstUnique', 'version': '0.6.0' } var secondUnique = { 'name': 'secondUnique', 'version': '1.2.0' } test('setup', function (t) { setup() t.end() }) // we like the cars function ltrimm (l) { return l.trim() } test('dedupe finds the common scoped modules and moves it up one level', function (t) { common.npm( [ 'find-dupes' // I actually found a use for this command! ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'successful dry run against fake install') t.notOk(code, 'npm ran without issue') t.notOk(stderr, 'npm printed no errors') t.same( stdout.trim().split('\n').map(ltrimm), body.map(ltrimm), 'got expected output' ) t.end() } ) }) test('cleanup', function (t) { cleanup() t.end() }) function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( join(pkg, 'package.json'), JSON.stringify(deduper, null, 2) ) mkdirp.sync(join(modules, 'first')) fs.writeFileSync( join(modules, 'first', 'package.json'), JSON.stringify(first, null, 2) ) mkdirp.sync(join(modules, 'first', 'node_modules', 'firstUnique')) fs.writeFileSync( join(modules, 'first', 'node_modules', 'firstUnique', 'package.json'), JSON.stringify(firstUnique, null, 2) ) mkdirp.sync(join(modules, 'first', 'node_modules', '@scope', 'shared')) fs.writeFileSync( join(modules, 'first', 'node_modules', '@scope', 'shared', 'package.json'), JSON.stringify(shared, null, 2) ) mkdirp.sync(join(modules, 'second')) fs.writeFileSync( join(modules, 'second', 'package.json'), JSON.stringify(second, null, 2) ) mkdirp.sync(join(modules, 'second', 'node_modules', 'secondUnique')) fs.writeFileSync( join(modules, 'second', 'node_modules', 'secondUnique', 'package.json'), JSON.stringify(secondUnique, null, 2) ) mkdirp.sync(join(modules, 'second', 'node_modules', '@scope', 'shared')) fs.writeFileSync( join(modules, 'second', 'node_modules', '@scope', 'shared', 'package.json'), JSON.stringify(shared, null, 2) ) } function cleanup () { rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/dedupe.js�������������������������������������������������������������������0000644�0000000�0000000�00000005004�12631326456�015040� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.join(__dirname, 'dedupe') var EXEC_OPTS = { cwd: pkg } var json = { author: 'Dedupe tester', name: 'dedupe', version: '0.0.0', dependencies: { optimist: '0.6.0', clean: '2.1.6' } } var shrinkwrap = { name: 'dedupe', version: '0.0.0', dependencies: { clean: { version: '2.1.6', dependencies: { checker: { version: '0.5.2', dependencies: { async: { version: '0.2.10' } } }, minimist: { version: '0.0.5' } } }, optimist: { version: '0.6.0', dependencies: { wordwrap: { version: '0.0.2' }, minimist: { version: '0.0.5' } } } } } test('setup', function (t) { t.comment('test for https://github.com/npm/npm/issues/4675') setup(function () { t.end() }) }) test('dedupe finds the common module and moves it up one level', function (t) { common.npm([ '--registry', common.registry, 'install', '.' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'successfully installed directory') t.equal(code, 0, 'npm install exited with code') common.npm( [ 'dedupe' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'successfully deduped against previous install') t.notOk(code, 'npm dedupe exited with code') t.ok(existsSync(path.join(pkg, 'node_modules', 'minimist')), 'minimist module exists') t.notOk( existsSync(path.join(pkg, 'node_modules', 'clean', 'node_modules', 'minimist')), 'no clean/minimist' ) t.notOk( existsSync(path.join(pkg, 'node_modules', 'optimist', 'node_modules', 'minimist')), 'no optmist/minimist' ) t.end() } ) }) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync( path.join(pkg, 'npm-shrinkwrap.json'), JSON.stringify(shrinkwrap, null, 2) ) process.chdir(pkg) mr({ port: common.port }, function (er, s) { server = s cb() }) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/deprecate.js����������������������������������������������������������������0000644�0000000�0000000�00000006400�12631326456�015527� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var server var cache = { '_id': 'cond', '_rev': '19-d458a706de1740662cd7728d7d7ddf07', 'name': 'cond', 'time': { 'modified': '2015-02-13T07:33:58.120Z', 'created': '2014-03-16T20:52:52.236Z', '0.0.0': '2014-03-16T20:52:52.236Z', '0.0.1': '2014-03-16T21:12:33.393Z', '0.0.2': '2014-03-16T21:15:25.430Z' }, 'versions': { '0.0.0': {}, '0.0.1': {}, '0.0.2': {} }, 'dist-tags': { 'latest': '0.0.2' }, 'description': 'Restartable error handling system', 'license': 'CC0' } test('setup', function (t) { mr({port: common.port}, function (err, s) { t.ifError(err, 'registry mocked successfully') server = s t.ok(true) t.end() }) }) test('npm deprecate an unscoped package', function (t) { var deprecated = JSON.parse(JSON.stringify(cache)) deprecated.versions = { '0.0.0': {}, '0.0.1': { deprecated: 'make it dead' }, '0.0.2': {} } server.get('/cond?write=true').reply(200, cache) server.put('/cond', deprecated).reply(201, { deprecated: true }) common.npm([ 'deprecate', 'cond@0.0.1', 'make it dead', '--registry', common.registry, '--loglevel', 'silent' ], {}, function (er, code, stdout, stderr) { t.ifError(er, 'npm deprecate') t.equal(stderr, '', 'no error output') t.equal(code, 0, 'exited OK') t.end() }) }) test('npm deprecate a scoped package', function (t) { var cacheCopy = JSON.parse(JSON.stringify(cache)) cacheCopy.name = '@scope/cond' cacheCopy._id = '@scope/cond' var deprecated = JSON.parse(JSON.stringify(cacheCopy)) deprecated.versions = { '0.0.0': {}, '0.0.1': { deprecated: 'make it dead' }, '0.0.2': {} } server.get('/@scope%2fcond?write=true').reply(200, cacheCopy) server.put('/@scope%2fcond', deprecated).reply(201, { deprecated: true }) common.npm([ 'deprecate', '@scope/cond@0.0.1', 'make it dead', '--registry', common.registry, '--loglevel', 'silent' ], {}, function (er, code, stdout, stderr) { t.ifError(er, 'npm deprecate') t.equal(stderr, '', 'no error output') t.equal(code, 0, 'exited OK') t.end() }) }) test('npm deprecate semver range', function (t) { var deprecated = JSON.parse(JSON.stringify(cache)) deprecated.versions = { '0.0.0': { deprecated: 'make it dead' }, '0.0.1': { deprecated: 'make it dead' }, '0.0.2': {} } server.get('/cond?write=true').reply(200, cache) server.put('/cond', deprecated).reply(201, { deprecated: true }) common.npm([ 'deprecate', 'cond@<0.0.2', 'make it dead', '--registry', common.registry, '--loglevel', 'silent' ], {}, function (er, code, stdout, stderr) { t.ifError(er, 'npm deprecate') t.equal(stderr, '', 'no error output') t.equal(code, 0, 'exited OK') t.end() }) }) test('npm deprecate bad semver range', function (t) { common.npm([ 'deprecate', 'cond@-9001', 'make it dead', '--registry', common.registry ], {}, function (er, code, stdout, stderr) { t.equal(code, 1, 'errored') t.match(stderr, /invalid version range/, 'bad semver') t.end() }) }) test('cleanup', function (t) { server.close() t.ok(true) t.end() }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/dist-tag.js�����������������������������������������������������������������0000644�0000000�0000000�00000011222�12631326456�015305� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'dist-tag') var server var scoped = { name: '@scoped/pkg', version: '1.1.1' } function mocks (server) { // ls current package server.get('/-/package/@scoped%2fpkg/dist-tags') .reply(200, { latest: '1.0.0', a: '0.0.1', b: '0.5.0' }) // ls named package server.get('/-/package/@scoped%2fanother/dist-tags') .reply(200, { latest: '2.0.0', a: '0.0.2', b: '0.6.0' }) // add c server.get('/-/package/@scoped%2fanother/dist-tags') .reply(200, { latest: '2.0.0', a: '0.0.2', b: '0.6.0' }) server.put('/-/package/@scoped%2fanother/dist-tags/c', '\"7.7.7\"') .reply(200, { latest: '7.7.7', a: '0.0.2', b: '0.6.0', c: '7.7.7' }) // set same version server.get('/-/package/@scoped%2fanother/dist-tags') .reply(200, { latest: '2.0.0', b: '0.6.0' }) // rm server.get('/-/package/@scoped%2fanother/dist-tags') .reply(200, { latest: '2.0.0', a: '0.0.2', b: '0.6.0', c: '7.7.7' }) server.delete('/-/package/@scoped%2fanother/dist-tags/c') .reply(200, { c: '7.7.7' }) // rm server.get('/-/package/@scoped%2fanother/dist-tags') .reply(200, { latest: '4.0.0' }) } test('setup', function (t) { mkdirp(pkg, function (er) { t.ifError(er, pkg + ' made successfully') mr({ port: common.port, plugin: mocks }, function (er, s) { server = s fs.writeFile( path.join(pkg, 'package.json'), JSON.stringify(scoped), function (er) { t.ifError(er, 'wrote package.json') t.end() } ) }) }) }) test('npm dist-tags ls in current package', function (t) { common.npm( [ 'dist-tags', 'ls', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.notOk(code, 'exited OK') t.notOk(stderr, 'no error output') t.equal(stdout, 'a: 0.0.1\nb: 0.5.0\nlatest: 1.0.0\n') t.end() } ) }) test('npm dist-tags ls on named package', function (t) { common.npm( [ 'dist-tags', 'ls', '@scoped/another', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.notOk(code, 'exited OK') t.notOk(stderr, 'no error output') t.equal(stdout, 'a: 0.0.2\nb: 0.6.0\nlatest: 2.0.0\n') t.end() } ) }) test('npm dist-tags add @scoped/another@7.7.7 c', function (t) { common.npm( [ 'dist-tags', 'add', '@scoped/another@7.7.7', 'c', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.notOk(code, 'exited OK') t.notOk(stderr, 'no error output') t.equal(stdout, '+c: @scoped/another@7.7.7\n') t.end() } ) }) test('npm dist-tags set same version', function (t) { common.npm( [ 'dist-tag', 'set', '@scoped/another@0.6.0', 'b', '--registry', common.registry, '--loglevel', 'warn' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.notOk(code, 'exited OK') t.equal( stderr, 'npm WARN dist-tag add b is already set to version 0.6.0\n', 'warned about setting same version' ) t.notOk(stdout, 'only expecting warning message') t.end() } ) }) test('npm dist-tags rm @scoped/another c', function (t) { common.npm( [ 'dist-tags', 'rm', '@scoped/another', 'c', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm access') t.notOk(code, 'exited OK') t.notOk(stderr, 'no error output') t.equal(stdout, '-c: @scoped/another@7.7.7\n') t.end() } ) }) test('npm dist-tags rm @scoped/another nonexistent', function (t) { common.npm( [ 'dist-tags', 'rm', '@scoped/another', 'nonexistent', '--registry', common.registry, '--loglevel', 'silent' ], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, 'npm dist-tag') t.ok(code, 'expecting nonzero exit code') t.notOk(stderr, 'no error output') t.notOk(stdout, 'not expecting output') t.end() } ) }) test('cleanup', function (t) { t.pass('cleaned up') rimraf.sync(pkg) server.close() t.end() }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/do-not-remove-other-bins.js�������������������������������������������������0000644�0000000�0000000�00000005721�12631326456�020343� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var base = path.resolve(__dirname, path.basename(__filename, '.js')) var installPath = path.resolve(base, 'install') var installBin = path.resolve(installPath, 'node_modules', '.bin') var packageApath = path.resolve(base, 'packageA') var packageBpath = path.resolve(base, 'packageB') var packageA = { name: 'a', version: '1.0.0', description: 'x', repository: 'x', license: 'MIT', bin: { testbin: './testbin.js' } } var packageB = { name: 'b', version: '1.0.0', description: 'x', repository: 'x', license: 'MIT', bin: { testbin: './testbin.js' } } var EXEC_OPTS = { cwd: installPath } test('setup', function (t) { cleanup() mkdirp.sync(path.join(installPath, 'node_modules')) mkdirp.sync(packageApath) fs.writeFileSync( path.join(packageApath, 'package.json'), JSON.stringify(packageA, null, 2) ) fs.writeFileSync( path.join(packageApath, packageA.bin.testbin), '' ) mkdirp.sync(packageBpath) fs.writeFileSync( path.join(packageBpath, 'package.json'), JSON.stringify(packageB, null, 2) ) fs.writeFileSync( path.join(packageBpath, packageB.bin.testbin), '' ) t.end() }) test('npm install A', function (t) { process.chdir(installPath) common.npm([ 'install', packageApath ], EXEC_OPTS, function (err, code, stdout, stderr) { console.log(stdout, stderr) t.ifErr(err, 'install finished successfully') t.notOk(code, 'exit ok') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.end() }) }) test('npm install B', function (t) { process.chdir(installPath) common.npm([ 'install', packageBpath ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifErr(err, 'install finished successfully') t.notOk(code, 'exit ok') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.end() }) }) test('verify bins', function (t) { var bin = path.dirname( path.resolve( installBin, fs.readlinkSync(path.join(installBin, 'testbin')))) t.is(bin, path.join(installPath, 'node_modules', 'b')) t.end() }) test('rm install A', function (t) { process.chdir(installPath) common.npm([ 'rm', packageApath ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifErr(err, 'install finished successfully') t.notOk(code, 'exit ok') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.end() }) }) test('verify postremoval bins', function (t) { var bin = path.dirname( path.resolve( installBin, fs.readlinkSync(path.join(installBin, 'testbin')))) t.is(bin, path.join(installPath, 'node_modules', 'b')) t.end() }) test('cleanup', function (t) { cleanup() t.pass('cleaned up') t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(base) } �����������������������������������������������npm_3.5.2.orig/test/tap/extraneous-dep-cycle-ls-ok.js�����������������������������������������������0000644�0000000�0000000�00000002637�12631326456�020666� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap') var pkg = path.resolve(__dirname, path.basename(__filename, '.js')) var pathModA = path.join(pkg, 'node_modules', 'moduleA') var pathModB = path.join(pkg, 'node_modules', 'moduleB') var modA = { name: 'moduleA', version: '1.0.0', _requiredBy: [ '#USER', '/moduleB' ], dependencies: { moduleB: '1.0.0' } } var modB = { name: 'moduleB', version: '1.0.0', _requiredBy: [ '/moduleA' ], dependencies: { moduleA: '1.0.0' } } function setup () { mkdirp.sync(pathModA) fs.writeFileSync( path.join(pathModA, 'package.json'), JSON.stringify(modA, null, 2) ) mkdirp.sync(pathModB) fs.writeFileSync( path.join(pathModB, 'package.json'), JSON.stringify(modB, null, 2) ) } function cleanup () { rimraf.sync(pkg) } test('setup', function (t) { cleanup() setup() t.end() }) var expected = pkg + '\n' + '└─┬ moduleA@1.0.0\n' + ' └── moduleB@1.0.0\n\n' test('extraneous-dep-cycle', function (t) { common.npm(['ls'], {cwd: pkg}, function (er, code, stdout, stderr) { t.ifErr(er, 'install finished successfully') t.is(stdout, expected, 'ls output shows module') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) �������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/false-name.js���������������������������������������������������������������0000644�0000000�0000000�00000003774�12631326456�015616� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// this is a test for fix #2538 // the false_name-test-package has the name property 'test-package' set // in the package.json and a dependency named 'test-package' is also a // defined dependency of 'test-package-with-one-dep'. // // this leads to a conflict during installation and the fix is covered // by this test var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'false-name') var cache = path.join(pkg, 'cache') var server var EXEC_OPTS = { cwd: pkg } var indexContent = 'module.exports = true\n' var json = { name: 'test-package', version: '0.0.0', main: 'index.js', dependencies: { 'test-package-with-one-dep': '0.0.0' } } test('setup', function (t) { t.comment('test for https://github.com/npm/npm/issues/2538') setup() mr({ port: common.port }, function (er, s) { server = s t.end() }) }) test('not every pkg.name can be required', function (t) { common.npm( [ 'install', '.', '--cache', cache, '--registry', common.registry ], EXEC_OPTS, function (err, code) { t.ifErr(err, 'install finished without error') t.equal(code, 0, 'install exited ok') t.ok( existsSync(path.join(pkg, 'node_modules', 'test-package-with-one-dep')), 'test-package-with-one-dep installed OK' ) t.ok( existsSync(path.join(pkg, 'node_modules', 'test-package')), 'test-pacakge subdep installed OK' ) t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync(path.join(pkg, 'index.js'), indexContent) } ����npm_3.5.2.orig/test/tap/full-warning-messages.js����������������������������������������������������0000644�0000000�0000000�00000005000�12631326456�020000� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var fs = require('graceful-fs') var common = require('../common-tap') var base = path.resolve(__dirname, path.basename(__filename, '.js')) var modA = path.resolve(base, 'modA') var modB = path.resolve(base, 'modB') var json = { 'name': 'test-full-warning-messages', 'version': '1.0.0', 'description': 'abc', 'repository': 'git://abc/', 'license': 'ISC', 'dependencies': { 'modA': modA } } var modAJson = { 'name': 'modA', 'version': '1.0.0', 'optionalDependencies': { 'modB': modB } } var modBJson = { 'name': 'modB', 'version': '1.0.0', 'os': ['nope'], 'cpu': 'invalid' } function modJoin () { var modules = Array.prototype.slice.call(arguments) return modules.reduce(function (a, b) { return path.resolve(a, 'node_modules', b) }) } function writeJson (mod, data) { fs.writeFileSync(path.resolve(mod, 'package.json'), JSON.stringify(data)) } function setup () { cleanup() ;[modA, modB].forEach(function (mod) { mkdirp.sync(mod) }) writeJson(base, json) writeJson(modA, modAJson) writeJson(modB, modBJson) } function cleanup () { rimraf.sync(base) } test('setup', function (t) { setup() t.end() }) function exists (t, filepath, msg) { try { fs.statSync(filepath) t.pass(msg) return true } catch (ex) { t.fail(msg, {found: null, wanted: 'exists', compare: 'fs.stat(' + filepath + ')'}) return false } } function notExists (t, filepath, msg) { try { fs.statSync(filepath) t.fail(msg, {found: 'exists', wanted: null, compare: 'fs.stat(' + filepath + ')'}) return true } catch (ex) { t.pass(msg) return false } } test('tree-style', function (t) { common.npm(['install', '--loglevel=warn'], {cwd: base}, function (err, code, stdout, stderr) { if (err) throw err t.is(code, 0, 'result code') t.match(stdout, /modA@1.0.0/, 'modA got installed') t.notMatch(stdout, /modB/, 'modB not installed') var stderrlines = stderr.trim().split(/\n/) t.is(stderrlines.length, 2, 'two lines of warnings') t.match(stderr, /Skipping failed optional dependency/, 'expected optional failure warning') t.match(stderr, /Not compatible with your operating system or architecture/, 'reason for optional failure') exists(t, modJoin(base, 'modA'), 'module A') notExists(t, modJoin(base, 'modB'), 'module B') t.done() }) }) test('cleanup', function (t) { cleanup() t.end() }) npm_3.5.2.orig/test/tap/gently-rm-cmdshims.js�������������������������������������������������������0000644�0000000�0000000�00000012067�12631326456�017324� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('graceful-fs') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var npm = require('../../lib/npm.js') var work = path.join(__dirname, path.basename(__filename, '.js')) var doremove = path.join(work, 'doremove') var dontremove = path.join(work, 'dontremove') var example_json = { name: 'example', version: '1.0.0', bin: { 'example': 'example.js' } } var example_bin = '#!/usr/bin/env node\n' + 'true\n' // NOTE: if this were actually produced on windows it would be \ not / of // course, buuut, path.resolve doesn't understand \ outside of windows =/ var do_example_cmd = '@IF EXIST "%~dp0\\node.exe" (\n' + ' "%~dp0\\node.exe" "%~dp0\\../example/example.js" %*\n' + ') ELSE (\n' + ' @SETLOCAL\n' + ' @SET PATHEXT=%PATHEXT:;.JS;=;%\n' + ' node "%~dp0\\../example/example.js" %*\n' + ')\n' var do_example_cygwin = '#!/bin/sh\n' + 'basedir=`dirname "$0"`\n' + '\n' + 'case `uname` in\n' + ' *CYGWIN*) basedir=`cygpath -w "$basedir"`;;\n' + 'esac\n' + '\n' + 'if [ -x "$basedir/node" ]; then\n' + ' "$basedir/node" "$basedir/../example/example.js" "$@"\n' + ' ret=$?\n' + 'else\n' + ' node "$basedir/../example/example.js" "$@"\n' + ' ret=$?\n' + 'fi\n' + 'exit $ret\n' var dont_example_cmd = '@IF EXIST "%~dp0\\node.exe" (\n' + ' "%~dp0\\node.exe" "%~dp0\\../example-other/example.js" %*\n' + ') ELSE (\n' + ' @SETLOCAL\n' + ' @SET PATHEXT=%PATHEXT:;.JS;=;%\n' + ' node "%~dp0\\../example-other/example.js" %*\n' + ')\n' var dont_example_cygwin = '#!/bin/sh\n' + 'basedir=`dirname "$0"`\n' + '\n' + 'case `uname` in\n' + ' *CYGWIN*) basedir=`cygpath -w "$basedir"`;;\n' + 'esac\n' + '\n' + 'if [ -x "$basedir/node" ]; then\n' + ' "$basedir/node" "$basedir/../example-other/example.js" "$@"\n' + ' ret=$?\n' + 'else\n' + ' node "$basedir/../example-other/example.js" "$@"\n' + ' ret=$?\n' + 'fi\n' + 'exit $ret\n' function cleanup () { rimraf.sync(work) } var doremove_module = path.join(doremove, 'node_modules', 'example') var doremove_example_cmd = path.join(doremove, 'node_modules', '.bin', 'example.cmd') var doremove_example_cygwin = path.join(doremove, 'node_modules', '.bin', 'example') var dontremove_module = path.join(dontremove, 'node_modules', 'example') var dontremove_example_cmd = path.join(dontremove, 'node_modules', '.bin', 'example.cmd') var dontremove_example_cygwin = path.join(dontremove, 'node_modules', '.bin', 'example') function setup () { mkdirp.sync(doremove_module) mkdirp.sync(path.join(doremove, 'node_modules', '.bin')) fs.writeFileSync(path.join(doremove, 'node_modules', 'example', 'package.json'), JSON.stringify(example_json)) fs.writeFileSync(path.join(doremove, 'node_modules', 'example', 'example.js'), JSON.stringify(example_bin)) fs.writeFileSync(doremove_example_cmd, do_example_cmd) fs.writeFileSync(doremove_example_cygwin, do_example_cygwin) mkdirp.sync(dontremove_module) mkdirp.sync(path.join(dontremove, 'node_modules', '.bin')) fs.writeFileSync(path.join(dontremove, 'node_modules', 'example', 'package.json'), JSON.stringify(example_json)) fs.writeFileSync(path.join(dontremove, 'node_modules', 'example', 'example.js'), JSON.stringify(example_bin)) fs.writeFileSync(dontremove_example_cmd, dont_example_cmd) fs.writeFileSync(dontremove_example_cygwin, dont_example_cygwin) } test('setup', function (t) { cleanup() setup() npm.load({}, function () { t.done() }) }) // Like slide.chain, but runs all commands even if they have errors, also // throws away results. function runAll (cmds, done) { runNext() function runNext () { if (cmds.length === 0) return done() var cmdline = cmds.shift() var cmd = cmdline.shift() cmdline.push(runNext) cmd.apply(null, cmdline) } } test('remove-cmd-shims', function (t) { t.plan(2) var gentlyRm = require('../../lib/utils/gently-rm.js') runAll([ [gentlyRm, doremove_example_cmd, true, doremove_module], [gentlyRm, doremove_example_cygwin, true, doremove_module] ], function () { fs.stat(doremove_example_cmd, function (er) { t.is(er && er.code, 'ENOENT', 'cmd-shim was removed') }) fs.stat(doremove_example_cygwin, function (er) { t.is(er && er.code, 'ENOENT', 'cmd-shim cygwin script was removed') }) }) }) test('dont-remove-cmd-shims', function (t) { t.plan(2) var gentlyRm = require('../../lib/utils/gently-rm.js') runAll([ [gentlyRm, dontremove_example_cmd, true, dontremove_module], [gentlyRm, dontremove_example_cygwin, true, dontremove_module] ], function () { fs.stat(dontremove_example_cmd, function (er) { t.is(er, null, 'cmd-shim was not removed') }) fs.stat(dontremove_example_cygwin, function (er) { t.is(er, null, 'cmd-shim cygwin script was not removed') }) }) }) test('cleanup', function (t) { cleanup() t.done() }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gently-rm-linked-module.js��������������������������������������������������0000644�0000000�0000000�00000004522�12631326456�020243� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var basename = require('path').basename var resolve = require('path').resolve var fs = require('graceful-fs') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var base = resolve(__dirname, basename(__filename, '.js')) var pkg = resolve(base, 'gently-rm-linked') var dep = resolve(base, 'test-linked') var glb = resolve(base, 'test-global') var lnk = resolve(base, 'test-global-link') var EXEC_OPTS = { cwd: pkg } var index = "module.exports = function () { console.log('whoop whoop') }" var fixture = { name: '@test/linked', version: '1.0.0', bin: { linked: './index.js' } } test('setup', function (t) { cleanup() setup() t.end() }) test('install and link', function (t) { // link our test module into the global folder common.npm( [ '--prefix', lnk, '--loglevel', 'error', 'link', dep ], EXEC_OPTS, function (er, code, stdout, stderr) { if (er) throw er t.is(code, 0, 'link succeeded') t.is(stderr, '', 'no log output') t.ok(doesModuleExist(), 'installed ok') // and try removing it and make sure that succeeds common.npm( [ '--global', '--prefix', lnk, '--loglevel', 'error', 'rm', '@test/linked' ], EXEC_OPTS, function (er, code, stdout, stderr) { if (er) throw er t.is(code, 0, 'rm succeeded') t.is(stderr, '', 'no log output') t.notOk(doesModuleExist(), 'removed ok') t.end() } ) } ) }) test('cleanup', function (t) { cleanup() t.end() }) function doesModuleExist () { var binPath = resolve(lnk, 'bin', 'linked') var pkgPath = resolve(lnk, 'lib', 'node_modules', '@test', 'linked') try { fs.statSync(binPath) fs.statSync(pkgPath) return true } catch (ex) { return false } } function cleanup () { rimraf.sync(pkg) rimraf.sync(dep) rimraf.sync(lnk) rimraf.sync(glb) } function setup () { mkdirp.sync(pkg) mkdirp.sync(glb) fs.symlinkSync(glb, lnk) // so it doesn't try to install into npm's own node_modules mkdirp.sync(resolve(pkg, 'node_modules')) mkdirp.sync(dep) fs.writeFileSync(resolve(dep, 'package.json'), JSON.stringify(fixture)) fs.writeFileSync(resolve(dep, 'index.js'), index) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gently-rm-overeager.js������������������������������������������������������0000644�0000000�0000000�00000002415�12631326456�017470� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var resolve = require('path').resolve var fs = require('graceful-fs') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'gently-rm-overeager') var dep = resolve(__dirname, 'test-whoops') var EXEC_OPTS = { cwd: pkg } var fixture = { name: '@test/whoops', version: '1.0.0', scripts: { postinstall: 'echo \'nope\' && exit 1' } } test('setup', function (t) { cleanup() setup() t.end() }) test('cache add', function (t) { common.npm(['install', '../test-whoops'], EXEC_OPTS, function (er, c) { t.ifError(er, "test-whoops install didn't explode") t.ok(c, 'test-whoops install also failed') fs.readdir(pkg, function (er, files) { t.ifError(er, 'package directory is still there') t.deepEqual(files, ['npm-debug.log'], 'only debug log remains') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) rimraf.sync(dep) } function setup () { mkdirp.sync(pkg) // so it doesn't try to install into npm's own node_modules mkdirp.sync(resolve(pkg, 'node_modules')) mkdirp.sync(dep) fs.writeFileSync(resolve(dep, 'package.json'), JSON.stringify(fixture)) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gently-rm-symlinked-global-dir.js�������������������������������������������0000644�0000000�0000000�00000005302�12631326456�021520� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var resolve = require('path').resolve var fs = require('graceful-fs') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'gently-rm-linked') var dep = resolve(__dirname, 'test-linked') var glb = resolve(__dirname, 'test-global') var lnk = resolve(__dirname, 'test-global-link') var EXEC_OPTS = { cwd: pkg } var index = "module.exports = function () { console.log('whoop whoop') }" var fixture = { name: '@test/linked', version: '1.0.0', bin: { linked: './index.js' } } test('setup', function (t) { cleanup() setup() t.end() }) test('install and link', function (t) { common.npm( [ '--global', '--prefix', lnk, '--loglevel', 'silent', 'install', '../test-linked' ], EXEC_OPTS, function (er, code, stdout, stderr) { t.ifError(er, "test-linked install didn't explode") t.notOk(code, 'test-linked install also failed') t.notOk(stderr, 'no log output') verify(t, stdout) // again, to make sure unlinking works properlyt common.npm( [ '--global', '--prefix', lnk, '--loglevel', 'silent', 'install', '../test-linked' ], EXEC_OPTS, function (er, code, stdout, stderr) { t.ifError(er, "test-linked install didn't explode") t.notOk(code, 'test-linked install also failed') t.notOk(stderr, 'no log output') verify(t, stdout) fs.readdir(pkg, function (er, files) { t.ifError(er, 'package directory is still there') t.deepEqual(files, ['node_modules'], 'only stub modules dir remains') t.end() }) } ) } ) }) test('cleanup', function (t) { cleanup() t.end() }) function removeBlank (line) { return line !== '' } function verify (t, stdout) { var binPath = resolve(lnk, 'bin', 'linked') var pkgPath = resolve(lnk, 'lib', 'node_modules', '@test', 'linked') var trgPath = resolve(pkgPath, 'index.js') t.deepEqual( stdout.split('\n').filter(removeBlank), [ binPath + ' -> ' + trgPath, resolve(lnk, 'lib'), '└── @test/linked@1.0.0 ' ], 'got expected install output' ) } function cleanup () { rimraf.sync(pkg) rimraf.sync(dep) rimraf.sync(lnk) rimraf.sync(glb) } function setup () { mkdirp.sync(pkg) mkdirp.sync(glb) fs.symlinkSync(glb, lnk) // so it doesn't try to install into npm's own node_modules mkdirp.sync(resolve(pkg, 'node_modules')) mkdirp.sync(dep) fs.writeFileSync(resolve(dep, 'package.json'), JSON.stringify(fixture)) fs.writeFileSync(resolve(dep, 'index.js'), index) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/get.js����������������������������������������������������������������������0000644�0000000�0000000�00000005644�12631326456�014363� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var cacheFile = require('npm-cache-filename') var npm = require('../../') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var path = require('path') var mr = require('npm-registry-mock') var fs = require('graceful-fs') function nop () {} var URI = 'https://npm.registry:8043/rewrite' var TIMEOUT = 3600 var FOLLOW = false var STALE_OK = true var TOKEN = 'lolbutts' var AUTH = { token: TOKEN } var PARAMS = { timeout: TIMEOUT, follow: FOLLOW, staleOk: STALE_OK, auth: AUTH } var PKG_DIR = path.resolve(__dirname, 'get-basic') var CACHE_DIR = path.resolve(PKG_DIR, 'cache') var BIGCO_SAMPLE = { name: '@bigco/sample', version: '1.2.3' } // mock server reference var server var mocks = { 'get': { '/@bigco%2fsample/1.2.3': [200, BIGCO_SAMPLE] } } var mapper = cacheFile(CACHE_DIR) function getCachePath (uri) { return path.join(mapper(uri), '.cache.json') } test('setup', function (t) { mkdirp.sync(CACHE_DIR) mr({port: common.port, mocks: mocks}, function (er, s) { t.ifError(er) npm.load({cache: CACHE_DIR, registry: common.registry}, function (er) { t.ifError(er) server = s t.end() }) }) }) test('get call contract', function (t) { t.throws(function () { npm.registry.get(undefined, PARAMS, nop) }, 'requires a URI') t.throws(function () { npm.registry.get([], PARAMS, nop) }, 'requires URI to be a string') t.throws(function () { npm.registry.get(URI, undefined, nop) }, 'requires params object') t.throws(function () { npm.registry.get(URI, '', nop) }, 'params must be object') t.throws(function () { npm.registry.get(URI, PARAMS, undefined) }, 'requires callback') t.throws(function () { npm.registry.get(URI, PARAMS, 'callback') }, 'callback must be function') t.end() }) test('basic request', function (t) { t.plan(9) var versioned = common.registry + '/underscore/1.3.3' npm.registry.get(versioned, PARAMS, function (er, data) { t.ifError(er, 'loaded specified version underscore data') t.equal(data.version, '1.3.3') fs.stat(getCachePath(versioned), function (er) { t.ifError(er, 'underscore 1.3.3 cache data written') }) }) var rollup = common.registry + '/underscore' npm.registry.get(rollup, PARAMS, function (er, data) { t.ifError(er, 'loaded all metadata') t.deepEqual(data.name, 'underscore') fs.stat(getCachePath(rollup), function (er) { t.ifError(er, 'underscore rollup cache data written') }) }) var scoped = common.registry + '/@bigco%2fsample/1.2.3' npm.registry.get(scoped, PARAMS, function (er, data) { t.ifError(er, 'loaded all metadata') t.equal(data.name, '@bigco/sample') fs.stat(getCachePath(scoped), function (er) { t.ifError(er, 'scoped cache data written') }) }) }) test('cleanup', function (t) { server.close() rimraf.sync(PKG_DIR) t.end() }) ��������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gist-short-shortcut-package.js����������������������������������������������0000644�0000000�0000000�00000003736�12631326456�021151� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'gist-short-shortcut-package') var json = { name: 'gist-short-shortcut-package', version: '0.0.0', dependencies: { 'private-gist': 'gist:deadbeef' } } test('setup', function (t) { setup() t.end() }) test('gist-short-shortcut-package', function (t) { var cloneUrls = [ ['git://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try git URLs first'], ['https://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try HTTPS URLs second'], ['git@gist.github.com:/deadbeef.git', 'GitHub gist shortcuts try SSH third'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install([], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ����������������������������������npm_3.5.2.orig/test/tap/gist-short-shortcut.js������������������������������������������������������0000644�0000000�0000000�00000003664�12631326456�017560� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'gist-short-shortcut') var json = { name: 'gist-short-shortcut', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('gist-shortcut', function (t) { var cloneUrls = [ ['git://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try git URLs first'], ['https://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try HTTPS URLs second'], ['git@gist.github.com:/deadbeef.git', 'GitHub gist shortcuts try SSH third'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error('execFile mock fails on purpose')) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install(['gist:deadbeef'], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ����������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gist-shortcut-package.js����������������������������������������������������0000644�0000000�0000000�00000003720�12631326456�020005� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'gist-shortcut-package') var json = { name: 'gist-shortcut-package', version: '0.0.0', dependencies: { 'private-gist': 'gist:foo/deadbeef' } } test('setup', function (t) { setup() t.end() }) test('gist-shortcut-package', function (t) { var cloneUrls = [ ['git://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try git URLs first'], ['https://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try HTTPS URLs second'], ['git@gist.github.com:/deadbeef.git', 'GitHub gist shortcuts try SSH third'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install([], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������npm_3.5.2.orig/test/tap/gist-shortcut.js������������������������������������������������������������0000644�0000000�0000000�00000003614�12631326456�016416� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'gist-shortcut') var json = { name: 'gist-shortcut', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('gist-shortcut', function (t) { var cloneUrls = [ ['git://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try git URLs first'], ['https://gist.github.com/deadbeef.git', 'GitHub gist shortcuts try HTTPS URLs second'], ['git@gist.github.com:/deadbeef.git', 'GitHub gist shortcuts try SSH third'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install(['gist:foo/deadbeef'], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/git-cache-locking.js��������������������������������������������������������0000644�0000000�0000000�00000002263�12631326456�017046� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var common = require('../common-tap') var path = require('path') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var pkg = path.resolve(__dirname, 'git-cache-locking') var tmp = path.join(pkg, 'tmp') var cache = path.join(pkg, 'cache') test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(path.resolve(pkg, 'node_modules')) t.end() }) test('git-cache-locking: install a git dependency', function (t) { // disable git integration tests on Travis. if (process.env.TRAVIS) return t.end() // package c depends on a.git#master and b.git#master // package b depends on a.git#master common.npm([ 'install', 'git://github.com/nigelzor/npm-4503-c.git' ], { cwd: pkg, env: { npm_config_cache: cache, npm_config_tmp: tmp, npm_config_prefix: pkg, npm_config_global: 'false', HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH } }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm install finished without error') t.equal(0, code, 'npm install should succeed') t.end() }) }) test('cleanup', function (t) { rimraf.sync(pkg) t.end() }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/git-cache-no-hooks.js�������������������������������������������������������0000644�0000000�0000000�00000003157�12631326456�017160� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var fs = require('fs') var path = require('path') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var spawn = require('child_process').spawn var npmCli = require.resolve('../../bin/npm-cli.js') var node = process.execPath var pkg = path.resolve(__dirname, 'git-cache-no-hooks') var tmp = path.join(pkg, 'tmp') var cache = path.join(pkg, 'cache') test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) mkdirp.sync(cache) mkdirp.sync(tmp) mkdirp.sync(path.resolve(pkg, 'node_modules')) t.end() }) test('git-cache-no-hooks: install a git dependency', function (t) { // disable git integration tests on Travis. if (process.env.TRAVIS) return t.end() var command = [ npmCli, 'install', 'git://github.com/nigelzor/npm-4503-a.git' ] var child = spawn(node, command, { cwd: pkg, env: { 'npm_config_cache': cache, 'npm_config_tmp': tmp, 'npm_config_prefix': pkg, 'npm_config_global': 'false', 'npm_config_umask': '00', HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH }, stdio: 'inherit' }) child.on('close', function (code) { t.equal(code, 0, 'npm install should succeed') // verify permissions on git hooks var repoDir = 'git-github-com-nigelzor-npm-4503-a-git-40c5cb24' var hooksPath = path.join(cache, '_git-remotes', repoDir, 'hooks') fs.readdir(hooksPath, function (err) { t.equal(err && err.code, 'ENOENT', 'hooks are not brought along with repo') t.end() }) }) }) test('cleanup', function (t) { rimraf.sync(pkg) t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/git-dependency-install-link.js����������������������������������������������0000644�0000000�0000000�00000007654�12631326456�021105� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var readJson = require('read-package-json') var mr = require('npm-registry-mock') var npm = require('../../lib/npm.js') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'git-dependency-install-link') var repo = resolve(__dirname, 'git-dependency-install-link-repo') var cache = resolve(pkg, 'cache') var daemon var daemonPID var git var mockRegistry var EXEC_OPTS = { registry: common.registry, cwd: pkg, cache: cache } var pjParent = JSON.stringify({ name: 'parent', version: '1.2.3', dependencies: { 'child': 'git://localhost:1234/child.git' } }, null, 2) + '\n' var pjChild = JSON.stringify({ name: 'child', version: '1.0.3' }, null, 2) + '\n' test('setup', function (t) { bootstrap() setup(function (er, r) { t.ifError(er, 'git started up successfully') if (!er) { daemon = r[r.length - 2] daemonPID = r[r.length - 1] } mr({ port: common.port }, function (er, server) { t.ifError(er, 'started mock registry') mockRegistry = server t.end() }) }) }) test('install from git repo [no --link]', function (t) { process.chdir(pkg) common.npm(['install', '--loglevel', 'error'], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm install failed') t.dissimilar(stderr, /Command failed:/, 'expect git to succeed') t.dissimilar(stderr, /version not found/, 'should not go to repository') readJson(resolve(pkg, 'node_modules', 'child', 'package.json'), function (err, data) { t.ifError(err, 'error reading child package.json') t.equal(data && data.version, '1.0.3') t.end() }) }) }) test('install from git repo [with --link]', function (t) { process.chdir(pkg) rimraf.sync(resolve(pkg, 'node_modules')) common.npm(['install', '--link', '--loglevel', 'error'], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm install --link failed') t.dissimilar(stderr, /Command failed:/, 'expect git to succeed') t.dissimilar(stderr, /version not found/, 'should not go to repository') readJson(resolve(pkg, 'node_modules', 'child', 'package.json'), function (err, data) { t.ifError(err, 'error reading child package.json') t.equal(data && data.version, '1.0.3') t.end() }) }) }) test('clean', function (t) { mockRegistry.close() daemon.on('close', function () { cleanup() t.end() }) process.kill(daemonPID) }) function bootstrap () { rimraf.sync(repo) rimraf.sync(pkg) mkdirp.sync(pkg) mkdirp.sync(cache) fs.writeFileSync(resolve(pkg, 'package.json'), pjParent) } function setup (cb) { mkdirp.sync(repo) fs.writeFileSync(resolve(repo, 'package.json'), pjChild) npm.load({ link: true, prefix: pkg, loglevel: 'silent' }, function () { git = require('../../lib/utils/git.js') function startDaemon (cb) { // start git server var d = git.spawn( [ 'daemon', '--verbose', '--listen=localhost', '--export-all', '--base-path=.', '--port=1234' ], { cwd: pkg, env: process.env, stdio: ['pipe', 'pipe', 'pipe'] } ) d.stderr.on('data', childFinder) function childFinder (c) { var cpid = c.toString().match(/^\[(\d+)\]/) if (cpid[1]) { this.removeListener('data', childFinder) cb(null, [d, cpid[1]]) } } } common.makeGitRepo({ path: repo, commands: [ git.chainableExec( ['clone', '--bare', repo, 'child.git'], { cwd: pkg, env: process.env } ), startDaemon ] }, cb) }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(repo) rimraf.sync(pkg) } ������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/git-npmignore.js������������������������������������������������������������0000644�0000000�0000000�00000010030�12631326456�016344� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var cat = require('graceful-fs').writeFileSync var exec = require('child_process').exec var readdir = require('graceful-fs').readdirSync var resolve = require('path').resolve var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var tmpdir = require('osenv').tmpdir var which = require('which') var common = require('../common-tap.js') var pkg = resolve(__dirname, 'git-npmignore') var dep = resolve(pkg, 'deps', 'gitch') var packname = 'gitch-1.0.0.tgz' var packed = resolve(pkg, packname) var modules = resolve(pkg, 'node_modules') var installed = resolve(modules, 'gitch') var expected = [ 'a.js', 'package.json', '.npmignore' ].sort() var EXEC_OPTS = { cwd: pkg } var gitignore = 'node_modules/\n' var npmignore = 't.js\n' var a = "console.log('hi');" var t = "require('tap').test(function (t) { t.pass('I am a test!'); t.end(); });" var fixture = { 'name': 'gitch', 'version': '1.0.0', 'private': true, 'main': 'a.js' } test('setup', function (t) { setup(function (er) { t.ifError(er, 'setup ran OK') t.end() }) }) test('npm pack directly from directory', function (t) { packInstallTest(dep, t) }) test('npm pack via git', function (t) { packInstallTest('git+file://' + dep, t) }) test('cleanup', function (t) { cleanup() t.end() }) function packInstallTest (spec, t) { common.npm( [ '--loglevel', 'silent', 'pack', spec ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm pack ran without error') t.notOk(code, 'npm pack exited cleanly') t.notOk(stderr, 'npm pack ran silently') t.equal(stdout.trim(), packname, 'got expected package name') common.npm( [ '--loglevel', 'silent', 'install', packed ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm install ran without error') t.notOk(code, 'npm install exited cleanly') t.notOk(stderr, 'npm install ran silently') var actual = readdir(installed).sort() t.same(actual, expected, 'no unexpected files in packed directory') rimraf(packed, function () { t.end() }) } ) } ) } function cleanup () { process.chdir(tmpdir()) rimraf.sync(pkg) } function setup (cb) { cleanup() mkdirp.sync(modules) mkdirp.sync(dep) process.chdir(dep) cat(resolve(dep, '.npmignore'), npmignore) cat(resolve(dep, '.gitignore'), gitignore) cat(resolve(dep, 'a.js'), a) cat(resolve(dep, 't.js'), t) cat(resolve(dep, 'package.json'), JSON.stringify(fixture)) common.npm( [ '--loglevel', 'silent', 'cache', 'clean' ], EXEC_OPTS, function (er, code, _, stderr) { if (er) return cb(er) if (code) return cb(new Error('npm cache nonzero exit: ' + code)) if (stderr) return cb(new Error('npm cache clean error: ' + stderr)) which('git', function found (er, git) { if (er) return cb(er) exec(git + ' init', init) function init (er, _, stderr) { if (er) return cb(er) if (stderr) return cb(new Error('git init error: ' + stderr)) exec(git + " config user.name 'Phantom Faker'", user) } function user (er, _, stderr) { if (er) return cb(er) if (stderr) return cb(new Error('git config error: ' + stderr)) exec(git + ' config user.email nope@not.real', email) } function email (er, _, stderr) { if (er) return cb(er) if (stderr) return cb(new Error('git config error: ' + stderr)) exec(git + ' add .', addAll) } function addAll (er, _, stderr) { if (er) return cb(er) if (stderr) return cb(new Error('git add . error: ' + stderr)) exec(git + ' commit -m boot', commit) } function commit (er, _, stderr) { if (er) return cb(er) if (stderr) return cb(new Error('git commit error: ' + stderr)) cb() } }) } ) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/git-races.js����������������������������������������������������������������0000644�0000000�0000000�00000013433�12631326456�015455� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var execFile = require('child_process').execFile var path = require('path') var zlib = require('zlib') var asyncMap = require('slide').asyncMap var deepEqual = require('deep-equal') var fs = require('graceful-fs') var mkdirp = require('mkdirp') var once = require('once') var requireInject = require('require-inject') var rimraf = require('rimraf') var tar = require('tar') var test = require('tap').test var tmpdir = require('osenv').tmpdir var which = require('which') var wd = path.resolve(tmpdir(), 'git-races') var fixtures = path.resolve(__dirname, '../fixtures') var testcase = 'github-com-BryanDonovan-npm-git-test' var testcase_git = path.resolve(wd, testcase + '.git') var testcase_path = path.resolve(wd, testcase) var testcase_tgz = path.resolve(fixtures, testcase + '.git.tar.gz') var testtarballs = [] var testrepos = {} var testurls = {} /* This test is specifically for #7202, where the bug was if you tried installing multiple git urls that pointed at the same repo but had different comittishes, you'd sometimes get the wrong version. The test cases, provided by @BryanDonovan, have a dependency tree like this: top bar#4.0.0 buzz#3.0.0 foo#3.0.0 buzz#3.0.0 foo#4.0.0 buzz#2.0.0 But what would happen is that buzz#2.0.0 would end up installed under bar#4.0.0. bar#4.0.0 shouldn't have gotten its own copy if buzz, and if it did, it shouldn've been buzz#3.0.0 */ ;['bar', 'foo', 'buzz'].forEach(function (name) { var mockurl = 'ssh://git@github.com/BryanDonovan/dummy-npm-' + name + '.git' var realrepo = path.resolve(wd, 'github-com-BryanDonovan-dummy-npm-' + name + '.git') var tgz = path.resolve(fixtures, 'github-com-BryanDonovan-dummy-npm-' + name + '.git.tar.gz') testrepos[mockurl] = realrepo testtarballs.push(tgz) }) function cleanup () { process.chdir(tmpdir()) rimraf.sync(wd) } var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { // If it's a clone we swap any requests for any of the urls we're mocking // with the path to the bare repo if (args[0] === 'clone') { var m2 = args.length - 2 var m1 = args.length - 1 if (testrepos[args[m2]]) { testurls[args[m1]] = args[m2] args[m2] = testrepos[args[m2]] } execFile(cmd, args, options, cb) // here, we intercept npm validating the remote origin url on one of the // clones we've done previously and return the original url that was requested } else if (args[0] === 'config' && args[1] === '--get' && args[2] === 'remote.origin.url') { process.nextTick(function () { cb(null, testurls[options.cwd], '') }) } else { execFile(cmd, args, options, cb) } } } }) function extract (tarball, target, cb) { cb = once(cb) fs.createReadStream(tarball).on('error', function (er) { cb(er) }) .pipe(zlib.createGunzip()).on('error', function (er) { cb(er) }) .pipe(tar.Extract({path: target})).on('error', function (er) { cb(er) }) .on('end', function () { cb() }) } // Copied from lib/utils/git, because we need to use // it before calling npm.load and lib/utils/git uses npm.js // which doesn't allow that. =( =( function prefixGitArgs () { return process.platform === 'win32' ? ['-c', 'core.longpaths=true'] : [] } var gitcmd function execGit (args, options, cb) { var fullArgs = prefixGitArgs().concat(args || []) return execFile(gitcmd, fullArgs, options, cb) } function gitWhichAndExec (args, options, cb) { if (gitcmd) return execGit(args, options, cb) which('git', function (err, pathtogit) { if (err) { err.code = 'ENOGIT' return cb(err) } gitcmd = pathtogit execGit(args, options, cb) }) } function andClone (gitdir, repodir, cb) { return function (er) { if (er) return cb(er) gitWhichAndExec(['clone', gitdir, repodir], {}, cb) } } function setup (cb) { cleanup() mkdirp.sync(wd) extract(testcase_tgz, wd, andClone(testcase_git, testcase_path, andExtractPackages)) function andExtractPackages (er) { if (er) return cb(er) asyncMap(testtarballs, function (tgz, done) { extract(tgz, wd, done) }, andChdir) } function andChdir (er) { if (er) return cb(er) process.chdir(testcase_path) andLoadNpm() } function andLoadNpm () { var opts = { cache: path.resolve(wd, 'cache') } npm.load(opts, cb) } } // there are two (sic) valid trees that can result we don't care which one we // get in npm@2 var oneTree = [ 'npm-git-test@1.0.0', [ ['dummy-npm-bar@4.0.0', [ ['dummy-npm-foo@3.0.0', []] ]], ['dummy-npm-buzz@3.0.0', []], ['dummy-npm-foo@4.0.0', [ ['dummy-npm-buzz@2.0.0', []] ]] ] ] var otherTree = [ 'npm-git-test@1.0.0', [ ['dummy-npm-bar@4.0.0', [ ['dummy-npm-buzz@3.0.0', []], ['dummy-npm-foo@3.0.0', []] ]], ['dummy-npm-buzz@3.0.0', []], ['dummy-npm-foo@4.0.0', [ ['dummy-npm-buzz@2.0.0', []] ]] ] ] function toSimple (tree) { var deps = [] Object.keys(tree.dependencies || {}).forEach(function (dep) { deps.push(toSimple(tree.dependencies[dep])) }) return [ tree['name'] + '@' + tree['version'], deps ] } test('setup', function (t) { setup(function (er) { t.ifError(er, 'setup ran OK') t.end() }) }) test('correct versions are installed for git dependency', function (t) { t.plan(3) t.comment('test for https://github.com/npm/npm/issues/7202') npm.commands.install([], function (er) { t.ifError(er, 'installed OK') npm.commands.ls([], true, function (er, result) { t.ifError(er, 'ls OK') var simplified = toSimple(result) t.ok( deepEqual(simplified, oneTree) || deepEqual(simplified, otherTree), 'install tree is correct' ) }) }) }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/github-shortcut-package.js��������������������������������������������������0000644�0000000�0000000�00000003665�12631326456�020331� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'github-shortcut-package') var json = { name: 'github-shortcut-package', version: '0.0.0', dependencies: { 'private': 'foo/private' } } test('setup', function (t) { setup() t.end() }) test('github-shortcut-package', function (t) { var cloneUrls = [ ['git://github.com/foo/private.git', 'GitHub shortcuts try git URLs first'], ['https://github.com/foo/private.git', 'GitHub shortcuts try HTTPS URLs second'], ['git@github.com:foo/private.git', 'GitHub shortcuts try SSH third'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install([], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ���������������������������������������������������������������������������npm_3.5.2.orig/test/tap/github-shortcut.js����������������������������������������������������������0000644�0000000�0000000�00000003601�12631326456�016726� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'github-shortcut') var json = { name: 'github-shortcut', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('github-shortcut', function (t) { var cloneUrls = [ ['git://github.com/foo/private.git', 'GitHub shortcuts try git URLs first'], ['https://github.com/foo/private.git', 'GitHub shortcuts try HTTPS URLs second'], ['git@github.com:foo/private.git', 'GitHub shortcuts try SSH third'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error('execFile mock fails on purpose')) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } t.plan(1 + cloneUrls.length) npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install(['foo/private'], function (er, result) { t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gitlab-shortcut-package.js��������������������������������������������������0000644�0000000�0000000�00000003553�12631326456�020305� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'gitlab-shortcut-package') var json = { name: 'gitlab-shortcut-package', version: '0.0.0', dependencies: { 'private': 'gitlab:foo/private' } } test('setup', function (t) { setup() t.end() }) test('gitlab-shortcut-package', function (t) { var cloneUrls = [ ['https://gitlab.com/foo/private.git', 'GitLab shortcuts try HTTPS URLs second'], ['git@gitlab.com:foo/private.git', 'GitLab shortcuts try SSH first'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error()) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install([], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/gitlab-shortcut.js����������������������������������������������������������0000644�0000000�0000000�00000003514�12631326456�016711� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var requireInject = require('require-inject') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'gitlab-shortcut') var json = { name: 'gitlab-shortcut', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('gitlab-shortcut', function (t) { var cloneUrls = [ ['https://gitlab.com/foo/private.git', 'GitLab shortcuts try HTTPS URLs second'], ['git@gitlab.com:foo/private.git', 'GitLab shortcuts try SSH first'] ] var npm = requireInject.installGlobally('../../lib/npm.js', { 'child_process': { 'execFile': function (cmd, args, options, cb) { process.nextTick(function () { if (args[0] !== 'clone') return cb(null, '', '') var cloneUrl = cloneUrls.shift() if (cloneUrl) { t.is(args[3], cloneUrl[0], cloneUrl[1]) } else { t.fail('too many attempts to clone') } cb(new Error('execFile mock fails on purpose')) }) } } }) var opts = { cache: path.resolve(pkg, 'cache'), prefix: pkg, registry: common.registry, loglevel: 'silent' } npm.load(opts, function (er) { t.ifError(er, 'npm loaded without error') npm.commands.install(['gitlab:foo/private'], function (er, result) { t.ok(er, 'mocked install failed as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/global-prefix-set-in-userconfig.js������������������������������������������0000644�0000000�0000000�00000001513�12631326456�021665� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var rimraf = require('rimraf') var prefix = __filename.replace(/\.js$/, '') var rcfile = __filename.replace(/\.js$/, '.npmrc') var fs = require('fs') var conf = 'prefix = ' + prefix + '\n' test('setup', function (t) { rimraf.sync(prefix) fs.writeFileSync(rcfile, conf) t.pass('ready') t.end() }) test('run command', function (t) { var args = ['prefix', '-g', '--userconfig=' + rcfile] common.npm(args, {env: {}}, function (er, code, so) { if (er) throw er t.notOk(code, 'npm prefix exited with code 0') t.equal(so.trim(), prefix) t.end() }) }) test('made dir', function (t) { t.ok(fs.statSync(prefix).isDirectory()) t.end() }) test('cleanup', function (t) { rimraf.sync(prefix) rimraf.sync(rcfile) t.pass('clean') t.end() }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/graceful-restart.js���������������������������������������������������������0000644�0000000�0000000�00000005101�12631326456�017042� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = resolve(__dirname, 'graceful-restart') var outGraceless = [ 'prerestart', 'prestop', 'stop', 'poststop', 'prestart', 'start', 'poststart', 'postrestart', '' ].join('\n') var outGraceful = [ 'prerestart', 'restart', 'postrestart', '' ].join('\n') var pjGraceless = JSON.stringify({ name: 'graceless', version: '1.2.3', scripts: { 'prestop': 'echo prestop', 'stop': 'echo stop', 'poststop': 'echo poststop', 'prerestart': 'echo prerestart', 'postrestart': 'echo postrestart', 'prestart': 'echo prestart', 'start': 'echo start', 'poststart': 'echo poststart' } }, null, 2) + '\n' var pjGraceful = JSON.stringify({ name: 'graceful', version: '1.2.3', scripts: { 'prestop': 'echo prestop', 'stop': 'echo stop', 'poststop': 'echo poststop', 'prerestart': 'echo prerestart', 'restart': 'echo restart', 'postrestart': 'echo postrestart', 'prestart': 'echo prestart', 'start': 'echo start', 'poststart': 'echo poststart' } }, null, 2) + '\n' test('setup', function (t) { bootstrap() t.end() }) test('graceless restart', function (t) { fs.writeFileSync(resolve(pkg, 'package.json'), pjGraceless) createChild(['run-script', 'restart'], function (err, code, out) { t.ifError(err, 'restart finished successfully') t.equal(code, 0, 'npm run-script exited with code') t.equal(out, outGraceless, 'expected all scripts to run') t.end() }) }) test('graceful restart', function (t) { fs.writeFileSync(resolve(pkg, 'package.json'), pjGraceful) createChild(['run-script', 'restart'], function (err, code, out) { t.ifError(err, 'restart finished successfully') t.equal(code, 0, 'npm run-script exited with code') t.equal(out, outGraceful, 'expected only *restart scripts to run') t.end() }) }) test('clean', function (t) { cleanup() t.end() }) function bootstrap () { mkdirp.sync(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function createChild (args, cb) { var env = { HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH, 'npm_config_loglevel': 'silent' } if (process.platform === 'win32') { env.npm_config_cache = '%APPDATA%\\npm-cache' } return common.npm(args, { cwd: pkg, stdio: ['ignore', 'pipe', 'ignore'], env: env }, cb) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ignore-install-link.js������������������������������������������������������0000644�0000000�0000000�00000003647�12631326456�017467� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������if (process.platform === 'win32') { console.log('ok - symlinks are weird on windows, skip this test') process.exit(0) } var common = require('../common-tap.js') var test = require('tap').test var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var root = path.resolve(__dirname, 'ignore-install-link') var pkg = path.resolve(root, 'pkg') var dep = path.resolve(root, 'dep') var target = path.resolve(pkg, 'node_modules', 'dep') var cache = path.resolve(root, 'cache') var globalPath = path.resolve(root, 'global') var pkgj = { 'name': 'pkg', 'version': '1.2.3', 'dependencies': { 'dep': '1.2.3' } } var depj = { 'name': 'dep', 'version': '1.2.3' } var myreg = require('http').createServer(function (q, s) { s.statusCode = 403 s.end(JSON.stringify({'error': 'forbidden'}) + '\n') }).listen(common.port) test('setup', function (t) { rimraf.sync(root) mkdirp.sync(root) mkdirp.sync(path.resolve(pkg, 'node_modules')) mkdirp.sync(dep) mkdirp.sync(cache) mkdirp.sync(globalPath) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify(pkgj)) fs.writeFileSync(path.resolve(dep, 'package.json'), JSON.stringify(depj)) fs.symlinkSync(dep, target, 'dir') t.end() }) test('ignore install if package is linked', function (t) { common.npm(['install'], { cwd: pkg, env: { PATH: process.env.PATH || process.env.Path, HOME: process.env.HOME, 'npm_config_prefix': globalPath, 'npm_config_cache': cache, 'npm_config_registry': common.registry, 'npm_config_loglevel': 'silent' }, stdio: 'inherit' }, function (er, code) { if (er) throw er t.equal(code, 0, 'npm install exited with code') t.end() }) }) test('still a symlink', function (t) { t.equal(true, fs.lstatSync(target).isSymbolicLink()) t.end() }) test('cleanup', function (t) { rimraf.sync(root) myreg.close() t.end() }) �����������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ignore-scripts.js�����������������������������������������������������������0000644�0000000�0000000�00000006214�12631326456�016546� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') // ignore-scripts/package.json has scripts that always exit with non-zero error // codes. var pkg = path.resolve(__dirname, 'ignore-scripts') var gypfile = 'bad_binding_file\n' var json = { author: 'Milton the Aussie', name: 'ignore-scripts', version: '0.0.0', scripts: { prepublish: 'exit 123', publish: 'exit 123', postpublish: 'exit 123', preinstall: 'exit 123', install: 'exit 123', postinstall: 'exit 123', preuninstall: 'exit 123', uninstall: 'exit 123', postuninstall: 'exit 123', pretest: 'exit 123', test: 'exit 123', posttest: 'exit 123', prestop: 'exit 123', stop: 'exit 123', poststop: 'exit 123', prestart: 'exit 123', start: 'exit 123', poststart: 'exit 123', prerestart: 'exit 123', restart: 'exit 123', postrestart: 'exit 123', preversion: 'exit 123', version: 'exit 123', postversion: 'exit 123' } } test('setup', function (t) { setup() t.end() }) test('ignore-scripts: install using the option', function (t) { createChild(['install', '--ignore-scripts'], function (err, code) { t.ifError(err, 'install with scripts ignored finished successfully') t.equal(code, 0, 'npm install exited with code') t.end() }) }) test('ignore-scripts: install NOT using the option', function (t) { createChild(['install'], function (err, code) { t.ifError(err, 'install with scripts successful') t.notEqual(code, 0, 'npm install exited with code') t.end() }) }) var scripts = [ 'prepublish', 'publish', 'postpublish', 'preinstall', 'install', 'postinstall', 'preuninstall', 'uninstall', 'postuninstall', 'pretest', 'test', 'posttest', 'prestop', 'stop', 'poststop', 'prestart', 'start', 'poststart', 'prerestart', 'restart', 'postrestart', 'preversion', 'version', 'postversion' ] scripts.forEach(function (script) { test('ignore-scripts: run-script ' + script + ' using the option', function (t) { createChild(['--ignore-scripts', 'run-script', script], function (err, code, stdout, stderr) { t.ifError(err, 'run-script ' + script + ' with ignore-scripts successful') t.equal(code, 0, 'npm run-script exited with code') t.end() }) }) }) scripts.forEach(function (script) { test('ignore-scripts: run-script ' + script + ' NOT using the option', function (t) { createChild(['run-script', script], function (err, code) { t.ifError(err, 'run-script ' + script + ' finished successfully') t.notEqual(code, 0, 'npm run-script exited with code') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync(path.join(pkg, 'binding.gyp'), gypfile) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) } function createChild (args, cb) { return common.npm( args.concat(['--loglevel', 'silent']), { cwd: pkg }, cb ) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ignore-shrinkwrap.js��������������������������������������������������������0000644�0000000�0000000�00000005671�12631326456�017255� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = require('path').join(__dirname, 'ignore-shrinkwrap') var EXEC_OPTS = { cwd: pkg } var customMocks = { 'get': { '/package.js': [200, { ente: true }], '/shrinkwrap.js': [200, { ente: true }] } } var json = { author: 'Rocko Artischocko', name: 'ignore-shrinkwrap', version: '0.0.0', dependencies: { 'npm-test-ignore-shrinkwrap-file': 'http://localhost:1337/package.js' } } var shrinkwrap = { name: 'ignore-shrinkwrap', version: '0.0.0', dependencies: { 'npm-test-ignore-shrinkwrap-file': { version: '1.2.3', from: 'http://localhost:1337/shrinkwrap.js', resolved: 'http://localhost:1337/shrinkwrap.js', dependencies: { opener: { version: '1.3.0', from: 'opener@1.3.0' } } } } } test('setup', function (t) { setup() t.end() }) test('npm install --no-shrinkwrap', function (t) { mr({ port: common.port, mocks: customMocks }, function (err, s) { t.ifError(err, 'mock registry bootstrapped without issue') s._server.on('request', function (req) { switch (req.url) { case '/shrinkwrap.js': t.fail('npm-shrinkwrap.json used instead of package.json') break case '/package.js': t.pass('package.json used') } }) common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'install', '--no-shrinkwrap' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.ok(code, "install isn't going to succeed") s.close() t.end() } ) }) }) test('npm install (with shrinkwrap)', function (t) { mr({ port: common.port, mocks: customMocks }, function (err, s) { t.ifError(err, 'mock registry bootstrapped without issue') s._server.on('request', function (req) { switch (req.url) { case '/shrinkwrap.js': t.pass('shrinkwrap used') break case '/package.js': t.fail('shrinkwrap ignored') } }) common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'install' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.ok(code, "install isn't going to succeed") s.close() t.end() } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync( path.join(pkg, 'npm-shrinkwrap.json'), JSON.stringify(shrinkwrap, null, 2) ) process.chdir(pkg) } �����������������������������������������������������������������������npm_3.5.2.orig/test/tap/init-interrupt.js�����������������������������������������������������������0000644�0000000�0000000�00000002546�12631326456�016577� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// if 'npm init' is interrupted with ^C, don't report // 'init written successfully' var test = require('tap').test var path = require('path') var osenv = require('osenv') var rimraf = require('rimraf') var npmlog = require('npmlog') var requireInject = require('require-inject') var npm = require('../../lib/npm.js') var PKG_DIR = path.resolve(__dirname, 'init-interrupt') test('setup', function (t) { cleanup() t.end() }) test('issue #6684 remove confusing message', function (t) { var initJsonMock = function (dir, input, config, cb) { process.nextTick(function () { cb({ message: 'canceled' }) }) } initJsonMock.yes = function () { return true } npm.load({ loglevel: 'silent' }, function () { var log = '' var init = requireInject('../../lib/init', { 'init-package-json': initJsonMock }) // capture log messages npmlog.on('log', function (chunk) { log += chunk.message + '\n' }) init([], function (err, code) { t.ifError(err, 'init ran successfully') t.notOk(code, 'exited without issue') t.notSimilar(log, /written successfully/, 'no success message written') t.similar(log, /canceled/, 'alerted that init was canceled') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(PKG_DIR) } ����������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-actions.js����������������������������������������������������������0000644�0000000�0000000�00000004204�12631326456�016677� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var npm = require('../../lib/npm.js') var log = require('npmlog') var test = require('tap').test var mockLog = { finish: function () {}, silly: function () {} } var actions test('setup', function (t) { npm.load(function () { log.disableProgress() actions = require('../../lib/install/actions.js').actions t.end() }) }) test('->optdep:a->dep:b', function (t) { var moduleA = { name: 'a', path: '/', package: { scripts: { postinstall: 'false' }, dependencies: { b: '*' } } } var moduleB = { name: 'b', path: '/', package: {}, requires: [], requiredBy: [moduleA] } moduleA.requires = [moduleB] var tree = { path: '/', package: { optionalDependencies: { a: '*' } }, children: [moduleA, moduleB], requires: [moduleA] } moduleA.requiredBy = [tree] t.plan(3) actions.postinstall('/', '/', moduleA, mockLog, function (er) { t.ok(er && er.code === 'ELIFECYCLE', 'Lifecycle failed') t.ok(moduleA.failed, 'moduleA (optional dep) is marked failed') t.ok(moduleB.failed, 'moduleB (direct dep of moduleA) is marked as failed') t.end() }) }) test('->dep:b,->optdep:a->dep:b', function (t) { var moduleA = { name: 'a', path: '/', package: { scripts: { postinstall: 'false' }, dependencies: { b: '*' } } } var moduleB = { name: 'b', path: '/', package: {}, requires: [], requiredBy: [moduleA] } moduleA.requires = [moduleB] var tree = { path: '/', package: { dependencies: { b: '*' }, optionalDependencies: { a: '*' } }, children: [moduleA, moduleB], requires: [moduleA, moduleB] } moduleA.requiredBy = [tree] moduleB.requiredBy.push(tree) t.plan(3) actions.postinstall('/', '/', moduleA, mockLog, function (er) { t.ok(er && er.code === 'ELIFECYCLE', 'Lifecycle failed') t.ok(moduleA.failed, 'moduleA (optional dep) is marked failed') t.ok(!moduleB.failed, 'moduleB (direct dep of moduleA) is marked as failed') t.end() }) }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-at-locally.js�������������������������������������������������������0000644�0000000�0000000�00000003444�12631326456�017305� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-at-locally') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-at-locally-mock', version: '0.0.0' } test('setup', function (t) { cleanup() t.end() }) test('\'npm install ./package@1.2.3\' should install local pkg', function (t) { var target = './package@1.2.3' setup(target) common.npm(['install', '--loglevel=silent', target], EXEC_OPTS, function (err, code) { var p = path.resolve(pkg, 'node_modules/install-at-locally-mock/package.json') t.ifError(err, 'install local package successful') t.equal(code, 0, 'npm install exited with code') t.ok(JSON.parse(fs.readFileSync(p, 'utf8'))) t.end() }) }) test('\'npm install install/at/locally@./package@1.2.3\' should install local pkg', function (t) { var target = 'install/at/locally@./package@1.2.3' setup(target) common.npm(['install', target], EXEC_OPTS, function (err, code) { var p = path.resolve(pkg, 'node_modules/install-at-locally-mock/package.json') t.ifError(err, 'install local package in explicit directory successful') t.equal(code, 0, 'npm install exited with code') t.ok(JSON.parse(fs.readFileSync(p, 'utf8'))) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup (target) { cleanup() var root = path.resolve(pkg, target) mkdirp.sync(root) fs.writeFileSync( path.join(root, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.resolve(pkg, 'node_modules')) process.chdir(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-bad-dep-format.js���������������������������������������������������0000644�0000000�0000000�00000002365�12631326456�020027� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var json = { author: 'John Foo', name: 'bad-dep-format', version: '0.0.0', dependencies: { 'not-legit': 'npm:not-legit@1.0' } } test('invalid url format returns appropriate error', function (t) { setup(json) common.npm(['install'], {}, function (err, code, stdout, stderr) { t.ifError(err, 'install ran without error') t.equals(code, 1, 'inall exited with code 1') t.match(stderr, /ERR.*Unsupported URL Type/, 'Error should report that invalid url-style formats are used') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup (json) { cleanup() process.chdir(mkPkg(json)) } function cleanup () { process.chdir(osenv.tmpdir()) var pkgs = [json] pkgs.forEach(function (json) { rimraf.sync(path.resolve(__dirname, json.name)) }) } function mkPkg (json) { var pkgPath = path.resolve(__dirname, json.name) mkdirp.sync(pkgPath) fs.writeFileSync( path.join(pkgPath, 'package.json'), JSON.stringify(json, null, 2) ) return pkgPath } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-bad-man.js����������������������������������������������������������0000644�0000000�0000000�00000003207�12631326456�016540� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = resolve(__dirname, 'install-bad-man') var target = resolve(__dirname, 'install-bad-man-target') var EXEC_OPTS = { cwd: target } var json = { name: 'install-bad-man', version: '1.2.3', man: [ './install-bad-man.1.lol' ] } test('setup', function (t) { setup() t.pass('setup ran') t.end() }) test("install from repo on 'OS X'", function (t) { common.npm( [ 'install', '--prefix', target, '--global', pkg ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm command ran from test') t.equals(code, 1, 'install exited with failure (1)') t.notOk(stdout, 'no output indicating success') t.notOk( stderr.match(/Cannot read property '1' of null/), 'no longer has cryptic error output' ) t.ok( stderr.match(/install-bad-man\.1\.lol is not a valid name/), 'got expected error output' ) t.end() } ) }) test('clean', function (t) { cleanup() t.pass('cleaned up') t.end() }) function setup () { cleanup() mkdirp.sync(pkg) // make sure it installs locally mkdirp.sync(resolve(target, 'node_modules')) fs.writeFileSync( resolve(pkg, 'package.json'), JSON.stringify(json, null, 2) + '\n' ) fs.writeFileSync(resolve(pkg, 'install-bad-man.1.lol'), 'lol\n') } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) rimraf.sync(target) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-cli-only-development.js���������������������������������������������0000644�0000000�0000000�00000005427�12631326456�021315� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-cli-development') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-cli-development', description: 'fixture', version: '0.0.0', dependencies: { dependency: 'file:./dependency' }, devDependencies: { 'dev-dependency': 'file:./dev-dependency' } } var dependency = { name: 'dependency', description: 'fixture', version: '0.0.0' } var devDependency = { name: 'dev-dependency', description: 'fixture', version: '0.0.0' } test('setup', function (t) { setup() t.pass('setup ran') t.end() }) test('\'npm install --only=development\' should only install devDependencies', function (t) { common.npm(['install', '--only=development'], EXEC_OPTS, function (err, code) { t.ifError(err, 'install development successful') t.equal(code, 0, 'npm install did not raise error code') t.ok( JSON.parse(fs.readFileSync( path.resolve(pkg, 'node_modules/dev-dependency/package.json'), 'utf8') ), 'devDependency was installed' ) t.notOk( existsSync(path.resolve(pkg, 'node_modules/dependency/package.json')), 'dependency was NOT installed' ) t.end() }) }) test('\'npm install --only=development\' should only install devDependencies regardless of npm.config.get(\'production\')', function (t) { cleanup() setup() common.npm(['install', '--only=development', '--production'], EXEC_OPTS, function (err, code) { t.ifError(err, 'install development successful') t.equal(code, 0, 'npm install did not raise error code') t.ok( JSON.parse(fs.readFileSync( path.resolve(pkg, 'node_modules/dev-dependency/package.json'), 'utf8') ), 'devDependency was installed' ) t.notOk( existsSync(path.resolve(pkg, 'node_modules/dependency/package.json')), 'dependency was NOT installed' ) t.end() }) }) test('cleanup', function (t) { cleanup() t.pass('cleaned up') t.end() }) function setup () { mkdirp.sync(path.join(pkg, 'dependency')) fs.writeFileSync( path.join(pkg, 'dependency', 'package.json'), JSON.stringify(dependency, null, 2) ) mkdirp.sync(path.join(pkg, 'dev-dependency')) fs.writeFileSync( path.join(pkg, 'dev-dependency', 'package.json'), JSON.stringify(devDependency, null, 2) ) mkdirp.sync(path.join(pkg, 'node_modules')) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-cli-only-production.js����������������������������������������������0000644�0000000�0000000�00000004024�12631326456�021151� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-cli-only-production') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-cli-only-production', description: 'fixture', version: '0.0.0', scripts: { prepublish: 'exit 123' }, dependencies: { dependency: 'file:./dependency' }, devDependencies: { 'dev-dependency': 'file:./dev-dependency' } } var dependency = { name: 'dependency', description: 'fixture', version: '0.0.0' } var devDependency = { name: 'dev-dependency', description: 'fixture', version: '0.0.0' } test('setup', function (t) { mkdirp.sync(path.join(pkg, 'dependency')) fs.writeFileSync( path.join(pkg, 'dependency', 'package.json'), JSON.stringify(dependency, null, 2) ) mkdirp.sync(path.join(pkg, 'devDependency')) fs.writeFileSync( path.join(pkg, 'devDependency', 'package.json'), JSON.stringify(devDependency, null, 2) ) mkdirp.sync(path.join(pkg, 'node_modules')) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('\'npm install --only=production\' should only install dependencies', function (t) { common.npm(['install', '--only=production'], EXEC_OPTS, function (err, code) { t.ifError(err, 'install production successful') t.equal(code, 0, 'npm install did not raise error code') t.ok( JSON.parse(fs.readFileSync( path.resolve(pkg, 'node_modules/dependency/package.json'), 'utf8') ), 'dependency was installed' ) t.notOk( existsSync(path.resolve(pkg, 'node_modules/dev-dependency/package.json')), 'devDependency was NOT installed' ) t.end() }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-cli-production.js���������������������������������������������������0000644�0000000�0000000�00000004002�12631326456�020166� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-cli-production') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-cli-production', description: 'fixture', version: '0.0.0', scripts: { prepublish: 'exit 123' }, dependencies: { dependency: 'file:./dependency' }, devDependencies: { 'dev-dependency': 'file:./dev-dependency' } } var dependency = { name: 'dependency', description: 'fixture', version: '0.0.0' } var devDependency = { name: 'dev-dependency', description: 'fixture', version: '0.0.0' } test('setup', function (t) { mkdirp.sync(path.join(pkg, 'dependency')) fs.writeFileSync( path.join(pkg, 'dependency', 'package.json'), JSON.stringify(dependency, null, 2) ) mkdirp.sync(path.join(pkg, 'dev-dependency')) fs.writeFileSync( path.join(pkg, 'dev-dependency', 'package.json'), JSON.stringify(devDependency, null, 2) ) mkdirp.sync(path.join(pkg, 'node_modules')) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('\'npm install --production\' should only install dependencies', function (t) { common.npm(['install', '--production'], EXEC_OPTS, function (err, code) { t.ifError(err, 'install production successful') t.equal(code, 0, 'npm install did not raise error code') t.ok( JSON.parse(fs.readFileSync( path.resolve(pkg, 'node_modules/dependency/package.json'), 'utf8') ), 'dependency was installed' ) t.notOk( existsSync(path.resolve(pkg, 'node_modules/dev-dependency/package.json')), 'devDependency was NOT installed' ) t.end() }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-cli-unicode.js������������������������������������������������������0000644�0000000�0000000�00000002714�12631326456�017436� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.resolve(__dirname, 'install-cli-unicode') function hasOnlyAscii (s) { return /^[\000-\177]*$/.test(s) } var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-cli', description: 'fixture', version: '0.0.1', dependencies: { read: '1.0.5' } } test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mr({ port: common.port }, function (er, s) { server = s t.end() }) }) test('does not use unicode with --unicode false', function (t) { common.npm( [ '--unicode', 'false', '--registry', common.registry, '--loglevel', 'silent', 'install', 'optimist' ], EXEC_OPTS, function (err, code, stdout) { t.ifError(err, 'install package read without unicode success') t.notOk(code, 'npm install exited with code 0') t.ifError(err, 'npm install ran without issue') t.ok(stdout, 'got some output') t.ok(hasOnlyAscii(stdout), 'only ASCII in install output') t.end() } ) }) test('cleanup', function (t) { server.close() process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) ����������������������������������������������������npm_3.5.2.orig/test/tap/install-from-local.js�������������������������������������������������������0000644�0000000�0000000�00000004556�12631326456�017304� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var root = path.join(__dirname, 'install-from-local') var pkg = path.join(root, 'package-with-local-paths') var EXEC_OPTS = { cwd: pkg } var localPaths = { name: 'package-with-local-paths', version: '0.0.0', dependencies: { 'package-local-dependency': 'file:../package-local-dependency' }, devDependencies: { 'package-local-dev-dependency': 'file:../package-local-dev-dependency' } } var localDependency = { name: 'package-local-dependency', version: '0.0.0', description: 'Test for local installs' } var localDevDependency = { name: 'package-local-dev-dependency', version: '0.0.0', description: 'Test for local installs' } test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(localPaths, null, 2) ) mkdirp.sync(path.join(root, 'package-local-dependency')) fs.writeFileSync( path.join(root, 'package-local-dependency', 'package.json'), JSON.stringify(localDependency, null, 2) ) mkdirp.sync(path.join(root, 'package-local-dev-dependency')) fs.writeFileSync( path.join(root, 'package-local-dev-dependency', 'package.json'), JSON.stringify(localDevDependency, null, 2) ) process.chdir(pkg) t.end() }) test('\'npm install\' should install local packages', function (t) { common.npm( [ 'install', '.' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'error should not exist') t.notOk(code, 'npm install exited with code 0') var dependencyPackageJson = path.resolve( pkg, 'node_modules/package-local-dependency/package.json' ) t.ok( JSON.parse(fs.readFileSync(dependencyPackageJson, 'utf8')), 'package with local dependency installed' ) var devDependencyPackageJson = path.resolve( pkg, 'node_modules/package-local-dev-dependency/package.json' ) t.ok( JSON.parse(fs.readFileSync(devDependencyPackageJson, 'utf8')), 'package with local dev dependency installed' ) t.end() } ) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(root) t.end() }) ��������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-into-likenamed-folder.js��������������������������������������������0000644�0000000�0000000�00000001746�12631326456�021420� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('graceful-fs') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var moduleDir = path.join(base, 'example-src') var destDir = path.join(base, 'example') var moduleJson = { name: 'example', version: '1.0.0' } function setup () { cleanup() mkdirp.sync(moduleDir) mkdirp.sync(path.join(destDir, 'node_modules')) fs.writeFileSync(path.join(moduleDir, 'package.json'), JSON.stringify(moduleJson)) } function cleanup () { rimraf.sync(base) } test('setup', function (t) { setup() t.end() }) test('like-named', function (t) { common.npm(['install', '../example-src'], {cwd: destDir}, function (er, code, stdout, stderr) { t.is(code, 0, 'no error code') t.is(stderr, '', 'no error output') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) ��������������������������npm_3.5.2.orig/test/tap/install-link-scripts.js�����������������������������������������������������0000644�0000000�0000000�00000005220�12631326456�017660� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-link-scripts') var tmp = path.join(pkg, 'tmp') var dep = path.join(pkg, 'dep') var json = { name: 'install-link-scripts', version: '1.0.0', description: 'a test', repository: 'git://github.com/npm/npm.git', license: 'ISC' } var dependency = { name: 'dep', version: '1.0.0', scripts: { install: './bin/foo' } } var foo = function () {/* #!/usr/bin/env node console.log('hey sup') */}.toString().split('\n').slice(1, -1).join('\n') process.env.npm_config_prefix = tmp test('plain install', function (t) { setup() common.npm( [ 'install', dep, '--tmp', tmp ], { cwd: pkg }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm install ' + dep + ' finished without error') t.equal(code, 0, 'exited ok') t.notOk(stderr, 'no output stderr') t.match(stdout, /hey sup/, 'postinstall script for dep ran') t.end() } ) }) test('link', function (t) { setup() common.npm( [ 'link', '--tmp', tmp ], { cwd: dep }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm link finished without error') t.equal(code, 0, 'exited ok') t.notOk(stderr, 'no output stderr') t.match(stdout, /hey sup/, 'script ran') t.end() } ) }) test('install --link', function (t) { setup() common.npm( [ 'link', '--tmp', tmp ], { cwd: dep }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm link finished without error') common.npm( [ 'install', '--link', dependency.name, '--tmp', tmp ], { cwd: pkg }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm install --link finished without error') t.equal(code, 0, 'exited ok') t.notOk(stderr, 'no output stderr') t.notMatch(stdout, /hey sup/, "script didn't run") t.end() } ) } ) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(tmp) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.join(dep, 'bin')) fs.writeFileSync( path.join(dep, 'package.json'), JSON.stringify(dependency, null, 2) ) fs.writeFileSync(path.join(dep, 'bin', 'foo'), foo, { mode: '0755' }) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-local-dep-cycle.js��������������������������������������������������0000644�0000000�0000000�00000002650�12631326456�020177� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('graceful-fs') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var baseJSON = { name: 'base', version: '1.0.0', dependencies: { a: 'file:a/', b: 'file:b/' } } var aPath = path.join(base, 'a') var aJSON = { name: 'a', version: '1.0.0', dependencies: { b: 'file:../b', c: 'file:../c' } } var bPath = path.join(base, 'b') var bJSON = { name: 'b', version: '1.0.0' } var cPath = path.join(base, 'c') var cJSON = { name: 'c', version: '1.0.0', dependencies: { b: 'file:../b' } } test('setup', function (t) { cleanup() setup() t.end() }) test('install', function (t) { common.npm(['install'], {cwd: base}, function (er, code, stdout, stderr) { t.ifError(er, 'npm config ran without issue') t.is(code, 0, 'exited with a non-error code') t.is(stderr, '', 'Ran without errors') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function saveJson (pkgPath, json) { mkdirp.sync(pkgPath) fs.writeFileSync(path.join(pkgPath, 'package.json'), JSON.stringify(json, null, 2)) } function setup () { saveJson(base, baseJSON) saveJson(aPath, aJSON) saveJson(bPath, bJSON) saveJson(cPath, cJSON) } function cleanup () { rimraf.sync(base) } ����������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-man.js��������������������������������������������������������������0000644�0000000�0000000�00000002766�12631326456�016025� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var resolve = require('path').resolve var osenv = require('osenv') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = resolve(__dirname, 'install-man') var target = resolve(__dirname, 'install-man-target') var EXEC_OPTS = { cwd: target } var json = { name: 'install-man', version: '1.2.3', man: [ './install-man.1' ] } test('setup', function (t) { setup() t.pass('setup ran') t.end() }) test('install man page', function (t) { common.npm( [ 'install', '--prefix', target, '--global', pkg ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm command ran from test') t.equals(code, 0, 'install exited with success (0)') t.ok(stdout, 'output indicating success') t.ok( fs.existsSync(resolve(target, 'share', 'man', 'man1', 'install-man.1')), 'man page link was created' ) t.end() } ) }) test('clean', function (t) { cleanup() t.pass('cleaned up') t.end() }) function setup () { cleanup() mkdirp.sync(pkg) // make sure it installs locally mkdirp.sync(resolve(target, 'node_modules')) fs.writeFileSync( resolve(pkg, 'package.json'), JSON.stringify(json, null, 2) + '\n' ) fs.writeFileSync(resolve(pkg, 'install-man.1'), 'THIS IS A MANPAGE\n') } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) rimraf.sync(target) } ����������npm_3.5.2.orig/test/tap/install-noargs-dev.js�������������������������������������������������������0000644�0000000�0000000�00000004342�12631326456�017307� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.join(__dirname, 'install-noargs-dev') var EXEC_OPTS = { cwd: pkg } var PACKAGE_JSON1 = { name: 'install-noargs-dev', version: '0.0.1', devDependencies: { 'underscore': '1.3.1' } } var PACKAGE_JSON2 = { name: 'install-noargs-dev', version: '0.0.2', devDependencies: { 'underscore': '1.5.1' } } test('setup', function (t) { setup() mr({ port: common.port }, function (er, s) { t.ifError(er, 'started mock registry') server = s t.end() }) }) test('install noargs installs devDependencies', function (t) { common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'install' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var p = path.join(pkg, 'node_modules', 'underscore', 'package.json') var pkgJson = JSON.parse(fs.readFileSync(p)) t.equal(pkgJson.version, '1.3.1') t.end() } ) }) test('install noargs installs updated devDependencies', function (t) { fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(PACKAGE_JSON2, null, 2) ) common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'install' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var p = path.join(pkg, 'node_modules', 'underscore', 'package.json') var pkgJson = JSON.parse(fs.readFileSync(p)) t.equal(pkgJson.version, '1.5.1') t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(path.resolve(pkg, 'node_modules')) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(PACKAGE_JSON1, null, 2) ) process.chdir(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-order.js������������������������������������������������������������0000644�0000000�0000000�00000001366�12631326456�016360� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var sortActions = require('../../lib/install/diff-trees.js').sortActions var a = { package: {_location: '/a', _requiredBy: []} } var b = { package: {_location: '/b', _requiredBy: []} } var c = { package: {_location: '/c', _requiredBy: ['/a', '/b']} } test('install-order when installing deps', function (t) { var plain = [ ['add', a], ['add', b], ['add', c]] var sorted = [ ['add', c], ['add', a], ['add', b]] t.isDeeply(sortActions(plain), sorted) t.end() }) test('install-order when not installing deps', function (t) { var plain = [ ['add', a], ['add', b]] var sorted = [ ['add', a], ['add', b]] t.isDeeply(sortActions(plain), sorted) t.end() }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-preferglobal-warnings.js��������������������������������������������0000644�0000000�0000000�00000006021�12631326456�021530� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var preferGlobalJson = { name: 'npm-test-preferglobal-dep', version: '0.0.0', preferGlobal: true } var dependenciesJson = { name: 'npm-test-preferglobal-dependency-check', version: '0.0.0', dependencies: { 'npm-test-preferglobal-dep': 'file:../' + preferGlobalJson.name } } var devDependenciesJson = { name: 'npm-test-preferglobal-devDependency-check', version: '0.0.0', devDependencies: { 'npm-test-preferglobal-dep': 'file:../' + preferGlobalJson.name } } var emptyPackage = { name: 'npm-test-preferglobal-empty-package', version: '0.0.0' } test('install a preferGlobal dependency without warning', function (t) { setup(dependenciesJson) common.npm([ 'install', '--loglevel=warn' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'packages were installed') t.notMatch( stderr, /WARN.*prefer global/, 'install should not warn when dependency is preferGlobal') t.end() }) }) test('install a preferGlobal dependency without warning', function (t) { setup(devDependenciesJson) common.npm([ 'install', '--loglevel=warn' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'packages were installed') t.notMatch( stderr, /WARN.*prefer global/, 'install should not warn when devDependency is preferGlobal') t.end() }) }) test('warn if a preferGlobal package is being installed direct', function (t) { setup(emptyPackage) common.npm([ 'install', 'file:../' + preferGlobalJson.name, '--loglevel=warn' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'packages were installed') t.match( stderr, /WARN.*prefer global/, 'install should warn when new package is preferGlobal') t.end() }) }) test('warn if a preferGlobal package is being saved', function (t) { setup(emptyPackage) common.npm([ 'install', 'file:../' + preferGlobalJson.name, '--save', '--loglevel=warn' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'packages were installed') t.match( stderr, /WARN.*prefer global/, 'install should warn when new package is preferGlobal') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup (json) { cleanup() mkPkg(preferGlobalJson) process.chdir(mkPkg(json)) } function cleanup () { process.chdir(osenv.tmpdir()) var pkgs = [preferGlobalJson, dependenciesJson, devDependenciesJson, emptyPackage] pkgs.forEach(function (json) { rimraf.sync(path.resolve(__dirname, json.name)) }) } function mkPkg (json) { var pkgPath = path.resolve(__dirname, json.name) mkdirp.sync(pkgPath) fs.writeFileSync( path.join(pkgPath, 'package.json'), JSON.stringify(json, null, 2) ) return pkgPath } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-save-exact.js�������������������������������������������������������0000644�0000000�0000000�00000005001�12631326456�017273� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.join(__dirname, 'install-save-exact') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-save-exact', version: '0.0.1', description: 'fixture' } test('setup', function (t) { setup() mr({ port: common.port }, function (er, s) { server = s t.end() }) }) test('\'npm install --save --save-exact\' should install local pkg', function (t) { common.npm( [ '--loglevel', 'silent', '--registry', common.registry, '--save', '--save-exact', 'install', 'underscore@1.3.1' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.notOk(code, 'npm install exited without raising an error code') var p = path.resolve(pkg, 'node_modules/underscore/package.json') t.ok(JSON.parse(fs.readFileSync(p))) p = path.resolve(pkg, 'package.json') var pkgJson = JSON.parse(fs.readFileSync(p, 'utf8')) t.same( pkgJson.dependencies, { 'underscore': '1.3.1' }, 'underscore dependency should specify exactly 1.3.1' ) t.end() } ) }) test('\'npm install --save-dev --save-exact\' should install local pkg', function (t) { setup() common.npm( [ '--loglevel', 'silent', '--registry', common.registry, '--save-dev', '--save-exact', 'install', 'underscore@1.3.1' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm ran without issue') t.notOk(code, 'npm install exited without raising an error code') var p = path.resolve(pkg, 'node_modules/underscore/package.json') t.ok(JSON.parse(fs.readFileSync(p))) p = path.resolve(pkg, 'package.json') var pkgJson = JSON.parse(fs.readFileSync(p, 'utf8')) t.same( pkgJson.devDependencies, { 'underscore': '1.3.1' }, 'underscore dependency should specify exactly 1.3.1' ) t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(path.resolve(pkg, 'node_modules')) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-save-local.js�������������������������������������������������������0000644�0000000�0000000�00000006205�12631326456�017270� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var root = path.join(__dirname, 'install-save-local') var pkg = path.join(root, 'package') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-save-local', version: '0.0.0' } var localDependency = { name: 'package-local-dependency', version: '0.0.0' } var localDevDependency = { name: 'package-local-dev-dependency', version: '0.0.0' } test('setup', function (t) { setup() t.end() }) test('\'npm install --save ../local/path\' should save to package.json', function (t) { common.npm( [ '--loglevel', 'silent', '--save', 'install', '../package-local-dependency' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var dependencyPackageJson = path.join( pkg, 'node_modules', 'package-local-dependency', 'package.json' ) t.ok(JSON.parse(fs.readFileSync(dependencyPackageJson, 'utf8'))) var pkgJson = JSON.parse(fs.readFileSync(pkg + '/package.json', 'utf8')) t.is(Object.keys(pkgJson.dependencies).length, 1, 'only one dep') t.ok( /file:.*?[/]package-local-dependency$/.test(pkgJson.dependencies['package-local-dependency']), 'local package saved correctly' ) t.end() } ) }) test('\'npm install --save-dev ../local/path\' should save to package.json', function (t) { setup() common.npm( [ '--loglevel', 'silent', '--save-dev', 'install', '../package-local-dev-dependency' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var dependencyPackageJson = path.resolve( pkg, 'node_modules', 'package-local-dev-dependency', 'package.json' ) t.ok(JSON.parse(fs.readFileSync(dependencyPackageJson, 'utf8'))) var pkgJson = JSON.parse(fs.readFileSync(pkg + '/package.json', 'utf8')) t.is(Object.keys(pkgJson.devDependencies).length, 1, 'only one dep') t.ok( /file:.*?[/]package-local-dev-dependency$/.test(pkgJson.devDependencies['package-local-dev-dependency']), 'local package saved correctly' ) t.end() } ) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) process.chdir(__dirname) rimraf.sync(root) } function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.join(root, 'package-local-dependency')) fs.writeFileSync( path.join(root, 'package-local-dependency', 'package.json'), JSON.stringify(localDependency, null, 2) ) mkdirp.sync(path.join(root, 'package-local-dev-dependency')) fs.writeFileSync( path.join(root, 'package-local-dev-dependency', 'package.json'), JSON.stringify(localDevDependency, null, 2) ) process.chdir(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-save-prefix.js������������������������������������������������������0000644�0000000�0000000�00000010263�12631326456�017472� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.join(__dirname, 'install-save-prefix') var EXEC_OPTS = { cwd: pkg } var json = { name: 'install-save-prefix', version: '0.0.1' } test('setup', function (t) { setup() mr({ port: common.port }, function (er, s) { t.ifError(er, 'started mock registry') server = s t.end() }) }) test('install --save with \'^\' save prefix should accept minor updates', function (t) { common.npm( [ '--registry', common.registry, '--loglevel', 'silent', '--save-prefix', '^', '--save', 'install', 'underscore@latest' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var p = path.join(pkg, 'node_modules', 'underscore', 'package.json') t.ok(JSON.parse(fs.readFileSync(p))) var pkgJson = JSON.parse(fs.readFileSync( path.join(pkg, 'package.json'), 'utf8' )) t.deepEqual( pkgJson.dependencies, { 'underscore': '^1.5.1' }, 'got expected save prefix and version of 1.5.1' ) t.end() } ) }) test('install --save-dev with \'^\' save prefix should accept minor dev updates', function (t) { setup() common.npm( [ '--registry', common.registry, '--loglevel', 'silent', '--save-prefix', '^', '--save-dev', 'install', 'underscore@1.3.1' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var p = path.join(pkg, 'node_modules', 'underscore', 'package.json') t.ok(JSON.parse(fs.readFileSync(p))) var pkgJson = JSON.parse(fs.readFileSync( path.join(pkg, 'package.json'), 'utf8' )) t.deepEqual( pkgJson.devDependencies, { 'underscore': '^1.3.1' }, 'got expected save prefix and version of 1.3.1' ) t.end() } ) }) test('install --save with \'~\' save prefix should accept patch updates', function (t) { setup() common.npm( [ '--registry', common.registry, '--loglevel', 'silent', '--save-prefix', '~', '--save', 'install', 'underscore@1.3.1' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var p = path.join(pkg, 'node_modules', 'underscore', 'package.json') t.ok(JSON.parse(fs.readFileSync(p))) var pkgJson = JSON.parse(fs.readFileSync( path.join(pkg, 'package.json'), 'utf8' )) t.deepEqual( pkgJson.dependencies, { 'underscore': '~1.3.1' }, 'got expected save prefix and version of 1.3.1' ) t.end() } ) }) test('install --save-dev with \'~\' save prefix should accept patch updates', function (t) { setup() common.npm( [ '--registry', common.registry, '--loglevel', 'silent', '--save-prefix', '~', '--save-dev', 'install', 'underscore@1.3.1' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install exited with code 0') var p = path.join(pkg, 'node_modules', 'underscore', 'package.json') t.ok(JSON.parse(fs.readFileSync(p))) var pkgJson = JSON.parse(fs.readFileSync( path.join(pkg, 'package.json'), 'utf8' )) t.deepEqual( pkgJson.devDependencies, { 'underscore': '~1.3.1' }, 'got expected save prefix and version of 1.3.1' ) t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(path.resolve(pkg, 'node_modules')) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-scoped-already-installed.js�����������������������������������������0000644�0000000�0000000�00000007144�12631326456�022116� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var root = path.join(__dirname, 'install-scoped-already-installed') var pkg = path.join(root, 'package-with-scoped-paths') var modules = path.join(pkg, 'node_modules') var EXEC_OPTS = { cwd: pkg } var scopedPaths = { name: 'package-with-scoped-paths', version: '0.0.0', dependencies: { 'package-local-dependency': 'file:../package-local-dependency', '@scoped/package-scoped-dependency': 'file:../package-scoped-dependency' } } var localDependency = { name: 'package-local-dependency', version: '0.0.0', description: 'Test for local installs' } var scopedDependency = { name: '@scoped/package-scoped-dependency', version: '0.0.0', description: 'Test for local installs' } test('setup', function (t) { rimraf.sync(root) mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(scopedPaths, null, 2) ) mkdirp.sync(path.join(root, 'package-local-dependency')) fs.writeFileSync( path.join(root, 'package-local-dependency', 'package.json'), JSON.stringify(localDependency, null, 2) ) mkdirp.sync(path.join(root, 'package-scoped-dependency')) fs.writeFileSync( path.join(root, 'package-scoped-dependency', 'package.json'), JSON.stringify(scopedDependency, null, 2) ) process.chdir(pkg) t.end() }) test('installing already installed local scoped package', function (t) { common.npm( [ '--loglevel', 'silent', '--parseable', 'install' ], EXEC_OPTS, function (err, code, stdout) { var installed = parseNpmInstallOutput(stdout) t.ifError(err, 'install ran to completion without error') t.notOk(code, 'npm install exited with code 0') t.ok( existsSync(path.join(modules, '@scoped', 'package-scoped-dependency', 'package.json')), 'package installed' ) t.ok( contains(installed, 'node_modules/@scoped/package-scoped-dependency'), 'installed @scoped/package-scoped-dependency' ) t.ok( contains(installed, 'node_modules/package-local-dependency'), 'installed package-local-dependency' ) common.npm( [ '--loglevel', 'silent', '--parseable', 'install' ], EXEC_OPTS, function (err, code, stdout) { t.ifError(err, 'install ran to completion without error') t.notOk(code, 'npm install raised no error code') installed = parseNpmInstallOutput(stdout) t.ok( existsSync(path.join(modules, '@scoped', 'package-scoped-dependency', 'package.json')), 'package installed' ) t.notOk( contains(installed, 'node_modules/@scoped/package-scoped-dependency'), 'did not reinstall @scoped/package-scoped-dependency' ) t.notOk( contains(installed, 'node_modules/package-local-dependency'), 'did not reinstall package-local-dependency' ) t.end() } ) } ) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(root) t.end() }) function contains (list, element) { var matcher = new RegExp(element + '$') for (var i = 0; i < list.length; ++i) { if (matcher.test(list[i])) { return true } } return false } function parseNpmInstallOutput (stdout) { return stdout.trim().split(/\n\n|\s+/) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-scoped-link.js������������������������������������������������������0000644�0000000�0000000�00000003633�12631326456�017454� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var exec = require('child_process').exec var fs = require('graceful-fs') var path = require('path') var existsSync = fs.existsSync || path.existsSync var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-scoped-link') var work = path.join(__dirname, 'install-scoped-link-TEST') var modules = path.join(work, 'node_modules') var EXEC_OPTS = { cwd: work } var world = 'console.log("hello blrbld")\n' var json = { name: '@scoped/package', version: '0.0.0', bin: { hello: './world.js' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync(path.join(pkg, 'world.js'), world) mkdirp.sync(modules) process.chdir(work) t.end() }) test('installing package with links', function (t) { common.npm( [ '--loglevel', 'silent', 'install', pkg ], EXEC_OPTS, function (err, code) { t.ifError(err, 'install ran to completion without error') t.notOk(code, 'npm install exited with code 0') t.ok( existsSync(path.join(modules, '@scoped', 'package', 'package.json')), 'package installed' ) t.ok(existsSync(path.join(modules, '.bin')), 'binary link directory exists') var hello = path.join(modules, '.bin', 'hello') t.ok(existsSync(hello), 'binary link exists') exec('node ' + hello, function (err, stdout, stderr) { t.ifError(err, 'command ran fine') t.notOk(stderr, 'got no error output back') t.equal(stdout, 'hello blrbld\n', 'output was as expected') t.end() }) } ) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(work) rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-scoped-with-peer-dependency.js��������������������������������������0000644�0000000�0000000�00000002535�12631326456�022537� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'install-scoped-with-peer-dependency') var local = path.join(pkg, 'package') var EXEC_OPTS = { } var json = { name: '@scope/package', version: '0.0.0', peerDependencies: { underscore: '*' } } test('setup', function (t) { setup() t.end() }) test('it should install peerDependencies in same tree level as the parent package', function (t) { common.npm(['install', '--loglevel=warn', './package'], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'install local package successful') t.equal(code, 0, 'npm install exited with code') t.match(stderr, /npm WARN @scope[/]package@0[.]0[.]0 requires a peer of underscore@[*] but none was installed[.]\n/, 'npm install warned about unresolved peer dep') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(local) mkdirp.sync(path.resolve(pkg, 'node_modules')) fs.writeFileSync( path.join(local, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/install-with-dev-dep-duplicate.js�������������������������������������������0000644�0000000�0000000�00000003546�12631326456�021514� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = npm = require('../../') var pkg = path.resolve(__dirname, 'dev-dep-duplicate') var json = { author: 'Anders Janmyr', name: 'dev-dep-duplicate', version: '0.0.0', dependencies: { underscore: '1.5.1' }, devDependencies: { underscore: '1.3.1' } } var expected = { name: 'dev-dep-duplicate', version: '0.0.0', dependencies: { underscore: { version: '1.5.1', from: 'underscore@1.5.1', resolved: common.registry + '/underscore/-/underscore-1.5.1.tgz', invalid: true } } } test('prefers version from dependencies over devDependencies', function (t) { t.plan(1) mr({ port: common.port }, function (er, s) { setup(function (err) { if (err) return t.fail(err) npm.install('.', function (err) { if (err) return t.fail(err) npm.commands.ls([], true, function (err, _, results) { if (err) return t.fail(err) // these contain full paths so we can't do an exact match // with them delete results.problems delete results.dependencies.underscore.problems t.deepEqual(results, expected) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) var opts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } npm.load(opts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/invalid-cmd-exit-code.js����������������������������������������������������0000644�0000000�0000000�00000001266�12631326456�017646� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var common = require('../common-tap.js') var opts = { cwd: process.cwd() } test('npm asdf should return exit code 1', function (t) { common.npm(['asdf'], opts, function (er, c) { if (er) throw er t.ok(c, 'exit code should not be zero') t.end() }) }) test('npm help should return exit code 0', function (t) { common.npm(['help'], opts, function (er, c) { if (er) throw er t.equal(c, 0, 'exit code should be 0') t.end() }) }) test('npm help fadf should return exit code 0', function (t) { common.npm(['help', 'fadf'], opts, function (er, c) { if (er) throw er t.equal(c, 0, 'exit code should be 0') t.end() }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/is-fs-access-available.js���������������������������������������������������0000644�0000000�0000000�00000003346�12631326456�017777� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var requireInject = require('require-inject') var semver = require('semver') var globalProcess = global.process function loadIsFsAccessAvailable (newProcess, fs) { global.process = newProcess var mocks = {fs: fs} var isFsAccessAvailable = requireInject('../../lib/install/is-fs-access-available.js', mocks) global.process = globalProcess return isFsAccessAvailable } var fsWithAccess = {access: function () {}} var fsWithoutAccess = {} if (semver.lt(process.version, '0.12.0')) { test('skipping', function (t) { t.pass('skipping all tests on < 0.12.0 due to process not being injectable') t.end() }) } else { test('mac + !fs.access', function (t) { var isFsAccessAvailable = loadIsFsAccessAvailable({platform: 'darwin'}, fsWithoutAccess) t.is(isFsAccessAvailable, false, 'not available') t.end() }) test('mac + fs.access', function (t) { var isFsAccessAvailable = loadIsFsAccessAvailable({platform: 'darwin'}, fsWithAccess) t.is(isFsAccessAvailable, true, 'available') t.end() }) test('windows + !fs.access', function (t) { var isFsAccessAvailable = loadIsFsAccessAvailable({platform: 'win32'}, fsWithoutAccess) t.is(isFsAccessAvailable, false, 'not available') t.end() }) test('windows + fs.access + node 0.12.7', function (t) { var isFsAccessAvailable = loadIsFsAccessAvailable({platform: 'win32', version: '0.12.7'}, fsWithAccess) t.is(isFsAccessAvailable, false, 'not available') t.end() }) test('windows + fs.access + node 2.4.0', function (t) { var isFsAccessAvailable = loadIsFsAccessAvailable({platform: 'win32', version: '2.4.0'}, fsWithAccess) t.is(isFsAccessAvailable, true, 'available') t.end() }) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/it.js�����������������������������������������������������������������������0000644�0000000�0000000�00000003670�12631326456�014215� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var join = require('path').join var statSync = require('graceful-fs').statSync var writeFileSync = require('graceful-fs').writeFileSync var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = join(__dirname, 'run-script') var installed = join(pkg, 'node_modules', 'underscore', 'package.json') var json = { name: 'npm-it-test', dependencies: { underscore: '1.5.1' }, scripts: { test: 'echo hax' } } var server test('run up the mock registry', function (t) { mr({ port: common.port }, function (err, s) { if (err) throw err server = s t.end() }) }) test('npm install-test', function (t) { setup() common.npm('install-test', { cwd: pkg }, function (err, code, stdout, stderr) { if (err) throw err t.equal(code, 0, 'command ran without error') t.ok(statSync(installed), 'package was installed') t.equal(require(installed).version, '1.5.1', 'underscore got installed as expected') t.match(stdout, /hax/, 'found expected test output') t.notOk(stderr, 'stderr should be empty') t.end() }) }) test('npm it (the form most people will use)', function (t) { setup() common.npm('it', { cwd: pkg }, function (err, code, stdout, stderr) { if (err) throw err t.equal(code, 0, 'command ran without error') t.ok(statSync(installed), 'package was installed') t.equal(require(installed).version, '1.5.1', 'underscore got installed as expected') t.match(stdout, /hax/, 'found expected test output') t.notOk(stderr, 'stderr should be empty') t.end() }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) server.close() cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(pkg) writeFileSync(join(pkg, 'package.json'), JSON.stringify(json, null, 2)) } ������������������������������������������������������������������������npm_3.5.2.orig/test/tap/lifecycle-path.js�����������������������������������������������������������0000644�0000000�0000000�00000004214�12631326456�016465� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'lifecycle-path') var link = path.resolve(pkg, 'node-bin') var PATH if (process.platform === 'win32') { // On Windows the 'comspec' environment variable is used, // so cmd.exe does not need to be on the path. PATH = 'C:\\foo\\bar' } else { // On non-Windows, without the path to the shell, nothing usually works. PATH = '/bin:/usr/bin' } var printPath = 'console.log(process.env.PATH)\n' var json = { name: 'glorb', version: '1.2.3', scripts: { path: './node-bin/node print-path.js' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync(path.join(pkg, 'print-path.js'), printPath) fs.symlinkSync(path.dirname(process.execPath), link, 'dir') t.end() }) test('make sure the path is correct', function (t) { common.npm(['run-script', 'path'], { cwd: pkg, env: { PATH: PATH, stdio: [ 0, 'pipe', 2 ] } }, function (er, code, stdout) { if (er) throw er t.equal(code, 0, 'exit code') // remove the banner, we just care about the last line stdout = stdout.trim().split(/\r|\n/).pop() var pathSplit = process.platform === 'win32' ? ';' : ':' var root = path.resolve(__dirname, '../..') var actual = stdout.split(pathSplit).map(function (p) { if (p.indexOf(root) === 0) { p = '{{ROOT}}' + p.substr(root.length) } return p.replace(/\\/g, '/') }) // get the ones we tacked on, then the system-specific requirements var expect = [ '{{ROOT}}/bin/node-gyp-bin', '{{ROOT}}/test/tap/lifecycle-path/node_modules/.bin' ].concat(PATH.split(pathSplit).map(function (p) { return p.replace(/\\/g, '/') })) t.same(actual, expect) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/lifecycle-signal.js���������������������������������������������������������0000644�0000000�0000000�00000002242�12631326456�017005� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var spawn = require('child_process').spawn var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var node = process.execPath var npm = require.resolve('../../bin/npm-cli.js') var pkg = path.resolve(__dirname, 'lifecycle-signal') var json = { name: 'lifecycle-signal', version: '1.2.5', scripts: { preinstall: 'node -e "process.kill(process.pid,\'SIGSEGV\')"' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('lifecycle signal abort', function (t) { // windows does not use lifecycle signals, abort if (process.platform === 'win32' || process.env.TRAVIS) return t.end() var child = spawn(node, [npm, 'install'], { cwd: pkg }) child.on('close', function (code, signal) { t.equal(code, null) t.equal(signal, 'SIGSEGV') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/lifecycle.js����������������������������������������������������������������0000644�0000000�0000000�00000000522�12631326456�015531� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npm = require('../../') var lifecycle = require('../../lib/utils/lifecycle') test('lifecycle: make env correctly', function (t) { npm.load({enteente: Infinity}, function () { var env = lifecycle.makeEnv({}, null, process.env) t.equal('Infinity', env.npm_config_enteente) t.end() }) }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/link.js���������������������������������������������������������������������0000644�0000000�0000000�00000007745�12631326456�014545� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdirp = require('mkdirp') var osenv = require('osenv') var path = require('path') var rimraf = require('rimraf') var test = require('tap').test var writeFileSync = require('fs').writeFileSync var common = require('../common-tap.js') var link = path.join(__dirname, 'link') var linkScoped = path.join(__dirname, 'link-scoped') var linkInstall = path.join(__dirname, 'link-install') var linkRoot = path.join(__dirname, 'link-root') var config = 'prefix = ' + linkRoot var configPath = path.join(link, '_npmrc') var OPTS = { env: { 'npm_config_userconfig': configPath } } var readJSON = { name: 'foo', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, author: '', license: 'ISC' } var readScopedJSON = { name: '@scope/foo', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, author: '', license: 'ISC' } var installJSON = { name: 'bar', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, author: '', license: 'ISC' } test('setup', function (t) { setup() common.npm(['ls', '-g', '--depth=0'], OPTS, function (err, c, out) { t.ifError(err) t.equal(c, 0, 'set up ok') t.notOk(out.match(/UNMET DEPENDENCY foo@/), "foo isn't in global") t.end() }) }) test('create global link', function (t) { process.chdir(link) common.npm(['link'], OPTS, function (err, c, out) { t.ifError(err, 'link has no error') common.npm(['ls', '-g'], OPTS, function (err, c, out, stderr) { t.ifError(err) t.equal(c, 0) t.equal(stderr, '', 'got expected stderr') t.has(out, /foo@1.0.0/, 'creates global link ok') t.end() }) }) }) test('create scoped global link', function (t) { process.chdir(linkScoped) common.npm(['link'], OPTS, function (err, c, out) { t.ifError(err, 'link has no error') common.npm(['ls', '-g'], OPTS, function (err, c, out, stderr) { t.ifError(err) t.equal(c, 0) t.equal(stderr, '', 'got expected stderr') t.has(out, /@scope[/]foo@1.0.0/, 'creates global link ok') t.end() }) }) }) test('link-install the package', function (t) { process.chdir(linkInstall) common.npm(['link', 'foo'], OPTS, function (err) { t.ifError(err, 'link-install has no error') common.npm(['ls'], OPTS, function (err, c, out) { t.ifError(err) t.equal(c, 1) t.has(out, /foo@1.0.0/, 'link-install ok') t.end() }) }) }) test('link-install the scoped package', function (t) { process.chdir(linkInstall) common.npm(['link', linkScoped], OPTS, function (err) { t.ifError(err, 'link-install has no error') common.npm(['ls'], OPTS, function (err, c, out) { t.ifError(err) t.equal(c, 1) t.has(out, /@scope[/]foo@1.0.0/, 'link-install ok') t.end() }) }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) common.npm(['rm', 'foo'], OPTS, function (err, code) { t.ifError(err, 'npm removed the linked package without error') t.equal(code, 0, 'cleanup foo in local ok') common.npm(['rm', '-g', 'foo'], OPTS, function (err, code) { t.ifError(err, 'npm removed the global package without error') t.equal(code, 0, 'cleanup foo in global ok') cleanup() t.end() }) }) }) function cleanup () { rimraf.sync(linkRoot) rimraf.sync(link) rimraf.sync(linkScoped) rimraf.sync(linkInstall) } function setup () { cleanup() mkdirp.sync(linkRoot) mkdirp.sync(link) writeFileSync( path.join(link, 'package.json'), JSON.stringify(readJSON, null, 2) ) mkdirp.sync(linkScoped) writeFileSync( path.join(linkScoped, 'package.json'), JSON.stringify(readScopedJSON, null, 2) ) mkdirp.sync(linkInstall) writeFileSync( path.join(linkInstall, 'package.json'), JSON.stringify(installJSON, null, 2) ) writeFileSync(configPath, config) } ���������������������������npm_3.5.2.orig/test/tap/locker.js�������������������������������������������������������������������0000644�0000000�0000000�00000004341�12631326456�015054� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var path = require('path') var fs = require('graceful-fs') var crypto = require('crypto') var rimraf = require('rimraf') var osenv = require('osenv') var mkdirp = require('mkdirp') var npm = require('../../') var locker = require('../../lib/utils/locker.js') var lock = locker.lock var unlock = locker.unlock var pkg = path.join(__dirname, '/locker') var cache = path.join(pkg, '/cache') var tmp = path.join(pkg, '/tmp') var nm = path.join(pkg, '/node_modules') function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } test('setup', function (t) { cleanup() mkdirp.sync(cache) mkdirp.sync(tmp) t.end() }) test('locking file puts lock in correct place', function (t) { npm.load({cache: cache, tmpdir: tmp}, function (er) { t.ifError(er, 'npm bootstrapped OK') var n = 'correct' var c = n.replace(/[^a-zA-Z0-9]+/g, '-').replace(/^-+|-+$/g, '') var p = path.resolve(nm, n) var h = crypto.createHash('sha1').update(p).digest('hex') var l = c.substr(0, 24) + '-' + h.substr(0, 16) + '.lock' var v = path.join(cache, '_locks', l) lock(nm, n, function (er) { t.ifError(er, 'locked path') fs.exists(v, function (found) { t.ok(found, 'lock found OK') unlock(nm, n, function (er) { t.ifError(er, 'unlocked path') fs.exists(v, function (found) { t.notOk(found, 'lock deleted OK') t.end() }) }) }) }) }) }) test('unlocking out of order errors out', function (t) { npm.load({cache: cache, tmpdir: tmp}, function (er) { t.ifError(er, 'npm bootstrapped OK') var n = 'busted' var c = n.replace(/[^a-zA-Z0-9]+/g, '-').replace(/^-+|-+$/g, '') var p = path.resolve(nm, n) var h = crypto.createHash('sha1').update(p).digest('hex') var l = c.substr(0, 24) + '-' + h.substr(0, 16) + '.lock' var v = path.join(cache, '_locks', l) fs.exists(v, function (found) { t.notOk(found, 'no lock to unlock') t.throws(function () { unlock(nm, n, function () { t.fail("shouldn't get here") t.end() }) }, 'blew up as expected') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/logout.js�������������������������������������������������������������������0000644�0000000�0000000�00000002504�12631326456�015105� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'logout') var outfile = path.join(pkg, '_npmrc') var opts = { cwd: pkg } var contents = function () {/* foo=boo //localhost:1337/:_authToken=glarb */}.toString().split('\n').slice(1, -1).join('\n') function mocks (server) { server.delete('/-/user/token/glarb') .reply(200, {}) } test('setup', function (t) { cleanup() setup() t.end() }) test('npm logout', function (t) { mr({ port: common.port, plugin: mocks }, function (err, s) { if (err) throw err common.npm( [ 'logout', '--registry', common.registry, '--loglevel', 'silent', '--userconfig', outfile ], opts, function (err, code) { t.ifError(err, 'no error output') t.notOk(code, 'exited OK') var config = fs.readFileSync(outfile, 'utf8') t.equal(config, 'foo=boo\n', 'creds gone') s.close() t.end() } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { mkdirp.sync(pkg) fs.writeFileSync(outfile, contents) } function cleanup () { rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ls-depth-cli.js�������������������������������������������������������������0000644�0000000�0000000�00000005273�12631326456�016067� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = path.resolve(__dirname, 'ls-depth-cli') var EXEC_OPTS = { cwd: pkg } var json = { name: 'ls-depth-cli', version: '0.0.0', dependencies: { 'test-package-with-one-dep': '0.0.0' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mr({ port: common.port }, function (er, s) { common.npm( [ '--registry', common.registry, 'install' ], EXEC_OPTS, function (er, c) { t.ifError(er, 'setup installation ran without issue') t.equal(c, 0) s.close() t.end() } ) }) }) test('npm ls --depth=0', function (t) { common.npm( ['ls', '--depth=0'], EXEC_OPTS, function (er, c, out) { t.ifError(er, 'npm ls ran without issue') t.equal(c, 0, 'ls ran without raising error code') t.has( out, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.doesNotHave( out, /test-package@0\.0\.0/, 'output not contains test-package@0.0.0' ) t.end() } ) }) test('npm ls --depth=1', function (t) { common.npm( ['ls', '--depth=1'], EXEC_OPTS, function (er, c, out) { t.ifError(er, 'npm ls ran without issue') t.equal(c, 0, 'ls ran without raising error code') t.has( out, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.has( out, /test-package@0\.0\.0/, 'output contains test-package@0.0.0' ) t.end() } ) }) test('npm ls --depth=Infinity', function (t) { // travis has a preconfigured depth=0, in general we can not depend // on the default value in all environments, so explictly set it here common.npm( ['ls', '--depth=Infinity'], EXEC_OPTS, function (er, c, out) { t.ifError(er, 'npm ls ran without issue') t.equal(c, 0, 'ls ran without raising error code') t.has( out, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.has( out, /test-package@0\.0\.0/, 'output contains test-package@0.0.0' ) t.end() } ) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ls-depth-unmet.js�����������������������������������������������������������0000644�0000000�0000000�00000010226�12631326456�016442� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = path.resolve(__dirname, 'ls-depth-unmet') var EXEC_OPTS = { cwd: pkg } var json = { name: 'ls-depth-umnet', author: 'Evan You', version: '0.0.0', dependencies: { 'test-package-with-one-dep': '0.0.0', underscore: '1.5.1', optimist: '0.6.0' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mr({ port: common.port }, function (er, s) { common.npm( [ '--registry', common.registry, 'install', 'underscore@1.3.1', 'mkdirp', 'test-package-with-one-dep' ], EXEC_OPTS, function (er, c) { t.ifError(er, 'setup installation ran without issue') t.equal(c, 0) s.close() t.end() } ) }) }) test('npm ls --depth=0', function (t) { common.npm( ['ls', '--depth=0'], EXEC_OPTS, function (er, c, out) { t.ifError(er, 'setup installation ran without issue') t.equal(c, 1, 'ls barfed') t.has( out, /UNMET DEPENDENCY optimist@0\.6\.0/, 'output contains optimist@0.6.0 and labeled as unmet dependency' ) t.has( out, /mkdirp@0\.3\.5 extraneous/, 'output contains mkdirp@0.3.5 and labeled as extraneous' ) t.has( out, /underscore@1\.3\.1 invalid/, 'output contains underscore@1.3.1 and labeled as invalid' ) t.has( out, /test-package-with-one-dep@0\.0\.0\n/, 'output contains test-package-with-one-dep@0.0.0 and has no labels' ) t.doesNotHave( out, /test-package@0\.0\.0/, 'output does not contain test-package@0.0.0 which is at depth=1' ) t.end() } ) }) test('npm ls --depth=1', function (t) { common.npm( ['ls', '--depth=1'], EXEC_OPTS, function (er, c, out) { t.ifError(er, 'setup installation ran without issue') t.equal(c, 1, 'ls barfed') t.has( out, /UNMET DEPENDENCY optimist@0\.6\.0/, 'output contains optimist@0.6.0 and labeled as unmet dependency' ) t.has( out, /mkdirp@0\.3\.5 extraneous/, 'output contains mkdirp@0.3.5 and labeled as extraneous' ) t.has( out, /underscore@1\.3\.1 invalid/, 'output contains underscore@1.3.1 and labeled as invalid' ) t.has( out, /test-package-with-one-dep@0\.0\.0\n/, 'output contains test-package-with-one-dep@0.0.0 and has no labels' ) t.has( out, /test-package@0\.0\.0/, 'output contains test-package@0.0.0 which is at depth=1' ) t.end() } ) }) test('npm ls --depth=Infinity', function (t) { // travis has a preconfigured depth=0, in general we can not depend // on the default value in all environments, so explictly set it here common.npm( ['ls', '--depth=Infinity'], EXEC_OPTS, function (er, c, out) { t.ifError(er, 'setup installation ran without issue') t.equal(c, 1, 'ls barfed') t.has( out, /UNMET DEPENDENCY optimist@0\.6\.0/, 'output contains optimist@0.6.0 and labeled as unmet dependency' ) t.has( out, /mkdirp@0\.3\.5 extraneous/, 'output contains mkdirp@0.3.5 and labeled as extraneous' ) t.has( out, /underscore@1\.3\.1 invalid/, 'output contains underscore@1.3.1 and labeled as invalid' ) t.has( out, /test-package-with-one-dep@0\.0\.0\n/, 'output contains test-package-with-one-dep@0.0.0 and has no labels' ) t.has( out, /test-package@0\.0\.0/, 'output contains test-package@0.0.0 which is at depth=1' ) t.end() } ) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ls-env.js�������������������������������������������������������������������0000644�0000000�0000000�00000006346�12631326456�015010� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = path.resolve(__dirname, 'ls-depth') var EXEC_OPTS = { cwd: pkg } var json = { name: 'ls-env', version: '0.0.0', dependencies: { 'test-package-with-one-dep': '0.0.0' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mr({port: common.port}, function (er, s) { common.npm( [ '--registry', common.registry, 'install' ], EXEC_OPTS, function (er, c) { t.ifError(er, 'install ran without issue') t.equal(c, 0) s.close() t.end() } ) }) }) test('npm ls --dev', function (t) { common.npm(['ls', '--dev'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --dev ran without issue') t.equal(code, 0) t.has(stdout, /(empty)/, 'output contains (empty)') t.end() }) }) test('npm ls --only=development', function (t) { common.npm(['ls', '--only=development'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --only=development ran without issue') t.equal(code, 0) t.has(stdout, /(empty)/, 'output contains (empty)') t.end() }) }) test('npm ls --only=dev', function (t) { common.npm(['ls', '--only=dev'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --only=dev ran without issue') t.equal(code, 0) t.has(stdout, /(empty)/, 'output contains (empty)') t.end() }) }) test('npm ls --production', function (t) { common.npm(['ls', '--production'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --production ran without issue') t.notOk(code, 'npm exited ok') t.has( stdout, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.end() }) }) test('npm ls --prod', function (t) { common.npm(['ls', '--prod'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --prod ran without issue') t.notOk(code, 'npm exited ok') t.has( stdout, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.end() }) }) test('npm ls --only=production', function (t) { common.npm(['ls', '--only=production'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --only=production ran without issue') t.notOk(code, 'npm exited ok') t.has( stdout, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.end() }) }) test('npm ls --only=prod', function (t) { common.npm(['ls', '--only=prod'], EXEC_OPTS, function (er, code, stdout) { t.ifError(er, 'ls --only=prod ran without issue') t.notOk(code, 'npm exited ok') t.has( stdout, /test-package-with-one-dep@0\.0\.0/, 'output contains test-package-with-one-dep@0.0.0' ) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ls-l-depth-0.js�������������������������������������������������������������0000644�0000000�0000000�00000004662�12631326456�015711� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var cat = require('graceful-fs').writeFileSync var resolve = require('path').resolve var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var tmpdir = require('osenv').tmpdir var common = require('../common-tap.js') var pkg = resolve(__dirname, 'ls-l-depth-0') var dep = resolve(pkg, 'deps', 'glock') var modules = resolve(pkg, 'node_modules') var expected = '\n' + '│ ' + pkg + '\n' + '│ \n' + '└── glock@1.8.7\n' + ' an inexplicably hostile sample package\n' + ' git+https://github.com/npm/glo.ck.git\n' + ' https://glo.ck\n' + '\n' var server var EXEC_OPTS = { cwd: pkg } var fixture = { 'name': 'glock', 'version': '1.8.7', 'private': true, 'description': 'an inexplicably hostile sample package', 'homepage': 'https://glo.ck', 'repository': 'https://github.com/npm/glo.ck', 'dependencies': { 'underscore': '1.5.1' } } test('setup', function (t) { setup() mr({ port: common.port }, function (er, s) { server = s t.end() }) }) test('#6311: npm ll --depth=0 duplicates listing', function (t) { common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'install', dep ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm install ran without error') t.notOk(code, 'npm install exited cleanly') t.notOk(stderr, 'npm install ran silently') t.equal( stdout.trim(), resolve(__dirname, 'ls-l-depth-0') + '\n└─┬ glock@1.8.7 ' + '\n └── underscore@1.5.1', 'got expected install output' ) common.npm( [ '--loglevel', 'silent', 'ls', '--long', '--depth', '0' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm ll ran without error') t.is(code, 0, 'npm ll exited cleanly') t.notOk(stderr, 'npm ll ran silently') t.equal( stdout, expected, 'got expected package name' ) t.end() } ) } ) }) test('cleanup', function (t) { cleanup() server.close() t.end() }) function cleanup () { process.chdir(tmpdir()) rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(modules) mkdirp.sync(dep) cat(resolve(dep, 'package.json'), JSON.stringify(fixture)) } ������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ls-no-results.js������������������������������������������������������������0000644�0000000�0000000�00000000560�12631326456�016323� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var spawn = require('child_process').spawn var node = process.execPath var npm = require.resolve('../../') var args = [ npm, 'ls', 'ceci n’est pas une package' ] test('ls exits non-zero when nothing found', function (t) { var child = spawn(node, args) child.on('exit', function (code) { t.notEqual(code, 0) t.end() }) }) ������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/ls-top-errors.js������������������������������������������������������������0000644�0000000�0000000�00000002570�12631326456�016327� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var common = require('../common-tap') var pkg = path.resolve(__dirname, path.basename(__filename, '.js')) var pathModA = path.join(pkg, 'node_modules', 'moduleA') var pathModB = path.join(pkg, 'node_modules', 'moduleB') var modA = { name: 'moduleA', version: '1.0.0', _requiredBy: [ '#USER', '/moduleB' ], dependencies: { moduleB: '1.0.0' } } var modB = { name: 'moduleB', version: '1.0.0', _requiredBy: [ '/moduleA' ], dependencies: { moduleA: '1.0.0' } } function setup () { mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), '{broken json' ) mkdirp.sync(pathModA) fs.writeFileSync( path.join(pathModA, 'package.json'), JSON.stringify(modA, null, 2) ) mkdirp.sync(pathModB) fs.writeFileSync( path.join(pathModB, 'package.json'), JSON.stringify(modB, null, 2) ) } function cleanup () { rimraf.sync(pkg) } test('setup', function (t) { cleanup() setup() t.end() }) test('ls-top-errors', function (t) { common.npm(['ls'], {cwd: pkg}, function (er, code, stdout, stderr) { t.ifErr(er, 'install finished successfully') t.match(stderr, /Failed to parse json/) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) ����������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/map-to-registry.js����������������������������������������������������������0000644�0000000�0000000�00000005524�12631326456�016644� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npm = require('../../') var common = require('../common-tap.js') var mapRegistry = require('../../lib/utils/map-to-registry.js') var creds = { '//registry.npmjs.org/:username': 'u', '//registry.npmjs.org/:_password': new Buffer('p').toString('base64'), '//registry.npmjs.org/:email': 'e', cache: common.npm_config_cache } test('setup', function (t) { npm.load(creds, function (err) { t.ifError(err) t.end() }) }) test('mapRegistryToURI', function (t) { t.plan(16) mapRegistry('basic', npm.config, function (er, uri, auth, registry) { t.ifError(er, 'mapRegistryToURI worked') t.equal(uri, 'https://registry.npmjs.org/basic') t.deepEqual(auth, { scope: '//registry.npmjs.org/', token: undefined, username: 'u', password: 'p', email: 'e', auth: 'dTpw', alwaysAuth: false }) t.equal(registry, 'https://registry.npmjs.org/') }) npm.config.set('scope', 'test') npm.config.set('@test:registry', 'http://reg.npm/design/-/rewrite/') npm.config.set('//reg.npm/design/-/rewrite/:_authToken', 'a-token') mapRegistry('simple', npm.config, function (er, uri, auth, registry) { t.ifError(er, 'mapRegistryToURI worked') t.equal(uri, 'http://reg.npm/design/-/rewrite/simple') t.deepEqual(auth, { scope: '//reg.npm/design/-/rewrite/', token: 'a-token', username: undefined, password: undefined, email: undefined, auth: undefined, alwaysAuth: undefined }) t.equal(registry, 'http://reg.npm/design/-/rewrite/') }) npm.config.set('scope', '') npm.config.set('@test2:registry', 'http://reg.npm/-/rewrite/') npm.config.set('//reg.npm/-/rewrite/:_authToken', 'b-token') mapRegistry('@test2/easy', npm.config, function (er, uri, auth, registry) { t.ifError(er, 'mapRegistryToURI worked') t.equal(uri, 'http://reg.npm/-/rewrite/@test2%2feasy') t.deepEqual(auth, { scope: '//reg.npm/-/rewrite/', token: 'b-token', username: undefined, password: undefined, email: undefined, auth: undefined, alwaysAuth: undefined }) t.equal(registry, 'http://reg.npm/-/rewrite/') }) npm.config.set('scope', 'test') npm.config.set('@test3:registry', 'http://reg.npm/design/-/rewrite/relative') npm.config.set('//reg.npm/design/-/rewrite/:_authToken', 'c-token') mapRegistry('@test3/basic', npm.config, function (er, uri, auth, registry) { t.ifError(er, 'mapRegistryToURI worked') t.equal(uri, 'http://reg.npm/design/-/rewrite/relative/@test3%2fbasic') t.deepEqual(auth, { scope: '//reg.npm/design/-/rewrite/', token: 'c-token', username: undefined, password: undefined, email: undefined, auth: undefined, alwaysAuth: undefined }) t.equal(registry, 'http://reg.npm/design/-/rewrite/relative/') }) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/nerf-dart.js����������������������������������������������������������������0000644�0000000�0000000�00000002304�12631326456�015454� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// taken from https://raw.githubusercontent.com/indexzero/npm/bd3cad01fbd3ab481d2f5da441b9eead16029123/test/tap/config-nerf-dart.js // originally written by Charlie Robbins, https://github.com/indexzero var test = require('tap').test var toNerfDart = require('../../lib/config/nerf-dart.js') function validNerfDart (uri, valid) { if (!valid) valid = '//registry.npmjs.org/' test(uri, function (t) { t.equal(toNerfDart(uri), valid) t.end() }) } validNerfDart('http://registry.npmjs.org') validNerfDart('http://registry.npmjs.org/some-package') validNerfDart('http://registry.npmjs.org/some-package?write=true') validNerfDart('http://user:pass@registry.npmjs.org/some-package?write=true') validNerfDart('http://registry.npmjs.org/#random-hash') validNerfDart('http://registry.npmjs.org/some-package#random-hash') validNerfDart( 'http://relative.couchapp.npm/design/-/rewrite/', '//relative.couchapp.npm/design/-/rewrite/' ) validNerfDart( 'http://relative.couchapp.npm:8080/design/-/rewrite/', '//relative.couchapp.npm:8080/design/-/rewrite/' ) validNerfDart( 'http://relative.couchapp.npm:8080/design/-/rewrite/some-package', '//relative.couchapp.npm:8080/design/-/rewrite/' ) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/nested-extraneous.js��������������������������������������������������������0000644�0000000�0000000�00000002275�12631326456�017256� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var mkdirp = require('mkdirp') var fs = require('fs') var rimraf = require('rimraf') var path = require('path') var pkg = path.resolve(__dirname, 'nested-extraneous') var pj = { name: 'nested-extraneous', version: '1.2.3' } var dep = path.resolve(pkg, 'node_modules', 'dep') var deppj = { name: 'nested-extraneous-dep', version: '1.2.3', dependencies: { 'nested-extra-depdep': '*' } } var depdep = path.resolve(dep, 'node_modules', 'depdep') var depdeppj = { name: 'nested-extra-depdep', version: '1.2.3' } test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(depdep) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify(pj)) fs.writeFileSync(path.resolve(dep, 'package.json'), JSON.stringify(deppj)) fs.writeFileSync(path.resolve(depdep, 'package.json'), JSON.stringify(depdeppj)) t.end() }) test('test', function (t) { common.npm(['ls'], { cwd: pkg }, function (er, code, sto, ste) { if (er) throw er t.notEqual(code, 0) t.notSimilar(ste, /depdep/) t.notSimilar(sto, /depdep/) t.end() }) }) test('clean', function (t) { rimraf.sync(pkg) t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/no-global-warns.js����������������������������������������������������������0000644�0000000�0000000�00000002612�12631326456�016576� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var writeFileSync = require('fs').writeFileSync var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var mockGlobal = path.join(base, 'global') var toInstall = path.join(base, 'to-install') var config = 'prefix = ' + base var configPath = path.join(base, '_npmrc') var OPTS = { env: { 'npm_config_userconfig': configPath } } var installJSON = { name: 'to-install', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, author: '', license: 'ISC' } test('setup', function (t) { setup() t.end() }) test('no-global-warns', function (t) { common.npm(['install', '-g', toInstall], OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'installed w/o error') t.is(stderr, '', 'no warnings printed to stderr') t.end() }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) cleanup() t.end() }) function cleanup () { rimraf.sync(base) } function setup () { cleanup() mkdirp.sync(mockGlobal) mkdirp.sync(toInstall) writeFileSync( path.join(toInstall, 'package.json'), JSON.stringify(installJSON, null, 2) ) writeFileSync(configPath, config) } ����������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/no-scan-full-global-dir.js��������������������������������������������������0000644�0000000�0000000�00000005604�12631326456�020110� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var test = require('tap').test var requireInject = require('require-inject') var osenv = require('osenv') var inherits = require('inherits') var npm = require('../../lib/npm.js') var packages = { test: {package: {name: 'test'}, path: __dirname, children: ['abc', 'def', 'ghi', 'jkl']}, abc: {package: {name: 'abc'}, path: path.join(__dirname, 'node_modules', 'abc')}, def: {package: {name: 'def'}, path: path.join(__dirname, 'node_modules', 'def')}, ghi: {package: {name: 'ghi'}, path: path.join(__dirname, 'node_modules', 'ghi')}, jkl: {package: {name: 'jkl'}, path: path.join(__dirname, 'node_modules', 'jkl')} } var dirs = {} var files = {} Object.keys(packages).forEach(function (name) { dirs[path.join(packages[name].path, 'node_modules')] = packages[name].children || [] files[path.join(packages[name].path, 'package.json')] = packages[name].package }) process.chdir(osenv.tmpdir()) var mockReaddir = function (name, cb) { if (dirs[name]) return cb(null, dirs[name]) var er = new Error('No such mock: ' + name) er.code = 'ENOENT' cb(er) } var mockReadPackageJson = function (file, cb) { if (files[file]) return cb(null, files[file]) var er = new Error('No such mock: ' + file) er.code = 'ENOENT' cb(er) } var mockFs = { realpath: function (dir, cb) { return cb(null, dir) } } test('setup', function (t) { npm.load(function () { t.pass('npm loaded') t.end() }) }) function loadArgMetadata (cb) { this.args = this.args.map(function (arg) { return {name: arg} }) cb() } test('installer', function (t) { t.plan(1) var installer = requireInject('../../lib/install.js', { 'fs': mockFs, 'readdir-scoped-modules': mockReaddir, 'read-package-json': mockReadPackageJson }) var Installer = installer.Installer var TestInstaller = function () { Installer.apply(this, arguments) this.global = true } inherits(TestInstaller, Installer) TestInstaller.prototype.loadArgMetadata = loadArgMetadata var inst = new TestInstaller(__dirname, false, ['def', 'abc']) inst.loadCurrentTree(function () { var kids = inst.currentTree.children.map(function (child) { return child.package.name }) t.isDeeply(kids, ['abc', 'def']) t.end() }) }) test('uninstaller', function (t) { t.plan(1) var uninstaller = requireInject('../../lib/uninstall.js', { 'fs': mockFs, 'readdir-scoped-modules': mockReaddir, 'read-package-json': mockReadPackageJson }) var Uninstaller = uninstaller.Uninstaller var TestUninstaller = function () { Uninstaller.apply(this, arguments) this.global = true } inherits(TestUninstaller, Uninstaller) var uninst = new TestUninstaller(__dirname, false, ['ghi', 'jkl']) uninst.loadCurrentTree(function () { var kids = uninst.currentTree.children.map(function (child) { return child.package.name }) t.isDeeply(kids, ['ghi', 'jkl']) t.end() }) }) ����������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/noargs-install-config-save.js�����������������������������������������������0000644�0000000�0000000�00000004116�12631326456�020731� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require.resolve('../../bin/npm-cli.js') var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var spawn = require('child_process').spawn var node = process.execPath var pkg = path.resolve(process.env.npm_config_tmp || '/tmp', 'noargs-install-config-save') function writePackageJson () { rimraf.sync(pkg) mkdirp.sync(pkg) mkdirp.sync(pkg + '/cache') fs.writeFileSync(pkg + '/package.json', JSON.stringify({ 'author': 'Rocko Artischocko', 'name': 'noargs', 'version': '0.0.0', 'devDependencies': { 'underscore': '1.3.1' } }), 'utf8') } function createChild (args) { var env = { 'npm_config_save': true, 'npm_config_registry': common.registry, 'npm_config_cache': pkg + '/cache', HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH } if (process.platform === 'win32') { env.npm_config_cache = '%APPDATA%\\npm-cache' } return spawn(node, args, { cwd: pkg, env: env }) } test('does not update the package.json with empty arguments', function (t) { writePackageJson() t.plan(1) mr({ port: common.port }, function (er, s) { var child = createChild([npm, 'install']) child.on('close', function () { var text = JSON.stringify(fs.readFileSync(pkg + '/package.json', 'utf8')) s.close() t.equal(text.indexOf('"dependencies'), -1, 'dependencies do not exist in file') }) }) }) test('updates the package.json (adds dependencies) with an argument', function (t) { writePackageJson() t.plan(1) mr({ port: common.port }, function (er, s) { var child = createChild([npm, 'install', 'underscore']) child.on('close', function () { s.close() var text = JSON.stringify(fs.readFileSync(pkg + '/package.json', 'utf8')) t.notEqual(text.indexOf('"dependencies'), -1, 'dependencies exist in file') }) }) }) test('cleanup', function (t) { rimraf.sync(pkg + '/cache') t.end() }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/normalize-package-explode.js������������������������������������������������0000644�0000000�0000000�00000001013�12631326456�020615� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var log = require('npmlog') var npm = require('../../lib/npm.js') var idealTree = { package: { name: 'a b c', version: '3.what' }, children: [], warnings: [] } test('setup', function (t) { npm.load({}, t.end) }) test('validate-tree', function (t) { log.disableProgress() var validateTree = require('../../lib/install/validate-tree.js') validateTree(idealTree, log.newGroup('validate'), function (er) { t.pass("we didn't crash") t.end() }) }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/npm-api-not-loaded-error.js�������������������������������������������������0000644�0000000�0000000�00000002004�12631326456�020303� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var npm = require('../..') var path = require('path') var rimraf = require('rimraf') var npmrc = path.join(__dirname, 'npmrc') var fs = require('fs') test('setup', function (t) { fs.writeFileSync(npmrc, 'foo = bar\n', 'ascii') t.end() }) test('calling set/get on config pre-load should throw', function (t) { var threw = true try { npm.config.get('foo') threw = false } catch (er) { t.equal(er.message, 'npm.load() required') } finally { t.ok(threw, 'get before load should throw') } threw = true try { npm.config.set('foo', 'bar') threw = false } catch (er) { t.equal(er.message, 'npm.load() required') } finally { t.ok(threw, 'set before load should throw') } npm.load({ userconfig: npmrc }, function (er) { if (er) throw er t.equal(npm.config.get('foo'), 'bar') npm.config.set('foo', 'baz') t.equal(npm.config.get('foo'), 'baz') t.end() }) }) test('cleanup', function (t) { rimraf.sync(npmrc) t.end() }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/optional-metadep-rollback-collision.js��������������������������������������0000644�0000000�0000000�00000012256�12631326456�022623� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'optional-metadep-rollback-collision') var deps = path.resolve(pkg, 'deps') var opdep = path.resolve(pkg, 'node_modules', 'opdep') var cache = path.resolve(pkg, 'cache') var pidfile = path.resolve(pkg, 'child.pid') var json = { name: 'optional-metadep-rollback-collision', version: '1.0.0', description: 'let\'s just see about that race condition', optionalDependencies: { opdep: 'file:./deps/opdep' } } var d1 = { name: 'd1', version: '1.0.0', description: 'I FAIL CONSTANTLY', scripts: { preinstall: 'sleep 1' }, dependencies: { foo: 'http://localhost:8080/' } } var d2 = { name: 'd2', version: '1.0.0', description: 'how do you *really* know you exist?', scripts: { postinstall: 'node blart.js' }, dependencies: { 'graceful-fs': '^3.0.2', mkdirp: '^0.5.0', rimraf: '^2.2.8' } } var opdep_json = { name: 'opdep', version: '1.0.0', description: 'To explode, of course!', main: 'index.js', scripts: { preinstall: 'node bad-server.js' }, dependencies: { d1: 'file:../d1', d2: 'file:../d2' } } var badServer = function () {/* var createServer = require('http').createServer var spawn = require('child_process').spawn var fs = require('fs') var path = require('path') var pidfile = path.resolve(__dirname, '..', '..', 'child.pid') if (process.argv[2]) { console.log('ok') createServer(function (req, res) { setTimeout(function () { res.writeHead(404) res.end() }, 1000) this.close() }).listen(8080) } else { var child = spawn( process.execPath, [__filename, 'whatever'], { stdio: [0, 1, 2], detached: true } ) child.unref() // kill any prior children, if existing. try { var pid = +fs.readFileSync(pidfile) process.kill(pid, 'SIGKILL') } catch (er) {} fs.writeFileSync(pidfile, child.pid + '\n') } */}.toString().split('\n').slice(1, -1).join('\n') var blart = function () {/* var rando = require('crypto').randomBytes var resolve = require('path').resolve var mkdirp = require('mkdirp') var rimraf = require('rimraf') var writeFile = require('graceful-fs').writeFile var BASEDIR = resolve(__dirname, 'arena') var keepItGoingLouder = {} var writers = 0 var errors = 0 function gensym () { return rando(16).toString('hex') } function writeAlmostForever (filename) { if (!keepItGoingLouder[filename]) { writers-- if (writers < 1) return done() } else { writeFile(filename, keepItGoingLouder[filename], function (err) { if (err) errors++ writeAlmostForever(filename) }) } } function done () { rimraf(BASEDIR, function () { if (errors > 0) console.log('not ok - %d errors', errors) else console.log('ok') }) } mkdirp(BASEDIR, function go () { for (var i = 0; i < 16; i++) { var filename = resolve(BASEDIR, gensym() + '.txt') keepItGoingLouder[filename] = '' for (var j = 0; j < 512; j++) keepItGoingLouder[filename] += filename writers++ writeAlmostForever(filename) } setTimeout(function viktor () { // kill all the writers keepItGoingLouder = {} }, 3 * 1000) }) */}.toString().split('\n').slice(1, -1).join('\n') test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mkdirp.sync(path.join(deps, 'd1')) fs.writeFileSync( path.join(deps, 'd1', 'package.json'), JSON.stringify(d1, null, 2) ) mkdirp.sync(path.join(deps, 'd2')) fs.writeFileSync( path.join(deps, 'd2', 'package.json'), JSON.stringify(d2, null, 2) ) fs.writeFileSync(path.join(deps, 'd2', 'blart.js'), blart) mkdirp.sync(path.join(deps, 'opdep')) fs.writeFileSync( path.join(deps, 'opdep', 'package.json'), JSON.stringify(opdep_json, null, 2) ) fs.writeFileSync(path.join(deps, 'opdep', 'bad-server.js'), badServer) t.end() }) test('go go test racer', function (t) { common.npm( [ '--prefix', pkg, '--fetch-retries', '0', '--loglevel', 'silent', '--cache', cache, 'install' ], { cwd: pkg, env: { PATH: process.env.PATH, Path: process.env.Path }, stdio: [0, 'pipe', 2] }, function (er, code, stdout, stderr) { t.ifError(er, 'install ran to completion without error') t.is(code, 0, 'npm install exited with code 0') t.is(stderr, '') // stdout should be empty, because we only have one, optional, dep and // if it fails we shouldn't try installing anything t.equal(stdout, '') t.notOk(/not ok/.test(stdout), 'should not contain the string \'not ok\'') t.end() } ) }) test('verify results', function (t) { t.throws(function () { fs.statSync(opdep) }) t.end() }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) try { var pid = +fs.readFileSync(pidfile) process.kill(pid, 'SIGKILL') } catch (er) {} rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-color.js�����������������������������������������������������������0000644�0000000�0000000�00000003100�12631326456�016512� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'outdated-color') var EXEC_OPTS = { cwd: pkg } function hasControlCodes (str) { return str.length !== ansiTrim(str).length } function ansiTrim (str) { var r = new RegExp('\x1b(?:\\[(?:\\d+[ABCDEFGJKSTm]|\\d+;\\d+[Hfm]|' + '\\d+;\\d+;\\d+m|6n|s|u|\\?25[lh])|\\w)', 'g') return str.replace(r, '') } var json = { name: 'outdated-color', description: 'fixture', version: '0.0.1', dependencies: { underscore: '1.3.1' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) // note hard to automate tests for color = true // as npm kills the color config when it detects // it's not running in a tty test('does not use ansi styling', function (t) { t.plan(4) mr({ port: common.port }, function (er, s) { // create mock registry. common.npm( [ '--registry', common.registry, 'outdated', 'underscore' ], EXEC_OPTS, function (err, code, stdout) { t.ifError(err) t.notOk(code, 'npm outdated exited with code 0') t.ok(stdout, stdout.length) t.ok(!hasControlCodes(stdout)) s.close() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-depth-deep.js������������������������������������������������������0000644�0000000�0000000�00000004065�12631326456�017426� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap') var path = require('path') var test = require('tap').test var rimraf = require('rimraf') var npm = require('../../') var mr = require('npm-registry-mock') var pkg = path.resolve(__dirname, 'outdated-depth-deep') var cache = path.resolve(pkg, 'cache') var osenv = require('osenv') var mkdirp = require('mkdirp') var fs = require('fs') var pj = JSON.stringify({ 'name': 'whatever', 'description': 'yeah idk', 'version': '1.2.3', 'main': 'index.js', 'dependencies': { 'underscore': '1.3.1', 'npm-test-peer-deps': '0.0.0' }, 'repository': 'git://github.com/luk-/whatever' }, null, 2) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { mkdirp.sync(pkg) process.chdir(pkg) fs.writeFileSync(path.resolve(pkg, 'package.json'), pj) } test('setup', function (t) { cleanup() setup() t.end() }) test('outdated depth deep (9999)', function (t) { var underscoreOutdated = ['underscore', '1.3.1', '1.3.1', '1.5.1', '1.3.1'] var childPkg = path.resolve(pkg, 'node_modules', 'npm-test-peer-deps') var expected = [ [childPkg].concat(underscoreOutdated).concat([null]), [pkg].concat(underscoreOutdated).concat([null]) ] process.chdir(pkg) mr({ port: common.port }, function (er, s) { npm.load({ cache: cache, loglevel: 'silent', registry: common.registry, depth: 9999 }, function () { npm.install('.', function (er) { if (er) throw new Error(er) var nodepath = process.env.npm_node_execpath || process.env.NODE || process.execPath var clibin = path.resolve(__dirname, '../../bin/npm-cli.js') npm.explore('npm-test-peer-deps', nodepath, clibin, 'install', 'underscore', function (er) { if (er) throw new Error(er) npm.outdated(function (err, d) { if (err) throw new Error(err) t.deepEqual(d, expected) s.close() t.end() }) }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-depth-integer.js���������������������������������������������������0000644�0000000�0000000�00000003110�12631326456�020134� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap') var test = require('tap').test var rimraf = require('rimraf') var npm = require('../../') var mr = require('npm-registry-mock') var pkg = __dirname + '/outdated-depth-integer' var osenv = require('osenv') var mkdirp = require('mkdirp') var fs = require('fs') var pj = JSON.stringify({ 'name': 'whatever', 'description': 'yeah idk', 'version': '1.2.3', 'main': 'index.js', 'dependencies': { 'underscore': '1.3.1' }, 'repository': 'git://github.com/luk-/whatever' }, null, 2) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { mkdirp.sync(pkg) process.chdir(pkg) fs.writeFileSync('package.json', pj) } test('setup', function (t) { cleanup() setup() t.end() }) test('outdated depth integer', function (t) { // todo: update with test-package-with-one-dep once the new // npm-registry-mock is published var expected = [[ pkg, 'underscore', undefined, // no version installed '1.3.1', // wanted '1.5.1', // latest '1.3.1', null ]] mr({ port: common.port }, function (er, s) { npm.load({ cache: pkg + '/cache', loglevel: 'silent', registry: common.registry, depth: 5 } , function () { npm.install('request@0.9.0', function (er) { if (er) throw new Error(er) npm.outdated(function (err, d) { if (err) throw new Error(err) t.deepEqual(d, expected) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-depth.js�����������������������������������������������������������0000644�0000000�0000000�00000002606�12631326456�016512� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../') var common = require('../common-tap') var pkg = path.resolve(__dirname, 'outdated-depth') var json = { name: 'outdated-depth', version: '1.2.3', dependencies: { underscore: '1.3.1', 'npm-test-peer-deps': '0.0.0' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('outdated depth zero', function (t) { var expected = [ pkg, 'underscore', '1.3.1', '1.3.1', '1.5.1', '1.3.1', null ] mr({ port: common.port }, function (er, s) { npm.load( { loglevel: 'silent', registry: common.registry }, function () { npm.install('.', function (er) { if (er) throw new Error(er) npm.outdated(function (err, d) { if (err) throw new Error(err) t.deepEqual(d[0], expected) s.close() t.end() }) }) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-git.js�������������������������������������������������������������0000644�0000000�0000000�00000002714�12631326456�016171� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var fs = require('graceful-fs') var rimraf = require('rimraf') var common = require('../common-tap.js') var npm = require('../../') // config var pkg = path.resolve(__dirname, 'outdated-git') var cache = path.resolve(pkg, 'cache') var json = { name: 'outdated-git', author: 'Rocko Artischocko', description: 'fixture', version: '0.0.1', main: 'index.js', dependencies: { 'foo-github': 'robertkowalski/foo', 'foo-private': 'git://github.com/robertkowalski/foo-private.git', 'foo-private-credentials': 'git://user:pass@github.com/robertkowalski/foo-private.git' } } test('setup', function (t) { setup() t.end() }) test('discovers new versions in outdated', function (t) { process.chdir(pkg) t.plan(5) npm.load({cache: cache, registry: common.registry, loglevel: 'silent'}, function () { npm.commands.outdated([], function (er, d) { t.equal(d[0][3], 'git') t.equal(d[0][4], 'git') t.equal(d[0][5], 'github:robertkowalski/foo') t.equal(d[1][5], 'git://github.com/robertkowalski/foo-private.git') t.equal(d[2][5], 'git://user:pass@github.com/robertkowalski/foo-private.git') }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { mkdirp.sync(cache) fs.writeFileSync(path.join(pkg, 'package.json'), JSON.stringify(json, null, 2), 'utf8') } function cleanup () { rimraf.sync(pkg) } ����������������������������������������������������npm_3.5.2.orig/test/tap/outdated-include-devdependencies.js�����������������������������������������0000644�0000000�0000000�00000002152�12631326456�022150� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../') var common = require('../common-tap.js') // config var pkg = path.resolve(__dirname, 'outdated-include-devdependencies') var cache = path.resolve(pkg, 'cache') var json = { author: 'Rocko Artischocko', name: 'ignore-shrinkwrap', version: '0.0.0', devDependencies: { underscore: '>=1.3.1' } } test('setup', function (t) { cleanup() mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) t.end() }) test('includes devDependencies in outdated', function (t) { process.chdir(pkg) mr({ port: common.port }, function (er, s) { npm.load({ cache: cache, registry: common.registry }, function () { npm.outdated(function (er, d) { t.equal('1.5.1', d[0][3]) s.close() t.end() }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-json.js������������������������������������������������������������0000644�0000000�0000000�00000004150�12631326456�016353� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.resolve(__dirname, 'outdated-json') var EXEC_OPTS = { cwd: pkg } var json = { name: 'outdated-json', author: 'Rockbert', version: '0.0.0', dependencies: { underscore: '~1.3.1' }, devDependencies: { request: '~0.9.0' } } var expected = { underscore: { current: '1.3.3', wanted: '1.3.3', latest: '1.5.1', location: 'node_modules' + path.sep + 'underscore' }, request: { current: '0.9.5', wanted: '0.9.5', latest: '2.27.0', location: 'node_modules' + path.sep + 'request' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) mr({ port: common.port }, function (er, s) { t.ifError(er, 'mock registry should never fail to start') server = s common.npm( [ '--registry', common.registry, '--silent', 'install' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm install ran without issue') t.notOk(code, 'npm install ran without raising error code') t.end() } ) }) }) test('it should log json data', function (t) { common.npm( [ '--registry', common.registry, '--silent', '--json', 'outdated' ], EXEC_OPTS, function (err, code, stdout) { t.ifError(err, 'npm outdated ran without issue') t.notOk(code, 'npm outdated ran without raising error code') var out t.doesNotThrow(function () { out = JSON.parse(stdout) }, 'output correctly parsed as JSON') t.deepEqual(out, expected) t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-local.js�����������������������������������������������������������0000644�0000000�0000000�00000012245�12631326456�016500� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var rimraf = require('rimraf') var path = require('path') var mr = require('npm-registry-mock') var osenv = require('osenv') var mkdirp = require('mkdirp') var fs = require('graceful-fs') var pkg = path.resolve(__dirname, 'outdated-local') var pkgLocal = path.resolve(pkg, 'local-module') var pkgScopedLocal = path.resolve(pkg, 'another-local-module') var pkgLocalUnderscore = path.resolve(pkg, 'underscore') var pkgLocalOptimist = path.resolve(pkg, 'optimist') var pjParent = JSON.stringify({ name: 'outdated-local', version: '1.0.0', dependencies: { 'local-module': 'file:local-module', // updated locally, not on repo '@scoped/another-local-module': 'file:another-local-module', // updated locally, scoped, not on repo 'underscore': 'file:underscore', // updated locally, updated but lesser version on repo 'optimist': 'file:optimist' // updated locally, updated and greater version on repo } }, null, 2) + '\n' var pjLocal = JSON.stringify({ name: 'local-module', version: '1.0.0' }, null, 2) + '\n' var pjLocalBumped = JSON.stringify({ name: 'local-module', version: '1.1.0' }, null, 2) + '\n' var pjScopedLocal = JSON.stringify({ name: '@scoped/another-local-module', version: '1.0.0' }, null, 2) + '\n' var pjScopedLocalBumped = JSON.stringify({ name: '@scoped/another-local-module', version: '1.2.0' }, null, 2) + '\n' var pjLocalUnderscore = JSON.stringify({ name: 'underscore', version: '1.3.1' }, null, 2) + '\n' var pjLocalUnderscoreBumped = JSON.stringify({ name: 'underscore', version: '1.6.1' }, null, 2) + '\n' var pjLocalOptimist = JSON.stringify({ name: 'optimist', version: '0.4.0' }, null, 2) + '\n' var pjLocalOptimistBumped = JSON.stringify({ name: 'optimist', version: '0.5.0' }, null, 2) + '\n' function mocks (server) { server.get('/local-module') .reply(404) server.get('/@scoped%2fanother-local-module') .reply(404) } test('setup', function (t) { bootstrap() t.end() }) test('outdated support local modules', function (t) { t.plan(4) process.chdir(pkg) mr({ port: common.port, plugin: mocks }, function (err, s) { t.ifError(err, 'mock registry started without problems') function verify (actual, expected) { for (var i = 0; i < expected.length; i++) { var current = expected[i] var found = false for (var j = 0; j < actual.length; j++) { var target = actual[j] var k for (k = 0; k < current.length; k++) { if (current[k] !== target[k]) break } if (k === current.length) found = true } if (!found) return false } return true } npm.load( { loglevel: 'silent', parseable: true, registry: common.registry }, function () { npm.install('.', function (err) { t.ifError(err, 'install success') bumpLocalModules() npm.outdated(function (er, d) { t.ifError(er, 'outdated success') t.ok(verify(d, [ [ path.resolve(__dirname, 'outdated-local'), 'local-module', '1.0.0', '1.1.0', '1.1.0', 'file:local-module' ], [ path.resolve(__dirname, 'outdated-local'), '@scoped/another-local-module', '1.0.0', '1.2.0', '1.2.0', 'file:another-local-module' ], [ path.resolve(__dirname, 'outdated-local'), 'underscore', '1.3.1', '1.6.1', '1.5.1', 'file:underscore' ], [ path.resolve(__dirname, 'outdated-local'), 'optimist', '0.4.0', '0.6.0', '0.6.0', 'optimist@0.6.0' ] ]), 'got expected outdated output') s.close() }) }) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function bootstrap () { mkdirp.sync(pkg) fs.writeFileSync(path.resolve(pkg, 'package.json'), pjParent) mkdirp.sync(pkgLocal) fs.writeFileSync(path.resolve(pkgLocal, 'package.json'), pjLocal) mkdirp.sync(pkgScopedLocal) fs.writeFileSync(path.resolve(pkgScopedLocal, 'package.json'), pjScopedLocal) mkdirp.sync(pkgLocalUnderscore) fs.writeFileSync(path.resolve(pkgLocalUnderscore, 'package.json'), pjLocalUnderscore) mkdirp.sync(pkgLocalOptimist) fs.writeFileSync(path.resolve(pkgLocalOptimist, 'package.json'), pjLocalOptimist) } function bumpLocalModules () { fs.writeFileSync(path.resolve(pkgLocal, 'package.json'), pjLocalBumped) fs.writeFileSync(path.resolve(pkgScopedLocal, 'package.json'), pjScopedLocalBumped) fs.writeFileSync(path.resolve(pkgLocalUnderscore, 'package.json'), pjLocalUnderscoreBumped) fs.writeFileSync(path.resolve(pkgLocalOptimist, 'package.json'), pjLocalOptimistBumped) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-long.js������������������������������������������������������������0000644�0000000�0000000�00000003645�12631326456�016351� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = require('../../') // config var pkg = path.resolve(__dirname, 'outdated-long') var cache = path.resolve(pkg, 'cache') var json = { name: 'outdated-long', description: 'fixture', version: '0.0.1', dependencies: { underscore: '1.3.1' } } test('setup', function (t) { cleanup() mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('it should not throw', function (t) { var originalLog = console.log var output = [] var expOut = [ path.resolve(pkg, 'node_modules', 'underscore'), path.resolve(pkg, 'node_modules', 'underscore') + ':underscore@1.3.1' + ':underscore@1.3.1' + ':underscore@1.5.1' + ':dependencies' ] var expData = [ [ pkg, 'underscore', '1.3.1', '1.3.1', '1.5.1', '1.3.1', 'dependencies' ] ] console.log = function () { output.push.apply(output, arguments) } mr({ port: common.port }, function (er, s) { npm.load( { cache: 'cache', loglevel: 'silent', parseable: true, registry: common.registry }, function () { npm.install('.', function (err) { t.ifError(err, 'install success') npm.config.set('long', true) npm.outdated(function (er, d) { t.ifError(er, 'outdated success') console.log = originalLog t.same(output, expOut) t.same(d, expData) s.close() t.end() }) }) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } �������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-new-versions.js����������������������������������������������������0000644�0000000�0000000�00000002442�12631326456�020043� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = require('../../') var pkg = path.resolve(__dirname, 'outdated-new-versions') var cache = path.resolve(pkg, 'cache') var json = { name: 'new-versions-with-outdated', author: 'Rockbert', version: '0.0.0', dependencies: { underscore: '~1.3.1' }, devDependencies: { request: '~0.9.0' } } test('setup', function (t) { cleanup() mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) t.end() }) test('dicovers new versions in outdated', function (t) { process.chdir(pkg) t.plan(2) mr({ port: common.port }, function (er, s) { npm.load({ cache: cache, registry: common.registry }, function () { npm.outdated(function (er, d) { for (var i = 0; i < d.length; i++) { if (d[i][1] === 'underscore') t.equal('1.5.1', d[i][4]) if (d[i][1] === 'request') t.equal('2.27.0', d[i][4]) } s.close() t.end() }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-notarget.js��������������������������������������������������������0000644�0000000�0000000�00000002256�12631326456�017232� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Fixes Issue #1770 var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var osenv = require('osenv') var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var pkg = path.resolve(__dirname, 'outdated-notarget') var cache = path.resolve(pkg, 'cache') var mr = require('npm-registry-mock') test('outdated-target: if no viable version is found, show error', function (t) { t.plan(1) setup() mr({ port: common.port }, function (er, s) { npm.load({ cache: cache, registry: common.registry }, function () { npm.commands.update(function (er) { t.equal(er.code, 'ETARGET') s.close() t.end() }) }) }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) mkdirp.sync(cache) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Evan Lucas', name: 'outdated-notarget', version: '0.0.0', description: 'Test for outdated-target', dependencies: { underscore: '~199.7.1' } }), 'utf8') process.chdir(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated-private.js���������������������������������������������������������0000644�0000000�0000000�00000005501�12631326456�017055� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var rimraf = require('rimraf') var path = require('path') var mr = require('npm-registry-mock') var osenv = require('osenv') var mkdirp = require('mkdirp') var fs = require('graceful-fs') var pkg = path.resolve(__dirname, 'outdated-private') var pkgLocalPrivate = path.resolve(pkg, 'local-private') var pkgScopedLocalPrivate = path.resolve(pkg, 'another-local-private') var pkgLocalUnderscore = path.resolve(pkg, 'underscore') var pjParent = JSON.stringify({ name: 'outdated-private', version: '1.0.0', dependencies: { 'local-private': 'file:local-private', '@scoped/another-local-private': 'file:another-local-private', 'underscore': 'file:underscore' } }, null, 2) + '\n' var pjLocalPrivate = JSON.stringify({ name: 'local-private', version: '1.0.0', 'private': true }, null, 2) + '\n' var pjLocalPrivateBumped = JSON.stringify({ name: 'local-private', version: '1.1.0', 'private': true }, null, 2) + '\n' var pjScopedLocalPrivate = JSON.stringify({ name: '@scoped/another-local-private', version: '1.0.0', 'private': true }, null, 2) + '\n' var pjLocalUnderscore = JSON.stringify({ name: 'underscore', version: '1.3.1' }, null, 2) + '\n' test('setup', function (t) { bootstrap() t.end() }) test('outdated ignores private modules', function (t) { t.plan(3) process.chdir(pkg) mr({ port: common.port }, function (er, s) { npm.load( { loglevel: 'silent', parseable: true, registry: common.registry }, function () { npm.install('.', function (err) { t.ifError(err, 'install success') bumpLocalPrivate() npm.outdated(function (er, d) { t.ifError(er, 'outdated success') t.deepEqual(d, [[ path.resolve(__dirname, 'outdated-private'), 'underscore', '1.3.1', '1.5.1', '1.5.1', 'underscore@1.5.1', null ]]) s.close() }) }) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function bootstrap () { mkdirp.sync(pkg) fs.writeFileSync(path.resolve(pkg, 'package.json'), pjParent) mkdirp.sync(pkgLocalPrivate) fs.writeFileSync(path.resolve(pkgLocalPrivate, 'package.json'), pjLocalPrivate) mkdirp.sync(pkgScopedLocalPrivate) fs.writeFileSync(path.resolve(pkgScopedLocalPrivate, 'package.json'), pjScopedLocalPrivate) mkdirp.sync(pkgLocalUnderscore) fs.writeFileSync(path.resolve(pkgLocalUnderscore, 'package.json'), pjLocalUnderscore) } function bumpLocalPrivate () { fs.writeFileSync(path.resolve(pkgLocalPrivate, 'package.json'), pjLocalPrivateBumped) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/outdated.js�����������������������������������������������������������������0000644�0000000�0000000�00000004623�12631326456�015411� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../') var common = require('../common-tap.js') // config var pkg = path.resolve(__dirname, 'outdated') var cache = path.resolve(pkg, 'cache') var originalLog var json = { name: 'outdated', description: 'fixture', version: '0.0.1', dependencies: { underscore: '1.3.1', async: '0.2.9', checker: '0.5.1' } } test('setup', function (t) { cleanup() originalLog = console.log mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('it should not throw', function (t) { var output = [] var expOut = [ path.resolve(pkg, 'node_modules', 'async') + ':async@0.2.9' + ':async@0.2.9' + ':async@0.2.10' + '\n' + path.resolve(pkg, 'node_modules', 'checker') + ':checker@0.5.1' + ':checker@0.5.1' + ':checker@0.5.2' + '\n' + path.resolve(pkg, 'node_modules', 'underscore') + ':underscore@1.3.1' + ':underscore@1.3.1' + ':underscore@1.5.1' ] var expData = [ [ pkg, 'async', '0.2.9', '0.2.9', '0.2.10', '0.2.9', null ], [ pkg, 'checker', '0.5.1', '0.5.1', '0.5.2', '0.5.1', null ], [ pkg, 'underscore', '1.3.1', '1.3.1', '1.5.1', '1.3.1', null ] ] console.log = function () {} mr({ port: common.port }, function (er, s) { npm.load( { cache: 'cache', loglevel: 'silent', parseable: true, registry: common.registry }, function () { npm.install('.', function (err) { t.ifError(err, 'install success') console.log = function () { output.push.apply(output, arguments) } npm.outdated(function (er, d) { t.ifError(er, 'outdated success') console.log = originalLog t.same(output, expOut) t.same(d, expData) s.close() t.end() }) }) } ) }) }) test('cleanup', function (t) { cleanup() console.log = originalLog t.end() }) function cleanup () { rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/override-bundled.js���������������������������������������������������������0000644�0000000�0000000�00000012323�12631326456�017026� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var fs = require('fs') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var path = require('path') var common = require('../common-tap.js') var testname = path.basename(__filename, '.js') var testdir = path.resolve(__dirname, testname) var testmod = path.resolve(testdir, 'top-test') var bundleupdatesrc = path.resolve(testmod, 'bundle-update') var bundleupdateNEW = path.resolve(bundleupdatesrc, 'NEW') var bundleupdateNEWpostinstall = path.resolve(testdir, 'node_modules', 'top-test', 'node_modules', 'bundle-update', 'NEW') var bundleupdatebad = path.resolve(testmod, 'node_modules', 'bundle-update') var bundlekeepsrc = path.resolve(testmod, 'bundle-keep') var bundlekeep = path.resolve(testmod, 'node_modules', 'bundle-keep') var bundlekeepOLD = path.resolve(bundlekeep, 'OLD') var bundlekeepOLDpostinstall = path.resolve(testdir, 'node_modules', 'top-test', 'node_modules', 'bundle-keep', 'OLD') var bundledeepsrc = path.resolve(testmod, 'bundle-deep') var bundledeep = path.resolve(testmod, 'node_modules', 'bundle-deep') var bundledeepOLD = path.resolve(bundledeep, 'OLD') var bundledeepOLDpostinstall = path.resolve(testdir, 'node_modules', 'top-test', 'node_modules', 'bundle-deep', 'OLD') var bundledeepupdatesrc = path.resolve(testmod, 'bundle-deep-update') var bundledeepupdate = path.resolve(bundledeep, 'node_modules', 'bundle-deep-update') var bundledeepupdateNEW = path.resolve(bundledeepupdatesrc, 'NEW') var bundledeepupdateNEWpostinstall = path.resolve(testdir, 'node_modules', 'top-test', 'node_modules', 'bundle-deep', 'node_modules', 'bundle-deep-update', 'NEW') var testjson = { dependencies: {'top-test': 'file:top-test/'} } var testmodjson = { name: 'top-test', version: '1.0.0', dependencies: { 'bundle-update': bundleupdatesrc, 'bundle-keep': bundlekeepsrc, 'bundle-deep': bundledeepsrc }, bundledDependencies: ['bundle-update', 'bundle-keep', 'bundle-deep'] } var bundlejson = { name: 'bundle-update', version: '1.0.0' } var bundlekeepjson = { name: 'bundle-keep', version: '1.0.0', _requested: { rawSpec: bundlekeepsrc } } var bundledeepjson = { name: 'bundle-deep', version: '1.0.0', dependencies: { 'bundle-deep-update': bundledeepupdatesrc }, _requested: { rawSpec: bundledeepsrc } } var bundledeepupdatejson = { version: '1.0.0', name: 'bundle-deep-update' } function writepjs (dir, content) { fs.writeFileSync(path.join(dir, 'package.json'), JSON.stringify(content, null, 2)) } function setup () { mkdirp.sync(testdir) writepjs(testdir, testjson) mkdirp.sync(testmod) writepjs(testmod, testmodjson) mkdirp.sync(bundleupdatesrc) writepjs(bundleupdatesrc, bundlejson) fs.writeFileSync(bundleupdateNEW, '') mkdirp.sync(bundleupdatebad) writepjs(bundleupdatebad, bundlejson) mkdirp.sync(bundlekeepsrc) writepjs(bundlekeepsrc, bundlekeepjson) mkdirp.sync(bundlekeep) writepjs(bundlekeep, bundlekeepjson) fs.writeFileSync(bundlekeepOLD, '') mkdirp.sync(bundledeepsrc) writepjs(bundledeepsrc, bundledeepjson) mkdirp.sync(bundledeep) writepjs(bundledeep, bundledeepjson) fs.writeFileSync(bundledeepOLD, '') mkdirp.sync(bundledeepupdatesrc) writepjs(bundledeepupdatesrc, bundledeepupdatejson) mkdirp.sync(bundledeepupdate) writepjs(bundledeepupdate, bundledeepupdatejson) fs.writeFileSync(bundledeepupdateNEW, '') } function cleanup () { rimraf.sync(testdir) } test('setup', function (t) { cleanup() setup() t.end() }) test('bundled', function (t) { common.npm(['install', '--loglevel=warn'], {cwd: testdir}, function (err, code, stdout, stderr) { if (err) throw err t.plan(8) t.is(code, 0, 'npm itself completed ok') // This tests that after the install we have a freshly installed version // of `bundle-update` (in alignment with the package.json), instead of the // version that was bundled with `top-test`. // If npm doesn't do this, and selects the bundled version, things go very // wrong because npm thinks it has a different module (with different // metadata) installed in that location and will go off and try to do // _things_ to it. Things like chmod in particular, which in turn results // in the dreaded ENOENT errors. t.like(stderr, new RegExp('npm WARN ' + testname), "didn't stomp on other warnings") t.like(stderr, /npm WARN.*bundle-update/, 'included update warning about bundled dep') t.like(stderr, /npm WARN.*bundle-deep-update/, 'included update warning about deeply bundled dep') fs.stat(bundleupdateNEWpostinstall, function (missing) { t.ok(!missing, 'package.json overrode bundle') }) fs.stat(bundledeepupdateNEWpostinstall, function (missing) { t.ok(!missing, 'deep package.json overrode bundle') }) // Relatedly, when upgrading, if a bundled module is replacing an existing // module we want to choose the bundled version, not the version we're replacing. fs.stat(bundlekeepOLDpostinstall, function (missing) { t.ok(!missing, 'no override when package.json matches') }) fs.stat(bundledeepOLDpostinstall, function (missing) { t.ok(!missing, 'deep no override when package.json matches') }) }) }) test('cleanup', function (t) { cleanup() t.end() }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/owner.js��������������������������������������������������������������������0000644�0000000�0000000�00000007753�12631326456�014741� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var server var EXEC_OPTS = {} var jashkenas = { name: 'jashkenas', email: 'jashkenas@gmail.com' } var othiym23 = { name: 'othiym23', email: 'forrest@npmjs.com' } var bcoe = { name: 'bcoe', email: 'ben@npmjs.com' } function shrt (user) { return user.name + ' <' + user.email + '>\n' } function mocks (server) { server.get('/-/user/org.couchdb.user:othiym23') .many().reply(200, othiym23) // test 1 server.get('/underscore') .reply(200, { _id: 'underscore', _rev: 1, maintainers: [jashkenas] }) server.put( '/underscore/-rev/1', { _id: 'underscore', _rev: 1, maintainers: [jashkenas, othiym23] }, {} ).reply(200, { _id: 'underscore', _rev: 2, maintainers: [jashkenas, othiym23] }) // test 2 server.get('/@xxx%2fscoped') .reply(200, { _id: '@xxx/scoped', _rev: 1, maintainers: [bcoe] }) server.put( '/@xxx%2fscoped/-rev/1', { _id: '@xxx/scoped', _rev: 1, maintainers: [bcoe, othiym23] }, {} ).reply(200, { _id: '@xxx/scoped', _rev: 2, maintainers: [bcoe, othiym23] }) // test 3 server.get('/underscore') .reply(200, { _id: 'underscore', _rev: 2, maintainers: [jashkenas, othiym23] }) // test 4 server.get('/underscore') .reply(200, { _id: 'underscore', _rev: 2, maintainers: [jashkenas, othiym23] }) server.put( '/underscore/-rev/2', { _id: 'underscore', _rev: 2, maintainers: [jashkenas] }, {} ).reply(200, { _id: 'underscore', _rev: 3, maintainers: [jashkenas] }) } test('setup', function (t) { common.npm( [ '--loglevel', 'silent', 'cache', 'clean' ], EXEC_OPTS, function (err, code) { t.ifError(err, 'npm cache clean ran without error') t.notOk(code, 'npm cache clean exited cleanly') mr({ port: common.port, plugin: mocks }, function (er, s) { server = s t.end() }) } ) }) test('npm owner add', function (t) { common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'owner', 'add', 'othiym23', 'underscore' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm owner add ran without error') t.notOk(code, 'npm owner add exited cleanly') t.notOk(stderr, 'npm owner add ran silently') t.equal(stdout, '+ othiym23 (underscore)\n', 'got expected add output') t.end() } ) }) test('npm owner add (scoped)', function (t) { common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'owner', 'add', 'othiym23', '@xxx/scoped' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm owner add (scoped) ran without error') t.notOk(code, 'npm owner add (scoped) exited cleanly') t.notOk(stderr, 'npm owner add (scoped) ran silently') t.equal(stdout, '+ othiym23 (@xxx/scoped)\n', 'got expected scoped add output') t.end() } ) }) test('npm owner ls', function (t) { common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'owner', 'ls', 'underscore' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm owner ls ran without error') t.notOk(code, 'npm owner ls exited cleanly') t.notOk(stderr, 'npm owner ls ran silently') t.equal(stdout, shrt(jashkenas) + shrt(othiym23), 'got expected ls output') t.end() } ) }) test('npm owner rm', function (t) { common.npm( [ '--loglevel', 'silent', '--registry', common.registry, 'owner', 'rm', 'othiym23', 'underscore' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'npm owner rm ran without error') t.notOk(code, 'npm owner rm exited cleanly') t.notOk(stderr, 'npm owner rm ran silently') t.equal(stdout, '- othiym23 (underscore)\n', 'got expected rm output') t.end() } ) }) test('cleanup', function (t) { server.close() t.end() }) ���������������������npm_3.5.2.orig/test/tap/pack-scoped.js��������������������������������������������������������������0000644�0000000�0000000�00000003253�12631326456�015767� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// verify that prepublish runs on pack and publish var test = require('tap').test var common = require('../common-tap') var fs = require('graceful-fs') var join = require('path').join var mkdirp = require('mkdirp') var rimraf = require('rimraf') var pkg = join(__dirname, 'scoped_package') var manifest = join(pkg, 'package.json') var tmp = join(pkg, 'tmp') var cache = join(pkg, 'cache') var data = { name: '@scope/generic-package', version: '90000.100001.5' } test('setup', function (t) { var n = 0 rimraf.sync(pkg) mkdirp(pkg, then()) mkdirp(cache, then()) mkdirp(tmp, then()) function then () { n++ return function (er) { t.ifError(er) if (--n === 0) next() } } function next () { fs.writeFile(manifest, JSON.stringify(data), 'ascii', done) } function done (er) { t.ifError(er) t.pass('setup done') t.end() } }) test('test', function (t) { var env = { 'npm_config_cache': cache, 'npm_config_tmp': tmp, 'npm_config_prefix': pkg, 'npm_config_global': 'false' } for (var i in process.env) { if (!/^npm_config_/.test(i)) env[i] = process.env[i] } common.npm([ 'pack', '--loglevel', 'warn' ], { cwd: pkg, env: env }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm pack finished without error') t.equal(code, 0, 'npm pack exited ok') t.notOk(stderr, 'got stderr data: ' + JSON.stringify('' + stderr)) stdout = stdout.trim() var regex = new RegExp('scope-generic-package-90000.100001.5.tgz', 'ig') t.ok(stdout.match(regex), 'found package') t.end() }) }) test('cleanup', function (t) { rimraf.sync(pkg) t.pass('cleaned up') t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/peer-deps-invalid.js��������������������������������������������������������0000644�0000000�0000000�00000005164�12631326456�017111� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../') var common = require('../common-tap') var pkg = path.resolve(__dirname, 'peer-deps-invalid') var cache = path.resolve(pkg, 'cache') var json = { author: 'Domenic Denicola <domenic@domenicdenicola.com> (http://domenicdenicola.com/)', name: 'peer-deps-invalid', version: '0.0.0', dependencies: { 'npm-test-peer-deps-file': 'http://localhost:1337/ok.js', 'npm-test-peer-deps-file-invalid': 'http://localhost:1337/invalid.js' } } var fileFail = function () { /**package * { "name": "npm-test-peer-deps-file-invalid" * , "main": "index.js" * , "version": "1.2.3" * , "description":"This one should conflict with the other one" * , "peerDependencies": { "underscore": "1.3.3" } * } **/ module.exports = 'I\'m just a lonely index, naked as the day I was born.' }.toString().split('\n').slice(1, -1).join('\n') var fileOK = function () { /**package * { "name": "npm-test-peer-deps-file" * , "main": "index.js" * , "version": "1.2.3" * , "description":"No package.json in sight!" * , "peerDependencies": { "underscore": "1.3.1" } * , "dependencies": { "mkdirp": "0.3.5" } * } **/ module.exports = 'I\'m just a lonely index, naked as the day I was born.' }.toString().split('\n').slice(1, -1).join('\n') test('setup', function (t) { cleanup() mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync(path.join(pkg, 'file-ok.js'), fileOK) fs.writeFileSync(path.join(pkg, 'file-fail.js'), fileFail) process.chdir(pkg) t.end() }) test('installing dependencies that have conflicting peerDependencies', function (t) { var customMocks = { 'get': { '/ok.js': [200, path.join(pkg, 'file-ok.js')], '/invalid.js': [200, path.join(pkg, 'file-fail.js')] } } mr({port: common.port, mocks: customMocks}, function (err, s) { t.ifError(err, 'mock registry started') npm.load( { cache: cache, registry: common.registry }, function () { npm.commands.install([], function (err, additions, tree) { t.error(err) var invalid = tree.warnings.filter(function (warning) { return warning.code === 'EPEERINVALID' }) t.is(invalid.length, 2) s.close() t.end() }) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/peer-deps-toplevel.js�������������������������������������������������������0000644�0000000�0000000�00000005270�12631326456�017313� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = npm = require('../../') var pkg = path.resolve(__dirname, 'peer-deps-toplevel') var expected = { name: 'npm-test-peer-deps-toplevel', version: '0.0.0', problems: [ 'peer dep missing: mkdirp@*, required by npm-test-peer-deps-toplevel@0.0.0', 'peer dep missing: request@0.9.x, required by npm-test-peer-deps@0.0.0' ], dependencies: { 'npm-test-peer-deps': { version: '0.0.0', from: 'npm-test-peer-deps@*', resolved: common.registry + '/npm-test-peer-deps/-/npm-test-peer-deps-0.0.0.tgz', dependencies: { underscore: { version: '1.3.1', from: 'underscore@1.3.1', resolved: common.registry + '/underscore/-/underscore-1.3.1.tgz' } } }, mkdirp: { peerMissing: true, required: { _id: 'mkdirp@*', name: 'mkdirp', version: '*', peerMissing: [ {requiredBy: 'npm-test-peer-deps-toplevel@0.0.0', requires: 'mkdirp@*'} ], dependencies: {} } }, request: { peerMissing: true, required: { _id: 'request@0.9.x', dependencies: {}, name: 'request', peerMissing: [ {requiredBy: 'npm-test-peer-deps@0.0.0', requires: 'request@0.9.x'} ], version: '0.9.x' } } } } var json = { author: 'Domenic Denicola', name: 'npm-test-peer-deps-toplevel', version: '0.0.0', dependencies: { 'npm-test-peer-deps': '*' }, peerDependencies: { mkdirp: '*' } } test('installs the peer dependency directory structure', function (t) { mr({ port: common.port }, function (er, s) { setup(function (err) { t.ifError(err, 'setup ran successfully') npm.install('.', function (err) { t.ifError(err, 'packages were installed') npm.commands.ls([], true, function (err, _, results) { t.ifError(err, 'listed tree without problems') t.deepEqual(results, expected, 'got expected output from ls') s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) var opts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } npm.load(opts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/peer-deps-without-package-json.js�������������������������������������������0000644�0000000�0000000�00000004117�12631326456�021523� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'peer-deps-without-package-json') var cache = path.resolve(pkg, 'cache') var nodeModules = path.resolve(pkg, 'node_modules') var fileJS = function () { /**package * { "name": "npm-test-peer-deps-file" * , "main": "index.js" * , "version": "1.2.3" * , "description":"No package.json in sight!" * , "peerDependencies": { "underscore": "1.3.1" } * , "dependencies": { "mkdirp": "0.3.5" } * } **/ module.exports = 'I\'m just a lonely index, naked as the day I was born.' }.toString().split('\n').slice(1, -1).join('\n') test('setup', function (t) { t.comment('test for https://github.com/npm/npm/issues/3049') cleanup() mkdirp.sync(cache) mkdirp.sync(nodeModules) fs.writeFileSync(path.join(pkg, 'file-js.js'), fileJS) process.chdir(pkg) t.end() }) test('installing a peerDeps-using package without package.json', function (t) { var customMocks = { 'get': { '/ok.js': [200, path.join(pkg, 'file-js.js')] } } mr({port: common.port, mocks: customMocks}, function (err, s) { t.ifError(err, 'mock registry booted') npm.load({ registry: common.registry, cache: cache }, function () { npm.install(common.registry + '/ok.js', function (err, additions, result) { t.ifError(err, 'installed ok.js') t.ok( fs.existsSync(path.join(nodeModules, 'npm-test-peer-deps-file')), 'passive peer dep installed' ) var invalid = result.warnings.filter(function (warning) { return warning.code === 'EPEERINVALID' }) t.is(invalid.length, 1, 'got a warning for a missing/invalid peer dep') t.end() s.close() // shutdown mock registry. }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/peer-deps.js����������������������������������������������������������������0000644�0000000�0000000�00000002712�12631326456�015461� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = npm = require('../../') var pkg = path.resolve(__dirname, 'peer-deps') var expected = [ 'peer dep missing: request@0.9.x, required by npm-test-peer-deps@0.0.0' ] var json = { author: 'Domenic Denicola', name: 'npm-test-peer-deps-installer', version: '0.0.0', dependencies: { 'npm-test-peer-deps': '*' } } test('installs the peer dependency directory structure', function (t) { mr({ port: common.port }, function (er, s) { setup(function (err) { if (err) return t.fail(err) npm.install('.', function (err) { if (err) return t.fail(err) npm.commands.ls([], true, function (err, _, results) { if (err) return t.fail(err) t.deepEqual(results.problems, expected) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) var opts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } npm.load(opts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������npm_3.5.2.orig/test/tap/ping.js���������������������������������������������������������������������0000644�0000000�0000000�00000002455�12631326456�014536� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'ping') var opts = { cwd: pkg } var outfile = path.join(pkg, '_npmrc') var contents = function () { }.toString().split('\n').slice(1, -1).join('\n') var pingResponse = { host: 'registry.npmjs.org', ok: true, username: null, peer: 'example.com' } function mocks (server) { server.get('/-/ping?write=true').reply(200, JSON.stringify(pingResponse)) } test('setup', function (t) { cleanup() setup() t.end() }) test('npm ping', function (t) { mr({ port: common.port, plugin: mocks }, function (err, s) { if (err) throw err common.npm([ 'ping', '--registry', common.registry, '--loglevel', 'silent', '--userconfig', outfile ], opts, function (err, code, stdout) { s.close() t.ifError(err, 'no error output') t.notOk(code, 'exited OK') t.same(JSON.parse(stdout), pingResponse) t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { mkdirp.sync(pkg) fs.writeFileSync(outfile, contents) } function cleanup () { rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/prepublish.js���������������������������������������������������������������0000644�0000000�0000000�00000003501�12631326456�015747� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// verify that prepublish runs on pack and publish var common = require('../common-tap') var test = require('tap').test var fs = require('graceful-fs') var join = require('path').join var mkdirp = require('mkdirp') var rimraf = require('rimraf') var pkg = join(__dirname, 'prepublish_package') var tmp = join(pkg, 'tmp') var cache = join(pkg, 'cache') test('setup', function (t) { var n = 0 cleanup() mkdirp(pkg, then()) mkdirp(cache, then()) mkdirp(tmp, then()) function then () { n++ return function (er) { if (er) throw er if (--n === 0) next() } } function next () { fs.writeFile(join(pkg, 'package.json'), JSON.stringify({ name: 'npm-test-prepublish', version: '1.2.5', scripts: { prepublish: 'echo ok' } }), 'ascii', function (er) { if (er) throw er t.pass('setup done') t.end() }) } }) test('test', function (t) { var env = { 'npm_config_cache': cache, 'npm_config_tmp': tmp, 'npm_config_prefix': pkg, 'npm_config_global': 'false' } for (var i in process.env) { if (!/^npm_config_/.test(i)) { env[i] = process.env[i] } } common.npm([ 'pack', '--loglevel', 'warn' ], { cwd: pkg, env: env }, function (err, code, stdout, stderr) { t.equal(code, 0, 'pack finished successfully') t.ifErr(err, 'pack finished successfully') t.notOk(stderr, 'got stderr data:' + JSON.stringify('' + stderr)) var c = stdout.trim() var regex = new RegExp('' + '> npm-test-prepublish@1.2.5 prepublish [^\\r\\n]+\\r?\\n' + '> echo ok\\r?\\n' + '\\r?\\n' + 'ok\\r?\\n' + 'npm-test-prepublish-1.2.5.tgz', 'ig') t.ok(c.match(regex)) t.end() }) }) test('cleanup', function (t) { cleanup() t.pass('cleaned up') t.end() }) function cleanup () { rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/progress-config.js����������������������������������������������������������0000644�0000000�0000000�00000002722�12631326456�016705� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var log = require('npmlog') // We use requireInject to get a fresh copy of // the npm singleton each time we require it. // If we didn't, we'd have shared state between // these various tests. var requireInject = require('require-inject') // Make sure existing environment vars don't muck up the test process.env = {} test('disabled', function (t) { t.plan(1) var npm = requireInject('../../lib/npm.js', {}) npm.load({progress: false}, function () { t.is(log.progressEnabled, false, 'should be disabled') }) }) test('enabled', function (t) { t.plan(1) var npm = requireInject('../../lib/npm.js', {}) npm.load({progress: true}, function () { t.is(log.progressEnabled, true, 'should be enabled') }) }) test('default', function (t) { t.plan(1) var npm = requireInject('../../lib/npm.js', {}) npm.load({}, function () { t.is(log.progressEnabled, true, 'should be enabled') }) }) test('default-travis', function (t) { t.plan(1) global.process.env.TRAVIS = 'true' var npm = requireInject('../../lib/npm.js', {}) npm.load({}, function () { t.is(log.progressEnabled, false, 'should be disabled') delete global.process.env.TRAVIS }) }) test('default-ci', function (t) { t.plan(1) global.process.env.CI = 'true' var npm = requireInject('../../lib/npm.js', {}) npm.load({}, function () { t.is(log.progressEnabled, false, 'should be disabled') delete global.process.env.CI }) }) ����������������������������������������������npm_3.5.2.orig/test/tap/prune.js��������������������������������������������������������������������0000644�0000000�0000000�00000005572�12631326456�014735� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var server var pkg = path.resolve(__dirname, 'prune') var cache = path.resolve(pkg, 'cache') var json = { name: 'prune', description: 'fixture', version: '0.0.1', main: 'index.js', dependencies: { underscore: '1.3.1' }, devDependencies: { mkdirp: '*' } } var EXEC_OPTS = { cwd: pkg, npm_config_depth: 'Infinity' } test('setup', function (t) { cleanup() mkdirp.sync(cache) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) mr({ port: common.port }, function (er, s) { server = s t.end() }) }) test('npm install', function (t) { common.npm([ 'install', '--cache', cache, '--registry', common.registry, '--loglevel', 'silent', '--production', 'false' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifErr(err, 'install finished successfully') t.notOk(code, 'exit ok') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.end() }) }) test('npm install test-package', function (t) { common.npm([ 'install', 'test-package', '--cache', cache, '--registry', common.registry, '--loglevel', 'silent', '--production', 'false' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifErr(err, 'install finished successfully') t.notOk(code, 'exit ok') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.end() }) }) test('verify installs', function (t) { var dirs = fs.readdirSync(pkg + '/node_modules').sort() t.same(dirs, [ 'test-package', 'mkdirp', 'underscore' ].sort()) t.end() }) test('npm prune', function (t) { common.npm([ 'prune', '--loglevel', 'silent', '--production', 'false' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifErr(err, 'prune finished successfully') t.notOk(code, 'exit ok') t.notOk(stderr, 'Should not get data on stderr: ' + stderr) t.end() }) }) test('verify installs', function (t) { var dirs = fs.readdirSync(pkg + '/node_modules').sort() t.same(dirs, [ 'mkdirp', 'underscore' ]) t.end() }) test('npm prune', function (t) { common.npm([ 'prune', '--loglevel', 'silent', '--production' ], EXEC_OPTS, function (err, code, stderr) { t.ifErr(err, 'prune finished successfully') t.notOk(code, 'exit ok') t.equal(stderr, 'unbuild mkdirp@0.3.5\n') t.end() }) }) test('verify installs', function (t) { var dirs = fs.readdirSync(pkg + '/node_modules').sort() t.same(dirs, [ 'underscore' ]) t.end() }) test('cleanup', function (t) { server.close() cleanup() t.pass('cleaned up') t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/publish-access-scoped.js����������������������������������������������������0000644�0000000�0000000�00000003545�12631326456�017762� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') require('../common-tap') var nock = require('nock') var npm = require('../../') var common = require('../common-tap.js') var pkg = path.join(__dirname, 'publish-access') // TODO: nock uses setImmediate, breaks 0.8: replace with mockRegistry if (!global.setImmediate) { global.setImmediate = function () { var args = [arguments[0], 0].concat([].slice.call(arguments, 1)) setTimeout.apply(this, args) } } test('setup', function (t) { mkdirp(path.join(pkg, 'cache'), function () { var configuration = { cache: path.join(pkg, 'cache'), loglevel: 'silent', registry: common.registry } npm.load(configuration, next) }) function next (er) { t.ifError(er, 'npm loaded successfully') process.chdir(pkg) fs.writeFile( path.join(pkg, 'package.json'), JSON.stringify({ name: '@bigco/publish-access', version: '1.2.5' }), 'ascii', function (er) { t.ifError(er) t.pass('setup done') t.end() } ) } }) test('scoped packages pass public access if set', function (t) { var put = nock(common.registry) .put('/@bigco%2fpublish-access') .reply(201, verify) npm.config.set('access', 'public') npm.commands.publish([], false, function (er) { t.ifError(er, 'published without error') put.done() t.end() }) function verify (_, body) { t.doesNotThrow(function () { var parsed = JSON.parse(body) t.equal(parsed.access, 'public', 'access level is correct') }, 'converted body back into object') return {ok: true} } }) test('cleanup', function (t) { process.chdir(__dirname) rimraf(pkg, function (er) { t.ifError(er) t.end() }) }) �����������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/publish-access-unscoped-restricted-fails.js���������������������������������0000644�0000000�0000000�00000002414�12631326456�023561� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var npm = require('../../') var common = require('../common-tap.js') var pkg = path.join(__dirname, 'publish-access-unscoped') test('setup', function (t) { mkdirp(path.join(pkg, 'cache'), function () { var configuration = { cache: path.join(pkg, 'cache'), loglevel: 'silent', registry: common.registry } npm.load(configuration, next) }) function next (er) { t.ifError(er, 'npm loaded successfully') process.chdir(pkg) fs.writeFile( path.join(pkg, 'package.json'), JSON.stringify({ name: 'publish-access', version: '1.2.5' }), 'ascii', function (er) { t.ifError(er) t.pass('setup done') t.end() } ) } }) test('unscoped packages cannot be restricted', function (t) { npm.config.set('access', 'restricted') npm.commands.publish([], false, function (er) { t.ok(er, 'got an error back') t.equal(er.message, "Can't restrict access to unscoped packages.") t.end() }) }) test('cleanup', function (t) { process.chdir(__dirname) rimraf(pkg, function (er) { t.ifError(er) t.end() }) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/publish-access-unscoped.js��������������������������������������������������0000644�0000000�0000000�00000003546�12631326456�020326� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') require('../common-tap') var nock = require('nock') var npm = require('../../') var common = require('../common-tap.js') var pkg = path.join(__dirname, 'publish-access-unscoped') // TODO: nock uses setImmediate, breaks 0.8: replace with mockRegistry if (!global.setImmediate) { global.setImmediate = function () { var args = [arguments[0], 0].concat([].slice.call(arguments, 1)) setTimeout.apply(this, args) } } test('setup', function (t) { mkdirp(path.join(pkg, 'cache'), function () { var configuration = { cache: path.join(pkg, 'cache'), loglevel: 'silent', registry: common.registry } npm.load(configuration, next) }) function next (er) { t.ifError(er, 'npm loaded successfully') process.chdir(pkg) fs.writeFile( path.join(pkg, 'package.json'), JSON.stringify({ name: 'publish-access', version: '1.2.5' }), 'ascii', function (er) { t.ifError(er) t.pass('setup done') t.end() } ) } }) test('unscoped packages can be explicitly set as public', function (t) { var put = nock(common.registry) .put('/publish-access') .reply(201, verify) npm.config.set('access', 'public') npm.commands.publish([], false, function (er) { t.ifError(er, 'published without error') put.done() t.end() }) function verify (_, body) { t.doesNotThrow(function () { var parsed = JSON.parse(body) t.equal(parsed.access, 'public', 'access level is correct') }, 'converted body back into object') return {ok: true} } }) test('cleanup', function (t) { process.chdir(__dirname) rimraf(pkg, function (er) { t.ifError(er) t.end() }) }) ����������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/publish-config.js�����������������������������������������������������������0000644�0000000�0000000�00000003515�12631326456�016510� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var fs = require('fs') var osenv = require('osenv') var pkg = process.env.npm_config_tmp || '/tmp' pkg += '/npm-test-publish-config' require('mkdirp').sync(pkg) fs.writeFileSync(pkg + '/package.json', JSON.stringify({ name: 'npm-test-publish-config', version: '1.2.3', publishConfig: { registry: common.registry } }), 'utf8') fs.writeFileSync(pkg + '/fixture_npmrc', '//localhost:1337/:email = fancy@feast.net\n' + '//localhost:1337/:username = fancy\n' + '//localhost:1337/:_password = ' + new Buffer('feast').toString('base64') + '\n' + 'registry = http://localhost:1337/') test(function (t) { var child t.plan(4) require('http').createServer(function (req, res) { t.pass('got request on the fakey fake registry') this.close() res.statusCode = 500 res.end(JSON.stringify({ error: 'sshhh. naptime nao. \\^O^/ <(YAWWWWN!)' })) child.kill('SIGHUP') }).listen(common.port, function () { t.pass('server is listening') // don't much care about listening to the child's results // just wanna make sure it hits the server we just set up. // // there are plenty of other tests to verify that publish // itself functions normally. // // Make sure that we don't sit around waiting for lock files child = common.npm(['publish', '--userconfig=' + pkg + '/fixture_npmrc'], { cwd: pkg, stdio: 'inherit', env: { 'npm_config_cache_lock_stale': 1000, 'npm_config_cache_lock_wait': 1000, HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH, USERPROFILE: osenv.home() } }, function (err, code) { t.ifError(err, 'publish command finished successfully') t.notOk(code, 'npm install exited with code 0') }) }) }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/publish-invalid-semver-tag.js�����������������������������������������������0000644�0000000�0000000�00000003301�12631326456�020732� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../lib/npm.js') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var path = require('path') var fs = require('fs') var mr = require('npm-registry-mock') var osenv = require('osenv') var PKG_DIR = path.resolve(__dirname, 'publish-invalid-semver-tag') var CACHE_DIR = path.resolve(PKG_DIR, 'cache') var DEFAULT_PKG = { 'name': 'examples', 'version': '1.2.3' } var mockServer function resetPackage (options) { rimraf.sync(CACHE_DIR) mkdirp.sync(CACHE_DIR) fs.writeFileSync(path.resolve(PKG_DIR, 'package.json'), DEFAULT_PKG) } test('setup', function (t) { process.chdir(osenv.tmpdir()) mkdirp.sync(PKG_DIR) process.chdir(PKG_DIR) resetPackage({}) mr({ port: common.port }, function (er, server) { npm.load({ cache: CACHE_DIR, registry: common.registry, cwd: PKG_DIR }, function (err) { t.ifError(err, 'started server') mockServer = server t.end() }) }) }) test('attempt publish with semver-like version', function (t) { resetPackage({}) npm.config.set('tag', 'v1.x') npm.commands.publish([], function (err) { t.notEqual(err, null) t.same(err.message, 'Tag name must not be a valid SemVer range: v1.x') t.end() }) }) test('attempt publish with semver-like version', function (t) { resetPackage({}) npm.config.set('tag', '1.2.3') npm.commands.publish([], function (err) { t.notEqual(err, null) t.same(err.message, 'Tag name must not be a valid SemVer range: 1.2.3') t.end() }) }) test('cleanup', function (t) { mockServer.close() process.chdir(osenv.tmpdir()) rimraf.sync(PKG_DIR) t.end() }) �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/publish-scoped.js�����������������������������������������������������������0000644�0000000�0000000�00000004124�12631326456�016515� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') require('../common-tap') var nock = require('nock') var npm = require('../../') var common = require('../common-tap.js') var pkg = path.join(__dirname, 'prepublish_package') test('setup', function (t) { mkdirp(path.join(pkg, 'cache'), next) function next () { process.chdir(pkg) fs.writeFile( path.join(pkg, 'package.json'), JSON.stringify({ name: '@bigco/publish-organized', version: '1.2.5' }), 'ascii', function (er) { t.ifError(er) t.pass('setup done') t.end() } ) } }) test('npm publish should honor scoping', function (t) { var put = nock(common.registry) .put('/@bigco%2fpublish-organized') .reply(201, verify) var configuration = { cache: path.join(pkg, 'cache'), loglevel: 'silent', registry: 'http://nonexistent.lvh.me', '//localhost:1337/:username': 'username', '//localhost:1337/:_password': new Buffer('password').toString('base64'), '//localhost:1337/:email': 'ogd@aoaioxxysz.net' } npm.load(configuration, onload) function onload (er) { t.ifError(er, 'npm bootstrapped successfully') npm.config.set('@bigco:registry', common.registry) npm.commands.publish([], false, function (er) { t.ifError(er, 'published without error') put.done() t.end() }) } function verify (_, body) { t.doesNotThrow(function () { var parsed = JSON.parse(body) var current = parsed.versions['1.2.5'] t.equal( current._npmVersion, require(path.resolve(__dirname, '../../package.json')).version, 'npm version is correct' ) t.equal( current._nodeVersion, process.versions.node, 'node version is correct' ) }, 'converted body back into object') return {ok: true} } }) test('cleanup', function (t) { process.chdir(__dirname) rimraf(pkg, function (er) { t.ifError(er) t.end() }) }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/pwd-prefix.js���������������������������������������������������������������0000644�0000000�0000000�00000001731�12631326456�015662� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// This test ensures that a few commands do the same // thing when the cwd is where package.json is, and when // the package.json is one level up. var test = require('tap').test var common = require('../common-tap.js') var path = require('path') var root = path.resolve(__dirname, '../..') var lib = path.resolve(root, 'lib') var commands = ['run', 'version'] commands.forEach(function (cmd) { // Should get the same stdout and stderr each time var stdout, stderr test(cmd + ' in root', function (t) { common.npm([cmd], {cwd: root}, function (er, code, so, se) { if (er) throw er t.notOk(code, 'npm ' + cmd + ' exited with code 0') stdout = so stderr = se t.end() }) }) test(cmd + ' in lib', function (t) { common.npm([cmd], {cwd: lib}, function (er, code, so, se) { if (er) throw er t.notOk(code, 'npm ' + cmd + ' exited with code 0') t.equal(so, stdout) t.equal(se, stderr) t.end() }) }) }) ���������������������������������������npm_3.5.2.orig/test/tap/referer.js������������������������������������������������������������������0000644�0000000�0000000�00000001234�12631326456�015225� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var http = require('http') test('should send referer http header', function (t) { http.createServer(function (q, s) { t.equal(q.headers.referer, 'install foo') s.statusCode = 404 s.end(JSON.stringify({error: 'whatever'})) this.close() }).listen(common.port, function () { var reg = 'http://localhost:' + common.port var args = [ 'install', 'foo', '--registry', reg ] common.npm(args, {}, function (er, code) { if (er) { throw er } // should not have ended nicely, since we returned an error t.ok(code) t.end() }) }) }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/registry.js�����������������������������������������������������������������0000644�0000000�0000000�00000003442�12631326456�015446� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// Run all the tests in the `npm-registry-couchapp` suite // This verifies that the server-side stuff still works. var common = require('../common-tap') var test = require('tap').test var npmExec = require.resolve('../../bin/npm-cli.js') var path = require('path') var ca = path.resolve(__dirname, '../../node_modules/npm-registry-couchapp') var which = require('which') var v = process.versions.node.split('.').map(function (n) { return parseInt(n, 10) }) if (v[0] === 0 && v[1] < 10) { console.error( 'WARNING: need a recent Node for npm-registry-couchapp tests to run, have', process.versions.node ) } else { which('couchdb', function (er) { if (er) { console.error('WARNING: need couch to run test: ' + er.message) } else { runTests() } }) } function runTests () { var env = { TAP: 1 } for (var i in process.env) env[i] = process.env[i] env.npm = npmExec var opts = { cwd: ca, stdio: 'inherit' } common.npm(['install'], opts, function (err, code) { if (err) { throw err } if (code) { return test('need install to work', function (t) { t.fail('install failed with: ' + code) t.end() }) } else { opts = { cwd: ca, env: env, stdio: 'inherit' } common.npm(['test', '--', '-Rtap'], opts, function (err, code) { if (err) { throw err } if (code) { return test('need test to work', function (t) { t.fail('test failed with: ' + code) t.end() }) } opts = { cwd: ca, env: env, stdio: 'inherit' } common.npm(['prune', '--production'], opts, function (err, code) { if (err) { throw err } process.exit(code || 0) }) }) } }) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/repo.js���������������������������������������������������������������������0000644�0000000�0000000�00000010275�12631326456�014545� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������if (process.platform === 'win32') { console.error('skipping test, because windows and shebangs') process.exit(0) } var common = require('../common-tap.js') var mr = require('npm-registry-mock') var test = require('tap').test var rimraf = require('rimraf') var fs = require('fs') var path = require('path') var outFile = path.join(__dirname, '/_output') var opts = { cwd: __dirname } test('setup', function (t) { var s = '#!/usr/bin/env bash\n' + 'echo \"$@\" > ' + JSON.stringify(__dirname) + '/_output\n' fs.writeFileSync(__dirname + '/_script.sh', s, 'ascii') fs.chmodSync(__dirname + '/_script.sh', '0755') t.pass('made script') t.end() }) test('npm repo underscore', function (t) { mr({ port: common.port }, function (er, s) { common.npm([ 'repo', 'underscore', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + __dirname + '/_script.sh' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'repo command ran without error') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://github.com/jashkenas/underscore\n') rimraf.sync(outFile) t.end() }) }) }) test('npm repo optimist - github (https://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm([ 'repo', 'optimist', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + __dirname + '/_script.sh' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'repo command ran without error') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://github.com/substack/node-optimist\n') rimraf.sync(outFile) t.end() }) }) }) test('npm repo npm-test-peer-deps - no repo', function (t) { mr({ port: common.port }, function (er, s) { common.npm([ 'repo', 'npm-test-peer-deps', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + __dirname + '/_script.sh' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'repo command ran without error') t.equal(code, 1, 'exit not ok') s.close() t.end() }) }) }) test('npm repo test-repo-url-http - non-github (http://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm([ 'repo', 'test-repo-url-http', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + __dirname + '/_script.sh' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'repo command ran without error') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'http://gitlab.com/evanlucas/test-repo-url-http\n') rimraf.sync(outFile) t.end() }) }) }) test('npm repo test-repo-url-https - non-github (https://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm([ 'repo', 'test-repo-url-https', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + __dirname + '/_script.sh' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'repo command ran without error') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://gitlab.com/evanlucas/test-repo-url-https\n') rimraf.sync(outFile) t.end() }) }) }) test('npm repo test-repo-url-ssh - non-github (ssh://)', function (t) { mr({ port: common.port }, function (er, s) { common.npm([ 'repo', 'test-repo-url-ssh', '--registry=' + common.registry, '--loglevel=silent', '--browser=' + __dirname + '/_script.sh' ], opts, function (err, code, stdout, stderr) { t.ifError(err, 'repo command ran without error') t.equal(code, 0, 'exit ok') var res = fs.readFileSync(outFile, 'ascii') s.close() t.equal(res, 'https://gitlab.com/evanlucas/test-repo-url-ssh\n') rimraf.sync(outFile) t.end() }) }) }) test('cleanup', function (t) { fs.unlinkSync(__dirname + '/_script.sh') t.pass('cleaned up') t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/rm-linked.js����������������������������������������������������������������0000644�0000000�0000000�00000006063�12631326456�015462� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mkdirp = require('mkdirp') var osenv = require('osenv') var path = require('path') var rimraf = require('rimraf') var test = require('tap').test var writeFileSync = require('fs').writeFileSync var common = require('../common-tap.js') var link = path.join(__dirname, 'rmlinked') var linkDep = path.join(link, 'node_modules', 'baz') var linkInstall = path.join(__dirname, 'rmlinked-install') var linkRoot = path.join(__dirname, 'rmlinked-root') var config = 'prefix = ' + linkRoot var configPath = path.join(link, '_npmrc') var OPTS = { env: { 'npm_config_userconfig': configPath } } var linkedJSON = { name: 'foo', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, dependencies: { 'baz': '1.0.0' }, author: '', license: 'ISC' } var linkedDepJSON = { name: 'baz', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, author: '', license: 'ISC' } var installJSON = { name: 'bar', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, dependencies: { 'foo': '1.0.0' }, author: '', license: 'ISC' } test('setup', function (t) { setup() common.npm(['ls', '-g', '--depth=0'], OPTS, function (err, c, out) { t.ifError(err) t.equal(c, 0, 'set up ok') t.notOk(out.match(/UNMET DEPENDENCY foo@/), "foo isn't in global") t.end() }) }) test('creates global link', function (t) { process.chdir(link) common.npm(['link'], OPTS, function (err, c, out) { t.ifError(err, 'link has no error') common.npm(['ls', '-g'], OPTS, function (err, c, out, stderr) { t.ifError(err) t.equal(c, 0) t.equal(stderr, '', 'got expected stderr') t.has(out, /foo@1.0.0/, 'creates global link ok') t.end() }) }) }) test('uninstall the global linked package', function (t) { process.chdir(osenv.tmpdir()) common.npm(['uninstall', '-g', 'foo'], OPTS, function (err) { t.ifError(err, 'uninstall has no error') process.chdir(link) common.npm(['ls'], OPTS, function (err, c, out) { t.ifError(err) t.equal(c, 0) t.has(out, /baz@1.0.0/, "uninstall didn't remove dep") t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) try { rimraf.sync(linkRoot) } catch (e) { } try { rimraf.sync(linkDep) } catch (e) { } try { rimraf.sync(link) } catch (e) { } try { rimraf.sync(linkInstall) } catch (e) { } } function setup () { cleanup() mkdirp.sync(linkRoot) mkdirp.sync(link) writeFileSync( path.join(link, 'package.json'), JSON.stringify(linkedJSON, null, 2) ) mkdirp.sync(linkDep) writeFileSync( path.join(linkDep, 'package.json'), JSON.stringify(linkedDepJSON, null, 2) ) mkdirp.sync(linkInstall) writeFileSync( path.join(linkInstall, 'package.json'), JSON.stringify(installJSON, null, 2) ) writeFileSync(configPath, config) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/run-script-filter-private.js������������������������������������������������0000644�0000000�0000000�00000002152�12631326456�020634� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = path.resolve(__dirname, 'run-script-filter-private') var opts = { cwd: pkg } var json = { name: 'run-script-filter-private', version: '1.2.3' } var npmrc = '//blah.com:_harsh=realms\n' test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.resolve(pkg, 'package.json'), JSON.stringify(json, null, 2) + '\n' ) fs.writeFileSync( path.resolve(pkg, '.npmrc'), npmrc ) t.end() }) test('npm run-script env', function (t) { common.npm(['run-script', 'env'], opts, function (er, code, stdout, stderr) { t.ifError(er, 'using default env script') t.notOk(stderr, 'should not generate errors') t.ok(stdout.indexOf('npm_config_init_version') > 0, 'expected values in var list') t.notMatch(stdout, /harsh/, 'unexpected config not there') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { rimraf.sync(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/run-script.js���������������������������������������������������������������0000644�0000000�0000000�00000017167�12631326456�015715� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var test = require('tap').test var rimraf = require('rimraf') var common = require('../common-tap') var pkg = path.resolve(__dirname, 'run-script') var cache = path.resolve(pkg, 'cache') var tmp = path.resolve(pkg, 'tmp') var opts = { cwd: pkg } var fullyPopulated = { 'name': 'runscript', 'version': '1.2.3', 'scripts': { 'start': 'node -e "console.log(process.argv[1] || \'start\')"', 'prewith-pre': 'node -e "console.log(process.argv[1] || \'pre\')"', 'with-pre': 'node -e "console.log(process.argv[1] || \'main\')"', 'with-post': 'node -e "console.log(process.argv[1] || \'main\')"', 'postwith-post': 'node -e "console.log(process.argv[1] || \'post\')"', 'prewith-both': 'node -e "console.log(process.argv[1] || \'pre\')"', 'with-both': 'node -e "console.log(process.argv[1] || \'main\')"', 'postwith-both': 'node -e "console.log(process.argv[1] || \'post\')"', 'stop': 'node -e "console.log(process.argv[1] || \'stop\')"' } } var lifecycleOnly = { name: 'scripted', version: '1.2.3', scripts: { 'prestart': 'echo prestart' } } var directOnly = { name: 'scripted', version: '1.2.3', scripts: { 'whoa': 'echo whoa' } } var both = { name: 'scripted', version: '1.2.3', scripts: { 'prestart': 'echo prestart', 'whoa': 'echo whoa' } } var preversionOnly = { name: 'scripted', version: '1.2.3', scripts: { 'preversion': 'echo preversion' } } function testOutput (t, command, er, code, stdout, stderr) { var lines if (er) throw er if (stderr) { throw new Error('npm ' + command + ' stderr: ' + stderr.toString()) } lines = stdout.trim().split('\n') stdout = lines.filter(function (line) { return line.trim() !== '' && line[0] !== '>' }).join(';') t.equal(stdout, command) t.end() } function writeMetadata (object) { fs.writeFileSync( path.resolve(pkg, 'package.json'), JSON.stringify(object, null, 2) + '\n' ) } function cleanup () { rimraf.sync(pkg) } test('setup', function (t) { cleanup() mkdirp.sync(cache) mkdirp.sync(tmp) writeMetadata(fullyPopulated) t.end() }) test('npm run-script start', function (t) { common.npm(['run-script', 'start'], opts, testOutput.bind(null, t, 'start')) }) test('npm run-script with args', function (t) { common.npm(['run-script', 'start', '--', 'stop'], opts, testOutput.bind(null, t, 'stop')) }) test('npm run-script with args that contain spaces', function (t) { common.npm(['run-script', 'start', '--', 'hello world'], opts, testOutput.bind(null, t, 'hello world')) }) test('npm run-script with args that contain single quotes', function (t) { common.npm(['run-script', 'start', '--', 'they"re awesome'], opts, testOutput.bind(null, t, 'they"re awesome')) }) test('npm run-script with args that contain double quotes', function (t) { common.npm(['run-script', 'start', '--', 'what"s "up"?'], opts, testOutput.bind(null, t, 'what"s "up"?')) }) test('npm run-script with args that contain ticks', function (t) { common.npm(['run-script', 'start', '--', 'what\'s \'up\'?'], opts, testOutput.bind(null, t, 'what\'s \'up\'?')) }) test('npm run-script with pre script', function (t) { common.npm(['run-script', 'with-post'], opts, testOutput.bind(null, t, 'main;post')) }) test('npm run-script with post script', function (t) { common.npm(['run-script', 'with-pre'], opts, testOutput.bind(null, t, 'pre;main')) }) test('npm run-script with both pre and post script', function (t) { common.npm(['run-script', 'with-both'], opts, testOutput.bind(null, t, 'pre;main;post')) }) test('npm run-script with both pre and post script and with args', function (t) { common.npm(['run-script', 'with-both', '--', 'an arg'], opts, testOutput.bind(null, t, 'pre;an arg;post')) }) test('npm run-script explicitly call pre script with arg', function (t) { common.npm(['run-script', 'prewith-pre', '--', 'an arg'], opts, testOutput.bind(null, t, 'an arg')) }) test('npm run-script test', function (t) { common.npm(['run-script', 'test'], opts, function (er, code, stdout, stderr) { t.ifError(er, 'npm run-script test ran without issue') t.notOk(stderr, 'should not generate errors') t.end() }) }) test('npm run-script env', function (t) { common.npm(['run-script', 'env'], opts, function (er, code, stdout, stderr) { t.ifError(er, 'using default env script') t.notOk(stderr, 'should not generate errors') t.ok(stdout.indexOf('npm_config_init_version') > 0, 'expected values in var list') t.end() }) }) test('npm run-script nonexistent-script', function (t) { common.npm(['run-script', 'nonexistent-script'], opts, function (er, code, stdout, stderr) { t.ifError(er, 'npm run-script nonexistent-script did not cause npm to explode') t.ok(stderr, 'should generate errors') t.end() }) }) test('npm run-script restart when there isn\'t restart', function (t) { common.npm(['run-script', 'restart'], opts, testOutput.bind(null, t, 'stop;start')) }) test('npm run-script nonexistent-script with --if-present flag', function (t) { common.npm(['run-script', '--if-present', 'nonexistent-script'], opts, function (er, code, stdout, stderr) { t.ifError(er, 'npm run-script --if-present non-existent-script ran without issue') t.notOk(stderr, 'should not generate errors') t.end() }) }) test('npm run-script no-params (lifecycle only)', function (t) { var expected = [ 'Lifecycle scripts included in scripted:', ' prestart', ' echo prestart', '' ].join('\n') writeMetadata(lifecycleOnly) common.npm(['run-script'], opts, function (err, code, stdout, stderr) { t.ifError(err, 'ran run-script without parameters without crashing') t.notOk(code, 'npm exited without error code') t.notOk(stderr, 'npm printed nothing to stderr') t.equal(stdout, expected, 'got expected output') t.end() }) }) test('npm run-script no-params (preversion only)', function (t) { var expected = [ 'Lifecycle scripts included in scripted:', ' preversion', ' echo preversion', '' ].join('\n') writeMetadata(preversionOnly) common.npm(['run-script'], opts, function (err, code, stdout, stderr) { t.ifError(err, 'ran run-script without parameters without crashing') t.notOk(code, 'npm exited without error code') t.notOk(stderr, 'npm printed nothing to stderr') t.equal(stdout, expected, 'got expected output') t.end() }) }) test('npm run-script no-params (direct only)', function (t) { var expected = [ 'Scripts available in scripted via `npm run-script`:', ' whoa', ' echo whoa', '' ].join('\n') writeMetadata(directOnly) common.npm(['run-script'], opts, function (err, code, stdout, stderr) { t.ifError(err, 'ran run-script without parameters without crashing') t.notOk(code, 'npm exited without error code') t.notOk(stderr, 'npm printed nothing to stderr') t.equal(stdout, expected, 'got expected output') t.end() }) }) test('npm run-script no-params (direct only)', function (t) { var expected = [ 'Lifecycle scripts included in scripted:', ' prestart', ' echo prestart', '', 'available via `npm run-script`:', ' whoa', ' echo whoa', '' ].join('\n') writeMetadata(both) common.npm(['run-script'], opts, function (err, code, stdout, stderr) { t.ifError(err, 'ran run-script without parameters without crashing') t.notOk(code, 'npm exited without error code') t.notOk(stderr, 'npm printed nothing to stderr') t.equal(stdout, expected, 'got expected output') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/scripts-whitespace-windows.js�����������������������������������������������0000644�0000000�0000000�00000004510�12631326456�021104� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = path.resolve(__dirname, 'scripts-whitespace-windows') var tmp = path.resolve(pkg, 'tmp') var cache = path.resolve(pkg, 'cache') var dep = path.resolve(pkg, 'dep') var EXEC_OPTS = { cwd: pkg } var json = { name: 'scripts-whitespace-windows', version: '1.0.0', description: 'a test', repository: 'git://github.com/robertkowalski/bogus', scripts: { foo: 'foo --title \"Analysis of\" --recurse -d report src' }, dependencies: { 'scripts-whitespace-windows-dep': '0.0.1' }, license: 'WTFPL' } var dependency = { name: 'scripts-whitespace-windows-dep', version: '0.0.1', bin: [ 'bin/foo' ] } var foo = function () {/* #!/usr/bin/env node if (process.argv.length === 8) console.log('npm-test-fine') */}.toString().split('\n').slice(1, -1).join('\n') test('setup', function (t) { cleanup() mkdirp.sync(tmp) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) fs.writeFileSync( path.join(pkg, 'README.md'), "### THAT'S RIGHT\n" ) mkdirp.sync(path.join(dep, 'bin')) fs.writeFileSync( path.join(dep, 'package.json'), JSON.stringify(dependency, null, 2) ) fs.writeFileSync(path.join(dep, 'bin', 'foo'), foo) common.npm(['i', dep], { cwd: pkg, env: { npm_config_cache: cache, npm_config_tmp: tmp, npm_config_prefix: pkg, npm_config_global: 'false' } }, function (err, code, stdout, stderr) { t.ifErr(err, 'npm i ' + dep + ' finished without error') t.equal(code, 0, 'npm i ' + dep + ' exited ok') t.notOk(stderr, 'no output stderr') t.end() }) }) test('test', function (t) { common.npm(['run', 'foo'], EXEC_OPTS, function (err, code, stdout, stderr) { stderr = stderr.trim() if (stderr) console.error(stderr) t.ifErr(err, 'npm run finished without error') t.equal(code, 0, 'npm run exited ok') t.notOk(stderr, 'no output stderr: ' + stderr) stdout = stdout.trim() t.ok(/npm-test-fine/.test(stdout)) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/search.js�������������������������������������������������������������������0000644�0000000�0000000�00000015613�12631326456�015046� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'search') var cache = path.resolve(pkg, 'cache') var registryCache = path.resolve(cache, 'localhost_1337', '-', 'all') var cacheJsonFile = path.resolve(registryCache, '.cache.json') var timeMock = { epoch: 1411727900, future: 1411727900 + 100, all: 1411727900 + 25, since: 0 // filled by since server callback } var EXEC_OPTS = {} var mocks = { /* Since request, always response with an _update time > the time requested */ sinceFuture: function (server) { server.filteringPathRegEx(/startkey=[^&]*/g, function (s) { var _allMock = JSON.parse(JSON.stringify(allMock)) timeMock.since = _allMock._updated = s.replace('startkey=', '') server.get('/-/all/since?stale=update_after&' + s) .reply(200, _allMock) return s }) }, allFutureUpdatedOnly: function (server) { server.get('/-/all') .reply(200, stringifyUpdated(timeMock.future)) }, all: function (server) { server.get('/-/all') .reply(200, allMock) } } test('No previous cache, init cache triggered by first search', function (t) { cleanup() mr({ port: common.port, plugin: mocks.allFutureUpdatedOnly }, function (err, s) { t.ifError(err, 'mock registry started') common.npm([ 'search', 'do not do extra search work on my behalf', '--registry', common.registry, '--cache', cache, '--loglevel', 'silent', '--color', 'always' ], EXEC_OPTS, function (err, code) { s.close() t.equal(code, 0, 'search finished successfully') t.ifErr(err, 'search finished successfully') t.ok( fs.existsSync(cacheJsonFile), cacheJsonFile + ' expected to have been created' ) var cacheData = JSON.parse(fs.readFileSync(cacheJsonFile, 'utf8')) t.equal(cacheData._updated, String(timeMock.future)) t.end() }) }) }) test('previous cache, _updated set, should trigger since request', function (t) { setupCache() function m (server) { [ mocks.all, mocks.sinceFuture ].forEach(function (m) { m(server) }) } mr({ port: common.port, plugin: m }, function (err, s) { t.ifError(err, 'mock registry started') common.npm([ 'search', 'do not do extra search work on my behalf', '--registry', common.registry, '--cache', cache, '--loglevel', 'silly', '--color', 'always' ], EXEC_OPTS, function (err, code) { s.close() t.equal(code, 0, 'search finished successfully') t.ifErr(err, 'search finished successfully') var cacheData = JSON.parse(fs.readFileSync(cacheJsonFile, 'utf8')) t.equal( cacheData._updated, timeMock.since, 'cache update time gotten from since response' ) t.end() }) }) }) var searches = [ { term: 'f36b6a6123da50959741e2ce4d634f96ec668c56', description: 'non-regex', location: 241 }, { term: '/f36b6a6123da50959741e2ce4d634f96ec668c56/', description: 'regex', location: 241 } ] searches.forEach(function (search) { test(search.description + ' search in color', function (t) { cleanup() mr({ port: common.port, plugin: mocks.all }, function (er, s) { common.npm([ 'search', search.term, '--registry', common.registry, '--cache', cache, '--loglevel', 'silent', '--color', 'always' ], EXEC_OPTS, function (err, code, stdout) { s.close() t.equal(code, 0, 'search finished successfully') t.ifErr(err, 'search finished successfully') // \033 == \u001B var markStart = '\u001B\\[[0-9][0-9]m' var markEnd = '\u001B\\[0m' var re = new RegExp(markStart + '.*?' + markEnd) var cnt = stdout.search(re) t.equal( cnt, search.location, search.description + ' search for ' + search.term ) t.end() }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setupCache () { cleanup() mkdirp.sync(cache) mkdirp.sync(registryCache) var res = fs.writeFileSync(cacheJsonFile, stringifyUpdated(timeMock.epoch)) if (res) throw new Error('Creating cache file failed') } function stringifyUpdated (time) { return JSON.stringify({ _updated: String(time) }) } var allMock = { '_updated': timeMock.all, 'generator-frontcow': { 'name': 'generator-frontcow', 'description': 'f36b6a6123da50959741e2ce4d634f96ec668c56 This is a fake description to ensure we do not accidentally search the real npm registry or use some kind of cache', 'dist-tags': { 'latest': '0.1.19' }, 'maintainers': [ { 'name': 'bcabanes', 'email': 'contact@benjamincabanes.com' } ], 'homepage': 'https://github.com/bcabanes/generator-frontcow', 'keywords': [ 'sass', 'frontend', 'yeoman-generator', 'atomic', 'design', 'sass', 'foundation', 'foundation5', 'atomic design', 'bourbon', 'polyfill', 'font awesome' ], 'repository': { 'type': 'git', 'url': 'https://github.com/bcabanes/generator-frontcow' }, 'author': { 'name': 'ben', 'email': 'contact@benjamincabanes.com', 'url': 'https://github.com/bcabanes' }, 'bugs': { 'url': 'https://github.com/bcabanes/generator-frontcow/issues' }, 'license': 'MIT', 'readmeFilename': 'README.md', 'time': { 'modified': '2014-10-03T02:26:18.406Z' }, 'versions': { '0.1.19': 'latest' } }, 'marko': { 'name': 'marko', 'description': 'Marko is an extensible, streaming, asynchronous, high performance, HTML-based templating language that can be used in Node.js or in the browser.', 'dist-tags': { 'latest': '1.2.16' }, 'maintainers': [ { 'name': 'pnidem', 'email': 'pnidem@gmail.com' }, { 'name': 'philidem', 'email': 'phillip.idem@gmail.com' } ], 'homepage': 'https://github.com/raptorjs/marko', 'keywords': [ 'templating', 'template', 'async', 'streaming' ], 'repository': { 'type': 'git', 'url': 'https://github.com/raptorjs/marko.git' }, 'author': { 'name': 'Patrick Steele-Idem', 'email': 'pnidem@gmail.com' }, 'bugs': { 'url': 'https://github.com/raptorjs/marko/issues' }, 'license': 'Apache License v2.0', 'readmeFilename': 'README.md', 'users': { 'pnidem': true }, 'time': { 'modified': '2014-10-03T02:27:31.775Z' }, 'versions': { '1.2.16': 'latest' } } } ���������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/semver-doc.js���������������������������������������������������������������0000644�0000000�0000000�00000000652�12631326456�015642� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test test('semver doc is up to date', function (t) { var path = require('path') var moddoc = path.join(__dirname, '../../node_modules/semver/README.md') var mydoc = path.join(__dirname, '../../doc/misc/semver.md') var fs = require('fs') var mod = fs.readFileSync(moddoc, 'utf8').replace(/semver\(1\)/, 'semver(7)') var my = fs.readFileSync(mydoc, 'utf8') t.equal(my, mod) t.end() }) ��������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/semver-tag.js���������������������������������������������������������������0000644�0000000�0000000�00000000726�12631326456�015652� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������// should not allow tagging with a valid semver range var common = require('../common-tap.js') var test = require('tap').test test('try to tag with semver range as tag name', function (t) { var cmd = ['tag', 'zzzz@1.2.3', 'v2.x', '--registry=http://localhost'] common.npm(cmd, { stdio: 'pipe' }, function (er, code, so, se) { if (er) throw er t.similar(se, /Tag name must not be a valid SemVer range: v2.x\n/) t.equal(code, 1) t.end() }) }) ������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-dev-dependency.js������������������������������������������������0000644�0000000�0000000�00000003512�12631326456�020654� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = npm = require('../../') var pkg = path.resolve(__dirname, 'shrinkwrap-dev-dependency') test("shrinkwrap doesn't strip out the dependency", function (t) { t.plan(1) mr({port: common.port}, function (er, s) { setup(function (err) { if (err) return t.fail(err) npm.install('.', function (err) { if (err) return t.fail(err) npm.commands.shrinkwrap([], true, function (err, results) { if (err) return t.fail(err) t.deepEqual(results, desired) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) var desired = { name: 'npm-test-shrinkwrap-dev-dependency', version: '0.0.0', dependencies: { request: { version: '0.9.0', from: 'request@0.9.0', resolved: common.registry + '/request/-/request-0.9.0.tgz' }, underscore: { version: '1.3.1', from: 'underscore@1.3.1', resolved: common.registry + '/underscore/-/underscore-1.3.1.tgz' } } } var json = { author: 'Domenic Denicola', name: 'npm-test-shrinkwrap-dev-dependency', version: '0.0.0', dependencies: { request: '0.9.0', underscore: '1.3.1' }, devDependencies: { underscore: '1.5.1' } } function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync(path.join(pkg, 'package.json'), JSON.stringify(json, null, 2)) process.chdir(pkg) var opts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } npm.load(opts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-empty-deps.js����������������������������������������������������0000644�0000000�0000000�00000003212�12631326456�020046� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'shrinkwrap-empty-deps') var EXEC_OPTS = { cwd: pkg } var json = { author: 'Rockbert', name: 'shrinkwrap-empty-deps', version: '0.0.0', dependencies: {}, devDependencies: {} } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) process.chdir(pkg) t.end() }) test('returns a list of removed items', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'shrinkwrap' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'shrinkwrap ran without issue') t.notOk(code, 'shrinkwrap ran without raising error code') fs.readFile(path.resolve(pkg, 'npm-shrinkwrap.json'), function (err, desired) { t.ifError(err, 'read npm-shrinkwrap.json without issue') t.same( { 'name': 'shrinkwrap-empty-deps', 'version': '0.0.0' }, JSON.parse(desired), 'shrinkwrap handled empty deps without exploding' ) s.close() t.end() }) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-local-dependency.js����������������������������������������������0000644�0000000�0000000�00000005654�12631326456�021201� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var path = require('path') var fs = require('fs') var osenv = require('osenv') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var common = require('../common-tap.js') var PKG_DIR = path.resolve(__dirname, 'shrinkwrap-local-dependency') var CACHE_DIR = path.resolve(PKG_DIR, 'cache') var DEP_DIR = path.resolve(PKG_DIR, 'dep') var desired = { 'name': 'npm-test-shrinkwrap-local-dependency', 'version': '0.0.0', 'dependencies': { 'npm-test-shrinkwrap-local-dependency-dep': { 'version': '0.0.0', 'from': 'dep', 'resolved': 'file:dep' } } } var root = { 'author': 'Thomas Torp', 'name': 'npm-test-shrinkwrap-local-dependency', 'version': '0.0.0', 'dependencies': { 'npm-test-shrinkwrap-local-dependency-dep': 'file:./dep' } } var dependency = { 'author': 'Thomas Torp', 'name': 'npm-test-shrinkwrap-local-dependency-dep', 'version': '0.0.0' } test('shrinkwrap uses resolved with file: on local deps', function (t) { setup() common.npm( ['--cache=' + CACHE_DIR, '--loglevel=silent', 'install', '.'], {}, function (err, code) { t.ifError(err, 'npm install worked') t.equal(code, 0, 'npm exited normally') common.npm( ['--cache=' + CACHE_DIR, '--loglevel=silent', 'shrinkwrap'], {}, function (err, code) { t.ifError(err, 'npm shrinkwrap worked') t.equal(code, 0, 'npm exited normally') fs.readFile('npm-shrinkwrap.json', { encoding: 'utf8' }, function (err, data) { t.ifError(err, 'read file correctly') t.deepEqual(JSON.parse(data), desired, 'shrinkwrap looks correct') t.end() }) } ) } ) }) test("'npm install' should install local packages from shrinkwrap", function (t) { cleanNodeModules() common.npm( ['--cache=' + CACHE_DIR, '--loglevel=silent', 'install', '.'], {}, function (err, code) { t.ifError(err, 'install ran correctly') t.notOk(code, 'npm install exited with code 0') var dependencyPackageJson = path.resolve( PKG_DIR, 'node_modules/npm-test-shrinkwrap-local-dependency-dep/package.json' ) t.ok( JSON.parse(fs.readFileSync(dependencyPackageJson, 'utf8')), 'package with local dependency installed from shrinkwrap' ) t.end() } ) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(PKG_DIR) mkdirp.sync(CACHE_DIR) mkdirp.sync(DEP_DIR) fs.writeFileSync( path.resolve(PKG_DIR, 'package.json'), JSON.stringify(root, null, 2) ) fs.writeFileSync( path.resolve(DEP_DIR, 'package.json'), JSON.stringify(dependency, null, 2) ) process.chdir(PKG_DIR) } function cleanNodeModules () { rimraf.sync(path.resolve(PKG_DIR, 'node_modules')) } function cleanup () { process.chdir(osenv.tmpdir()) cleanNodeModules() rimraf.sync(PKG_DIR) } ������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-optional-dependency.js�������������������������������������������0000644�0000000�0000000�00000004200�12631326456�021716� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = npm = require('../../') var pkg = path.resolve(__dirname, 'shrinkwrap-optional-dependency') test('shrinkwrap does not fail on missing optional dependency', function (t) { t.plan(1) var mocks = { get: { '/random-package': [404, {}] } } mr({port: common.port, mocks: mocks}, function (er, s) { function fail (err) { s.close() // Close on failure to allow node to exit t.fail(err) } setup(function (err) { if (err) return fail(err) // Install without the optional dependency npm.install('.', function (err) { if (err) return fail(err) // Pretend the optional dependency was specified, but somehow failed to load: json.optionalDependencies = { 'random-package': '0.0.0' } writePackage() npm.commands.shrinkwrap([], true, function (err, results) { if (err) return fail(err) t.deepEqual(results, desired) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) var desired = { name: 'npm-test-shrinkwrap-optional-dependency', version: '0.0.0', dependencies: { 'test-package': { version: '0.0.0', from: 'test-package@0.0.0', resolved: common.registry + '/test-package/-/test-package-0.0.0.tgz' } } } var json = { author: 'Maximilian Antoni', name: 'npm-test-shrinkwrap-optional-dependency', version: '0.0.0', dependencies: { 'test-package': '0.0.0' } } function writePackage () { fs.writeFileSync(path.join(pkg, 'package.json'), JSON.stringify(json, null, 2)) } function setup (cb) { cleanup() mkdirp.sync(pkg) writePackage() process.chdir(pkg) var opts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } npm.load(opts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-prod-dependency-also.js������������������������������������������0000644�0000000�0000000�00000003705�12631326456�022002� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = npm = require('../../') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'shrinkwrap-prod-dependency') test("shrinkwrap --also=development doesn't strip out prod dependencies", function (t) { t.plan(1) mr({port: common.port}, function (er, s) { setup({}, function (err) { if (err) return t.fail(err) npm.install('.', function (err) { if (err) return t.fail(err) npm.config.set('also', 'development') npm.commands.shrinkwrap([], true, function (err, results) { if (err) return t.fail(err) t.deepEqual(results, desired) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) var desired = { name: 'npm-test-shrinkwrap-prod-dependency', version: '0.0.0', dependencies: { request: { version: '0.9.0', from: 'request@0.9.0', resolved: common.registry + '/request/-/request-0.9.0.tgz' }, underscore: { version: '1.5.1', from: 'underscore@1.5.1', resolved: common.registry + '/underscore/-/underscore-1.5.1.tgz' } } } var json = { author: 'Domenic Denicola', name: 'npm-test-shrinkwrap-prod-dependency', version: '0.0.0', dependencies: { request: '0.9.0' }, devDependencies: { underscore: '1.5.1' } } function setup (opts, cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync(path.join(pkg, 'package.json'), JSON.stringify(json, null, 2)) process.chdir(pkg) var allOpts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } for (var key in opts) { allOpts[key] = opts[key] } npm.load(allOpts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-prod-dependency.js�����������������������������������������������0000644�0000000�0000000�00000003656�12631326456�021053� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = npm = require('../../') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'shrinkwrap-prod-dependency') test("shrinkwrap --dev doesn't strip out prod dependencies", function (t) { t.plan(1) mr({port: common.port}, function (er, s) { setup({}, function (err) { if (err) return t.fail(err) npm.install('.', function (err) { if (err) return t.fail(err) npm.config.set('dev', true) npm.commands.shrinkwrap([], true, function (err, results) { if (err) return t.fail(err) t.deepEqual(results, desired) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) var desired = { name: 'npm-test-shrinkwrap-prod-dependency', version: '0.0.0', dependencies: { request: { version: '0.9.0', from: 'request@0.9.0', resolved: common.registry + '/request/-/request-0.9.0.tgz' }, underscore: { version: '1.5.1', from: 'underscore@1.5.1', resolved: common.registry + '/underscore/-/underscore-1.5.1.tgz' } } } var json = { author: 'Domenic Denicola', name: 'npm-test-shrinkwrap-prod-dependency', version: '0.0.0', dependencies: { request: '0.9.0' }, devDependencies: { underscore: '1.5.1' } } function setup (opts, cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync(path.join(pkg, 'package.json'), JSON.stringify(json, null, 2)) process.chdir(pkg) var allOpts = { cache: path.resolve(pkg, 'cache'), registry: common.registry } for (var key in opts) { allOpts[key] = opts[key] } npm.load(allOpts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } ����������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-save-with-existing-dev-deps.js�����������������������������������0000644�0000000�0000000�00000004324�12631326456�023230� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var base = path.resolve(__dirname, path.basename(__filename, '.js')) var installme = path.join(base, 'installme') var installme_pkg = path.join(installme, 'package.json') var example = path.join(base, 'example') var example_shrinkwrap = path.join(example, 'npm-shrinkwrap.json') var example_pkg = path.join(example, 'package.json') var installed = path.join(example, 'node_modules', 'installed') var installed_pkg = path.join(installed, 'package.json') var EXEC_OPTS = { cwd: example } var installme_pkg_json = { name: 'installme', version: '1.0.0', dependencies: {} } var example_pkg_json = { name: 'example', version: '1.0.0', dependencies: {}, devDependencies: { 'installed': '1.0' } } var example_shrinkwrap_json = { name: 'example', version: '1.0.0', dependencies: { installed: { version: '1.0.0' } } } var installed_pkg_json = { _id: 'installed@1.0.0', name: 'installed', version: '1.0.0' } function writeJson (filename, obj) { mkdirp.sync(path.dirname(filename)) fs.writeFileSync(filename, JSON.stringify(obj, null, 2)) } test('setup', function (t) { cleanup() writeJson(installme_pkg, installme_pkg_json) writeJson(example_pkg, example_pkg_json) writeJson(example_shrinkwrap, example_shrinkwrap_json) writeJson(installed_pkg, installed_pkg_json) t.end() }) test('install --save leaves dev deps alone', function (t) { common.npm(['install', '--save', 'file://' + installme], EXEC_OPTS, function (er, code, stdout, stderr) { t.ifError(er, "spawn didn't catch fire") t.is(code, 0, 'install completed ok') t.is(stderr, '', 'install completed without error output') var shrinkwrap = JSON.parse(fs.readFileSync(example_shrinkwrap)) t.ok(shrinkwrap.dependencies.installed, "save new install didn't remove dev dep") t.ok(shrinkwrap.dependencies.installme, 'save new install DID add new dep') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(base) } ������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-scoped-auth.js���������������������������������������������������0000644�0000000�0000000�00000005677�12631326456�020214� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var resolve = require('path').resolve var writeFileSync = require('graceful-fs').writeFileSync var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var toNerfDart = require('../../lib/config/nerf-dart.js') var pkg = resolve(__dirname, 'shrinkwrap-scoped-auth') var outfile = resolve(pkg, '_npmrc') var modules = resolve(pkg, 'node_modules') var tarballPath = '/scoped-underscore/-/scoped-underscore-1.3.1.tgz' var tarballURL = common.registry + tarballPath var tarball = resolve(__dirname, '../fixtures/scoped-underscore-1.3.1.tgz') var server var EXEC_OPTS = { cwd: pkg } function mocks (server) { var auth = 'Bearer 0xabad1dea' server.get(tarballPath, { authorization: auth }).replyWithFile(200, tarball) server.get(tarballPath).reply(401, { error: 'unauthorized', reason: 'You are not authorized to access this db.' }) } test('setup', function (t) { mr({ port: common.port, plugin: mocks }, function (er, s) { server = s t.ok(s, 'set up mock registry') setup() t.end() }) }) test('authed npm install with shrinkwrapped scoped package', function (t) { common.npm( [ 'install', '--loglevel', 'silent', '--json', '--fetch-retries', 0, '--userconfig', outfile ], EXEC_OPTS, function (err, code, stdout, stderr) { console.error(stderr) t.ifError(err, 'test runner executed without error') t.equal(code, 0, 'npm install exited OK') t.notOk(stderr, 'no output on stderr') try { var results = JSON.parse(stdout) } catch (ex) { console.error('#', ex) t.ifError(ex, 'stdout was valid JSON') } if (results) { var installedversion = { 'version': '1.3.1', 'from': '>=1.3.1 <2', 'resolved': 'http://localhost:1337/scoped-underscore/-/scoped-underscore-1.3.1.tgz' } t.isDeeply(results.dependencies['@scoped/underscore'], installedversion, '@scoped/underscore installed') } t.end() } ) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) var contents = '@scoped:registry=' + common.registry + '\n' + toNerfDart(common.registry) + ':_authToken=0xabad1dea\n' var json = { name: 'test-package-install', version: '1.0.0' } var shrinkwrap = { name: 'test-package-install', version: '1.0.0', dependencies: { '@scoped/underscore': { resolved: tarballURL, from: '>=1.3.1 <2', version: '1.3.1' } } } function setup () { cleanup() mkdirp.sync(modules) writeFileSync(resolve(pkg, 'package.json'), JSON.stringify(json, null, 2) + '\n') writeFileSync(outfile, contents) writeFileSync( resolve(pkg, 'npm-shrinkwrap.json'), JSON.stringify(shrinkwrap, null, 2) + '\n' ) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-shared-dev-dependency.js�����������������������������������������0000644�0000000�0000000�00000004023�12631326456�022116� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = npm = require('../../') var pkg = path.resolve(__dirname, 'shrinkwrap-shared-dev-dependency') test("shrinkwrap doesn't strip out the shared dependency", function (t) { t.plan(1) mr({ port: common.port }, function (er, s) { setup(function (err) { if (err) return t.fail(err) npm.install('.', function (err) { if (err) return t.fail(err) npm.commands.shrinkwrap([], true, function (err, results) { if (err) return t.fail(err) t.deepEqual(results, desired) s.close() t.end() }) }) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) var desired = { name: 'npm-test-shrinkwrap-shared-dev-dependency', version: '0.0.0', dependencies: { 'test-package-with-one-dep': { version: '0.0.0', from: 'test-package-with-one-dep@0.0.0', resolved: common.registry + '/test-package-with-one-dep/-/test-package-with-one-dep-0.0.0.tgz' }, 'test-package': { version: '0.0.0', from: 'test-package@0.0.0', resolved: common.registry + '/test-package/-/test-package-0.0.0.tgz' } } } var json = { author: 'Domenic Denicola', name: 'npm-test-shrinkwrap-shared-dev-dependency', version: '0.0.0', dependencies: { 'test-package-with-one-dep': '0.0.0' }, devDependencies: { 'test-package': '0.0.0' } } function setup (cb) { cleanup() mkdirp.sync(pkg) fs.writeFileSync(path.join(pkg, 'package.json'), JSON.stringify(json, null, 2)) process.chdir(pkg) var opts = { cache: path.resolve(pkg, 'cache'), registry: common.registry, // important to make sure devDependencies don't get stripped dev: true } npm.load(opts, cb) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/shrinkwrap-version-match.js�������������������������������������������������0000644�0000000�0000000�00000006135�12631326456�020545� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var fs = require('fs') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var path = require('path') var common = require('../common-tap.js') var testdir = path.resolve(__dirname, path.basename(__filename, '.js')) var modAdir = path.resolve(testdir, 'modA') var modB1dir = path.resolve(testdir, 'modB@1') var modB2dir = path.resolve(testdir, 'modB@2') var modCdir = path.resolve(testdir, 'modC') var testjson = { dependencies: { modA: 'file://' + modAdir, modC: 'file://' + modCdir } } var testshrinkwrap = { dependencies: { modA: { version: '1.0.0', from: 'modA', resolved: 'file://' + modAdir }, modB: { version: '1.0.0', from: 'modB@1', resolved: 'file://' + modB1dir } } } var modAjson = { name: 'modA', version: '1.0.0', dependencies: { 'modB': 'file://' + modB1dir } } var modCjson = { name: 'modC', version: '1.0.0', dependencies: { 'modB': 'file://' + modB2dir } } var modB1json = { name: 'modB', version: '1.0.0' } var modB2json = { name: 'modB', version: '2.0.0' } function writepjson (dir, content) { writejson(dir, 'package.json', content) } function writejson (dir, file, content) { writefile(dir, file, JSON.stringify(content, null, 2)) } function writefile (dir, file, content) { fs.writeFileSync(path.join(dir, file), content) } function setup () { mkdirp.sync(testdir) writepjson(testdir, testjson) writejson(testdir, 'npm-shrinkwrap.json', testshrinkwrap) mkdirp.sync(modAdir) writepjson(modAdir, modAjson) mkdirp.sync(modB1dir) writepjson(modB1dir, modB1json) writefile(modB1dir, 'B1', '') mkdirp.sync(modB2dir) writepjson(modB2dir, modB2json) writefile(modB2dir, 'B2', '') mkdirp.sync(modCdir) writepjson(modCdir, modCjson) } function cleanup () { rimraf.sync(testdir) } test('setup', function (t) { cleanup() setup() t.end() }) // Shrinkwraps need to let you override dependency versions specified in // package.json files. Indeed, this was already supported, but it was a bit // to keen on this. Previously, if you had a dep in your shrinkwrap then // anything that required that dependency would count as a match, regardless // of version. // This test ensures that the broad matching is not done when the matched // module is not a direct child of the module doing the requiring. test('bundled', function (t) { common.npm(['install'], {cwd: testdir}, function (err, code, out, stderr) { t.is(err, null, 'No fatal errors running npm') t.is(code, 0, 'npm itself completed ok') // Specifically, if B2 exists (or the modB directory under modC at all) // that means modC was given its own copy of modB. Without the patch // that went with this test, it wouldn't have been installed because npm // would have consider modB@1 to have fulfilled modC's requirement. fs.stat(path.join(testdir, 'node_modules', 'modC', 'node_modules', 'modB', 'B2'), function (missing) { t.ok(!missing, 'modC got the right version of modB') t.end() }) }) }) test('cleanup', function (t) { cleanup() t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/sorted-package-json.js������������������������������������������������������0000644�0000000�0000000�00000004643�12631326456�017442� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var test = require('tap').test var path = require('path') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var spawn = require('child_process').spawn var npm = require.resolve('../../bin/npm-cli.js') var node = process.execPath var pkg = path.resolve(__dirname, 'sorted-package-json') var tmp = path.join(pkg, 'tmp') var cache = path.join(pkg, 'cache') var fs = require('fs') var common = require('../common-tap.js') var mr = require('npm-registry-mock') var osenv = require('osenv') test('sorting dependencies', function (t) { var packageJson = path.resolve(pkg, 'package.json') cleanup() mkdirp.sync(cache) mkdirp.sync(tmp) setup() var before = JSON.parse(fs.readFileSync(packageJson).toString()) mr({ port: common.port }, function (er, s) { // underscore is already in the package.json, // but --save will trigger a rewrite with sort var child = spawn(node, [npm, 'install', '--save', 'underscore@1.3.3'], { cwd: pkg, env: { 'npm_config_registry': common.registry, 'npm_config_cache': cache, 'npm_config_tmp': tmp, 'npm_config_prefix': pkg, 'npm_config_global': 'false', HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH } }) child.on('close', function (code) { t.equal(code, 0, 'npm install exited with code') var result = fs.readFileSync(packageJson).toString() var resultAsJson = JSON.parse(result) s.close() t.same(Object.keys(resultAsJson.dependencies), Object.keys(before.dependencies).sort()) t.notSame(Object.keys(resultAsJson.dependencies), Object.keys(before.dependencies)) t.ok(resultAsJson.dependencies.underscore) t.ok(resultAsJson.dependencies.request) t.end() }) }) }) test('cleanup', function (t) { cleanup() t.pass('cleaned up') t.end() }) function setup () { mkdirp.sync(pkg) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ 'name': 'sorted-package-json', 'version': '0.0.0', 'description': '', 'main': 'index.js', 'scripts': { 'test': 'echo \'Error: no test specified\' && exit 1' }, 'author': 'Rocko Artischocko', 'license': 'ISC', 'dependencies': { 'underscore': '^1.3.3', 'request': '^0.9.0' } }, null, 2), 'utf8') } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(cache) rimraf.sync(pkg) } ���������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/spawn-enoent-help.js��������������������������������������������������������0000644�0000000�0000000�00000001372�12631326456�017142� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require('path') var test = require('tap').test var rimraf = require('rimraf') var mkdirp = require('mkdirp') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'spawn-enoent-help') test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) t.end() }) test('enoent help', function (t) { common.npm(['help', 'config'], { cwd: pkg, env: { PATH: '', Path: '', 'npm_config_loglevel': 'warn', 'npm_config_viewer': 'woman' } }, function (er, code, sout, serr) { t.similar(serr, /Check if the file 'emacsclient' is present./) t.equal(global.cooked, undefined, "Don't leak into global scope") t.end() }) }) test('clean', function (t) { rimraf.sync(pkg) t.end() }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/spawn-enoent.js�������������������������������������������������������������0000644�0000000�0000000�00000001575�12631326456�016221� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var path = require('path') var test = require('tap').test var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'spawn-enoent') var pj = JSON.stringify({ name: 'x', version: '1.2.3', scripts: { start: 'wharble-garble-blorst' } }, null, 2) + '\n' test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) fs.writeFileSync(pkg + '/package.json', pj) t.end() }) test('enoent script', function (t) { common.npm(['start'], { cwd: pkg, env: { PATH: process.env.PATH, Path: process.env.Path, 'npm_config_loglevel': 'warn' } }, function (er, code, sout, serr) { t.similar(serr, /npm ERR! Failed at the x@1\.2\.3 start script 'wharble-garble-blorst'\./) t.end() }) }) test('clean', function (t) { rimraf.sync(pkg) t.end() }) �����������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/splat-with-only-prerelease-to-latest.js�������������������������������������0000644�0000000�0000000�00000004077�12631326456�022715� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var npm = require('../../lib/npm') var stream = require('readable-stream') var moduleName = 'xyzzy-wibble' var testModule = { name: moduleName, 'dist-tags': { latest: '1.3.0-a' }, versions: { '1.0.0-a': { name: moduleName, version: '1.0.0-a', dist: { shasum: 'da39a3ee5e6b4b0d3255bfef95601890afd80709', tarball: 'http://registry.npmjs.org/aproba/-/xyzzy-wibble-1.0.0-a.tgz' } }, '1.1.0-a': { name: moduleName, version: '1.1.0-a', dist: { shasum: 'da39a3ee5e6b4b0d3255bfef95601890afd80709', tarball: 'http://registry.npmjs.org/aproba/-/xyzzy-wibble-1.1.0-a.tgz' } }, '1.2.0-a': { name: moduleName, version: '1.2.0-a', dist: { shasum: 'da39a3ee5e6b4b0d3255bfef95601890afd80709', tarball: 'http://registry.npmjs.org/aproba/-/xyzzy-wibble-1.2.0-a.tgz' } }, '1.3.0-a': { name: moduleName, version: '1.3.0-a', dist: { shasum: 'da39a3ee5e6b4b0d3255bfef95601890afd80709', tarball: 'http://registry.npmjs.org/aproba/-/xyzzy-wibble-1.3.0-a.tgz' } } } } var lastFetched test('setup', function (t) { npm.load(function () { npm.config.set('loglevel', 'silly') npm.registry = { get: function (uri, opts, cb) { setTimeout(function () { cb(null, testModule, null, {statusCode: 200}) }) }, fetch: function (u, opts, cb) { lastFetched = u setTimeout(function () { var empty = new stream.Readable() empty.push(null) cb(null, empty) }) } } t.end() }) }) test('splat', function (t) { t.plan(4) var addNamed = require('../../lib/cache/add-named.js') addNamed('xyzzy-wibble', '*', testModule, function (err, pkg) { t.error(err, 'Succesfully resolved a splat package') t.is(pkg.name, moduleName) t.is(pkg.version, testModule['dist-tags'].latest) t.is(lastFetched, 'https://registry.npmjs.org/aproba/-/xyzzy-wibble-1.3.0-a.tgz') }) }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/startstop.js����������������������������������������������������������������0000644�0000000�0000000�00000003067�12631326456�015644� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var pkg = path.resolve(__dirname, 'startstop') var EXEC_OPTS = { cwd: pkg } var json = { name: 'startstop', version: '1.2.3', scripts: { start: 'node -e \"console.log(\'start\')\"', stop: 'node -e \"console.log(\'stop\')\"' } } function testOutput (t, command, er, code, stdout, stderr) { t.notOk(code, 'npm ' + command + ' exited with code 0') if (stderr) throw new Error('npm ' + command + ' stderr: ' + stderr.toString()) stdout = stdout.trim().split(/\n|\r/) stdout = stdout[stdout.length - 1] t.equal(stdout, command) t.end() } test('setup', function (t) { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) t.end() }) test('npm start', function (t) { common.npm(['start'], EXEC_OPTS, testOutput.bind(null, t, 'start')) }) test('npm stop', function (t) { common.npm(['stop'], EXEC_OPTS, testOutput.bind(null, t, 'stop')) }) test('npm restart', function (t) { common.npm(['restart'], EXEC_OPTS, function (er, c, stdout) { if (er) throw er var output = stdout.split('\n').filter(function (val) { return val.match(/^s/) }) t.same(output.sort(), ['start', 'stop'].sort()) t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/symlink-cycle.js������������������������������������������������������������0000644�0000000�0000000�00000002405�12631326456�016357� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var fs = require('fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var writeFileSync = require('fs').writeFileSync var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var cycle = path.join(base, 'cycle') var cycleJSON = { name: 'cycle', version: '1.0.0', description: '', main: 'index.js', scripts: { test: 'echo \"Error: no test specified\" && exit 1' }, dependencies: { 'cycle': '*' }, author: '', license: 'ISC' } test('setup', function (t) { setup() t.end() }) test('ls', function (t) { process.chdir(cycle) common.npm(['ls'], {}, function (err, code, stdout, stderr) { t.ifError(err, 'installed w/o error') t.is(stderr, '', 'no warnings printed to stderr') t.end() }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) cleanup() t.end() }) function cleanup () { rimraf.sync(base) } function setup () { cleanup() mkdirp.sync(path.join(cycle, 'node_modules')) writeFileSync( path.join(cycle, 'package.json'), JSON.stringify(cycleJSON, null, 2) ) fs.symlinkSync(cycle, path.join(cycle, 'node_modules', 'cycle')) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/tag-version-prefix.js�������������������������������������������������������0000644�0000000�0000000�00000004632�12631326456�017331� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../lib/npm.js') var pkg = path.resolve(__dirname, 'version-message-config') var cache = path.resolve(pkg, 'cache') var npmrc = path.resolve(pkg, '.npmrc') var packagePath = path.resolve(pkg, 'package.json') var json = { name: 'blah', version: '0.1.2' } var configContents = 'sign-git-tag=false\nmessage=":bookmark: %s"\n' test('npm version <semver> with message config', function (t) { setup() npm.load({ prefix: pkg, userconfig: npmrc }, function () { var git = require('../../lib/utils/git.js') common.makeGitRepo({ path: pkg }, function (er) { t.ifErr(er, 'git bootstrap ran without error') common.npm( [ '--userconfig', npmrc, 'config', 'set', 'tag-version-prefix', 'q' ], { cwd: pkg, env: { PATH: process.env.PATH } }, function (err, code, stdout, stderr) { t.ifError(err, 'npm config ran without issue') t.notOk(code, 'exited with a non-error code') t.notOk(stderr, 'no error output') common.npm( [ 'version', 'patch', '--loglevel', 'silent' // package config is picked up from env ], { cwd: pkg, env: { PATH: process.env.PATH } }, function (err, code, stdout, stderr) { t.ifError(err, 'npm version ran without issue') t.notOk(code, 'exited with a non-error code') t.notOk(stderr, 'no error output') git.whichAndExec( ['tag'], { cwd: pkg, env: process.env }, function (er, tags, stderr) { t.ok(tags.match(/q0\.1\.3/g), 'tag was created by version' + tags) t.end() } ) } ) } ) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(cache) process.chdir(pkg) fs.writeFileSync(packagePath, JSON.stringify(json), 'utf8') fs.writeFileSync(npmrc, configContents, 'ascii') } ������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/team.js���������������������������������������������������������������������0000644�0000000�0000000�00000006651�12631326456�014531� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var mr = require('npm-registry-mock') var test = require('tap').test var common = require('../common-tap.js') var server test('setup', function (t) { mr({port: common.port}, function (err, s) { t.ifError(err, 'registry mocked successfully') server = s t.end() }) }) test('team create basic', function (t) { var teamData = { name: 'test', scope_id: 1234, created: '2015-07-23T18:07:49.959Z', updated: '2015-07-23T18:07:49.959Z', deleted: null } server.put('/-/org/myorg/team', JSON.stringify({ name: teamData.name })).reply(200, teamData) common.npm([ 'team', 'create', 'myorg:' + teamData.name, '--registry', common.registry, '--loglevel', 'silent' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'npm team') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.same(JSON.parse(stdout), teamData) t.end() }) }) test('team destroy', function (t) { var teamData = { name: 'myteam', scope_id: 1234, created: '2015-07-23T18:07:49.959Z', updated: '2015-07-23T18:07:49.959Z', deleted: '2015-07-23T18:27:27.178Z' } server.delete('/-/team/myorg/' + teamData.name).reply(200, teamData) common.npm([ 'team', 'destroy', 'myorg:' + teamData.name, '--registry', common.registry, '--loglevel', 'silent' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'npm team') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.same(JSON.parse(stdout), teamData) t.end() }) }) test('team add', function (t) { var user = 'zkat' server.put('/-/team/myorg/myteam/user', JSON.stringify({ user: user })).reply(200) common.npm([ 'team', 'add', 'myorg:myteam', user, '--registry', common.registry, '--loglevel', 'silent' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'npm team') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.end() }) }) test('team rm', function (t) { var user = 'zkat' server.delete('/-/team/myorg/myteam/user', JSON.stringify({ user: user })).reply(200) common.npm([ 'team', 'rm', 'myorg:myteam', user, '--registry', common.registry, '--loglevel', 'silent' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'npm team') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.end() }) }) test('team ls (on org)', function (t) { var teams = ['myorg:team1', 'myorg:team2', 'myorg:team3'] server.get('/-/org/myorg/team?format=cli').reply(200, teams) common.npm([ 'team', 'ls', 'myorg', '--registry', common.registry, '--loglevel', 'silent' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'npm team') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.same(JSON.parse(stdout), teams) t.end() }) }) test('team ls (on team)', function (t) { var users = ['zkat', 'bcoe'] server.get('/-/team/myorg/myteam/user?format=cli').reply(200, users) common.npm([ 'team', 'ls', 'myorg:myteam', '--registry', common.registry, '--loglevel', 'silent' ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'npm team') t.equal(code, 0, 'exited OK') t.equal(stderr, '', 'no error output') t.same(JSON.parse(stdout), users) t.end() }) }) test('cleanup', function (t) { t.pass('cleaned up') server.done() server.close() t.end() }) ���������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/test-run-ls.js��������������������������������������������������������������0000644�0000000�0000000�00000001626�12631326456�015775� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var path = require('path') var cwd = path.resolve(__dirname, '..', '..') var testscript = require('../../package.json').scripts.test var tsregexp = testscript.replace(/([\[\.\*\]])/g, '\\$1') test('default', function (t) { common.npm(['run'], { cwd: cwd }, function (er, code, so) { if (er) throw er t.notOk(code) t.similar(so, new RegExp('\\n test\\n ' + tsregexp + '\\n')) t.end() }) }) test('parseable', function (t) { common.npm(['run', '-p'], { cwd: cwd }, function (er, code, so) { if (er) throw er t.notOk(code) t.similar(so, new RegExp('\\ntest:' + tsregexp + '\\n')) t.end() }) }) test('parseable', function (t) { common.npm(['run', '--json'], { cwd: cwd }, function (er, code, so) { if (er) throw er t.notOk(code) t.equal(JSON.parse(so).test, testscript) t.end() }) }) ����������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/tree-style.js���������������������������������������������������������������0000644�0000000�0000000�00000006065�12631326456�015677� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var fs = require('graceful-fs') var common = require('../common-tap') var base = path.resolve(__dirname, path.basename(__filename, '.js')) var modA = path.resolve(base, 'modA') var modB = path.resolve(base, 'modB') var modC = path.resolve(base, 'modC') var testNormal = path.resolve(base, 'testNormal') var testGlobal = path.resolve(base, 'testGlobal') var testLegacy = path.resolve(base, 'testLegacy') var json = { 'name': 'test-tree-style', 'version': '1.0.0', 'dependencies': { 'modA': modA } } var modAJson = { 'name': 'modA', 'version': '1.0.0', 'dependencies': { 'modB': modB } } var modBJson = { 'name': 'modB', 'version': '1.0.0', 'dependencies': { 'modC': modC } } var modCJson = { 'name': 'modC', 'version': '1.0.0' } function modJoin () { var modules = Array.prototype.slice.call(arguments) return modules.reduce(function (a, b) { return path.resolve(a, 'node_modules', b) }) } function writeJson (mod, data) { fs.writeFileSync(path.resolve(mod, 'package.json'), JSON.stringify(data)) } function setup () { cleanup() ;[modA, modB, modC, testNormal, testGlobal, testLegacy].forEach(function (mod) { mkdirp.sync(mod) }) writeJson(modA, modAJson) writeJson(modB, modBJson) writeJson(modC, modCJson) ;[testNormal, testGlobal, testLegacy].forEach(function (mod) { writeJson(mod, json) }) } function cleanup () { rimraf.sync(base) } test('setup', function (t) { setup() t.end() }) function exists (t, filepath, msg) { try { fs.statSync(filepath) t.pass(msg) return true } catch (ex) { t.fail(msg, {found: null, wanted: 'exists', compare: 'fs.stat(' + filepath + ')'}) return false } } test('tree-style', function (t) { t.plan(12) common.npm(['install'], {cwd: testNormal}, function (err, code, stdout, stderr) { if (err) throw err t.is(code, 0, 'normal install; result code') t.is(stderr, '', 'normal install; no errors') exists(t, modJoin(testNormal, 'modA'), 'normal install; module A') exists(t, modJoin(testNormal, 'modB'), 'normal install; module B') exists(t, modJoin(testNormal, 'modC'), 'normal install; module C') }) common.npm(['install', '--global-style'], {cwd: testGlobal}, function (err, code, stdout, stderr) { if (err) throw err t.is(code, 0, 'global-style install; result code') t.is(stderr, '', 'global-style install; no errors') exists(t, modJoin(testGlobal, 'modA', 'modB'), 'global-style install; module B') exists(t, modJoin(testGlobal, 'modA', 'modC'), 'global-style install; module C') }) common.npm(['install', '--legacy-bundling'], {cwd: testLegacy}, function (err, code, stdout, stderr) { if (err) throw err t.is(code, 0, 'legacy-bundling install; result code') t.is(stderr, '', 'legacy-bundling install; no errors') exists(t, modJoin(testLegacy, 'modA', 'modB', 'modC'), 'legacy-bundling install; module C') }) }) test('cleanup', function (t) { cleanup() t.end() }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/umask-lifecycle.js����������������������������������������������������������0000644�0000000�0000000�00000002402�12631326456�016646� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var sprintf = require('sprintf-js').sprintf var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'umask-lifecycle') var pj = JSON.stringify({ name: 'x', version: '1.2.3', scripts: { umask: '$npm_execpath config get umask && echo "$npm_config_umask" && node -pe "process.umask()"' } }, null, 2) + '\n' var umask = process.umask() var expected = [ '', '> x@1.2.3 umask ' + path.join(__dirname, 'umask-lifecycle'), '> $npm_execpath config get umask && echo "$npm_config_umask" && node -pe "process.umask()"', '', sprintf('%04o', umask), sprintf('%04o', umask), sprintf('%d', umask), '' ].join('\n') test('setup', function (t) { rimraf.sync(pkg) mkdirp.sync(pkg) fs.writeFileSync(pkg + '/package.json', pj) t.end() }) test('umask script', function (t) { common.npm(['run', 'umask'], { cwd: pkg, env: { PATH: process.env.PATH, Path: process.env.Path, 'npm_config_loglevel': 'warn' } }, function (er, code, sout, serr) { t.equal(sout, expected) t.equal(serr, '') t.end() }) }) test('clean', function (t) { rimraf.sync(pkg) t.end() }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/uninstall-in-reverse.js�����������������������������������������������������0000644�0000000�0000000�00000001671�12631326456�017666� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var requireInject = require('require-inject') var log = require('npmlog') /* The remove actions need to happen in the opposite of their normally defined order. That is, they need to go shallow -> deep. */ var removed = [] var npm = requireInject.installGlobally('../../lib/npm.js', { '../../lib/install/action/remove.js': function (top, buildpath, pkg, log, next) { removed.push(pkg.package.name) next() } }) test('setup', function (t) { npm.load(function () { t.pass('npm loaded') t.end() }) }) test('abc', function (t) { var Installer = require('../../lib/install.js').Installer var inst = new Installer(__dirname, false, []) inst.progress = {executeActions: log} inst.todo = [ ['remove', {package: {name: 'first'}}], ['remove', {package: {name: 'second'}}] ] inst.executeActions(function () { t.isDeeply(removed, ['second', 'first']) t.end() }) }) �����������������������������������������������������������������������npm_3.5.2.orig/test/tap/uninstall-package.js��������������������������������������������������������0000644�0000000�0000000�00000003433�12631326456�017200� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var pkg = path.join(__dirname, 'uninstall-package') var EXEC_OPTS = { cwd: pkg } var json = { name: 'uninstall-package', version: '0.0.0', dependencies: { underscore: '~1.3.1', request: '~0.9.0' } } test('setup', function (t) { cleanup() mkdirp.sync(pkg) process.chdir(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) t.end() }) test('returns a list of removed items', function (t) { mr({ port: common.port }, function (er, s) { common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'install', '.' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'install ran without issue') t.notOk(code, 'install ran without raising error code') common.npm( [ '--registry', common.registry, '--loglevel', 'silent', 'uninstall', 'underscore', 'request', 'lala' ], EXEC_OPTS, function (err, code, stdout, stderr) { t.ifError(err, 'uninstall ran without issue') t.notOk(code, 'uninstall ran without raising error code') t.has(stdout, /- underscore@1.3.3/, 'underscore uninstalled') t.has(stdout, /- request@0.9.5/, 'request uninstalled') s.close() t.end() } ) } ) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unit-child-path.js����������������������������������������������������������0000644�0000000�0000000�00000000530�12631326456�016563� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var childPath = require('../../lib/utils/child-path.js') test('childPath', function (t) { t.is(childPath('/path/to', {name: 'abc'}), '/path/to/node_modules/abc', 'basic use') t.is(childPath('/path/to', {package: {name: '@zed/abc'}}), '/path/to/node_modules/@zed/abc', 'scoped use') t.end() }) ������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unit-deps-removeObsoleteDep.js����������������������������������������������0000644�0000000�0000000�00000002014�12631326456�021121� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var requireInject = require('require-inject') // we're just mocking to avoid having to call `npm.load` var deps = requireInject('../../lib/install/deps.js', { '../../lib/npm.js': { config: { get: function () { return 'mock' } } } }) var removeObsoleteDep = deps._removeObsoleteDep test('removeObsoleteDep', function (t) { var child1 = {requiredBy: []} var test1 = { removed: true, requires: [ child1 ] } removeObsoleteDep(test1) t.is(child1.removed, undefined, 'no recursion on deps flagged as removed already') var child2 = {requiredBy: []} var test2 = { requires: [ child2 ] } child2.requiredBy.push(test2) removeObsoleteDep(test2) t.is(child2.removed, true, 'required by no other modules, removing') var child3 = {requiredBy: ['NOTEMPTY']} var test3 = { requires: [ child3 ] } child3.requiredBy.push(test3) removeObsoleteDep(test3) t.is(child3.removed, undefined, 'required by other modules, keeping') t.done() }) ��������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unit-deps-replaceModule.js��������������������������������������������������0000644�0000000�0000000�00000003245�12631326456�020266� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var npm = require('../../lib/npm') test('setup', function (t) { npm.load({}, t.done) }) test('replaceModule', function (t) { var replaceModule = require('../../lib/install/deps')._replaceModule var mods = [] for (var ii = 0; ii < 10; ++ii) { mods.push({package: {name: ii}}) } var test = {} test.A = mods.slice(0, 4) replaceModule(test, 'A', mods[2]) t.isDeeply(test.A, mods.slice(0, 4), 'replacing an existing module leaves the order alone') replaceModule(test, 'A', mods[7]) t.isDeeply(test.A, mods.slice(0, 4).concat(mods[7]), 'replacing a new module appends') test.B = mods.slice(0, 4) var replacement = {package: {name: 1}, isReplacement: true} replaceModule(test, 'B', replacement) t.isDeeply(test.B, [mods[0], replacement, mods[2], mods[3]], 'replacing existing module swaps out for the new version') replaceModule(test, 'C', mods[7]) t.isDeeply(test.C, [mods[7]], 'replacing when the key does not exist yet, causes its creation') t.end() }) test('replaceModuleName', function (t) { var replaceModuleName = require('../../lib/install/deps')._replaceModuleName var mods = [] for (var ii = 0; ii < 10; ++ii) { mods.push('pkg' + ii) } var test = {} test.A = mods.slice(0, 4) replaceModuleName(test, 'A', mods[2]) t.isDeeply(test.A, mods.slice(0, 4), 'replacing an existing module leaves the order alone') replaceModuleName(test, 'A', mods[7]) t.isDeeply(test.A, mods.slice(0, 4).concat(mods[7]), 'replacing a new module appends') replaceModuleName(test, 'C', mods[7]) t.isDeeply(test.C, [mods[7]], 'replacing when the key does not exist yet, causes its creation') t.end() }) �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unit-gentlyrm.js������������������������������������������������������������0000644�0000000�0000000�00000032664�12631326456�016424� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var requireInject = require('require-inject') function error (code) { var er = new Error() er.code = code return er } function mockWith (fixture) { return { '../../lib/npm.js': {}, 'graceful-fs': { lstat: function (path, cb) { var entry = fixture[path] if (!entry) return cb(error('ENOENT')) cb(null, { isDirectory: function () { return entry.type === 'directory' }, isSymbolicLink: function () { return entry.type === 'symlink' }, isFile: function () { return entry.type === 'file' || entry.type === 'cmdshim' || entry.type === 'error' } }) }, readlink: function (path, cb) { var entry = fixture[path] if (!entry) return cb(error('ENOENT')) if (entry.type !== 'symlink') return cb(error('EINVAL')) cb(null, entry.dest) } }, 'read-cmd-shim': function (path, cb) { var entry = fixture[path] if (!entry) return cb(error('ENOENT')) if (entry.type === 'directory') return cb(error('EISDIR')) if (entry.type === 'error') return cb(error(entry.code)) if (entry.type !== 'cmdshim') return cb(error('ENOTASHIM')) cb(null, entry.dest) } } } test('readLinkOrShim', function (t) { t.plan(10) var mocks = mockWith({ '/path/to/directory': { type: 'directory' }, '/path/to/link': { type: 'symlink', dest: '../to/file' }, '/path/to/file': { type: 'file' }, '/path/to/cmdshim': { type: 'cmdshim', dest: '../to/file' }, '/path/to/invalid': { type: 'error', code: 'EINVAL' } }) var gentlyRm = requireInject('../../lib/utils/gently-rm.js', mocks) var readLinkOrShim = gentlyRm._readLinkOrShim readLinkOrShim('/path/to/nowhere', function (er, path) { t.is(er && er.code, 'ENOENT', 'missing files are errors') }) readLinkOrShim('/path/to/invalid', function (er, path) { t.is(er && er.code, 'EINVAL', 'other errors pass through too') }) readLinkOrShim('/path/to/directory', function (er, path) { t.ifError(er, "reading dirs isn't an error") t.is(path, null, 'reading non links/cmdshims gives us null') }) readLinkOrShim('/path/to/file', function (er, path) { t.ifError(er, "reading non-cmdshim files isn't an error") t.is(path, null, 'reading non links/cmdshims gives us null') }) readLinkOrShim('/path/to/link', function (er, path) { t.ifError(er, "reading links isn't an error") t.is(path, '../to/file', 'reading links works') }) readLinkOrShim('/path/to/cmdshim', function (er, path) { t.ifError(er, "reading cmdshims isn't an error") t.is(path, '../to/file', 'reading cmdshims works') }) t.done() }) test('resolveSymlink', function (t) { t.plan(9) var mocks = mockWith({ '/path/to/directory': { type: 'directory' }, '/path/to/link': { type: 'symlink', dest: '../to/file' }, '/path/to/file': { type: 'file' }, '/path/to/cmdshim': { type: 'cmdshim', dest: '../to/file' } }) var gentlyRm = requireInject('../../lib/utils/gently-rm.js', mocks) var resolveSymlink = gentlyRm._resolveSymlink resolveSymlink('/path/to/nowhere', function (er, path) { t.is(er && er.code, 'ENOENT', 'missing files are errors') }) resolveSymlink('/path/to/directory', function (er, path) { t.ifError(er, "reading dirs isn't an error") t.is(path, '/path/to/directory', 'reading non links/cmdshims gives us path we passed in') }) resolveSymlink('/path/to/file', function (er, path) { t.ifError(er, "reading non-cmdshim files isn't an error") t.is(path, '/path/to/file', 'reading non links/cmdshims gives us the path we passed in') }) resolveSymlink('/path/to/link', function (er, path) { t.ifError(er, "reading links isn't an error") t.is(path, '/path/to/file', 'reading links works') }) resolveSymlink('/path/to/cmdshim', function (er, path) { t.ifError(er, "reading cmdshims isn't an error") t.is(path, '/path/to/file', 'reading cmdshims works') }) t.done() }) test('readAllLinks', function (t) { t.plan(16) var mocks = mockWith({ '/path/to/directory': { type: 'directory' }, '/path/to/link': { type: 'symlink', dest: '../to/file' }, '/path/to/file': { type: 'file' }, '/path/to/cmdshim': { type: 'cmdshim', dest: '../to/file' }, '/path/to/linktolink': { type: 'symlink', dest: 'link' }, '/path/to/linktolink^2': { type: 'symlink', dest: 'linktolink' }, '/path/to/linktocmdshim': { type: 'symlink', dest: 'cmdshim' }, '/path/to/linktobad': { type: 'symlink', dest: '/does/not/exist' } }) var gentlyRm = requireInject('../../lib/utils/gently-rm.js', mocks) var readAllLinks = gentlyRm._readAllLinks readAllLinks('/path/to/nowhere', function (er, path) { t.is(er && er.code, 'ENOENT', 'missing files are errors') }) readAllLinks('/path/to/directory', function (er, path) { t.ifError(er, "reading dirs isn't an error") t.isDeeply(path, ['/path/to/directory'], 'reading non links/cmdshims gives us path we passed in') }) readAllLinks('/path/to/file', function (er, path) { t.ifError(er, "reading non-cmdshim files isn't an error") t.isDeeply(path, ['/path/to/file'], 'reading non links/cmdshims gives us the path we passed in') }) readAllLinks('/path/to/linktobad', function (er, path) { t.is(er && er.code, 'ENOENT', 'links to missing files are errors') }) readAllLinks('/path/to/link', function (er, path) { t.ifError(er, "reading links isn't an error") t.isDeeply(path, ['/path/to/link', '/path/to/file'], 'reading links works') }) readAllLinks('/path/to/cmdshim', function (er, path) { t.ifError(er, "reading cmdshims isn't an error") t.isDeeply(path, ['/path/to/cmdshim', '/path/to/file'], 'reading cmdshims works') }) readAllLinks('/path/to/linktolink', function (er, path) { t.ifError(er, "reading link to link isn't an error") t.isDeeply(path, ['/path/to/linktolink', '/path/to/link', '/path/to/file'], 'reading link to link works') }) readAllLinks('/path/to/linktolink^2', function (er, path) { t.ifError(er, "reading link to link to link isn't an error") t.isDeeply(path, ['/path/to/linktolink^2', '/path/to/linktolink', '/path/to/link', '/path/to/file'], 'reading link to link to link works') }) readAllLinks('/path/to/linktocmdshim', function (er, path) { t.ifError(er, "reading link to cmdshim isn't an error") t.isDeeply(path, ['/path/to/linktocmdshim', '/path/to/cmdshim', '/path/to/file'], 'reading link to cmdshim works') }) t.done() }) test('areAnyInsideAny', function (t) { var gentlyRm = requireInject('../../lib/utils/gently-rm.js', mockWith({})) var areAnyInsideAny = gentlyRm._areAnyInsideAny var noneOneToOne = areAnyInsideAny(['/abc'], ['/xyz']) t.is(noneOneToOne, false, 'none inside: one to one') var noneOneToMany = areAnyInsideAny(['/abc'], ['/rst', '/uvw', '/xyz']) t.is(noneOneToMany, false, 'none inside: one to many') var noneManyToOne = areAnyInsideAny(['/abc', '/def', '/ghi'], ['/xyz']) t.is(noneManyToOne, false, 'none inside: many to one') var noneManyToMany = areAnyInsideAny(['/abc', '/def', '/ghi'], ['/rst', '/uvw', '/xyz']) t.is(noneManyToMany, false, 'none inside: many to many') var oneToOne = areAnyInsideAny(['/one/toOne'], ['/one']) t.isDeeply(oneToOne, {target: '/one/toOne', path: '/one'}, 'first: one to one') var firstOneToMany = areAnyInsideAny(['/abc/def'], ['/abc', '/def', '/ghi']) t.isDeeply(firstOneToMany, {target: '/abc/def', path: '/abc'}, 'first: one to many') var secondOneToMany = areAnyInsideAny(['/def/ghi'], ['/abc', '/def', '/ghi']) t.isDeeply(secondOneToMany, {target: '/def/ghi', path: '/def'}, 'second: one to many') var lastOneToMany = areAnyInsideAny(['/ghi/jkl'], ['/abc', '/def', '/ghi']) t.isDeeply(lastOneToMany, {target: '/ghi/jkl', path: '/ghi'}, 'last: one to many') var firstManyToOne = areAnyInsideAny(['/abc/def', '/uvw/def', '/xyz/def'], ['/abc']) t.isDeeply(firstManyToOne, {target: '/abc/def', path: '/abc'}, 'first: many to one') var secondManyToOne = areAnyInsideAny(['/abc/def', '/uvw/def', '/xyz/def'], ['/uvw']) t.isDeeply(secondManyToOne, {target: '/uvw/def', path: '/uvw'}, 'second: many to one') var lastManyToOne = areAnyInsideAny(['/abc/def', '/uvw/def', '/xyz/def'], ['/xyz']) t.isDeeply(lastManyToOne, {target: '/xyz/def', path: '/xyz'}, 'last: many to one') var firstToFirst = areAnyInsideAny(['/abc/def', '/uvw/def', '/xyz/def'], ['/abc', '/uvw', '/xyz']) t.isDeeply(firstToFirst, {target: '/abc/def', path: '/abc'}, 'first to first: many to many') var firstToSecond = areAnyInsideAny(['/abc/def', '/uvw/def', '/xyz/def'], ['/nope', '/abc', '/xyz']) t.isDeeply(firstToSecond, {target: '/abc/def', path: '/abc'}, 'first to second: many to many') var firstToLast = areAnyInsideAny(['/abc/def', '/uvw/def', '/xyz/def'], ['/nope', '/nooo', '/abc']) t.isDeeply(firstToLast, {target: '/abc/def', path: '/abc'}, 'first to last: many to many') var secondToFirst = areAnyInsideAny(['/!!!', '/abc/def', '/xyz/def'], ['/abc', '/uvw', '/xyz']) t.isDeeply(secondToFirst, {target: '/abc/def', path: '/abc'}, 'second to first: many to many') var secondToSecond = areAnyInsideAny(['/!!!', '/abc/def', '/xyz/def'], ['/nope', '/abc', '/xyz']) t.isDeeply(secondToSecond, {target: '/abc/def', path: '/abc'}, 'second to second: many to many') var secondToLast = areAnyInsideAny(['/!!!', '/abc/def', '/uvw/def'], ['/nope', '/nooo', '/abc']) t.isDeeply(secondToLast, {target: '/abc/def', path: '/abc'}, 'second to last: many to many') var lastToFirst = areAnyInsideAny(['/!!!', '/???', '/abc/def'], ['/abc', '/uvw', '/xyz']) t.isDeeply(lastToFirst, {target: '/abc/def', path: '/abc'}, 'last to first: many to many') var lastToSecond = areAnyInsideAny(['/!!!', '/???', '/abc/def'], ['/nope', '/abc', '/xyz']) t.isDeeply(lastToSecond, {target: '/abc/def', path: '/abc'}, 'last to second: many to many') var lastToLast = areAnyInsideAny(['/!!!', '/???', '/abc/def'], ['/nope', '/nooo', '/abc']) t.isDeeply(lastToLast, {target: '/abc/def', path: '/abc'}, 'last to last: many to many') t.done() }) test('isEverInside', function (t) { t.plan(13) var mocks = mockWith({ '/path/other/link': { type: 'symlink', dest: '../to/file' }, '/path/to/file': { type: 'file' }, '/path/to': { type: 'directory' }, '/linkpath': { type: 'symlink', dest: '../path/to' }, '/path/to/invalid': { type: 'error', code: 'EINVAL' } }) var gentlyRm = requireInject('../../lib/utils/gently-rm.js', mocks) var isEverInside = gentlyRm._isEverInside isEverInside('/path/to/invalid', ['/path/to/invalid'], function (er, inside) { t.is(er && er.code, 'EINVAL', 'errors bubble out') }) isEverInside('/path/to/file', ['/ten'], function (er, inside) { t.ifError(er) t.is(inside, false, 'not inside') }) isEverInside('/path/to/nowhere', ['/ten'], function (er, inside) { t.ifError(er) t.is(inside, false, 'missing target') }) isEverInside('/path/to/file', ['/path/to'], function (er, inside) { t.ifError(er) t.isDeeply(inside, {target: '/path/to/file', path: '/path/to'}, 'plain file in plain path') }) isEverInside('/path/other/link', ['/path/to'], function (er, inside) { t.ifError(er) t.isDeeply(inside, {target: '/path/to/file', path: '/path/to'}, 'link in plain path') }) isEverInside('/path/to/file', ['/linkpath'], function (er, inside) { t.ifError(er) t.isDeeply(inside, {target: '/path/to/file', path: '/path/to'}, 'plain file in link path') }) isEverInside('/path/other/link', ['/linkpath'], function (er, inside) { t.ifError(er) t.isDeeply(inside, {target: '/path/to/file', path: '/path/to'}, 'link in link path') }) t.done() }) test('isSafeToRm', function (t) { var gentlyRm = requireInject('../../lib/utils/gently-rm.js', mockWith({})) var isSafeToRm = gentlyRm._isSafeToRm t.plan(12) function testIsSafeToRm (t, parent, target, shouldPath, shouldBase, msg) { isSafeToRm(parent, target, function (er, path, base) { t.ifError(er, msg + ' no error') t.is(path, shouldPath, msg + ' path') t.is(base, shouldBase, msg + ' base') }) } function testNotIsSafeToRm (t, parent, target, msg) { isSafeToRm(parent, target, function (er) { t.is(er && er.code, 'EEXIST', msg + ' error') }) } var unmanagedParent = {path: '/foo', managed: false} var managedParent = {path: '/foo', managed: true} var targetInParent = { path: '/foo/bar/baz', inParent: { target: '/foo/bar/baz', path: '/foo' } } var targetLinkInParent = { path: '/foo/bar/baz', inParent: { target: '/other/area/baz', path: '/other/area' } } var targetManagedLinkNotInParent = { path: '/foo/bar/baz', managed: true, inParent: false, symlink: '/foo/bar/bark' } var targetUnmanagedLink = { path: '/not/managed/baz', managed: false, inParent: false, symlink: '/not/managed/foo' } var targetUnmanagedFile = { path: '/not/managed/baz', managed: false, inParent: false, symlink: false } testNotIsSafeToRm(t, unmanagedParent, targetInParent, 'unmanaged parent') testIsSafeToRm(t, managedParent, targetInParent, '/foo/bar/baz', '/foo', 'path is in parent') testIsSafeToRm(t, managedParent, targetLinkInParent, '/foo/bar/baz', '/foo/bar', 'path links to parent') testIsSafeToRm(t, managedParent, targetManagedLinkNotInParent, undefined, undefined, 'managed but not owned by package') testNotIsSafeToRm(t, managedParent, targetUnmanagedLink, 'unmanaged link') testNotIsSafeToRm(t, managedParent, targetUnmanagedFile, 'unmanaged file') }) ����������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unit-module-name.js���������������������������������������������������������0000644�0000000�0000000�00000003271�12631326456�016756� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var moduleName = require('../../lib/utils/module-name.js') test('pathToPackageName', function (t) { var pathToPackageName = moduleName.test.pathToPackageName t.is(pathToPackageName('/foo/bar/baz/bark'), 'bark', 'simple module name') t.is(pathToPackageName('/foo/bar/@baz/bark'), '@baz/bark', 'scoped module name') t.is(pathToPackageName('/foo'), 'foo', 'module at top') t.is(pathToPackageName('/@foo'), '@foo', 'invalid module at top') t.is(pathToPackageName('/'), '', 'root, empty result') t.is(pathToPackageName(''), '', 'empty, empty') t.is(pathToPackageName(undefined), '', 'undefined is empty') t.is(pathToPackageName(null), '', 'null is empty') t.done() }) test('isNotEmpty', function (t) { var isNotEmpty = moduleName.test.isNotEmpty t.is(isNotEmpty('abc'), true, 'string is not empty') t.is(isNotEmpty(''), false, 'empty string is empty') t.is(isNotEmpty(null), false, 'null is empty') t.is(isNotEmpty(undefined), false, 'undefined is empty') t.is(isNotEmpty(0), true, 'zero is not empty') t.is(isNotEmpty(true), true, 'true is not empty') t.is(isNotEmpty([]), true, 'empty array is not empty') t.is(isNotEmpty({}), true, 'object is not empty') t.done() }) test('moduleName', function (t) { t.is(moduleName({package: {name: 'foo'}}), 'foo', 'package named') t.is(moduleName({name: 'foo'}), 'foo', 'package named, no tree') t.is(moduleName({path: '/foo/bar'}), 'bar', 'path named') t.is(moduleName({}), '!invalid#1', 'no named') t.is(moduleName({path: '/'}), '!invalid#2', 'invalid named') var obj = {} t.is(moduleName(obj), moduleName(obj), 'once computed, an invalid module name will not change') t.done() }) ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unit-package-id.js����������������������������������������������������������0000644�0000000�0000000�00000001700�12631326456�016533� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var packageId = require('../../lib/utils/package-id.js') test('packageId', function (t) { t.is(packageId({package: {_id: 'abc@123'}}), 'abc@123', 'basic') t.is(packageId({_id: 'abc@123'}), 'abc@123', 'basic no tree') t.is(packageId({package: {name: 'abc', version: '123'}}), 'abc@123', 'computed') t.is(packageId({package: {_id: '@', name: 'abc', version: '123'}}), 'abc@123', 'computed, ignore invalid id') t.is(packageId({package: {name: 'abc'}}), 'abc', 'no version') t.is(packageId({package: {version: '123'}}), '!invalid#1@123', 'version, no name') t.is(packageId({package: {version: '123'}, path: '/path/to/abc'}), 'abc@123', 'version path-name') t.is(packageId({package: {version: '123'}, path: '/path/@to/abc'}), '@to/abc@123', 'version scoped-path-name') t.is(packageId({path: '/path/to/abc'}), 'abc', 'path name, no version') t.is(packageId({}), '!invalid#2', 'nothing') t.done() }) ����������������������������������������������������������������npm_3.5.2.orig/test/tap/unpack-foreign-tarball.js���������������������������������������������������0000644�0000000�0000000�00000004311�12631326456�020121� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var test = require('tap').test var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var common = require('../common-tap.js') var fixtures = path.resolve(__dirname, '..', 'fixtures') var pkg = path.resolve(__dirname, 'unpack-foreign-tarball') var nm = path.resolve(pkg, 'node_modules') var target = path.resolve(nm, 'npm-test-gitignore') var cache = path.resolve(pkg, 'cache') var tmp = path.resolve(pkg, 'tmp') var EXEC_OPTS = { env: { 'npm_config_cache': cache, 'npm_config_tmp': tmp }, cwd: pkg, stdio: [ 'pipe', 'pipe', 2 ] } function verify (t, files, err, code) { if (code) { t.fail('exited with failure: ' + code) return t.end() } var actual = fs.readdirSync(target).sort() var expect = files.concat(['.npmignore', 'package.json']).sort() t.same(actual, expect) t.end() } test('setup', function (t) { setup() t.comment('test for https://github.com/npm/npm/issues/5658') t.end() }) test('npmignore only', function (t) { var file = path.resolve(fixtures, 'npmignore.tgz') common.npm(['install', file], EXEC_OPTS, verify.bind(null, t, ['foo'])) }) test('gitignore only', function (t) { setup() var file = path.resolve(fixtures, 'gitignore.tgz') common.npm(['install', file], EXEC_OPTS, verify.bind(null, t, ['foo'])) }) test('gitignore and npmignore', function (t) { setup() var file = path.resolve(fixtures, 'gitignore-and-npmignore.tgz') common.npm(['install', file], EXEC_OPTS, verify.bind(null, t, ['foo', 'bar'])) }) test('gitignore and npmignore, not gzipped 1/2', function (t) { setup() var file = path.resolve(fixtures, 'gitignore-and-npmignore.tar') common.npm(['install', file], EXEC_OPTS, verify.bind(null, t, ['foo', 'bar'])) }) test('gitignore and npmignore, not gzipped 2/2', function (t) { setup() var file = path.resolve(fixtures, 'gitignore-and-npmignore-2.tar') common.npm(['install', file], EXEC_OPTS, verify.bind(null, t, ['foo', 'bar'])) }) test('cleanup', function (t) { cleanup() t.end() }) function setup () { cleanup() mkdirp.sync(nm) mkdirp.sync(tmp) } function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/unpublish-config.js���������������������������������������������������������0000644�0000000�0000000�00000003613�12631326456�017052� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var http = require('http') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var pkg = path.join(__dirname, 'npm-test-unpublish-config') var fixturePath = path.join(pkg, 'fixture_npmrc') var common = require('../common-tap.js') var json = { name: 'npm-test-unpublish-config', version: '1.2.3', publishConfig: { registry: common.registry } } test('setup', function (t) { mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json), 'utf8' ) fs.writeFileSync( fixturePath, '//localhost:1337/:_authToken = beeeeeeeeeeeeef\n' + 'registry = http://lvh.me:4321/registry/path\n' ) t.end() }) test('cursory test of unpublishing with config', function (t) { var child t.plan(4) http.createServer(function (req, res) { t.pass('got request on the fakey fake registry') this.close() res.statusCode = 500 res.end(JSON.stringify({ error: 'shh no tears, only dreams now' })) child.kill('SIGHUP') }).listen(common.port, function () { t.pass('server is listening') child = common.npm( [ '--userconfig', fixturePath, '--loglevel', 'silent', '--force', 'unpublish' ], { cwd: pkg, stdio: 'inherit', env: { 'npm_config_cache_lock_stale': 1000, 'npm_config_cache_lock_wait': 1000, HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH, USERPROFILE: osenv.home() } }, function (err, code) { t.ifError(err, 'publish command finished successfully') t.notOk(code, 'npm install exited with code 0') } ) }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) ���������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/update-examples.js����������������������������������������������������������0000644�0000000�0000000�00000012064�12631326456�016674� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var mkdirp = require('mkdirp') var rimraf = require('rimraf') var path = require('path') var mr = require('npm-registry-mock') var osenv = require('osenv') var requireInject = require('require-inject') var PKG_DIR = path.resolve(__dirname, 'update-examples') var CACHE_DIR = path.resolve(PKG_DIR, 'cache') // ** constant templates for mocks ** var DEFAULT_PKG = { 'name': 'update-examples', 'version': '1.2.3', 'dependencies': { 'dep1': '*' } } var DEP_PKG = { name: 'dep1', version: '1.1.1', _from: '^1.1.1' } var INSTALLED = { path: '/mock/root', realpath: '/mock/root', isLink: false, package: DEFAULT_PKG, children: [ { path: '/mock/root/node_modules/dep1', realpath: '/mock/root/node_modules/dep1', isLink: false, package: DEP_PKG, children: [] } ] } var DEP1_REGISTRY = { name: 'dep1', 'dist-tags': { latest: '1.2.2' }, versions: { '1.2.2': { version: '1.2.2' }, '1.2.1': { version: '1.2.1' }, '1.2.0': { version: '1.2.0' }, '1.1.2': { version: '1.1.2' }, '1.1.1': { version: '1.1.1' }, '1.0.0': { version: '1.0.0' }, '0.4.1': { version: '0.4.1' }, '0.4.0': { version: '0.4.0' }, '0.2.0': { version: '0.2.0' } } } var registryMocks = { 'get': { '/dep1': [200, DEP1_REGISTRY] } } // ** dynamic mocks, cloned from templates and modified ** var mockServer var mockDepJson = clone(DEP_PKG) var mockInstalled = clone(INSTALLED) var mockParentJson = clone(DEFAULT_PKG) // target var installAskedFor function clone (a) { return extend({}, a) } function extend (a, b) { for (var key in b) { a[key] = b[key] } return a } function resetPackage (options) { rimraf.sync(CACHE_DIR) mkdirp.sync(CACHE_DIR) installAskedFor = undefined mockParentJson = clone(DEFAULT_PKG) mockInstalled = clone(INSTALLED) mockDepJson = clone(DEP_PKG) if (options.wanted) { mockParentJson.dependencies.dep1 = options.wanted mockInstalled.package.dependencies.dep1 = options.wanted mockDepJson._from = options.wanted } if (options.installed) { mockInstalled.package.dependencies.dep1 = options.installed mockInstalled.children[0].package.version = options.installed mockDepJson.version = options.installed } } function mockReadPackageTree (dir, cb) { cb(null, mockInstalled) } function mockReadJson (file, cb) { cb(null, file.match(/dep1/) ? mockDepJson : mockParentJson) } function mockCommand (npm, name, fn) { delete npm.commands[name] npm.commands[name] = fn } function mockInstaller (where, dryrun, what) { installAskedFor = what[0] } mockInstaller.prototype = {} mockInstaller.prototype.run = function (cb) { cb() } var npm = requireInject.installGlobally('../../lib/npm.js', { 'read-package-tree': mockReadPackageTree, 'read-package-json': mockReadJson, '../../lib/install': { Installer: mockInstaller } }) test('setup', function (t) { t.plan(5) process.chdir(osenv.tmpdir()) mkdirp.sync(PKG_DIR) process.chdir(PKG_DIR) t.pass('made ' + PKG_DIR) resetPackage({}) mr({ port: common.port, mocks: registryMocks }, function (er, server) { t.pass('mock registry active') npm.load({ cache: CACHE_DIR, registry: common.registry, cwd: PKG_DIR }, function (err) { t.ifError(err, 'started server') mockServer = server t.pass('npm.load complete') mockCommand(npm, 'install', function mockInstall (where, what, cb) { installAskedFor = what cb(null) }) t.pass('mocks configured') t.end() }) }) }) test('update caret dependency to latest', function (t) { resetPackage({ wanted: '^1.1.1' }) npm.config.set('loglevel', 'silly') npm.commands.update([], function (err) { t.ifError(err) t.equal(installAskedFor, 'dep1@1.2.2', 'should want to install dep@1.2.2') t.end() }) }) test('update tilde dependency to latest', function (t) { resetPackage({ wanted: '~1.1.1' }) npm.commands.update([], function (err) { t.ifError(err) t.equal(installAskedFor, 'dep1@1.1.2', 'should want to install dep@1.1.2') t.end() }) }) test('hold tilde dependency at wanted (#6441)', function (t) { resetPackage({ wanted: '~1.1.2', installed: '1.1.2' }) npm.commands.update([], function (err) { t.ifError(err) t.notOk(installAskedFor, 'should not want to install anything') t.end() }) }) test('update old caret dependency with no newer', function (t) { resetPackage({ wanted: '^0.2.0', installed: '^0.2.0' }) npm.commands.update([], function (err) { t.ifError(err) t.equal(installAskedFor, 'dep1@0.2.0', 'should want to install dep@0.2.0') t.end() }) }) test('update old caret dependency with newer', function (t) { resetPackage({ wanted: '^0.4.0', installed: '^0.4.0' }) npm.commands.update([], function (err) { t.ifError(err) t.equal(installAskedFor, 'dep1@0.4.1', 'should want to install dep@0.4.1') t.end() }) }) test('cleanup', function (t) { mockServer.close() process.chdir(osenv.tmpdir()) rimraf.sync(PKG_DIR) t.end() }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/update-index.js�������������������������������������������������������������0000644�0000000�0000000�00000011020�12631326456�016154� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var path = require('path') var mr = require('npm-registry-mock') var updateIndex = require('../../lib/cache/update-index.js') var PKG_DIR = path.resolve(__dirname, 'get-basic') var CACHE_DIR = path.resolve(PKG_DIR, 'cache') var server var mocks = { basic: function (mock) { mock.get('/-/all').reply(200, allMock) }, auth: function (mock) { var littleBobbyTablesAuth = new Buffer('bobby:tables').toString('base64') var auth = 'Basic ' + littleBobbyTablesAuth mock.get('/-/all', { authorization: auth }).reply(200, allMock) mock.get('/-/all').reply(401, { error: 'unauthorized', reason: 'You are not authorized to access this db.' }) } } var allMock = { '_updated': 1411727900 + 25, 'generator-frontcow': { 'name': 'generator-frontcow', 'description': 'f36b6a6123da50959741e2ce4d634f96ec668c56 This is a fake description to ensure we do not accidentally search the real npm registry or use some kind of cache', 'dist-tags': { 'latest': '0.1.19' }, 'maintainers': [ { 'name': 'bcabanes', 'email': 'contact@benjamincabanes.com' } ], 'homepage': 'https://github.com/bcabanes/generator-frontcow', 'keywords': [ 'sass', 'frontend', 'yeoman-generator', 'atomic', 'design', 'sass', 'foundation', 'foundation5', 'atomic design', 'bourbon', 'polyfill', 'font awesome' ], 'repository': { 'type': 'git', 'url': 'https://github.com/bcabanes/generator-frontcow' }, 'author': { 'name': 'ben', 'email': 'contact@benjamincabanes.com', 'url': 'https://github.com/bcabanes' }, 'bugs': { 'url': 'https://github.com/bcabanes/generator-frontcow/issues' }, 'license': 'MIT', 'readmeFilename': 'README.md', 'time': { 'modified': '2014-10-03T02:26:18.406Z' }, 'versions': { '0.1.19': 'latest' } }, 'marko': { 'name': 'marko', 'description': 'Marko is an extensible, streaming, asynchronous, high performance, HTML-based templating language that can be used in Node.js or in the browser.', 'dist-tags': { 'latest': '1.2.16' }, 'maintainers': [ { 'name': 'pnidem', 'email': 'pnidem@gmail.com' }, { 'name': 'philidem', 'email': 'phillip.idem@gmail.com' } ], 'homepage': 'https://github.com/raptorjs/marko', 'keywords': [ 'templating', 'template', 'async', 'streaming' ], 'repository': { 'type': 'git', 'url': 'https://github.com/raptorjs/marko.git' }, 'author': { 'name': 'Patrick Steele-Idem', 'email': 'pnidem@gmail.com' }, 'bugs': { 'url': 'https://github.com/raptorjs/marko/issues' }, 'license': 'Apache License v2.0', 'readmeFilename': 'README.md', 'users': { 'pnidem': true }, 'time': { 'modified': '2014-10-03T02:27:31.775Z' }, 'versions': { '1.2.16': 'latest' } } } function setup (t, mock, extra) { mkdirp.sync(CACHE_DIR) mr({ port: common.port, plugin: mock }, function (er, s) { npm.load({ cache: CACHE_DIR, registry: common.registry }, function (err) { if (extra) { Object.keys(extra).forEach(function (k) { npm.config.set(k, extra[k], 'user') }) } t.ifError(err, 'no error') server = s t.end() }) }) } function cleanup (t) { server.close(function () { rimraf.sync(PKG_DIR) t.end() }) } test('setup basic', function (t) { setup(t, mocks.basic) }) test('request basic', function (t) { updateIndex(0, function (er) { t.ifError(er, 'no error') t.end() }) }) test('cleanup basic', cleanup) test('setup auth', function (t) { setup(t, mocks.auth) }) test('request auth failure', function (t) { updateIndex(0, function (er) { t.equals(er.code, 'E401', 'gotta get that auth') t.ok(/^unauthorized/.test(er.message), 'unauthorized message') t.end() }) }) test('cleanup auth failure', cleanup) test('setup auth', function (t) { // mimic as if alwaysAuth had been set setup(t, mocks.auth, { _auth: new Buffer('bobby:tables').toString('base64'), 'always-auth': true }) }) test('request auth success', function (t) { updateIndex(0, function (er) { t.ifError(er, 'no error') t.end() }) }) test('cleanup auth', cleanup) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/update-path.js��������������������������������������������������������������0000644�0000000�0000000�00000001561�12631326456�016012� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var test = require('tap').test var requireInject = require('require-inject') var mockNpm = { config: { get: function (key) { return false } }, commands: { outdated: function (args, silent, cb) { cb(null, [ [{path: '/incorrect', parent: {path: '/correct'}}, 'abc', '1.0.0', '1.1.0', '1.1.0', '^1.1.0'] ]) } } } // What we're testing here is that updates use the parent module's path to // install from. test('update', function (t) { var update = requireInject('../../lib/update.js', { '../../lib/npm.js': mockNpm, '../../lib/install.js': { 'Installer': function (where, dryrun, args) { t.is(where, '/correct', 'We should be installing to the parent of the modules being updated') this.run = function (cb) { cb() } } } }) update(['abc'], function () { t.end() }) }) �����������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/update-save.js��������������������������������������������������������������0000644�0000000�0000000�00000011756�12631326456�016023� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var server var pkg = path.resolve(__dirname, 'update-save') var cache = path.resolve(pkg, 'cache') var EXEC_OPTS = { cwd: pkg, stdio: 'ignore', env: { npm_config_registry: common.registry, npm_config_loglevel: 'verbose', npm_config_save_prefix: '^' } } var json = { name: 'update-save-example', version: '1.2.3', dependencies: { mkdirp: '~0.3.0' }, devDependencies: { underscore: '~1.3.1' } } function clone (a) { return extend({}, a) } function extend (a, b) { for (var key in b) { a[key] = b[key] } return a } test('setup', function (t) { setup() mr({ port: common.port }, function (er, s) { t.ifError(er) server = s t.end() }) }) test('update regular dependencies only', function (t) { setup() common.npm(['update', '--save'], EXEC_OPTS, function (err, code) { t.ifError(err) t.notOk(code, 'npm update exited with code 0') var pkgdata = JSON.parse(fs.readFileSync(path.join(pkg, 'package.json'), 'utf8')) t.deepEqual( pkgdata.dependencies, { mkdirp: '^0.3.5' }, 'only dependencies updated' ) t.deepEqual( pkgdata.devDependencies, json.devDependencies, 'dev dependencies should be untouched' ) t.deepEqual( pkgdata.optionalDependencies, json.optionalDependencies, 'optional dependencies should be untouched' ) t.end() }) }) test('update devDependencies only', function (t) { setup() common.npm(['update', '--save-dev'], EXEC_OPTS, function (err, code) { t.ifError(err) t.notOk(code, 'npm update exited with code 0') var pkgdata = JSON.parse(fs.readFileSync(path.join(pkg, 'package.json'), 'utf8')) t.deepEqual( pkgdata.dependencies, json.dependencies, 'dependencies should be untouched' ) t.deepEqual( pkgdata.devDependencies, { underscore: '^1.3.3' }, 'dev dependencies should be updated' ) t.deepEqual( pkgdata.optionalDependencies, json.optionalDependencies, 'optional dependencies should be untouched' ) t.end() }) }) test('update optionalDependencies only', function (t) { setup({ optionalDependencies: { underscore: '~1.3.1' } }) common.npm(['update', '--save-optional'], EXEC_OPTS, function (err, code) { t.ifError(err) t.notOk(code, 'npm update exited with code 0') var pkgdata = JSON.parse(fs.readFileSync(path.join(pkg, 'package.json'), 'utf8')) t.deepEqual( pkgdata.dependencies, json.dependencies, 'dependencies should be untouched' ) t.deepEqual( pkgdata.devDependencies, json.devDependencies, 'dev dependencies should be untouched' ) t.deepEqual( pkgdata.optionalDependencies, { underscore: '^1.3.3' }, 'optional dependencies should be updated' ) t.end() }) }) test('optionalDependencies are merged into dependencies during --save', function (t) { var cloned = setup({ optionalDependencies: { underscore: '~1.3.1' } }) common.npm(['update', '--save'], EXEC_OPTS, function (err, code) { t.ifError(err) t.notOk(code, 'npm update exited with code 0') var pkgdata = JSON.parse(fs.readFileSync(path.join(pkg, 'package.json'), 'utf8')) t.deepEqual( pkgdata.dependencies, { mkdirp: '^0.3.5' }, 'dependencies should not include optional dependencies' ) t.deepEqual( pkgdata.devDependencies, cloned.devDependencies, 'dev dependencies should be untouched' ) t.deepEqual( pkgdata.optionalDependencies, cloned.optionalDependencies, 'optional dependencies should be untouched' ) t.end() }) }) test('semver prefix is replaced with configured save-prefix', function (t) { setup() common.npm(['update', '--save', '--save-prefix', '~'], EXEC_OPTS, function (err, code) { t.ifError(err) t.notOk(code, 'npm update exited with code 0') var pkgdata = JSON.parse(fs.readFileSync(path.join(pkg, 'package.json'), 'utf8')) t.deepEqual( pkgdata.dependencies, { mkdirp: '~0.3.5' }, 'dependencies should be updated' ) t.deepEqual( pkgdata.devDependencies, json.devDependencies, 'dev dependencies should be untouched' ) t.deepEqual( pkgdata.optionalDependencies, json.optionalDependencies, 'optional dependencies should be updated' ) t.end() }) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup (extendWith) { cleanup() mkdirp.sync(cache) process.chdir(pkg) var template = clone(json) extend(template, extendWith) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(template, null, 2) ) return template } ������������������npm_3.5.2.orig/test/tap/url-dependencies.js���������������������������������������������������������0000644�0000000�0000000�00000004503�12631326456�017023� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var mr = require('npm-registry-mock') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap') var server var pkg = path.resolve(__dirname, 'url-dependencies') var json = { author: 'Steve Mason', name: 'url-dependencies', version: '0.0.0', dependencies: { underscore: common.registry + '/underscore/-/underscore-1.3.1.tgz' } } var mockRoutes = { 'get': { '/underscore/-/underscore-1.3.1.tgz': [200] } } test('setup', function (t) { mr({ port: common.port, mocks: mockRoutes }, function (er, s) { server = s t.end() }) }) test('url-dependencies: download first time', function (t) { setup() performInstall(t, function (output) { if (!tarballWasFetched(output)) { t.fail('Tarball was not fetched') } else { t.pass('Tarball was fetched') } t.end() }) }) test('url-dependencies: do not download subsequent times', function (t) { setup() performInstall(t, function () { performInstall(t, function (output) { if (tarballWasFetched(output)) { t.fail('Tarball was fetched second time around') } else { t.pass('Tarball was not fetched') } t.end() }) }) }) test('cleanup', function (t) { server.close() cleanup() t.end() }) function cleanup () { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(path.resolve(pkg)) } function setup () { cleanup() mkdirp.sync(pkg) fs.writeFileSync( path.join(pkg, 'package.json'), JSON.stringify(json, null, 2) ) } function tarballWasFetched (output) { return output.indexOf( 'http fetch GET ' + common.registry + '/underscore/-/underscore-1.3.1.tgz' ) > -1 } function performInstall (t, cb) { var opts = { cwd: pkg, env: { npm_config_registry: common.registry, npm_config_cache_lock_stale: 1000, npm_config_cache_lock_wait: 1000, npm_config_loglevel: 'http', HOME: process.env.HOME, Path: process.env.PATH, PATH: process.env.PATH } } common.npm(['install'], opts, function (err, code, stdout, stderr) { t.ifError(err, 'install success') t.notOk(code, 'npm install exited with code 0') cb(stderr) }) } ���������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/verify-no-lifecycle-on-repo.js����������������������������������������������0000644�0000000�0000000�00000002077�12631326456�021031� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������'use strict' var path = require('path') var fs = require('graceful-fs') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var base = path.join(__dirname, path.basename(__filename, '.js')) var baseJSON = { name: 'base', version: '1.0.0', repository: { type: 'git', url: 'http://example.com' }, scripts: { prepublish: 'false' } } test('setup', function (t) { cleanup() setup() t.end() }) test('repo', function (t) { common.npm(['repo', '--browser=echo'], {cwd: base}, function (er, code, stdout, stderr) { t.ifError(er, 'npm config ran without issue') t.is(code, 0, 'exited with a non-error code') t.is(stderr, '', 'Ran without errors') t.end() }) }) test('cleanup', function (t) { cleanup() t.end() }) function saveJson (pkgPath, json) { mkdirp.sync(pkgPath) fs.writeFileSync(path.join(pkgPath, 'package.json'), JSON.stringify(json, null, 2)) } function setup () { saveJson(base, baseJSON) } function cleanup () { rimraf.sync(base) } �����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/version-git-not-clean.js����������������������������������������������������0000644�0000000�0000000�00000005417�12631326456�017726� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var osenv = require('osenv') var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var which = require('which') var spawn = require('child_process').spawn var pkg = path.resolve(__dirname, 'version-git-not-clean') var cache = path.resolve(pkg, 'cache') test('npm version <semver> with working directory not clean', function (t) { setup() npm.load({ cache: cache, registry: common.registry, prefix: pkg }, function () { which('git', function (err, git) { t.ifError(err, 'git found') function addPackageJSON (_cb) { var data = JSON.stringify({ name: 'blah', version: '0.1.2' }) fs.writeFile('package.json', data, function () { var child = spawn(git, ['add', 'package.json']) child.on('exit', function () { var child2 = spawn(git, ['commit', 'package.json', '-m', 'init']) var out = '' child2.stdout.on('data', function (d) { out += d.toString() }) child2.on('exit', function () { return _cb(out) }) }) }) } common.makeGitRepo({path: pkg}, function () { addPackageJSON(function () { var data = JSON.stringify({ name: 'blah', version: '0.1.3' }) fs.writeFile('package.json', data, function () { npm.commands.version(['patch'], function (err) { if (!err) { t.fail('should fail on non-clean working directory') } else { t.ok(err.message.match(/Git working directory not clean./)) t.ok(err.message.match(/M package.json/)) } t.end() }) }) }) }) }) }) }) test('npm version <semver> --force with working directory not clean', function (t) { common.npm( [ '--force', '--no-sign-git-tag', '--registry', common.registry, '--prefix', pkg, 'version', 'patch' ], { cwd: pkg, env: {PATH: process.env.PATH} }, function (err, code, stdout, stderr) { t.ifError(err, 'npm version ran without issue') t.notOk(code, 'exited with a non-error code') var errorLines = stderr.trim().split('\n') .map(function (line) { return line.trim() }) .filter(function (line) { return !line.indexOf('using --force') }) t.notOk(errorLines.length, 'no error output') t.end() }) }) test('cleanup', function (t) { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) mkdirp.sync(cache) process.chdir(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/version-lifecycle.js��������������������������������������������������������0000644�0000000�0000000�00000012527�12631326456�017224� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('graceful-fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var common = require('../common-tap.js') var npm = require('../../') var pkg = path.resolve(__dirname, 'version-lifecycle') var cache = path.resolve(pkg, 'cache') var npmrc = path.resolve(pkg, './.npmrc') var configContents = 'sign-git-tag=false\n' test('npm version <semver> with failing preversion lifecycle script', function (t) { setup() fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Alex Wolfe', name: 'version-lifecycle', version: '0.0.0', description: 'Test for npm version if preversion script fails', scripts: { preversion: './fail.sh' } }), 'utf8') fs.writeFileSync(path.resolve(pkg, 'fail.sh'), 'exit 50', 'utf8') fs.chmodSync(path.resolve(pkg, 'fail.sh'), 448) npm.load({cache: cache, 'sign-git-tag': false, registry: common.registry}, function () { var version = require('../../lib/version') version(['patch'], function (err) { t.ok(err) t.ok(err.message.match(/Exit status 50/)) t.end() }) }) }) test('npm version <semver> with failing version lifecycle script', function (t) { setup() fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Alex Wolfe', name: 'version-lifecycle', version: '0.0.0', description: 'Test for npm version if postversion script fails', scripts: { version: './fail.sh' } }), 'utf8') fs.writeFileSync(path.resolve(pkg, 'fail.sh'), 'exit 50', 'utf8') fs.chmodSync(path.resolve(pkg, 'fail.sh'), 448) npm.load({cache: cache, 'sign-git-tag': false, registry: common.registry}, function () { var version = require('../../lib/version') version(['patch'], function (err) { t.ok(err) t.ok(err.message.match(/Exit status 50/)) t.end() }) }) }) test('npm version <semver> with failing postversion lifecycle script', function (t) { setup() fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Alex Wolfe', name: 'version-lifecycle', version: '0.0.0', description: 'Test for npm version if postversion script fails', scripts: { postversion: './fail.sh' } }), 'utf8') fs.writeFileSync(path.resolve(pkg, 'fail.sh'), 'exit 50', 'utf8') fs.chmodSync(path.resolve(pkg, 'fail.sh'), 448) npm.load({cache: cache, 'sign-git-tag': false, registry: common.registry}, function () { var version = require('../../lib/version') version(['patch'], function (err) { t.ok(err) t.ok(err.message.match(/Exit status 50/)) t.end() }) }) }) test('npm version <semver> execution order', function (t) { setup() fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Alex Wolfe', name: 'version-lifecycle', version: '0.0.0', description: 'Test for npm version if postversion script fails', scripts: { preversion: './preversion.sh', version: './version.sh', postversion: './postversion.sh' } }), 'utf8') makeScript('preversion') makeScript('version') makeScript('postversion') npm.load({cache: cache, 'sign-git-tag': false, registry: common.registry}, function () { common.makeGitRepo({path: pkg}, function (err, git) { t.ifError(err, 'git bootstrap ran without error') var version = require('../../lib/version') version(['patch'], function (err) { t.ifError(err, 'version command complete') t.equal('0.0.0', readPackage('preversion').version, 'preversion') t.deepEqual(readStatus('preversion', t), { 'preversion-package.json': 'A' }) t.equal('0.0.1', readPackage('version').version, 'version') t.deepEqual(readStatus('version', t), { 'package.json': 'M', 'preversion-package.json': 'A', 'version-package.json': 'A' }) t.equal('0.0.1', readPackage('postversion').version, 'postversion') t.deepEqual(readStatus('postversion', t), { 'postversion-package.json': 'A' }) t.end() }) }) }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) mkdirp.sync(path.join(pkg, 'node_modules')) mkdirp.sync(cache) fs.writeFileSync(npmrc, configContents, 'ascii') process.chdir(pkg) } function makeScript (lifecycle) { var contents = [ 'cp package.json ' + lifecycle + '-package.json', 'git add ' + lifecycle + '-package.json', 'git status --porcelain > ' + lifecycle + '-git.txt' ].join('\n') var scriptPath = path.join(pkg, lifecycle + '.sh') fs.writeFileSync(scriptPath, contents, 'utf-8') fs.chmodSync(scriptPath, 448) } function readPackage (lifecycle) { return JSON.parse(fs.readFileSync(path.join(pkg, lifecycle + '-package.json'), 'utf-8')) } function readStatus (lifecycle, t) { var status = {} fs.readFileSync(path.join(pkg, lifecycle + '-git.txt'), 'utf-8') .trim() .split('\n') .forEach(function (line) { line = line.trim() if (line && !line.match(/^\?\? /)) { var parts = line.split(/\s+/) t.equal(parts.length, 2, lifecycle + ' : git status has too many words : ' + line) status[parts[1].trim()] = parts[0].trim() } }) return status } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/version-message-config.js���������������������������������������������������0000644�0000000�0000000�00000003626�12631326456�020154� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../lib/npm.js') var pkg = path.resolve(__dirname, 'version-message-config') var cache = path.resolve(pkg, 'cache') var npmrc = path.resolve(pkg, '.npmrc') var packagePath = path.resolve(pkg, 'package.json') var json = { name: 'blah', version: '0.1.2' } var configContents = 'sign-git-tag=false\nmessage=":bookmark: %s"\n' test('npm version <semver> with message config', function (t) { setup() npm.load({ prefix: pkg, userconfig: npmrc }, function () { var git = require('../../lib/utils/git.js') common.makeGitRepo({ path: pkg }, function (er) { t.ifErr(er, 'git bootstrap ran without error') common.npm( [ 'version', 'patch', '--loglevel', 'silent' // package config is picked up from env ], { cwd: pkg, env: { PATH: process.env.PATH } }, function (err, code, stdout, stderr) { t.ifError(err, 'npm version ran without issue') t.notOk(code, 'exited with a non-error code') t.notOk(stderr, 'no error output') git.whichAndExec( ['log'], { cwd: pkg, env: process.env }, function (er, log, stderr) { t.ok(log.match(/:bookmark: 0\.1\.3/g), 'config was picked up by version') t.end() } ) } ) }) }) }) test('cleanup', function (t) { cleanup() t.end() }) function cleanup () { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(pkg) } function setup () { cleanup() mkdirp.sync(cache) process.chdir(pkg) fs.writeFileSync(packagePath, JSON.stringify(json), 'utf8') fs.writeFileSync(npmrc, configContents, 'ascii') } ����������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/version-no-git.js�����������������������������������������������������������0000644�0000000�0000000�00000002702�12631326456�016454� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var osenv = require('osenv') var path = require('path') var fs = require('fs') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var requireInject = require('require-inject') var pkg = path.resolve(__dirname, 'version-no-git') var cache = path.resolve(pkg, 'cache') var gitDir = path.resolve(pkg, '.git') test('npm version <semver> in a git repo without the git binary', function (t) { setup() npm.load({cache: cache, registry: common.registry}, function () { var version = requireInject('../../lib/version', { which: function (cmd, cb) { process.nextTick(function () { cb(new Error('ENOGIT!')) }) } }) version(['patch'], function (err) { if (!t.error(err)) return t.end() var p = path.resolve(pkg, 'package') var testPkg = require(p) t.equal('0.0.1', testPkg.version, '\'' + testPkg.version + '\' === \'0.0.1\'') t.end() }) }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) mkdirp.sync(cache) mkdirp.sync(gitDir) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Terin Stock', name: 'version-no-git-test', version: '0.0.0', description: "Test for npm version if git binary doesn't exist" }), 'utf8') process.chdir(pkg) } ��������������������������������������������������������������npm_3.5.2.orig/test/tap/version-no-package.js�������������������������������������������������������0000644�0000000�0000000�00000002130�12631326456�017257� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var osenv = require('osenv') var path = require('path') var mkdirp = require('mkdirp') var rimraf = require('rimraf') var pkg = path.resolve(__dirname, 'version-no-package') test('setup', function (t) { setup() t.end() }) test('npm version in a prefix with no package.json', function (t) { setup() common.npm( ['version', '--json', '--prefix', pkg], { cwd: pkg }, function (er, code, stdout, stderr) { t.ifError(er, "npm version doesn't care that there's no package.json") t.notOk(code, 'npm version ran without barfing') t.ok(stdout, 'got version output') t.notOk(stderr, 'no error output') t.doesNotThrow(function () { var metadata = JSON.parse(stdout) t.equal(metadata.node, process.versions.node, 'node versions match') }, 'able to reconstitute version object from stdout') t.end() } ) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) process.chdir(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/version-no-tags.js����������������������������������������������������������0000644�0000000�0000000�00000004147�12631326456�016634� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var npm = require('../../') var osenv = require('osenv') var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var which = require('which') var spawn = require('child_process').spawn var pkg = path.resolve(__dirname, 'version-no-tags') var cache = path.resolve(pkg, 'cache') test('npm version <semver> without git tag', function (t) { setup() npm.load({ cache: cache, registry: common.registry }, function () { which('git', function (err, git) { t.ifError(err, 'git found on system') function tagExists (tag, _cb) { var child1 = spawn(git, ['tag', '-l', tag]) var out = '' child1.stdout.on('data', function (d) { out += d.toString() }) child1.on('exit', function () { return _cb(null, Boolean(~out.indexOf(tag))) }) } var child2 = spawn(git, ['init']) child2.stdout.pipe(process.stdout) child2.on('exit', function () { npm.config.set('git-tag-version', false) npm.commands.version(['patch'], function (err) { if (err) return t.fail('Error perform version patch') var p = path.resolve(pkg, 'package') var testPkg = require(p) if (testPkg.version !== '0.0.1') t.fail(testPkg.version + ' !== \'0.0.1\'') t.equal('0.0.1', testPkg.version) tagExists('v0.0.1', function (err, exists) { t.ifError(err, 'tag found to exist') t.equal(exists, false, 'git tag DOES exist') t.pass('git tag does not exist') t.end() }) }) }) }) }) }) test('cleanup', function (t) { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { mkdirp.sync(pkg) mkdirp.sync(cache) fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify({ author: 'Evan Lucas', name: 'version-no-tags-test', version: '0.0.0', description: 'Test for git-tag-version flag' }), 'utf8') process.chdir(pkg) } �������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/version-update-shrinkwrap.js������������������������������������������������0000644�0000000�0000000�00000007422�12631326456�020733� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var fs = require('fs') var path = require('path') var mkdirp = require('mkdirp') var osenv = require('osenv') var rimraf = require('rimraf') var test = require('tap').test var npm = require('../../') var common = require('../common-tap.js') var pkg = path.resolve(__dirname, 'version-shrinkwrap') var cache = path.resolve(pkg, 'cache') test('npm version <semver> updates shrinkwrap - no git', function (t) { setup() npm.load({ cache: pkg + '/cache', registry: common.registry }, function () { npm.commands.version(['patch'], function (err) { if (err) return t.fail('Error perform version patch') var shrinkwrap = require(path.resolve(pkg, 'npm-shrinkwrap.json')) t.equal(shrinkwrap.version, '0.0.1', 'got expected version') t.end() }) }) }) test('npm version <semver> updates git works with no shrinkwrap', function (t) { setup() rimraf.sync(path.resolve(pkg, 'npm-shrinkwrap.json')) npm.config.set('sign-git-tag', false) common.makeGitRepo({ path: pkg, added: ['package.json'] }, version) function version (er, stdout, stderr) { t.ifError(er, 'git repo initialized without issue') t.notOk(stderr, 'no error output') npm.commands.version(['patch'], checkCommit) } function checkCommit (er) { t.ifError(er, 'version command ran without error') var shrinkwrap = require(path.resolve(pkg, 'npm-shrinkwrap.json')) t.equal(shrinkwrap.version, '0.0.1', 'got expected version') var opts = { cwd: pkg, env: { PATH: process.env.PATH } } var git = require('../../lib/utils/git.js') git.whichAndExec( ['show', 'HEAD', '--name-only'], opts, function (er, stdout, stderr) { t.ifError(er, 'git show ran without issues') t.notOk(stderr, 'no error output') var lines = stdout.split('\n') t.notEqual(lines.indexOf('package.json'), -1, 'package.json commited') t.equal(lines.indexOf('npm-shrinkwrap.json'), -1, 'npm-shrinkwrap.json not present') t.end() } ) } }) test('npm version <semver> updates shrinkwrap and updates git', function (t) { setup() npm.config.set('sign-git-tag', false) common.makeGitRepo({ path: pkg, added: ['package.json', 'npm-shrinkwrap.json'] }, version) function version (er, stdout, stderr) { t.ifError(er, 'git repo initialized without issue') t.notOk(stderr, 'no error output') npm.commands.version(['patch'], checkCommit) } function checkCommit (er) { t.ifError(er, 'version command ran without error') var shrinkwrap = require(path.resolve(pkg, 'npm-shrinkwrap.json')) t.equal(shrinkwrap.version, '0.0.1', 'got expected version') var git = require('../../lib/utils/git.js') var opts = { cwd: pkg, env: { PATH: process.env.PATH } } git.whichAndExec( ['show', 'HEAD', '--name-only'], opts, function (er, stdout, stderr) { t.ifError(er, 'git show ran without issues') t.notOk(stderr, 'no error output') var lines = stdout.split('\n') t.notEqual(lines.indexOf('package.json'), -1, 'package.json commited') t.notEqual(lines.indexOf('npm-shrinkwrap.json'), -1, 'npm-shrinkwrap.json commited') t.end() } ) } }) test('cleanup', function (t) { // windows fix for locked files process.chdir(osenv.tmpdir()) rimraf.sync(pkg) t.end() }) function setup () { rimraf.sync(pkg) mkdirp.sync(pkg) mkdirp.sync(cache) var contents = { author: 'Nathan Bowser && Faiq Raza', name: 'version-with-shrinkwrap-test', version: '0.0.0', description: 'Test for version with shrinkwrap update' } fs.writeFileSync(path.resolve(pkg, 'package.json'), JSON.stringify(contents), 'utf8') fs.writeFileSync(path.resolve(pkg, 'npm-shrinkwrap.json'), JSON.stringify(contents), 'utf8') process.chdir(pkg) } ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������npm_3.5.2.orig/test/tap/view.js���������������������������������������������������������������������0000644�0000000�0000000�00000020667�12631326456�014560� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var test = require('tap').test var osenv = require('osenv') var path = require('path') var fs = require('fs') var rimraf = require('rimraf') var mkdirp = require('mkdirp') var tmp = osenv.tmpdir() var t1dir = path.resolve(tmp, 'view-local-no-pkg') var t2dir = path.resolve(tmp, 'view-local-notmine') var t3dir = path.resolve(tmp, 'view-local-mine') var mr = require('npm-registry-mock') test('setup', function (t) { mkdirp.sync(t1dir) mkdirp.sync(t2dir) mkdirp.sync(t3dir) fs.writeFileSync(t2dir + '/package.json', JSON.stringify({ author: 'Evan Lucas', name: 'test-repo-url-https', version: '0.0.1' }), 'utf8') fs.writeFileSync(t3dir + '/package.json', JSON.stringify({ author: 'Evan Lucas', name: 'biscuits', version: '0.0.1' }), 'utf8') t.pass('created fixtures') t.end() }) function plugin (server) { server .get('/biscuits') .many() .reply(404, {'error': 'version not found'}) } test('npm view . in global mode', function (t) { process.chdir(t1dir) common.npm([ 'view', '.', '--registry=' + common.registry, '--global' ], { cwd: t1dir }, function (err, code, stdout, stderr) { t.ifError(err, 'view command finished successfully') t.equal(code, 1, 'exit not ok') t.similar(stderr, /Cannot use view command in global mode./m) t.end() }) }) test('npm view --global', function (t) { process.chdir(t1dir) common.npm([ 'view', '--registry=' + common.registry, '--global' ], { cwd: t1dir }, function (err, code, stdout, stderr) { t.ifError(err, 'view command finished successfully') t.equal(code, 1, 'exit not ok') t.similar(stderr, /Cannot use view command in global mode./m) t.end() }) }) test('npm view . with no package.json', function (t) { process.chdir(t1dir) common.npm([ 'view', '.', '--registry=' + common.registry ], { cwd: t1dir }, function (err, code, stdout, stderr) { t.ifError(err, 'view command finished successfully') t.equal(code, 1, 'exit not ok') t.similar(stderr, /Invalid package.json/m) t.end() }) }) test('npm view . with no published package', function (t) { process.chdir(t3dir) mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', '.', '--registry=' + common.registry ], { cwd: t3dir }, function (err, code, stdout, stderr) { t.ifError(err, 'view command finished successfully') t.equal(code, 1, 'exit not ok') t.similar(stderr, /version not found/m) s.close() t.end() }) }) }) test('npm view .', function (t) { process.chdir(t2dir) mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', '.', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') var re = new RegExp("name: 'test-repo-url-https'") t.similar(stdout, re) s.close() t.end() }) }) }) test('npm view . select fields', function (t) { process.chdir(t2dir) mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', '.', 'main', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') t.equal(stdout.trim(), 'index.js', 'should print `index.js`') s.close() t.end() }) }) }) test('npm view .@<version>', function (t) { process.chdir(t2dir) mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', '.@0.0.0', 'version', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') t.equal(stdout.trim(), '0.0.0', 'should print `0.0.0`') s.close() t.end() }) }) }) test('npm view .@<version> --json', function (t) { process.chdir(t2dir) mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', '.@0.0.0', 'version', '--json', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') t.equal(stdout.trim(), '"0.0.0"', 'should print `"0.0.0"`') s.close() t.end() }) }) }) test('npm view <package name>', function (t) { mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', 'underscore', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') var re = new RegExp("name: 'underscore'") t.similar(stdout, re, 'should have name `underscore`') s.close() t.end() }) }) }) test('npm view <package name> --global', function (t) { mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', 'underscore', '--global', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') var re = new RegExp("name: 'underscore'") t.similar(stdout, re, 'should have name `underscore`') s.close() t.end() }) }) }) test('npm view <package name> --json', function (t) { t.plan(3) mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', 'underscore', '--json', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') s.close() try { var out = JSON.parse(stdout.trim()) t.similar(out, { maintainers: ['jashkenas <jashkenas@gmail.com>'] }, 'should have the same maintainer') } catch (er) { t.fail('Unable to parse JSON') } }) }) }) test('npm view <package name> <field>', function (t) { mr({ port: common.port, plugin: plugin }, function (er, s) { common.npm([ 'view', 'underscore', 'homepage', '--registry=' + common.registry ], { cwd: t2dir }, function (err, code, stdout) { t.ifError(err, 'view command finished successfully') t.equal(code, 0, 'exit ok') t.equal(stdout.trim(), 'http://underscorejs.org', 'homepage should equal `http://underscorejs.org`') s.close() t.end() }) }) }) test('npm view with invalid package name', function (t) { var invalidName = 'InvalidPackage' var obj = {} obj['/' + invalidName] = [404, {'error': 'not found'}] mr({ port: common.port, mocks: { 'get': obj } }, function (er, s) { common.npm([ 'view', invalidName, '--registry=' + common.registry ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'view command finished successfully') t.equal(code, 1, 'exit not ok') t.similar(stderr, new RegExp('is not in the npm registry'), 'Package should NOT be found') t.dissimilar(stderr, new RegExp('use the name yourself!'), 'Suggestion should not be there') t.similar(stderr, new RegExp('name can no longer contain capital letters'), 'Suggestion about Capital letter should be there') s.close() t.end() }) }) }) test('npm view with valid but non existent package name', function (t) { mr({ port: common.port, mocks: { 'get': { '/valid-but-non-existent-package': [404, {'error': 'not found'}] } }}, function (er, s) { common.npm([ 'view', 'valid-but-non-existent-package', '--registry=' + common.registry ], {}, function (err, code, stdout, stderr) { t.ifError(err, 'view command finished successfully') t.equal(code, 1, 'exit not ok') t.similar(stderr, new RegExp("'valid-but-non-existent-package' is not in the npm registry\."), 'Package should NOT be found') t.similar(stderr, new RegExp('use the name yourself!'), 'Suggestion should be there') s.close() t.end() }) }) }) test('cleanup', function (t) { process.chdir(osenv.tmpdir()) rimraf.sync(t1dir) rimraf.sync(t2dir) rimraf.sync(t3dir) t.pass('cleaned up') t.end() }) �������������������������������������������������������������������������npm_3.5.2.orig/test/tap/whoami.js�������������������������������������������������������������������0000644�0000000�0000000�00000003767�12631326456�015074� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap.js') var fs = require('fs') var path = require('path') var createServer = require('http').createServer var test = require('tap').test var rimraf = require('rimraf') var opts = { cwd: __dirname } var FIXTURE_PATH = path.resolve(__dirname, 'fixture_npmrc') test('npm whoami with basic auth', function (t) { var s = '//registry.lvh.me/:username = wombat\n' + '//registry.lvh.me/:_password = YmFkIHBhc3N3b3Jk\n' + '//registry.lvh.me/:email = lindsay@wdu.org.au\n' fs.writeFileSync(FIXTURE_PATH, s, 'ascii') fs.chmodSync(FIXTURE_PATH, '0444') common.npm( [ 'whoami', '--userconfig=' + FIXTURE_PATH, '--registry=http://registry.lvh.me/' ], opts, function (err, code, stdout, stderr) { t.ifError(err) t.equal(stderr, '', 'got nothing on stderr') t.equal(code, 0, 'exit ok') t.equal(stdout, 'wombat\n', 'got username') rimraf.sync(FIXTURE_PATH) t.end() } ) }) test('npm whoami with bearer auth', { timeout: 2 * 1000 }, function (t) { var s = '//localhost:' + common.port + '/:_authToken = wombat-developers-union\n' fs.writeFileSync(FIXTURE_PATH, s, 'ascii') fs.chmodSync(FIXTURE_PATH, '0444') function verify (req, res) { t.equal(req.method, 'GET') t.equal(req.url, '/-/whoami') res.setHeader('content-type', 'application/json') res.writeHeader(200) res.end(JSON.stringify({ username: 'wombat' }), 'utf8') } var server = createServer(verify) server.listen(common.port, function () { common.npm( [ 'whoami', '--userconfig=' + FIXTURE_PATH, '--registry=http://localhost:' + common.port + '/' ], opts, function (err, code, stdout, stderr) { t.ifError(err) t.equal(stderr, '', 'got nothing on stderr') t.equal(code, 0, 'exit ok') t.equal(stdout, 'wombat\n', 'got username') rimraf.sync(FIXTURE_PATH) server.close() t.end() } ) }) }) ���������npm_3.5.2.orig/test/tap/zz-cleanup.js���������������������������������������������������������������0000644�0000000�0000000�00000000270�12631326456�015662� 0����������������������������������������������������������������������������������������������������ustar �����������������������������������������������������������������0000000�0000000������������������������������������������������������������������������������������������������������������������������������������������������������������������������var common = require('../common-tap') var test = require('tap').test var rimraf = require('rimraf') test('cleanup', function (t) { rimraf.sync(common.npm_config_cache) t.end() }) ����������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������������