pax_global_header00006660000000000000000000000064123643227070014520gustar00rootroot0000000000000052 comment=7127ed4d9fe93bb0c2b2809b3c10a4f23f0c8236 read-package-json-1.2.4/000077500000000000000000000000001236432270700147775ustar00rootroot00000000000000read-package-json-1.2.4/.gitignore000066400000000000000000000001461236432270700167700ustar00rootroot00000000000000*.swp .*.swp .DS_Store *~ .project .settings npm-debug.log coverage.html .idea lib-cov node_modules read-package-json-1.2.4/LICENSE000066400000000000000000000013541236432270700160070ustar00rootroot00000000000000The ISC License Copyright (c) Isaac Z. Schlueter Permission to use, copy, modify, and/or distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. read-package-json-1.2.4/README.md000066400000000000000000000116621236432270700162640ustar00rootroot00000000000000# read-package-json This is the thing that npm uses to read package.json files. It validates some stuff, and loads some default things. It keeps a cache of the files you've read, so that you don't end up reading the same package.json file multiple times. Note that if you just want to see what's literally in the package.json file, you can usually do `var data = require('some-module/package.json')`. This module is basically only needed by npm, but it's handy to see what npm will see when it looks at your package. ## Usage ```javascript var readJson = require('read-package-json') // readJson(filename, [logFunction=noop], [strict=false], cb) readJson('/path/to/package.json', console.error, false, function (er, data) { if (er) { console.error("There was an error reading the file") return } console.error('the package data is', data) }); ``` ## readJson(file, [logFn = noop], [strict = false], cb) * `file` {String} The path to the package.json file * `logFn` {Function} Function to handle logging. Defaults to a noop. * `strict` {Boolean} True to enforce SemVer 2.0 version strings, and other strict requirements. * `cb` {Function} Gets called with `(er, data)`, as is The Node Way. Reads the JSON file and does the things. ## `package.json` Fields See `man 5 package.json` or `npm help json`. ## readJson.log By default this is a reference to the `npmlog` module. But if that module can't be found, then it'll be set to just a dummy thing that does nothing. Replace with your own `{log,warn,error}` object for fun loggy time. ## readJson.extras(file, data, cb) Run all the extra stuff relative to the file, with the parsed data. Modifies the data as it does stuff. Calls the cb when it's done. ## readJson.extraSet = [fn, fn, ...] Array of functions that are called by `extras`. Each one receives the arguments `fn(file, data, cb)` and is expected to call `cb(er, data)` when done or when an error occurs. Order is indeterminate, so each function should be completely independent. Mix and match! ## readJson.cache The `lru-cache` object that readJson uses to not read the same file over and over again. See [lru-cache](https://github.com/isaacs/node-lru-cache) for details. ## Other Relevant Files Besides `package.json` Some other files have an effect on the resulting data object, in the following ways: ### `README?(.*)` If there is a `README` or `README.*` file present, then npm will attach a `readme` field to the data with the contents of this file. Owing to the fact that roughly 100% of existing node modules have Markdown README files, it will generally be assumed to be Markdown, regardless of the extension. Please plan accordingly. ### `server.js` If there is a `server.js` file, and there is not already a `scripts.start` field, then `scripts.start` will be set to `node server.js`. ### `AUTHORS` If there is not already a `contributors` field, then the `contributors` field will be set to the contents of the `AUTHORS` file, split by lines, and parsed. ### `bindings.gyp` If a bindings.gyp file exists, and there is not already a `scripts.install` field, then the `scripts.install` field will be set to `node-gyp rebuild`. ### `wscript` If a wscript file exists, and there is not already a `scripts.install` field, then the `scripts.install` field will be set to `node-waf clean ; node-waf configure build`. Note that the `bindings.gyp` file supercedes this, since node-waf has been deprecated in favor of node-gyp. ### `index.js` If the json file does not exist, but there is a `index.js` file present instead, and that file has a package comment, then it will try to parse the package comment, and use that as the data instead. A package comment looks like this: ```javascript /**package * { "name": "my-bare-module" * , "version": "1.2.3" * , "description": "etc...." } **/ // or... /**package { "name": "my-bare-module" , "version": "1.2.3" , "description": "etc...." } **/ ``` The important thing is that it starts with `/**package`, and ends with `**/`. If the package.json file exists, then the index.js is not parsed. ### `{directories.man}/*.[0-9]` If there is not already a `man` field defined as an array of files or a single file, and there is a `directories.man` field defined, then that directory will be searched for manpages. Any valid manpages found in that directory will be assigned to the `man` array, and installed in the appropriate man directory at package install time, when installed globally on a Unix system. ### `{directories.bin}/*` If there is not already a `bin` field defined as a string filename or a hash of ` : ` pairs, then the `directories.bin` directory will be searched and all the files within it will be linked as executables at install time. When installing locally, npm links bins into `node_modules/.bin`, which is in the `PATH` environ when npm runs scripts. When installing globally, they are linked into `{prefix}/bin`, which is presumably in the `PATH` environment variable. read-package-json-1.2.4/package.json000066400000000000000000000012011236432270700172570ustar00rootroot00000000000000{ "name": "read-package-json", "version": "1.2.4", "author": "Isaac Z. Schlueter (http://blog.izs.me/)", "description": "The thing npm uses to read package.json files with semantics and defaults and validation", "repository": { "type": "git", "url": "git://github.com/isaacs/read-package-json.git" }, "main": "read-json.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "glob": "^4.0.2", "lru-cache": "2", "normalize-package-data": "0.4" }, "devDependencies": { "tap": "~0.2.5" }, "optionalDependencies": { "graceful-fs": "2 || 3" }, "license": "ISC" } read-package-json-1.2.4/read-json.js000066400000000000000000000354561236432270700172340ustar00rootroot00000000000000// vim: set softtabstop=16 shiftwidth=16: try { var fs = require("graceful-fs") } catch (er) { var fs = require("fs") } module.exports = readJson var LRU = require("lru-cache") readJson.cache = new LRU({max: 1000}) var path = require("path") var glob = require("glob") var normalizeData = require("normalize-package-data") // put more stuff on here to customize. readJson.extraSet = [ gypfile, serverjs, scriptpath, authors, readme, mans, bins, githead ] var typoWarned = {} function readJson (file, log_, strict_, cb_) { var log, strict, cb for (var i = 1; i < arguments.length - 1; i++) { if (typeof arguments[i] === 'boolean') strict = arguments[i] else if (typeof arguments[i] === 'function') log = arguments[i] } if (!log) log = function () {}; cb = arguments[ arguments.length - 1 ] var c = readJson.cache.get(file) if (c) { cb = cb.bind(null, null, c) return process.nextTick(cb); } cb = (function (orig) { return function (er, data) { if (data) readJson.cache.set(file, data); return orig(er, data) } })(cb) readJson_(file, log, strict, cb) } function readJson_ (file, log, strict, cb) { fs.readFile(file, "utf8", function (er, d) { parseJson(file, er, d, log, strict, cb) }) } function stripBOM(content) { // Remove byte order marker. This catches EF BB BF (the UTF-8 BOM) // because the buffer-to-string conversion in `fs.readFileSync()` // translates it to FEFF, the UTF-16 BOM. if (content.charCodeAt(0) === 0xFEFF) { content = content.slice(1); } return content; } function parseJson (file, er, d, log, strict, cb) { if (er && er.code === "ENOENT") { indexjs(file, er, log, strict, cb) return } if (er) return cb(er); try { d = JSON.parse(stripBOM(d)) } catch (er) { d = parseIndex(d) if (!d) return cb(parseError(er, file)); } extras(file, d, log, strict, cb) } function indexjs (file, er, log, strict, cb) { if (path.basename(file) === "index.js") { return cb(er); } var index = path.resolve(path.dirname(file), "index.js") fs.readFile(index, "utf8", function (er2, d) { if (er2) return cb(er); d = parseIndex(d) if (!d) return cb(er); extras(file, d, log, strict, cb) }) } readJson.extras = extras function extras (file, data, log_, strict_, cb_) { var log, strict, cb for (var i = 2; i < arguments.length - 1; i++) { if (typeof arguments[i] === 'boolean') strict = arguments[i] else if (typeof arguments[i] === 'function') log = arguments[i] } if (!log) log = function () {}; cb = arguments[i] var set = readJson.extraSet var n = set.length var errState = null set.forEach(function (fn) { fn(file, data, then) }) function then(er) { if (errState) return; if (er) return cb(errState = er); if (--n > 0) return; final(file, data, log, strict, cb); } } function scriptpath (file, data, cb) { if (!data.scripts) return cb(null, data); var k = Object.keys(data.scripts) k.forEach(scriptpath_, data.scripts) cb(null, data); } function scriptpath_(key) { s = this[key] // This is never allowed, and only causes problems if (typeof s !== 'string') return delete this[key] var spre = /^(\.[\/\\])?node_modules[\/\\].bin[\\\/]/ if (s.match(spre)) this[key] = this[key].replace(spre, '') } function gypfile (file, data, cb) { var dir = path.dirname(file) var s = data.scripts || {} if (s.install || s.preinstall) return cb(null, data); glob("*.gyp", { cwd: dir }, function (er, files) { if (er) return cb(er); gypfile_(file, data, files, cb) }) } function gypfile_ (file, data, files, cb) { if (!files.length) return cb(null, data); var s = data.scripts || {} s.install = "node-gyp rebuild" data.scripts = s data.gypfile = true return cb(null, data); } function serverjs (file, data, cb) { var dir = path.dirname(file) var s = data.scripts || {} if (s.start) return cb(null, data) glob("server.js", { cwd: dir }, function (er, files) { if (er) return cb(er); serverjs_(file, data, files, cb) }) } function serverjs_ (file, data, files, cb) { if (!files.length) return cb(null, data); var s = data.scripts || {} s.start = "node server.js" data.scripts = s return cb(null, data) } function authors (file, data, cb) { if (data.contributors) return cb(null, data); var af = path.resolve(path.dirname(file), "AUTHORS") fs.readFile(af, "utf8", function (er, ad) { // ignore error. just checking it. if (er) return cb(null, data); authors_(file, data, ad, cb) }) } function authors_ (file, data, ad, cb) { ad = ad.split(/\r?\n/g).map(function (line) { return line.replace(/^\s*#.*$/, '').trim() }).filter(function (line) { return line }) data.contributors = ad return cb(null, data) } var defDesc = "Unnamed repository; edit this file " + "'description' to name the repository." function gitDescription (file, data, cb) { if (data.description) return cb(null, data); var dir = path.dirname(file) // just cuz it'd be nice if this file mattered... var gitDesc = path.resolve(dir, '.git/description') fs.readFile(gitDesc, 'utf8', function (er, desc) { if (desc) desc = desc.trim() if (!er && desc !== defDesc) data.description = desc return cb(null, data) }) } function readmeDescription (file, data) { if (data.description) return cb(null, data); var d = data.readme if (!d) return; // the first block of text before the first heading // that isn't the first line heading d = d.trim().split('\n') for (var s = 0; d[s] && d[s].trim().match(/^(#|$)/); s ++); var l = d.length for (var e = s + 1; e < l && d[e].trim(); e ++); data.description = d.slice(s, e).join(' ').trim() } function readme (file, data, cb) { if (data.readme) return cb(null, data); var dir = path.dirname(file) var globOpts = { cwd: dir, nocase: true, mark: true } glob("{README,README.*}", globOpts, function (er, files) { if (er) return cb(er); // don't accept directories. files = files.filter(function (file) { return !file.match(/\/$/) }) if (!files.length) return cb(); var fn = preferMarkdownReadme(files) var rm = path.resolve(dir, fn) readme_(file, data, rm, cb) }) } function preferMarkdownReadme(files) { var fallback = 0; var re = /\.m?a?r?k?d?o?w?n?$/i for (var i = 0; i < files.length; i++) { if (files[i].match(re)) return files[i] else if (files[i].match(/README$/)) fallback = i } // prefer README.md, followed by README; otherwise, return // the first filename (which could be README) return files[fallback]; } function readme_(file, data, rm, cb) { var rmfn = path.basename(rm); fs.readFile(rm, "utf8", function (er, rm) { // maybe not readable, or something. if (er) return cb() data.readme = rm data.readmeFilename = rmfn return cb(er, data) }) } function mans (file, data, cb) { var m = data.directories && data.directories.man if (data.man || !m) return cb(null, data); m = path.resolve(path.dirname(file), m) glob("**/*.[0-9]", { cwd: m }, function (er, mans) { if (er) return cb(er); mans_(file, data, mans, cb) }) } function mans_ (file, data, mans, cb) { var m = data.directories && data.directories.man data.man = mans.map(function (mf) { return path.resolve(path.dirname(file), m, mf) }) return cb(null, data) } function bins (file, data, cb) { if (Array.isArray(data.bin)) { return bins_(file, data, data.bin, cb) } var m = data.directories && data.directories.bin if (data.bin || !m) return cb(null, data); m = path.resolve(path.dirname(file), m) glob("**", { cwd: m }, function (er, bins) { if (er) return cb(er); bins_(file, data, bins, cb) }) } function bins_ (file, data, bins, cb) { var m = data.directories && data.directories.bin || '.' data.bin = bins.reduce(function (acc, mf) { if (mf && mf.charAt(0) !== '.') { var f = path.basename(mf) acc[f] = path.join(m, mf) } return acc }, {}) return cb(null, data) } function githead (file, data, cb) { if (data.gitHead) return cb(null, data); var dir = path.dirname(file) var head = path.resolve(dir, '.git/HEAD') fs.readFile(head, 'utf8', function (er, head) { if (er) return cb(null, data); githead_(file, data, dir, head, cb) }) } function githead_ (file, data, dir, head, cb) { if (!head.match(/^ref: /)) { data.gitHead = head.trim() return cb(null, data) } var headFile = head.replace(/^ref: /, '').trim() headFile = path.resolve(dir, '.git', headFile) fs.readFile(headFile, 'utf8', function (er, head) { if (er || !head) return cb(null, data) head = head.replace(/^ref: /, '').trim() data.gitHead = head return cb(null, data) }) } function final (file, data, log, strict, cb) { var pId = makePackageId(data) function warn(msg) { if (typoWarned[pId]) return; if (log) log("package.json", pId, msg); } try { normalizeData(data, warn, strict) } catch (error) { return cb(error) } typoWarned[pId] = true readJson.cache.set(file, data) cb(null, data) } function makePackageId (data) { var name = cleanString(data.name) var ver = cleanString(data.version) return name + "@" + ver } function cleanString(str) { return (!str || typeof(str) !== "string") ? "" : str.trim() } // /**package { "name": "foo", "version": "1.2.3", ... } **/ function parseIndex (data) { data = data.split(/^\/\*\*package(?:\s|$)/m) if (data.length < 2) return null data = data[1] data = data.split(/\*\*\/$/m) if (data.length < 2) return null data = data[0] data = data.replace(/^\s*\*/mg, "") try { return JSON.parse(data) } catch (er) { return null } } function parseError (ex, file) { var e = new Error("Failed to parse json\n"+ex.message) e.code = "EJSONPARSE" e.file = file return e } read-package-json-1.2.4/test/000077500000000000000000000000001236432270700157565ustar00rootroot00000000000000read-package-json-1.2.4/test/basic.js000066400000000000000000000032261236432270700174000ustar00rootroot00000000000000// vim: set softtabstop=16 shiftwidth=16: var tap = require("tap") var readJson = require("../") var path = require("path") var fs = require("fs") var readme = fs.readFileSync(path.resolve(__dirname, "../README.md"), "utf8") var package = require("../package.json") var isGit try { fs.readFileSync(path.resolve(__dirname, '../.git/HEAD')); isGit = true } catch (e) { isGit = false } console.error("basic test") tap.test("basic test", function (t) { var p = path.resolve(__dirname, "../package.json") readJson(p, function (er, data) { if (er) throw er; basic_(t, data) }) }) function basic_ (t, data) { t.ok(data) t.equal(data.version, package.version) t.equal(data._id, data.name + "@" + data.version) t.equal(data.name, package.name) t.type(data.author, "object") t.equal(data.readme, readme) t.deepEqual(data.scripts, package.scripts) t.equal(data.main, package.main) t.equal(data.readmeFilename, 'README.md') if (isGit) t.similar(data.gitHead, /^[a-f0-9]{40}$/); // optional deps are folded in. t.deepEqual(data.optionalDependencies, package.optionalDependencies) t.has(data.dependencies, package.optionalDependencies) t.has(data.dependencies, package.dependencies) t.deepEqual(data.devDependencies, package.devDependencies) t.end() } read-package-json-1.2.4/test/bom.js000066400000000000000000000014151236432270700170720ustar00rootroot00000000000000// vim: set softtabstop=16 shiftwidth=16: var tap = require("tap") var readJson = require("../") var path = require("path") var fs = require("fs") console.error("BOM test") tap.test("BOM test", function (t) { var p = path.resolve(__dirname, "fixtures/bom.json") readJson(p, function (er, data) { if (er) throw er; p = path.resolve(__dirname, "fixtures/nobom.json") readJson(p, function (er, data2) { if (er) throw er; t.deepEqual(data, data2) t.end() }) }) }) read-package-json-1.2.4/test/fixtures/000077500000000000000000000000001236432270700176275ustar00rootroot00000000000000read-package-json-1.2.4/test/fixtures/bom.json000066400000000000000000000001551236432270700213000ustar00rootroot00000000000000{ "name": "this", "description": "file", "author": "has ", "version" : "0.0.1" }read-package-json-1.2.4/test/fixtures/nobom.json000066400000000000000000000001521236432270700216320ustar00rootroot00000000000000{ "name": "this", "description": "file", "author": "has ", "version" : "0.0.1" }read-package-json-1.2.4/test/fixtures/not-json.css000066400000000000000000000013511236432270700221100ustar00rootroot00000000000000body { height: yo mama } /**package { "name": "read-package-json", "version": "0.1.1", "author": "Isaac Z. Schlueter (http://blog.izs.me/)", "description": "The thing npm uses to read package.json files with semantics and defaults and validation", "repository": { "type": "git", "url": "git://github.com/isaacs/read-package-json.git" }, "main": "read-json.js", "scripts": { "test": "./node_modules/.bin/tap test/*.js" }, "dependencies": { "glob": "~3.1.9", "lru-cache": "~1.1.0", "semver": "~1.0.14", "slide": "~1.1.3" }, "devDependencies": { "tap": "~0.2.5" }, "optionalDependencies": { "npmlog": "0", "graceful-fs": "~1.1.8" } } **/ html { width: so fat } read-package-json-1.2.4/test/fixtures/readmes/000077500000000000000000000000001236432270700212475ustar00rootroot00000000000000read-package-json-1.2.4/test/fixtures/readmes/README000066400000000000000000000000041236432270700221210ustar00rootroot00000000000000foo read-package-json-1.2.4/test/fixtures/readmes/README.md000066400000000000000000000000131236432270700225200ustar00rootroot00000000000000*markdown* read-package-json-1.2.4/test/fixtures/readmes/package.json000066400000000000000000000000611236432270700235320ustar00rootroot00000000000000{"name":"readmes", "version":"99.999.999999999"} read-package-json-1.2.4/test/fixtures/readmes/readmexxx.yz000066400000000000000000000000141236432270700236330ustar00rootroot00000000000000extra noise read-package-json-1.2.4/test/non-json.js000066400000000000000000000042211236432270700200540ustar00rootroot00000000000000// vim: set softtabstop=16 shiftwidth=16: var tap = require('tap') var readJson = require('../') var path = require('path') var fs = require('fs') var expect = { name: 'read-package-json', version: '0.1.1', author: { name: 'Isaac Z. Schlueter', email: 'i@izs.me', url: 'http://blog.izs.me/' }, description: 'The thing npm uses to read package.json files with semantics and defaults and validation', repository: { type: 'git', url: 'git://github.com/isaacs/read-package-json.git' }, bugs: {url: "https://github.com/isaacs/read-package-json/issues" }, main: 'read-json.js', scripts: { test: 'tap test/*.js' }, dependencies: { glob: '~3.1.9', 'lru-cache': '~1.1.0', semver: '~1.0.14', slide: '~1.1.3', npmlog: '0', 'graceful-fs': '~1.1.8' }, devDependencies: { tap: '~0.2.5' }, homepage: "https://github.com/isaacs/read-package-json", optionalDependencies: { npmlog: '0', 'graceful-fs': '~1.1.8' }, _id: 'read-package-json@0.1.1', readme: 'ERROR: No README data found!' } tap.test('from css', function (t) { var c = path.join(__dirname, 'fixtures', 'not-json.css') readJson(c, function (er, d) { t.same(d, expect) t.end() }) }) tap.test('from js', function (t) { readJson(__filename, function (er, d) { t.same(d, expect) t.end() }) }) /**package { "name": "read-package-json", "version": "0.1.1", "author": "Isaac Z. Schlueter (http://blog.izs.me/)", "description": "The thing npm uses to read package.json files with semantics and defaults and validation", "repository": { "type": "git", "url": "git://github.com/isaacs/read-package-json.git" }, "main": "read-json.js", "scripts": { "test": "tap test/*.js" }, "dependencies": { "glob": "~3.1.9", "lru-cache": "~1.1.0", "semver": "~1.0.14", "slide": "~1.1.3" }, "devDependencies": { "tap": "~0.2.5" }, "optionalDependencies": { "npmlog": "0", "graceful-fs": "~1.1.8" } } **/ read-package-json-1.2.4/test/readmes.js000066400000000000000000000013751236432270700177420ustar00rootroot00000000000000// vim: set softtabstop=16 shiftwidth=16: var tap = require("tap") var readJson = require("../") var path = require("path") var fs = require("fs") var p = path.resolve(__dirname, "fixtures/readmes/package.json") var expect = {} var expect = { "name" : "readmes", "version" : "99.999.999999999", "readme" : "*markdown*\n", "readmeFilename" : "README.md", "description" : "*markdown*", "_id" : "readmes@99.999.999999999" } console.error("readme test") tap.test("readme test", function (t) { readJson(p, function (er, data) { if (er) throw er; test(t, data) }) }) function test(t, data) { t.deepEqual(data, expect) t.end() }