pbr-0.7.0/0000775000175300017540000000000012312051536013434 5ustar jenkinsjenkins00000000000000pbr-0.7.0/README.rst0000664000175300017540000000515512312051507015127 0ustar jenkinsjenkins00000000000000Introduction ============ PBR is a library that injects some useful and sensible default behaviors into your setuptools run. It started off life as the chunks of code that were copied between all of the OpenStack projects. Around the time that OpenStack hit 18 different projects each with at least 3 active branches, it seems like a good time to make that code into a proper re-usable library. PBR is only mildly configurable. The basic idea is that there's a decent way to run things and if you do, you should reap the rewards, because then it's simple and repeatable. If you want to do things differently, cool! But you've already got the power of python at your fingertips, so you don't really need PBR. PBR builds on top of the work that `d2to1` started to provide for declarative configuration. `d2to1` is itself an implementation of the ideas behind `distutils2`. Although `distutils2` is now abandoned in favor of work towards PEP 426 and Metadata 2.0, declarative config is still a great idea and specifically important in trying to distribute setup code as a library when that library itself will alter how the setup is processed. As Metadata 2.0 and other modern Python packaging PEPs come out, `pbr` aims to support them as quickly as possible. You can read more in `the documentation`_. Running Tests ============= The testing system is based on a combination of tox and testr. The canonical approach to running tests is to simply run the command `tox`. This will create virtual environments, populate them with depenedencies and run all of the tests that OpenStack CI systems run. Behind the scenes, tox is running `testr run --parallel`, but is set up such that you can supply any additional testr arguments that are needed to tox. For example, you can run: `tox -- --analyze-isolation` to cause tox to tell testr to add --analyze-isolation to its argument list. It is also possible to run the tests inside of a virtual environment you have created, or it is possible that you have all of the dependencies installed locally already. If you'd like to go this route, the requirements are listed in requirements.txt and the requirements for testing are in test-requirements.txt. Installing them via pip, for instance, is simply:: pip install -r requirements.txt -r test-requirements.txt In you go this route, you can interact with the testr command directly. Running `testr run` will run the entire test suite. `testr run --parallel` will run it in parallel (this is the default incantation tox uses.) More information about testr can be found at: http://wiki.openstack.org/testr .. _`the documentation`: http://docs.openstack.org/developer/pbr/ pbr-0.7.0/doc/0000775000175300017540000000000012312051536014201 5ustar jenkinsjenkins00000000000000pbr-0.7.0/doc/source/0000775000175300017540000000000012312051536015501 5ustar jenkinsjenkins00000000000000pbr-0.7.0/doc/source/semver.rst0000664000175300017540000004105712312051507017541 0ustar jenkinsjenkins00000000000000Linux Compatible Semantic Versioning 3.0.0 ========================================== This is a fork of Semantic Versioning 2.0. The specific changes have to do with the format of pre-release and build labels, specifically to make them not confusing when co-existing with Linux Distribution packaging. Inspiration for the format of the pre-release and build labels came from Python's PEP440. Summary ------- Given a version number MAJOR.MINOR.PATCH, increment the: #. MAJOR version when you make incompatible API changes, #. MINOR version when you add functionality in a backwards-compatible manner, and #. PATCH version when you make backwards-compatible bug fixes. Additional labels for pre-release and build metadata are available as extensions to the MAJOR.MINOR.PATCH format. Introduction ------------ In the world of software management there exists a dread place called "dependency hell." The bigger your system grows and the more packages you integrate into your software, the more likely you are to find yourself, one day, in this pit of despair. In systems with many dependencies, releasing new package versions can quickly become a nightmare. If the dependency specifications are too tight, you are in danger of version lock (the inability to upgrade a package without having to release new versions of every dependent package). If dependencies are specified too loosely, you will inevitably be bitten by version promiscuity (assuming compatibility with more future versions than is reasonable). Dependency hell is where you are when version lock and/or version promiscuity prevent you from easily and safely moving your project forward. As a solution to this problem, I propose a simple set of rules and requirements that dictate how version numbers are assigned and incremented. These rules are based on but not necessarily limited to pre-existing widespread common practices in use in both closed and open-source software. For this system to work, you first need to declare a public API. This may consist of documentation or be enforced by the code itself. Regardless, it is important that this API be clear and precise. Once you identify your public API, you communicate changes to it with specific increments to your version number. Consider a version format of X.Y.Z (Major.Minor.Patch). Bug fixes not affecting the API increment the patch version, backwards compatible API additions/changes increment the minor version, and backwards incompatible API changes increment the major version. I call this system "Semantic Versioning." Under this scheme, version numbers and the way they change convey meaning about the underlying code and what has been modified from one version to the next. Linux Compatible Semantic Versioning is different from Semantic Versioning in that it does not employ the use of the hypen in ways that are ambiguous when used with or adjacent to software packaged with dpkg or rpm. Instead, it draws from PEP440's approach of indicating pre-releases with leading characters in the version segment. Semantic Versioning Specification (SemVer) ------------------------------------------ The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in `RFC 2119 `__. #. Software using Semantic Versioning MUST declare a public API. This API could be declared in the code itself or exist strictly in documentation. However it is done, it should be precise and comprehensive. #. A normal version number MUST take the form X.Y.Z where X, Y, and Z are non-negative integers, and MUST NOT contain leading zeroes. X is the major version, Y is the minor version, and Z is the patch version. Each element MUST increase numerically. For instance: 1.9.0 -> 1.10.0 -> 1.11.0. #. Once a versioned package has been released, the contents of that version MUST NOT be modified. Any modifications MUST be released as a new version. #. Major version zero (0.y.z) is for initial development. Anything may change at any time. The public API should not be considered stable. #. Version 1.0.0 defines the public API. The way in which the version number is incremented after this release is dependent on this public API and how it changes. #. Patch version Z (x.y.Z \| x > 0) MUST be incremented if only backwards compatible bug fixes are introduced. A bug fix is defined as an internal change that fixes incorrect behavior. #. Minor version Y (x.Y.z \| x > 0) MUST be incremented if new, backwards compatible functionality is introduced to the public API. It MUST be incremented if any public API functionality is marked as deprecated. It MAY be incremented if substantial new functionality or improvements are introduced within the private code. It MAY include patch level changes. Patch version MUST be reset to 0 when minor version is incremented. #. Major version X (X.y.z \| X > 0) MUST be incremented if any backwards incompatible changes are introduced to the public API. It MAY also include minor and patch level changes. Patch and minor version MUST be reset to 0 when major version is incremented. #. A pre-release version MAY be denoted by appending a dot separated identifier immediately following the patch version. The identifier MUST comprise only a, b, rc followed by non-negative integer value. The identifier MUST NOT be empty. Pre-release versions have a lower precedence than the associated normal version. A pre-release version indicates that the version is unstable and might not satisfy the intended compatibility requirements as denoted by its associated normal version. Examples: 1.0.0.a1, 1.0.0.b99, 1.0.0.rc1000. #. A development version MAY be denoted by appending a dot separated indentifier immediately following the patch version. The identifier MUST comprise the string dev followed by non-negative integer value. The identifier MUST NOT be empty. Development versions have a lower precedence than the associated normal version. A development version is a completely unsupported and conveys no API promises when related to other versions. They are more useful as communication vehicles between developers of a community, whereas pre-releases, while potentially prone to break still, are intended for externally facing communication of not-yet-released ideas. Example: 1.0.0.dev1. #. git version metadata MAY be denoted by appending a dot separated identifier immediately following a development version. The identifier MUST comprise the character g followed by a seven character git short-sha. The sha MUST NOT be empty. git version metadata MUST be ignored when determining version precedence. Thus two versions that differ only in the git version, have the same precedence. Example: 1.0.0.a1.g95a9beb. #. Build metadata MAY be denoted by appending a plus sign and a series of dot separated identifiers immediately following the patch or pre-release version. Identifiers MUST comprise only ASCII alphanumerics [0-9A-Za-z]. Identifiers MUST NOT be empty. Build metadata MUST be ignored when determining version precedence. Thus two versions that differ only in the build metadata, have the same precedence. Examples: 1.0.0.a1+001, 1.0.0+20130313144700, 1.0.0.b1+exp.sha.5114f85. #. Precedence refers to how versions are compared to each other when ordered. Precedence MUST be calculated by separating the version into major, minor, patch and pre-release identifiers in that order (Build metadata does not figure into precedence). Precedence is determined by the first difference when comparing each of these identifiers from left to right as follows: Major, minor, and patch versions are always compared numerically. Example: 1.0.0 < 2.0.0 < 2.1.0 < 2.1.1. When major, minor, and patch are equal, a pre-release version has lower precedence than a normal version. Example: 1.0.0.a1 < 1.0.0. When major, minor, and patch are equal, a development version as a lower precedence than a normal version and of a pre-release version. Example: 1.0.0.dev1 < 1.0.0 and 1.0.0dev9 < 1.0.0a1. Precedence for two pre-release or development versions with the same major, minor, and patch version MUST be determined by comparing the identifier to the right of the patch version as follows: if the alpha portion matches, the numeric portion is compared in numerical sort order. If the alpha portion does not match, the sort order is dev < a < b < rc. Example: 1.0.0.dev8 < 1.0.0.dev9 1.0.0.a1 < 1.0.0.b2 < 1.0.0.rc1 < 1.0.0. Why Use Semantic Versioning? ---------------------------- This is not a new or revolutionary idea. In fact, you probably do something close to this already. The problem is that "close" isn't good enough. Without compliance to some sort of formal specification, version numbers are essentially useless for dependency management. By giving a name and clear definition to the above ideas, it becomes easy to communicate your intentions to the users of your software. Once these intentions are clear, flexible (but not too flexible) dependency specifications can finally be made. A simple example will demonstrate how Semantic Versioning can make dependency hell a thing of the past. Consider a library called "Firetruck." It requires a Semantically Versioned package named "Ladder." At the time that Firetruck is created, Ladder is at version 3.1.0. Since Firetruck uses some functionality that was first introduced in 3.1.0, you can safely specify the Ladder dependency as greater than or equal to 3.1.0 but less than 4.0.0. Now, when Ladder version 3.1.1 and 3.2.0 become available, you can release them to your package management system and know that they will be compatible with existing dependent software. As a responsible developer you will, of course, want to verify that any package upgrades function as advertised. The real world is a messy place; there's nothing we can do about that but be vigilant. What you can do is let Semantic Versioning provide you with a sane way to release and upgrade packages without having to roll new versions of dependent packages, saving you time and hassle. If all of this sounds desirable, all you need to do to start using Semantic Versioning is to declare that you are doing so and then follow the rules. Link to this website from your README so others know the rules and can benefit from them. FAQ --- How should I deal with revisions in the 0.y.z initial development phase? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The simplest thing to do is start your initial development release at 0.1.0 and then increment the minor version for each subsequent release. How do I know when to release 1.0.0? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ If your software is being used in production, it should probably already be 1.0.0. If you have a stable API on which users have come to depend, you should be 1.0.0. If you're worrying a lot about backwards compatibility, you should probably already be 1.0.0. Doesn't this discourage rapid development and fast iteration? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Major version zero is all about rapid development. If you're changing the API every day you should either still be in version 0.y.z or on a separate development branch working on the next major version. If even the tiniest backwards incompatible changes to the public API require a major version bump, won't I end up at version 42.0.0 very rapidly? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This is a question of responsible development and foresight. Incompatible changes should not be introduced lightly to software that has a lot of dependent code. The cost that must be incurred to upgrade can be significant. Having to bump major versions to release incompatible changes means you'll think through the impact of your changes, and evaluate the cost/benefit ratio involved. Documenting the entire public API is too much work! ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ It is your responsibility as a professional developer to properly document software that is intended for use by others. Managing software complexity is a hugely important part of keeping a project efficient, and that's hard to do if nobody knows how to use your software, or what methods are safe to call. In the long run, Semantic Versioning, and the insistence on a well defined public API can keep everyone and everything running smoothly. What do I do if I accidentally release a backwards incompatible change as a minor version? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ As soon as you realize that you've broken the Semantic Versioning spec, fix the problem and release a new minor version that corrects the problem and restores backwards compatibility. Even under this circumstance, it is unacceptable to modify versioned releases. If it's appropriate, document the offending version and inform your users of the problem so that they are aware of the offending version. What should I do if I update my own dependencies without changing the public API? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ That would be considered compatible since it does not affect the public API. Software that explicitly depends on the same dependencies as your package should have their own dependency specifications and the author will notice any conflicts. Determining whether the change is a patch level or minor level modification depends on whether you updated your dependencies in order to fix a bug or introduce new functionality. I would usually expect additional code for the latter instance, in which case it's obviously a minor level increment. What if I inadvertently alter the public API in a way that is not compliant with the version number change (i.e. the code incorrectly introduces a major breaking change in a patch release) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Use your best judgment. If you have a huge audience that will be drastically impacted by changing the behavior back to what the public API intended, then it may be best to perform a major version release, even though the fix could strictly be considered a patch release. Remember, Semantic Versioning is all about conveying meaning by how the version number changes. If these changes are important to your users, use the version number to inform them. How should I handle deprecating functionality? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Deprecating existing functionality is a normal part of software development and is often required to make forward progress. When you deprecate part of your public API, you should do two things: (1) update your documentation to let users know about the change, (2) issue a new minor release with the deprecation in place. Before you completely remove the functionality in a new major release there should be at least one minor release that contains the deprecation so that users can smoothly transition to the new API. Does SemVer have a size limit on the version string? ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ No, but use good judgment. A 255 character version string is probably overkill, for example. Also, specific systems may impose their own limits on the size of the string. About ----- The Linux Compatible Semantic Versioning specification was modified by `Monty Taylor `__, member of `The Satori Group `__, co-founder of OpenStack and Free Software Hacker. It was based on The Semantic Versioning specification, which was authored by `Tom Preston-Werner `__, inventor of Gravatars and cofounder of GitHub, with inputs from `PEP 440 `__ which was authored by `Nick Coughlan `__ who is a core Python developer and generally a great guy. I don't really know which things Nick invented or co-founded, and I'm not really sure why we'd need to list those here, but Tom did, so I figured coding style is usually about sticking to the style that was there before you showed up. If you'd like to leave feedback, please `open an issue on GitHub `__. License ------- Creative Commons - CC BY 3.0 http://creativecommons.org/licenses/by/3.0/ pbr-0.7.0/doc/source/index.rst0000664000175300017540000001661012312051507017344 0ustar jenkinsjenkins00000000000000================================= pbr - Python Build Reasonableness ================================= A library for managing setuptools packaging needs in a consistent manner. `pbr` reads and then filters the `setup.cfg` data through a setup hook to fill in default values and provide more sensible behaviors, and then feeds the results in as the arguments to a call to `setup.py` - so the heavy lifting of handling python packaging needs is still being done by `setuptools`. What It Does ============ PBR can and does do a bunch of things for you: * **Version**: Manage version number based on git revisions and tags * **AUTHORS**: Generate AUTHORS file from git log * **ChangeLog**: Generate ChangeLog from git log * **Sphinx Autodoc**: Generate autodoc stub files for your whole module * **Requirements**: Store your dependencies in a pip requirements file * **long_description**: Use your README file as a long_description * **Smart find_packages**: Smartly find packages under your root package Version ------- Version strings will be inferred from git. If a given revision is tagged, that's the version. If it's not, and you don't provide a version, the version will be very similar to git describe. If you do, then we'll assume that's the version you are working towards, and will generate alpha version strings based on commits since last tag and the current git sha. The versions are expected to be compliant with :doc:`semver`. AUTHORS and ChangeLog --------------------- Why keep an AUTHORS or a ChangeLog file, when git already has all of the information you need. AUTHORS generation supports filtering/combining based on a standard .mailmap file. Sphinx Autodoc -------------- Sphinx can produce auto documentation indexes based on signatures and docstrings of your project- but you have to give it index files to tell it to autodoc each module. That's kind of repetitive and boring. PBR will scan your project, find all of your modules, and generate all of the stub files for you. Sphinx documentation setups are altered to generate man pages by default. They also have several pieces of information that are known to setup.py injected into the sphinx config. Requirements ------------ You may not have noticed, but there are differences in how pip requirements.txt files work and how distutils wants to be told about requirements. The pip way is nicer, because it sure does make it easier to popuplate a virtualenv for testing, or to just install everything you need. Duplicating the information, though, is super lame. So PBR will let you keep requirements.txt format files around describing the requirements for your project, will parse them and split them up approprirately, and inject them into the install_requires and/or tests_require and/or dependency_links arguments to setup. Voila! You can also have a requirement file for each specific major version of Python. If you want to have a different package list for Python 3, just drop a requirements-py3.txt, and it will be used instead. The requirement files are tried in that order (N being the Python major version number used to install the package): * requirements-pyN.txt * tools/pip-requires-py3 * requirements.txt * tools/pip-requires Only the first file found is used to install the list of packages it contains. long_description ---------------- There is no need to maintain two long descriptions- and your README file is probably a good long_description. So we'll just inject the contents of your README.rst, README.txt or README file into your empty long_description. Yay for you. Usage ===== pbr requires a distribution to use distribute. Your distribution must include a distutils2-like setup.cfg file, and a minimal setup.py script. A simple sample can be found in pbr s own setup.cfg (it uses its own machinery to install itself):: [metadata] name = pbr author = OpenStack Foundation author-email = openstack-dev@lists.openstack.org summary = OpenStack's setup automation in a reuable form description-file = README license = Apache-2 classifier = Development Status :: 4 - Beta Environment :: Console Environment :: OpenStack Intended Audience :: Developers Intended Audience :: Information Technology License :: OSI Approved :: Apache Software License Operating System :: OS Independent Programming Language :: Python keywords = setup distutils [files] packages = pbr data_files = etc/pbr = etc/* etc/init = pbr.packaging.conf pbr.version.conf [entry_points] console_scripts = pbr = pbr.cmd:main pbr.config.drivers = plain = pbr.cfg.driver:Plain The minimal setup.py should look something like this:: #!/usr/bin/env python from setuptools import setup setup( setup_requires=['pbr'], pbr=True, ) Note that it's important to specify `pbr=True` or else the pbr functionality will not be enabled. It should also work fine if additional arguments are passed to `setup()`, but it should be noted that they will be clobbered by any options in the setup.cfg file. files ----- The format of the files section is worth explaining. There are three fundamental keys one is likely to care about, `packages`, `namespace_packages`, and `data_files`. `packages` is a list of top-level packages that should be installed. The behavior of packages is similar to `setuptools.find_packages` in that it recurses the python package hierarchy below the given top level and installs all of it. If `packages` is not specified, it defaults to the name given in the `[metadata]` section. `namespace_packages` is the same, but is a list of packages that provide namespace packages. `data_files` lists files to be installed. The format is an indented block that contains key value pairs which specify target directory and source file to install there. More than one source file for a directory may be indicated with a further indented list. Source files are stripped of leading directories. Additionally, `pbr` supports a simple file globbing syntax for installing entire directory structures, so:: [files] data_files = etc/pbr = etc/pbr/* etc/neutron = etc/api-paste.ini etc/dhcp-agent.ini etc/init.d = neutron.init Will result in `/etc/neutron` containing `api-paste.ini` and `dhcp-agent.ini`, both of which pbr will expect to find in the `etc` directory in the root of the source tree. Additionally, `neutron.init` from that dir will be installed in `/etc/init.d`. All of the files and directories located under `etc/pbr` in the source tree will be installed into `/etc/pbr`. entry_points ------------ The general syntax of specifying entry points is a top level name indicating the entry point group name, followed by one or more key value pairs naming the entry point to be installed. For instance:: [entry_points] console_scripts = pbr = pbr.cmd:main pbr.config.drivers = plain = pbr.cfg.driver:Plain fancy = pbr.cfg.driver:Fancy Will cause a console script called `pbr` to be installed that executes the `main` function found in `pbr.cmd`. Additionally, two entry points will be installed for `pbr.config.drivers`, one called `plain` which maps to the `Plain` class in `pbr.cfg.driver` and one called `fancy` which maps to the `Fancy` class in `pbr.cfg.driver`. Additional Docs =============== .. toctree:: :maxdepth: 1 packagers semver Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` pbr-0.7.0/doc/source/_theme/0000775000175300017540000000000012312051536016742 5ustar jenkinsjenkins00000000000000pbr-0.7.0/doc/source/_theme/theme.conf0000664000175300017540000000010712312051507020707 0ustar jenkinsjenkins00000000000000[theme] inherit = basic stylesheet = nature.css pygments_style = tango pbr-0.7.0/doc/source/_theme/layout.html0000664000175300017540000000730612312051507021151 0ustar jenkinsjenkins00000000000000{% extends "basic/layout.html" %} {% set css_files = css_files + ['_static/tweaks.css'] %} {% set script_files = script_files + ['_static/jquery.tweet.js'] %} {%- macro sidebar() %} {%- if not embedded %}{% if not theme_nosidebar|tobool %}
{%- block sidebarlogo %} {%- if logo %} {%- endif %} {%- endblock %} {%- block sidebartoc %} {%- if display_toc %}

{{ _('Table Of Contents') }}

{{ toc }} {%- endif %} {%- endblock %} {%- block sidebarrel %} {%- if prev %}

{{ _('Previous topic') }}

{{ prev.title }}

{%- endif %} {%- if next %}

{{ _('Next topic') }}

{{ next.title }}

{%- endif %} {%- endblock %} {%- block sidebarsourcelink %} {%- if show_source and has_source and sourcename %}

{{ _('This Page') }}

{%- endif %} {%- endblock %} {%- if customsidebar %} {% include customsidebar %} {%- endif %} {%- block sidebarsearch %} {%- if pagename != "search" %} {%- endif %} {%- endblock %}
{%- endif %}{% endif %} {%- endmacro %} {% block relbar1 %}{% endblock relbar1 %} {% block header %} {% endblock %} pbr-0.7.0/doc/source/_templates/0000775000175300017540000000000012312051536017636 5ustar jenkinsjenkins00000000000000pbr-0.7.0/doc/source/_templates/.placeholder0000664000175300017540000000000012312051507022105 0ustar jenkinsjenkins00000000000000pbr-0.7.0/doc/source/conf.py0000664000175300017540000000366412312051507017007 0ustar jenkinsjenkins00000000000000# -*- coding: utf-8 -*- import os import sys sys.path.insert(0, os.path.abspath('../..')) # -- General configuration ---------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = ['sphinx.ext.autodoc', 'sphinx.ext.intersphinx'] # autodoc generation is a bit aggressive and a nuisance when doing heavy # text edit cycles. # execute "export SPHINX_DEBUG=1" in your terminal to disable # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix of source filenames. source_suffix = '.rst' # The master toctree document. master_doc = 'index' # General information about the project. project = 'pbr' copyright = '2013, OpenStack Foundation' # If true, '()' will be appended to :func: etc. cross-reference text. add_function_parentheses = True # If true, the current module name will be prepended to all description # unit titles (such as .. function::). add_module_names = True # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # -- Options for HTML output -------------------------------------------------- # The theme to use for HTML and HTML Help pages. Major themes that come with # Sphinx are currently 'default' and 'sphinxdoc'. html_theme_path = ["."] html_theme = '_theme' html_static_path = ['static'] # Output file base name for HTML help builder. htmlhelp_basename = '%sdoc' % project # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, author, documentclass # [howto/manual]). latex_documents = [ ('index', '%s.tex' % project, '%s Documentation' % project, 'OpenStack Foundation', 'manual'), ] # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = {'http://docs.python.org/': None} pbr-0.7.0/doc/source/static/0000775000175300017540000000000012312051536016770 5ustar jenkinsjenkins00000000000000pbr-0.7.0/doc/source/static/basic.css0000664000175300017540000001462512312051507020571 0ustar jenkinsjenkins00000000000000/** * Sphinx stylesheet -- basic theme * ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ */ /* -- main layout ----------------------------------------------------------- */ div.clearer { clear: both; } /* -- relbar ---------------------------------------------------------------- */ div.related { width: 100%; font-size: 90%; } div.related h3 { display: none; } div.related ul { margin: 0; padding: 0 0 0 10px; list-style: none; } div.related li { display: inline; } div.related li.right { float: right; margin-right: 5px; } /* -- sidebar --------------------------------------------------------------- */ div.sphinxsidebarwrapper { padding: 10px 5px 0 10px; } div.sphinxsidebar { float: left; width: 230px; margin-left: -100%; font-size: 90%; } div.sphinxsidebar ul { list-style: none; } div.sphinxsidebar ul ul, div.sphinxsidebar ul.want-points { margin-left: 20px; list-style: square; } div.sphinxsidebar ul ul { margin-top: 0; margin-bottom: 0; } div.sphinxsidebar form { margin-top: 10px; } div.sphinxsidebar input { border: 1px solid #98dbcc; font-family: sans-serif; font-size: 1em; } img { border: 0; } /* -- search page ----------------------------------------------------------- */ ul.search { margin: 10px 0 0 20px; padding: 0; } ul.search li { padding: 5px 0 5px 20px; background-image: url(file.png); background-repeat: no-repeat; background-position: 0 7px; } ul.search li a { font-weight: bold; } ul.search li div.context { color: #888; margin: 2px 0 0 30px; text-align: left; } ul.keywordmatches li.goodmatch a { font-weight: bold; } /* -- index page ------------------------------------------------------------ */ table.contentstable { width: 90%; } table.contentstable p.biglink { line-height: 150%; } a.biglink { font-size: 1.3em; } span.linkdescr { font-style: italic; padding-top: 5px; font-size: 90%; } /* -- general index --------------------------------------------------------- */ table.indextable td { text-align: left; vertical-align: top; } table.indextable dl, table.indextable dd { margin-top: 0; margin-bottom: 0; } table.indextable tr.pcap { height: 10px; } table.indextable tr.cap { margin-top: 10px; background-color: #f2f2f2; } img.toggler { margin-right: 3px; margin-top: 3px; cursor: pointer; } /* -- general body styles --------------------------------------------------- */ a.headerlink { visibility: hidden; } h1:hover > a.headerlink, h2:hover > a.headerlink, h3:hover > a.headerlink, h4:hover > a.headerlink, h5:hover > a.headerlink, h6:hover > a.headerlink, dt:hover > a.headerlink { visibility: visible; } div.body p.caption { text-align: inherit; } div.body td { text-align: left; } .field-list ul { padding-left: 1em; } .first { } p.rubric { margin-top: 30px; font-weight: bold; } /* -- sidebars -------------------------------------------------------------- */ div.sidebar { margin: 0 0 0.5em 1em; border: 1px solid #ddb; padding: 7px 7px 0 7px; background-color: #ffe; width: 40%; float: right; } p.sidebar-title { font-weight: bold; } /* -- topics ---------------------------------------------------------------- */ div.topic { border: 1px solid #ccc; padding: 7px 7px 0 7px; margin: 10px 0 10px 0; } p.topic-title { font-size: 1.1em; font-weight: bold; margin-top: 10px; } /* -- admonitions ----------------------------------------------------------- */ div.admonition { margin-top: 10px; margin-bottom: 10px; padding: 7px; } div.admonition dt { font-weight: bold; } div.admonition dl { margin-bottom: 0; } p.admonition-title { margin: 0px 10px 5px 0px; font-weight: bold; } div.body p.centered { text-align: center; margin-top: 25px; } /* -- tables ---------------------------------------------------------------- */ table.docutils { border: 0; border-collapse: collapse; } table.docutils td, table.docutils th { padding: 1px 8px 1px 0; border-top: 0; border-left: 0; border-right: 0; border-bottom: 1px solid #aaa; } table.field-list td, table.field-list th { border: 0 !important; } table.footnote td, table.footnote th { border: 0 !important; } th { text-align: left; padding-right: 5px; } /* -- other body styles ----------------------------------------------------- */ dl { margin-bottom: 15px; } dd p { margin-top: 0px; } dd ul, dd table { margin-bottom: 10px; } dd { margin-top: 3px; margin-bottom: 10px; margin-left: 30px; } dt:target, .highlight { background-color: #fbe54e; } dl.glossary dt { font-weight: bold; font-size: 1.1em; } .field-list ul { margin: 0; padding-left: 1em; } .field-list p { margin: 0; } .refcount { color: #060; } .optional { font-size: 1.3em; } .versionmodified { font-style: italic; } .system-message { background-color: #fda; padding: 5px; border: 3px solid red; } .footnote:target { background-color: #ffa } .line-block { display: block; margin-top: 1em; margin-bottom: 1em; } .line-block .line-block { margin-top: 0; margin-bottom: 0; margin-left: 1.5em; } /* -- code displays --------------------------------------------------------- */ pre { overflow: auto; } td.linenos pre { padding: 5px 0px; border: 0; background-color: transparent; color: #aaa; } table.highlighttable { margin-left: 0.5em; } table.highlighttable td { padding: 0 0.5em 0 0.5em; } tt.descname { background-color: transparent; font-weight: bold; font-size: 1.2em; } tt.descclassname { background-color: transparent; } tt.xref, a tt { background-color: transparent; font-weight: bold; } h1 tt, h2 tt, h3 tt, h4 tt, h5 tt, h6 tt { background-color: transparent; } /* -- math display ---------------------------------------------------------- */ img.math { vertical-align: middle; } div.body div.math p { text-align: center; } span.eqno { float: right; } /* -- printout stylesheet --------------------------------------------------- */ @media print { div.document, div.documentwrapper, div.bodywrapper { margin: 0 !important; width: 100%; } div.sphinxsidebar, div.related, div.footer, #top-link { display: none; } } pbr-0.7.0/doc/source/static/header-line.gif0000664000175300017540000000006012312051507021626 0ustar jenkinsjenkins00000000000000GIF89a!,Q;pbr-0.7.0/doc/source/static/jquery.tweet.js0000664000175300017540000001635412312051507022003 0ustar jenkinsjenkins00000000000000(function($) { $.fn.tweet = function(o){ var s = { username: ["seaofclouds"], // [string] required, unless you want to display our tweets. :) it can be an array, just do ["username1","username2","etc"] list: null, //[string] optional name of list belonging to username avatar_size: null, // [integer] height and width of avatar if displayed (48px max) count: 3, // [integer] how many tweets to display? intro_text: null, // [string] do you want text BEFORE your your tweets? outro_text: null, // [string] do you want text AFTER your tweets? join_text: null, // [string] optional text in between date and tweet, try setting to "auto" auto_join_text_default: "i said,", // [string] auto text for non verb: "i said" bullocks auto_join_text_ed: "i", // [string] auto text for past tense: "i" surfed auto_join_text_ing: "i am", // [string] auto tense for present tense: "i was" surfing auto_join_text_reply: "i replied to", // [string] auto tense for replies: "i replied to" @someone "with" auto_join_text_url: "i was looking at", // [string] auto tense for urls: "i was looking at" http:... loading_text: null, // [string] optional loading text, displayed while tweets load query: null // [string] optional search query }; if(o) $.extend(s, o); $.fn.extend({ linkUrl: function() { var returning = []; var regexp = /((ftp|http|https):\/\/(\w+:{0,1}\w*@)?(\S+)(:[0-9]+)?(\/|\/([\w#!:.?+=&%@!\-\/]))?)/gi; this.each(function() { returning.push(this.replace(regexp,"$1")); }); return $(returning); }, linkUser: function() { var returning = []; var regexp = /[\@]+([A-Za-z0-9-_]+)/gi; this.each(function() { returning.push(this.replace(regexp,"@$1")); }); return $(returning); }, linkHash: function() { var returning = []; var regexp = / [\#]+([A-Za-z0-9-_]+)/gi; this.each(function() { returning.push(this.replace(regexp, ' #$1')); }); return $(returning); }, capAwesome: function() { var returning = []; this.each(function() { returning.push(this.replace(/\b(awesome)\b/gi, '$1')); }); return $(returning); }, capEpic: function() { var returning = []; this.each(function() { returning.push(this.replace(/\b(epic)\b/gi, '$1')); }); return $(returning); }, makeHeart: function() { var returning = []; this.each(function() { returning.push(this.replace(/(<)+[3]/gi, "")); }); return $(returning); } }); function relative_time(time_value) { var parsed_date = Date.parse(time_value); var relative_to = (arguments.length > 1) ? arguments[1] : new Date(); var delta = parseInt((relative_to.getTime() - parsed_date) / 1000); var pluralize = function (singular, n) { return '' + n + ' ' + singular + (n == 1 ? '' : 's'); }; if(delta < 60) { return 'less than a minute ago'; } else if(delta < (45*60)) { return 'about ' + pluralize("minute", parseInt(delta / 60)) + ' ago'; } else if(delta < (24*60*60)) { return 'about ' + pluralize("hour", parseInt(delta / 3600)) + ' ago'; } else { return 'about ' + pluralize("day", parseInt(delta / 86400)) + ' ago'; } } function build_url() { var proto = ('https:' == document.location.protocol ? 'https:' : 'http:'); if (s.list) { return proto+"//api.twitter.com/1/"+s.username[0]+"/lists/"+s.list+"/statuses.json?per_page="+s.count+"&callback=?"; } else if (s.query == null && s.username.length == 1) { return proto+'//twitter.com/status/user_timeline/'+s.username[0]+'.json?count='+s.count+'&callback=?'; } else { var query = (s.query || 'from:'+s.username.join('%20OR%20from:')); return proto+'//search.twitter.com/search.json?&q='+query+'&rpp='+s.count+'&callback=?'; } } return this.each(function(){ var list = $('
    ').appendTo(this); var intro = '

    '+s.intro_text+'

    '; var outro = '

    '+s.outro_text+'

    '; var loading = $('

    '+s.loading_text+'

    '); if(typeof(s.username) == "string"){ s.username = [s.username]; } if (s.loading_text) $(this).append(loading); $.getJSON(build_url(), function(data){ if (s.loading_text) loading.remove(); if (s.intro_text) list.before(intro); $.each((data.results || data), function(i,item){ // auto join text based on verb tense and content if (s.join_text == "auto") { if (item.text.match(/^(@([A-Za-z0-9-_]+)) .*/i)) { var join_text = s.auto_join_text_reply; } else if (item.text.match(/(^\w+:\/\/[A-Za-z0-9-_]+\.[A-Za-z0-9-_:%&\?\/.=]+) .*/i)) { var join_text = s.auto_join_text_url; } else if (item.text.match(/^((\w+ed)|just) .*/im)) { var join_text = s.auto_join_text_ed; } else if (item.text.match(/^(\w*ing) .*/i)) { var join_text = s.auto_join_text_ing; } else { var join_text = s.auto_join_text_default; } } else { var join_text = s.join_text; }; var from_user = item.from_user || item.user.screen_name; var profile_image_url = item.profile_image_url || item.user.profile_image_url; var join_template = ' '+join_text+' '; var join = ((s.join_text) ? join_template : ' '); var avatar_template = ''+from_user+'\'s avatar'; var avatar = (s.avatar_size ? avatar_template : ''); var date = ''+relative_time(item.created_at)+''; var text = '' +$([item.text]).linkUrl().linkUser().linkHash().makeHeart().capAwesome().capEpic()[0]+ ''; // until we create a template option, arrange the items below to alter a tweet's display. list.append('
  • ' + avatar + date + join + text + '
  • '); list.children('li:first').addClass('tweet_first'); list.children('li:odd').addClass('tweet_even'); list.children('li:even').addClass('tweet_odd'); }); if (s.outro_text) list.after(outro); }); }); }; })(jQuery); pbr-0.7.0/doc/source/static/openstack_logo.png0000664000175300017540000000712612312051507022511 0ustar jenkinsjenkins00000000000000PNG  IHDR8&ztEXtSoftwareAdobe ImageReadyqe< IDATxOP߮VB 8ccčĞtڤ:= %Gc;1\z襝`39tz2LO4v;yo:qqq96c0+i~O<Rf>}vgޮ:s̱J4ٹUI݀=>p=l Bol鵵5?tgi>C1M-Gz/@8g;4,Uo@8 G=T/%&UD@lsIZc͸"^,IU>EEZCjEobü.,ʲyFXQ̭x2a6H'X_(dJ:&2U4Ԁɕ{kxr.i:zH PYY]߮pe)Z6:Wѱp XvP|5:?,#;D- ǣBFy6ȚX>ڡ0D JeH J^."4pXN,"s.'yy7עƍ!wvΞ={ jB` p󧛎˭ZWI`x⬆/xa!!n6cpLmq1ޢy/Rqlj3"0a j ; \E8ۅ<~cQCMiq*ł6+[z'B|㱒=D >@i!8Nӝ"e y\L`(02H#V*M4> u?϶ c}`*ca!]7,iHMkU4O7Ԃ0N.vRT(3R~X_r&Mn:D@_v f הYfʌӾ gDiA/XsDiLhlF% 4wAc lγ(>NjaAZJR ST,L9nUʩSCOc%N^ @iYՒSHÏy^"&KI}d+1VNB\iq)8+shqe y[< -86=^e2Ëh1qZrfJ*T埠%,jÍgFMW/zRD4G,b3/n{Z_5676R+ f{{aUhfdYxabEwOib+2MxyBhitI۝IopV 8ʹOjb'}>v[$\\0'j$t- #E&LLOu̪/oD6GLnAuRzʇ5Aűv#^tBuG*)Ct[(-Y(O/F^ʦvd :ҙcrLkT1@N;1_+ k3ZLY^b'uk)T\/{;aбxrF77/)ǓjFMekTHH fe?~`{7 h}.8 7?$&<%ݻTN/ѬUNîF6gjU5{~w37|UWD*gׅ?ڰHoq`Y8+K]8MW,LJ4iYv2cVL;$C1L߆MEZИg߭g vXM9B Ke76]jjnUpݲ\*{3._scnN';d LA=`ҙ*IѓcFkIEenou:.H#'a A_HNDA0*g_rpU,A qbkϰ? $39kW^^XO"׿GJcZ_W^8I=Q6TTS,XgP\lʻ0"< ;7~ϣƁcM0%tWc`;ݛB `9-TխܮRug\liW-x1qVZՅqXX&xz\;Rݺu\dmreV/J&ވcpyL+*98oJz /ֻS!k[-<t]S)W['&\ޏ~wݤQ3Zw.vr|mGg:#5xc祮 \Z"7 NT}n TD9!w aS . hfO57'swD7=WJB)$C 9ÍLPMѥc"fĶ:σb3f9D5 nSey13;;zeM?u&]LJ=H҆,Oݒ4GKa$V//:v Jӡd(g7\S3CJ:Z[ E{3#ݺ3)ՓTp©w9 MM}Ibq?qnW+++DupA?_9Suv8YISƖݡ'>7iq+A8.6ח.e(B(7\L7 ~ɩܭQg?G:YmPeEQ\X,L'Zjy\X; ` >dp>)y0<[wO9Vc9c/f΍0IENDB`pbr-0.7.0/doc/source/static/tweaks.css0000664000175300017540000000310312312051507020773 0ustar jenkinsjenkins00000000000000body { background: #fff url(../_static/header_bg.jpg) top left no-repeat; } #header { width: 950px; margin: 0 auto; height: 102px; } #header h1#logo { background: url(../_static/openstack_logo.png) top left no-repeat; display: block; float: left; text-indent: -9999px; width: 175px; height: 55px; } #navigation { background: url(../_static/header-line.gif) repeat-x 0 bottom; display: block; float: left; margin: 27px 0 0 25px; padding: 0; } #navigation li{ float: left; display: block; margin-right: 25px; } #navigation li a { display: block; font-weight: normal; text-decoration: none; background-position: 50% 0; padding: 20px 0 5px; color: #353535; font-size: 14px; } #navigation li a.current, #navigation li a.section { border-bottom: 3px solid #cf2f19; color: #cf2f19; } div.related { background-color: #cde2f8; border: 1px solid #b0d3f8; } div.related a { color: #4078ba; text-shadow: none; } div.sphinxsidebarwrapper { padding-top: 0; } pre { color: #555; } div.documentwrapper h1, div.documentwrapper h2, div.documentwrapper h3, div.documentwrapper h4, div.documentwrapper h5, div.documentwrapper h6 { font-family: 'PT Sans', sans-serif !important; color: #264D69; border-bottom: 1px dotted #C5E2EA; padding: 0; background: none; padding-bottom: 5px; } div.documentwrapper h3 { color: #CF2F19; } a.headerlink { color: #fff !important; margin-left: 5px; background: #CF2F19 !important; } div.body { margin-top: -25px; margin-left: 230px; } div.document { width: 960px; margin: 0 auto; }pbr-0.7.0/doc/source/static/nature.css0000664000175300017540000001015312312051507020776 0ustar jenkinsjenkins00000000000000/* * nature.css_t * ~~~~~~~~~~~~ * * Sphinx stylesheet -- nature theme. * * :copyright: Copyright 2007-2011 by the Sphinx team, see AUTHORS. * :license: BSD, see LICENSE for details. * */ @import url("basic.css"); /* -- page layout ----------------------------------------------------------- */ body { font-family: Arial, sans-serif; font-size: 100%; background-color: #111; color: #555; margin: 0; padding: 0; } div.documentwrapper { float: left; width: 100%; } div.bodywrapper { margin: 0 0 0 {{ theme_sidebarwidth|toint }}px; } hr { border: 1px solid #B1B4B6; } div.document { background-color: #eee; } div.body { background-color: #ffffff; color: #3E4349; padding: 0 30px 30px 30px; font-size: 0.9em; } div.footer { color: #555; width: 100%; padding: 13px 0; text-align: center; font-size: 75%; } div.footer a { color: #444; text-decoration: underline; } div.related { background-color: #6BA81E; line-height: 32px; color: #fff; text-shadow: 0px 1px 0 #444; font-size: 0.9em; } div.related a { color: #E2F3CC; } div.sphinxsidebar { font-size: 0.75em; line-height: 1.5em; } div.sphinxsidebarwrapper{ padding: 20px 0; } div.sphinxsidebar h3, div.sphinxsidebar h4 { font-family: Arial, sans-serif; color: #222; font-size: 1.2em; font-weight: normal; margin: 0; padding: 5px 10px; background-color: #ddd; text-shadow: 1px 1px 0 white } div.sphinxsidebar h4{ font-size: 1.1em; } div.sphinxsidebar h3 a { color: #444; } div.sphinxsidebar p { color: #888; padding: 5px 20px; } div.sphinxsidebar p.topless { } div.sphinxsidebar ul { margin: 10px 20px; padding: 0; color: #000; } div.sphinxsidebar a { color: #444; } div.sphinxsidebar input { border: 1px solid #ccc; font-family: sans-serif; font-size: 1em; } div.sphinxsidebar input[type=text]{ margin-left: 20px; } /* -- body styles ----------------------------------------------------------- */ a { color: #005B81; text-decoration: none; } a:hover { color: #E32E00; text-decoration: underline; } div.body h1, div.body h2, div.body h3, div.body h4, div.body h5, div.body h6 { font-family: Arial, sans-serif; background-color: #BED4EB; font-weight: normal; color: #212224; margin: 30px 0px 10px 0px; padding: 5px 0 5px 10px; text-shadow: 0px 1px 0 white } div.body h1 { border-top: 20px solid white; margin-top: 0; font-size: 200%; } div.body h2 { font-size: 150%; background-color: #C8D5E3; } div.body h3 { font-size: 120%; background-color: #D8DEE3; } div.body h4 { font-size: 110%; background-color: #D8DEE3; } div.body h5 { font-size: 100%; background-color: #D8DEE3; } div.body h6 { font-size: 100%; background-color: #D8DEE3; } a.headerlink { color: #c60f0f; font-size: 0.8em; padding: 0 4px 0 4px; text-decoration: none; } a.headerlink:hover { background-color: #c60f0f; color: white; } div.body p, div.body dd, div.body li { line-height: 1.5em; } div.admonition p.admonition-title + p { display: inline; } div.highlight{ background-color: white; } div.note { background-color: #eee; border: 1px solid #ccc; } div.seealso { background-color: #ffc; border: 1px solid #ff6; } div.topic { background-color: #eee; } div.warning { background-color: #ffe4e4; border: 1px solid #f66; } p.admonition-title { display: inline; } p.admonition-title:after { content: ":"; } pre { padding: 10px; background-color: White; color: #222; line-height: 1.2em; border: 1px solid #C6C9CB; font-size: 1.1em; margin: 1.5em 0 1.5em 0; -webkit-box-shadow: 1px 1px 1px #d8d8d8; -moz-box-shadow: 1px 1px 1px #d8d8d8; } tt { background-color: #ecf0f3; color: #222; /* padding: 1px 2px; */ font-size: 1.1em; font-family: monospace; } .viewcode-back { font-family: Arial, sans-serif; } div.viewcode-block:target { background-color: #f4debf; border-top: 1px solid #ac9; border-bottom: 1px solid #ac9; } pbr-0.7.0/doc/source/static/header_bg.jpg0000664000175300017540000000723212312051507021374 0ustar jenkinsjenkins00000000000000JFIFddDuckyPAdobed      u !1AQa"q2BRb!1 ?ezFUO8H+{|mfu7uڛk$+N7n. aUBn\jgZ˒i8XJ5.i?RwmR|+5띟:΅.V^=PW 'fˉ^|{rTsa7&3!}5kt1w7ZZMZG?85oQ:F_fF;A^>ΚR˷ZF6λ[Ӯ? BhآIUƩwzW\寱38˖VG % ZY :&dWu׼Ko$iߏ$=|^Jh蕖+I$6#x_ @H "M6(P?׭VQSS.%/R;Vo*nQSU۩54bUm7\<1L&L]pM#S \/Wbx64Ye{)~ٟ+~ej.&T^ 9߭6>#_ufZfCXVWTǃ;flΚ^/;cմMb-:鮃tnYZH6:+=5EµoHD_W)-4 @7WV)o@ ٸg}M:rM?r54Y|Ez'A>#NaVֳ};q4m" U4H/@mimP_.8ΐ%: ֕Nn)Vի㼒bx򋚚ԗfM\}VbR&+e鉬jk]tRڋl?LIM|R9Vg |l J%$z+җ`u{z,s:cQdZyzƒX|RXήFc_uh܊dXJi aJ+Ejդ8޶5SkiVߨx")0Um F=}AMWJ77GkQ5\s0=8#yBK[YInA 6l*@˩f;u_rJF:Z"kBR--G\3eճ_],Vnnخ'V17 (Z#/l ĿO[i7F"W,R5ݿ1ܴ@W[aksHuEu,uQiqVoV}IR8W oϊ_V* uX(ź^{)JjQXêEE}I"WRQ 'TJAtO Ge;dǦQf[ZR"i ZǦxs*tH"i[ A"ڭ:Z@E_(+Jk/ȣەdA&YSkOM:5Īύg3QƵ#R9޼l$\WU^*3ڔt|k::qv|ָ( UZ3Z^[0I;>FR%#VJ/IBR"i'‘uk0E#]y%H&cS61j>Se6yܔ1cKz/f6WX*;/GU_@,?UKPEi( m 4 ,@++$]Yrj[WD]WUj&?Nr^^+AcЕS%W>ml03ZWSݶwOrQLDE[k/M5ī֣+AHU9 @G5Zj9#PVA 4ګ07Ӊxi 7" X`^$ǫťGR\UWТO}ˢ  Đx8@}ZBU H4E(UQaP>7EI +Q@@q:@YUB}^, @R@^( S|FlkX@} ,B$0j@@,0 ,` (48//,h@@XdR! @J" * ]]; then echo "${line%%>*}" elif [[ "$line" == *\=* ]]; then echo "${line%%=*}" elif [[ "$line" == *\<* ]]; then echo "${line%%<*}" else echo "${line%%#*}" fi done < $FILE done set -x } # BASE should be a directory with a subdir called "new" and in that # dir, there should be a git repository for every entry in PROJECTS BASE=${BASE:-/opt/stack} REPODIR=${REPODIR:-$BASE/new} # TODO: Figure out how to get this on to the box properly sudo apt-get install -y --force-yes libxml2-dev libxslt-dev libmysqlclient-dev libpq-dev libnspr4-dev pkg-config libsqlite3-dev libzmq-dev libffi-dev libldap2-dev libsasl2-dev tmpdir=$(mktemp -d) whoami=$(whoami) tmpdownload=$tmpdir/download mkdir -p $tmpdownload pypidir=/var/www/pypi sudo mkdir -p $pypidir sudo chown $USER $pypidir pypimirrorvenv=$tmpdir/pypi-mirror sudo touch $HOME/pip.log sudo chown $USER $HOME/pip.log rm -f ~/.pip/pip.conf ~/.pydistutils.cfg mkdir -p ~/.pip cat < ~/.pip/pip.conf [global] log = $HOME/pip.log EOF pypimirrorsourcedir=$tmpdir/pypimirrorsourcedir git clone $REPODIR/pypi-mirror $pypimirrorsourcedir mkvenv $pypimirrorvenv $pypimirrorvenv/bin/pip install -e $pypimirrorsourcedir cat < $tmpdir/mirror.yaml cache-root: $tmpdownload mirrors: - name: openstack projects: - file://$REPODIR/requirements output: $pypidir EOF # wheel mirrors are below a dir level containing distro and release # because the wheel format itself does not distinguish distro=`lsb_release -i -r -s | xargs | tr ' ' '-'` # set up local apache to serve the mirror we're about to create if [ ! -d /etc/apache2/sites-enabled/ ] ; then echo "Apache does not seem to be installed!!!" exit 1 fi sudo rm /etc/apache2/sites-enabled/* cat < $tmpdir/pypi.conf ServerAdmin webmaster@localhost DocumentRoot /var/www Options Indexes FollowSymLinks EOF sudo mv $tmpdir/pypi.conf /etc/apache2/sites-available/pypi sudo chown root:root /etc/apache2/sites-available/pypi sudo a2ensite pypi sudo service apache2 reload # PROJECTS is a list of projects that we're testing PROJECTS=$* pbrsdistdir=$tmpdir/pbrsdist git clone $REPODIR/pbr $pbrsdistdir cd $pbrsdistdir # Note the -b argument here is essentially a noop as # --no-update is passed as well. The one thing the -b # does give us is it makes run-mirror install dependencies # once instead of over and over for all branches it can find. $pypimirrorvenv/bin/run-mirror -b remotes/origin/master --no-update --verbose -c $tmpdir/mirror.yaml --no-process --export=$HOME/mirror_package_list.txt # Compare packages in the mirror with the list of requirements gen_bare_package_list "$REPODIR/requirements/global-requirements.txt $REPODIR/requirements/dev-requirements.txt" > bare_all_requirements.txt gen_bare_package_list $HOME/mirror_package_list.txt > bare_mirror_package_list.txt echo "Diff between python mirror packages and requirements packages:" grep -v -f bare_all_requirements.txt bare_mirror_package_list.txt > diff_requirements_mirror.txt cat diff_requirements_mirror.txt $pypimirrorvenv/bin/pip install -i http://pypi.python.org/simple -d $tmpdownload/pip/openstack 'pip==1.0' 'setuptools>=0.7' 'd2to1' $pypimirrorvenv/bin/pip install -i http://pypi.python.org/simple -d $tmpdownload/pip/openstack -r requirements.txt $pypimirrorvenv/bin/python setup.py sdist -d $tmpdownload/pip/openstack $pypimirrorvenv/bin/run-mirror -b remotes/origin/master --no-update --verbose -c $tmpdir/mirror.yaml --no-download find $pypidir -type f -name '*.html' -delete find $pypidir # Make pypi thing pypiurl=http://localhost/pypi export no_proxy=$no_proxy${no_proxy:+,}localhost cat < ~/.pydistutils.cfg [easy_install] index_url = $pypiurl EOF cat < ~/.pip/pip.conf [global] index-url = $pypiurl extra-index-url = $pypiurl/$distro extra-index-url = http://pypi.openstack.org/openstack log = $HOME/pip.log EOF eptest=$tmpdir/eptest mkdir $eptest cd $eptest cat < setup.cfg [metadata] name = test_project [entry_points] console_scripts = test_cmd = test_project:main [global] setup-hooks = pbr.hooks.setup_hook EOF cat < setup.py import setuptools setuptools.setup( setup_requires=['pbr'], pbr=True) EOF mkdir test_project cat < test_project/__init__.py def main(): print "Test cmd" EOF epvenv=$eptest/venv mkvenv $epvenv eppbrdir=$tmpdir/eppbrdir git clone $REPODIR/pbr $eppbrdir $epvenv/bin/pip install -e $eppbrdir PBR_VERSION=0.0 $epvenv/bin/python setup.py install cat $epvenv/bin/test_cmd grep 'PBR Generated' $epvenv/bin/test_cmd $epvenv/bin/test_cmd | grep 'Test cmd' projectdir=$tmpdir/projects mkdir -p $projectdir for PROJECT in $PROJECTS ; do SHORT_PROJECT=$(basename $PROJECT) if ! grep 'pbr' $REPODIR/$SHORT_PROJECT/setup.py >/dev/null 2>&1 then # project doesn't use pbr continue fi if [ $SHORT_PROJECT = 'pypi-mirror' ]; then # pypi-mirror doesn't consume the mirror continue fi if [ $SHORT_PROJECT = 'jeepyb' ]; then # pypi-mirror doesn't consume the mirror continue fi if [ $SHORT_PROJECT = 'tempest' ]; then # Tempest doesn't really install continue fi if [ $SHORT_PROJECT = 'requirements' ]; then # requirements doesn't really install continue fi # set up the project synced with the global requirements sudo chown -R $USER $REPODIR/$SHORT_PROJECT (cd $REPODIR/requirements && python update.py $REPODIR/$SHORT_PROJECT) pushd $REPODIR/$SHORT_PROJECT if ! git diff --quiet ; then git commit -a -m'Update requirements' fi popd # Clone from synced repo shortprojectdir=$projectdir/$SHORT_PROJECT git clone $REPODIR/$SHORT_PROJECT $shortprojectdir # Test that we can make a tarball from scratch sdistvenv=$tmpdir/sdist mkvenv $sdistvenv cd $shortprojectdir $sdistvenv/bin/python setup.py sdist cd $tmpdir # Test that the tarball installs tarballvenv=$tmpdir/tarball mkvenv $tarballvenv $tarballvenv/bin/pip install $shortprojectdir/dist/*tar.gz # Test pip installing pipvenv=$tmpdir/pip mkvenv $pipvenv $pipvenv/bin/pip install git+file://$shortprojectdir # Test python setup.py install installvenv=$tmpdir/install mkvenv $installvenv installprojectdir=$projectdir/install$SHORT_PROJECT git clone $shortprojectdir $installprojectdir cd $installprojectdir $installvenv/bin/python setup.py install # Ensure the install_package_data is doing the thing it should do if [ $SHORT_PROJECT = 'nova' ]; then find $installvenv | grep migrate.cfg fi done pbr-0.7.0/MANIFEST.in0000664000175300017540000000037112312051507015171 0ustar jenkinsjenkins00000000000000include AUTHORS include ChangeLog include CONTRIBUTING.rst include LICENSE include README.rst include requirements.txt include test-requirements.txt include tox.ini recursive-include doc * exclude .gitignore exclude .gitreview global-exclude *.pyc pbr-0.7.0/.testr.conf0000664000175300017540000000032212312051507015515 0ustar jenkinsjenkins00000000000000[DEFAULT] test_command=OS_STDOUT_CAPTURE=1 OS_STDERR_CAPTURE=1 OS_TEST_TIMEOUT=60 ${PYTHON:-python} -m subunit.run discover -t ./ . $LISTOPT $IDOPTION test_id_option=--load-list $IDFILE test_list_option=--list pbr-0.7.0/setup.py0000775000175300017540000000131612312051507015150 0ustar jenkinsjenkins00000000000000#!/usr/bin/env python # Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. import setuptools from pbr import util setuptools.setup( **util.cfg_to_args()) pbr-0.7.0/requirements.txt0000664000175300017540000000001112312051507016706 0ustar jenkinsjenkins00000000000000pip>=1.4 pbr-0.7.0/AUTHORS0000664000175300017540000000000112312051536014473 0ustar jenkinsjenkins00000000000000 pbr-0.7.0/LICENSE0000664000175300017540000002363712312051507014452 0ustar jenkinsjenkins00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. pbr-0.7.0/pbr.egg-info/0000775000175300017540000000000012312051536015711 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr.egg-info/SOURCES.txt0000664000175300017540000000367512312051536017610 0ustar jenkinsjenkins00000000000000.mailmap .testr.conf AUTHORS CONTRIBUTING.rst ChangeLog LICENSE MANIFEST.in README.rst requirements.txt setup.cfg setup.py test-requirements.txt tox.ini doc/source/conf.py doc/source/index.rst doc/source/packagers.rst doc/source/semver.rst doc/source/_templates/.placeholder doc/source/_theme/layout.html doc/source/_theme/theme.conf doc/source/static/basic.css doc/source/static/default.css doc/source/static/header-line.gif doc/source/static/header_bg.jpg doc/source/static/jquery.tweet.js doc/source/static/nature.css doc/source/static/openstack_logo.png doc/source/static/tweaks.css pbr/__init__.py pbr/core.py pbr/extra_files.py pbr/find_package.py pbr/packaging.py pbr/testr_command.py pbr/util.py pbr/version.py pbr.egg-info/PKG-INFO pbr.egg-info/SOURCES.txt pbr.egg-info/dependency_links.txt pbr.egg-info/entry_points.txt pbr.egg-info/not-zip-safe pbr.egg-info/requires.txt pbr.egg-info/top_level.txt pbr/hooks/__init__.py pbr/hooks/backwards.py pbr/hooks/base.py pbr/hooks/commands.py pbr/hooks/files.py pbr/hooks/metadata.py pbr/tests/__init__.py pbr/tests/base.py pbr/tests/test_commands.py pbr/tests/test_core.py pbr/tests/test_files.py pbr/tests/test_hooks.py pbr/tests/test_packaging.py pbr/tests/test_setup.py pbr/tests/test_version.py pbr/tests/util.py pbr/tests/testpackage/CHANGES.txt pbr/tests/testpackage/LICENSE.txt pbr/tests/testpackage/MANIFEST.in pbr/tests/testpackage/README.txt pbr/tests/testpackage/extra-file.txt pbr/tests/testpackage/git-extra-file.txt pbr/tests/testpackage/setup.cfg pbr/tests/testpackage/setup.py pbr/tests/testpackage/data_files/a.txt pbr/tests/testpackage/data_files/b.txt pbr/tests/testpackage/data_files/c.rst pbr/tests/testpackage/pbr_testpackage/__init__.py pbr/tests/testpackage/pbr_testpackage/_setup_hooks.py pbr/tests/testpackage/pbr_testpackage/cmd.py pbr/tests/testpackage/pbr_testpackage/package_data/1.txt pbr/tests/testpackage/pbr_testpackage/package_data/2.txt pbr/tests/testpackage/src/testext.c tools/integration.shpbr-0.7.0/pbr.egg-info/not-zip-safe0000664000175300017540000000000112312051534020135 0ustar jenkinsjenkins00000000000000 pbr-0.7.0/pbr.egg-info/PKG-INFO0000664000175300017540000000756312312051536017021 0ustar jenkinsjenkins00000000000000Metadata-Version: 1.1 Name: pbr Version: 0.7.0 Summary: Python Build Reasonableness Home-page: http://pypi.python.org/pypi/pbr Author: OpenStack Author-email: openstack-dev@lists.openstack.org License: UNKNOWN Description: Introduction ============ PBR is a library that injects some useful and sensible default behaviors into your setuptools run. It started off life as the chunks of code that were copied between all of the OpenStack projects. Around the time that OpenStack hit 18 different projects each with at least 3 active branches, it seems like a good time to make that code into a proper re-usable library. PBR is only mildly configurable. The basic idea is that there's a decent way to run things and if you do, you should reap the rewards, because then it's simple and repeatable. If you want to do things differently, cool! But you've already got the power of python at your fingertips, so you don't really need PBR. PBR builds on top of the work that `d2to1` started to provide for declarative configuration. `d2to1` is itself an implementation of the ideas behind `distutils2`. Although `distutils2` is now abandoned in favor of work towards PEP 426 and Metadata 2.0, declarative config is still a great idea and specifically important in trying to distribute setup code as a library when that library itself will alter how the setup is processed. As Metadata 2.0 and other modern Python packaging PEPs come out, `pbr` aims to support them as quickly as possible. You can read more in `the documentation`_. Running Tests ============= The testing system is based on a combination of tox and testr. The canonical approach to running tests is to simply run the command `tox`. This will create virtual environments, populate them with depenedencies and run all of the tests that OpenStack CI systems run. Behind the scenes, tox is running `testr run --parallel`, but is set up such that you can supply any additional testr arguments that are needed to tox. For example, you can run: `tox -- --analyze-isolation` to cause tox to tell testr to add --analyze-isolation to its argument list. It is also possible to run the tests inside of a virtual environment you have created, or it is possible that you have all of the dependencies installed locally already. If you'd like to go this route, the requirements are listed in requirements.txt and the requirements for testing are in test-requirements.txt. Installing them via pip, for instance, is simply:: pip install -r requirements.txt -r test-requirements.txt In you go this route, you can interact with the testr command directly. Running `testr run` will run the entire test suite. `testr run --parallel` will run it in parallel (this is the default incantation tox uses.) More information about testr can be found at: http://wiki.openstack.org/testr .. _`the documentation`: http://docs.openstack.org/developer/pbr/ Platform: UNKNOWN Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: Console Classifier: Environment :: OpenStack Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Information Technology Classifier: License :: OSI Approved :: Apache Software License Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.6 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.3 pbr-0.7.0/pbr.egg-info/top_level.txt0000664000175300017540000000000412312051536020435 0ustar jenkinsjenkins00000000000000pbr pbr-0.7.0/pbr.egg-info/dependency_links.txt0000664000175300017540000000000112312051536021757 0ustar jenkinsjenkins00000000000000 pbr-0.7.0/pbr.egg-info/requires.txt0000664000175300017540000000001012312051536020300 0ustar jenkinsjenkins00000000000000pip>=1.4pbr-0.7.0/pbr.egg-info/entry_points.txt0000664000175300017540000000005712312051536021211 0ustar jenkinsjenkins00000000000000[distutils.setup_keywords] pbr = pbr.core:pbr pbr-0.7.0/test-requirements.txt0000664000175300017540000000027112312051507017673 0ustar jenkinsjenkins00000000000000coverage>=3.6 discover fixtures>=0.3.14 flake8==2.0 mock>=1.0 python-subunit>=0.0.18 sphinx>=1.1.2,<1.2 testrepository>=0.0.18 testresources>=0.2.4 testscenarios>=0.4 testtools>=0.9.34 pbr-0.7.0/CONTRIBUTING.rst0000664000175300017540000000103112312051507016066 0ustar jenkinsjenkins00000000000000If you would like to contribute to the development of OpenStack, you must follow the steps in the "If you're a developer, start here" section of this page: http://wiki.openstack.org/HowToContribute Once those steps have been completed, changes to OpenStack should be submitted for review via the Gerrit tool, following the workflow documented at: http://wiki.openstack.org/GerritWorkflow Pull requests submitted through GitHub will be ignored. Bugs should be filed on Launchpad, not GitHub: https://bugs.launchpad.net/pbr pbr-0.7.0/pbr/0000775000175300017540000000000012312051536014217 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/hooks/0000775000175300017540000000000012312051536015342 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/hooks/__init__.py0000664000175300017540000000215312312051507017452 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from pbr.hooks import backwards from pbr.hooks import commands from pbr.hooks import files from pbr.hooks import metadata def setup_hook(config): """Filter config parsed from a setup.cfg to inject our defaults.""" metadata_config = metadata.MetadataConfig(config) metadata_config.run() backwards.BackwardsCompatConfig(config).run() commands.CommandsConfig(config).run() files.FilesConfig(config, metadata_config.get_name()).run() pbr-0.7.0/pbr/hooks/commands.py0000664000175300017540000000441412312051507017516 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import os from setuptools.command import easy_install from pbr.hooks import base from pbr import packaging class CommandsConfig(base.BaseConfig): section = 'global' def __init__(self, config): super(CommandsConfig, self).__init__(config) self.commands = self.config.get('commands', "") def save(self): self.config['commands'] = self.commands super(CommandsConfig, self).save() def add_command(self, command): self.commands = "%s\n%s" % (self.commands, command) def hook(self): self.add_command('pbr.packaging.LocalEggInfo') self.add_command('pbr.packaging.LocalSDist') self.add_command('pbr.packaging.LocalInstallScripts') if os.name != 'nt': easy_install.get_script_args = packaging.override_get_script_args if packaging.have_sphinx(): self.add_command('pbr.packaging.LocalBuildDoc') self.add_command('pbr.packaging.LocalBuildLatex') if os.path.exists('.testr.conf') and packaging.have_testr(): # There is a .testr.conf file. We want to use it. self.add_command('pbr.packaging.TestrTest') elif self.config.get('nosetests', False) and packaging.have_nose(): # We seem to still have nose configured self.add_command('pbr.packaging.NoseTest') use_egg = packaging.get_boolean_option( self.pbr_config, 'use-egg', 'PBR_USE_EGG') # We always want non-egg install unless explicitly requested if 'manpages' in self.pbr_config or not use_egg: self.add_command('pbr.packaging.LocalInstall') pbr-0.7.0/pbr/hooks/metadata.py0000664000175300017540000000214112312051507017470 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from pbr.hooks import base from pbr import packaging class MetadataConfig(base.BaseConfig): section = 'metadata' def hook(self): self.config['version'] = packaging.get_version( self.config['name'], self.config.get('version', None)) packaging.append_text_list( self.config, 'requires_dist', packaging.parse_requirements()) def get_name(self): return self.config['name'] pbr-0.7.0/pbr/hooks/backwards.py0000664000175300017540000000224112312051507017652 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from pbr.hooks import base from pbr import packaging class BackwardsCompatConfig(base.BaseConfig): section = 'backwards_compat' def hook(self): self.config['include_package_data'] = 'True' packaging.append_text_list( self.config, 'dependency_links', packaging.parse_dependency_links()) packaging.append_text_list( self.config, 'tests_require', packaging.parse_requirements( packaging.TEST_REQUIREMENTS_FILES)) pbr-0.7.0/pbr/hooks/files.py0000664000175300017540000000702012312051507017013 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import os import sys from pbr import find_package from pbr.hooks import base def get_manpath(): manpath = 'share/man' if os.path.exists(os.path.join(sys.prefix, 'man')): # This works around a bug with install where it expects every node # in the relative data directory to be an actual directory, since at # least Debian derivatives (and probably other platforms as well) # like to symlink Unixish /usr/local/man to /usr/local/share/man. manpath = 'man' return manpath def get_man_section(section): return os.path.join(get_manpath(), 'man%s' % section) class FilesConfig(base.BaseConfig): section = 'files' def __init__(self, config, name): super(FilesConfig, self).__init__(config) self.name = name self.data_files = self.config.get('data_files', '') def save(self): self.config['data_files'] = self.data_files super(FilesConfig, self).save() def expand_globs(self): finished = [] for line in self.data_files.split("\n"): if line.rstrip().endswith('*') and '=' in line: (target, source_glob) = line.split('=') source_prefix = source_glob.strip()[:-1] target = target.strip() if not target.endswith(os.path.sep): target += os.path.sep for (dirpath, dirnames, fnames) in os.walk(source_prefix): finished.append( "%s = " % dirpath.replace(source_prefix, target)) finished.extend( [" %s" % os.path.join(dirpath, f) for f in fnames]) else: finished.append(line) self.data_files = "\n".join(finished) def add_man_path(self, man_path): self.data_files = "%s\n%s =" % (self.data_files, man_path) def add_man_page(self, man_page): self.data_files = "%s\n %s" % (self.data_files, man_page) def get_man_sections(self): man_sections = dict() manpages = self.pbr_config['manpages'] for manpage in manpages.split(): section_number = manpage.strip()[-1] section = man_sections.get(section_number, list()) section.append(manpage.strip()) man_sections[section_number] = section return man_sections def hook(self): package = self.config.get('packages', self.name).strip() if os.path.isdir(package): self.config['packages'] = find_package.smart_find_packages(package) self.expand_globs() if 'manpages' in self.pbr_config: man_sections = self.get_man_sections() for (section, pages) in man_sections.items(): manpath = get_man_section(section) self.add_man_path(manpath) for page in pages: self.add_man_page(page) pbr-0.7.0/pbr/hooks/base.py0000664000175300017540000000207312312051507016626 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. class BaseConfig(object): section = None def __init__(self, config): self._global_config = config self.config = self._global_config.get(self.section, dict()) self.pbr_config = config.get('pbr', dict()) def run(self): self.hook() self.save() def hook(self): pass def save(self): self._global_config[self.section] = self.config pbr-0.7.0/pbr/__init__.py0000664000175300017540000000000012312051507016314 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/find_package.py0000664000175300017540000000207712312051507017170 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import os import setuptools def smart_find_packages(package_list): """Run find_packages the way we intend.""" packages = [] for pkg in package_list.strip().split("\n"): pkg_path = pkg.replace('.', os.path.sep) packages.append(pkg) packages.extend(['%s.%s' % (pkg, f) for f in setuptools.find_packages(pkg_path)]) return "\n".join(set(packages)) pbr-0.7.0/pbr/testr_command.py0000664000175300017540000000776012312051507017440 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (c) 2013 Testrepository Contributors # # Licensed under either the Apache License, Version 2.0 or the BSD 3-clause # license at the users choice. A copy of both licenses are available in the # project source as Apache-2.0 and BSD. You may not use this file except in # compliance with one of these two licences. # # Unless required by applicable law or agreed to in writing, software # distributed under these licenses is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # license you chose for the specific language governing permissions and # limitations under that license. """setuptools/distutils commands to run testr via setup.py Currently provides 'testr' which runs tests using testr. You can pass --coverage which will also export PYTHON='coverage run --source ' and automatically combine the coverage from each testr backend test runner after the run completes. To use, just use setuptools/distribute and depend on testr, and it should be picked up automatically (as the commands are exported in the testrepository package metadata. """ from distutils import cmd import distutils.errors import os import sys from testrepository import commands class Testr(cmd.Command): description = "Run unit tests using testr" user_options = [ ('coverage', None, "Replace PYTHON with coverage and merge coverage " "from each testr worker."), ('testr-args=', 't', "Run 'testr' with these args"), ('omit=', 'o', 'Files to omit from coverage calculations'), ('slowest', None, "Show slowest test times after tests complete."), ('no-parallel', None, "Run testr serially"), ] boolean_options = ['coverage', 'slowest', 'no_parallel'] def _run_testr(self, *args): return commands.run_argv([sys.argv[0]] + list(args), sys.stdin, sys.stdout, sys.stderr) def initialize_options(self): self.testr_args = None self.coverage = None self.omit = "" self.slowest = None self.no_parallel = None def finalize_options(self): if self.testr_args is None: self.testr_args = [] else: self.testr_args = self.testr_args.split() if self.omit: self.omit = "--omit=%s" % self.omit def run(self): """Set up testr repo, then run testr""" if not os.path.isdir(".testrepository"): self._run_testr("init") if self.coverage: self._coverage_before() if not self.no_parallel: testr_ret = self._run_testr("run", "--parallel", *self.testr_args) else: testr_ret = self._run_testr("run", *self.testr_args) if testr_ret: raise distutils.errors.DistutilsError( "testr failed (%d)" % testr_ret) if self.slowest: print("Slowest Tests") self._run_testr("slowest") if self.coverage: self._coverage_after() def _coverage_before(self): package = self.distribution.get_name() if package.startswith('python-'): package = package[7:] options = "--source %s --parallel-mode" % package os.environ['PYTHON'] = ("coverage run %s" % options) def _coverage_after(self): os.system("coverage combine") os.system("coverage html -d ./cover %s" % self.omit) pbr-0.7.0/pbr/extra_files.py0000664000175300017540000000211012312051507017066 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. from distutils import errors import os _extra_files = [] def get_extra_files(): global _extra_files return _extra_files def set_extra_files(extra_files): # Let's do a sanity check for filename in extra_files: if not os.path.exists(filename): raise errors.DistutilsFileError( '%s from the extra_files option in setup.cfg does not ' 'exist' % filename) global _extra_files _extra_files[:] = extra_files[:] pbr-0.7.0/pbr/version.py0000664000175300017540000000704012312051507016255 0ustar jenkinsjenkins00000000000000 # Copyright 2012 OpenStack Foundation # Copyright 2012-2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """ Utilities for consuming the version from pkg_resources. """ import pkg_resources class VersionInfo(object): def __init__(self, package): """Object that understands versioning for a package :param package: name of the python package, such as glance, or python-glanceclient """ self.package = package self.release = None self.version = None self._cached_version = None def __str__(self): """Make the VersionInfo object behave like a string.""" return self.version_string() def __repr__(self): """Include the name.""" return "pbr.version.VersionInfo(%s:%s)" % ( self.package, self.version_string()) def _get_version_from_pkg_resources(self): """Obtain a version from pkg_resources or setup-time logic if missing. This will try to get the version of the package from the pkg_resources record associated with the package, and if there is no such record falls back to the logic sdist would use. """ try: requirement = pkg_resources.Requirement.parse(self.package) provider = pkg_resources.get_provider(requirement) return provider.version except pkg_resources.DistributionNotFound: # The most likely cause for this is running tests in a tree # produced from a tarball where the package itself has not been # installed into anything. Revert to setup-time logic. from pbr import packaging return packaging.get_version(self.package) def release_string(self): """Return the full version of the package. This including suffixes indicating VCS status. """ if self.release is None: self.release = self._get_version_from_pkg_resources() return self.release def version_string(self): """Return the short version minus any alpha/beta tags.""" if self.version is None: parts = [] for part in self.release_string().split('.'): if part[0].isdigit(): parts.append(part) else: break self.version = ".".join(parts) return self.version # Compatibility functions canonical_version_string = version_string version_string_with_vcs = release_string def cached_version_string(self, prefix=""): """Return a cached version string. This will return a cached version string if one is already cached, irrespective of prefix. If none is cached, one will be created with prefix and then cached and returned. """ if not self._cached_version: self._cached_version = "%s%s" % (prefix, self.version_string()) return self._cached_version pbr-0.7.0/pbr/packaging.py0000664000175300017540000007545712312051507016535 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2011 OpenStack LLC. # Copyright 2012-2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """ Utilities with minimum-depends for use in setup.py """ from __future__ import unicode_literals import email import io import os import re import subprocess import sys from distutils.command import install as du_install import distutils.errors from distutils import log import pkg_resources from setuptools.command import easy_install from setuptools.command import egg_info from setuptools.command import install from setuptools.command import install_scripts from setuptools.command import sdist try: import cStringIO except ImportError: import io as cStringIO from pbr import extra_files TRUE_VALUES = ('true', '1', 'yes') REQUIREMENTS_FILES = ('requirements.txt', 'tools/pip-requires') TEST_REQUIREMENTS_FILES = ('test-requirements.txt', 'tools/test-requires') # part of the standard library starting with 2.7 # adding it to the requirements list screws distro installs BROKEN_ON_27 = ('argparse', 'importlib', 'ordereddict') def get_requirements_files(): files = os.environ.get("PBR_REQUIREMENTS_FILES") if files: return tuple(f.strip() for f in files.split(',')) # Returns a list composed of: # - REQUIREMENTS_FILES with -py2 or -py3 in the name # (e.g. requirements-py3.txt) # - REQUIREMENTS_FILES return (list(map(('-py' + str(sys.version_info[0])).join, map(os.path.splitext, REQUIREMENTS_FILES))) + list(REQUIREMENTS_FILES)) def append_text_list(config, key, text_list): """Append a \n separated list to possibly existing value.""" new_value = [] current_value = config.get(key, "") if current_value: new_value.append(current_value) new_value.extend(text_list) config[key] = '\n'.join(new_value) def _pip_install(links, requires, root=None, option_dict=dict()): if get_boolean_option( option_dict, 'skip_pip_install', 'SKIP_PIP_INSTALL'): return cmd = [sys.executable, '-m', 'pip.__init__', 'install'] if root: cmd.append("--root=%s" % root) for link in links: cmd.append("-f") cmd.append(link) _run_shell_command( cmd + requires, throw_on_error=True, buffer=False, env=dict(PIP_USE_WHEEL="true")) def _any_existing(file_list): return [f for f in file_list if os.path.exists(f)] # Get requirements from the first file that exists def get_reqs_from_files(requirements_files): for requirements_file in _any_existing(requirements_files): with open(requirements_file, 'r') as fil: return fil.read().split('\n') return [] def parse_requirements(requirements_files=None): if requirements_files is None: requirements_files = get_requirements_files() def egg_fragment(match): # take a versioned egg fragment and return a # versioned package requirement e.g. # nova-1.2.3 becomes nova>=1.2.3 return re.sub(r'([\w.]+)-([\w.-]+)', r'\1>=\2', match.group(1)) requirements = [] for line in get_reqs_from_files(requirements_files): # Ignore comments if (not line.strip()) or line.startswith('#'): continue try: project_name = pkg_resources.Requirement.parse(line).project_name except ValueError: project_name = None # For the requirements list, we need to inject only the portion # after egg= so that distutils knows the package it's looking for # such as: # -e git://github.com/openstack/nova/master#egg=nova # -e git://github.com/openstack/nova/master#egg=nova-1.2.3 if re.match(r'\s*-e\s+', line): line = re.sub(r'\s*-e\s+.*#egg=(.*)$', egg_fragment, line) # such as: # http://github.com/openstack/nova/zipball/master#egg=nova # http://github.com/openstack/nova/zipball/master#egg=nova-1.2.3 elif re.match(r'\s*https?:', line): line = re.sub(r'\s*https?:.*#egg=(.*)$', egg_fragment, line) # -f lines are for index locations, and don't get used here elif re.match(r'\s*-f\s+', line): line = None reason = 'Index Location' elif (project_name and project_name in BROKEN_ON_27 and sys.version_info >= (2, 7)): line = None reason = 'Python 2.6 only dependency' if line is not None: requirements.append(line) else: log.info( '[pbr] Excluding %s: %s' % (project_name, reason)) return requirements def parse_dependency_links(requirements_files=None): if requirements_files is None: requirements_files = get_requirements_files() dependency_links = [] # dependency_links inject alternate locations to find packages listed # in requirements for line in get_reqs_from_files(requirements_files): # skip comments and blank lines if re.match(r'(\s*#)|(\s*$)', line): continue # lines with -e or -f need the whole line, minus the flag if re.match(r'\s*-[ef]\s+', line): dependency_links.append(re.sub(r'\s*-[ef]\s+', '', line)) # lines that are only urls can go in unmolested elif re.match(r'\s*https?:', line): dependency_links.append(line) return dependency_links def _run_git_command(cmd, git_dir, **kwargs): if not isinstance(cmd, (list, tuple)): cmd = [cmd] return _run_shell_command( ['git', '--git-dir=%s' % git_dir] + cmd, **kwargs) def _run_shell_command(cmd, throw_on_error=False, buffer=True, env=None): if buffer: out_location = subprocess.PIPE err_location = subprocess.PIPE else: out_location = None err_location = None newenv = os.environ.copy() if env: newenv.update(env) output = subprocess.Popen(cmd, stdout=out_location, stderr=err_location, env=newenv) out = output.communicate() if output.returncode and throw_on_error: raise distutils.errors.DistutilsError( "%s returned %d" % (cmd, output.returncode)) if len(out) == 0 or not out[0] or not out[0].strip(): return '' return out[0].strip().decode('utf-8') def _get_git_directory(): return _run_shell_command(['git', 'rev-parse', '--git-dir']) def _git_is_installed(): try: # We cannot use 'which git' as it may not be available # in some distributions, So just try 'git --version' # to see if we run into trouble _run_shell_command(['git', '--version']) except OSError: return False return True def _get_highest_tag(tags): """Find the highest tag from a list. Pass in a list of tag strings and this will return the highest (latest) as sorted by the pkg_resources version parser. """ return max(tags, key=pkg_resources.parse_version) def get_boolean_option(option_dict, option_name, env_name): return ((option_name in option_dict and option_dict[option_name][1].lower() in TRUE_VALUES) or str(os.getenv(env_name)).lower() in TRUE_VALUES) def write_git_changelog(git_dir=None, dest_dir=os.path.curdir, option_dict=dict()): """Write a changelog based on the git changelog.""" should_skip = get_boolean_option(option_dict, 'skip_changelog', 'SKIP_WRITE_GIT_CHANGELOG') if not should_skip: new_changelog = os.path.join(dest_dir, 'ChangeLog') # If there's already a ChangeLog and it's not writable, just use it if (os.path.exists(new_changelog) and not os.access(new_changelog, os.W_OK)): return log.info('[pbr] Writing ChangeLog') if git_dir is None: git_dir = _get_git_directory() if git_dir: log_cmd = ['log', '--oneline', '--decorate'] changelog = _run_git_command(log_cmd, git_dir) first_line = True with io.open(new_changelog, "w", encoding="utf-8") as changelog_file: changelog_file.write("CHANGES\n=======\n\n") for line in changelog.split('\n'): line_parts = line.split() if len(line_parts) < 2: continue # Tags are in a list contained in ()'s. If a commit # subject that is tagged happens to have ()'s in it # this will fail if line_parts[1].startswith('(') and ')' in line: msg = line.split(')')[1].strip() else: msg = " ".join(line_parts[1:]) if "tag:" in line: tags = [ tag.split(",")[0] for tag in line.split(")")[0].split("tag: ")[1:]] tag = _get_highest_tag(tags) underline = len(tag) * '-' if not first_line: changelog_file.write(u'\n') changelog_file.write( ("%(tag)s\n%(underline)s\n\n" % dict(tag=tag, underline=underline))) if not msg.startswith("Merge "): if msg.endswith("."): msg = msg[:-1] changelog_file.write( ("* %(msg)s\n" % dict(msg=msg))) first_line = False def generate_authors(git_dir=None, dest_dir='.', option_dict=dict()): """Create AUTHORS file using git commits.""" should_skip = get_boolean_option(option_dict, 'skip_authors', 'SKIP_GENERATE_AUTHORS') if not should_skip: old_authors = os.path.join(dest_dir, 'AUTHORS.in') new_authors = os.path.join(dest_dir, 'AUTHORS') # If there's already an AUTHORS file and it's not writable, just use it if (os.path.exists(new_authors) and not os.access(new_authors, os.W_OK)): return log.info('[pbr] Generating AUTHORS') ignore_emails = '(jenkins@review|infra@lists|jenkins@openstack)' if git_dir is None: git_dir = _get_git_directory() if git_dir: authors = [] # don't include jenkins email address in AUTHORS file git_log_cmd = ['log', '--use-mailmap', '--format=%aN <%aE>'] authors += _run_git_command(git_log_cmd, git_dir).split('\n') authors = [a for a in authors if not re.search(ignore_emails, a)] # get all co-authors from commit messages co_authors_out = _run_git_command('log', git_dir) co_authors = re.findall('Co-authored-by:.+', co_authors_out, re.MULTILINE) co_authors = [signed.split(":", 1)[1].strip() for signed in co_authors if signed] authors += co_authors authors = sorted(set(authors)) with open(new_authors, 'wb') as new_authors_fh: if os.path.exists(old_authors): with open(old_authors, "rb") as old_authors_fh: new_authors_fh.write(old_authors_fh.read()) new_authors_fh.write(('\n'.join(authors) + '\n') .encode('utf-8')) def _find_git_files(dirname='', git_dir=None): """Behave like a file finder entrypoint plugin. We don't actually use the entrypoints system for this because it runs at absurd times. We only want to do this when we are building an sdist. """ file_list = [] if git_dir is None and _git_is_installed(): git_dir = _get_git_directory() if git_dir: log.info("[pbr] In git context, generating filelist from git") file_list = _run_git_command(['ls-files', '-z'], git_dir) file_list = file_list.split(b'\x00'.decode('utf-8')) return [f for f in file_list if f] _rst_template = """%(heading)s %(underline)s .. automodule:: %(module)s :members: :undoc-members: :show-inheritance: """ def _find_modules(arg, dirname, files): for filename in files: if filename.endswith('.py') and filename != '__init__.py': arg["%s.%s" % (dirname.replace('/', '.'), filename[:-3])] = True class LocalInstall(install.install): """Runs python setup.py install in a sensible manner. Force a non-egg installed in the manner of single-version-externally-managed, which allows us to install manpages and config files. Because non-egg installs bypass the depend processing machinery, we need to do our own. Because easy_install is evil, just use pip to process our requirements files directly, which means we don't have to do crazy extra processing. Bypass installation if --single-version-externally-managed is given, so that behavior for packagers remains the same. """ command_name = 'install' def run(self): option_dict = self.distribution.get_option_dict('pbr') if (not self.single_version_externally_managed and self.distribution.install_requires): _pip_install( self.distribution.dependency_links, self.distribution.install_requires, self.root, option_dict=option_dict) return du_install.install.run(self) def _newer_requires_files(egg_info_dir): """Check to see if any of the requires files are newer than egg-info.""" for target, sources in (('requires.txt', get_requirements_files()), ('test-requires.txt', TEST_REQUIREMENTS_FILES)): target_path = os.path.join(egg_info_dir, target) for src in _any_existing(sources): if (not os.path.exists(target_path) or os.path.getmtime(target_path) < os.path.getmtime(src)): return True return False def _copy_test_requires_to(egg_info_dir): """Copy the requirements file to egg-info/test-requires.txt.""" with open(os.path.join(egg_info_dir, 'test-requires.txt'), 'w') as dest: for source in _any_existing(TEST_REQUIREMENTS_FILES): dest.write(open(source, 'r').read().rstrip('\n') + '\n') class _PipInstallTestRequires(object): """Mixin class to install test-requirements.txt before running tests.""" def install_test_requirements(self): links = parse_dependency_links(TEST_REQUIREMENTS_FILES) if self.distribution.tests_require: option_dict = self.distribution.get_option_dict('pbr') _pip_install( links, self.distribution.tests_require, option_dict=option_dict) def pre_run(self): self.egg_name = pkg_resources.safe_name(self.distribution.get_name()) self.egg_info = "%s.egg-info" % pkg_resources.to_filename( self.egg_name) if (not os.path.exists(self.egg_info) or _newer_requires_files(self.egg_info)): ei_cmd = self.get_finalized_command('egg_info') ei_cmd.run() self.install_test_requirements() _copy_test_requires_to(self.egg_info) try: from pbr import testr_command class TestrTest(testr_command.Testr, _PipInstallTestRequires): """Make setup.py test do the right thing.""" command_name = 'test' def run(self): self.pre_run() # Can't use super - base class old-style class testr_command.Testr.run(self) _have_testr = True except ImportError: _have_testr = False def have_testr(): return _have_testr try: from nose import commands class NoseTest(commands.nosetests, _PipInstallTestRequires): """Fallback test runner if testr is a no-go.""" command_name = 'test' def run(self): self.pre_run() # Can't use super - base class old-style class commands.nosetests.run(self) _have_nose = True except ImportError: _have_nose = False def have_nose(): return _have_nose _script_text = """# PBR Generated from %(group)r import sys from %(module_name)s import %(import_target)s if __name__ == "__main__": sys.exit(%(invoke_target)s()) """ def override_get_script_args( dist, executable=os.path.normpath(sys.executable), is_wininst=False): """Override entrypoints console_script.""" header = easy_install.get_script_header("", executable, is_wininst) for group in 'console_scripts', 'gui_scripts': for name, ep in dist.get_entry_map(group).items(): if not ep.attrs or len(ep.attrs) > 2: raise ValueError("Script targets must be of the form " "'func' or 'Class.class_method'.") script_text = _script_text % dict( group=group, module_name=ep.module_name, import_target=ep.attrs[0], invoke_target='.'.join(ep.attrs), ) yield (name, header + script_text) class LocalInstallScripts(install_scripts.install_scripts): """Intercepts console scripts entry_points.""" command_name = 'install_scripts' def run(self): if os.name != 'nt': get_script_args = override_get_script_args else: get_script_args = easy_install.get_script_args import distutils.command.install_scripts self.run_command("egg_info") if self.distribution.scripts: # run first to set up self.outfiles distutils.command.install_scripts.install_scripts.run(self) else: self.outfiles = [] if self.no_ep: # don't install entry point scripts into .egg file! return ei_cmd = self.get_finalized_command("egg_info") dist = pkg_resources.Distribution( ei_cmd.egg_base, pkg_resources.PathMetadata(ei_cmd.egg_base, ei_cmd.egg_info), ei_cmd.egg_name, ei_cmd.egg_version, ) bs_cmd = self.get_finalized_command('build_scripts') executable = getattr( bs_cmd, 'executable', easy_install.sys_executable) is_wininst = getattr( self.get_finalized_command("bdist_wininst"), '_is_running', False ) for args in get_script_args(dist, executable, is_wininst): self.write_script(*args) class LocalManifestMaker(egg_info.manifest_maker): """Add any files that are in git and some standard sensible files.""" def _add_pbr_defaults(self): for template_line in [ 'include AUTHORS', 'include ChangeLog', 'exclude .gitignore', 'exclude .gitreview', 'global-exclude *.pyc' ]: self.filelist.process_template_line(template_line) def add_defaults(self): option_dict = self.distribution.get_option_dict('pbr') sdist.sdist.add_defaults(self) self.filelist.append(self.template) self.filelist.append(self.manifest) self.filelist.extend(extra_files.get_extra_files()) should_skip = get_boolean_option(option_dict, 'skip_git_sdist', 'SKIP_GIT_SDIST') if not should_skip: rcfiles = _find_git_files() if rcfiles: self.filelist.extend(rcfiles) elif os.path.exists(self.manifest): self.read_manifest() ei_cmd = self.get_finalized_command('egg_info') self._add_pbr_defaults() self.filelist.include_pattern("*", prefix=ei_cmd.egg_info) class LocalEggInfo(egg_info.egg_info): """Override the egg_info command to regenerate SOURCES.txt sensibly.""" command_name = 'egg_info' def find_sources(self): """Generate SOURCES.txt only if there isn't one already. If we are in an sdist command, then we always want to update SOURCES.txt. If we are not in an sdist command, then it doesn't matter one flip, and is actually destructive. """ manifest_filename = os.path.join(self.egg_info, "SOURCES.txt") if not os.path.exists(manifest_filename) or 'sdist' in sys.argv: log.info("[pbr] Processing SOURCES.txt") mm = LocalManifestMaker(self.distribution) mm.manifest = manifest_filename mm.run() self.filelist = mm.filelist else: log.info("[pbr] Reusing existing SOURCES.txt") self.filelist = egg_info.FileList() for entry in open(manifest_filename, 'r').read().split('\n'): self.filelist.append(entry) class LocalSDist(sdist.sdist): """Builds the ChangeLog and Authors files from VC first.""" command_name = 'sdist' def run(self): option_dict = self.distribution.get_option_dict('pbr') write_git_changelog(option_dict=option_dict) generate_authors(option_dict=option_dict) # sdist.sdist is an old style class, can't use super() sdist.sdist.run(self) try: from sphinx import apidoc from sphinx import application from sphinx import config from sphinx import setup_command class LocalBuildDoc(setup_command.BuildDoc): command_name = 'build_sphinx' builders = ['html', 'man'] def _get_source_dir(self): option_dict = self.distribution.get_option_dict('build_sphinx') if 'source_dir' in option_dict: source_dir = os.path.join(option_dict['source_dir'][1], 'api') else: source_dir = 'doc/source/api' if not os.path.exists(source_dir): os.makedirs(source_dir) return source_dir def generate_autoindex(self): log.info("[pbr] Autodocumenting from %s" % os.path.abspath(os.curdir)) modules = {} source_dir = self._get_source_dir() for pkg in self.distribution.packages: if '.' not in pkg: for dirpath, dirnames, files in os.walk(pkg): _find_modules(modules, dirpath, files) module_list = list(modules.keys()) module_list.sort() autoindex_filename = os.path.join(source_dir, 'autoindex.rst') with open(autoindex_filename, 'w') as autoindex: autoindex.write(""".. toctree:: :maxdepth: 1 """) for module in module_list: output_filename = os.path.join(source_dir, "%s.rst" % module) heading = "The :mod:`%s` Module" % module underline = "=" * len(heading) values = dict(module=module, heading=heading, underline=underline) log.info("[pbr] Generating %s" % output_filename) with open(output_filename, 'w') as output_file: output_file.write(_rst_template % values) autoindex.write(" %s.rst\n" % module) def _sphinx_tree(self): source_dir = self._get_source_dir() apidoc.main(['apidoc', '.', '-H', 'Modules', '-o', source_dir]) def _sphinx_run(self): if not self.verbose: status_stream = cStringIO.StringIO() else: status_stream = sys.stdout confoverrides = {} if self.version: confoverrides['version'] = self.version if self.release: confoverrides['release'] = self.release if self.today: confoverrides['today'] = self.today sphinx_config = config.Config(self.config_dir, 'conf.py', {}, []) sphinx_config.init_values() if self.builder == 'man' and len(sphinx_config.man_pages) == 0: return app = application.Sphinx( self.source_dir, self.config_dir, self.builder_target_dir, self.doctree_dir, self.builder, confoverrides, status_stream, freshenv=self.fresh_env, warningiserror=True) try: app.build(force_all=self.all_files) except Exception as err: from docutils import utils if isinstance(err, utils.SystemMessage): sys.stder.write('reST markup error:\n') sys.stderr.write(err.args[0].encode('ascii', 'backslashreplace')) sys.stderr.write('\n') else: raise if self.link_index: src = app.config.master_doc + app.builder.out_suffix dst = app.builder.get_outfilename('index') os.symlink(src, dst) def run(self): option_dict = self.distribution.get_option_dict('pbr') tree_index = get_boolean_option(option_dict, 'autodoc_tree_index_modules', 'AUTODOC_TREE_INDEX_MODULES') auto_index = get_boolean_option(option_dict, 'autodoc_index_modules', 'AUTODOC_INDEX_MODULES') if not os.getenv('SPHINX_DEBUG'): #NOTE(afazekas): These options can be used together, # but they do a very similar thing in a difffernet way if tree_index: self._sphinx_tree() if auto_index: self.generate_autoindex() for builder in self.builders: self.builder = builder self.finalize_options() self.project = self.distribution.get_name() self.version = self.distribution.get_version() self.release = self.distribution.get_version() if 'warnerrors' in option_dict: self._sphinx_run() else: setup_command.BuildDoc.run(self) def finalize_options(self): # Not a new style class, super keyword does not work. setup_command.BuildDoc.finalize_options(self) # Allow builders to be configurable - as a comma separated list. if not isinstance(self.builders, list) and self.builders: self.builders = self.builders.split(',') class LocalBuildLatex(LocalBuildDoc): builders = ['latex'] command_name = 'build_sphinx_latex' _have_sphinx = True except ImportError: _have_sphinx = False def have_sphinx(): return _have_sphinx def _get_revno(git_dir): """Return the number of commits since the most recent tag. We use git-describe to find this out, but if there are no tags then we fall back to counting commits since the beginning of time. """ describe = _run_git_command(['describe', '--always'], git_dir) if "-" in describe: return describe.rsplit("-", 2)[-2] # no tags found revlist = _run_git_command( ['rev-list', '--abbrev-commit', 'HEAD'], git_dir) return len(revlist.splitlines()) def _get_version_from_git(pre_version): """Return a version which is equal to the tag that's on the current revision if there is one, or tag plus number of additional revisions if the current revision has no tag. """ git_dir = _get_git_directory() if git_dir: if pre_version: try: return _run_git_command( ['describe', '--exact-match'], git_dir, throw_on_error=True).replace('-', '.') except Exception: sha = _run_git_command( ['log', '-n1', '--pretty=format:%h'], git_dir) return "%s.dev%s.g%s" % (pre_version, _get_revno(git_dir), sha) else: return _run_git_command( ['describe', '--always'], git_dir).replace('-', '.') # If we don't know the version, return an empty string so at least # the downstream users of the value always have the same type of # object to work with. try: return unicode() except NameError: return '' def _get_version_from_pkg_info(package_name): """Get the version from PKG-INFO file if we can.""" try: pkg_info_file = open('PKG-INFO', 'r') except (IOError, OSError): return None try: pkg_info = email.message_from_file(pkg_info_file) except email.MessageError: return None # Check to make sure we're in our own dir if pkg_info.get('Name', None) != package_name: return None return pkg_info.get('Version', None) def get_version(package_name, pre_version=None): """Get the version of the project. First, try getting it from PKG-INFO, if it exists. If it does, that means we're in a distribution tarball or that install has happened. Otherwise, if there is no PKG-INFO file, pull the version from git. We do not support setup.py version sanity in git archive tarballs, nor do we support packagers directly sucking our git repo into theirs. We expect that a source tarball be made from our git repo - or that if someone wants to make a source tarball from a fork of our repo with additional tags in it that they understand and desire the results of doing that. """ version = os.environ.get( "PBR_VERSION", os.environ.get("OSLO_PACKAGE_VERSION", None)) if version: return version version = _get_version_from_pkg_info(package_name) if version: return version version = _get_version_from_git(pre_version) # Handle http://bugs.python.org/issue11638 # version will either be an empty unicode string or a valid # unicode version string, but either way it's unicode and needs to # be encoded. if sys.version_info[0] == 2: version = version.encode('utf-8') if version: return version raise Exception("Versioning for this project requires either an sdist" " tarball, or access to an upstream git repository." " Are you sure that git is installed?") pbr-0.7.0/pbr/util.py0000664000175300017540000005273412312051507015557 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS """The code in this module is mostly copy/pasted out of the distutils2 source code, as recommended by Tarek Ziade. As such, it may be subject to some change as distutils2 development continues, and will have to be kept up to date. I didn't want to use it directly from distutils2 itself, since I do not want it to be an installation dependency for our packages yet--it is still too unstable (the latest version on PyPI doesn't even install). """ # These first two imports are not used, but are needed to get around an # irritating Python bug that can crop up when using ./setup.py test. # See: http://www.eby-sarna.com/pipermail/peak/2010-May/003355.html try: import multiprocessing # flake8: noqa except ImportError: pass import logging # flake8: noqa import os import re import sys import traceback from collections import defaultdict import distutils.ccompiler from distutils import log from distutils.errors import (DistutilsOptionError, DistutilsModuleError, DistutilsFileError) from setuptools.command.egg_info import manifest_maker from setuptools.dist import Distribution from setuptools.extension import Extension try: import ConfigParser as configparser except ImportError: import configparser from pbr import extra_files import pbr.hooks # A simplified RE for this; just checks that the line ends with version # predicates in () _VERSION_SPEC_RE = re.compile(r'\s*(.*?)\s*\((.*)\)\s*$') # Mappings from setup() keyword arguments to setup.cfg options; # The values are (section, option) tuples, or simply (section,) tuples if # the option has the same name as the setup() argument D1_D2_SETUP_ARGS = { "name": ("metadata",), "version": ("metadata",), "author": ("metadata",), "author_email": ("metadata",), "maintainer": ("metadata",), "maintainer_email": ("metadata",), "url": ("metadata", "home_page"), "description": ("metadata", "summary"), "keywords": ("metadata",), "long_description": ("metadata", "description"), "download-url": ("metadata",), "classifiers": ("metadata", "classifier"), "platforms": ("metadata", "platform"), # ** "license": ("metadata",), # Use setuptools install_requires, not # broken distutils requires "install_requires": ("metadata", "requires_dist"), "setup_requires": ("metadata", "setup_requires_dist"), "provides": ("metadata", "provides_dist"), # ** "obsoletes": ("metadata", "obsoletes_dist"), # ** "package_dir": ("files", 'packages_root'), "packages": ("files",), "package_data": ("files",), "namespace_packages": ("files",), "data_files": ("files",), "scripts": ("files",), "py_modules": ("files", "modules"), # ** "cmdclass": ("global", "commands"), # Not supported in distutils2, but provided for # backwards compatibility with setuptools "use_2to3": ("backwards_compat", "use_2to3"), "zip_safe": ("backwards_compat", "zip_safe"), "tests_require": ("backwards_compat", "tests_require"), "dependency_links": ("backwards_compat",), "include_package_data": ("backwards_compat",), } # setup() arguments that can have multiple values in setup.cfg MULTI_FIELDS = ("classifiers", "platforms", "install_requires", "provides", "obsoletes", "namespace_packages", "packages", "package_data", "data_files", "scripts", "py_modules", "dependency_links", "setup_requires", "tests_require", "cmdclass") # setup() arguments that contain boolean values BOOL_FIELDS = ("use_2to3", "zip_safe", "include_package_data") CSV_FIELDS = ("keywords",) def resolve_name(name): """Resolve a name like ``module.object`` to an object and return it. Raise ImportError if the module or name is not found. """ parts = name.split('.') cursor = len(parts) - 1 module_name = parts[:cursor] attr_name = parts[-1] while cursor > 0: try: ret = __import__('.'.join(module_name), fromlist=[attr_name]) break except ImportError: if cursor == 0: raise cursor -= 1 module_name = parts[:cursor] attr_name = parts[cursor] ret = '' for part in parts[cursor:]: try: ret = getattr(ret, part) except AttributeError: raise ImportError(name) return ret def cfg_to_args(path='setup.cfg'): """ Distutils2 to distutils1 compatibility util. This method uses an existing setup.cfg to generate a dictionary of keywords that can be used by distutils.core.setup(kwargs**). :param file: The setup.cfg path. :raises DistutilsFileError: When the setup.cfg file is not found. """ # The method source code really starts here. parser = configparser.RawConfigParser() if not os.path.exists(path): raise DistutilsFileError("file '%s' does not exist" % os.path.abspath(path)) parser.read(path) config = {} for section in parser.sections(): config[section] = dict(parser.items(section)) # Run setup_hooks, if configured setup_hooks = has_get_option(config, 'global', 'setup_hooks') package_dir = has_get_option(config, 'files', 'packages_root') # Add the source package directory to sys.path in case it contains # additional hooks, and to make sure it's on the path before any existing # installations of the package if package_dir: package_dir = os.path.abspath(package_dir) sys.path.insert(0, package_dir) try: if setup_hooks: setup_hooks = [ hook for hook in split_multiline(setup_hooks) if hook != 'pbr.hooks.setup_hook'] for hook in setup_hooks: hook_fn = resolve_name(hook) try : hook_fn(config) except SystemExit: log.error('setup hook %s terminated the installation') except: e = sys.exc_info()[1] log.error('setup hook %s raised exception: %s\n' % (hook, e)) log.error(traceback.format_exc()) sys.exit(1) # Run the pbr hook pbr.hooks.setup_hook(config) kwargs = setup_cfg_to_setup_kwargs(config) # Set default config overrides kwargs['include_package_data'] = True kwargs['zip_safe'] = False register_custom_compilers(config) ext_modules = get_extension_modules(config) if ext_modules: kwargs['ext_modules'] = ext_modules entry_points = get_entry_points(config) if entry_points: kwargs['entry_points'] = entry_points wrap_commands(kwargs) # Handle the [files]/extra_files option files_extra_files = has_get_option(config, 'files', 'extra_files') if files_extra_files: extra_files.set_extra_files(split_multiline(files_extra_files)) finally: # Perform cleanup if any paths were added to sys.path if package_dir: sys.path.pop(0) return kwargs def setup_cfg_to_setup_kwargs(config): """Processes the setup.cfg options and converts them to arguments accepted by setuptools' setup() function. """ kwargs = {} for arg in D1_D2_SETUP_ARGS: if len(D1_D2_SETUP_ARGS[arg]) == 2: # The distutils field name is different than distutils2's. section, option = D1_D2_SETUP_ARGS[arg] elif len(D1_D2_SETUP_ARGS[arg]) == 1: # The distutils field name is the same thant distutils2's. section = D1_D2_SETUP_ARGS[arg][0] option = arg in_cfg_value = has_get_option(config, section, option) if not in_cfg_value: # There is no such option in the setup.cfg if arg == "long_description": in_cfg_value = has_get_option(config, section, "description_file") if in_cfg_value: in_cfg_value = split_multiline(in_cfg_value) value = '' for filename in in_cfg_value: description_file = open(filename) try: value += description_file.read().strip() + '\n\n' finally: description_file.close() in_cfg_value = value else: continue if arg in CSV_FIELDS: in_cfg_value = split_csv(in_cfg_value) if arg in MULTI_FIELDS: in_cfg_value = split_multiline(in_cfg_value) elif arg in BOOL_FIELDS: # Provide some flexibility here... if in_cfg_value.lower() in ('true', 't', '1', 'yes', 'y'): in_cfg_value = True else: in_cfg_value = False if in_cfg_value: if arg in ('install_requires', 'tests_require'): # Replaces PEP345-style version specs with the sort expected by # setuptools in_cfg_value = [_VERSION_SPEC_RE.sub(r'\1\2', pred) for pred in in_cfg_value] elif arg == 'package_dir': in_cfg_value = {'': in_cfg_value} elif arg in ('package_data', 'data_files'): data_files = {} firstline = True prev = None for line in in_cfg_value: if '=' in line: key, value = line.split('=', 1) key, value = (key.strip(), value.strip()) if key in data_files: # Multiple duplicates of the same package name; # this is for backwards compatibility of the old # format prior to d2to1 0.2.6. prev = data_files[key] prev.extend(value.split()) else: prev = data_files[key.strip()] = value.split() elif firstline: raise DistutilsOptionError( 'malformed package_data first line %r (misses ' '"=")' % line) else: prev.extend(line.strip().split()) firstline = False if arg == 'data_files': # the data_files value is a pointlessly different structure # from the package_data value data_files = data_files.items() in_cfg_value = data_files elif arg == 'cmdclass': cmdclass = {} dist = Distribution() for cls in in_cfg_value: cls = resolve_name(cls) cmd = cls(dist) cmdclass[cmd.get_command_name()] = cls in_cfg_value = cmdclass kwargs[arg] = in_cfg_value return kwargs def register_custom_compilers(config): """Handle custom compilers; this has no real equivalent in distutils, where additional compilers could only be added programmatically, so we have to hack it in somehow. """ compilers = has_get_option(config, 'global', 'compilers') if compilers: compilers = split_multiline(compilers) for compiler in compilers: compiler = resolve_name(compiler) # In distutils2 compilers these class attributes exist; for # distutils1 we just have to make something up if hasattr(compiler, 'name'): name = compiler.name else: name = compiler.__name__ if hasattr(compiler, 'description'): desc = compiler.description else: desc = 'custom compiler %s' % name module_name = compiler.__module__ # Note; this *will* override built in compilers with the same name # TODO: Maybe display a warning about this? cc = distutils.ccompiler.compiler_class cc[name] = (module_name, compiler.__name__, desc) # HACK!!!! Distutils assumes all compiler modules are in the # distutils package sys.modules['distutils.' + module_name] = sys.modules[module_name] def get_extension_modules(config): """Handle extension modules""" EXTENSION_FIELDS = ("sources", "include_dirs", "define_macros", "undef_macros", "library_dirs", "libraries", "runtime_library_dirs", "extra_objects", "extra_compile_args", "extra_link_args", "export_symbols", "swig_opts", "depends") ext_modules = [] for section in config: if ':' in section: labels = section.split(':', 1) else: # Backwards compatibility for old syntax; don't use this though labels = section.split('=', 1) labels = [l.strip() for l in labels] if (len(labels) == 2) and (labels[0] == 'extension'): ext_args = {} for field in EXTENSION_FIELDS: value = has_get_option(config, section, field) # All extension module options besides name can have multiple # values if not value: continue value = split_multiline(value) if field == 'define_macros': macros = [] for macro in value: macro = macro.split('=', 1) if len(macro) == 1: macro = (macro[0].strip(), None) else: macro = (macro[0].strip(), macro[1].strip()) macros.append(macro) value = macros ext_args[field] = value if ext_args: if 'name' not in ext_args: ext_args['name'] = labels[1] ext_modules.append(Extension(ext_args.pop('name'), **ext_args)) return ext_modules def get_entry_points(config): """Process the [entry_points] section of setup.cfg to handle setuptools entry points. This is, of course, not a standard feature of distutils2/packaging, but as there is not currently a standard alternative in packaging, we provide support for them. """ if not 'entry_points' in config: return {} return dict((option, split_multiline(value)) for option, value in config['entry_points'].items()) def wrap_commands(kwargs): dist = Distribution() # This should suffice to get the same config values and command classes # that the actual Distribution will see (not counting cmdclass, which is # handled below) dist.parse_config_files() for cmd, _ in dist.get_command_list(): hooks = {} for opt, val in dist.get_option_dict(cmd).items(): val = val[1] if opt.startswith('pre_hook.') or opt.startswith('post_hook.'): hook_type, alias = opt.split('.', 1) hook_dict = hooks.setdefault(hook_type, {}) hook_dict[alias] = val if not hooks: continue if 'cmdclass' in kwargs and cmd in kwargs['cmdclass']: cmdclass = kwargs['cmdclass'][cmd] else: cmdclass = dist.get_command_class(cmd) new_cmdclass = wrap_command(cmd, cmdclass, hooks) kwargs.setdefault('cmdclass', {})[cmd] = new_cmdclass def wrap_command(cmd, cmdclass, hooks): def run(self, cmdclass=cmdclass): self.run_command_hooks('pre_hook') cmdclass.run(self) self.run_command_hooks('post_hook') return type(cmd, (cmdclass, object), {'run': run, 'run_command_hooks': run_command_hooks, 'pre_hook': hooks.get('pre_hook'), 'post_hook': hooks.get('post_hook')}) def run_command_hooks(cmd_obj, hook_kind): """Run hooks registered for that command and phase. *cmd_obj* is a finalized command object; *hook_kind* is either 'pre_hook' or 'post_hook'. """ if hook_kind not in ('pre_hook', 'post_hook'): raise ValueError('invalid hook kind: %r' % hook_kind) hooks = getattr(cmd_obj, hook_kind, None) if hooks is None: return for hook in hooks.values(): if isinstance(hook, str): try: hook_obj = resolve_name(hook) except ImportError: err = sys.exc_info()[1] # For py3k raise DistutilsModuleError('cannot find hook %s: %s' % (hook,err)) else: hook_obj = hook if not hasattr(hook_obj, '__call__'): raise DistutilsOptionError('hook %r is not callable' % hook) log.info('running %s %s for command %s', hook_kind, hook, cmd_obj.get_command_name()) try : hook_obj(cmd_obj) except: e = sys.exc_info()[1] log.error('hook %s raised exception: %s\n' % (hook, e)) log.error(traceback.format_exc()) sys.exit(1) def has_get_option(config, section, option): if section in config and option in config[section]: return config[section][option] elif section in config and option.replace('_', '-') in config[section]: return config[section][option.replace('_', '-')] else: return False def split_multiline(value): """Special behaviour when we have a multi line options""" value = [element for element in (line.strip() for line in value.split('\n')) if element] return value def split_csv(value): """Special behaviour when we have a comma separated options""" value = [element for element in (chunk.strip() for chunk in value.split(',')) if element] return value def monkeypatch_method(cls): """A function decorator to monkey-patch a method of the same name on the given class. """ def wrapper(func): orig = getattr(cls, func.__name__, None) if orig and not hasattr(orig, '_orig'): # Already patched setattr(func, '_orig', orig) setattr(cls, func.__name__, func) return func return wrapper # The following classes are used to hack Distribution.command_options a bit class DefaultGetDict(defaultdict): """Like defaultdict, but the get() method also sets and returns the default value. """ def get(self, key, default=None): if default is None: default = self.default_factory() return super(DefaultGetDict, self).setdefault(key, default) class IgnoreDict(dict): """A dictionary that ignores any insertions in which the key is a string matching any string in `ignore`. The ignore list can also contain wildcard patterns using '*'. """ def __init__(self, ignore): self.__ignore = re.compile(r'(%s)' % ('|'.join( [pat.replace('*', '.*') for pat in ignore]))) def __setitem__(self, key, val): if self.__ignore.match(key): return super(IgnoreDict, self).__setitem__(key, val) pbr-0.7.0/pbr/core.py0000664000175300017540000001206312312051507015521 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS from distutils import core from distutils import errors import os import sys import warnings from setuptools import dist from pbr import util core.Distribution = dist._get_unpatched(core.Distribution) if sys.version_info[0] == 3: string_type = str integer_types = (int,) else: string_type = basestring integer_types = (int, long) def pbr(dist, attr, value): """Implements the actual pbr setup() keyword. When used, this should be the only keyword in your setup() aside from `setup_requires`. If given as a string, the value of pbr is assumed to be the relative path to the setup.cfg file to use. Otherwise, if it evaluates to true, it simply assumes that pbr should be used, and the default 'setup.cfg' is used. This works by reading the setup.cfg file, parsing out the supported metadata and command options, and using them to rebuild the `DistributionMetadata` object and set the newly added command options. The reason for doing things this way is that a custom `Distribution` class will not play nicely with setup_requires; however, this implementation may not work well with distributions that do use a `Distribution` subclass. """ if not value: return if isinstance(value, string_type): path = os.path.abspath(value) else: path = os.path.abspath('setup.cfg') if not os.path.exists(path): raise errors.DistutilsFileError( 'The setup.cfg file %s does not exist.' % path) # Converts the setup.cfg file to setup() arguments try: attrs = util.cfg_to_args(path) except Exception: e = sys.exc_info()[1] raise errors.DistutilsSetupError( 'Error parsing %s: %s: %s' % (path, e.__class__.__name__, e)) # Repeat some of the Distribution initialization code with the newly # provided attrs if attrs: # Skips 'options' and 'licence' support which are rarely used; may add # back in later if demanded for key, val in attrs.items(): if hasattr(dist.metadata, 'set_' + key): getattr(dist.metadata, 'set_' + key)(val) elif hasattr(dist.metadata, key): setattr(dist.metadata, key, val) elif hasattr(dist, key): setattr(dist, key, val) else: msg = 'Unknown distribution option: %s' % repr(key) warnings.warn(msg) # Re-finalize the underlying Distribution core.Distribution.finalize_options(dist) # This bit comes out of distribute/setuptools if isinstance(dist.metadata.version, integer_types + (float,)): # Some people apparently take "version number" too literally :) dist.metadata.version = str(dist.metadata.version) # This bit of hackery is necessary so that the Distribution will ignore # normally unsupport command options (namely pre-hooks and post-hooks). # dist.command_options is normally a dict mapping command names to dicts of # their options. Now it will be a defaultdict that returns IgnoreDicts for # the each command's options so we can pass through the unsupported options ignore = ['pre_hook.*', 'post_hook.*'] dist.command_options = util.DefaultGetDict(lambda: util.IgnoreDict(ignore)) pbr-0.7.0/pbr/tests/0000775000175300017540000000000012312051536015361 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/test_setup.py0000664000175300017540000003730712312051507020142 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright (c) 2011 OpenStack Foundation # Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from __future__ import print_function import os import sys import tempfile try: import cStringIO as io BytesIO = io.StringIO except ImportError: import io BytesIO = io.BytesIO import fixtures import testscenarios from pbr import packaging from pbr.tests import base class SkipFileWrites(base.BaseTestCase): scenarios = [ ('changelog_option_true', dict(option_key='skip_changelog', option_value='True', env_key='SKIP_WRITE_GIT_CHANGELOG', env_value=None, pkg_func=packaging.write_git_changelog, filename='ChangeLog')), ('changelog_option_false', dict(option_key='skip_changelog', option_value='False', env_key='SKIP_WRITE_GIT_CHANGELOG', env_value=None, pkg_func=packaging.write_git_changelog, filename='ChangeLog')), ('changelog_env_true', dict(option_key='skip_changelog', option_value='False', env_key='SKIP_WRITE_GIT_CHANGELOG', env_value='True', pkg_func=packaging.write_git_changelog, filename='ChangeLog')), ('changelog_both_true', dict(option_key='skip_changelog', option_value='True', env_key='SKIP_WRITE_GIT_CHANGELOG', env_value='True', pkg_func=packaging.write_git_changelog, filename='ChangeLog')), ('authors_option_true', dict(option_key='skip_authors', option_value='True', env_key='SKIP_GENERATE_AUTHORS', env_value=None, pkg_func=packaging.generate_authors, filename='AUTHORS')), ('authors_option_false', dict(option_key='skip_authors', option_value='False', env_key='SKIP_GENERATE_AUTHORS', env_value=None, pkg_func=packaging.generate_authors, filename='AUTHORS')), ('authors_env_true', dict(option_key='skip_authors', option_value='False', env_key='SKIP_GENERATE_AUTHORS', env_value='True', pkg_func=packaging.generate_authors, filename='AUTHORS')), ('authors_both_true', dict(option_key='skip_authors', option_value='True', env_key='SKIP_GENERATE_AUTHORS', env_value='True', pkg_func=packaging.generate_authors, filename='AUTHORS')), ] def setUp(self): super(SkipFileWrites, self).setUp() self.temp_path = self.useFixture(fixtures.TempDir()).path self.root_dir = os.path.abspath(os.path.curdir) self.git_dir = os.path.join(self.root_dir, ".git") if not os.path.exists(self.git_dir): self.skipTest("%s is missing; skipping git-related checks" % self.git_dir) return self.filename = os.path.join(self.temp_path, self.filename) self.option_dict = dict() if self.option_key is not None: self.option_dict[self.option_key] = ('setup.cfg', self.option_value) self.useFixture( fixtures.EnvironmentVariable(self.env_key, self.env_value)) def test_skip(self): self.pkg_func(git_dir=self.git_dir, dest_dir=self.temp_path, option_dict=self.option_dict) self.assertEqual( not os.path.exists(self.filename), (self.option_value.lower() in packaging.TRUE_VALUES or self.env_value is not None)) _changelog_content = """04316fe (review/monty_taylor/27519) Make python 378261a Add an integration test script. 3c373ac (HEAD, tag: 2013.2.rc2, tag: 2013.2, milestone-proposed) Merge "Lib 182feb3 (tag: 0.5.17) Fix pip invocation for old versions of pip. fa4f46e (tag: 0.5.16) Remove explicit depend on distribute. d1c53dd Use pip instead of easy_install for installation. a793ea1 Merge "Skip git-checkout related tests when .git is missing" 6c27ce7 Skip git-checkout related tests when .git is missing 04984a5 Refactor hooks file. a65e8ee (tag: 0.5.14, tag: 0.5.13) Remove jinja pin. """ class GitLogsTest(base.BaseTestCase): def setUp(self): super(GitLogsTest, self).setUp() self.temp_path = self.useFixture(fixtures.TempDir()).path self.root_dir = os.path.abspath(os.path.curdir) self.git_dir = os.path.join(self.root_dir, ".git") self.useFixture( fixtures.EnvironmentVariable('SKIP_GENERATE_AUTHORS')) self.useFixture( fixtures.EnvironmentVariable('SKIP_WRITE_GIT_CHANGELOG')) def test_write_git_changelog(self): self.useFixture(fixtures.FakePopen(lambda _: { "stdout": BytesIO(_changelog_content.encode('utf-8')) })) packaging.write_git_changelog(git_dir=self.git_dir, dest_dir=self.temp_path) with open(os.path.join(self.temp_path, "ChangeLog"), "r") as ch_fh: changelog_contents = ch_fh.read() self.assertIn("2013.2", changelog_contents) self.assertIn("0.5.17", changelog_contents) self.assertIn("------", changelog_contents) self.assertIn("Refactor hooks file", changelog_contents) self.assertNotIn("Refactor hooks file.", changelog_contents) self.assertNotIn("182feb3", changelog_contents) self.assertNotIn("review/monty_taylor/27519", changelog_contents) self.assertNotIn("0.5.13", changelog_contents) self.assertNotIn('Merge "', changelog_contents) def test_generate_authors(self): author_old = u"Foo Foo " author_new = u"Bar Bar " co_author = u"Foo Bar " co_author_by = u"Co-authored-by: " + co_author git_log_cmd = ( "git --git-dir=%s log --use-mailmap --format=%%aN <%%aE>" % self.git_dir) git_co_log_cmd = ("git --git-dir=%s log" % self.git_dir) git_top_level = "git rev-parse --show-toplevel" cmd_map = { git_log_cmd: author_new, git_co_log_cmd: co_author_by, git_top_level: self.root_dir, } exist_files = [self.git_dir, os.path.join(self.temp_path, "AUTHORS.in")] self.useFixture(fixtures.MonkeyPatch( "os.path.exists", lambda path: os.path.abspath(path) in exist_files)) def _fake_run_shell_command(cmd, **kwargs): return cmd_map[" ".join(cmd)] self.useFixture(fixtures.MonkeyPatch( "pbr.packaging._run_shell_command", _fake_run_shell_command)) with open(os.path.join(self.temp_path, "AUTHORS.in"), "w") as auth_fh: auth_fh.write("%s\n" % author_old) packaging.generate_authors(git_dir=self.git_dir, dest_dir=self.temp_path) with open(os.path.join(self.temp_path, "AUTHORS"), "r") as auth_fh: authors = auth_fh.read() self.assertTrue(author_old in authors) self.assertTrue(author_new in authors) self.assertTrue(co_author in authors) class BuildSphinxTest(base.BaseTestCase): scenarios = [ ('true_autodoc_caps', dict(has_opt=True, autodoc='True', has_autodoc=True)), ('true_autodoc_lower', dict(has_opt=True, autodoc='true', has_autodoc=True)), ('false_autodoc', dict(has_opt=True, autodoc='False', has_autodoc=False)), ('no_autodoc', dict(has_opt=False, autodoc='False', has_autodoc=False)), ] def setUp(self): super(BuildSphinxTest, self).setUp() self.useFixture(fixtures.MonkeyPatch( "sphinx.setup_command.BuildDoc.run", lambda self: None)) from distutils import dist self.distr = dist.Distribution() self.distr.packages = ("fake_package",) self.distr.command_options["build_sphinx"] = { "source_dir": ["a", "."]} pkg_fixture = fixtures.PythonPackage( "fake_package", [("fake_module.py", b"")]) self.useFixture(pkg_fixture) self.useFixture(base.DiveDir(pkg_fixture.base)) def test_build_doc(self): if self.has_opt: self.distr.command_options["pbr"] = { "autodoc_index_modules": ('setup.cfg', self.autodoc)} build_doc = packaging.LocalBuildDoc(self.distr) build_doc.run() self.assertTrue( os.path.exists("api/autoindex.rst") == self.has_autodoc) self.assertTrue( os.path.exists( "api/fake_package.fake_module.rst") == self.has_autodoc) def test_builders_config(self): if self.has_opt: self.distr.command_options["pbr"] = { "autodoc_index_modules": ('setup.cfg', self.autodoc)} build_doc = packaging.LocalBuildDoc(self.distr) build_doc.finalize_options() self.assertEqual(2, len(build_doc.builders)) self.assertIn('html', build_doc.builders) self.assertIn('man', build_doc.builders) build_doc = packaging.LocalBuildDoc(self.distr) build_doc.builders = '' build_doc.finalize_options() self.assertEqual('', build_doc.builders) build_doc = packaging.LocalBuildDoc(self.distr) build_doc.builders = 'man' build_doc.finalize_options() self.assertEqual(1, len(build_doc.builders)) self.assertIn('man', build_doc.builders) build_doc = packaging.LocalBuildDoc(self.distr) build_doc.builders = 'html,man,doctest' build_doc.finalize_options() self.assertIn('html', build_doc.builders) self.assertIn('man', build_doc.builders) self.assertIn('doctest', build_doc.builders) class ParseRequirementsTest(base.BaseTestCase): def setUp(self): super(ParseRequirementsTest, self).setUp() (fd, self.tmp_file) = tempfile.mkstemp(prefix='openstack', suffix='.setup') def test_parse_requirements_normal(self): with open(self.tmp_file, 'w') as fh: fh.write("foo\nbar") self.assertEqual(['foo', 'bar'], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_with_git_egg_url(self): with open(self.tmp_file, 'w') as fh: fh.write("-e git://foo.com/zipball#egg=bar") self.assertEqual(['bar'], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_with_versioned_git_egg_url(self): with open(self.tmp_file, 'w') as fh: fh.write("-e git://foo.com/zipball#egg=bar-1.2.4") self.assertEqual(['bar>=1.2.4'], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_with_http_egg_url(self): with open(self.tmp_file, 'w') as fh: fh.write("https://foo.com/zipball#egg=bar") self.assertEqual(['bar'], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_with_versioned_http_egg_url(self): with open(self.tmp_file, 'w') as fh: fh.write("https://foo.com/zipball#egg=bar-4.2.1") self.assertEqual(['bar>=4.2.1'], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_removes_index_lines(self): with open(self.tmp_file, 'w') as fh: fh.write("-f foobar") self.assertEqual([], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_removes_argparse(self): with open(self.tmp_file, 'w') as fh: fh.write("argparse") if sys.version_info >= (2, 7): self.assertEqual([], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_removes_versioned_ordereddict(self): self.useFixture(fixtures.MonkeyPatch('sys.version_info', (2, 7))) with open(self.tmp_file, 'w') as fh: fh.write("ordereddict==1.0.1") self.assertEqual([], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_keeps_versioned_ordereddict(self): self.useFixture(fixtures.MonkeyPatch('sys.version_info', (2, 6))) with open(self.tmp_file, 'w') as fh: fh.write("ordereddict==1.0.1") self.assertEqual([ "ordereddict==1.0.1"], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_override_with_env(self): with open(self.tmp_file, 'w') as fh: fh.write("foo\nbar") self.useFixture( fixtures.EnvironmentVariable('PBR_REQUIREMENTS_FILES', self.tmp_file)) self.assertEqual(['foo', 'bar'], packaging.parse_requirements()) def test_parse_requirements_override_with_env_multiple_files(self): with open(self.tmp_file, 'w') as fh: fh.write("foo\nbar") self.useFixture( fixtures.EnvironmentVariable('PBR_REQUIREMENTS_FILES', "no-such-file," + self.tmp_file)) self.assertEqual(['foo', 'bar'], packaging.parse_requirements()) def test_get_requirement_from_file_empty(self): actual = packaging.get_reqs_from_files([]) self.assertEqual([], actual) def test_parse_requirements_with_comments(self): with open(self.tmp_file, 'w') as fh: fh.write("# this is a comment\nfoobar\n# and another one\nfoobaz") self.assertEqual(['foobar', 'foobaz'], packaging.parse_requirements([self.tmp_file])) def test_parse_requirements_python_version(self): with open("requirements-py%d.txt" % sys.version_info[0], "w") as fh: fh.write("# this is a comment\nfoobar\n# and another one\nfoobaz") self.assertEqual(['foobar', 'foobaz'], packaging.parse_requirements()) def test_parse_requirements_right_python_version(self): with open("requirements-py1.txt", "w") as fh: fh.write("thisisatrap") with open("requirements-py%d.txt" % sys.version_info[0], "w") as fh: fh.write("# this is a comment\nfoobar\n# and another one\nfoobaz") self.assertEqual(['foobar', 'foobaz'], packaging.parse_requirements()) class ParseDependencyLinksTest(base.BaseTestCase): def setUp(self): super(ParseDependencyLinksTest, self).setUp() (fd, self.tmp_file) = tempfile.mkstemp(prefix="openstack", suffix=".setup") def test_parse_dependency_normal(self): with open(self.tmp_file, "w") as fh: fh.write("http://test.com\n") self.assertEqual( ["http://test.com"], packaging.parse_dependency_links([self.tmp_file])) def test_parse_dependency_with_git_egg_url(self): with open(self.tmp_file, "w") as fh: fh.write("-e git://foo.com/zipball#egg=bar") self.assertEqual( ["git://foo.com/zipball#egg=bar"], packaging.parse_dependency_links([self.tmp_file])) def load_tests(loader, in_tests, pattern): return testscenarios.load_tests_apply_scenarios(loader, in_tests, pattern) pbr-0.7.0/pbr/tests/__init__.py0000664000175300017540000000000012312051507017456 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/0000775000175300017540000000000012312051536017654 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/0000775000175300017540000000000012312051536023012 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/__init__.py0000664000175300017540000000000012312051507025107 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/cmd.py0000664000175300017540000000143612312051507024131 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. from __future__ import print_function def main(): print("PBR Test Command") class Foo(object): @classmethod def bar(self): print("PBR Test Command - with class!") pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/_setup_hooks.py0000664000175300017540000000440612312051507026070 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS from distutils.command import build_py def test_hook_1(config): print('test_hook_1') def test_hook_2(config): print('test_hook_2') class test_command(build_py.build_py): command_name = 'build_py' def run(self): print('Running custom build_py command.') return build_py.build_py.run(self) def test_pre_hook(cmdobj): print('build_ext pre-hook') def test_post_hook(cmdobj): print('build_ext post-hook') pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/package_data/0000775000175300017540000000000012312051536025376 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/package_data/1.txt0000664000175300017540000000000012312051507026263 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/pbr_testpackage/package_data/2.txt0000664000175300017540000000000012312051507026264 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/data_files/0000775000175300017540000000000012312051536021747 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/data_files/b.txt0000664000175300017540000000000012312051507022715 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/data_files/c.rst0000664000175300017540000000000012312051507022707 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/data_files/a.txt0000664000175300017540000000000012312051507022714 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/extra-file.txt0000664000175300017540000000000012312051507022441 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/LICENSE.txt0000664000175300017540000000267012312051507021502 0ustar jenkinsjenkins00000000000000Copyright (C) 2005 Association of Universities for Research in Astronomy (AURA) Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. The name of AURA and its representatives may not be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. pbr-0.7.0/pbr/tests/testpackage/src/0000775000175300017540000000000012312051536020443 5ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/src/testext.c0000664000175300017540000000063212312051507022306 0ustar jenkinsjenkins00000000000000#include static PyMethodDef TestextMethods[] = { {NULL, NULL, 0, NULL} }; #if PY_MAJOR_VERSION >=3 static struct PyModuleDef testextmodule = { PyModuleDef_HEAD_INIT, "testext", -1, TestextMethods }; PyObject* PyInit_testext(void) { return PyModule_Create(&testextmodule); } #else PyMODINIT_FUNC inittestext(void) { Py_InitModule("testext", TestextMethods); } #endif pbr-0.7.0/pbr/tests/testpackage/MANIFEST.in0000664000175300017540000000002512312051507021405 0ustar jenkinsjenkins00000000000000include data_files/* pbr-0.7.0/pbr/tests/testpackage/setup.py0000775000175300017540000000131212312051507021364 0ustar jenkinsjenkins00000000000000#!/usr/bin/env python # Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. import setuptools setuptools.setup( setup_requires=['pbr'], pbr=True, ) pbr-0.7.0/pbr/tests/testpackage/CHANGES.txt0000664000175300017540000000766412312051507021500 0ustar jenkinsjenkins00000000000000Changelog =========== 0.3 (unreleased) ------------------ - The ``glob_data_files`` hook became a pre-command hook for the install_data command instead of being a setup-hook. This is to support the additional functionality of requiring data_files with relative destination paths to be install relative to the package's install path (i.e. site-packages). - Dropped support for and deprecated the easier_install custom command. Although it should still work, it probably won't be used anymore for stsci_python packages. - Added support for the ``build_optional_ext`` command, which replaces/extends the default ``build_ext`` command. See the README for more details. - Added the ``tag_svn_revision`` setup_hook as a replacement for the setuptools-specific tag_svn_revision option to the egg_info command. This new hook is easier to use than the old tag_svn_revision option: It's automatically enabled by the presence of ``.dev`` in the version string, and disabled otherwise. - The ``svn_info_pre_hook`` and ``svn_info_post_hook`` have been replaced with ``version_pre_command_hook`` and ``version_post_command_hook`` respectively. However, a new ``version_setup_hook``, which has the same purpose, has been added. It is generally easier to use and will give more consistent results in that it will run every time setup.py is run, regardless of which command is used. ``stsci.distutils`` itself uses this hook--see the `setup.cfg` file and `stsci/distutils/__init__.py` for example usage. - Instead of creating an `svninfo.py` module, the new ``version_`` hooks create a file called `version.py`. In addition to the SVN info that was included in `svninfo.py`, it includes a ``__version__`` variable to be used by the package's `__init__.py`. This allows there to be a hard-coded ``__version__`` variable included in the source code, rather than using pkg_resources to get the version. - In `version.py`, the variables previously named ``__svn_version__`` and ``__full_svn_info__`` are now named ``__svn_revision__`` and ``__svn_full_info__``. - Fixed a bug when using stsci.distutils in the installation of other packages in the ``stsci.*`` namespace package. If stsci.distutils was not already installed, and was downloaded automatically by distribute through the setup_requires option, then ``stsci.distutils`` would fail to import. This is because the way the namespace package (nspkg) mechanism currently works, all packages belonging to the nspkg *must* be on the import path at initial import time. So when installing stsci.tools, for example, if ``stsci.tools`` is imported from within the source code at install time, but before ``stsci.distutils`` is downloaded and added to the path, the ``stsci`` package is already imported and can't be extended to include the path of ``stsci.distutils`` after the fact. The easiest way of dealing with this, it seems, is to delete ``stsci`` from ``sys.modules``, which forces it to be reimported, now the its ``__path__`` extended to include ``stsci.distutil``'s path. 0.2.2 (2011-11-09) ------------------ - Fixed check for the issue205 bug on actual setuptools installs; before it only worked on distribute. setuptools has the issue205 bug prior to version 0.6c10. - Improved the fix for the issue205 bug, especially on setuptools. setuptools, prior to 0.6c10, did not back of sys.modules either before sandboxing, which causes serious problems. In fact, it's so bad that it's not enough to add a sys.modules backup to the current sandbox: It's in fact necessary to monkeypatch setuptools.sandbox.run_setup so that any subsequent calls to it also back up sys.modules. 0.2.1 (2011-09-02) ------------------ - Fixed the dependencies so that setuptools is requirement but 'distribute' specifically. Previously installation could fail if users had plain setuptools installed and not distribute 0.2 (2011-08-23) ------------------ - Initial public release pbr-0.7.0/pbr/tests/testpackage/git-extra-file.txt0000664000175300017540000000000012312051507023222 0ustar jenkinsjenkins00000000000000pbr-0.7.0/pbr/tests/testpackage/README.txt0000664000175300017540000001502212312051507021350 0ustar jenkinsjenkins00000000000000Introduction ============ This package contains utilities used to package some of STScI's Python projects; specifically those projects that comprise stsci_python_ and Astrolib_. It currently consists mostly of some setup_hook scripts meant for use with `distutils2/packaging`_ and/or pbr_, and a customized easy_install command meant for use with distribute_. This package is not meant for general consumption, though it might be worth looking at for examples of how to do certain things with your own packages, but YMMV. Features ======== Hook Scripts ------------ Currently the main features of this package are a couple of setup_hook scripts. In distutils2, a setup_hook is a script that runs at the beginning of any pysetup command, and can modify the package configuration read from setup.cfg. There are also pre- and post-command hooks that only run before/after a specific setup command (eg. build_ext, install) is run. stsci.distutils.hooks.use_packages_root ''''''''''''''''''''''''''''''''''''''' If using the ``packages_root`` option under the ``[files]`` section of setup.cfg, this hook will add that path to ``sys.path`` so that modules in your package can be imported and used in setup. This can be used even if ``packages_root`` is not specified--in this case it adds ``''`` to ``sys.path``. stsci.distutils.hooks.version_setup_hook '''''''''''''''''''''''''''''''''''''''' Creates a Python module called version.py which currently contains four variables: * ``__version__`` (the release version) * ``__svn_revision__`` (the SVN revision info as returned by the ``svnversion`` command) * ``__svn_full_info__`` (as returned by the ``svn info`` command) * ``__setup_datetime__`` (the date and time that setup.py was last run). These variables can be imported in the package's `__init__.py` for degugging purposes. The version.py module will *only* be created in a package that imports from the version module in its `__init__.py`. It should be noted that this is generally preferable to writing these variables directly into `__init__.py`, since this provides more control and is less likely to unexpectedly break things in `__init__.py`. stsci.distutils.hooks.version_pre_command_hook '''''''''''''''''''''''''''''''''''''''''''''' Identical to version_setup_hook, but designed to be used as a pre-command hook. stsci.distutils.hooks.version_post_command_hook ''''''''''''''''''''''''''''''''''''''''''''''' The complement to version_pre_command_hook. This will delete any version.py files created during a build in order to prevent them from cluttering an SVN working copy (note, however, that version.py is *not* deleted from the build/ directory, so a copy of it is still preserved). It will also not be deleted if the current directory is not an SVN working copy. For example, if source code extracted from a source tarball it will be preserved. stsci.distutils.hooks.tag_svn_revision '''''''''''''''''''''''''''''''''''''' A setup_hook to add the SVN revision of the current working copy path to the package version string, but only if the version ends in .dev. For example, ``mypackage-1.0.dev`` becomes ``mypackage-1.0.dev1234``. This is in accordance with the version string format standardized by PEP 386. This should be used as a replacement for the ``tag_svn_revision`` option to the egg_info command. This hook is more compatible with packaging/distutils2, which does not include any VCS support. This hook is also more flexible in that it turns the revision number on/off depending on the presence of ``.dev`` in the version string, so that it's not automatically added to the version in final releases. This hook does require the ``svnversion`` command to be available in order to work. It does not examine the working copy metadata directly. stsci.distutils.hooks.numpy_extension_hook '''''''''''''''''''''''''''''''''''''''''' This is a pre-command hook for the build_ext command. To use it, add a ``[build_ext]`` section to your setup.cfg, and add to it:: pre-hook.numpy-extension-hook = stsci.distutils.hooks.numpy_extension_hook This hook must be used to build extension modules that use Numpy. The primary side-effect of this hook is to add the correct numpy include directories to `include_dirs`. To use it, add 'numpy' to the 'include-dirs' option of each extension module that requires numpy to build. The value 'numpy' will be replaced with the actual path to the numpy includes. stsci.distutils.hooks.is_display_option ''''''''''''''''''''''''''''''''''''''' This is not actually a hook, but is a useful utility function that can be used in writing other hooks. Basically, it returns ``True`` if setup.py was run with a "display option" such as --version or --help. This can be used to prevent your hook from running in such cases. stsci.distutils.hooks.glob_data_files ''''''''''''''''''''''''''''''''''''' A pre-command hook for the install_data command. Allows filename wildcards as understood by ``glob.glob()`` to be used in the data_files option. This hook must be used in order to have this functionality since it does not normally exist in distutils. This hook also ensures that data files are installed relative to the package path. data_files shouldn't normally be installed this way, but the functionality is required for a few special cases. Commands -------- build_optional_ext '''''''''''''''''' This serves as an optional replacement for the default built_ext command, which compiles C extension modules. Its purpose is to allow extension modules to be *optional*, so that if their build fails the rest of the package is still allowed to be built and installed. This can be used when an extension module is not definitely required to use the package. To use this custom command, add:: commands = stsci.distutils.command.build_optional_ext.build_optional_ext under the ``[global]`` section of your package's setup.cfg. Then, to mark an individual extension module as optional, under the setup.cfg section for that extension add:: optional = True Optionally, you may also add a custom failure message by adding:: fail_message = The foobar extension module failed to compile. This could be because you lack such and such headers. This package will still work, but such and such features will be disabled. .. _stsci_python: http://www.stsci.edu/resources/software_hardware/pyraf/stsci_python .. _Astrolib: http://www.scipy.org/AstroLib/ .. _distutils2/packaging: http://distutils2.notmyidea.org/ .. _d2to1: http://pypi.python.org/pypi/d2to1 .. _distribute: http://pypi.python.org/pypi/distribute pbr-0.7.0/pbr/tests/testpackage/setup.cfg0000664000175300017540000000261312312051507021475 0ustar jenkinsjenkins00000000000000[metadata] name = pbr_testpackage version = 0.1.dev author = OpenStack author-email = openstack-dev@lists.openstack.org home-page = http://pypi.python.org/pypi/pbr summary = Test package for testing pbr description-file = README.txt CHANGES.txt requires-python = >=2.5 requires-dist = setuptools classifier = Development Status :: 3 - Alpha Intended Audience :: Developers License :: OSI Approved :: BSD License Programming Language :: Python Topic :: Scientific/Engineering Topic :: Software Development :: Build Tools Topic :: Software Development :: Libraries :: Python Modules Topic :: System :: Archiving :: Packaging keywords = packaging, distutils, setuptools [files] packages = pbr_testpackage package-data = testpackage = package_data/*.txt data-files = testpackage/data_files = data_files/*.txt extra-files = extra-file.txt [entry_points] console_scripts = pbr_test_cmd = pbr_testpackage.cmd:main pbr_test_cmd_with_class = pbr_testpackage.cmd:Foo.bar [extension=pbr_testpackage.testext] sources = src/testext.c optional = True [global] #setup-hooks = # pbr_testpackage._setup_hooks.test_hook_1 # pbr_testpackage._setup_hooks.test_hook_2 commands = pbr_testpackage._setup_hooks.test_command [build_ext] #pre-hook.test_pre_hook = pbr_testpackage._setup_hooks.test_pre_hook #post-hook.test_post_hook = pbr_testpackage._setup_hooks.test_post_hook pbr-0.7.0/pbr/tests/test_core.py0000664000175300017540000001222512312051507017722 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS import glob import os import tarfile import fixtures from pbr.tests import base class TestCore(base.BaseTestCase): cmd_names = ('pbr_test_cmd', 'pbr_test_cmd_with_class') def check_script_install(self, install_stdout): for cmd_name in self.cmd_names: install_txt = 'Installing %s script to %s' % (cmd_name, self.temp_dir) self.assertIn(install_txt, install_stdout) cmd_filename = os.path.join(self.temp_dir, cmd_name) script_txt = open(cmd_filename, 'r').read() self.assertNotIn('pkg_resources', script_txt) stdout, _, return_code = self._run_cmd(cmd_filename) self.assertIn("PBR", stdout) def test_setup_py_keywords(self): """setup.py --keywords. Test that the `./setup.py --keywords` command returns the correct value without balking. """ self.run_setup('egg_info') stdout, _, _ = self.run_setup('--keywords') assert stdout == 'packaging,distutils,setuptools' def test_sdist_extra_files(self): """Test that the extra files are correctly added.""" stdout, _, return_code = self.run_setup('sdist', '--formats=gztar') # There can be only one try: tf_path = glob.glob(os.path.join('dist', '*.tar.gz'))[0] except IndexError: assert False, 'source dist not found' tf = tarfile.open(tf_path) names = ['/'.join(p.split('/')[1:]) for p in tf.getnames()] self.assertIn('extra-file.txt', names) def test_console_script_install(self): """Test that we install a non-pkg-resources console script.""" if os.name == 'nt': self.skipTest('Windows support is passthrough') stdout, _, return_code = self.run_setup( 'install_scripts', '--install-dir=%s' % self.temp_dir) self.useFixture( fixtures.EnvironmentVariable('PYTHONPATH', '.')) self.check_script_install(stdout) def test_console_script_develop(self): """Test that we develop a non-pkg-resources console script.""" if os.name == 'nt': self.skipTest('Windows support is passthrough') self.useFixture( fixtures.EnvironmentVariable( 'PYTHONPATH', ".:%s" % self.temp_dir)) stdout, _, return_code = self.run_setup( 'develop', '--install-dir=%s' % self.temp_dir) self.check_script_install(stdout) class TestGitSDist(base.BaseTestCase): def setUp(self): super(TestGitSDist, self).setUp() stdout, _, return_code = self._run_cmd('git', ('init',)) if return_code: self.skipTest("git not installed") stdout, _, return_code = self._run_cmd('git', ('add', '.')) stdout, _, return_code = self._run_cmd( 'git', ('commit', '-m', 'Turn this into a git repo')) stdout, _, return_code = self.run_setup('sdist', '--formats=gztar') def test_sdist_git_extra_files(self): """Test that extra files found in git are correctly added.""" # There can be only one tf_path = glob.glob(os.path.join('dist', '*.tar.gz'))[0] tf = tarfile.open(tf_path) names = ['/'.join(p.split('/')[1:]) for p in tf.getnames()] self.assertIn('git-extra-file.txt', names) pbr-0.7.0/pbr/tests/test_version.py0000664000175300017540000000216112312051507020455 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2012 Red Hat, Inc. # Copyright 2012-2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from pbr.tests import base from pbr import version class DeferredVersionTestCase(base.BaseTestCase): def test_cached_version(self): class MyVersionInfo(version.VersionInfo): def _get_version_from_pkg_resources(self): return "5.5.5.5" deferred_string = MyVersionInfo("openstack").\ cached_version_string() self.assertEqual("5.5.5.5", deferred_string) pbr-0.7.0/pbr/tests/test_commands.py0000664000175300017540000000471712312051507020602 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS from testtools import content from pbr.tests import base class TestCommands(base.BaseTestCase): def test_custom_build_py_command(self): """Test custom build_py command. Test that a custom subclass of the build_py command runs when listed in the commands [global] option, rather than the normal build command. """ stdout, stderr, return_code = self.run_setup('build_py') self.addDetail('stdout', content.text_content(stdout)) self.addDetail('stderr', content.text_content(stderr)) self.assertIn('Running custom build_py command.', stdout) self.assertEqual(return_code, 0) pbr-0.7.0/pbr/tests/test_hooks.py0000664000175300017540000000750112312051507020116 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS import os import textwrap from testtools.matchers import Contains from pbr.tests import base from pbr.tests import util class TestHooks(base.BaseTestCase): def setUp(self): super(TestHooks, self).setUp() with util.open_config( os.path.join(self.package_dir, 'setup.cfg')) as cfg: cfg.set('global', 'setup-hooks', 'pbr_testpackage._setup_hooks.test_hook_1\n' 'pbr_testpackage._setup_hooks.test_hook_2') cfg.set('build_ext', 'pre-hook.test_pre_hook', 'pbr_testpackage._setup_hooks.test_pre_hook') cfg.set('build_ext', 'post-hook.test_post_hook', 'pbr_testpackage._setup_hooks.test_post_hook') def test_global_setup_hooks(self): """Test setup_hooks. Test that setup_hooks listed in the [global] section of setup.cfg are executed in order. """ stdout, _, return_code = self.run_setup('egg_info') assert 'test_hook_1\ntest_hook_2' in stdout assert return_code == 0 def test_command_hooks(self): """Test command hooks. Simple test that the appropriate command hooks run at the beginning/end of the appropriate command. """ stdout, _, return_code = self.run_setup('egg_info') assert 'build_ext pre-hook' not in stdout assert 'build_ext post-hook' not in stdout assert return_code == 0 stdout, _, return_code = self.run_setup('build_ext') assert textwrap.dedent(""" running build_ext running pre_hook pbr_testpackage._setup_hooks.test_pre_hook for command build_ext build_ext pre-hook """) in stdout # flake8: noqa assert stdout.endswith('build_ext post-hook') assert return_code == 0 def test_custom_commands_known(self): stdout, _, return_code = self.run_setup('--help-commands') self.assertFalse(return_code) self.assertThat(stdout, Contains(" testr ")) pbr-0.7.0/pbr/tests/test_files.py0000664000175300017540000000510012312051507020066 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from __future__ import print_function import os import fixtures from pbr.hooks import files from pbr.tests import base class FilesConfigTest(base.BaseTestCase): def setUp(self): super(FilesConfigTest, self).setUp() pkg_fixture = fixtures.PythonPackage( "fake_package", [ ("fake_module.py", b""), ("other_fake_module.py", b""), ]) self.useFixture(pkg_fixture) pkg_etc = os.path.join(pkg_fixture.base, 'etc') pkg_sub = os.path.join(pkg_etc, 'sub') subpackage = os.path.join( pkg_fixture.base, 'fake_package', 'subpackage') os.makedirs(pkg_sub) os.makedirs(subpackage) with open(os.path.join(pkg_etc, "foo"), 'w') as foo_file: foo_file.write("Foo Data") with open(os.path.join(pkg_sub, "bar"), 'w') as foo_file: foo_file.write("Bar Data") with open(os.path.join(subpackage, "__init__.py"), 'w') as foo_file: foo_file.write("# empty") self.useFixture(base.DiveDir(pkg_fixture.base)) def test_implicit_auto_package(self): config = dict( files=dict( ) ) files.FilesConfig(config, 'fake_package').run() self.assertIn('subpackage', config['files']['packages']) def test_auto_package(self): config = dict( files=dict( packages='fake_package', ) ) files.FilesConfig(config, 'fake_package').run() self.assertIn('subpackage', config['files']['packages']) def test_data_files_globbing(self): config = dict( files=dict( data_files="\n etc/pbr = etc/*" ) ) files.FilesConfig(config, 'fake_package').run() self.assertIn( '\netc/pbr/ = \n etc/foo\netc/pbr/sub = \n etc/sub/bar', config['files']['data_files']) pbr-0.7.0/pbr/tests/base.py0000664000175300017540000001242412312051507016646 0ustar jenkinsjenkins00000000000000# vim: tabstop=4 shiftwidth=4 softtabstop=4 # Copyright 2010-2011 OpenStack Foundation # Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS """Common utilities used in testing""" import os import shutil import subprocess import sys import fixtures import testresources import testtools from pbr import packaging class DiveDir(fixtures.Fixture): """Dive into given directory and return back on cleanup. :ivar path: The target directory. """ def __init__(self, path): self.path = path def setUp(self): super(DiveDir, self).setUp() self.addCleanup(os.chdir, os.getcwd()) os.chdir(self.path) class BaseTestCase(testtools.TestCase, testresources.ResourcedTestCase): def setUp(self): super(BaseTestCase, self).setUp() test_timeout = os.environ.get('OS_TEST_TIMEOUT', 30) try: test_timeout = int(test_timeout) except ValueError: # If timeout value is invalid, fail hard. print("OS_TEST_TIMEOUT set to invalid value" " defaulting to no timeout") test_timeout = 0 if test_timeout > 0: self.useFixture(fixtures.Timeout(test_timeout, gentle=True)) if os.environ.get('OS_STDOUT_CAPTURE') in packaging.TRUE_VALUES: stdout = self.useFixture(fixtures.StringStream('stdout')).stream self.useFixture(fixtures.MonkeyPatch('sys.stdout', stdout)) if os.environ.get('OS_STDERR_CAPTURE') in packaging.TRUE_VALUES: stderr = self.useFixture(fixtures.StringStream('stderr')).stream self.useFixture(fixtures.MonkeyPatch('sys.stderr', stderr)) self.log_fixture = self.useFixture( fixtures.FakeLogger('pbr')) self.useFixture(fixtures.NestedTempfile()) self.useFixture(fixtures.FakeLogger()) self.useFixture(fixtures.EnvironmentVariable('PBR_VERSION', '0.0')) self.temp_dir = self.useFixture(fixtures.TempDir()).path self.package_dir = os.path.join(self.temp_dir, 'testpackage') shutil.copytree(os.path.join(os.path.dirname(__file__), 'testpackage'), self.package_dir) self.addCleanup(os.chdir, os.getcwd()) os.chdir(self.package_dir) self.addCleanup(self._discard_testpackage) def _discard_testpackage(self): # Remove pbr.testpackage from sys.modules so that it can be freshly # re-imported by the next test for k in list(sys.modules): if (k == 'pbr_testpackage' or k.startswith('pbr_testpackage.')): del sys.modules[k] def run_setup(self, *args): return self._run_cmd(sys.executable, ('setup.py',) + args) def _run_cmd(self, cmd, args=[]): """Run a command in the root of the test working copy. Runs a command, with the given argument list, in the root of the test working copy--returns the stdout and stderr streams and the exit code from the subprocess. """ return _run_cmd([cmd] + list(args), cwd=self.package_dir) def _run_cmd(args, cwd): """Run the command args in cwd. :param args: The command to run e.g. ['git', 'status'] :param cwd: The directory to run the comamnd in. :return: ((stdout, stderr), returncode) """ p = subprocess.Popen( args, stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=cwd) streams = tuple(s.decode('latin1').strip() for s in p.communicate()) for content in streams: print(content) return (streams) + (p.returncode,) pbr-0.7.0/pbr/tests/util.py0000664000175300017540000000477412312051507016722 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS import contextlib import os import shutil import stat try: import ConfigParser as configparser except ImportError: import configparser @contextlib.contextmanager def open_config(filename): cfg = configparser.ConfigParser() cfg.read(filename) yield cfg with open(filename, 'w') as fp: cfg.write(fp) def rmtree(path): """shutil.rmtree() with error handler. Handle 'access denied' from trying to delete read-only files. """ def onerror(func, path, exc_info): if not os.access(path, os.W_OK): os.chmod(path, stat.S_IWUSR) func(path) else: raise return shutil.rmtree(path, onerror=onerror) pbr-0.7.0/pbr/tests/test_packaging.py0000664000175300017540000001125412312051507020717 0ustar jenkinsjenkins00000000000000# Copyright (c) 2013 New Dream Network, LLC (DreamHost) # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # # Copyright (C) 2013 Association of Universities for Research in Astronomy # (AURA) # # Redistribution and use in source and binary forms, with or without # modification, are permitted provided that the following conditions are met: # # 1. Redistributions of source code must retain the above copyright # notice, this list of conditions and the following disclaimer. # # 2. Redistributions in binary form must reproduce the above # copyright notice, this list of conditions and the following # disclaimer in the documentation and/or other materials provided # with the distribution. # # 3. The name of AURA and its representatives may not be used to # endorse or promote products derived from this software without # specific prior written permission. # # THIS SOFTWARE IS PROVIDED BY AURA ``AS IS'' AND ANY EXPRESS OR IMPLIED # WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF # MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE # DISCLAIMED. IN NO EVENT SHALL AURA BE LIABLE FOR ANY DIRECT, INDIRECT, # INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, # BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS import os import fixtures import mock from pbr import packaging from pbr.tests import base class TestPackagingInGitRepoWithCommit(base.BaseTestCase): def setUp(self): super(TestPackagingInGitRepoWithCommit, self).setUp() self.useFixture(fixtures.TempHomeDir()) self._run_cmd( 'git', ['config', '--global', 'user.email', 'nobody@example.com']) self._run_cmd('git', ['init', '.']) self._run_cmd('git', ['add', '.']) self._run_cmd('git', ['commit', '-m', 'test commit']) self.run_setup('sdist') return def test_authors(self): # One commit, something should be in the authors list with open(os.path.join(self.package_dir, 'AUTHORS'), 'r') as f: body = f.read() self.assertNotEqual(body, '') def test_changelog(self): with open(os.path.join(self.package_dir, 'ChangeLog'), 'r') as f: body = f.read() # One commit, something should be in the ChangeLog list self.assertNotEqual(body, '') class TestPackagingInGitRepoWithoutCommit(base.BaseTestCase): def setUp(self): super(TestPackagingInGitRepoWithoutCommit, self).setUp() self._run_cmd('git', ['init', '.']) self._run_cmd('git', ['add', '.']) self.run_setup('sdist') return def test_authors(self): # No commits, no authors in list with open(os.path.join(self.package_dir, 'AUTHORS'), 'r') as f: body = f.read() self.assertEqual(body, '\n') def test_changelog(self): # No commits, nothing should be in the ChangeLog list with open(os.path.join(self.package_dir, 'ChangeLog'), 'r') as f: body = f.read() self.assertEqual(body, 'CHANGES\n=======\n\n') class TestPackagingInPlainDirectory(base.BaseTestCase): def setUp(self): super(TestPackagingInPlainDirectory, self).setUp() self.run_setup('sdist') return def test_authors(self): # Not a git repo, no AUTHORS file created filename = os.path.join(self.package_dir, 'AUTHORS') self.assertFalse(os.path.exists(filename)) def test_changelog(self): # Not a git repo, no ChangeLog created filename = os.path.join(self.package_dir, 'ChangeLog') self.assertFalse(os.path.exists(filename)) class TestPresenceOfGit(base.BaseTestCase): def testGitIsInstalled(self): with mock.patch.object(packaging, '_run_shell_command') as _command: _command.return_value = 'git version 1.8.4.1' self.assertEqual(True, packaging._git_is_installed()) def testGitIsNotInstalled(self): with mock.patch.object(packaging, '_run_shell_command') as _command: _command.side_effect = OSError self.assertEqual(False, packaging._git_is_installed()) pbr-0.7.0/setup.cfg0000664000175300017540000000202712312051536015256 0ustar jenkinsjenkins00000000000000[metadata] name = pbr author = OpenStack author-email = openstack-dev@lists.openstack.org summary = Python Build Reasonableness description-file = README.rst home-page = http://pypi.python.org/pypi/pbr requires-python = >=2.6 classifier = Development Status :: 5 - Production/Stable Environment :: Console Environment :: OpenStack Intended Audience :: Developers Intended Audience :: Information Technology License :: OSI Approved :: Apache Software License Operating System :: OS Independent Programming Language :: Python Programming Language :: Python :: 2 Programming Language :: Python :: 2.6 Programming Language :: Python :: 2.7 Programming Language :: Python :: 3 Programming Language :: Python :: 3.3 [files] packages = pbr [global] setup-hooks = pbr.hooks.setup_hook [pbr] warnerrors = True [entry_points] distutils.setup_keywords = pbr = pbr.core:pbr [build_sphinx] all_files = 1 build-dir = doc/build source-dir = doc/source [wheel] universal = 1 [egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 pbr-0.7.0/.mailmap0000664000175300017540000000041712312051507015055 0ustar jenkinsjenkins00000000000000# Format is: # # Davanum Srinivas Erik M. Bray Erik Bray Zhongyue Luo