pgxnclient-1.2.1/ 0000775 0001750 0001750 00000000000 12143730213 013630 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/Makefile 0000664 0001750 0001750 00000001101 12143727745 015301 0 ustar piro piro 0000000 0000000 # pgxnclient Makefile
#
# Copyright (C) 2011-2012 Daniele Varrazzo
#
# This file is part of the PGXN client
.PHONY: sdist upload docs
PYTHON := python$(PYTHON_VERSION)
PYTHON_VERSION ?= $(shell $(PYTHON) -c 'import sys; print ("%d.%d" % sys.version_info[:2])')
build:
$(PYTHON) setup.py build
check:
$(PYTHON) setup.py test
sdist:
$(PYTHON) setup.py sdist --formats=gztar
upload:
$(PYTHON) setup.py sdist --formats=gztar upload
docs:
$(MAKE) -C docs
clean:
rm -rf build pgxnclient.egg-info
rm -rf simplejson-*.egg mock-*.egg unittest2-*.egg
$(MAKE) -C docs $@
pgxnclient-1.2.1/bin/ 0000775 0001750 0001750 00000000000 12143730213 014400 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/bin/pgxnclient 0000775 0001750 0001750 00000000303 12143727745 016515 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/bin/pgxn 0000775 0001750 0001750 00000001067 12143727745 015326 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- commands dispatcher
The script dispatches commands based on the name, e.g. upon the command::
pgxn foo --arg blah ...
a script called pgxn-foo is searched and executed with remaining arguments.
The commands are looked for by default in the dir ``libexec/pgxnclient/``
sibling of the directory containing this script, then are looked for in the
``PATH`` directories.
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import command_dispatch
command_dispatch()
pgxnclient-1.2.1/CHANGES 0000664 0001750 0001750 00000006021 12143727754 014642 0 ustar piro piro 0000000 0000000 .. _changes:
PGXN Client changes log
-----------------------
pgxnclient 1.2.1
================
- Fixed traceback on error when a dir doesn't contain META.json (ticket #19).
- Handle version numbers both with and without hyphen (ticket #22).
pgxnclient 1.2
==============
- Packages can be downloaded, installed, loaded specifying an URL
(ticket #15).
- Added support for ``.tar`` files (ticket #17).
- Use ``gmake`` in favour of ``make`` for platforms where the two are
distinct, such as BSD (ticket #14).
- Added ``--make`` option to select the make executable (ticket #16).
pgxnclient 1.1
==============
- Dropped support for Python 2.4.
- ``sudo`` is not invoked automatically: the ``--sudo`` option must be
specified if the user has not permission to write into PostgreSQL's libdir
(ticket #13). The ``--sudo`` option can also be invoked without argument.
- Make sure the same ``pg_config`` is used both by the current user and by
sudo.
pgxnclient 1.0.3
================
- Can deal with extensions whose ``Makefile`` is created by ``configure``
and with makefile not in the package root. Patch provided by Hitoshi
Harada (ticket #12).
pgxnclient 1.0.2
================
- Correctly handle PostgreSQL identifiers to be quoted (ticket #10).
- Don't crash with a traceback if some external command is not found
(ticket #11).
pgxnclient 1.0.1
================
- Fixed simplejson dependency on Python 2.6 (ticket #8).
- Added ``pgxn help CMD`` as synonim for ``pgxn CMD --help`` (ticket #7).
- Fixed a few compatibility problems with Python 3.
pgxnclient 1.0
==============
- Extensions to load/unload from a distribution can be specified on the
command line.
- ``pgxn help --libexec`` returns a single directory, possibly independent
from the client version.
pgxnclient 0.3
==============
- ``pgxn`` script converted into a generic dispatcher in order to allow
additional commands to be implemented in external scripts and in any
language.
- commands accept extension names too, not only specs.
- Added ``help`` command to get information about program and commands.
pgxnclient 0.2.1
================
- Lowercase search for distributions in the API (issue #3).
- Fixed handling of zip files not containing entries for the directory.
- More informative error messages when some item is not found on PGXN.
pgxnclient 0.2
==============
- Dropped ``list`` command (use ``info --versions`` instead).
- Skip extension load/unload if the provided file is not sql.
pgxnclient 0.1a4
================
- The spec can point to a local file/directory for install.
- Read the sha1 from the ``META.json`` as it may be different from the one
in the ``dist.json``.
- Run sudo in the installation phase of the install command.
pgxn.client 0.1a3
=================
- Fixed executable mode for scripts unpacked from the zip files.
- Added ``list`` and ``info`` commands.
pgxn.client 0.1a2
=================
- Added database connection parameters for the ``check`` command.
pgxn.client 0.1a1
=================
- Fist version released on PyPI.
pgxnclient-1.2.1/README.rst 0000664 0001750 0001750 00000003257 12143727745 015346 0 ustar piro piro 0000000 0000000 =====================================================================
PGXN Client
=====================================================================
A command line tool to interact with the PostgreSQL Extension Network
=====================================================================
The `PGXN Client `__ is a command
line tool designed to interact with the `PostgreSQL Extension Network
`__ allowing searching, compiling, installing, and removing
extensions in PostgreSQL databases.
For example, to install the semver_ extension, the client can be invoked as::
$ pgxn install semver
which would download and compile the extension for one of the PostgreSQL
servers hosted on the machine and::
$ pgxn load -d somedb semver
which would load the extension in one of the databases of the server.
The client interacts with the PGXN web service and a ``Makefile`` provided by
the extension. The best results are achieved with makefiles using the
PostgreSQL `Extension Building Infrastructure`__; however the client tries to
degrade gracefully in presence of any package hosted on PGXN.
.. _semver: http://pgxn.org/dist/semver
.. __: http://www.postgresql.org/docs/9.1/static/extend-pgxs.html
- Home page: http://pgxnclient.projects.postgresql.org/
- Downloads: http://pypi.python.org/pypi/pgxnclient/
- Discussion group: http://groups.google.com/group/pgxn-users/
- Source repository: https://github.com/dvarrazzo/pgxnclient/
- PgFoundry project: http://pgfoundry.org/projects/pgxnclient/
Please refer to the files in the ``docs`` directory for instructions about
the program installation and usage.
pgxnclient-1.2.1/COPYING 0000664 0001750 0001750 00000002714 12143727745 014707 0 ustar piro piro 0000000 0000000 Copyright (c) 2011-2012, Daniele Varrazzo
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
* The name of Daniele Varrazzo may not be used to endorse or promote
products derived from this software without specific prior written
permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT
OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
pgxnclient-1.2.1/AUTHORS 0000664 0001750 0001750 00000001204 12143727745 014715 0 ustar piro piro 0000000 0000000 Who has contributed to the PGXN client?
=======================================
Daniele Varrazzo
He rushed to implement a client before David could do it in Perl!
David Wheeler
He is the PGXN mastermind: a lot of helpful design discussions.
Peter Eisentraut
First implementation of tarball support. Auto-sudo is not a good idea, I
got it.
Hitoshi Harada
Tricky installation corner cases.
Andrey Popp
Make selection. Helped the program not to suck on BSD!
Also thank you everybody for the useful discussions on the PGXN mailing list,
bug reports, proofreading the docs and the general support to the project.
pgxnclient-1.2.1/setup.py 0000664 0001750 0001750 00000004411 12143727745 015362 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- setup script
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
import sys
from setuptools import setup, find_packages
# Grab the version without importing the module
# or we will get import errors on install if prerequisites are still missing
fn = os.path.join(os.path.dirname(__file__), 'pgxnclient', '__init__.py')
f = open(fn)
try:
for line in f:
if line.startswith('__version__ ='):
version = line.split("'")[1]
break
else:
raise ValueError('cannot find __version__ in the pgxnclient module')
finally:
f.close()
# External dependencies, depending on the Python version
requires = []
tests_require = []
if sys.version_info < (2, 5):
raise ValueError("PGXN client requires at least Python 2.5")
elif sys.version_info < (2, 7):
requires.append('simplejson>=2.1')
tests_require.append('mock')
if sys.version_info < (2, 7):
tests_require.append('unittest2')
classifiers = """
Development Status :: 5 - Production/Stable
Environment :: Console
Intended Audience :: Developers
Intended Audience :: System Administrators
License :: OSI Approved :: BSD License
Operating System :: POSIX
Programming Language :: Python :: 2
Programming Language :: Python :: 2.5
Programming Language :: Python :: 2.6
Programming Language :: Python :: 2.7
Programming Language :: Python :: 3
Programming Language :: Python :: 3.1
Programming Language :: Python :: 3.2
Topic :: Database
"""
setup(
name = 'pgxnclient',
description = 'A command line tool to interact with the PostgreSQL Extension Network.',
author = 'Daniele Varrazzo',
author_email = 'daniele.varrazzo@gmail.com',
url = 'http://pgxnclient.projects.postgresql.org/',
license = 'BSD',
packages = find_packages(),
package_data = {'pgxnclient': ['libexec/*']},
entry_points = {'console_scripts': [
'pgxn = pgxnclient.cli:command_dispatch',
'pgxnclient = pgxnclient.cli:script', ]},
test_suite = 'pgxnclient.tests',
classifiers = [x for x in classifiers.split('\n') if x],
zip_safe = False, # because we dynamically look for commands
install_requires = requires,
tests_require = tests_require,
version = version,
use_2to3 = True,
)
pgxnclient-1.2.1/pgxnclient/ 0000775 0001750 0001750 00000000000 12143730213 016003 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/pgxnclient/errors.py 0000664 0001750 0001750 00000002522 12143727745 017712 0 ustar piro piro 0000000 0000000 """
pgxnclient -- package exceptions
These exceptions can be used to signal expected problems and to exit in a
controlled way from the program.
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
class PgxnException(Exception):
"""Base class for the exceptions known in the pgxn package."""
class PgxnClientException(PgxnException):
"""Base class for the exceptions raised by the pgxnclient package."""
class UserAbort(PgxnClientException):
"""The user requested to stop the operation."""
class BadSpecError(PgxnClientException):
"""A bad package specification."""
class ProcessError(PgxnClientException):
"""An error raised calling an external program."""
class InsufficientPrivileges(PgxnClientException):
"""Operation will fail because the user is too lame."""
class NotFound(PgxnException):
"""Something requested by the user not found on PGXN"""
class NetworkError(PgxnClientException):
"""An error from the other side of the wire."""
class BadChecksum(PgxnClientException):
"""A downloaded file is not what expected."""
class ResourceNotFound(NetworkError):
"""Resource not found on the server."""
class BadRequestError(Exception):
"""Bad request from our side.
This exception is a basic one because it should be rased upon an error
on our side.
"""
pgxnclient-1.2.1/pgxnclient/cli.py 0000664 0001750 0001750 00000006442 12143727745 017152 0 ustar piro piro 0000000 0000000 """
pgxnclient -- command line entry point
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
import sys
from pgxnclient import find_script
from pgxnclient.i18n import _
from pgxnclient.errors import PgxnException, UserAbort
from pgxnclient.commands import get_option_parser, load_commands, run_command
def main(argv=None):
"""
The program main function.
The function is still relatively self contained: it can be called with
arguments and raises whatever exception, so it's the best entry point
for whole system testing.
"""
if argv is None:
argv = sys.argv[1:]
load_commands()
parser = get_option_parser()
opt = parser.parse_args(argv)
run_command(opt, parser)
def script():
"""
Execute the program as a script.
Set up logging, invoke main() using the user-provided arguments and handle
any exception raised.
"""
# Setup logging
import logging
logging.basicConfig(
format="%(levelname)s: %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
stream=sys.stdout)
logger = logging.getLogger()
# Dispatch to the command according to the script name
script = sys.argv[0]
args = sys.argv[1:]
if os.path.basename(script).startswith('pgxn-'):
args.insert(0, os.path.basename(script)[5:])
# for help print
sys.argv[0] = os.path.join(os.path.dirname(script), 'pgxn')
# Execute the script
try:
main(args)
# Different ways to fail
except UserAbort, e:
# The user replied "no" to some question
logger.info("%s", e)
sys.exit(1)
except PgxnException, e:
# An regular error from the program
logger.error("%s", e)
sys.exit(1)
except SystemExit, e:
# Usually the arg parser bailing out.
pass
except Exception, e:
logger.error(_("unexpected error: %s - %s"),
e.__class__.__name__, e, exc_info=True)
sys.exit(1)
except BaseException, e:
# ctrl-c
sys.exit(1)
def command_dispatch(argv=None):
"""
Entry point for a script to dispatch commands to external scripts.
Upon invocation of a command ``pgxn cmd --arg``, locate pgxn-cmd and
execute it with --arg arguments.
"""
if argv is None:
argv = sys.argv[1:]
# Assume the first arg after the option is the command to run
for icmd, cmd in enumerate(argv):
if not cmd.startswith('-'):
argv = [_get_exec(cmd)] + argv[:icmd] + argv[icmd+1:]
break
else:
# No command specified: dispatch to the pgxnclient script
# to print basic help, main command etc.
argv = ([os.path.join(os.path.dirname(sys.argv[0]), 'pgxnclient')]
+ argv)
if not os.access(argv[0], os.X_OK):
# This is our friend setuptools' job: the script have lost the
# executable flag. We assume the script is a Python one and run it
# through the current executable.
argv.insert(0, sys.executable)
os.execv(argv[0], argv)
def _get_exec(cmd):
fn = find_script('pgxn-' + cmd)
if not fn:
print >>sys.stderr, \
"pgxn: unknown command: '%s'. See 'pgxn --help'" % cmd
sys.exit(2)
return fn
if __name__ == '__main__':
script()
pgxnclient-1.2.1/pgxnclient/i18n.py 0000664 0001750 0001750 00000000470 12143727745 017155 0 ustar piro piro 0000000 0000000 """
pgxnclient -- internationalization support
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
def gettext(msg):
# TODO: real l10n
return msg
_ = gettext
def N_(msg):
"""Designate a string to be found by gettext but not to be translated."""
return msg
pgxnclient-1.2.1/pgxnclient/archive.py 0000664 0001750 0001750 00000005366 12143727745 020030 0 ustar piro piro 0000000 0000000 """
pgxnclient -- archives handling
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
from pgxnclient.i18n import _
from pgxnclient.utils import load_jsons
from pgxnclient.errors import PgxnClientException
def from_spec(spec):
"""Return an `Archive` instance to handle the file requested by *spec*
"""
assert spec.is_file()
return from_file(spec.filename)
def from_file(filename):
"""Return an `Archive` instance to handle the file *filename*
"""
from pgxnclient.zip import ZipArchive
from pgxnclient.tar import TarArchive
for cls in (ZipArchive, TarArchive):
a = cls(filename)
if a.can_open():
return a
raise PgxnClientException(
_("can't open archive '%s': file type not recognized")
% filename)
class Archive(object):
"""Base class to handle archives."""
def __init__(self, filename):
self.filename = filename
def can_open(self):
"""Return `!True` if the `!filename` can be opened by the obect."""
raise NotImplementedError
def open(self):
"""Open the archive for usage.
Raise PgxnClientException if the archive can't be open.
"""
raise NotImplementedError
def close(self):
"""Close the archive after usage."""
raise NotImplementedError
def list_files(self):
"""Return an iterable with the list of file names in the archive."""
raise NotImplementedError
def read(self, fn):
"""Return a file's data from the archive."""
raise NotImplementedError
def unpack(self, destdir):
raise NotImplementedError
def get_meta(self):
filename = self.filename
self.open()
try:
# Return the first file with the expected name
for fn in self.list_files():
if fn.endswith('META.json'):
return load_jsons(self.read(fn).decode('utf8'))
else:
raise PgxnClientException(
_("file 'META.json' not found in archive '%s'") % filename)
finally:
self.close()
def _find_work_directory(self, destdir):
"""
Choose the directory where to work.
Because we are mostly a wrapper for pgxs, let's look for a makefile.
The tar should contain a single base directory, so return the first
dir we found containing a Makefile, alternatively just return the
unpacked dir
"""
for dir in os.listdir(destdir):
for fn in ('Makefile', 'makefile', 'GNUmakefile', 'configure'):
if os.path.exists(os.path.join(destdir, dir, fn)):
return os.path.join(destdir, dir)
return destdir
pgxnclient-1.2.1/pgxnclient/tests/ 0000775 0001750 0001750 00000000000 12143730213 017145 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/pgxnclient/tests/test_commands.py 0000664 0001750 0001750 00000076122 12143727745 022407 0 ustar piro piro 0000000 0000000 from mock import patch, Mock
import os
import tempfile
import shutil
from urllib import quote
from pgxnclient.utils import b
from pgxnclient.errors import PgxnClientException, ResourceNotFound, InsufficientPrivileges
from pgxnclient.tests import unittest
from pgxnclient.tests.testutils import ifunlink, get_test_filename
class FakeFile(object):
def __init__(self, *args):
self._f = open(*args)
self.url = None
def __enter__(self):
self._f.__enter__()
return self
def __exit__(self, type, value, traceback):
self._f.__exit__(type, value, traceback)
def __getattr__(self, attr):
return getattr(self._f, attr)
def fake_get_file(url, urlmap=None):
if urlmap: url = urlmap.get(url, url)
fn = get_test_filename(quote(url, safe=""))
if not os.path.exists(fn):
raise ResourceNotFound(fn)
f = FakeFile(fn, 'rb')
f.url = url
return f
def fake_pg_config(**map):
def f(what):
return map[what]
return f
class InfoTestCase(unittest.TestCase):
def _get_output(self, cmdline):
@patch('sys.stdout')
@patch('pgxnclient.network.get_file')
def do(mock, stdout):
mock.side_effect = fake_get_file
from pgxnclient.cli import main
main(cmdline)
return u''.join([a[0] for a, k in stdout.write.call_args_list]) \
.encode('ascii')
return do()
def test_info(self):
output = self._get_output(['info', '--versions', 'foobar'])
self.assertEqual(output, b("""\
foobar 0.43.2b1 testing
foobar 0.42.1 stable
foobar 0.42.0 stable
"""))
def test_info_op(self):
output = self._get_output(['info', '--versions', 'foobar>0.42.0'])
self.assertEqual(output, b("""\
foobar 0.43.2b1 testing
foobar 0.42.1 stable
"""))
def test_info_empty(self):
output = self._get_output(['info', '--versions', 'foobar>=0.43.2'])
self.assertEqual(output, b(""))
def test_info_case_insensitive(self):
output = self._get_output(['info', '--versions', 'Foobar'])
self.assertEqual(output, b("""\
foobar 0.43.2b1 testing
foobar 0.42.1 stable
foobar 0.42.0 stable
"""))
def test_mirrors_list(self):
output = self._get_output(['mirror'])
self.assertEqual(output, b("""\
http://pgxn.depesz.com/
http://www.postgres-support.ch/pgxn/
http://pgxn.justatheory.com/
http://pgxn.darkixion.com/
http://mirrors.cat.pdx.edu/pgxn/
http://pgxn.dalibo.org/
http://pgxn.cxsoftware.org/
http://api.pgxn.org/
"""))
def test_mirror_info(self):
output = self._get_output(['mirror', 'http://pgxn.justatheory.com/'])
self.assertEqual(output, b("""\
uri: http://pgxn.justatheory.com/
frequency: daily
location: Portland, OR, USA
bandwidth: Cable
organization: David E. Wheeler
email: justatheory.com|pgxn
timezone: America/Los_Angeles
src: rsync://master.pgxn.org/pgxn/
rsync:
notes:
"""))
class CommandTestCase(unittest.TestCase):
def test_popen_raises(self):
from pgxnclient.commands import Command
c = Command([])
self.assertRaises(PgxnClientException,
c.popen, "this-script-doesnt-exist")
class DownloadTestCase(unittest.TestCase):
@patch('pgxnclient.network.get_file')
def test_download_latest(self, mock):
mock.side_effect = fake_get_file
fn = 'foobar-0.42.1.zip'
self.assert_(not os.path.exists(fn))
from pgxnclient.cli import main
try:
main(['download', 'foobar'])
self.assert_(os.path.exists(fn))
finally:
ifunlink(fn)
@patch('pgxnclient.network.get_file')
def test_download_testing(self, mock):
mock.side_effect = fake_get_file
fn = 'foobar-0.43.2b1.zip'
self.assert_(not os.path.exists(fn))
from pgxnclient.cli import main
try:
main(['download', '--testing', 'foobar'])
self.assert_(os.path.exists(fn))
finally:
ifunlink(fn)
@patch('pgxnclient.network.get_file')
def test_download_url(self, mock):
mock.side_effect = fake_get_file
fn = 'foobar-0.43.2b1.zip'
self.assert_(not os.path.exists(fn))
from pgxnclient.cli import main
try:
main(['download', 'http://api.pgxn.org/dist/foobar/0.43.2b1/foobar-0.43.2b1.zip'])
self.assert_(os.path.exists(fn))
finally:
ifunlink(fn)
@patch('pgxnclient.network.get_file')
def test_download_ext(self, mock):
mock.side_effect = fake_get_file
fn = 'pg_amqp-0.3.0.zip'
self.assert_(not os.path.exists(fn))
from pgxnclient.cli import main
try:
main(['download', 'amqp'])
self.assert_(os.path.exists(fn))
finally:
ifunlink(fn)
@patch('pgxnclient.network.get_file')
def test_download_rename(self, mock):
mock.side_effect = fake_get_file
fn = 'foobar-0.42.1.zip'
fn1= 'foobar-0.42.1-1.zip'
fn2= 'foobar-0.42.1-2.zip'
for tmp in (fn, fn1, fn2):
self.assert_(not os.path.exists(tmp))
try:
f = open(fn, "w")
f.write('test')
f.close()
from pgxnclient.cli import main
main(['download', 'foobar'])
self.assert_(os.path.exists(fn1))
self.assert_(not os.path.exists(fn2))
main(['download', 'foobar'])
self.assert_(os.path.exists(fn2))
f = open(fn)
self.assertEquals(f.read(), 'test')
f.close()
finally:
ifunlink(fn)
ifunlink(fn1)
ifunlink(fn2)
@patch('pgxnclient.network.get_file')
def test_download_bad_sha1(self, mock):
def fakefake(url):
return fake_get_file(url, urlmap = {
'http://api.pgxn.org/dist/foobar/0.42.1/META.json':
'http://api.pgxn.org/dist/foobar/0.42.1/META-badsha1.json'})
mock.side_effect = fakefake
fn = 'foobar-0.42.1.zip'
self.assert_(not os.path.exists(fn))
try:
from pgxnclient.cli import main
from pgxnclient.errors import BadChecksum
e = self.assertRaises(BadChecksum,
main, ['download', 'foobar'])
self.assert_(not os.path.exists(fn))
finally:
ifunlink(fn)
@patch('pgxnclient.network.get_file')
def test_download_case_insensitive(self, mock):
mock.side_effect = fake_get_file
fn = 'pyrseas-0.4.1.zip'
self.assert_(not os.path.exists(fn))
from pgxnclient.cli import main
try:
main(['download', 'pyrseas'])
self.assert_(os.path.exists(fn))
finally:
ifunlink(fn)
try:
main(['download', 'Pyrseas'])
self.assert_(os.path.exists(fn))
finally:
ifunlink(fn)
def test_version(self):
from pgxnclient import Spec
from pgxnclient.commands.install import Download
from pgxnclient.errors import ResourceNotFound
opt = Mock()
opt.status = Spec.STABLE
cmd = Download(opt)
for spec, res, data in [
('foo', '1.2.0', {'stable': [ '1.2.0' ]}),
('foo', '1.2.0', {'stable': [ '1.2.0', '1.2.0b' ]}),
('foo=1.2', '1.2.0', {'stable': [ '1.2.0' ]}),
('foo>=1.1', '1.2.0', {'stable': [ '1.1.0', '1.2.0' ]}),
('foo>=1.1', '1.2.0', {
'stable': [ '1.1.0', '1.2.0' ],
'testing': [ '1.3.0' ],
'unstable': [ '1.4.0' ], }),
]:
spec = Spec.parse(spec)
data = { 'releases':
dict([(k, [{'version': v} for v in vs])
for k, vs in data.items()]) }
self.assertEqual(res, cmd.get_best_version(data, spec))
for spec, res, data in [
('foo>=1.3', '1.2.0', {'stable': [ '1.2.0' ]}),
('foo>=1.3', '1.2.0', {
'stable': [ '1.2.0' ],
'testing': [ '1.3.0' ], }),
]:
spec = Spec.parse(spec)
data = { 'releases':
dict([(k, [{'version': v} for v in vs])
for k, vs in data.items()]) }
self.assertRaises(ResourceNotFound, cmd.get_best_version, data, spec)
opt.status = Spec.TESTING
for spec, res, data in [
('foo>=1.1', '1.3.0', {
'stable': [ '1.1.0', '1.2.0' ],
'testing': [ '1.3.0' ],
'unstable': [ '1.4.0' ], }),
]:
spec = Spec.parse(spec)
data = { 'releases':
dict([(k, [{'version': v} for v in vs])
for k, vs in data.items()]) }
self.assertEqual(res, cmd.get_best_version(data, spec))
opt.status = Spec.UNSTABLE
for spec, res, data in [
('foo>=1.1', '1.4.0', {
'stable': [ '1.1.0', '1.2.0' ],
'testing': [ '1.3.0' ],
'unstable': [ '1.4.0' ], }),
]:
spec = Spec.parse(spec)
data = { 'releases':
dict([(k, [{'version': v} for v in vs])
for k, vs in data.items()]) }
self.assertEqual(res, cmd.get_best_version(data, spec))
class Assertions(object):
make = object()
def assertCallArgs(self, pattern, args):
if len(pattern) != len(args):
self.fail('args and pattern have different lengths')
for p, a in zip(pattern, args):
if p is self.make:
if not a.endswith('make'):
self.fail('%s is not a make in %s' % (a, args))
else:
if not a == p:
self.fail('%s is not a %s in %s' % (a, p, args))
# With mock patching a method seems tricky: looks there's no way to get to
# 'self' as the mock method is unbound.
from pgxnclient.tar import TarArchive
TarArchive.unpack_orig = TarArchive.unpack
from pgxnclient.zip import ZipArchive
ZipArchive.unpack_orig = ZipArchive.unpack
class InstallTestCase(unittest.TestCase, Assertions):
def setUp(self):
self._p1 = patch('pgxnclient.network.get_file')
self.mock_get = self._p1.start()
self.mock_get.side_effect = fake_get_file
self._p2 = patch('pgxnclient.commands.Popen')
self.mock_popen = self._p2.start()
self.mock_popen.return_value.returncode = 0
self._p3 = patch('pgxnclient.commands.WithPgConfig.call_pg_config')
self.mock_pgconfig = self._p3.start()
self.mock_pgconfig.side_effect = fake_pg_config(
libdir='/', bindir='/')
def tearDown(self):
self._p1.stop()
self._p2.stop()
self._p3.stop()
def test_install_latest(self):
from pgxnclient.cli import main
main(['install', '--sudo', '--', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs(['sudo', self.make], self.mock_popen.call_args_list[1][0][0][:2])
def test_install_missing_sudo(self):
from pgxnclient.cli import main
self.assertRaises(InsufficientPrivileges, main, ['install', 'foobar'])
def test_install_local(self):
self.mock_pgconfig.side_effect = fake_pg_config(
libdir=os.environ['HOME'], bindir='/')
from pgxnclient.cli import main
main(['install', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs([self.make], self.mock_popen.call_args_list[1][0][0][:1])
def test_install_url(self):
self.mock_pgconfig.side_effect = fake_pg_config(
libdir=os.environ['HOME'], bindir='/')
from pgxnclient.cli import main
main(['install', 'http://api.pgxn.org/dist/foobar/0.42.1/foobar-0.42.1.zip'])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs([self.make], self.mock_popen.call_args_list[1][0][0][:1])
def test_install_fails(self):
self.mock_popen.return_value.returncode = 1
self.mock_pgconfig.side_effect = fake_pg_config(
libdir=os.environ['HOME'], bindir='/')
from pgxnclient.cli import main
self.assertRaises(PgxnClientException, main, ['install', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 1)
def test_install_bad_sha1(self):
def fakefake(url):
return fake_get_file(url, urlmap = {
'http://api.pgxn.org/dist/foobar/0.42.1/META.json':
'http://api.pgxn.org/dist/foobar/0.42.1/META-badsha1.json'})
self.mock_get.side_effect = fakefake
from pgxnclient.cli import main
from pgxnclient.errors import BadChecksum
self.assertRaises(BadChecksum,
main, ['install', '--sudo', '--', 'foobar'])
def test_install_nosudo(self):
self.mock_pgconfig.side_effect = fake_pg_config(libdir=os.environ['HOME'])
from pgxnclient.cli import main
main(['install', '--nosudo', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs([self.make], self.mock_popen.call_args_list[1][0][0][:1])
def test_install_sudo(self):
from pgxnclient.cli import main
main(['install', '--sudo', 'gksudo -d "hello world"', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make],
self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs(['gksudo', '-d', 'hello world', self.make],
self.mock_popen.call_args_list[1][0][0][:4])
@patch('pgxnclient.tar.TarArchive.unpack')
def test_install_local_tar(self, mock_unpack):
fn = get_test_filename('foobar-0.42.1.tar.gz')
mock_unpack.side_effect = TarArchive(fn).unpack_orig
from pgxnclient.cli import main
main(['install', '--sudo', '--', fn])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs(['sudo', self.make],
self.mock_popen.call_args_list[1][0][0][:2])
make_cwd = self.mock_popen.call_args_list[1][1]['cwd']
self.assertEquals(mock_unpack.call_count, 1)
tmpdir, = mock_unpack.call_args[0]
self.assertEqual(make_cwd, os.path.join(tmpdir, 'foobar-0.42.1'))
@patch('pgxnclient.zip.ZipArchive.unpack')
def test_install_local_zip(self, mock_unpack):
fn = get_test_filename('foobar-0.42.1.zip')
mock_unpack.side_effect = ZipArchive(fn).unpack_orig
from pgxnclient.cli import main
main(['install', '--sudo', '--', fn])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs(['sudo', self.make],
self.mock_popen.call_args_list[1][0][0][:2])
make_cwd = self.mock_popen.call_args_list[1][1]['cwd']
self.assertEquals(mock_unpack.call_count, 1)
tmpdir, = mock_unpack.call_args[0]
self.assertEqual(make_cwd, os.path.join(tmpdir, 'foobar-0.42.1'))
def test_install_url_file(self):
fn = get_test_filename('foobar-0.42.1.zip')
url = 'file://' + os.path.abspath(fn).replace("f", '%%%2x' % ord('f'))
from pgxnclient.cli import main
main(['install', '--sudo', '--', url])
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs(['sudo', self.make],
self.mock_popen.call_args_list[1][0][0][:2])
def test_install_local_dir(self):
self.mock_get.side_effect = lambda *args: self.fail('network invoked')
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
from pgxnclient.cli import main
main(['install', '--sudo', '--', dir])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 2)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
self.assertCallArgs(dir, self.mock_popen.call_args_list[0][1]['cwd'])
self.assertCallArgs(['sudo', self.make],
self.mock_popen.call_args_list[1][0][0][:2])
self.assertEquals(dir, self.mock_popen.call_args_list[1][1]['cwd'])
class CheckTestCase(unittest.TestCase, Assertions):
def setUp(self):
self._p1 = patch('pgxnclient.network.get_file')
self.mock_get = self._p1.start()
self.mock_get.side_effect = fake_get_file
self._p2 = patch('pgxnclient.commands.Popen')
self.mock_popen = self._p2.start()
self.mock_popen.return_value.returncode = 0
self._p3 = patch('pgxnclient.commands.WithPgConfig.call_pg_config')
self.mock_pgconfig = self._p3.start()
self.mock_pgconfig.side_effect = fake_pg_config(
libdir='/', bindir='/')
def tearDown(self):
self._p1.stop()
self._p2.stop()
self._p3.stop()
def test_check_latest(self):
from pgxnclient.cli import main
main(['check', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 1)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
def test_check_url(self):
from pgxnclient.cli import main
main(['check', 'http://api.pgxn.org/dist/foobar/0.42.1/foobar-0.42.1.zip'])
self.assertEquals(self.mock_popen.call_count, 1)
self.assertCallArgs([self.make], self.mock_popen.call_args_list[0][0][0][:1])
def test_check_fails(self):
self.mock_popen.return_value.returncode = 1
from pgxnclient.cli import main
self.assertRaises(PgxnClientException, main, ['check', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 1)
def test_check_diff_moved(self):
def create_regression_files(*args, **kwargs):
cwd = kwargs['cwd']
open(os.path.join(cwd, 'regression.out'), 'w').close()
open(os.path.join(cwd, 'regression.diffs'), 'w').close()
return Mock()
self.mock_popen.side_effect = create_regression_files
self.mock_popen.return_value.returncode = 1
self.assert_(not os.path.exists('regression.out'),
"Please remove temp file 'regression.out' from current dir")
self.assert_(not os.path.exists('regression.diffs'),
"Please remove temp file 'regression.diffs' from current dir")
from pgxnclient.cli import main
try:
self.assertRaises(PgxnClientException, main, ['check', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 1)
self.assert_(os.path.exists('regression.out'))
self.assert_(os.path.exists('regression.diffs'))
finally:
ifunlink('regression.out')
ifunlink('regression.diffs')
def test_check_bad_sha1(self):
def fakefake(url):
return fake_get_file(url, urlmap = {
'http://api.pgxn.org/dist/foobar/0.42.1/META.json':
'http://api.pgxn.org/dist/foobar/0.42.1/META-badsha1.json'})
self.mock_get.side_effect = fakefake
self.mock_popen.return_value.returncode = 1
from pgxnclient.cli import main
from pgxnclient.errors import BadChecksum
self.assertRaises(BadChecksum, main, ['check', 'foobar'])
self.assertEquals(self.mock_popen.call_count, 0)
class LoadTestCase(unittest.TestCase):
def setUp(self):
self._p1 = patch('pgxnclient.commands.Popen')
self.mock_popen = self._p1.start()
self.mock_popen.return_value.returncode = 0
self.mock_popen.return_value.communicate.return_value = (b(''), b(''))
self._p2 = patch('pgxnclient.commands.install.LoadUnload.is_extension')
self.mock_isext = self._p2.start()
self.mock_isext.return_value = True
self._p3 = patch('pgxnclient.commands.install.LoadUnload.get_pg_version')
self.mock_pgver = self._p3.start()
self.mock_pgver.return_value = (9,1,0)
def tearDown(self):
self._p1.stop()
self._p2.stop()
self._p3.stop()
def test_parse_version(self):
from pgxnclient.commands.install import Load
cmd = Load(None)
self.assertEquals((9,0,3), cmd.parse_pg_version(
'PostgreSQL 9.0.3 on i686-pc-linux-gnu, compiled by GCC'
' gcc-4.4.real (Ubuntu/Linaro 4.4.4-14ubuntu5) 4.4.5, 32-bit'))
self.assertEquals((9,1,0), cmd.parse_pg_version(
'PostgreSQL 9.1alpha5 on i686-pc-linux-gnu, compiled by GCC gcc'
' (Ubuntu/Linaro 4.4.4-14ubuntu5) 4.4.5, 32-bit '))
@patch('pgxnclient.network.get_file')
def test_check_psql_options(self, mock_get):
mock_get.side_effect = fake_get_file
from pgxnclient.cli import main
main(['load', '--yes', '--dbname', 'dbdb', 'foobar'])
args = self.mock_popen.call_args[0][0]
self.assertEqual('dbdb', args[args.index('--dbname') + 1])
main(['load', '--yes', '-U', 'meme', 'foobar'])
args = self.mock_popen.call_args[0][0]
self.assertEqual('meme', args[args.index('--username') + 1])
main(['load', '--yes', '--port', '666', 'foobar'])
args = self.mock_popen.call_args[0][0]
self.assertEqual('666', args[args.index('--port') + 1])
main(['load', '--yes', '-h', 'somewhere', 'foobar'])
args = self.mock_popen.call_args[0][0]
self.assertEqual('somewhere', args[args.index('--host') + 1])
@patch('pgxnclient.zip.ZipArchive.unpack')
@patch('pgxnclient.network.get_file')
def test_load_local_zip(self, mock_get, mock_unpack):
mock_get.side_effect = lambda *args: self.fail('network invoked')
mock_unpack.side_effect = ZipArchive.unpack_orig
from pgxnclient.cli import main
main(['load', '--yes', get_test_filename('foobar-0.42.1.zip')])
self.assertEquals(mock_unpack.call_count, 0)
self.assertEquals(self.mock_popen.call_count, 1)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args[0][0],
b('CREATE EXTENSION foobar;'))
@patch('pgxnclient.tar.TarArchive.unpack')
@patch('pgxnclient.network.get_file')
def test_load_local_tar(self, mock_get, mock_unpack):
mock_get.side_effect = lambda *args: self.fail('network invoked')
mock_unpack.side_effect = TarArchive.unpack_orig
from pgxnclient.cli import main
main(['load', '--yes', get_test_filename('foobar-0.42.1.tar.gz')])
self.assertEquals(mock_unpack.call_count, 0)
self.assertEquals(self.mock_popen.call_count, 1)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args[0][0],
b('CREATE EXTENSION foobar;'))
@patch('pgxnclient.network.get_file')
def test_load_local_dir(self, mock_get):
mock_get.side_effect = lambda *args: self.fail('network invoked')
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
from pgxnclient.cli import main
main(['load', '--yes', dir])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 1)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args[0][0],
b('CREATE EXTENSION foobar;'))
@patch('pgxnclient.zip.ZipArchive.unpack')
@patch('pgxnclient.network.get_file')
def test_load_zip_url(self, mock_get, mock_unpack):
mock_get.side_effect = fake_get_file
mock_unpack.side_effect = ZipArchive.unpack_orig
from pgxnclient.cli import main
main(['load', '--yes',
'http://api.pgxn.org/dist/foobar/0.42.1/foobar-0.42.1.zip'])
self.assertEquals(mock_unpack.call_count, 0)
self.assertEquals(self.mock_popen.call_count, 1)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args[0][0],
b('CREATE EXTENSION foobar;'))
@patch('pgxnclient.tar.TarArchive.unpack')
@patch('pgxnclient.network.get_file')
def test_load_tar_url(self, mock_get, mock_unpack):
mock_get.side_effect = fake_get_file
mock_unpack.side_effect = TarArchive.unpack_orig
from pgxnclient.cli import main
main(['load', '--yes',
'http://example.org/foobar-0.42.1.tar.gz'])
self.assertEquals(mock_unpack.call_count, 0)
self.assertEquals(self.mock_popen.call_count, 1)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args[0][0],
b('CREATE EXTENSION foobar;'))
def test_load_extensions_order(self):
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
shutil.copyfile(
get_test_filename('META-manyext.json'),
os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
main(['load', '--yes', dir])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 4)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args_list[0][0][0],
b('CREATE EXTENSION foo;'))
self.assertEquals(communicate.call_args_list[1][0][0],
b('CREATE EXTENSION bar;'))
self.assertEquals(communicate.call_args_list[2][0][0],
b('CREATE EXTENSION baz;'))
self.assertEquals(communicate.call_args_list[3][0][0],
b('CREATE EXTENSION qux;'))
def test_unload_extensions_order(self):
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
shutil.copyfile(
get_test_filename('META-manyext.json'),
os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
main(['unload', '--yes', dir])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 4)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args_list[0][0][0],
b('DROP EXTENSION qux;'))
self.assertEquals(communicate.call_args_list[1][0][0],
b('DROP EXTENSION baz;'))
self.assertEquals(communicate.call_args_list[2][0][0],
b('DROP EXTENSION bar;'))
self.assertEquals(communicate.call_args_list[3][0][0],
b('DROP EXTENSION foo;'))
def test_load_list(self):
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
shutil.copyfile(
get_test_filename('META-manyext.json'),
os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
main(['load', '--yes', dir, 'baz', 'foo'])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 2)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args_list[0][0][0],
b('CREATE EXTENSION baz;'))
self.assertEquals(communicate.call_args_list[1][0][0],
b('CREATE EXTENSION foo;'))
def test_unload_list(self):
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
shutil.copyfile(
get_test_filename('META-manyext.json'),
os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
main(['unload', '--yes', dir, 'baz', 'foo'])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 2)
self.assert_('psql' in self.mock_popen.call_args[0][0][0])
communicate = self.mock_popen.return_value.communicate
self.assertEquals(communicate.call_args_list[0][0][0],
b('DROP EXTENSION baz;'))
self.assertEquals(communicate.call_args_list[1][0][0],
b('DROP EXTENSION foo;'))
def test_load_missing(self):
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
shutil.copyfile(
get_test_filename('META-manyext.json'),
os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
self.assertRaises(PgxnClientException, main,
['load', '--yes', dir, 'foo', 'ach'])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 0)
def test_unload_missing(self):
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
shutil.copyfile(
get_test_filename('META-manyext.json'),
os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
self.assertRaises(PgxnClientException, main,
['unload', '--yes', dir, 'foo', 'ach'])
finally:
shutil.rmtree(tdir)
self.assertEquals(self.mock_popen.call_count, 0)
def test_missing_meta_dir(self):
# issue #19
tdir = tempfile.mkdtemp()
try:
from pgxnclient.zip import unpack
dir = unpack(get_test_filename('foobar-0.42.1.zip'), tdir)
os.unlink(os.path.join(dir, 'META.json'))
from pgxnclient.cli import main
self.assertRaises(PgxnClientException, main, ['load', dir])
finally:
shutil.rmtree(tdir)
class SearchTestCase(unittest.TestCase):
@patch('sys.stdout')
@patch('pgxnclient.network.get_file')
def test_search_quoting(self, mock_get, stdout):
mock_get.side_effect = fake_get_file
from pgxnclient.cli import main
main(['search', '--docs', 'foo bar', 'baz'])
if __name__ == '__main__':
unittest.main()
pgxnclient-1.2.1/pgxnclient/tests/__init__.py 0000664 0001750 0001750 00000001532 12143727745 021277 0 ustar piro piro 0000000 0000000 """
pgxnclient -- test suite package
The test suite can be run via setup.py test. But you better use "make check"
in order to correctly set up the pythonpath.
The test suite relies on the files in the 'testdata' dir, which are currently
not installed but only avaliable in the sdist.
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import sys
# import here the unit test module we want to use
if sys.version_info < (2,7):
import unittest2 as unittest
else:
import unittest
# fix unittest maintainers stubborness: see Python issue #9424
if unittest.TestCase.assert_ is not unittest.TestCase.assertTrue:
# Vaffanculo, Wolf
unittest.TestCase.assert_ = unittest.TestCase.assertTrue
unittest.TestCase.assertEquals = unittest.TestCase.assertEqual
if __name__ == '__main__':
unittest.main()
pgxnclient-1.2.1/pgxnclient/tests/test_label.py 0000664 0001750 0001750 00000005313 12143727745 021657 0 ustar piro piro 0000000 0000000 from pgxnclient.tests import unittest
from pgxnclient import Label, Term, Identifier
class LabelTestCase(unittest.TestCase):
def test_ok(self):
for s in [
'd',
'a1234',
'abcd1234-5432XYZ',
'a12345678901234567890123456789012345678901234567890123456789012',]:
self.assertEqual(Label(s), s)
self.assertEqual(Label(s), Label(s))
self.assert_(Label(s) <= Label(s))
self.assert_(Label(s) >= Label(s))
def test_bad(self):
def ar(s):
try: Label(s)
except ValueError: pass
else: self.fail("ValueError not raised: '%s'" % s)
for s in [
'',
' a',
'a ',
'1a',
'-a',
'a-',
'a_b',
'a123456789012345678901234567890123456789012345678901234567890123',]:
ar(s)
def test_compare(self):
self.assertEqual(Label('a'), Label('A'))
self.assertNotEqual(str(Label('a')), str(Label('A'))) # preserving
def test_order(self):
self.assert_(Label('a') < Label('B') < Label('c'))
self.assert_(Label('A') < Label('b') < Label('C'))
self.assert_(Label('a') <= Label('B') <= Label('c'))
self.assert_(Label('A') <= Label('b') <= Label('C'))
self.assert_(Label('c') > Label('B') > Label('a'))
self.assert_(Label('C') > Label('b') > Label('A'))
self.assert_(Label('c') >= Label('B') >= Label('a'))
self.assert_(Label('C') >= Label('b') >= Label('A'))
class TermTestCase(unittest.TestCase):
def test_ok(self):
for s in [
'aa'
'adfkjh"()', ]:
self.assertEqual(Term(s), s)
self.assertEqual(Term(s), Term(s))
self.assert_(Term(s) <= Term(s))
self.assert_(Term(s) >= Term(s))
def test_bad(self):
def ar(s):
try: Term(s)
except ValueError: pass
else: self.fail("ValueError not raised: '%s'" % s)
for s in [
'a',
'aa ',
'a/a',
'a\\a',
'a\ta',
'aa\x01' ]:
ar(s)
class TestIdentifier(unittest.TestCase):
def test_nonblank(self):
self.assertRaises(ValueError, Identifier, "")
def test_unquoted(self):
for s in [
'x',
'xxxxx',
'abcxyz_0189',
'ABCXYZ_0189', ]:
self.assertEqual(Identifier(s), s)
def test_quoted(self):
for s, q in [
('x-y', '"x-y"'),
(' ', '" "'),
('x"y', '"x""y"')]:
self.assertEqual(Identifier(s), q)
if __name__ == '__main__':
unittest.main()
pgxnclient-1.2.1/pgxnclient/tests/testutils.py 0000664 0001750 0001750 00000002430 12143727745 021576 0 ustar piro piro 0000000 0000000 """
pgxnclient -- unit test utilities
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
def ifunlink(fn):
"""Delete a file if exists."""
if os.path.exists(fn):
os.unlink(fn)
_testdata_dir = None
def get_test_filename(*parts):
"""Return the complete file name for a testing file.
Note: The unit test is currently a pgxnclient sub-package: this is
required to have it converted to Python 3. However this results in the
subpackage being installed together with the main package. I don't mind
that (well, I do, but I don't think can do anything else), but I don't
want the crap of the test data files added to the package too. So,
the test files are found wherever are stored in any parent directory of
this module, which is ok for about any development setup.
"""
global _testdata_dir
if _testdata_dir is None:
_testdata_dir = os.path.dirname(__file__)
while not os.path.isdir(os.path.join(_testdata_dir, 'testdata')):
tmp = os.path.dirname(_testdata_dir)
if not tmp or tmp == _testdata_dir:
raise ValueError("'testdata' directory not found")
_testdata_dir = tmp
return os.path.join(_testdata_dir, 'testdata', *parts)
pgxnclient-1.2.1/pgxnclient/tests/test_semver.py 0000664 0001750 0001750 00000010745 12143727754 022106 0 ustar piro piro 0000000 0000000 from pgxnclient.tests import unittest
from pgxnclient import SemVer
# Tests based on
# https://github.com/theory/pg-semver/blob/master/test/sql/base.sql
class SemVerTestCase(unittest.TestCase):
def test_ok(self):
for s in [
'1.2.2',
'0.2.2',
'1.2.2',
'0.0.0',
'0.1.999',
'9999.9999999.823823',
'1.0.0beta1', # no more valid according to semver
'1.0.0beta2', # no more valid according to semver
'1.0.0-beta1',
'1.0.0-beta2',
'1.0.0',
'20110204.0.0', ]:
self.assertEqual(SemVer(s), s)
def test_bad(self):
def ar(s):
try: SemVer(s)
except ValueError: pass
else: self.fail("ValueError not raised: '%s'" % s)
for s in [
'1.2',
'1.2.02',
'1.2.2-',
'1.2.3b#5',
'03.3.3',
'v1.2.2',
'1.3b',
'1.4b.0',
'1v',
'1v.2.2v',
'1.2.4b.5', ]:
ar(s)
def test_eq(self):
for s1, s2 in [
('1.2.2', '1.2.2'),
('1.2.23', '1.2.23'),
('0.0.0', '0.0.0'),
('999.888.7777', '999.888.7777'),
('0.1.2-beta3', '0.1.2-beta3'),
('1.0.0-rc-1', '1.0.0-RC-1'), ]:
self.assertEqual(SemVer(s1), SemVer(s2))
self.assertEqual(hash(SemVer(s1)), hash(SemVer(s2)))
self.assert_(SemVer(s1) <= SemVer(s2),
"%s <= %s failed" % (s1, s2))
self.assert_(SemVer(s1) >= SemVer(s2),
"%s >= %s failed" % (s1, s2))
def test_ne(self):
for s1, s2 in [
('1.2.2', '1.2.3'),
('0.0.1', '1.0.0'),
('1.0.1', '1.1.0'),
('1.1.1', '1.1.0'),
('1.2.3-b', '1.2.3'),
('1.2.3', '1.2.3-b'),
('1.2.3-a', '1.2.3-b'),
('1.2.3-aaaaaaa1', '1.2.3-aaaaaaa2'), ]:
self.assertNotEqual(SemVer(s1), SemVer(s2))
self.assertNotEqual(hash(SemVer(s1)), hash(SemVer(s2)))
def test_dis(self):
for s1, s2 in [
('2.2.2', '1.1.1'),
('2.2.2', '2.1.1'),
('2.2.2', '2.2.1'),
('2.2.2-b', '2.2.1'),
('2.2.2', '2.2.2-b'),
('2.2.2-c', '2.2.2-b'),
('2.2.2-rc-2', '2.2.2-RC-1'),
('0.9.10', '0.9.9'), ]:
self.assert_(SemVer(s1) >= SemVer(s2),
"%s >= %s failed" % (s1, s2))
self.assert_(SemVer(s1) > SemVer(s2),
"%s > %s failed" % (s1, s2))
self.assert_(SemVer(s2) <= SemVer(s1),
"%s <= %s failed" % (s2, s1))
self.assert_(SemVer(s2) < SemVer(s1),
"%s < %s failed" % (s2, s1))
def test_clean(self):
for s1, s2 in [
('1.2.2', '1.2.2'),
('01.2.2', '1.2.2'),
('1.02.2', '1.2.2'),
('1.2.02', '1.2.2'),
('1.2.02b', '1.2.2-b'),
('1.2.02beta-3 ', '1.2.2-beta-3'),
('1.02.02rc1', '1.2.2-rc1'),
('1.0', '1.0.0'),
('1', '1.0.0'),
('.0.02', '0.0.2'),
('1..02', '1.0.2'),
('1..', '1.0.0'),
('1.1', '1.1.0'),
('1.2.b1', '1.2.0-b1'),
('9.0beta4', '9.0.0-beta4'), # PostgreSQL format.
('9b', '9.0.0-b'),
('rc1', '0.0.0-rc1'),
('', '0.0.0'),
('..2', '0.0.2'),
('1.2.3 a', '1.2.3-a'),
('..2 b', '0.0.2-b'),
(' 012.2.2', '12.2.2'),
('20110204', '20110204.0.0'), ]:
try:
self.assertEqual(SemVer.clean(s1), SemVer(s2))
except:
print s1, s2
raise
def test_cant_clean(self):
def ar(s):
try: SemVer.clean(s)
except ValueError: pass
else: self.fail("ValueError not raised: '%s'" % s)
for s in [
'1.2.0 beta 4',
'1.2.2-',
'1.2.3b#5',
'v1.2.2',
'1.4b.0',
'1v.2.2v',
'1.2.4b.5',
'1.2.3.4',
'1.2.3 4',
'1.2000000000000000.3.4',]:
ar(s)
if __name__ == '__main__':
unittest.main()
pgxnclient-1.2.1/pgxnclient/tests/test_archives.py 0000664 0001750 0001750 00000003702 12143727745 022404 0 ustar piro piro 0000000 0000000 from pgxnclient import tar
from pgxnclient import zip
from pgxnclient import archive
from pgxnclient.tests import unittest
from pgxnclient.errors import PgxnClientException
from pgxnclient.tests.testutils import get_test_filename
class TestArchive(unittest.TestCase):
def test_from_file_zip(self):
fn = get_test_filename('foobar-0.42.1.zip')
a = archive.from_file(fn)
self.assert_(isinstance(a, zip.ZipArchive))
self.assertEqual(a.filename, fn)
def test_from_file_tar(self):
fn = get_test_filename('foobar-0.42.1.tar.gz')
a = archive.from_file(fn)
self.assert_(isinstance(a, tar.TarArchive))
self.assertEqual(a.filename, fn)
def test_from_file_unknown(self):
fn = get_test_filename('META-manyext.json')
self.assertRaises(PgxnClientException(archive.from_file, fn))
class TestZipArchive(unittest.TestCase):
def test_can_open(self):
fn = get_test_filename('foobar-0.42.1.zip')
a = zip.ZipArchive(fn)
self.assert_(a.can_open())
a.open()
a.close()
def test_can_open_noext(self):
fn = get_test_filename('zip.ext')
a = zip.ZipArchive(fn)
self.assert_(a.can_open())
a.open()
a.close()
def test_cannot_open(self):
fn = get_test_filename('foobar-0.42.1.tar.gz')
a = zip.ZipArchive(fn)
self.assert_(not a.can_open())
class TestTarArchive(unittest.TestCase):
def test_can_open(self):
fn = get_test_filename('foobar-0.42.1.tar.gz')
a = tar.TarArchive(fn)
self.assert_(a.can_open())
a.open()
a.close()
def test_can_open_noext(self):
fn = get_test_filename('tar.ext')
a = tar.TarArchive(fn)
self.assert_(a.can_open())
a.open()
a.close()
def test_cannot_open(self):
fn = get_test_filename('foobar-0.42.1.zip')
a = tar.TarArchive(fn)
self.assert_(not a.can_open())
pgxnclient-1.2.1/pgxnclient/tests/test_spec.py 0000664 0001750 0001750 00000001044 12143727745 021527 0 ustar piro piro 0000000 0000000 from pgxnclient.tests import unittest
from pgxnclient import Spec
class SpecTestCase(unittest.TestCase):
def test_str(self):
self.assertEqual(
str(Spec('foo')), 'foo')
self.assertEqual(
str(Spec('foo>2.0')), 'foo>2.0')
self.assertEqual(
str(Spec('foo>2.0')), 'foo>2.0')
self.assertEqual(
str(Spec(dirname='/foo')), '/foo')
self.assertEqual(
str(Spec(dirname='/foo/foo.zip')), '/foo/foo.zip')
if __name__ == '__main__':
unittest.main()
pgxnclient-1.2.1/pgxnclient/commands/ 0000775 0001750 0001750 00000000000 12143730213 017604 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/pgxnclient/commands/info.py 0000664 0001750 0001750 00000017265 12143727745 021144 0 ustar piro piro 0000000 0000000 """
pgxnclient -- informative commands implementation
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.i18n import _, N_
from pgxnclient import SemVer
from pgxnclient.errors import NotFound, ResourceNotFound
from pgxnclient.commands import Command, WithSpec
import logging
logger = logging.getLogger('pgxnclient.commands')
class Mirror(Command):
name = 'mirror'
description = N_("return information about the available mirrors")
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(Mirror, self).customize_parser(
parser, subparsers, **kwargs)
subp.add_argument('uri', nargs='?', metavar="URI",
help = _("return detailed info about this mirror."
" If not specified return a list of mirror URIs"))
subp.add_argument('--detailed', action="store_true",
help = _("return full details for each mirror"))
return subp
def run(self):
data = self.api.mirrors()
if self.opts.uri:
detailed = True
data = [ d for d in data if d['uri'] == self.opts.uri ]
if not data:
raise ResourceNotFound(
_('mirror not found: %s') % self.opts.uri)
else:
detailed = self.opts.detailed
for i, d in enumerate(data):
if not detailed:
print d['uri']
else:
for k in [
"uri", "frequency", "location", "bandwidth", "organization",
"email", "timezone", "src", "rsync", "notes",]:
print "%s: %s" % (k, d.get(k, ''))
print
import re
import textwrap
import xml.sax.saxutils as saxutils
class Search(Command):
name = 'search'
description = N_("search in the available extensions")
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(Search, self).customize_parser(
parser, subparsers, **kwargs)
g = subp.add_mutually_exclusive_group()
g.add_argument('--docs', dest='where', action='store_const',
const='docs', default='docs',
help=_("search in documentation [default]"))
g.add_argument('--dist', dest='where', action='store_const',
const="dists",
help=_("search in distributions"))
g.add_argument('--ext', dest='where', action='store_const',
const='extensions',
help=_("search in extensions"))
subp.add_argument('query', metavar='TERM', nargs='+',
help = _("a string to search"))
return subp
def run(self):
data = self.api.search(self.opts.where, self.opts.query)
for hit in data['hits']:
print "%s %s" % (hit['dist'], hit['version'])
if 'excerpt' in hit:
excerpt = self.clean_excerpt(hit['excerpt'])
for line in textwrap.wrap(excerpt, 72):
print " " + line
print
def clean_excerpt(self, excerpt):
"""Clean up the excerpt returned in the json result for output."""
# replace ellipsis with three dots, as there's no chance
# to have them printed on non-utf8 consoles.
# Also, they suck obscenely on fixed-width output.
excerpt = excerpt.replace('…', '...')
# TODO: this apparently misses a few entities
excerpt = saxutils.unescape(excerpt)
excerpt = excerpt.replace('"', '"')
# Convert numerical entities
excerpt = re.sub(r'\&\#(\d+)\;',
lambda c: unichr(int(c.group(1))),
excerpt)
# Hilight found terms
# TODO: use proper highlight with escape chars?
excerpt = excerpt.replace('', '')
excerpt = excerpt.replace('', '*')
excerpt = excerpt.replace('', '*')
return excerpt
class Info(WithSpec, Command):
name = 'info'
description = N_("print information about a distribution")
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(Info, self).customize_parser(
parser, subparsers, **kwargs)
g = subp.add_mutually_exclusive_group()
g.add_argument('--details', dest='what',
action='store_const', const='details', default='details',
help=_("show details about the distribution [default]"))
g.add_argument('--meta', dest='what',
action='store_const', const='meta',
help=_("show the distribution META.json"))
g.add_argument('--readme', dest='what',
action='store_const', const='readme',
help=_("show the distribution README"))
g.add_argument('--versions', dest='what',
action='store_const', const='versions',
help=_("show the list of available versions"))
return subp
def run(self):
spec = self.get_spec()
getattr(self, 'print_' + self.opts.what)(spec)
def print_meta(self, spec):
data = self._get_dist_data(spec.name)
ver = self.get_best_version(data, spec, quiet=True)
print self.api.meta(spec.name, ver, as_json=False)
def print_readme(self, spec):
data = self._get_dist_data(spec.name)
ver = self.get_best_version(data, spec, quiet=True)
print self.api.readme(spec.name, ver)
def print_details(self, spec):
data = self._get_dist_data(spec.name)
ver = self.get_best_version(data, spec, quiet=True)
data = self.api.meta(spec.name, ver)
for k in [u'name', u'abstract', u'description', u'maintainer', u'license',
u'release_status', u'version', u'date', u'sha1']:
try:
v = data[k]
except KeyError:
logger.warn(_("data key '%s' not found"), k)
continue
if isinstance(v, list):
for vv in v:
print "%s: %s" % (k, vv)
elif isinstance(v, dict):
for kk, vv in v.iteritems():
print "%s: %s: %s" % (k, kk, vv)
else:
print "%s: %s" % (k, v)
k = 'provides'
for ext, dext in data[k].iteritems():
print "%s: %s: %s" % (k, ext, dext['version'])
k = 'prereqs'
if k in data:
for phase, rels in data[k].iteritems():
for rel, pkgs in rels.iteritems():
for pkg, ver in pkgs.iteritems():
print "%s: %s: %s %s" % (phase, rel, pkg, ver)
def print_versions(self, spec):
data = self._get_dist_data(spec.name)
name = data['name']
vs = [ (SemVer(d['version']), s)
for s, ds in data['releases'].iteritems()
for d in ds ]
vs = [ (v, s) for v, s in vs if spec.accepted(v) ]
vs.sort(reverse=True)
for v, s in vs:
print name, v, s
def _get_dist_data(self, name):
try:
return self.api.dist(name)
except NotFound, e:
# maybe the user was looking for an extension instead?
try:
ext = self.api.ext(name)
except NotFound:
pass
else:
vs = ext.get('versions', {})
for extver, ds in vs.iteritems():
for d in ds:
if 'dist' not in d: continue
dist = d['dist']
distver = d.get('version', 'unknown')
logger.info(
_("extension %s %s found in distribution %s %s"),
name, extver, dist, distver)
raise e
pgxnclient-1.2.1/pgxnclient/commands/__init__.py 0000664 0001750 0001750 00000054516 12143727745 021750 0 ustar piro piro 0000000 0000000 """
pgxnclient -- commands package
This module contains base classes and functions to implement and deal with
commands. Concrete commands implementations are available in other package
modules.
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from __future__ import with_statement
import os
import sys
import logging
from subprocess import Popen, PIPE
from pgxnclient.utils import load_json, argparse, find_executable
from pgxnclient import __version__
from pgxnclient import network
from pgxnclient import Spec, SemVer
from pgxnclient import archive
from pgxnclient.api import Api
from pgxnclient.i18n import _, gettext
from pgxnclient.errors import NotFound, PgxnClientException, ProcessError, ResourceNotFound, UserAbort
from pgxnclient.utils.temp import temp_dir
logger = logging.getLogger('pgxnclient.commands')
def get_option_parser():
"""
Return an option parser populated with the available commands.
The parser is populated with all the options defined by the implemented
commands. Only commands defining a ``name`` attribute are added.
The function relies on the `Command` subclasses being already
created: call `load_commands()` before calling this function.
"""
parser = argparse.ArgumentParser(
# usage = _("%(prog)s [global options] COMMAND [command options]"),
description =
_("Interact with the PostgreSQL Extension Network (PGXN)."),
)
parser.add_argument("--version", action='version',
version="%%(prog)s %s" % __version__,
help = _("print the version number and exit"))
subparsers = parser.add_subparsers(
title = _("available commands"),
metavar = 'COMMAND',
help = _("the command to execute."
" The complete list is available using `pgxn help --all`."
" Builtin commands are:"))
clss = [ cls for cls in CommandType.subclasses if cls.name ]
clss.sort(key=lambda c: c.name)
for cls in clss:
cls.customize_parser(parser, subparsers)
return parser
def load_commands():
"""
Load all the commands known by the program.
Currently commands are read from modules into the `pgxnclient.commands`
package.
Importing the package causes the `Command` classes to be created: they
register themselves thanks to the `CommandType` metaclass.
"""
pkgdir = os.path.dirname(__file__)
for fn in os.listdir(pkgdir):
if fn.startswith('_'): continue
modname = __name__ + '.' + os.path.splitext(fn)[0]
# skip already imported modules
if modname in sys.modules: continue
try:
__import__(modname)
except Exception, e:
logger.warn(_("error importing commands module %s: %s - %s"),
modname, e.__class__.__name__, e)
def run_command(opts, parser):
"""Run the command specified by options parsed on the command line."""
# setup the logging
logging.getLogger().setLevel(
opts.verbose and logging.DEBUG or logging.INFO)
return opts.cmd(opts, parser=parser).run()
class CommandType(type):
"""
Metaclass for the Command class.
This metaclass allows self-registration of the commands: any Command
subclass is automatically added to the `subclasses` list.
"""
subclasses = []
def __new__(cls, name, bases, dct):
rv = type.__new__(cls, name, bases, dct)
CommandType.subclasses.append(rv)
return rv
def __init__(cls, name, bases, dct):
super(CommandType, cls).__init__(name, bases, dct)
class Command(object):
"""
Base class to implement client commands.
Provide the argument parsing framework and API dispatch.
Commands should subclass this class and possibly other mixin classes, set
a value for the `name` and `description` arguments and implement the
`run()` method. If command line parser customization is required,
`customize_parser()` should be extended.
"""
__metaclass__ = CommandType
name = None
description = None
def __init__(self, opts, parser=None):
"""Initialize a new Command.
The parser will be specified if the class has been initialized
by that parser itself, so run() can expect it being not None.
"""
self.opts = opts
self.parser = parser
self._api = None
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
"""Customise the option parser.
:param parser: the option parser to be customized
:param subparsers: the action object where to register a command subparser
:return: the new subparser created
Subclasses should extend this method in order to add new options or a
subparser implementing a new command. Be careful in calling the
superclass' `customize_parser()` via `super()` in order to call all
the mixins methods. Also note that the method must be a classmethod.
"""
return self.__make_subparser(parser, subparsers, **kwargs)
def run(self):
"""The actions to take when the command is invoked."""
raise NotImplementedError
@classmethod
def __make_subparser(self, parser, subparsers,
description=None, epilog=None):
"""Create a new subparser with help populated."""
subp = subparsers.add_parser(self.name,
help = gettext(self.description),
description = description or gettext(self.description),
epilog = epilog)
subp.set_defaults(cmd=self)
glb = subp.add_argument_group(_("global options"))
glb.add_argument("--mirror", metavar="URL",
default = 'http://api.pgxn.org/',
help = _("the mirror to interact with [default: %(default)s]"))
glb.add_argument("--verbose", action='store_true',
help = _("print more information"))
glb.add_argument("--yes", action='store_true',
help = _("assume affirmative answer to all questions"))
return subp
@property
def api(self):
"""Return an `Api` instance to communicate with PGXN.
Use the value provided with ``--mirror`` to decide where to connect.
"""
if self._api is None:
self._api = Api(mirror=self.opts.mirror)
return self._api
def confirm(self, prompt):
"""Prompt an user confirmation.
Raise `UserAbort` if the user replies "no".
The method is no-op if the ``--yes`` option is specified.
"""
if self.opts.yes:
return True
while 1:
ans = raw_input(_("%s [y/N] ") % prompt)
if _('no').startswith(ans.lower()):
raise UserAbort(_("operation interrupted on user request"))
elif _('yes').startswith(ans.lower()):
return True
else:
prompt = _("Please answer yes or no")
def popen(self, cmd, *args, **kwargs):
"""
Excecute subprocess.Popen.
Commands should use this method instead of importing subprocess.Popen:
this allows replacement with a mock in the test suite.
"""
logger.debug("running command: %s", cmd)
try:
return Popen(cmd, *args, **kwargs)
except OSError, e:
if not isinstance(cmd, basestring):
cmd = ' '.join(cmd)
msg = _("%s running command: %s") % (e, cmd)
raise ProcessError(msg)
from pgxnclient.errors import BadSpecError
class WithSpec(Command):
"""Mixin to implement commands taking a package specification.
This class adds a positional argument SPEC to the parser and related
options.
"""
@classmethod
def customize_parser(self, parser, subparsers,
with_status=True, epilog=None, **kwargs):
"""
Add the SPEC related options to the parser.
If *with_status* is true, options ``--stable``, ``--testing``,
``--unstable`` are also handled.
"""
epilog = _("""
SPEC can either specify just a name or contain required versions
indications, for instance 'pkgname=1.0', or 'pkgname>=2.1'.
""") + (epilog or "")
subp = super(WithSpec, self).customize_parser(
parser, subparsers, epilog=epilog, **kwargs)
subp.add_argument('spec', metavar='SPEC',
help = _("name and optional version of the package"))
if with_status:
g = subp.add_mutually_exclusive_group(required=False)
g.add_argument('--stable', dest='status',
action='store_const', const=Spec.STABLE, default=Spec.STABLE,
help=_("only accept stable distributions [default]"))
g.add_argument('--testing', dest='status',
action='store_const', const=Spec.TESTING,
help=_("accept testing distributions too"))
g.add_argument('--unstable', dest='status',
action='store_const', const=Spec.UNSTABLE,
help=_("accept unstable distributions too"))
return subp
def get_spec(self, _can_be_local=False, _can_be_url=False):
"""
Return the package specification requested.
Return a `Spec` instance.
"""
spec = self.opts.spec
try:
spec = Spec.parse(spec)
except (ValueError, BadSpecError), e:
self.parser.error(_("cannot parse package '%s': %s")
% (spec, e))
if not _can_be_local and spec.is_local():
raise PgxnClientException(
_("you cannot use a local resource with this command"))
if not _can_be_url and spec.is_url():
raise PgxnClientException(
_("you cannot use an url with this command"))
return spec
def get_best_version(self, data, spec, quiet=False):
"""
Return the best version an user may want for a distribution.
Return a `SemVer` instance.
Raise `ResourceNotFound` if no version is found with the provided
specification and options.
"""
drels = data['releases']
# Get the maximum version for each release status satisfying the spec
vers = [ None ] * len(Spec.STATUS)
for n, d in drels.iteritems():
vs = filter(spec.accepted, [SemVer(r['version']) for r in d])
if vs:
vers[Spec.STATUS[n]] = max(vs)
return self._get_best_version(vers, spec, quiet)
def get_best_version_from_ext(self, data, spec):
"""
Return the best distribution version from an extension's data
"""
# Get the maximum version for each release status satisfying the spec
vers = [ [] for i in xrange(len(Spec.STATUS)) ]
vmap = {} # ext_version -> (dist_name, dist_version)
for ev, dists in data.get('versions', {}).iteritems():
ev = SemVer(ev)
if not spec.accepted(ev):
continue
for dist in dists:
dv = SemVer(dist['version'])
ds = dist.get('status', 'stable')
vers[Spec.STATUS[ds]].append(ev)
vmap[ev] = (dist['dist'], dv)
# for each rel status only take the max one.
for i in xrange(len(vers)):
vers[i] = vers[i] and max(vers[i]) or None
ev = self._get_best_version(vers, spec, quiet=False)
return vmap[ev]
def _get_best_version(self, vers, spec, quiet):
# Is there any result at the desired release status?
want = [ v for lvl, v in enumerate(vers)
if lvl >= self.opts.status and v is not None ]
if want:
ver = max(want)
if not quiet:
logger.info(_("best version: %s %s"), spec.name, ver)
return ver
# Not found: is there any hint we can give?
if self.opts.status > Spec.TESTING and vers[Spec.TESTING]:
hint = (vers[Spec.TESTING], _('testing'))
elif self.opts.status > Spec.UNSTABLE and vers[Spec.UNSTABLE]:
hint = (vers[Spec.UNSTABLE], _('unstable'))
else:
hint = None
msg = _("no suitable version found for %s") % spec
if hint:
msg += _(" but there is version %s at level %s") % hint
raise ResourceNotFound(msg)
def get_meta(self, spec):
"""
Return the content of the ``META.json`` file for *spec*.
Return the object obtained parsing the JSON.
"""
if spec.is_name():
# Get the metadata from the API
try:
data = self.api.dist(spec.name)
except NotFound:
# Distro not found: maybe it's an extension?
ext = self.api.ext(spec.name)
name, ver = self.get_best_version_from_ext(ext, spec)
return self.api.meta(name, ver)
else:
ver = self.get_best_version(data, spec)
return self.api.meta(spec.name, ver)
elif spec.is_dir():
# Get the metadata from a directory
fn = os.path.join(spec.dirname, 'META.json')
logger.debug("reading %s", fn)
if not os.path.exists(fn):
raise PgxnClientException(
_("file 'META.json' not found in '%s'") % spec.dirname)
with open(fn) as f:
return load_json(f)
elif spec.is_file():
arc = archive.from_spec(spec)
return arc.get_meta()
elif spec.is_url():
with network.get_file(spec.url) as fin:
with temp_dir() as dir:
fn = network.download(fin, dir)
arc = archive.from_file(fn)
return arc.get_meta()
else:
assert False
class WithSpecLocal(WithSpec):
"""
Mixin to implement commands that can also refer to a local file or dir.
"""
@classmethod
def customize_parser(self, parser, subparsers, epilog=None, **kwargs):
epilog = _("""
SPEC may also be a local zip file or unpacked directory, but in this case
it should contain at least a '%s', for instance '.%spkgname.zip'.
""") % (os.sep, os.sep) + (epilog or "")
subp = super(WithSpecLocal, self).customize_parser(
parser, subparsers, epilog=epilog, **kwargs)
return subp
def get_spec(self, **kwargs):
kwargs['_can_be_local'] = True
return super(WithSpecLocal, self).get_spec(**kwargs)
class WithSpecUrl(WithSpec):
"""
Mixin to implement commands that can also refer to a URL.
"""
@classmethod
def customize_parser(self, parser, subparsers, epilog=None, **kwargs):
epilog = _("""
SPEC may also be an url specifying a protocol such as 'http://' or 'https://'.
""") + (epilog or "")
subp = super(WithSpecUrl, self).customize_parser(
parser, subparsers, epilog=epilog, **kwargs)
return subp
def get_spec(self, **kwargs):
kwargs['_can_be_url'] = True
return super(WithSpecUrl, self).get_spec(**kwargs)
class WithPgConfig(object):
"""
Mixin to implement commands that should query :program:`pg_config`.
"""
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
"""
Add the ``--pg_config`` option to the options parser.
"""
subp = super(WithPgConfig, self).customize_parser(
parser, subparsers, **kwargs)
subp.add_argument('--pg_config', metavar="PROG", default='pg_config',
help = _("the pg_config executable to find the database"
" [default: %(default)s]"))
return subp
def call_pg_config(self, what, _cache={}):
"""
Call :program:`pg_config` and return its output.
"""
if what in _cache:
return _cache[what]
logger.debug("running pg_config --%s", what)
cmdline = [self.get_pg_config(), "--%s" % what]
p = self.popen(cmdline, stdout=PIPE)
out, err = p.communicate()
if p.returncode:
raise ProcessError(_("command returned %s: %s")
% (p.returncode, cmdline))
out = out.rstrip().decode('utf-8')
rv = _cache[what] = out
return rv
def get_pg_config(self):
"""
Return the absolute path of the pg_config binary.
"""
pg_config = self.opts.pg_config
if os.path.split(pg_config)[0]:
pg_config = os.path.abspath(pg_config)
else:
pg_config = find_executable(pg_config)
if not pg_config:
raise PgxnClientException(_("pg_config executable not found"))
return pg_config
import shlex
class WithMake(WithPgConfig):
"""
Mixin to implement commands that should invoke :program:`make`.
"""
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
"""
Add the ``--make`` option to the options parser.
"""
subp = super(WithMake, self).customize_parser(
parser, subparsers, **kwargs)
subp.add_argument('--make', metavar="PROG",
default=self._find_default_make(),
help = _("the 'make' executable to use to build the extension "
"[default: %(default)s]"))
return subp
def run_make(self, cmd, dir, env=None, sudo=None):
"""Invoke make with the selected command.
:param cmd: the make target or list of options to pass make
:param dir: the direcrory to run the command into
:param env: variables to add to the make environment
:param sudo: if set, use the provided command/arg to elevate
privileges
"""
# check if the directory contains a makefile
for fn in ('GNUmakefile', 'makefile', 'Makefile'):
if os.path.exists(os.path.join(dir, fn)):
break
else:
raise PgxnClientException(
_("no Makefile found in the extension root"))
cmdline = []
if sudo:
cmdline.extend(shlex.split(sudo))
cmdline.extend([self.get_make(), 'PG_CONFIG=%s' % self.get_pg_config()])
if isinstance(cmd, basestring):
cmdline.append(cmd)
else: # a list
cmdline.extend(cmd)
logger.debug(_("running: %s"), cmdline)
p = self.popen(cmdline, cwd=dir, shell=False, env=env, close_fds=True)
p.communicate()
if p.returncode:
raise ProcessError(_("command returned %s: %s")
% (p.returncode, ' '.join(cmdline)))
def get_make(self, _cache=[]):
"""
Return the path of the make binary.
"""
# the cache is not for performance but to return a consistent value
# even if the cwd is changed
if _cache:
return _cache[0]
make = self.opts.make
if os.path.split(make)[0]:
# At least a relative dir specified.
if not os.path.exists(make):
raise PgxnClientException(_("make executable not found: %s")
% make)
# Convert to abs path to be robust in case the dir is changed.
make = os.path.abspath(make)
else:
# we don't find make here and convert to abs path because it's a
# security hole: make may be run under sudo and in this case we
# don't want root to execute a make hacked in an user local dir
if not find_executable(make):
raise PgxnClientException(_("make executable not found: %s")
% make)
_cache.append(make)
return make
@classmethod
def _find_default_make(self):
for make in ('gmake', 'make'):
path = find_executable(make)
if path:
return make
# if nothing was found, fall back on 'gmake'. If it was missing we
# will give an error when attempting to use it
return 'gmake'
class WithSudo(object):
"""
Mixin to implement commands that may invoke sudo.
"""
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(WithSudo, self).customize_parser(
parser, subparsers, **kwargs)
g = subp.add_mutually_exclusive_group()
g.add_argument('--sudo', metavar="PROG", const='sudo', nargs="?",
help = _("run PROG to elevate privileges when required"
" [default: %(const)s]"))
g.add_argument('--nosudo', dest='sudo', action='store_false',
help = _("never elevate privileges "
"(no more needed: for backward compatibility)"))
return subp
class WithDatabase(object):
"""
Mixin to implement commands that should communicate to a database.
"""
@classmethod
def customize_parser(self, parser, subparsers, epilog=None, **kwargs):
"""
Add the options related to database connections.
"""
epilog = _("""
The default database connection options depend on the value of environment
variables PGDATABASE, PGHOST, PGPORT, PGUSER.
""") + (epilog or "")
subp = super(WithDatabase, self).customize_parser(
parser, subparsers, epilog=epilog, **kwargs)
g = subp.add_argument_group(_("database connections options"))
g.add_argument('-d', '--dbname', metavar="DBNAME",
help = _("database name to install into"))
g.add_argument('-h', '--host', metavar="HOST",
help = _("database server host or socket directory"))
g.add_argument('-p', '--port', metavar="PORT", type=int,
help = _("database server port"))
g.add_argument('-U', '--username', metavar="NAME",
help = _("database user name"))
return subp
def get_psql_options(self):
"""
Return the cmdline options to connect to the specified database.
"""
rv = []
if self.opts.dbname: rv.extend(['--dbname', self.opts.dbname])
if self.opts.host: rv.extend(['--host', self.opts.host])
if self.opts.port: rv.extend(['--port', str(self.opts.port)])
if self.opts.username: rv.extend(['--username', self.opts.username])
return rv
def get_psql_env(self):
"""
Return a dict with env variables to connect to the specified db.
"""
rv = {}
if self.opts.dbname: rv['PGDATABASE'] = self.opts.dbname
if self.opts.host: rv['PGHOST'] = self.opts.host
if self.opts.port: rv['PGPORT'] = str(self.opts.port)
if self.opts.username: rv['PGUSER'] = self.opts.username
return rv
pgxnclient-1.2.1/pgxnclient/commands/help.py 0000664 0001750 0001750 00000004024 12143727745 021126 0 ustar piro piro 0000000 0000000 """
pgxnclient -- help commands implementation
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
from pgxnclient import get_scripts_dirs, get_public_scripts_dir
from pgxnclient.i18n import _, N_
from pgxnclient.commands import Command
class Help(Command):
name = 'help'
description = N_("display help and other program information")
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(Help, self).customize_parser(
parser, subparsers, **kwargs)
g = subp.add_mutually_exclusive_group()
g.add_argument('--all', action="store_true",
help = _("list all the available commands"))
g.add_argument('--libexec', action="store_true",
help = _("print the location of the scripts directory"))
g.add_argument('command', metavar='CMD', nargs='?',
help = _("the command to get help about"))
# To print the basic help
self._parser = parser
return subp
def run(self):
if self.opts.command:
from pgxnclient.cli import main
main([self.opts.command, '--help'])
elif self.opts.all:
self.print_all_commands()
elif self.opts.libexec:
self.print_libexec()
else:
self._parser.print_help()
def print_all_commands(self):
cmds = self.find_all_commands()
title = _("Available PGXN Client commands")
print title
print "-" * len(title)
for cmd in cmds:
print " " + cmd
def find_all_commands(self):
rv = []
path = os.environ.get('PATH', '').split(os.pathsep)
path[0:0] = get_scripts_dirs()
for p in path:
if not os.path.isdir(p):
continue
for fn in os.listdir(p):
if fn.startswith('pgxn-'):
rv.append(fn[5:])
rv.sort()
return rv
def print_libexec(self):
print get_public_scripts_dir()
pgxnclient-1.2.1/pgxnclient/commands/install.py 0000664 0001750 0001750 00000043357 12143727745 021660 0 ustar piro piro 0000000 0000000 """
pgxnclient -- installation/loading commands implementation
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from __future__ import with_statement
import os
import re
import shutil
import difflib
import logging
import tempfile
from subprocess import PIPE
from pgxnclient import SemVer
from pgxnclient import archive
from pgxnclient import network
from pgxnclient.i18n import _, N_
from pgxnclient.utils import sha1, b
from pgxnclient.errors import BadChecksum, PgxnClientException, InsufficientPrivileges
from pgxnclient.commands import Command, WithDatabase, WithMake, WithPgConfig
from pgxnclient.commands import WithSpecUrl, WithSpecLocal, WithSudo
from pgxnclient.utils.temp import temp_dir
from pgxnclient.utils.strings import Identifier
logger = logging.getLogger('pgxnclient.commands')
class Download(WithSpecUrl, Command):
name = 'download'
description = N_("download a distribution from the network")
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(Download, self).customize_parser(
parser, subparsers, **kwargs)
subp.add_argument('--target', metavar='PATH', default='.',
help = _('Target directory and/or filename to save'))
return subp
def run(self):
spec = self.get_spec()
assert not spec.is_local()
if spec.is_url():
return self._run_url(spec)
data = self.get_meta(spec)
try:
chk = data['sha1']
except KeyError:
raise PgxnClientException(
"sha1 missing from the distribution meta")
with self.api.download(data['name'], SemVer(data['version'])) as fin:
fn = network.download(fin, self.opts.target)
self.verify_checksum(fn, chk)
return fn
def _run_url(self, spec):
with network.get_file(spec.url) as fin:
fn = network.download(fin, self.opts.target)
return fn
def verify_checksum(self, fn, chk):
"""Verify that a downloaded file has the expected sha1."""
sha = sha1()
logger.debug(_("checking sha1 of '%s'"), fn)
f = open(fn, "rb")
try:
while 1:
data = f.read(8192)
if not data: break
sha.update(data)
finally:
f.close()
sha = sha.hexdigest()
if sha != chk:
os.unlink(fn)
logger.error(_("file %s has sha1 %s instead of %s"),
fn, sha, chk)
raise BadChecksum(_("bad sha1 in downloaded file"))
class InstallUninstall(WithMake, WithSpecUrl, WithSpecLocal, Command):
"""
Base class to implement the ``install`` and ``uninstall`` commands.
"""
def run(self):
with temp_dir() as dir:
return self._run(dir)
def _run(self, dir):
spec = self.get_spec()
if spec.is_dir():
pdir = os.path.abspath(spec.dirname)
elif spec.is_file():
pdir = archive.from_file(spec.filename).unpack(dir)
elif not spec.is_local():
self.opts.target = dir
fn = Download(self.opts).run()
pdir = archive.from_file(fn).unpack(dir)
else:
assert False
self.maybe_run_configure(pdir)
self._inun(pdir)
def _inun(self, pdir):
"""Run the specific command, implemented in the subclass."""
raise NotImplementedError
def maybe_run_configure(self, dir):
fn = os.path.join(dir, 'configure')
logger.debug("checking '%s'", fn)
if not os.path.exists(fn):
return
logger.info(_("running configure"))
p = self.popen(fn, cwd=dir)
p.communicate()
if p.returncode:
raise PgxnClientException(
_("configure failed with return code %s") % p.returncode)
class SudoInstallUninstall(WithSudo, InstallUninstall):
"""
Installation commands base class supporting sudo operations.
"""
def run(self):
if not self.is_libdir_writable() and not self.opts.sudo:
dir = self.call_pg_config('libdir')
raise InsufficientPrivileges(_(
"PostgreSQL library directory (%s) not writable: "
"you should run the program as superuser, or specify "
"a 'sudo' program") % dir)
return super(SudoInstallUninstall, self).run()
def get_sudo_prog(self):
if self.is_libdir_writable():
return None # not needed
return self.opts.sudo
def is_libdir_writable(self):
"""
Check if the Postgres installation directory is writable.
If it is, we will assume that sudo is not required to
install/uninstall the library, so the sudo program will not be invoked
or its specification will not be required.
"""
dir = self.call_pg_config('libdir')
logger.debug("testing if %s is writable", dir)
try:
f = tempfile.TemporaryFile(prefix="pgxn-", suffix=".test", dir=dir)
f.write(b('test'))
f.close()
except (IOError, OSError):
rv = False
else:
rv = True
return rv
class Install(SudoInstallUninstall):
name = 'install'
description = N_("download, build and install a distribution")
def _inun(self, pdir):
logger.info(_("building extension"))
self.run_make('all', dir=pdir)
logger.info(_("installing extension"))
self.run_make('install', dir=pdir, sudo=self.get_sudo_prog())
class Uninstall(SudoInstallUninstall):
name = 'uninstall'
description = N_("remove a distribution from the system")
def _inun(self, pdir):
logger.info(_("removing extension"))
self.run_make('uninstall', dir=pdir, sudo=self.get_sudo_prog())
class Check(WithDatabase, InstallUninstall):
name = 'check'
description = N_("run a distribution's test")
def _inun(self, pdir):
logger.info(_("checking extension"))
upenv = self.get_psql_env()
logger.debug("additional env: %s", upenv)
env = os.environ.copy()
env.update(upenv)
cmd = ['installcheck']
if 'PGDATABASE' in upenv:
cmd.append("CONTRIB_TESTDB=" + env['PGDATABASE'])
try:
self.run_make(cmd, dir=pdir, env=env)
except PgxnClientException:
# if the test failed, copy locally the regression result
for ext in ('out', 'diffs'):
fn = os.path.join(pdir, 'regression.' + ext)
if os.path.exists(fn):
dest = './regression.' + ext
if not os.path.exists(dest) or not os.path.samefile(fn, dest):
logger.info(_('copying regression.%s'), ext)
shutil.copy(fn, dest)
raise
class LoadUnload(WithPgConfig, WithDatabase, WithSpecUrl, WithSpecLocal, Command):
"""
Base class to implement the ``load`` and ``unload`` commands.
"""
@classmethod
def customize_parser(self, parser, subparsers, **kwargs):
subp = super(LoadUnload, self).customize_parser(
parser, subparsers, **kwargs)
subp.add_argument('--schema', metavar="SCHEMA",
type=Identifier.parse_arg,
help=_("use SCHEMA instead of the default schema"))
subp.add_argument('extensions', metavar='EXT', nargs='*',
help = _("only specified extensions [default: all]"))
return subp
def get_pg_version(self):
"""Return the version of the selected database."""
data = self.call_psql('SELECT version();')
pgver = self.parse_pg_version(data)
logger.debug("PostgreSQL version: %d.%d.%d", *pgver)
return pgver
def parse_pg_version(self, data):
m = re.match(r'\S+\s+(\d+)\.(\d+)(?:\.(\d+))?', data)
if m is None:
raise PgxnClientException(
"cannot parse version number from '%s'" % data)
return (int(m.group(1)), int(m.group(2)), int(m.group(3) or 0))
def is_extension(self, name):
fn = os.path.join(self.call_pg_config('sharedir'),
"extension", name + ".control")
logger.debug("checking if exists %s", fn)
return os.path.exists(fn)
def call_psql(self, command):
cmdline = [self.find_psql()]
cmdline.extend(self.get_psql_options())
if command is not None:
cmdline.append('-tA') # tuple only, unaligned
cmdline.extend(['-c', command])
logger.debug("calling %s", cmdline)
p = self.popen(cmdline, stdout=PIPE)
out, err = p.communicate()
if p.returncode:
raise PgxnClientException(
"psql returned %s running command" % (p.returncode))
return out.decode('utf-8')
def load_sql(self, filename=None, data=None):
cmdline = [self.find_psql()]
cmdline.extend(self.get_psql_options())
# load via pipe to enable psql commands in the file
if not data:
logger.debug("loading sql from %s", filename)
with open(filename, 'r') as fin:
p = self.popen(cmdline, stdin=fin)
p.communicate()
else:
if len(data) > 105:
tdata = data[:100] + "..."
else:
tdata = data
logger.debug('running sql command: "%s"', tdata)
p = self.popen(cmdline, stdin=PIPE)
# for Python 3: just assume default encoding will do
if isinstance(data, unicode):
data = data.encode()
p.communicate(data)
if p.returncode:
raise PgxnClientException(
"psql returned %s loading extension" % (p.returncode))
def find_psql(self):
return self.call_pg_config('bindir') + '/psql'
def find_sql_file(self, name, sqlfile):
# In the extension the sql can be specified with a directory,
# butit gets flattened into the target dir by the Makefile
sqlfile = os.path.basename(sqlfile)
sharedir = self.call_pg_config('sharedir')
# TODO: we only check in contrib and in : actually it may be
# somewhere else - only the makefile knows!
tries = [
name + '/' + sqlfile,
sqlfile.rsplit('.', 1)[0] + '/' + sqlfile,
'contrib/' + sqlfile,
]
tried = set()
for fn in tries:
if fn in tried:
continue
tried.add(fn)
fn = sharedir + '/' + fn
logger.debug("checking sql file in %s" % fn)
if os.path.exists(fn):
return fn
else:
raise PgxnClientException(
"cannot find sql file for extension '%s': '%s'"
% (name, sqlfile))
def patch_for_schema(self, fn):
"""
Patch a sql file to set the schema where the commands are executed.
If no schema has been requested, return the data unchanged.
Else, ask for confirmation and return the data for a patched file.
The schema is only useful for PG < 9.1: for proper PG extensions there
is no need to patch the sql.
"""
schema = self.opts.schema
f = open(fn)
try: data = f.read()
finally: f.close()
if not schema:
return data
self._check_schema_exists(schema)
re_path = re.compile(r'SET\s+search_path\s*(?:=|to)\s*([^;]+);', re.I)
m = re_path.search(data)
if m is None:
newdata = ("SET search_path = %s;\n\n" % schema) + data
else:
newdata = re_path.sub("SET search_path = %s;" % schema, data)
diff = ''.join(difflib.unified_diff(
[r + '\n' for r in data.splitlines()],
[r + '\n' for r in newdata.splitlines()],
fn, fn + ".schema"))
msg = _("""
In order to operate in the schema %s, the following changes will be
performed:\n\n%s\n\nDo you want to continue?""")
self.confirm(msg % (schema, diff))
return newdata
def _register_loaded(self, fn):
if not hasattr(self, '_loaded'):
self._loaded = []
self._loaded.append(fn)
def _is_loaded(self, fn):
return hasattr(self, '_loaded') and fn in self._loaded
def _check_schema_exists(self, schema):
cmdline = [self.find_psql()]
cmdline.extend(self.get_psql_options())
cmdline.extend(['-c', 'SET search_path=%s' % schema])
p = self.popen(cmdline, stdin=PIPE, stdout=PIPE, stderr=PIPE)
p.communicate()
if p.returncode:
raise PgxnClientException(
"schema %s does not exist" % schema)
def _get_extensions(self):
"""
Return a list of pairs (name, sql file) to be loaded/unloaded.
Items are in loading order.
"""
spec = self.get_spec()
dist = self.get_meta(spec)
if 'provides' not in dist:
# No 'provides' specified: assume a single extension named
# after the distribution. This is automatically done by PGXN,
# but we should do ourselves to deal with local META files
# not mangled by the PGXN upload script yet.
name = dist['name']
for ext in self.opts.extensions:
if ext != name:
raise PgxnClientException(
"can't find extension '%s' in the distribution '%s'"
% (name, spec))
return [ (name, None) ]
rv = []
if not self.opts.extensions:
# All the extensions, in the order specified
# (assume we got an orddict from json)
for name, data in dist['provides'].items():
rv.append((name, data.get('file')))
else:
# Only the specified extensions
for name in self.opts.extensions:
try:
data = dist['provides'][name]
except KeyError:
raise PgxnClientException(
"can't find extension '%s' in the distribution '%s'"
% (name, spec))
rv.append((name, data.get('file')))
return rv
class Load(LoadUnload):
name = 'load'
description = N_("load a distribution's extensions into a database")
def run(self):
items = self._get_extensions()
for (name, sql) in items:
self.load_ext(name, sql)
def load_ext(self, name, sqlfile):
logger.debug(_("loading extension '%s' with file: %s"),
name, sqlfile)
if sqlfile and not sqlfile.endswith('.sql'):
logger.info(
_("the specified file '%s' doesn't seem SQL:"
" assuming '%s' is not a PostgreSQL extension"),
sqlfile, name)
return
pgver = self.get_pg_version()
if pgver >= (9,1,0):
if self.is_extension(name):
self.create_extension(name)
return
else:
self.confirm(_("""\
The extension '%s' doesn't contain a control file:
it will be installed as a loose set of objects.
Do you want to continue?""")
% name)
confirm = False
if not sqlfile:
sqlfile = name + '.sql'
confirm = True
fn = self.find_sql_file(name, sqlfile)
if confirm:
self.confirm(_("""\
The extension '%s' doesn't specify a SQL file.
'%s' is probably the right one.
Do you want to load it?""")
% (name, fn))
# TODO: is confirmation asked only once? Also, check for repetition
# in unload.
if self._is_loaded(fn):
logger.info(_("file %s already loaded"), fn)
else:
data = self.patch_for_schema(fn)
self.load_sql(data=data)
self._register_loaded(fn)
def create_extension(self, name):
name = Identifier(name)
schema = self.opts.schema
cmd = ["CREATE EXTENSION", name]
if schema:
cmd.extend(["SCHEMA", schema])
cmd = " ".join(cmd) + ';'
self.load_sql(data=cmd)
class Unload(LoadUnload):
name = 'unload'
description = N_("unload a distribution's extensions from a database")
def run(self):
items = self._get_extensions()
if not self.opts.extensions:
items.reverse()
for (name, sql) in items:
self.unload_ext(name, sql)
def unload_ext(self, name, sqlfile):
logger.debug(_("unloading extension '%s' with file: %s"),
name, sqlfile)
if sqlfile and not sqlfile.endswith('.sql'):
logger.info(
_("the specified file '%s' doesn't seem SQL:"
" assuming '%s' is not a PostgreSQL extension"),
sqlfile, name)
return
pgver = self.get_pg_version()
if pgver >= (9,1,0):
if self.is_extension(name):
self.drop_extension(name)
return
else:
self.confirm(_("""\
The extension '%s' doesn't contain a control file:
will look for an SQL script to unload the objects.
Do you want to continue?""")
% name)
if not sqlfile:
sqlfile = name + '.sql'
tmp = os.path.split(sqlfile)
sqlfile = os.path.join(tmp[0], 'uninstall_' + tmp[1])
fn = self.find_sql_file(name, sqlfile)
self.confirm(_("""\
In order to unload the extension '%s' looks like you will have
to load the file '%s'.
Do you want to execute it?""")
% (name, fn))
data = self.patch_for_schema(fn)
self.load_sql(data=data)
def drop_extension(self, name):
# TODO: cascade
cmd = "DROP EXTENSION %s;" % Identifier(name)
self.load_sql(data=cmd)
pgxnclient-1.2.1/pgxnclient/__init__.py 0000664 0001750 0001750 00000003540 12143727754 020136 0 ustar piro piro 0000000 0000000 """
pgxnclient -- main package
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
__version__ = '1.2.1'
# Paths where to find the command executables.
# If relative, it's from the `pgxnclient` package directory.
# Distribution packagers may move them around if they wish.
#
# Only one of the paths should be marked as "public": it will be returned by
# pgxn help --libexec
LIBEXECDIRS = [
# public, path
(False, './libexec/'),
(True, '/usr/local/libexec/pgxnclient/'),
]
assert len([x for x in LIBEXECDIRS if x[0]]) == 1, \
"only one libexec directory should be public"
__all__ = [
'Spec', 'SemVer', 'Label', 'Term', 'Identifier',
'get_scripts_dirs', 'get_public_script_dir', 'find_script' ]
import os
from pgxnclient.spec import Spec
from pgxnclient.utils.semver import SemVer
from pgxnclient.utils.strings import Label, Term, Identifier
def get_scripts_dirs():
"""
Return the absolute path of the directories containing the client scripts.
"""
return [ os.path.normpath(os.path.join(
os.path.dirname(__file__), p))
for (_, p) in LIBEXECDIRS ]
def get_public_scripts_dir():
"""
Return the absolute path of the public directory for the client scripts.
"""
return [ os.path.normpath(os.path.join(
os.path.dirname(__file__), p))
for (public, p) in LIBEXECDIRS if public ][0]
def find_script(name):
"""Return the absoulute path of a pgxn script.
The script are usually found in the `LIBEXEC` dir, but any script on the
path will do (they are usually prefixed by ``pgxn-``).
Return `None` if the script is not found.
"""
path = os.environ.get('PATH', '').split(os.pathsep)
path[0:0] = get_scripts_dirs()
for p in path:
fn = os.path.join(p, name)
if os.path.isfile(fn):
return fn
pgxnclient-1.2.1/pgxnclient/spec.py 0000664 0001750 0001750 00000006246 12143727745 017337 0 ustar piro piro 0000000 0000000 """
pgxnclient -- specification object
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
import re
import urllib
import operator as _op
from pgxnclient.i18n import _
from pgxnclient.errors import BadSpecError, ResourceNotFound
from pgxnclient.utils.semver import SemVer
from pgxnclient.utils.strings import Term
class Spec(object):
"""A name together with a range of versions."""
# Available release statuses.
# Order matters.
UNSTABLE = 0
TESTING = 1
STABLE = 2
STATUS = {
'unstable': UNSTABLE,
'testing': TESTING,
'stable': STABLE, }
def __init__(self, name=None, op=None, ver=None,
dirname=None, filename=None, url=None):
self.name = name and name.lower()
self.op = op
self.ver = ver
# point to local files or specific resources
self.dirname = dirname
self.filename = filename
self.url = url
def is_name(self):
return self.name is not None
def is_dir(self):
return self.dirname is not None
def is_file(self):
return self.filename is not None
def is_url(self):
return self.url is not None
def is_local(self):
return self.is_dir() or self.is_file()
def __str__(self):
name = self.name or self.filename or self.dirname or self.url or "???"
if self.op is None:
return name
else:
return "%s%s%s" % (name, self.op, self.ver)
@classmethod
def parse(self, spec):
"""Parse a spec string into a populated Spec instance.
Raise BadSpecError if couldn't parse.
"""
# check if it's a network resource
if spec.startswith('http://') or spec.startswith('https://'):
return Spec(url=spec)
# check if it's a local resource
if spec.startswith('file://'):
try_file = urllib.unquote_plus(spec[len('file://'):])
elif os.sep in spec:
try_file = spec
else:
try_file = None
if try_file:
# This is a local thing, let's see what
if os.path.isdir(try_file):
return Spec(dirname=try_file)
elif os.path.exists(try_file):
return Spec(filename=try_file)
else:
raise ResourceNotFound(_("cannot find '%s'") % try_file)
# so we think it's a PGXN spec
# split operator/version and name
m = re.match(r'(.+?)(?:(==|=|>=|>|<=|<)(.*))?$', spec)
if m is None:
raise BadSpecError(
_("bad format for version specification: '%s'"), spec)
name = Term(m.group(1))
op = m.group(2)
if op == '=':
op = '=='
if op is not None:
ver = SemVer.clean(m.group(3))
else:
ver = None
return Spec(name, op, ver)
def accepted(self, version, _map = {
'==': _op.eq, '<=': _op.le, '<': _op.lt, '>=': _op.ge, '>': _op.gt}):
"""Return True if the given version is accepted in the spec."""
if self.op is None:
return True
return _map[self.op](version, self.ver)
pgxnclient-1.2.1/pgxnclient/utils/ 0000775 0001750 0001750 00000000000 12143730213 017143 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/pgxnclient/utils/semver.py 0000664 0001750 0001750 00000010731 12143727754 021040 0 ustar piro piro 0000000 0000000 """
SemVer -- (not quite) semantic version specification
http://semver.org/
IMPORTANT: don't trust this implementation. And don't trust SemVer AT ALL.
We have a bloody mess because the specification changed after being published
and after several extension had been uploaded with a version number that
suddenly had become no more valid.
https://github.com/mojombo/semver.org/issues/49
My plea for forking the spec and keep our schema has been ignored. So this
module only tries to make sure people can use PGXN, not to be conform to an
half-aborted specification. End of rant.
This implementation is conform to the SemVer 0.3.0 implementation by David
Wheeler (http://pgxn.org/dist/semver/0.3.0/) and passes all its unit test.
Note that it is slightly non conform to the original specification, as the
trailing part should be compared in ascii order while our comparison is not
case sensitive. David has already stated that the meaning is independent on
the case (http://blog.pgxn.org/post/4948135198/case-insensitivity) and I'm
fine with that: the important thing is that the client and the server
understand each other.
"""
# Copyright (C) 2011-2013 Daniele Varrazzo
# This file is part of the PGXN client
import re
import operator
from pgxnclient.i18n import _
class SemVer(str):
"""A string representing a semantic version number.
Non valid version numbers raise ValueError.
"""
def __new__(cls, value):
self = str.__new__(cls, value)
self.tuple = SemVer.parse(value)
return self
@property
def major(self): return self.tuple[0]
@property
def minor(self): return self.tuple[1]
@property
def patch(self): return self.tuple[2]
@property
def trail(self): return self.tuple[3]
def __repr__(self):
return "%s(%r)" % (self.__class__.__name__, str(self))
def __eq__(self, other):
if isinstance(other, SemVer):
return self.tuple[:3] == other.tuple[:3] \
and self.tuple[3].lower() == other.tuple[3].lower()
elif isinstance(other, str):
return self == SemVer(other)
else:
return NotImplemented
def __ne__(self, other):
return not self == other
def __hash__(self):
return hash(self.tuple[:3] + (self.tuple[3].lower(),))
def _ltgt(self, other, op):
if isinstance(other, SemVer):
t1 = self.tuple[:3]
t2 = other.tuple[:3]
if t1 <> t2: return op(t1, t2)
s1 = self.tuple[3].lower()
s2 = other.tuple[3].lower()
if s1 == s2: return False
if s1 and s2: return op(s1, s2)
return op(bool(s2), bool(s1))
elif isinstance(other, str):
return op(self, SemVer(other))
else:
return NotImplemented
def __lt__(self, other, op=operator.lt):
return self._ltgt(other, operator.lt)
def __gt__(self, other):
return self._ltgt(other, operator.gt)
def __ge__(self, other):
return not self < other
def __le__(self, other):
return not other < self
@classmethod
def parse(self, s):
"""
Split a valid version number in components (major, minor, patch, trail).
"""
m = re_semver.match(s)
if m is None:
raise ValueError(_("bad version number: '%s'") % s)
maj, min, patch, trail = m.groups()
if not patch: patch = 0
if not trail: trail = ''
return (int(maj), int(min), int(patch), trail)
@classmethod
def clean(self, s):
"""
Convert an invalid but still recognizable version number into a SemVer.
"""
m = re_clean.match(s.strip())
if m is None:
raise ValueError(_("bad version number: '%s' - can't clean") % s)
maj, min, patch, trail = m.groups()
maj = maj and int(maj) or 0
min = min and int(min) or 0
patch = patch and int(patch) or 0
trail = trail and '-' + trail.strip() or ''
return "%d.%d.%d%s" % (maj, min, patch, trail)
re_semver = re.compile(r"""
^
(0|[1-9][0-9]*)
\. (0|[1-9][0-9]*)
\. (0|[1-9][0-9]*)
(?:
-? # should be mandatory, but see rant above
([a-z][a-z0-9-]*)
)?
$
""",
re.IGNORECASE | re.VERBOSE)
re_clean = re.compile(r"""
^
([0-9]+)?
\.? ([0-9]+)?
\.? ([0-9]+)?
(?:
-? \s*
([a-z][a-z0-9-]*)
)?
$
""",
re.IGNORECASE | re.VERBOSE)
pgxnclient-1.2.1/pgxnclient/utils/argparse.py 0000775 0001750 0001750 00000253447 12143727745 021363 0 ustar piro piro 0000000 0000000 # Author: Steven J. Bethard .
"""Command-line parsing library
This module is an optparse-inspired command-line parsing library that:
- handles both optional and positional arguments
- produces highly informative usage messages
- supports parsers that dispatch to sub-parsers
The following is a simple usage example that sums integers from the
command-line and writes the result to a file::
parser = argparse.ArgumentParser(
description='sum the integers at the command line')
parser.add_argument(
'integers', metavar='int', nargs='+', type=int,
help='an integer to be summed')
parser.add_argument(
'--log', default=sys.stdout, type=argparse.FileType('w'),
help='the file where the sum should be written')
args = parser.parse_args()
args.log.write('%s' % sum(args.integers))
args.log.close()
The module contains the following public classes:
- ArgumentParser -- The main entry point for command-line parsing. As the
example above shows, the add_argument() method is used to populate
the parser with actions for optional and positional arguments. Then
the parse_args() method is invoked to convert the args at the
command-line into an object with attributes.
- ArgumentError -- The exception raised by ArgumentParser objects when
there are errors with the parser's actions. Errors raised while
parsing the command-line are caught by ArgumentParser and emitted
as command-line messages.
- FileType -- A factory for defining types of files to be created. As the
example above shows, instances of FileType are typically passed as
the type= argument of add_argument() calls.
- Action -- The base class for parser actions. Typically actions are
selected by passing strings like 'store_true' or 'append_const' to
the action= argument of add_argument(). However, for greater
customization of ArgumentParser actions, subclasses of Action may
be defined and passed as the action= argument.
- HelpFormatter, RawDescriptionHelpFormatter, RawTextHelpFormatter,
ArgumentDefaultsHelpFormatter -- Formatter classes which
may be passed as the formatter_class= argument to the
ArgumentParser constructor. HelpFormatter is the default,
RawDescriptionHelpFormatter and RawTextHelpFormatter tell the parser
not to change the formatting for help text, and
ArgumentDefaultsHelpFormatter adds information about argument defaults
to the help.
All other classes in this module are considered implementation details.
(Also note that HelpFormatter and RawDescriptionHelpFormatter are only
considered public as object names -- the API of the formatter objects is
still considered an implementation detail.)
"""
__version__ = '1.2.1'
__all__ = [
'ArgumentParser',
'ArgumentError',
'ArgumentTypeError',
'FileType',
'HelpFormatter',
'ArgumentDefaultsHelpFormatter',
'RawDescriptionHelpFormatter',
'RawTextHelpFormatter',
'Namespace',
'Action',
'ONE_OR_MORE',
'OPTIONAL',
'PARSER',
'REMAINDER',
'SUPPRESS',
'ZERO_OR_MORE',
]
import copy as _copy
import os as _os
import re as _re
import sys as _sys
import textwrap as _textwrap
from gettext import gettext as _
try:
set
except NameError:
# for python < 2.4 compatibility (sets module is there since 2.3):
from sets import Set as set
try:
basestring
except NameError:
basestring = str
try:
sorted
except NameError:
# for python < 2.4 compatibility:
def sorted(iterable, reverse=False):
result = list(iterable)
result.sort()
if reverse:
result.reverse()
return result
def _callable(obj):
return hasattr(obj, '__call__') or hasattr(obj, '__bases__')
SUPPRESS = '==SUPPRESS=='
OPTIONAL = '?'
ZERO_OR_MORE = '*'
ONE_OR_MORE = '+'
PARSER = 'A...'
REMAINDER = '...'
_UNRECOGNIZED_ARGS_ATTR = '_unrecognized_args'
# =============================
# Utility functions and classes
# =============================
class _AttributeHolder(object):
"""Abstract base class that provides __repr__.
The __repr__ method returns a string in the format::
ClassName(attr=name, attr=name, ...)
The attributes are determined either by a class-level attribute,
'_kwarg_names', or by inspecting the instance __dict__.
"""
def __repr__(self):
type_name = type(self).__name__
arg_strings = []
for arg in self._get_args():
arg_strings.append(repr(arg))
for name, value in self._get_kwargs():
arg_strings.append('%s=%r' % (name, value))
return '%s(%s)' % (type_name, ', '.join(arg_strings))
def _get_kwargs(self):
return sorted(self.__dict__.items())
def _get_args(self):
return []
def _ensure_value(namespace, name, value):
if getattr(namespace, name, None) is None:
setattr(namespace, name, value)
return getattr(namespace, name)
# ===============
# Formatting Help
# ===============
class HelpFormatter(object):
"""Formatter for generating usage messages and argument help strings.
Only the name of this class is considered a public API. All the methods
provided by the class are considered an implementation detail.
"""
def __init__(self,
prog,
indent_increment=2,
max_help_position=24,
width=None):
# default setting for width
if width is None:
try:
width = int(_os.environ['COLUMNS'])
except (KeyError, ValueError):
width = 80
width -= 2
self._prog = prog
self._indent_increment = indent_increment
self._max_help_position = max_help_position
self._width = width
self._current_indent = 0
self._level = 0
self._action_max_length = 0
self._root_section = self._Section(self, None)
self._current_section = self._root_section
self._whitespace_matcher = _re.compile(r'\s+')
self._long_break_matcher = _re.compile(r'\n\n\n+')
# ===============================
# Section and indentation methods
# ===============================
def _indent(self):
self._current_indent += self._indent_increment
self._level += 1
def _dedent(self):
self._current_indent -= self._indent_increment
assert self._current_indent >= 0, 'Indent decreased below 0.'
self._level -= 1
class _Section(object):
def __init__(self, formatter, parent, heading=None):
self.formatter = formatter
self.parent = parent
self.heading = heading
self.items = []
def format_help(self):
# format the indented section
if self.parent is not None:
self.formatter._indent()
join = self.formatter._join_parts
for func, args in self.items:
func(*args)
item_help = join([func(*args) for func, args in self.items])
if self.parent is not None:
self.formatter._dedent()
# return nothing if the section was empty
if not item_help:
return ''
# add the heading if the section was non-empty
if self.heading is not SUPPRESS and self.heading is not None:
current_indent = self.formatter._current_indent
heading = '%*s%s:\n' % (current_indent, '', self.heading)
else:
heading = ''
# join the section-initial newline, the heading and the help
return join(['\n', heading, item_help, '\n'])
def _add_item(self, func, args):
self._current_section.items.append((func, args))
# ========================
# Message building methods
# ========================
def start_section(self, heading):
self._indent()
section = self._Section(self, self._current_section, heading)
self._add_item(section.format_help, [])
self._current_section = section
def end_section(self):
self._current_section = self._current_section.parent
self._dedent()
def add_text(self, text):
if text is not SUPPRESS and text is not None:
self._add_item(self._format_text, [text])
def add_usage(self, usage, actions, groups, prefix=None):
if usage is not SUPPRESS:
args = usage, actions, groups, prefix
self._add_item(self._format_usage, args)
def add_argument(self, action):
if action.help is not SUPPRESS:
# find all invocations
get_invocation = self._format_action_invocation
invocations = [get_invocation(action)]
for subaction in self._iter_indented_subactions(action):
invocations.append(get_invocation(subaction))
# update the maximum item length
invocation_length = max([len(s) for s in invocations])
action_length = invocation_length + self._current_indent
self._action_max_length = max(self._action_max_length,
action_length)
# add the item to the list
self._add_item(self._format_action, [action])
def add_arguments(self, actions):
for action in actions:
self.add_argument(action)
# =======================
# Help-formatting methods
# =======================
def format_help(self):
help = self._root_section.format_help()
if help:
help = self._long_break_matcher.sub('\n\n', help)
help = help.strip('\n') + '\n'
return help
def _join_parts(self, part_strings):
return ''.join([part
for part in part_strings
if part and part is not SUPPRESS])
def _format_usage(self, usage, actions, groups, prefix):
if prefix is None:
prefix = _('usage: ')
# if usage is specified, use that
if usage is not None:
usage = usage % dict(prog=self._prog)
# if no optionals or positionals are available, usage is just prog
elif usage is None and not actions:
usage = '%(prog)s' % dict(prog=self._prog)
# if optionals and positionals are available, calculate usage
elif usage is None:
prog = '%(prog)s' % dict(prog=self._prog)
# split optionals from positionals
optionals = []
positionals = []
for action in actions:
if action.option_strings:
optionals.append(action)
else:
positionals.append(action)
# build full usage string
format = self._format_actions_usage
action_usage = format(optionals + positionals, groups)
usage = ' '.join([s for s in [prog, action_usage] if s])
# wrap the usage parts if it's too long
text_width = self._width - self._current_indent
if len(prefix) + len(usage) > text_width:
# break usage into wrappable parts
part_regexp = r'\(.*?\)+|\[.*?\]+|\S+'
opt_usage = format(optionals, groups)
pos_usage = format(positionals, groups)
opt_parts = _re.findall(part_regexp, opt_usage)
pos_parts = _re.findall(part_regexp, pos_usage)
assert ' '.join(opt_parts) == opt_usage
assert ' '.join(pos_parts) == pos_usage
# helper for wrapping lines
def get_lines(parts, indent, prefix=None):
lines = []
line = []
if prefix is not None:
line_len = len(prefix) - 1
else:
line_len = len(indent) - 1
for part in parts:
if line_len + 1 + len(part) > text_width:
lines.append(indent + ' '.join(line))
line = []
line_len = len(indent) - 1
line.append(part)
line_len += len(part) + 1
if line:
lines.append(indent + ' '.join(line))
if prefix is not None:
lines[0] = lines[0][len(indent):]
return lines
# if prog is short, follow it with optionals or positionals
if len(prefix) + len(prog) <= 0.75 * text_width:
indent = ' ' * (len(prefix) + len(prog) + 1)
if opt_parts:
lines = get_lines([prog] + opt_parts, indent, prefix)
lines.extend(get_lines(pos_parts, indent))
elif pos_parts:
lines = get_lines([prog] + pos_parts, indent, prefix)
else:
lines = [prog]
# if prog is long, put it on its own line
else:
indent = ' ' * len(prefix)
parts = opt_parts + pos_parts
lines = get_lines(parts, indent)
if len(lines) > 1:
lines = []
lines.extend(get_lines(opt_parts, indent))
lines.extend(get_lines(pos_parts, indent))
lines = [prog] + lines
# join lines into usage
usage = '\n'.join(lines)
# prefix with 'usage:'
return '%s%s\n\n' % (prefix, usage)
def _format_actions_usage(self, actions, groups):
# find group indices and identify actions in groups
group_actions = set()
inserts = {}
for group in groups:
try:
start = actions.index(group._group_actions[0])
except ValueError:
continue
else:
end = start + len(group._group_actions)
if actions[start:end] == group._group_actions:
for action in group._group_actions:
group_actions.add(action)
if not group.required:
if start in inserts:
inserts[start] += ' ['
else:
inserts[start] = '['
inserts[end] = ']'
else:
if start in inserts:
inserts[start] += ' ('
else:
inserts[start] = '('
inserts[end] = ')'
for i in range(start + 1, end):
inserts[i] = '|'
# collect all actions format strings
parts = []
for i, action in enumerate(actions):
# suppressed arguments are marked with None
# remove | separators for suppressed arguments
if action.help is SUPPRESS:
parts.append(None)
if inserts.get(i) == '|':
inserts.pop(i)
elif inserts.get(i + 1) == '|':
inserts.pop(i + 1)
# produce all arg strings
elif not action.option_strings:
part = self._format_args(action, action.dest)
# if it's in a group, strip the outer []
if action in group_actions:
if part[0] == '[' and part[-1] == ']':
part = part[1:-1]
# add the action string to the list
parts.append(part)
# produce the first way to invoke the option in brackets
else:
option_string = action.option_strings[0]
# if the Optional doesn't take a value, format is:
# -s or --long
if action.nargs == 0:
part = '%s' % option_string
# if the Optional takes a value, format is:
# -s ARGS or --long ARGS
else:
default = action.dest.upper()
args_string = self._format_args(action, default)
part = '%s %s' % (option_string, args_string)
# make it look optional if it's not required or in a group
if not action.required and action not in group_actions:
part = '[%s]' % part
# add the action string to the list
parts.append(part)
# insert things at the necessary indices
for i in sorted(inserts, reverse=True):
parts[i:i] = [inserts[i]]
# join all the action items with spaces
text = ' '.join([item for item in parts if item is not None])
# clean up separators for mutually exclusive groups
open = r'[\[(]'
close = r'[\])]'
text = _re.sub(r'(%s) ' % open, r'\1', text)
text = _re.sub(r' (%s)' % close, r'\1', text)
text = _re.sub(r'%s *%s' % (open, close), r'', text)
text = _re.sub(r'\(([^|]*)\)', r'\1', text)
text = text.strip()
# return the text
return text
def _format_text(self, text):
if '%(prog)' in text:
text = text % dict(prog=self._prog)
text_width = self._width - self._current_indent
indent = ' ' * self._current_indent
return self._fill_text(text, text_width, indent) + '\n\n'
def _format_action(self, action):
# determine the required width and the entry label
help_position = min(self._action_max_length + 2,
self._max_help_position)
help_width = self._width - help_position
action_width = help_position - self._current_indent - 2
action_header = self._format_action_invocation(action)
# ho nelp; start on same line and add a final newline
if not action.help:
tup = self._current_indent, '', action_header
action_header = '%*s%s\n' % tup
# short action name; start on the same line and pad two spaces
elif len(action_header) <= action_width:
tup = self._current_indent, '', action_width, action_header
action_header = '%*s%-*s ' % tup
indent_first = 0
# long action name; start on the next line
else:
tup = self._current_indent, '', action_header
action_header = '%*s%s\n' % tup
indent_first = help_position
# collect the pieces of the action help
parts = [action_header]
# if there was help for the action, add lines of help text
if action.help:
help_text = self._expand_help(action)
help_lines = self._split_lines(help_text, help_width)
parts.append('%*s%s\n' % (indent_first, '', help_lines[0]))
for line in help_lines[1:]:
parts.append('%*s%s\n' % (help_position, '', line))
# or add a newline if the description doesn't end with one
elif not action_header.endswith('\n'):
parts.append('\n')
# if there are any sub-actions, add their help as well
for subaction in self._iter_indented_subactions(action):
parts.append(self._format_action(subaction))
# return a single string
return self._join_parts(parts)
def _format_action_invocation(self, action):
if not action.option_strings:
metavar, = self._metavar_formatter(action, action.dest)(1)
return metavar
else:
parts = []
# if the Optional doesn't take a value, format is:
# -s, --long
if action.nargs == 0:
parts.extend(action.option_strings)
# if the Optional takes a value, format is:
# -s ARGS, --long ARGS
else:
default = action.dest.upper()
args_string = self._format_args(action, default)
for option_string in action.option_strings:
parts.append('%s %s' % (option_string, args_string))
return ', '.join(parts)
def _metavar_formatter(self, action, default_metavar):
if action.metavar is not None:
result = action.metavar
elif action.choices is not None:
choice_strs = [str(choice) for choice in action.choices]
choice_strs.sort()
result = '{%s}' % ','.join(choice_strs)
else:
result = default_metavar
def format(tuple_size):
if isinstance(result, tuple):
return result
else:
return (result, ) * tuple_size
return format
def _format_args(self, action, default_metavar):
get_metavar = self._metavar_formatter(action, default_metavar)
if action.nargs is None:
result = '%s' % get_metavar(1)
elif action.nargs == OPTIONAL:
result = '[%s]' % get_metavar(1)
elif action.nargs == ZERO_OR_MORE:
result = '[%s [%s ...]]' % get_metavar(2)
elif action.nargs == ONE_OR_MORE:
result = '%s [%s ...]' % get_metavar(2)
elif action.nargs == REMAINDER:
result = '...'
elif action.nargs == PARSER:
result = '%s ...' % get_metavar(1)
else:
formats = ['%s' for _ in range(action.nargs)]
result = ' '.join(formats) % get_metavar(action.nargs)
return result
def _expand_help(self, action):
params = dict(vars(action), prog=self._prog)
for name in list(params):
if params[name] is SUPPRESS:
del params[name]
for name in list(params):
if hasattr(params[name], '__name__'):
params[name] = params[name].__name__
if params.get('choices') is not None:
choices_str = ', '.join([str(c) for c in params['choices']])
params['choices'] = choices_str
return self._get_help_string(action) % params
def _iter_indented_subactions(self, action):
try:
get_subactions = action._get_subactions
except AttributeError:
pass
else:
self._indent()
for subaction in get_subactions():
yield subaction
self._dedent()
def _split_lines(self, text, width):
text = self._whitespace_matcher.sub(' ', text).strip()
return _textwrap.wrap(text, width)
def _fill_text(self, text, width, indent):
text = self._whitespace_matcher.sub(' ', text).strip()
return _textwrap.fill(text, width, initial_indent=indent,
subsequent_indent=indent)
def _get_help_string(self, action):
return action.help
class RawDescriptionHelpFormatter(HelpFormatter):
"""Help message formatter which retains any formatting in descriptions.
Only the name of this class is considered a public API. All the methods
provided by the class are considered an implementation detail.
"""
def _fill_text(self, text, width, indent):
return ''.join([indent + line for line in text.splitlines(True)])
class RawTextHelpFormatter(RawDescriptionHelpFormatter):
"""Help message formatter which retains formatting of all help text.
Only the name of this class is considered a public API. All the methods
provided by the class are considered an implementation detail.
"""
def _split_lines(self, text, width):
return text.splitlines()
class ArgumentDefaultsHelpFormatter(HelpFormatter):
"""Help message formatter which adds default values to argument help.
Only the name of this class is considered a public API. All the methods
provided by the class are considered an implementation detail.
"""
def _get_help_string(self, action):
help = action.help
if '%(default)' not in action.help:
if action.default is not SUPPRESS:
defaulting_nargs = [OPTIONAL, ZERO_OR_MORE]
if action.option_strings or action.nargs in defaulting_nargs:
help += ' (default: %(default)s)'
return help
# =====================
# Options and Arguments
# =====================
def _get_action_name(argument):
if argument is None:
return None
elif argument.option_strings:
return '/'.join(argument.option_strings)
elif argument.metavar not in (None, SUPPRESS):
return argument.metavar
elif argument.dest not in (None, SUPPRESS):
return argument.dest
else:
return None
class ArgumentError(Exception):
"""An error from creating or using an argument (optional or positional).
The string value of this exception is the message, augmented with
information about the argument that caused it.
"""
def __init__(self, argument, message):
self.argument_name = _get_action_name(argument)
self.message = message
def __str__(self):
if self.argument_name is None:
format = '%(message)s'
else:
format = 'argument %(argument_name)s: %(message)s'
return format % dict(message=self.message,
argument_name=self.argument_name)
class ArgumentTypeError(Exception):
"""An error from trying to convert a command line string to a type."""
pass
# ==============
# Action classes
# ==============
class Action(_AttributeHolder):
"""Information about how to convert command line strings to Python objects.
Action objects are used by an ArgumentParser to represent the information
needed to parse a single argument from one or more strings from the
command line. The keyword arguments to the Action constructor are also
all attributes of Action instances.
Keyword Arguments:
- option_strings -- A list of command-line option strings which
should be associated with this action.
- dest -- The name of the attribute to hold the created object(s)
- nargs -- The number of command-line arguments that should be
consumed. By default, one argument will be consumed and a single
value will be produced. Other values include:
- N (an integer) consumes N arguments (and produces a list)
- '?' consumes zero or one arguments
- '*' consumes zero or more arguments (and produces a list)
- '+' consumes one or more arguments (and produces a list)
Note that the difference between the default and nargs=1 is that
with the default, a single value will be produced, while with
nargs=1, a list containing a single value will be produced.
- const -- The value to be produced if the option is specified and the
option uses an action that takes no values.
- default -- The value to be produced if the option is not specified.
- type -- The type which the command-line arguments should be converted
to, should be one of 'string', 'int', 'float', 'complex' or a
callable object that accepts a single string argument. If None,
'string' is assumed.
- choices -- A container of values that should be allowed. If not None,
after a command-line argument has been converted to the appropriate
type, an exception will be raised if it is not a member of this
collection.
- required -- True if the action must always be specified at the
command line. This is only meaningful for optional command-line
arguments.
- help -- The help string describing the argument.
- metavar -- The name to be used for the option's argument with the
help string. If None, the 'dest' value will be used as the name.
"""
def __init__(self,
option_strings,
dest,
nargs=None,
const=None,
default=None,
type=None,
choices=None,
required=False,
help=None,
metavar=None):
self.option_strings = option_strings
self.dest = dest
self.nargs = nargs
self.const = const
self.default = default
self.type = type
self.choices = choices
self.required = required
self.help = help
self.metavar = metavar
def _get_kwargs(self):
names = [
'option_strings',
'dest',
'nargs',
'const',
'default',
'type',
'choices',
'help',
'metavar',
]
return [(name, getattr(self, name)) for name in names]
def __call__(self, parser, namespace, values, option_string=None):
raise NotImplementedError(_('.__call__() not defined'))
class _StoreAction(Action):
def __init__(self,
option_strings,
dest,
nargs=None,
const=None,
default=None,
type=None,
choices=None,
required=False,
help=None,
metavar=None):
if nargs == 0:
raise ValueError('nargs for store actions must be > 0; if you '
'have nothing to store, actions such as store '
'true or store const may be more appropriate')
if const is not None and nargs != OPTIONAL:
raise ValueError('nargs must be %r to supply const' % OPTIONAL)
super(_StoreAction, self).__init__(
option_strings=option_strings,
dest=dest,
nargs=nargs,
const=const,
default=default,
type=type,
choices=choices,
required=required,
help=help,
metavar=metavar)
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, values)
class _StoreConstAction(Action):
def __init__(self,
option_strings,
dest,
const,
default=None,
required=False,
help=None,
metavar=None):
super(_StoreConstAction, self).__init__(
option_strings=option_strings,
dest=dest,
nargs=0,
const=const,
default=default,
required=required,
help=help)
def __call__(self, parser, namespace, values, option_string=None):
setattr(namespace, self.dest, self.const)
class _StoreTrueAction(_StoreConstAction):
def __init__(self,
option_strings,
dest,
default=False,
required=False,
help=None):
super(_StoreTrueAction, self).__init__(
option_strings=option_strings,
dest=dest,
const=True,
default=default,
required=required,
help=help)
class _StoreFalseAction(_StoreConstAction):
def __init__(self,
option_strings,
dest,
default=True,
required=False,
help=None):
super(_StoreFalseAction, self).__init__(
option_strings=option_strings,
dest=dest,
const=False,
default=default,
required=required,
help=help)
class _AppendAction(Action):
def __init__(self,
option_strings,
dest,
nargs=None,
const=None,
default=None,
type=None,
choices=None,
required=False,
help=None,
metavar=None):
if nargs == 0:
raise ValueError('nargs for append actions must be > 0; if arg '
'strings are not supplying the value to append, '
'the append const action may be more appropriate')
if const is not None and nargs != OPTIONAL:
raise ValueError('nargs must be %r to supply const' % OPTIONAL)
super(_AppendAction, self).__init__(
option_strings=option_strings,
dest=dest,
nargs=nargs,
const=const,
default=default,
type=type,
choices=choices,
required=required,
help=help,
metavar=metavar)
def __call__(self, parser, namespace, values, option_string=None):
items = _copy.copy(_ensure_value(namespace, self.dest, []))
items.append(values)
setattr(namespace, self.dest, items)
class _AppendConstAction(Action):
def __init__(self,
option_strings,
dest,
const,
default=None,
required=False,
help=None,
metavar=None):
super(_AppendConstAction, self).__init__(
option_strings=option_strings,
dest=dest,
nargs=0,
const=const,
default=default,
required=required,
help=help,
metavar=metavar)
def __call__(self, parser, namespace, values, option_string=None):
items = _copy.copy(_ensure_value(namespace, self.dest, []))
items.append(self.const)
setattr(namespace, self.dest, items)
class _CountAction(Action):
def __init__(self,
option_strings,
dest,
default=None,
required=False,
help=None):
super(_CountAction, self).__init__(
option_strings=option_strings,
dest=dest,
nargs=0,
default=default,
required=required,
help=help)
def __call__(self, parser, namespace, values, option_string=None):
new_count = _ensure_value(namespace, self.dest, 0) + 1
setattr(namespace, self.dest, new_count)
class _HelpAction(Action):
def __init__(self,
option_strings,
dest=SUPPRESS,
default=SUPPRESS,
help=None):
super(_HelpAction, self).__init__(
option_strings=option_strings,
dest=dest,
default=default,
nargs=0,
help=help)
def __call__(self, parser, namespace, values, option_string=None):
parser.print_help()
parser.exit()
class _VersionAction(Action):
def __init__(self,
option_strings,
version=None,
dest=SUPPRESS,
default=SUPPRESS,
help="show program's version number and exit"):
super(_VersionAction, self).__init__(
option_strings=option_strings,
dest=dest,
default=default,
nargs=0,
help=help)
self.version = version
def __call__(self, parser, namespace, values, option_string=None):
version = self.version
if version is None:
version = parser.version
formatter = parser._get_formatter()
formatter.add_text(version)
parser.exit(message=formatter.format_help())
class _SubParsersAction(Action):
class _ChoicesPseudoAction(Action):
def __init__(self, name, help):
sup = super(_SubParsersAction._ChoicesPseudoAction, self)
sup.__init__(option_strings=[], dest=name, help=help)
def __init__(self,
option_strings,
prog,
parser_class,
dest=SUPPRESS,
help=None,
metavar=None):
self._prog_prefix = prog
self._parser_class = parser_class
self._name_parser_map = {}
self._choices_actions = []
super(_SubParsersAction, self).__init__(
option_strings=option_strings,
dest=dest,
nargs=PARSER,
choices=self._name_parser_map,
help=help,
metavar=metavar)
def add_parser(self, name, **kwargs):
# set prog from the existing prefix
if kwargs.get('prog') is None:
kwargs['prog'] = '%s %s' % (self._prog_prefix, name)
# create a pseudo-action to hold the choice help
if 'help' in kwargs:
help = kwargs.pop('help')
choice_action = self._ChoicesPseudoAction(name, help)
self._choices_actions.append(choice_action)
# create the parser and add it to the map
parser = self._parser_class(**kwargs)
self._name_parser_map[name] = parser
return parser
def _get_subactions(self):
return self._choices_actions
def __call__(self, parser, namespace, values, option_string=None):
parser_name = values[0]
arg_strings = values[1:]
# set the parser name if requested
if self.dest is not SUPPRESS:
setattr(namespace, self.dest, parser_name)
# select the parser
try:
parser = self._name_parser_map[parser_name]
except KeyError:
tup = parser_name, ', '.join(self._name_parser_map)
msg = _('unknown parser %r (choices: %s)' % tup)
raise ArgumentError(self, msg)
# parse all the remaining options into the namespace
# store any unrecognized options on the object, so that the top
# level parser can decide what to do with them
namespace, arg_strings = parser.parse_known_args(arg_strings, namespace)
if arg_strings:
vars(namespace).setdefault(_UNRECOGNIZED_ARGS_ATTR, [])
getattr(namespace, _UNRECOGNIZED_ARGS_ATTR).extend(arg_strings)
# ==============
# Type classes
# ==============
class FileType(object):
"""Factory for creating file object types
Instances of FileType are typically passed as type= arguments to the
ArgumentParser add_argument() method.
Keyword Arguments:
- mode -- A string indicating how the file is to be opened. Accepts the
same values as the builtin open() function.
- bufsize -- The file's desired buffer size. Accepts the same values as
the builtin open() function.
"""
def __init__(self, mode='r', bufsize=None):
self._mode = mode
self._bufsize = bufsize
def __call__(self, string):
# the special argument "-" means sys.std{in,out}
if string == '-':
if 'r' in self._mode:
return _sys.stdin
elif 'w' in self._mode:
return _sys.stdout
else:
msg = _('argument "-" with mode %r' % self._mode)
raise ValueError(msg)
# all other arguments are used as file names
if self._bufsize:
return open(string, self._mode, self._bufsize)
else:
return open(string, self._mode)
def __repr__(self):
args = [self._mode, self._bufsize]
args_str = ', '.join([repr(arg) for arg in args if arg is not None])
return '%s(%s)' % (type(self).__name__, args_str)
# ===========================
# Optional and Positional Parsing
# ===========================
class Namespace(_AttributeHolder):
"""Simple object for storing attributes.
Implements equality by attribute names and values, and provides a simple
string representation.
"""
def __init__(self, **kwargs):
for name in kwargs:
setattr(self, name, kwargs[name])
__hash__ = None
def __eq__(self, other):
return vars(self) == vars(other)
def __ne__(self, other):
return not (self == other)
def __contains__(self, key):
return key in self.__dict__
class _ActionsContainer(object):
def __init__(self,
description,
prefix_chars,
argument_default,
conflict_handler):
super(_ActionsContainer, self).__init__()
self.description = description
self.argument_default = argument_default
self.prefix_chars = prefix_chars
self.conflict_handler = conflict_handler
# set up registries
self._registries = {}
# register actions
self.register('action', None, _StoreAction)
self.register('action', 'store', _StoreAction)
self.register('action', 'store_const', _StoreConstAction)
self.register('action', 'store_true', _StoreTrueAction)
self.register('action', 'store_false', _StoreFalseAction)
self.register('action', 'append', _AppendAction)
self.register('action', 'append_const', _AppendConstAction)
self.register('action', 'count', _CountAction)
self.register('action', 'help', _HelpAction)
self.register('action', 'version', _VersionAction)
self.register('action', 'parsers', _SubParsersAction)
# raise an exception if the conflict handler is invalid
self._get_handler()
# action storage
self._actions = []
self._option_string_actions = {}
# groups
self._action_groups = []
self._mutually_exclusive_groups = []
# defaults storage
self._defaults = {}
# determines whether an "option" looks like a negative number
self._negative_number_matcher = _re.compile(r'^-\d+$|^-\d*\.\d+$')
# whether or not there are any optionals that look like negative
# numbers -- uses a list so it can be shared and edited
self._has_negative_number_optionals = []
# ====================
# Registration methods
# ====================
def register(self, registry_name, value, object):
registry = self._registries.setdefault(registry_name, {})
registry[value] = object
def _registry_get(self, registry_name, value, default=None):
return self._registries[registry_name].get(value, default)
# ==================================
# Namespace default accessor methods
# ==================================
def set_defaults(self, **kwargs):
self._defaults.update(kwargs)
# if these defaults match any existing arguments, replace
# the previous default on the object with the new one
for action in self._actions:
if action.dest in kwargs:
action.default = kwargs[action.dest]
def get_default(self, dest):
for action in self._actions:
if action.dest == dest and action.default is not None:
return action.default
return self._defaults.get(dest, None)
# =======================
# Adding argument actions
# =======================
def add_argument(self, *args, **kwargs):
"""
add_argument(dest, ..., name=value, ...)
add_argument(option_string, option_string, ..., name=value, ...)
"""
# if no positional args are supplied or only one is supplied and
# it doesn't look like an option string, parse a positional
# argument
chars = self.prefix_chars
if not args or len(args) == 1 and args[0][0] not in chars:
if args and 'dest' in kwargs:
raise ValueError('dest supplied twice for positional argument')
kwargs = self._get_positional_kwargs(*args, **kwargs)
# otherwise, we're adding an optional argument
else:
kwargs = self._get_optional_kwargs(*args, **kwargs)
# if no default was supplied, use the parser-level default
if 'default' not in kwargs:
dest = kwargs['dest']
if dest in self._defaults:
kwargs['default'] = self._defaults[dest]
elif self.argument_default is not None:
kwargs['default'] = self.argument_default
# create the action object, and add it to the parser
action_class = self._pop_action_class(kwargs)
if not _callable(action_class):
raise ValueError('unknown action "%s"' % action_class)
action = action_class(**kwargs)
# raise an error if the action type is not callable
type_func = self._registry_get('type', action.type, action.type)
if not _callable(type_func):
raise ValueError('%r is not callable' % type_func)
return self._add_action(action)
def add_argument_group(self, *args, **kwargs):
group = _ArgumentGroup(self, *args, **kwargs)
self._action_groups.append(group)
return group
def add_mutually_exclusive_group(self, **kwargs):
group = _MutuallyExclusiveGroup(self, **kwargs)
self._mutually_exclusive_groups.append(group)
return group
def _add_action(self, action):
# resolve any conflicts
self._check_conflict(action)
# add to actions list
self._actions.append(action)
action.container = self
# index the action by any option strings it has
for option_string in action.option_strings:
self._option_string_actions[option_string] = action
# set the flag if any option strings look like negative numbers
for option_string in action.option_strings:
if self._negative_number_matcher.match(option_string):
if not self._has_negative_number_optionals:
self._has_negative_number_optionals.append(True)
# return the created action
return action
def _remove_action(self, action):
self._actions.remove(action)
def _add_container_actions(self, container):
# collect groups by titles
title_group_map = {}
for group in self._action_groups:
if group.title in title_group_map:
msg = _('cannot merge actions - two groups are named %r')
raise ValueError(msg % (group.title))
title_group_map[group.title] = group
# map each action to its group
group_map = {}
for group in container._action_groups:
# if a group with the title exists, use that, otherwise
# create a new group matching the container's group
if group.title not in title_group_map:
title_group_map[group.title] = self.add_argument_group(
title=group.title,
description=group.description,
conflict_handler=group.conflict_handler)
# map the actions to their new group
for action in group._group_actions:
group_map[action] = title_group_map[group.title]
# add container's mutually exclusive groups
# NOTE: if add_mutually_exclusive_group ever gains title= and
# description= then this code will need to be expanded as above
for group in container._mutually_exclusive_groups:
mutex_group = self.add_mutually_exclusive_group(
required=group.required)
# map the actions to their new mutex group
for action in group._group_actions:
group_map[action] = mutex_group
# add all actions to this container or their group
for action in container._actions:
group_map.get(action, self)._add_action(action)
def _get_positional_kwargs(self, dest, **kwargs):
# make sure required is not specified
if 'required' in kwargs:
msg = _("'required' is an invalid argument for positionals")
raise TypeError(msg)
# mark positional arguments as required if at least one is
# always required
if kwargs.get('nargs') not in [OPTIONAL, ZERO_OR_MORE]:
kwargs['required'] = True
if kwargs.get('nargs') == ZERO_OR_MORE and 'default' not in kwargs:
kwargs['required'] = True
# return the keyword arguments with no option strings
return dict(kwargs, dest=dest, option_strings=[])
def _get_optional_kwargs(self, *args, **kwargs):
# determine short and long option strings
option_strings = []
long_option_strings = []
for option_string in args:
# error on strings that don't start with an appropriate prefix
if not option_string[0] in self.prefix_chars:
msg = _('invalid option string %r: '
'must start with a character %r')
tup = option_string, self.prefix_chars
raise ValueError(msg % tup)
# strings starting with two prefix characters are long options
option_strings.append(option_string)
if option_string[0] in self.prefix_chars:
if len(option_string) > 1:
if option_string[1] in self.prefix_chars:
long_option_strings.append(option_string)
# infer destination, '--foo-bar' -> 'foo_bar' and '-x' -> 'x'
dest = kwargs.pop('dest', None)
if dest is None:
if long_option_strings:
dest_option_string = long_option_strings[0]
else:
dest_option_string = option_strings[0]
dest = dest_option_string.lstrip(self.prefix_chars)
if not dest:
msg = _('dest= is required for options like %r')
raise ValueError(msg % option_string)
dest = dest.replace('-', '_')
# return the updated keyword arguments
return dict(kwargs, dest=dest, option_strings=option_strings)
def _pop_action_class(self, kwargs, default=None):
action = kwargs.pop('action', default)
return self._registry_get('action', action, action)
def _get_handler(self):
# determine function from conflict handler string
handler_func_name = '_handle_conflict_%s' % self.conflict_handler
try:
return getattr(self, handler_func_name)
except AttributeError:
msg = _('invalid conflict_resolution value: %r')
raise ValueError(msg % self.conflict_handler)
def _check_conflict(self, action):
# find all options that conflict with this option
confl_optionals = []
for option_string in action.option_strings:
if option_string in self._option_string_actions:
confl_optional = self._option_string_actions[option_string]
confl_optionals.append((option_string, confl_optional))
# resolve any conflicts
if confl_optionals:
conflict_handler = self._get_handler()
conflict_handler(action, confl_optionals)
def _handle_conflict_error(self, action, conflicting_actions):
message = _('conflicting option string(s): %s')
conflict_string = ', '.join([option_string
for option_string, action
in conflicting_actions])
raise ArgumentError(action, message % conflict_string)
def _handle_conflict_resolve(self, action, conflicting_actions):
# remove all conflicting options
for option_string, action in conflicting_actions:
# remove the conflicting option
action.option_strings.remove(option_string)
self._option_string_actions.pop(option_string, None)
# if the option now has no option string, remove it from the
# container holding it
if not action.option_strings:
action.container._remove_action(action)
class _ArgumentGroup(_ActionsContainer):
def __init__(self, container, title=None, description=None, **kwargs):
# add any missing keyword arguments by checking the container
update = kwargs.setdefault
update('conflict_handler', container.conflict_handler)
update('prefix_chars', container.prefix_chars)
update('argument_default', container.argument_default)
super_init = super(_ArgumentGroup, self).__init__
super_init(description=description, **kwargs)
# group attributes
self.title = title
self._group_actions = []
# share most attributes with the container
self._registries = container._registries
self._actions = container._actions
self._option_string_actions = container._option_string_actions
self._defaults = container._defaults
self._has_negative_number_optionals = \
container._has_negative_number_optionals
def _add_action(self, action):
action = super(_ArgumentGroup, self)._add_action(action)
self._group_actions.append(action)
return action
def _remove_action(self, action):
super(_ArgumentGroup, self)._remove_action(action)
self._group_actions.remove(action)
class _MutuallyExclusiveGroup(_ArgumentGroup):
def __init__(self, container, required=False):
super(_MutuallyExclusiveGroup, self).__init__(container)
self.required = required
self._container = container
def _add_action(self, action):
if action.required:
msg = _('mutually exclusive arguments must be optional')
raise ValueError(msg)
action = self._container._add_action(action)
self._group_actions.append(action)
return action
def _remove_action(self, action):
self._container._remove_action(action)
self._group_actions.remove(action)
class ArgumentParser(_AttributeHolder, _ActionsContainer):
"""Object for parsing command line strings into Python objects.
Keyword Arguments:
- prog -- The name of the program (default: sys.argv[0])
- usage -- A usage message (default: auto-generated from arguments)
- description -- A description of what the program does
- epilog -- Text following the argument descriptions
- parents -- Parsers whose arguments should be copied into this one
- formatter_class -- HelpFormatter class for printing help messages
- prefix_chars -- Characters that prefix optional arguments
- fromfile_prefix_chars -- Characters that prefix files containing
additional arguments
- argument_default -- The default value for all arguments
- conflict_handler -- String indicating how to handle conflicts
- add_help -- Add a -h/-help option
"""
def __init__(self,
prog=None,
usage=None,
description=None,
epilog=None,
version=None,
parents=[],
formatter_class=HelpFormatter,
prefix_chars='-',
fromfile_prefix_chars=None,
argument_default=None,
conflict_handler='error',
add_help=True):
if version is not None:
import warnings
warnings.warn(
"""The "version" argument to ArgumentParser is deprecated. """
"""Please use """
""""add_argument(..., action='version', version="N", ...)" """
"""instead""", DeprecationWarning)
superinit = super(ArgumentParser, self).__init__
superinit(description=description,
prefix_chars=prefix_chars,
argument_default=argument_default,
conflict_handler=conflict_handler)
# default setting for prog
if prog is None:
prog = _os.path.basename(_sys.argv[0])
self.prog = prog
self.usage = usage
self.epilog = epilog
self.version = version
self.formatter_class = formatter_class
self.fromfile_prefix_chars = fromfile_prefix_chars
self.add_help = add_help
add_group = self.add_argument_group
self._positionals = add_group(_('positional arguments'))
self._optionals = add_group(_('optional arguments'))
self._subparsers = None
# register types
def identity(string):
return string
self.register('type', None, identity)
# add help and version arguments if necessary
# (using explicit default to override global argument_default)
if '-' in prefix_chars:
default_prefix = '-'
else:
default_prefix = prefix_chars[0]
if self.add_help:
self.add_argument(
default_prefix*2+'help',
action='help', default=SUPPRESS,
help=_('show this help message and exit'))
if self.version:
self.add_argument(
default_prefix+'v', default_prefix*2+'version',
action='version', default=SUPPRESS,
version=self.version,
help=_("show program's version number and exit"))
# add parent arguments and defaults
for parent in parents:
self._add_container_actions(parent)
try:
defaults = parent._defaults
except AttributeError:
pass
else:
self._defaults.update(defaults)
# =======================
# Pretty __repr__ methods
# =======================
def _get_kwargs(self):
names = [
'prog',
'usage',
'description',
'version',
'formatter_class',
'conflict_handler',
'add_help',
]
return [(name, getattr(self, name)) for name in names]
# ==================================
# Optional/Positional adding methods
# ==================================
def add_subparsers(self, **kwargs):
if self._subparsers is not None:
self.error(_('cannot have multiple subparser arguments'))
# add the parser class to the arguments if it's not present
kwargs.setdefault('parser_class', type(self))
if 'title' in kwargs or 'description' in kwargs:
title = _(kwargs.pop('title', 'subcommands'))
description = _(kwargs.pop('description', None))
self._subparsers = self.add_argument_group(title, description)
else:
self._subparsers = self._positionals
# prog defaults to the usage message of this parser, skipping
# optional arguments and with no "usage:" prefix
if kwargs.get('prog') is None:
formatter = self._get_formatter()
positionals = self._get_positional_actions()
groups = self._mutually_exclusive_groups
formatter.add_usage(self.usage, positionals, groups, '')
kwargs['prog'] = formatter.format_help().strip()
# create the parsers action and add it to the positionals list
parsers_class = self._pop_action_class(kwargs, 'parsers')
action = parsers_class(option_strings=[], **kwargs)
self._subparsers._add_action(action)
# return the created parsers action
return action
def _add_action(self, action):
if action.option_strings:
self._optionals._add_action(action)
else:
self._positionals._add_action(action)
return action
def _get_optional_actions(self):
return [action
for action in self._actions
if action.option_strings]
def _get_positional_actions(self):
return [action
for action in self._actions
if not action.option_strings]
# =====================================
# Command line argument parsing methods
# =====================================
def parse_args(self, args=None, namespace=None):
args, argv = self.parse_known_args(args, namespace)
if argv:
msg = _('unrecognized arguments: %s')
self.error(msg % ' '.join(argv))
return args
def parse_known_args(self, args=None, namespace=None):
# args default to the system args
if args is None:
args = _sys.argv[1:]
# default Namespace built from parser defaults
if namespace is None:
namespace = Namespace()
# add any action defaults that aren't present
for action in self._actions:
if action.dest is not SUPPRESS:
if not hasattr(namespace, action.dest):
if action.default is not SUPPRESS:
default = action.default
if isinstance(action.default, basestring):
default = self._get_value(action, default)
setattr(namespace, action.dest, default)
# add any parser defaults that aren't present
for dest in self._defaults:
if not hasattr(namespace, dest):
setattr(namespace, dest, self._defaults[dest])
# parse the arguments and exit if there are any errors
try:
namespace, args = self._parse_known_args(args, namespace)
if hasattr(namespace, _UNRECOGNIZED_ARGS_ATTR):
args.extend(getattr(namespace, _UNRECOGNIZED_ARGS_ATTR))
delattr(namespace, _UNRECOGNIZED_ARGS_ATTR)
return namespace, args
except ArgumentError:
err = _sys.exc_info()[1]
self.error(str(err))
def _parse_known_args(self, arg_strings, namespace):
# replace arg strings that are file references
if self.fromfile_prefix_chars is not None:
arg_strings = self._read_args_from_files(arg_strings)
# map all mutually exclusive arguments to the other arguments
# they can't occur with
action_conflicts = {}
for mutex_group in self._mutually_exclusive_groups:
group_actions = mutex_group._group_actions
for i, mutex_action in enumerate(mutex_group._group_actions):
conflicts = action_conflicts.setdefault(mutex_action, [])
conflicts.extend(group_actions[:i])
conflicts.extend(group_actions[i + 1:])
# find all option indices, and determine the arg_string_pattern
# which has an 'O' if there is an option at an index,
# an 'A' if there is an argument, or a '-' if there is a '--'
option_string_indices = {}
arg_string_pattern_parts = []
arg_strings_iter = iter(arg_strings)
for i, arg_string in enumerate(arg_strings_iter):
# all args after -- are non-options
if arg_string == '--':
arg_string_pattern_parts.append('-')
for arg_string in arg_strings_iter:
arg_string_pattern_parts.append('A')
# otherwise, add the arg to the arg strings
# and note the index if it was an option
else:
option_tuple = self._parse_optional(arg_string)
if option_tuple is None:
pattern = 'A'
else:
option_string_indices[i] = option_tuple
pattern = 'O'
arg_string_pattern_parts.append(pattern)
# join the pieces together to form the pattern
arg_strings_pattern = ''.join(arg_string_pattern_parts)
# converts arg strings to the appropriate and then takes the action
seen_actions = set()
seen_non_default_actions = set()
def take_action(action, argument_strings, option_string=None):
seen_actions.add(action)
argument_values = self._get_values(action, argument_strings)
# error if this argument is not allowed with other previously
# seen arguments, assuming that actions that use the default
# value don't really count as "present"
if argument_values is not action.default:
seen_non_default_actions.add(action)
for conflict_action in action_conflicts.get(action, []):
if conflict_action in seen_non_default_actions:
msg = _('not allowed with argument %s')
action_name = _get_action_name(conflict_action)
raise ArgumentError(action, msg % action_name)
# take the action if we didn't receive a SUPPRESS value
# (e.g. from a default)
if argument_values is not SUPPRESS:
action(self, namespace, argument_values, option_string)
# function to convert arg_strings into an optional action
def consume_optional(start_index):
# get the optional identified at this index
option_tuple = option_string_indices[start_index]
action, option_string, explicit_arg = option_tuple
# identify additional optionals in the same arg string
# (e.g. -xyz is the same as -x -y -z if no args are required)
match_argument = self._match_argument
action_tuples = []
while True:
# if we found no optional action, skip it
if action is None:
extras.append(arg_strings[start_index])
return start_index + 1
# if there is an explicit argument, try to match the
# optional's string arguments to only this
if explicit_arg is not None:
arg_count = match_argument(action, 'A')
# if the action is a single-dash option and takes no
# arguments, try to parse more single-dash options out
# of the tail of the option string
chars = self.prefix_chars
if arg_count == 0 and option_string[1] not in chars:
action_tuples.append((action, [], option_string))
char = option_string[0]
option_string = char + explicit_arg[0]
new_explicit_arg = explicit_arg[1:] or None
optionals_map = self._option_string_actions
if option_string in optionals_map:
action = optionals_map[option_string]
explicit_arg = new_explicit_arg
else:
msg = _('ignored explicit argument %r')
raise ArgumentError(action, msg % explicit_arg)
# if the action expect exactly one argument, we've
# successfully matched the option; exit the loop
elif arg_count == 1:
stop = start_index + 1
args = [explicit_arg]
action_tuples.append((action, args, option_string))
break
# error if a double-dash option did not use the
# explicit argument
else:
msg = _('ignored explicit argument %r')
raise ArgumentError(action, msg % explicit_arg)
# if there is no explicit argument, try to match the
# optional's string arguments with the following strings
# if successful, exit the loop
else:
start = start_index + 1
selected_patterns = arg_strings_pattern[start:]
arg_count = match_argument(action, selected_patterns)
stop = start + arg_count
args = arg_strings[start:stop]
action_tuples.append((action, args, option_string))
break
# add the Optional to the list and return the index at which
# the Optional's string args stopped
assert action_tuples
for action, args, option_string in action_tuples:
take_action(action, args, option_string)
return stop
# the list of Positionals left to be parsed; this is modified
# by consume_positionals()
positionals = self._get_positional_actions()
# function to convert arg_strings into positional actions
def consume_positionals(start_index):
# match as many Positionals as possible
match_partial = self._match_arguments_partial
selected_pattern = arg_strings_pattern[start_index:]
arg_counts = match_partial(positionals, selected_pattern)
# slice off the appropriate arg strings for each Positional
# and add the Positional and its args to the list
for action, arg_count in zip(positionals, arg_counts):
args = arg_strings[start_index: start_index + arg_count]
start_index += arg_count
take_action(action, args)
# slice off the Positionals that we just parsed and return the
# index at which the Positionals' string args stopped
positionals[:] = positionals[len(arg_counts):]
return start_index
# consume Positionals and Optionals alternately, until we have
# passed the last option string
extras = []
start_index = 0
if option_string_indices:
max_option_string_index = max(option_string_indices)
else:
max_option_string_index = -1
while start_index <= max_option_string_index:
# consume any Positionals preceding the next option
next_option_string_index = min([
index
for index in option_string_indices
if index >= start_index])
if start_index != next_option_string_index:
positionals_end_index = consume_positionals(start_index)
# only try to parse the next optional if we didn't consume
# the option string during the positionals parsing
if positionals_end_index > start_index:
start_index = positionals_end_index
continue
else:
start_index = positionals_end_index
# if we consumed all the positionals we could and we're not
# at the index of an option string, there were extra arguments
if start_index not in option_string_indices:
strings = arg_strings[start_index:next_option_string_index]
extras.extend(strings)
start_index = next_option_string_index
# consume the next optional and any arguments for it
start_index = consume_optional(start_index)
# consume any positionals following the last Optional
stop_index = consume_positionals(start_index)
# if we didn't consume all the argument strings, there were extras
extras.extend(arg_strings[stop_index:])
# if we didn't use all the Positional objects, there were too few
# arg strings supplied.
if positionals:
self.error(_('too few arguments'))
# make sure all required actions were present
for action in self._actions:
if action.required:
if action not in seen_actions:
name = _get_action_name(action)
self.error(_('argument %s is required') % name)
# make sure all required groups had one option present
for group in self._mutually_exclusive_groups:
if group.required:
for action in group._group_actions:
if action in seen_non_default_actions:
break
# if no actions were used, report the error
else:
names = [_get_action_name(action)
for action in group._group_actions
if action.help is not SUPPRESS]
msg = _('one of the arguments %s is required')
self.error(msg % ' '.join(names))
# return the updated namespace and the extra arguments
return namespace, extras
def _read_args_from_files(self, arg_strings):
# expand arguments referencing files
new_arg_strings = []
for arg_string in arg_strings:
# for regular arguments, just add them back into the list
if arg_string[0] not in self.fromfile_prefix_chars:
new_arg_strings.append(arg_string)
# replace arguments referencing files with the file content
else:
try:
args_file = open(arg_string[1:])
try:
arg_strings = []
for arg_line in args_file.read().splitlines():
for arg in self.convert_arg_line_to_args(arg_line):
arg_strings.append(arg)
arg_strings = self._read_args_from_files(arg_strings)
new_arg_strings.extend(arg_strings)
finally:
args_file.close()
except IOError:
err = _sys.exc_info()[1]
self.error(str(err))
# return the modified argument list
return new_arg_strings
def convert_arg_line_to_args(self, arg_line):
return [arg_line]
def _match_argument(self, action, arg_strings_pattern):
# match the pattern for this action to the arg strings
nargs_pattern = self._get_nargs_pattern(action)
match = _re.match(nargs_pattern, arg_strings_pattern)
# raise an exception if we weren't able to find a match
if match is None:
nargs_errors = {
None: _('expected one argument'),
OPTIONAL: _('expected at most one argument'),
ONE_OR_MORE: _('expected at least one argument'),
}
default = _('expected %s argument(s)') % action.nargs
msg = nargs_errors.get(action.nargs, default)
raise ArgumentError(action, msg)
# return the number of arguments matched
return len(match.group(1))
def _match_arguments_partial(self, actions, arg_strings_pattern):
# progressively shorten the actions list by slicing off the
# final actions until we find a match
result = []
for i in range(len(actions), 0, -1):
actions_slice = actions[:i]
pattern = ''.join([self._get_nargs_pattern(action)
for action in actions_slice])
match = _re.match(pattern, arg_strings_pattern)
if match is not None:
result.extend([len(string) for string in match.groups()])
break
# return the list of arg string counts
return result
def _parse_optional(self, arg_string):
# if it's an empty string, it was meant to be a positional
if not arg_string:
return None
# if it doesn't start with a prefix, it was meant to be positional
if not arg_string[0] in self.prefix_chars:
return None
# if the option string is present in the parser, return the action
if arg_string in self._option_string_actions:
action = self._option_string_actions[arg_string]
return action, arg_string, None
# if it's just a single character, it was meant to be positional
if len(arg_string) == 1:
return None
# if the option string before the "=" is present, return the action
if '=' in arg_string:
option_string, explicit_arg = arg_string.split('=', 1)
if option_string in self._option_string_actions:
action = self._option_string_actions[option_string]
return action, option_string, explicit_arg
# search through all possible prefixes of the option string
# and all actions in the parser for possible interpretations
option_tuples = self._get_option_tuples(arg_string)
# if multiple actions match, the option string was ambiguous
if len(option_tuples) > 1:
options = ', '.join([option_string
for action, option_string, explicit_arg in option_tuples])
tup = arg_string, options
self.error(_('ambiguous option: %s could match %s') % tup)
# if exactly one action matched, this segmentation is good,
# so return the parsed action
elif len(option_tuples) == 1:
option_tuple, = option_tuples
return option_tuple
# if it was not found as an option, but it looks like a negative
# number, it was meant to be positional
# unless there are negative-number-like options
if self._negative_number_matcher.match(arg_string):
if not self._has_negative_number_optionals:
return None
# if it contains a space, it was meant to be a positional
if ' ' in arg_string:
return None
# it was meant to be an optional but there is no such option
# in this parser (though it might be a valid option in a subparser)
return None, arg_string, None
def _get_option_tuples(self, option_string):
result = []
# option strings starting with two prefix characters are only
# split at the '='
chars = self.prefix_chars
if option_string[0] in chars and option_string[1] in chars:
if '=' in option_string:
option_prefix, explicit_arg = option_string.split('=', 1)
else:
option_prefix = option_string
explicit_arg = None
for option_string in self._option_string_actions:
if option_string.startswith(option_prefix):
action = self._option_string_actions[option_string]
tup = action, option_string, explicit_arg
result.append(tup)
# single character options can be concatenated with their arguments
# but multiple character options always have to have their argument
# separate
elif option_string[0] in chars and option_string[1] not in chars:
option_prefix = option_string
explicit_arg = None
short_option_prefix = option_string[:2]
short_explicit_arg = option_string[2:]
for option_string in self._option_string_actions:
if option_string == short_option_prefix:
action = self._option_string_actions[option_string]
tup = action, option_string, short_explicit_arg
result.append(tup)
elif option_string.startswith(option_prefix):
action = self._option_string_actions[option_string]
tup = action, option_string, explicit_arg
result.append(tup)
# shouldn't ever get here
else:
self.error(_('unexpected option string: %s') % option_string)
# return the collected option tuples
return result
def _get_nargs_pattern(self, action):
# in all examples below, we have to allow for '--' args
# which are represented as '-' in the pattern
nargs = action.nargs
# the default (None) is assumed to be a single argument
if nargs is None:
nargs_pattern = '(-*A-*)'
# allow zero or one arguments
elif nargs == OPTIONAL:
nargs_pattern = '(-*A?-*)'
# allow zero or more arguments
elif nargs == ZERO_OR_MORE:
nargs_pattern = '(-*[A-]*)'
# allow one or more arguments
elif nargs == ONE_OR_MORE:
nargs_pattern = '(-*A[A-]*)'
# allow any number of options or arguments
elif nargs == REMAINDER:
nargs_pattern = '([-AO]*)'
# allow one argument followed by any number of options or arguments
elif nargs == PARSER:
nargs_pattern = '(-*A[-AO]*)'
# all others should be integers
else:
nargs_pattern = '(-*%s-*)' % '-*'.join('A' * nargs)
# if this is an optional action, -- is not allowed
if action.option_strings:
nargs_pattern = nargs_pattern.replace('-*', '')
nargs_pattern = nargs_pattern.replace('-', '')
# return the pattern
return nargs_pattern
# ========================
# Value conversion methods
# ========================
def _get_values(self, action, arg_strings):
# for everything but PARSER args, strip out '--'
if action.nargs not in [PARSER, REMAINDER]:
arg_strings = [s for s in arg_strings if s != '--']
# optional argument produces a default when not present
if not arg_strings and action.nargs == OPTIONAL:
if action.option_strings:
value = action.const
else:
value = action.default
if isinstance(value, basestring):
value = self._get_value(action, value)
self._check_value(action, value)
# when nargs='*' on a positional, if there were no command-line
# args, use the default if it is anything other than None
elif (not arg_strings and action.nargs == ZERO_OR_MORE and
not action.option_strings):
if action.default is not None:
value = action.default
else:
value = arg_strings
self._check_value(action, value)
# single argument or optional argument produces a single value
elif len(arg_strings) == 1 and action.nargs in [None, OPTIONAL]:
arg_string, = arg_strings
value = self._get_value(action, arg_string)
self._check_value(action, value)
# REMAINDER arguments convert all values, checking none
elif action.nargs == REMAINDER:
value = [self._get_value(action, v) for v in arg_strings]
# PARSER arguments convert all values, but check only the first
elif action.nargs == PARSER:
value = [self._get_value(action, v) for v in arg_strings]
self._check_value(action, value[0])
# all other types of nargs produce a list
else:
value = [self._get_value(action, v) for v in arg_strings]
for v in value:
self._check_value(action, v)
# return the converted value
return value
def _get_value(self, action, arg_string):
type_func = self._registry_get('type', action.type, action.type)
if not _callable(type_func):
msg = _('%r is not callable')
raise ArgumentError(action, msg % type_func)
# convert the value to the appropriate type
try:
result = type_func(arg_string)
# ArgumentTypeErrors indicate errors
except ArgumentTypeError:
name = getattr(action.type, '__name__', repr(action.type))
msg = str(_sys.exc_info()[1])
raise ArgumentError(action, msg)
# TypeErrors or ValueErrors also indicate errors
except (TypeError, ValueError):
name = getattr(action.type, '__name__', repr(action.type))
msg = _('invalid %s value: %r')
raise ArgumentError(action, msg % (name, arg_string))
# return the converted value
return result
def _check_value(self, action, value):
# converted value must be one of the choices (if specified)
if action.choices is not None and value not in action.choices:
tup = map(repr, action.choices)
tup.sort()
tup = value, ', '.join(tup)
msg = _('invalid choice: %r (choose from %s)') % tup
raise ArgumentError(action, msg)
# =======================
# Help-formatting methods
# =======================
def format_usage(self):
formatter = self._get_formatter()
formatter.add_usage(self.usage, self._actions,
self._mutually_exclusive_groups)
return formatter.format_help()
def format_help(self):
formatter = self._get_formatter()
# usage
formatter.add_usage(self.usage, self._actions,
self._mutually_exclusive_groups)
# description
formatter.add_text(self.description)
# positionals, optionals and user-defined groups
for action_group in self._action_groups:
formatter.start_section(action_group.title)
formatter.add_text(action_group.description)
formatter.add_arguments(action_group._group_actions)
formatter.end_section()
# epilog
formatter.add_text(self.epilog)
# determine help from format above
return formatter.format_help()
def format_version(self):
import warnings
warnings.warn(
'The format_version method is deprecated -- the "version" '
'argument to ArgumentParser is no longer supported.',
DeprecationWarning)
formatter = self._get_formatter()
formatter.add_text(self.version)
return formatter.format_help()
def _get_formatter(self):
return self.formatter_class(prog=self.prog)
# =====================
# Help-printing methods
# =====================
def print_usage(self, file=None):
if file is None:
file = _sys.stdout
self._print_message(self.format_usage(), file)
def print_help(self, file=None):
if file is None:
file = _sys.stdout
self._print_message(self.format_help(), file)
def print_version(self, file=None):
import warnings
warnings.warn(
'The print_version method is deprecated -- the "version" '
'argument to ArgumentParser is no longer supported.',
DeprecationWarning)
self._print_message(self.format_version(), file)
def _print_message(self, message, file=None):
if message:
if file is None:
file = _sys.stderr
file.write(message)
# ===============
# Exiting methods
# ===============
def exit(self, status=0, message=None):
if message:
self._print_message(message, _sys.stderr)
_sys.exit(status)
def error(self, message):
"""error(message: string)
Prints a usage message incorporating the message to stderr and
exits.
If you override this in a subclass, it should not return -- it
should either exit or raise an exception.
"""
self.print_usage(_sys.stderr)
self.exit(2, _('%s: error: %s\n') % (self.prog, message))
pgxnclient-1.2.1/pgxnclient/utils/__init__.py 0000664 0001750 0001750 00000003422 12143727745 021275 0 ustar piro piro 0000000 0000000 """
pgxnclient -- misc utilities package
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
__all__ = ['OrderedDict', 'load_json', 'load_jsons', 'sha1', 'b',
'find_executable']
import sys
import os
# OrderedDict available from Python 2.7
if sys.version_info >= (2, 7):
from collections import OrderedDict
else:
from pgxnclient.utils.ordereddict import OrderedDict
# Import the proper JSON library.
#
# Dependencies note: simplejson is certified for Python 2.5. Support for
# Python 2.4 was available up to version 2.0.9, but this version doesn't
# support ordered dicts. In Py 2.6 the package is in stdlib, but without
# orddict support, so we use simplejson 2.1 again. From Python 2.7 the stdlilb
# json module has orddict support so we don't need the external dependency.
if sys.version_info >= (2, 7):
import json
else:
import simplejson as json
def load_json(f):
data = f.read()
if not isinstance(data, unicode):
data = data.decode('utf-8')
return load_jsons(data)
def load_jsons(data):
return json.loads(data, object_pairs_hook=OrderedDict)
# Import the sha1 object without warnings
from hashlib import sha1
# For compatibility from Python 2.4 to 3.x
# b('str') is equivalent to b'str' but works on Python < 2.6 too
if sys.version_info < (3,):
def b(s):
return s
else:
def b(s):
return s.encode('utf8')
def find_executable(name):
"""
Find executable by ``name`` by inspecting PATH environment variable, return
``None`` if nothing found.
"""
for dir in os.environ.get('PATH', '').split(os.pathsep):
if not dir:
continue
fn = os.path.abspath(os.path.join(dir, name))
if os.path.exists(fn):
return os.path.abspath(fn)
pgxnclient-1.2.1/pgxnclient/utils/ordereddict.py 0000664 0001750 0001750 00000021351 12143727745 022027 0 ustar piro piro 0000000 0000000 ## {{{ http://code.activestate.com/recipes/576693/ (r9)
# Backport of OrderedDict() class that runs on Python 2.4, 2.5, 2.6, 2.7 and pypy.
# Passes Python2.7's test suite and incorporates all the latest updates.
try:
from thread import get_ident as _get_ident
except ImportError:
from dummy_thread import get_ident as _get_ident
try:
from _abcoll import KeysView, ValuesView, ItemsView
except ImportError:
pass
class OrderedDict(dict):
'Dictionary that remembers insertion order'
# An inherited dict maps keys to values.
# The inherited dict provides __getitem__, __len__, __contains__, and get.
# The remaining methods are order-aware.
# Big-O running times for all methods are the same as for regular dictionaries.
# The internal self.__map dictionary maps keys to links in a doubly linked list.
# The circular doubly linked list starts and ends with a sentinel element.
# The sentinel element never gets deleted (this simplifies the algorithm).
# Each link is stored as a list of length three: [PREV, NEXT, KEY].
def __init__(self, *args, **kwds):
'''Initialize an ordered dictionary. Signature is the same as for
regular dictionaries, but keyword arguments are not recommended
because their insertion order is arbitrary.
'''
if len(args) > 1:
raise TypeError('expected at most 1 arguments, got %d' % len(args))
try:
self.__root
except AttributeError:
self.__root = root = [] # sentinel node
root[:] = [root, root, None]
self.__map = {}
self.__update(*args, **kwds)
def __setitem__(self, key, value, dict_setitem=dict.__setitem__):
'od.__setitem__(i, y) <==> od[i]=y'
# Setting a new item creates a new link which goes at the end of the linked
# list, and the inherited dictionary is updated with the new key/value pair.
if key not in self:
root = self.__root
last = root[0]
last[1] = root[0] = self.__map[key] = [last, root, key]
dict_setitem(self, key, value)
def __delitem__(self, key, dict_delitem=dict.__delitem__):
'od.__delitem__(y) <==> del od[y]'
# Deleting an existing item uses self.__map to find the link which is
# then removed by updating the links in the predecessor and successor nodes.
dict_delitem(self, key)
link_prev, link_next, key = self.__map.pop(key)
link_prev[1] = link_next
link_next[0] = link_prev
def __iter__(self):
'od.__iter__() <==> iter(od)'
root = self.__root
curr = root[1]
while curr is not root:
yield curr[2]
curr = curr[1]
def __reversed__(self):
'od.__reversed__() <==> reversed(od)'
root = self.__root
curr = root[0]
while curr is not root:
yield curr[2]
curr = curr[0]
def clear(self):
'od.clear() -> None. Remove all items from od.'
try:
for node in self.__map.itervalues():
del node[:]
root = self.__root
root[:] = [root, root, None]
self.__map.clear()
except AttributeError:
pass
dict.clear(self)
def popitem(self, last=True):
'''od.popitem() -> (k, v), return and remove a (key, value) pair.
Pairs are returned in LIFO order if last is true or FIFO order if false.
'''
if not self:
raise KeyError('dictionary is empty')
root = self.__root
if last:
link = root[0]
link_prev = link[0]
link_prev[1] = root
root[0] = link_prev
else:
link = root[1]
link_next = link[1]
root[1] = link_next
link_next[0] = root
key = link[2]
del self.__map[key]
value = dict.pop(self, key)
return key, value
# -- the following methods do not depend on the internal structure --
def keys(self):
'od.keys() -> list of keys in od'
return list(self)
def values(self):
'od.values() -> list of values in od'
return [self[key] for key in self]
def items(self):
'od.items() -> list of (key, value) pairs in od'
return [(key, self[key]) for key in self]
def iterkeys(self):
'od.iterkeys() -> an iterator over the keys in od'
return iter(self)
def itervalues(self):
'od.itervalues -> an iterator over the values in od'
for k in self:
yield self[k]
def iteritems(self):
'od.iteritems -> an iterator over the (key, value) items in od'
for k in self:
yield (k, self[k])
def update(*args, **kwds):
'''od.update(E, **F) -> None. Update od from dict/iterable E and F.
If E is a dict instance, does: for k in E: od[k] = E[k]
If E has a .keys() method, does: for k in E.keys(): od[k] = E[k]
Or if E is an iterable of items, does: for k, v in E: od[k] = v
In either case, this is followed by: for k, v in F.items(): od[k] = v
'''
if len(args) > 2:
raise TypeError('update() takes at most 2 positional '
'arguments (%d given)' % (len(args),))
elif not args:
raise TypeError('update() takes at least 1 argument (0 given)')
self = args[0]
# Make progressively weaker assumptions about "other"
other = ()
if len(args) == 2:
other = args[1]
if isinstance(other, dict):
for key in other:
self[key] = other[key]
elif hasattr(other, 'keys'):
for key in other.keys():
self[key] = other[key]
else:
for key, value in other:
self[key] = value
for key, value in kwds.items():
self[key] = value
__update = update # let subclasses override update without breaking __init__
__marker = object()
def pop(self, key, default=__marker):
'''od.pop(k[,d]) -> v, remove specified key and return the corresponding value.
If key is not found, d is returned if given, otherwise KeyError is raised.
'''
if key in self:
result = self[key]
del self[key]
return result
if default is self.__marker:
raise KeyError(key)
return default
def setdefault(self, key, default=None):
'od.setdefault(k[,d]) -> od.get(k,d), also set od[k]=d if k not in od'
if key in self:
return self[key]
self[key] = default
return default
def __repr__(self, _repr_running={}):
'od.__repr__() <==> repr(od)'
call_key = id(self), _get_ident()
if call_key in _repr_running:
return '...'
_repr_running[call_key] = 1
try:
if not self:
return '%s()' % (self.__class__.__name__,)
return '%s(%r)' % (self.__class__.__name__, self.items())
finally:
del _repr_running[call_key]
def __reduce__(self):
'Return state information for pickling'
items = [[k, self[k]] for k in self]
inst_dict = vars(self).copy()
for k in vars(OrderedDict()):
inst_dict.pop(k, None)
if inst_dict:
return (self.__class__, (items,), inst_dict)
return self.__class__, (items,)
def copy(self):
'od.copy() -> a shallow copy of od'
return self.__class__(self)
@classmethod
def fromkeys(cls, iterable, value=None):
'''OD.fromkeys(S[, v]) -> New ordered dictionary with keys from S
and values equal to v (which defaults to None).
'''
d = cls()
for key in iterable:
d[key] = value
return d
def __eq__(self, other):
'''od.__eq__(y) <==> od==y. Comparison to another OD is order-sensitive
while comparison to a regular mapping is order-insensitive.
'''
if isinstance(other, OrderedDict):
return len(self)==len(other) and self.items() == other.items()
return dict.__eq__(self, other)
def __ne__(self, other):
return not self == other
# -- the following methods are only used in Python 2.7 --
def viewkeys(self):
"od.viewkeys() -> a set-like object providing a view on od's keys"
return KeysView(self)
def viewvalues(self):
"od.viewvalues() -> an object providing a view on od's values"
return ValuesView(self)
def viewitems(self):
"od.viewitems() -> a set-like object providing a view on od's items"
return ItemsView(self)
## end of http://code.activestate.com/recipes/576693/ }}}
pgxnclient-1.2.1/pgxnclient/utils/strings.py 0000664 0001750 0001750 00000005607 12143727745 021236 0 ustar piro piro 0000000 0000000 """
Strings -- implementation of a few specific string subclasses.
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import re
from pgxnclient.i18n import _
from pgxnclient.utils.argparse import ArgumentTypeError
class CIStr(str):
"""
A case preserving string with non case-sensitive comparison.
"""
def __eq__(self, other):
if isinstance(other, CIStr):
return self.lower() == other.lower()
else:
return NotImplemented
def __ne__(self, other):
return not self == other
def __lt__(self, other):
if isinstance(other, CIStr):
return self.lower() < other.lower()
else:
return NotImplemented
def __gt__(self, other):
return other < self
def __le__(self, other):
return not other < self
def __ge__(self, other):
return not self < other
class Label(CIStr):
"""A string following the rules in RFC 1034.
Labels can then be used as host names in domains.
http://tools.ietf.org/html/rfc1034
"The labels must follow the rules for ARPANET host names. They must
start with a letter, end with a letter or digit, and have as interior
characters only letters, digits, and hyphen. There are also some
restrictions on the length. Labels must be 63 characters or less."
"""
def __new__(cls, value):
if not Label._re_chk.match(value):
raise ValueError(_("bad label: '%s'") % value)
return CIStr.__new__(cls, value)
_re_chk = re.compile(
r'^[a-z]([-a-z0-9]{0,61}[a-z0-9])?$',
re.IGNORECASE)
class Term(CIStr):
"""
A Term is a subtype of String that must be at least two characters long
contain no slash (/), backslash (\), control, or space characters.
See http://pgxn.org/spec#Term
"""
def __new__(cls, value):
if not Term._re_chk.match(value) or min(map(ord, value)) < 32:
raise ValueError(_("not a valid term term: '%s'") % value)
return CIStr.__new__(cls, value)
_re_chk = re.compile( r'^[^\s/\\]{2,}$')
class Identifier(CIStr):
"""
A string modeling a PostgreSQL identifier.
"""
def __new__(cls, value):
if not value:
raise ValueError("PostgreSQL identifiers cannot be blank")
if not Identifier._re_chk.match(value):
value = '"%s"' % value.replace('"', '""')
# TODO: identifier are actually case sensitive if quoted
return CIStr.__new__(cls, value)
_re_chk = re.compile(
r'^[a-z_][a-z0-9_\$]*$',
re.IGNORECASE)
@classmethod
def parse_arg(self, s):
"""
Parse an Identifier from a command line argument.
"""
try:
return Identifier(s)
except ValueError, e:
# shouldn't happen anymore as we quote invalid identifiers
raise ArgumentTypeError(e)
pgxnclient-1.2.1/pgxnclient/utils/temp.py 0000664 0001750 0001750 00000000546 12143727745 020507 0 ustar piro piro 0000000 0000000 """
pgxnclient -- temp files utilities
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import shutil
import tempfile
import contextlib
@contextlib.contextmanager
def temp_dir():
"""Context manager to create a temp dir and delete after usage."""
dir = tempfile.mkdtemp()
yield dir
shutil.rmtree(dir)
pgxnclient-1.2.1/pgxnclient/utils/uri.py 0000775 0001750 0001750 00000020457 12143727745 020347 0 ustar piro piro 0000000 0000000 """
Simple implementation of URI-Templates
(http://bitworking.org/projects/URI-Templates/).
Some bits are inspired by or based on:
* Joe Gregorio's example implementation
(http://code.google.com/p/uri-templates/)
* Addressable (http://addressable.rubyforge.org/)
Simple usage::
>>> import uri
>>> args = {'foo': 'it worked'}
>>> uri.expand_template("http://example.com/{foo}", args)
'http://example.com/it%20worked'
>>> args = {'a':'foo', 'b':'bar', 'a_b':'baz'}
>>> uri.expand_template("http://example.org/{a}{b}/{a_b}", args)
'http://example.org/foobar/baz'
You can also use keyword arguments for a more pythonic style::
>>> uri.expand_template("http://example.org/?q={a}", a="foo")
'http://example.org/?q=foo'
"""
import re
import urllib
__all__ = ["expand_template", "TemplateSyntaxError"]
class TemplateSyntaxError(Exception):
pass
_template_pattern = re.compile(r"{([^}]+)}")
def expand_template(template, values={}, **kwargs):
"""Expand a URI template."""
values = values.copy()
values.update(kwargs)
values = percent_encode(values)
return _template_pattern.sub(lambda m: _handle_match(m, values), template)
def _handle_match(match, values):
op, arg, variables = parse_expansion(match.group(1))
if op:
try:
return getattr(_operators, op)(variables, arg, values)
except AttributeError:
raise TemplateSyntaxError("Unexpected operator: %r" % op)
else:
assert len(variables) == 1
key, default = variables.items()[0]
return values.get(key, default)
#
# Parse an expansion
# Adapted directly from the spec (Appendix A); extra validation has been added
# to make it pass all the tests.
#
_varname_pattern = re.compile(r"^[A-Za-z0-9]\w*$")
def parse_expansion(expansion):
"""
Parse an expansion -- the part inside {curlybraces} -- into its component
parts. Returns a tuple of (operator, argument, variabledict).
For example::
>>> parse_expansion("-join|&|a,b,c=1")
('join', '&', {'a': None, 'c': '1', 'b': None})
>>> parse_expansion("c=1")
(None, None, {'c': '1'})
"""
if "|" in expansion:
(op, arg, vars_) = expansion.split("|")
op = op[1:]
else:
(op, arg, vars_) = (None, None, expansion)
vars_ = vars_.split(",")
variables = {}
for var in vars_:
if "=" in var:
(varname, vardefault) = var.split("=")
if not vardefault:
raise TemplateSyntaxError("Invalid variable: %r" % var)
else:
(varname, vardefault) = (var, None)
if not _varname_pattern.match(varname):
raise TemplateSyntaxError("Invalid variable: %r" % varname)
variables[varname] = vardefault
return (op, arg, variables)
#
# Encode an entire dictionary of values
#
def percent_encode(values):
rv = {}
for k, v in values.items():
if isinstance(v, basestring):
rv[k] = urllib.quote(v)
else:
rv[k] = [urllib.quote(s) for s in v]
return rv
#
# Operators; see Section 3.3.
# Shoved into a class just so we have an ad hoc namespace.
#
class _operators(object):
@staticmethod
def opt(variables, arg, values):
for k in variables.keys():
v = values.get(k, None)
if v is None or (not isinstance(v, basestring) and len(v) == 0):
continue
else:
return arg
return ""
@staticmethod
def neg(variables, arg, values):
if _operators.opt(variables, arg, values):
return ""
else:
return arg
@staticmethod
def listjoin(variables, arg, values):
k = variables.keys()[0]
return arg.join(values.get(k, []))
@staticmethod
def join(variables, arg, values):
return arg.join([
"%s=%s" % (k, values.get(k, default))
for k, default in variables.items()
if values.get(k, default) is not None
])
@staticmethod
def prefix(variables, arg, values):
k, default = variables.items()[0]
v = values.get(k, default)
if v is not None and len(v) > 0:
return arg + v
else:
return ""
@staticmethod
def append(variables, arg, values):
k, default = variables.items()[0]
v = values.get(k, default)
if v is not None and len(v) > 0:
return v + arg
else:
return ""
#
# A bunch more tests that don't rightly fit in docstrings elsewhere
# Taken from Joe Gregorio's template_parser.py.
#
_test_pre = """
>>> expand_template('{foo}', {})
''
>>> expand_template('{foo}', {'foo': 'barney'})
'barney'
>>> expand_template('{foo=wilma}', {})
'wilma'
>>> expand_template('{foo=wilma}', {'foo': 'barney'})
'barney'
>>> expand_template('{-prefix|&|foo}', {})
''
>>> expand_template('{-prefix|&|foo=wilma}', {})
'&wilma'
>>> expand_template('{-prefix||foo=wilma}', {})
'wilma'
>>> expand_template('{-prefix|&|foo=wilma}', {'foo': 'barney'})
'&barney'
>>> expand_template('{-append|/|foo}', {})
''
>>> expand_template('{-append|#|foo=wilma}', {})
'wilma#'
>>> expand_template('{-append|&?|foo=wilma}', {'foo': 'barney'})
'barney&?'
>>> expand_template('{-join|/|foo}', {})
''
>>> expand_template('{-join|/|foo,bar}', {})
''
>>> expand_template('{-join|&|q,num}', {})
''
>>> expand_template('{-join|#|foo=wilma}', {})
'foo=wilma'
>>> expand_template('{-join|#|foo=wilma,bar}', {})
'foo=wilma'
>>> expand_template('{-join|&?|foo=wilma}', {'foo': 'barney'})
'foo=barney'
>>> expand_template('{-listjoin|/|foo}', {})
''
>>> expand_template('{-listjoin|/|foo}', {'foo': ['a', 'b']})
'a/b'
>>> expand_template('{-listjoin||foo}', {'foo': ['a', 'b']})
'ab'
>>> expand_template('{-listjoin|/|foo}', {'foo': ['a']})
'a'
>>> expand_template('{-listjoin|/|foo}', {'foo': []})
''
>>> expand_template('{-opt|&|foo}', {})
''
>>> expand_template('{-opt|&|foo}', {'foo': 'fred'})
'&'
>>> expand_template('{-opt|&|foo}', {'foo': []})
''
>>> expand_template('{-opt|&|foo}', {'foo': ['a']})
'&'
>>> expand_template('{-opt|&|foo,bar}', {'foo': ['a']})
'&'
>>> expand_template('{-opt|&|foo,bar}', {'bar': 'a'})
'&'
>>> expand_template('{-opt|&|foo,bar}', {})
''
>>> expand_template('{-neg|&|foo}', {})
'&'
>>> expand_template('{-neg|&|foo}', {'foo': 'fred'})
''
>>> expand_template('{-neg|&|foo}', {'foo': []})
'&'
>>> expand_template('{-neg|&|foo}', {'foo': ['a']})
''
>>> expand_template('{-neg|&|foo,bar}', {'bar': 'a'})
''
>>> expand_template('{-neg|&|foo,bar}', {'bar': []})
'&'
>>> expand_template('{foo}', {'foo': ' '})
'%20'
>>> expand_template('{-listjoin|&|foo}', {'foo': ['&', '&', '|', '_']})
'%26&%26&%7C&_'
# Extra hoops to deal with unpredictable dict ordering
>>> expand_template('{-join|#|foo=wilma,bar=barney}', {}) in ('bar=barney#foo=wilma', 'foo=wilma#bar=barney')
True
"""
_syntax_errors = """
>>> expand_template("{fred=}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: 'fred='
>>> expand_template("{f:}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: 'f:'
>>> expand_template("{f<}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: 'f<'
>>> expand_template("{<:}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: '<:'
>>> expand_template("{<:fred,barney}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: '<:fred'
>>> expand_template("{>:}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: '>:'
>>> expand_template("{>:fred,barney}")
Traceback (most recent call last):
...
TemplateSyntaxError: Invalid variable: '>:fred'
"""
__test__ = {"test_pre": _test_pre, "syntax_errors": _syntax_errors}
if __name__ == '__main__':
import doctest
doctest.testmod() pgxnclient-1.2.1/pgxnclient/libexec/ 0000775 0001750 0001750 00000000000 12143730213 017416 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/pgxnclient/libexec/pgxn-download 0000775 0001750 0001750 00000000303 12143727745 022141 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-search 0000775 0001750 0001750 00000000303 12143727745 021577 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-check 0000775 0001750 0001750 00000000303 12143727745 021407 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-install 0000775 0001750 0001750 00000000303 12143727745 022000 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-uninstall 0000775 0001750 0001750 00000000303 12143727745 022343 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/README 0000664 0001750 0001750 00000001352 12143727745 020317 0 ustar piro piro 0000000 0000000 This directory contains the PGXN Client builtin commands. If you want to
extend the client with your own command, please use the directory returned by
``pgxn help --libexec``. See the documentation for further information about
how to create new commands.
Note that setuptools doesn't do a perfect job and replaces the links with the
script content, dropping the executable flag. If you are packaging pgxnclient
for a distribution, you may use soft/hard links instead. The location of this
directory may also be changed if your distribution policies prefer a better
location (e.g. ``/usr/lib/pgxnclient/libexec``...): in this case change the
`LIBEXECDIR` constant in ``pgxnclient/__init__.py`` with the absolute path of
the scripts directory.
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-unload 0000775 0001750 0001750 00000000303 12143727745 021614 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-load 0000775 0001750 0001750 00000000303 12143727745 021251 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-mirror 0000775 0001750 0001750 00000000303 12143727745 021644 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-info 0000775 0001750 0001750 00000000303 12143727745 021265 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/libexec/pgxn-help 0000775 0001750 0001750 00000000303 12143727745 021262 0 ustar piro piro 0000000 0000000 #!/usr/bin/env python
"""
pgxnclient -- command line interface
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from pgxnclient.cli import script
script()
pgxnclient-1.2.1/pgxnclient/network.py 0000664 0001750 0001750 00000005403 12143727745 020070 0 ustar piro piro 0000000 0000000 """
pgxnclient -- network interaction
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
import urllib2
from urlparse import urlsplit
from itertools import count
from contextlib import closing
from pgxnclient import __version__
from pgxnclient.i18n import _
from pgxnclient.errors import PgxnClientException, NetworkError, ResourceNotFound, BadRequestError
import logging
logger = logging.getLogger('pgxnclient.network')
def get_file(url):
opener = urllib2.build_opener()
opener.addheaders = [('User-agent', 'pgxnclient/%s' % __version__)]
logger.debug('opening url: %s', url)
try:
return closing(opener.open(url))
except urllib2.HTTPError, e:
if e.code == 404:
raise ResourceNotFound(_("resource not found: '%s'") % e.url)
elif e.code == 400:
raise BadRequestError(_("bad request on '%s'") % e.url)
elif e.code == 500:
raise NetworkError(_("server error"))
elif e.code == 503:
raise NetworkError(_("service unavailable"))
else:
raise NetworkError(_("unexpected response %d for '%s'")
% (e.code, e.url))
except urllib2.URLError, e:
raise NetworkError(_("network error: %s") % e.reason)
def get_local_file_name(target, url):
"""Return a good name for a local file.
If *target* is a dir, make a name out of the url. Otherwise return target
itself. Always return an absolute path.
"""
if os.path.isdir(target):
basename = urlsplit(url)[2].rsplit('/', 1)[-1]
fn = os.path.join(target, basename)
else:
fn = target
return os.path.abspath(fn)
def download(f, fn, rename=True):
"""Download a file locally.
:param f: open file to read
:param fn: name of the file to write. If a dir, save into it.
:param rename: if true and a file *fn* exist, rename the downloaded file
adding a prefix ``-1``, ``-2``... before the extension.
Return the name of the file saved.
"""
if os.path.isdir(fn):
fn = get_local_file_name(fn, f.url)
if rename:
if os.path.exists(fn):
base, ext = os.path.splitext(fn)
for i in count(1):
logger.debug(_("file %s exists"), fn)
fn = "%s-%d%s" % (base, i, ext)
if not os.path.exists(fn):
break
logger.info(_("saving %s"), fn)
try:
fout = open(fn, "wb")
except Exception, e:
raise PgxnClientException(
_("cannot open target file: %s: %s")
% (e.__class__.__name__, e))
try:
while 1:
data = f.read(8192)
if not data: break
fout.write(data)
finally:
fout.close()
return fn
pgxnclient-1.2.1/pgxnclient/tar.py 0000664 0001750 0001750 00000003421 12143727745 017163 0 ustar piro piro 0000000 0000000 """
pgxnclient -- tar file utilities
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
import tarfile
from pgxnclient.i18n import _
from pgxnclient.errors import PgxnClientException
from pgxnclient.archive import Archive
import logging
logger = logging.getLogger('pgxnclient.tar')
class TarArchive(Archive):
"""Handle .tar archives"""
_file = None
def can_open(self):
return tarfile.is_tarfile(self.filename)
def open(self):
assert not self._file, "archive already open"
try:
self._file = tarfile.open(self.filename, 'r')
except Exception, e:
raise PgxnClientException(
_("cannot open archive '%s': %s") % (self.filename, e))
def close(self):
if self._file is not None:
self._file.close()
self._file = None
def list_files(self):
assert self._file, "archive not open"
return self._file.getnames()
def read(self, fn):
assert self._file, "archive not open"
return self._file.extractfile(fn).read()
def unpack(self, destdir):
tarname = self.filename
logger.info(_("unpacking: %s"), tarname)
destdir = os.path.abspath(destdir)
self.open()
try:
for fn in self.list_files():
fname = os.path.abspath(os.path.join(destdir, fn))
if not fname.startswith(destdir):
raise PgxnClientException(
_("archive file '%s' trying to escape!") % fname)
self._file.extractall(path=destdir)
finally:
self.close()
return self._find_work_directory(destdir)
def unpack(filename, destdir):
return TarArchive(filename).unpack(destdir)
pgxnclient-1.2.1/pgxnclient/zip.py 0000664 0001750 0001750 00000005504 12143727745 017203 0 ustar piro piro 0000000 0000000 """
pgxnclient -- zip file utilities
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
import os
import stat
import zipfile
from pgxnclient.utils import b
from pgxnclient.i18n import _
from pgxnclient.errors import PgxnClientException
from pgxnclient.archive import Archive
import logging
logger = logging.getLogger('pgxnclient.zip')
class ZipArchive(Archive):
"""Handle .zip archives"""
_file = None
def can_open(self):
return zipfile.is_zipfile(self.filename)
def open(self):
assert not self._file, "archive already open"
try:
self._file = zipfile.ZipFile(self.filename, 'r')
except Exception, e:
raise PgxnClientException(
_("cannot open archive '%s': %s") % (self.filename, e))
def close(self):
if self._file is not None:
self._file.close()
self._file = None
def list_files(self):
assert self._file, "archive not open"
return self._file.namelist()
def read(self, fn):
assert self._file, "archive not open"
return self._file.read(fn)
def unpack(self, destdir):
zipname = self.filename
logger.info(_("unpacking: %s"), zipname)
destdir = os.path.abspath(destdir)
self.open()
try:
for fn in self.list_files():
fname = os.path.abspath(os.path.join(destdir, fn))
if not fname.startswith(destdir):
raise PgxnClientException(
_("archive file '%s' trying to escape!") % fname)
# Looks like checking for a trailing / is the only way to
# tell if the file is a directory.
if fn.endswith('/'):
os.makedirs(fname)
continue
# The directory is not always explicitly present in the archive
if not os.path.exists(os.path.dirname(fname)):
os.makedirs(os.path.dirname(fname))
# Copy the file content
logger.debug(_("saving: %s"), fname)
fout = open(fname, "wb")
try:
data = self.read(fn)
# In order to restore the executable bit, I haven't find
# anything that looks like an executable flag in the zipinfo,
# so look at the hashbangs...
isexec = data[:2] == b('#!')
fout.write(data)
finally:
fout.close()
if isexec:
os.chmod(fname, stat.S_IREAD | stat.S_IWRITE | stat.S_IEXEC)
finally:
self.close()
return self._find_work_directory(destdir)
def unpack(filename, destdir):
return ZipArchive(filename).unpack(destdir)
pgxnclient-1.2.1/pgxnclient/api.py 0000664 0001750 0001750 00000007264 12143727754 017157 0 ustar piro piro 0000000 0000000 """
pgxnclient -- client API stub
"""
# Copyright (C) 2011-2012 Daniele Varrazzo
# This file is part of the PGXN client
from __future__ import with_statement
from urllib import urlencode
from pgxnclient import network
from pgxnclient.utils import load_json
from pgxnclient.errors import NetworkError, NotFound, ResourceNotFound
from pgxnclient.utils.uri import expand_template
class Api(object):
def __init__(self, mirror):
self.mirror = mirror
def dist(self, dist, version=''):
try:
with self.call(version and 'meta' or 'dist',
{'dist': dist, 'version': version}) as f:
return load_json(f)
except ResourceNotFound:
raise NotFound("distribution '%s' not found" % dist)
def ext(self, ext):
try:
with self.call('extension', {'extension': ext}) as f:
return load_json(f)
except ResourceNotFound:
raise NotFound("extension '%s' not found" % ext)
def meta(self, dist, version, as_json=True):
with self.call('meta', {'dist': dist, 'version': version}) as f:
if as_json:
return load_json(f)
else:
return f.read().decode('utf-8')
def readme(self, dist, version):
with self.call('readme', {'dist': dist, 'version': version}) as f:
return f.read()
def download(self, dist, version):
dist = dist.lower()
version = version.lower()
return self.call('download', {'dist': dist, 'version': version})
def mirrors(self):
with self.call('mirrors') as f:
return load_json(f)
def search(self, where, query):
"""Search into PGXN.
:param where: where to search. The server currently supports "docs",
"dists", "extensions"
:param query: list of strings to search
"""
# convert the query list into a string
q = ' '.join([' ' in s and ('"%s"' % s) or s for s in query])
with self.call('search', {'in': where}, query={'q': q}) as f:
return load_json(f)
def stats(self, arg):
with self.call('stats', {'stats': arg}) as f:
return load_json(f)
def user(self, username):
with self.call('user', {'user': username}) as f:
return load_json(f)
def call(self, meth, args=None, query=None):
url = self.get_url(meth, args, query)
try:
return network.get_file(url)
except ResourceNotFound:
# check if it is one of the broken URLs as reported in
# https://groups.google.com/group/pgxn-users/browse_thread/thread/e41fbc202680c92c
version = args and args.get('version')
if not (version and version.trail):
raise
args = args.copy()
args['version'] = str(version).replace('-', '', 1)
url = self.get_url(meth, args, query)
return network.get_file(url)
def get_url(self, meth, args=None, query=None):
tmpl = self.get_template(meth)
url = expand_template(tmpl, args or {})
url = self.mirror.rstrip('/') + url
if query is not None:
url = url + '?' + urlencode(query)
return url
def get_template(self, meth):
return self.get_index()[meth]
_api_index = None
def get_index(self):
if self._api_index is None:
url = self.mirror.rstrip('/') + '/index.json'
try:
with network.get_file(url) as f:
self._api_index = load_json(f)
except ResourceNotFound:
raise NetworkError("API index not found at '%s'" % url)
return self._api_index
pgxnclient-1.2.1/PKG-INFO 0000664 0001750 0001750 00000001701 12143730213 014724 0 ustar piro piro 0000000 0000000 Metadata-Version: 1.1
Name: pgxnclient
Version: 1.2.1
Summary: A command line tool to interact with the PostgreSQL Extension Network.
Home-page: http://pgxnclient.projects.postgresql.org/
Author: Daniele Varrazzo
Author-email: daniele.varrazzo@gmail.com
License: BSD
Description: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: System Administrators
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: POSIX
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.5
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.1
Classifier: Programming Language :: Python :: 3.2
Classifier: Topic :: Database
pgxnclient-1.2.1/setup.cfg 0000664 0001750 0001750 00000000073 12143730213 015451 0 ustar piro piro 0000000 0000000 [egg_info]
tag_build =
tag_date = 0
tag_svn_revision = 0
pgxnclient-1.2.1/pgxnclient.egg-info/ 0000775 0001750 0001750 00000000000 12143730213 017475 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/pgxnclient.egg-info/entry_points.txt 0000664 0001750 0001750 00000000135 12143730213 022772 0 ustar piro piro 0000000 0000000 [console_scripts]
pgxn = pgxnclient.cli:command_dispatch
pgxnclient = pgxnclient.cli:script
pgxnclient-1.2.1/pgxnclient.egg-info/SOURCES.txt 0000664 0001750 0001750 00000005235 12143730213 021366 0 ustar piro piro 0000000 0000000 AUTHORS
CHANGES
COPYING
MANIFEST.in
Makefile
README.rst
setup.py
bin/pgxn
bin/pgxnclient
docs/Makefile
docs/changes.rst
docs/conf.py
docs/ext.rst
docs/index.rst
docs/install.rst
docs/usage.rst
pgxnclient/__init__.py
pgxnclient/api.py
pgxnclient/archive.py
pgxnclient/cli.py
pgxnclient/errors.py
pgxnclient/i18n.py
pgxnclient/network.py
pgxnclient/spec.py
pgxnclient/tar.py
pgxnclient/zip.py
pgxnclient.egg-info/PKG-INFO
pgxnclient.egg-info/SOURCES.txt
pgxnclient.egg-info/dependency_links.txt
pgxnclient.egg-info/entry_points.txt
pgxnclient.egg-info/not-zip-safe
pgxnclient.egg-info/top_level.txt
pgxnclient/commands/__init__.py
pgxnclient/commands/help.py
pgxnclient/commands/info.py
pgxnclient/commands/install.py
pgxnclient/libexec/README
pgxnclient/libexec/pgxn-check
pgxnclient/libexec/pgxn-download
pgxnclient/libexec/pgxn-help
pgxnclient/libexec/pgxn-info
pgxnclient/libexec/pgxn-install
pgxnclient/libexec/pgxn-load
pgxnclient/libexec/pgxn-mirror
pgxnclient/libexec/pgxn-search
pgxnclient/libexec/pgxn-uninstall
pgxnclient/libexec/pgxn-unload
pgxnclient/tests/__init__.py
pgxnclient/tests/test_archives.py
pgxnclient/tests/test_commands.py
pgxnclient/tests/test_label.py
pgxnclient/tests/test_semver.py
pgxnclient/tests/test_spec.py
pgxnclient/tests/testutils.py
pgxnclient/utils/__init__.py
pgxnclient/utils/argparse.py
pgxnclient/utils/ordereddict.py
pgxnclient/utils/semver.py
pgxnclient/utils/strings.py
pgxnclient/utils/temp.py
pgxnclient/utils/uri.py
testdata/META-manyext.json
testdata/download.py
testdata/foobar-0.42.1.tar.gz
testdata/foobar-0.42.1.zip
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar%2F0.42.1%2FMETA-badsha1.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar%2F0.42.1%2FMETA.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar%2F0.42.1%2Ffoobar-0.42.1.zip
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar%2F0.43.2b1%2FMETA.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar%2F0.43.2b1%2Ffoobar-0.43.2b1.zip
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpg_amqp%2F0.3.0%2FMETA.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpg_amqp%2F0.3.0%2Fpg_amqp-0.3.0.zip
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpyrseas%2F0.4.1%2FMETA.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpyrseas%2F0.4.1%2Fpyrseas-0.4.1.zip
testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpyrseas.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fextension%2Famqp.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Findex.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fmeta%2Fmirrors.json
testdata/http%3A%2F%2Fapi.pgxn.org%2Fsearch%2Fdocs%2F%3Fq%3D%2522foo%2Bbar%2522%2Bbaz
testdata/http%3A%2F%2Fexample.org%2Ffoobar-0.42.1.tar.gz
testdata/tar.ext
testdata/zip.ext pgxnclient-1.2.1/pgxnclient.egg-info/PKG-INFO 0000664 0001750 0001750 00000001701 12143730213 020571 0 ustar piro piro 0000000 0000000 Metadata-Version: 1.1
Name: pgxnclient
Version: 1.2.1
Summary: A command line tool to interact with the PostgreSQL Extension Network.
Home-page: http://pgxnclient.projects.postgresql.org/
Author: Daniele Varrazzo
Author-email: daniele.varrazzo@gmail.com
License: BSD
Description: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Console
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: System Administrators
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: POSIX
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2.5
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.1
Classifier: Programming Language :: Python :: 3.2
Classifier: Topic :: Database
pgxnclient-1.2.1/pgxnclient.egg-info/top_level.txt 0000664 0001750 0001750 00000000013 12143730213 022221 0 ustar piro piro 0000000 0000000 pgxnclient
pgxnclient-1.2.1/pgxnclient.egg-info/dependency_links.txt 0000664 0001750 0001750 00000000001 12143730213 023543 0 ustar piro piro 0000000 0000000
pgxnclient-1.2.1/pgxnclient.egg-info/not-zip-safe 0000664 0001750 0001750 00000000001 12143730213 021723 0 ustar piro piro 0000000 0000000
pgxnclient-1.2.1/MANIFEST.in 0000664 0001750 0001750 00000000363 12143727745 015410 0 ustar piro piro 0000000 0000000 include AUTHORS CHANGES COPYING MANIFEST.in README.rst setup.py Makefile
include bin/pgxn bin/pgxnclient
recursive-include pgxnclient *.py
recursive-include testdata *
include pgxnclient/libexec/*
include docs/conf.py docs/Makefile docs/*.rst
pgxnclient-1.2.1/docs/ 0000775 0001750 0001750 00000000000 12143730213 014560 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/docs/index.rst 0000664 0001750 0001750 00000003125 12143727745 016442 0 ustar piro piro 0000000 0000000 PGXN Client's documentation
===========================
The `PGXN Client `__ is a command
line tool designed to interact with the `PostgreSQL Extension Network
`__ allowing searching, compiling, installing, and removing
extensions in PostgreSQL databases.
For example, to install the semver_ extension, the client can be invoked as::
$ pgxn install semver
which would download and compile the extension for one of the PostgreSQL
servers hosted on the machine and::
$ pgxn load -d somedb semver
which would load the extension in one of the databases of the server.
The client interacts with the PGXN web service and a ``Makefile`` provided by
the extension. The best results are achieved with makefiles using the
PostgreSQL `Extension Building Infrastructure`__; however the client tries to
degrade gracefully in presence of any package hosted on PGXN and any package
available outside the extension network.
.. _semver: http://pgxn.org/dist/semver
.. __: http://www.postgresql.org/docs/current/static/extend-pgxs.html
- Home page: http://pgxnclient.projects.postgresql.org/
- Downloads: http://pypi.python.org/pypi/pgxnclient/
- Discussion group: http://groups.google.com/group/pgxn-users/
- Source repository: https://github.com/dvarrazzo/pgxnclient/
Contents:
.. toctree::
:maxdepth: 2
install
usage
ext
.. toctree::
:hidden:
changes
Indices and tables
==================
* :ref:`Changes Log `
* :ref:`search`
* :ref:`genindex`
..
* :ref:`modindex`
..
To do
=====
.. todolist::
pgxnclient-1.2.1/docs/Makefile 0000664 0001750 0001750 00000002360 12143727745 016241 0 ustar piro piro 0000000 0000000 # PGXN Client -- documentation makefile
#
# Use 'make env', then 'make html' to build the HTML documentation
#
# Copyright (C) 2011-2012 Daniele Varrazzo
PYTHON := python$(PYTHON_VERSION)
PYTHON_VERSION ?= $(shell $(PYTHON) -c 'import sys; print ("%d.%d" % sys.version_info[:2])')
BUILD_DIR = $(shell pwd)/build/lib.$(PYTHON_VERSION)
ENV_DIR = $(shell pwd)/env/py-$(PYTHON_VERSION)
ENV_BIN = $(ENV_DIR)/bin
ENV_LIB = $(ENV_DIR)/lib
EASY_INSTALL = PYTHONPATH=$(ENV_LIB) $(ENV_BIN)/easy_install-$(PYTHON_VERSION) -d $(ENV_LIB) -s $(ENV_BIN)
EZ_SETUP = $(ENV_BIN)/ez_setup.py
SPHINXOPTS =
SPHINXBUILD = $(ENV_BIN)/sphinx-build
PAPER =
BUILDDIR = .
.PHONY: env clean html
default: html
html:
PYTHONPATH=$(ENV_LIB) $(SPHINXBUILD) -b html $(ALLSPHINXOPTS) \
. $(BUILDDIR)/html
# The environment is currently required to build the documentation.
# It is not clean by 'make clean'
env: easy_install
mkdir -p $(ENV_BIN)
mkdir -p $(ENV_LIB)
$(EASY_INSTALL) docutils
$(EASY_INSTALL) sphinx
easy_install: ez_setup
PYTHONPATH=$(ENV_LIB) $(PYTHON) $(EZ_SETUP) -d $(ENV_LIB) -s $(ENV_BIN) setuptools
ez_setup:
mkdir -p $(ENV_BIN)
mkdir -p $(ENV_LIB)
wget -O $(EZ_SETUP) http://peak.telecommunity.com/dist/ez_setup.py
clean:
$(RM) -r html
pgxnclient-1.2.1/docs/changes.rst 0000777 0001750 0001750 00000000000 12143727745 020143 2../CHANGES ustar piro piro 0000000 0000000 pgxnclient-1.2.1/docs/install.rst 0000664 0001750 0001750 00000003752 12143727745 017007 0 ustar piro piro 0000000 0000000 Installation
============
Prerequisites
-------------
The program is implemented in Python. Versions from Python 2.5 onwards are
supported, including Python 3.0 and successive.
PostgreSQL client-side development tools are required to build and install
extensions.
Installation from the Python Package Index
------------------------------------------
The PGXN client is `hosted on PyPI`__, therefore the easiest way to install
the program is through a Python installation tool such as easy_install_, pip_
or `zc.buildout`_. For example a system-wide installation can be obtained
with::
$ sudo easy_install pgxnclient
To upgrade from a previous version to the most recent available you may run
instead::
$ sudo easy_install -U pgxnclient
The documentation of the installation tool of your choice will also show how
to perform a local installation.
.. __: http://pypi.python.org/pypi/pgxnclient
.. _easy_install: http://peak.telecommunity.com/DevCenter/EasyInstall
.. _pip: http://www.pip-installer.org/en/latest/
.. _zc.buildout: http://www.buildout.org/
Installation from source
------------------------
The program can also be installed from the source, either from a `source
package`__ or from the `source repository`__: in this case you can install the
program using::
$ python setup.py install
.. __: http://pypi.python.org/pypi/pgxnclient/
.. __: https://github.com/dvarrazzo/pgxnclient/
Running from the project directory
----------------------------------
You can also run PGXN Client directly from the project directory, either
unpacked from a `source package`__, or cloned from the `source repository`__,
without performing any installation.
Just make sure that the project directory is in the :envvar:`PYTHONPATH` and
run the :program:`bin/pgxn` script::
$ cd /path/to/pgxnclient
$ export PYTHONPATH=`pwd`
$ ./bin/pgxn --version
pgxnclient 1.0.3.dev0 # just an example
.. __: http://pypi.python.org/pypi/pgxnclient/
.. __: https://github.com/dvarrazzo/pgxnclient/
pgxnclient-1.2.1/docs/usage.rst 0000664 0001750 0001750 00000043517 12143727745 016450 0 ustar piro piro 0000000 0000000 Program usage
=============
The program entry point is the script called :program:`pgxn`.
Usage:
.. parsed-literal::
:class: pgxn
pgxn [--help] [--version] *COMMAND*
[--mirror *URL*] [--verbose] [--yes] ...
The script offers several commands, whose list can be obtained using ``pgxn
--help``. The options available for each subcommand can be obtained using
:samp:`pgxn {COMMAND} --help`.
The main commands you may be interested in are `install`_ (to download, build
and install an extension distribution into the system) and `load`_ (to load an
installed extension into a database). Commands to perform reverse operations
are `uninstall`_ and `unload`_. Use `download`_ to get a package from a mirror
without installing it.
There are also informative commands: `search <#pgxn-search>`_ is used to
search the network, `info`_ to get information about a distribution.
The `mirror`_ command can be used to get a list of mirrors.
A few options are available to all the commands:
:samp:`--mirror {URL}`
Select a mirror to interact with. If not specified the default is
``http://api.pgxn.org/``.
``--verbose``
Print more information during the process.
``--yes``
Assume affirmative answer to all questions. Useful for unattended scripts.
Package specification
---------------------
Many commands such as install_ require a *package specification* to operate.
In its simple form the specification is just the name of a distribution:
``pgxn install foo`` means "install the most recent stable release of the
``foo`` distribution". If a distribution with given name is not found, many
commands will look for an *extension* with the given name, and will work on
it.
The specification allows specifying an operator and a version number, so that
``pgxn install 'foo<2.0'`` will install the most recent stable release of the
distribution before the release 2.0. The version numbers are ordered according to
the `Semantic Versioning specification `__. Supported
operators are ``=``, ``==`` (alias for ``=``), ``<``, ``<=``, ``>``, ``>=``.
Note that you probably need to quote the string as in the example to avoid
invoking shell command redirection.
Whenever a command takes a specification in input, it also accepts options
``--stable``, ``--testing`` and ``--unstable`` to specify the minimum release
status accepted. The default is "stable".
A few commands also allow specifying a local archive or local directory
containing a distribution: in this case the specification should contain at
least a path separator to disambiguate it from a distribution name (for
instance ``pgxn install ./foo.zip``) or it should be specified as an URL with
``file://`` schema.
A few commands also allow specifying a remote package with a URL. Currently
the schemas ``http://`` and ``https://`` are supported.
Currently the client supports ``.zip`` and ``.tar`` archives (eventually with
*gzip* and *bz2* compression).
.. _install:
``pgxn install``
----------------
Download, build, and install a distribution on the local system.
Usage:
.. parsed-literal::
:class: pgxn-install
pgxn install [--help] [--stable | --testing | --unstable]
[--pg_config *PROG*] [--make *PROG*]
[--sudo [*PROG*] | --nosudo]
*SPEC*
The program takes a `package specification`_ identifying the distribution to
work with. The download phase is skipped if the distribution specification
refers to a local directory or package. The package may be specified with an
URL.
Note that the built extension is not loaded in any database: use the command
`load`_ for this purpose.
The command will run the ``configure`` script if available in the package,
then will perform ``make all`` and ``make install``. It is assumed that the
``Makefile`` provided by the distribution uses PGXS_ to build the extension,
but this is not enforced: you may provide any Makefile as long as the expected
commands are implemented.
.. _PGXS: http://www.postgresql.org/docs/current/static/extend-pgxs.html
If there are many PostgreSQL installations on the system, the extension will
be built and installed against the instance whose :program:`pg_config` is
first found on the :envvar:`PATH`. A different instance can be specified using
the option :samp:`--pg_config {PATH}`.
The PGXS_ build system relies on a presence of `GNU Make`__: in many systems
it is installed as :program:`gmake` or :program:`make` executable. The program
will use the first of them on the path. You can specify an alternative program
using ``--make`` option.
.. __: http://www.gnu.org/software/make/
If the extension is being installed into a system PostgreSQL installation, the
install phase will likely require root privileges to be performed. In this
case either run the command under :program:`sudo` or specify the ``--sudo``
option: in the latter case :program:`sudo` will only be invoked during the
"install" phase. An optional program :samp:`{PROG}` to elevate the user
privileges can be specified as ``--sudo`` option; if none is specified,
:program:`sudo` will be used.
.. note::
If ``--sudo`` is the last option and no :samp:`{PROG}` is specified, a
``--`` separator may be required to disambiguate the :samp:`{SPEC}`::
pgxn install --sudo -- foobar
.. _check:
``pgxn check``
--------------
Run a distribution's unit test.
Usage:
.. parsed-literal::
:class: pgxn-check
pgxn check [--help] [--stable | --testing | --unstable]
[--pg_config *PROG*] [--make *PROG*]
[-d *DBNAME*] [-h *HOST*] [-p *PORT*] [-U *NAME*]
*SPEC*
The command takes a `package specification`_ identifying the distribution to
work with, which can also be a local file or directory or an URL. The
distribution is unpacked if required and the ``installcheck`` make target is
run.
.. note::
The command doesn't run ``make all`` before ``installcheck``: if any file
required for testing is to be built, it should be listed as
``installcheck`` prerequisite in the ``Makefile``, for instance:
.. code-block:: make
myext.sql: myext.sql.in
some_command
installcheck: myext.sql
The script exits with non-zero value in case of test failed. In this case,
if files ``regression.diff`` and ``regression.out`` are produced (as
:program:`pg_regress` does), these files are copied to the local directory
where the script is run.
The database connection options are similar to the ones in load_, with the
difference that the variable :envvar:`PGDATABASE` doesn't influence the
database name.
See the install_ command for details about the command arguments.
.. warning::
At the time of writing, :program:`pg_regress` on Debian and derivatives is
affected by `bug #554166`__ which makes *HOST* selection impossible.
.. __: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=554166
.. _uninstall:
``pgxn uninstall``
------------------
Remove a distribution from the system.
Usage:
.. parsed-literal::
:class: pgxn-uninstall
pgxn uninstall [--help] [--stable | --testing | --unstable]
[--pg_config *PROG*] [--make *PROG*]
[--sudo [*PROG*] | --nosudo]
*SPEC*
The command does the opposite of the install_ command, removing a
distribution's files from the system. It doesn't issue any command to the
databases where the distribution's extensions may have been loaded: you should
first drop the extension (the unload_ command can do this).
The distribution should match what installed via the `install`_ command.
See the install_ command for details about the command arguments.
.. _load:
``pgxn load``
-------------
Load the extensions included in a distribution into a database. The
distribution must be already installed in the system, e.g. via the `install`_
command.
Usage:
.. parsed-literal::
:class: pgxn-load
pgxn load [--help] [--stable | --testing | --unstable] [-d *DBNAME*]
[-h *HOST*] [-p *PORT*] [-U *NAME*] [--pg_config *PATH*]
[--schema *SCHEMA*]
*SPEC* [*EXT* [*EXT* ...]]
The distribution is specified according to the `package specification`_ and
can refer to a local directory or file or to an URL. No consistency check is
performed between the packages specified in the ``install`` and ``load``
command: the specifications should refer to compatible packages. The specified
distribution is only used to read the metadata: only installed files are
actually used to issue database commands.
The database to install into can be specified using options
``-d``/``--dbname``, ``-h``/``--host``, ``-p``/``--port``,
``-U``/``--username``. The default values for these parameters are the regular
system ones and can be also set using environment variables
:envvar:`PGDATABASE`, :envvar:`PGHOST`, :envvar:`PGPORT`, :envvar:`PGUSER`.
The command supports also a ``--pg_config`` option that can be used to specify
an alternative :program:`pg_config` to use to look for installation scripts:
you may need to specify the parameter if there are many PostgreSQL
installations on the system, and should be consistent to the one specified
in the ``install`` command.
If the specified database version is at least PostgreSQL 9.1, and if the
extension specifies a ``.control`` file, it will be loaded using the `CREATE
EXTENSION`_ command, otherwise it will be loaded as a loose set of objects.
For more information see the `extensions documentation`__.
.. _CREATE EXTENSION: http://www.postgresql.org/docs/current/static/sql-createextension.html
.. __: http://www.postgresql.org/docs/current/static/extend-extensions.html
The command is based on the `'provides' section`_ of the distribution's
``META.json``: if a SQL file is specified, that file will be used to load the
extension. Note that loading is only attempted if the file extension is
``.sql``: if it's not, we assume that the extension is not really a PostgreSQL
extension (it may be for instance a script). If no ``file`` is specified, a
file named :samp:`{extension}.sql` will be looked for in a few directories
under the PostgreSQL ``shared`` directory and it will be loaded after an user
confirmation.
If the distribution provides more than one extension, the extensions are
loaded in the order in which they are specified in the ``provides`` section of
the ``META.json`` file. It is also possible to load only a few of the
extensions provided, specifying them after *SPEC*: the extensions will be
loaded in the order specified.
If a *SCHEMA* is specified, the extensions are loaded in the provided schema.
Note that if ``CREATE EXTENSION`` is used, the schema is directly supported;
otherwise the ``.sql`` script loaded will be patched to create the objects in
the provided schema (a confirmation will be asked before attempting loading).
.. _'provides' section: http://pgxn.org/spec/#provides
.. _unload:
``pgxn unload``
---------------
Unload a distribution's extensions from a database.
Usage:
.. parsed-literal::
:class: pgxn-unload
pgxn unload [--help] [--stable | --testing | --unstable] [-d *DBNAME*]
[-h *HOST*] [-p *PORT*] [-U *NAME*] [--pg_config *PATH*]
[--schema *SCHEMA*]
*SPEC* [*EXT* [*EXT* ...]]
The command does the opposite of the load_ command: it drops a distribution
extensions from the specified database, either issuing `DROP EXTENSION`_
commands or running uninstall scripts eventually provided.
For every extension specified in the `'provides' section`_ of the
distribution ``META.json``, the command will look for a file called
:samp:`uninstall_{file.sql}` where :samp:`{file.sql}` is the ``file``
specified. If no file is specified, :samp:`{extension}.sql` is assumed. If
a file with extension different from ``.sql`` is specified, it is
assumed that the extension is not a PostgreSQL extension so unload is not
performed.
If a *SCHEMA* is specified, the uninstall script will be patched to drop the
objects in the selected schema. However, if the extension was loaded via
``CREATE EXTENSION``, the server will be able to figure out the correct schema
itself, so the option will be ignored.
If the distribution specifies more than one extension, they are unloaded in
reverse order respect to the order in which they are specified in the
``META.json`` file. It is also possible to unload only a few of the
extensions provided, specifying them after *SPEC*: the extensions will be
unloaded in the order specified.
.. _DROP EXTENSION: http://www.postgresql.org/docs/current/static/sql-dropextension.html
See the load_ command for details about the command arguments.
.. _download:
``pgxn download``
-----------------
Download a distribution from the network.
Usage:
.. parsed-literal::
:class: pgxn-download
pgxn download [--help] [--stable | --testing | --unstable]
[--target *PATH*]
*SPEC*
The distribution is specified according to the `package specification`_ and
can be represented by an URL. The file is saved in the current directory with
name usually :samp:`{distribution}-{version}.zip`. If a file with the same
name exists, a suffix ``-1``, ``-2`` etc. is added to the name, before the
extension. A different directory or name can be specified using the
``--target`` option.
.. _pgxn-search:
``pgxn search``
---------------
Search in the extensions available on PGXN.
Usage:
.. parsed-literal::
:class: pgxn-search
pgxn search [--help] [--dist | --ext | --docs] *TERM* [*TERM* ...]
The command prints on ``stdout`` a list of packages and version matching
:samp:`{TERM}`. By default the search is performed in the documentation:
alternatively the distributions (using the ``--dist`` option) or the
extensions (using the ``--ext`` option) can be searched.
Example:
.. code-block:: console
$ pgxn search --dist integer
tinyint 0.1.1
Traditionally, PostgreSQL core has a policy not to have 1 byte *integer*
in it. With this module, you can define 1 byte *integer* column on your
tables, which will help query performances and...
check_updates 1.0.0
... test2 defined as: CREATE TABLE test2(a *INTEGER*, b *INTEGER*, c
*INTEGER*, d *INTEGER*); To make a trigger allowing updates only when c
becomes equal to 5: CREATE TRIGGER c_should_be_5 BEFORE UPDATE ON...
ssn 1.0.0
INSERT INTO test VALUES('124659876'); The output is always represented
using the format with dashes, i.e: 123-45-6789 124-65-9876 Internals:
The type is stored as a 4 bytes *integer*.
The search will return all the matches containing any of *TERM*. In order to
search for items containing more than one word, join the word into a single
token. For instance to search for items containing the terms "double
precision" or the terms "floating point" use:
.. code-block:: console
$ pgxn search "double precision" "floating point"
semver 0.2.2
... to semver semver(12.0::real) 12.0.0semver(*double precision*) Cast
*double precision* to semver semver(9.2::*double precision*)
9.2.0semver(integer) Cast integer to semver semver(42::integer)...
saio 0.0.1
Defaults to true. saio_seed A *floating point* seed for the random
numbers generator. saio_equilibrium_factor Scaling factor for the query
size, determining the number of loops before equilibrium is...
pgTAP 0.25.0
... ) casts_are( casts[] ) SELECT casts_are( ARRAY[ 'integer AS *double
precision*', 'integer AS reltime', 'integer AS numeric', -- ...
.. _info:
``pgxn info``
-------------
Print information about a distribution obtained from PGXN.
Usage:
.. parsed-literal::
:class: pgxn-info
pgxn info [--help] [--stable | --testing | --unstable]
[--details | --meta | --readme | --versions]
*SPEC*
The distribution is specified according to the `package specification`_. It
cannot be a local dir or file nor an URL. The command output is a list of
values obtained by the distribution's ``META.json`` file, for example:
.. code-block:: console
$ pgxn info pair
name: pair
abstract: A key/value pair data type
description: This library contains a single PostgreSQL extension,
a key/value pair data type called “pair”, along with a convenience
function for constructing key/value pairs.
maintainer: David E. Wheeler
license: postgresql
release_status: stable
version: 0.1.2
date: 2011-04-20T23:47:22Z
sha1: 9988d7adb056b11f8576db44cca30f88a08bd652
provides: pair: 0.1.2
Alternatively the raw ``META.json`` (using the ``--meta`` option) or the
distribution README (using the ``--readme`` option) can be obtained.
Using the ``--versions`` option, the command prints a list of available
versions for the specified distribution, together with their release status.
Only distributions respecting :samp:`{SPEC}` and the eventually specified
release status options are printed, for example:
.. code-block:: console
$ pgxn info --versions 'pair<0.1.2'
pair 0.1.1 stable
pair 0.1.0 stable
.. _mirror:
``pgxn mirror``
---------------
Return information about the available mirrors.
Usage:
.. parsed-literal::
:class: pgxn-mirror
pgxn mirror [--help] [--detailed] [*URI*]
If no :samp:`URI` is specified, print a list of known mirror URIs. Otherwise
print details about the specified mirror. It is also possible to print details
for all the known mirrors using the ``--detailed`` option.
.. _help:
``pgxn help``
-------------
Display help and other program information.
Usage:
.. parsed-literal::
:class: pgxn-help
pgxn help [--help] [--all | --libexec | *CMD*]
Without options show the same information obtained by ``pgxn --help``, which
includes a list of builtin commands. With the ``--all`` option print the
complete list of commands installed in the system.
The option ``--libexec`` prints the full path of the directory containing
the external commands scripts: see :ref:`extending` for more information.
:samp:`pgxn help {CMD}` is an alias for :samp:`pgxn {CMD} --help`.
pgxnclient-1.2.1/docs/conf.py 0000664 0001750 0001750 00000016203 12143727745 016101 0 ustar piro piro 0000000 0000000 # -*- coding: utf-8 -*-
#
# PGXN Client documentation build configuration file, created by
# sphinx-quickstart on Tue May 3 00:34:03 2011.
#
# This file is execfile()d with the current directory set to its containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys, os, re
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#sys.path.insert(0, os.path.abspath('.'))
# -- General configuration -----------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be extensions
# coming with Sphinx (named 'sphinx.ext.*') or your custom ones.
extensions = ['sphinx.ext.todo']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix of source filenames.
source_suffix = '.rst'
# The encoding of source files.
#source_encoding = 'utf-8-sig'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'PGXN Client'
copyright = u'2011-2012, Daniele Varrazzo'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
sys.path.insert(0, '..')
try:
from pgxnclient import __version__
finally:
del sys.path[0]
# The short X.Y version.
version = re.match(r'\d+\.\d+', __version__).group()
# The full version, including alpha/beta/rc tags.
release = __version__
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#language = None
# There are two options for replacing |today|: either, you set today to some
# non-false value, then it is used:
#today = ''
# Else, today_fmt is used as the format for a strftime call.
#today_fmt = '%B %d, %Y'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
exclude_patterns = ['_build', 'env']
# The reST default role (used for this markup: `text`) to use for all documents.
default_role = 'obj'
# If true, '()' will be appended to :func: etc. cross-reference text.
#add_function_parentheses = True
# If true, the current module name will be prepended to all description
# unit titles (such as .. function::).
#add_module_names = True
# If true, sectionauthor and moduleauthor directives will be shown in the
# output. They are ignored by default.
#show_authors = False
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# A list of ignored prefixes for module index sorting.
#modindex_common_prefix = []
todo_include_todos = True
# -- Options for HTML output ---------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
html_theme = 'default'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#html_theme_options = {}
# Add any paths that contain custom themes here, relative to this directory.
#html_theme_path = []
# The name for this set of Sphinx documents. If None, it defaults to
# " v documentation".
#html_title = None
# A shorter title for the navigation bar. Default is the same as html_title.
#html_short_title = None
# The name of an image file (relative to this directory) to place at the top
# of the sidebar.
#html_logo = None
# The name of an image file (within the static path) to use as favicon of the
# docs. This file should be a Windows icon file (.ico) being 16x16 or 32x32
# pixels large.
#html_favicon = None
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
# using the given strftime format.
#html_last_updated_fmt = '%b %d, %Y'
# If true, SmartyPants will be used to convert quotes and dashes to
# typographically correct entities.
# Disabled because of issue #647
# https://bitbucket.org/birkenfeld/sphinx/issue/647/
html_use_smartypants = False
# Custom sidebar templates, maps document names to template names.
#html_sidebars = {}
# Additional templates that should be rendered to pages, maps page names to
# template names.
#html_additional_pages = {}
# If false, no module index is generated.
#html_domain_indices = True
# If false, no index is generated.
#html_use_index = True
# If true, the index is split into individual pages for each letter.
#html_split_index = False
# If true, links to the reST sources are added to the pages.
#html_show_sourcelink = True
# If true, "Created using Sphinx" is shown in the HTML footer. Default is True.
#html_show_sphinx = True
# If true, "(C) Copyright ..." is shown in the HTML footer. Default is True.
#html_show_copyright = True
# If true, an OpenSearch description file will be output, and all pages will
# contain a tag referring to it. The value of this option must be the
# base URL from which the finished HTML is served.
#html_use_opensearch = ''
# This is the file name suffix for HTML files (e.g. ".xhtml").
#html_file_suffix = None
# Output file base name for HTML help builder.
htmlhelp_basename = 'PGXNClientdoc'
# -- Options for LaTeX output --------------------------------------------------
# The paper size ('letter' or 'a4').
#latex_paper_size = 'letter'
# The font size ('10pt', '11pt' or '12pt').
#latex_font_size = '10pt'
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title, author, documentclass [howto/manual]).
latex_documents = [
('index', 'PGXNClient.tex', u'PGXN Client Documentation',
u'Daniele Varrazzo', 'manual'),
]
# The name of an image file (relative to this directory) to place at the top of
# the title page.
#latex_logo = None
# For "manual" documents, if this is true, then toplevel headings are parts,
# not chapters.
#latex_use_parts = False
# If true, show page references after internal links.
#latex_show_pagerefs = False
# If true, show URL addresses after external links.
#latex_show_urls = False
# Additional stuff for the LaTeX preamble.
#latex_preamble = ''
# Documents to append as an appendix to all manuals.
#latex_appendices = []
# If false, no module index is generated.
#latex_domain_indices = True
# -- Options for manual page output --------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
('index', 'pgxnclient', u'PGXN Client Documentation',
[u'Daniele Varrazzo'], 1)
]
pgxnclient-1.2.1/docs/ext.rst 0000664 0001750 0001750 00000003200 12143727745 016125 0 ustar piro piro 0000000 0000000 .. _extending:
Extending PGXN client
=====================
PGXN Client can be easily extended, either adding new builtin commands, to
be included in the `!pgxnclient` package, or writing new scripts in any
language you want.
In order to add new builtin commands, add a Python module into the
``pgxnclient/commands`` containing your command or a set of logically-related
commands. The commands are implemented by subclassing the `!Command` class.
Your commands will benefit of all the infrastructure available for the other
commands. For up-to-date information take a look at the implementation of
builtin simple commands, such as the ones in ``info.py``.
If you are not into Python and want to add commands written in other
languages, you can provide a link (either soft or hard) to your command under
one of the ``libexec`` directories. The exact location of the directories
depends on the client installation: distribution packagers may decide to move
them according to their own policies. The location of one of the directories,
which can be considered the "public" one, can always be known using the command
``pgxn help --libexec``. Note that this directory may not exist: in this case
the command being installed is responsible to create it. Links are also looked
for in the :envvar:`PATH` directories.
In order to implement the command :samp:`pgxn {foo}`, the link should be named
:samp:`pgxn-{foo}`. The :program:`pgxn` script will dispatch the command and
all the options to your script. Note that you can package many commands into
the same script by looking at ``argv[0]`` to know the name of the link through
which your script has been invoked.
pgxnclient-1.2.1/testdata/ 0000775 0001750 0001750 00000000000 12143730213 015441 5 ustar piro piro 0000000 0000000 pgxnclient-1.2.1/testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpyrseas.json 0000664 0001750 0001750 00000005367 12143727745 025511 0 ustar piro piro 0000000 0000000 {
"abstract": "Framework and utilities to upgrade and maintain databases",
"date": "2011-10-27T16:43:33Z",
"description": "Pyrseas provides a framework and utilities to upgrade and maintain a PostgreSQL database. Its utilities output a database schema in YAML format suitable for committing to a version control system and read this format to generate SQL to sync to another database. Supports PostgreSQL 8.4, 9.0 and 9.1.",
"docs": {
"README": {
"title": "README"
},
"docs/dbtoyaml": {
"abstract": "Output PostgreSQL schemas in YAML format",
"title": "dbtoyaml"
},
"docs/yamltodb": {
"abstract": "Generate SQL to sync a database with a YAML schema spec",
"title": "yamltodb"
}
},
"license": "bsd",
"maintainer": "Joe Abbate ",
"name": "Pyrseas",
"prereqs": {
"runtime": {
"requires": {
"PostgreSQL": "8.4.0"
}
}
},
"provides": {
"dbtoyaml": {
"abstract": "Output PostgreSQL schemas in YAML format",
"docfile": "docs/dbtoyaml.rst",
"docpath": "docs/dbtoyaml",
"file": "pyrseas/dbtoyaml.py",
"version": "0.4.1"
},
"yamltodb": {
"abstract": "Generate SQL to sync a database with a YAML schema spec",
"docfile": "docs/yamltodb.rst",
"docpath": "docs/yamltodb",
"file": "pyrseas/yamltodb.py",
"version": "0.4.1"
}
},
"release_status": "stable",
"releases": {
"stable": [
{
"date": "2011-10-27T16:43:33Z",
"version": "0.4.1"
},
{
"date": "2011-09-27T01:35:22Z",
"version": "0.4.0"
},
{
"date": "2011-08-26T17:15:44Z",
"version": "0.3.1"
},
{
"date": "2011-06-30T15:31:05Z",
"version": "0.3.0"
},
{
"date": "2011-06-07T20:41:39Z",
"version": "0.2.1"
},
{
"date": "2011-05-19T18:43:06Z",
"version": "0.2.0"
}
]
},
"resources": {
"bugtracker": {
"web": "https://github.com/jmafc/Pyrseas/issues"
},
"homepage": "http://pyrseas.org/",
"repository": {
"type": "git",
"url": "git://github.com/jmafc/Pyrseas.git",
"web": "https://github.com/jmafc/Pyrseas"
}
},
"sha1": "11f5085f99811cc1d78ac97d2dfef42c90aeda08",
"special_files": [
"README",
"META.json",
"Makefile",
"MANIFEST.in"
],
"tags": [
"version control",
"yaml",
"database version control",
"schema versioning"
],
"user": "jma",
"version": "0.4.1"
}
pgxnclient-1.2.1/testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Fpg_amqp%2F0.3.0%2Fpg_amqp-0.3.0.zip 0000777 0001750 0001750 00000000000 12143727745 033046 2foobar-0.42.1.zip ustar piro piro 0000000 0000000 pgxnclient-1.2.1/testdata/http%3A%2F%2Fapi.pgxn.org%2Fdist%2Ffoobar%2F0.43.2b1%2FMETA.json 0000664 0001750 0001750 00000003140 12143727745 027307 0 ustar piro piro 0000000 0000000 {
"abstract": "A mock distribution",
"date": "2011-04-20T23:47:22Z",
"description": "This library doesn't exist.",
"docs": {
"README": {
"title": "foobar 0.42.1"
},
"doc/pair": {
"abstract": "Abstract of an abstract project",
"title": "foobar 0.42.1"
}
},
"license": "postgresql",
"maintainer": [
"Daniele Varrazzo "
],
"name": "foobar",
"provides": {
"foobar": {
"abstract": "A non existing extension",
"docfile": "doc/foobar.md",
"docpath": "doc/foobar",
"file": "sql/foobar.sql",
"version": "0.43.2b1"
}
},
"release_status": "stable",
"releases": {
"stable": [
{
"date": "2010-10-29T22:44:42Z",
"version": "0.42.1"
},
{
"date": "2010-10-19T03:59:54Z",
"version": "0.42.0"
}
],
"testing": [
{
"date": "2011-04-20T23:47:22Z",
"version": "0.43.2b1"
}
]
},
"resources": {
"bugtracker": {
"web": "http://github.com/piro/foobar/issues/"
},
"repository": {
"type": "git",
"url": "git://github.com/piro/foobar.git",
"web": "http://github.com/piro/foobar/"
}
},
"sha1": "6ae083946254210f6bfc9c5b2cae538bbaf59142",
"special_files": [
"Changes",
"README.md",
"META.json",
"Makefile",
"foobar.control"
],
"tags": [
"foo",
"bar",
"testing"
],
"user": "piro",
"version": "0.43.2b1"
}
pgxnclient-1.2.1/testdata/tar.ext 0000777 0001750 0001750 00000000000 12143727745 022103 2foobar-0.42.1.tar.gz ustar piro piro 0000000 0000000 pgxnclient-1.2.1/testdata/http%3A%2F%2Fapi.pgxn.org%2Fmeta%2Fmirrors.json 0000664 0001750 0001750 00000005511 12143727745 025472 0 ustar piro piro 0000000 0000000 [
{
"uri": "http://pgxn.depesz.com/",
"frequency": "every 6 hours",
"location": "Nürnberg, Germany",
"organization": "depesz Software Hubert Lubaczewski",
"timezone": "CEST",
"email": "depesz.com|web_pgxn",
"bandwidth": "100Mbps",
"src": "rsync://master.pgxn.org/pgxn/",
"rsync": "",
"notes": "access via http only"
},
{
"uri": "http://www.postgres-support.ch/pgxn/",
"frequency": "hourly",
"location": "Basel, Switzerland, Europe",
"organization": "micro systems",
"timezone": "CEST",
"email": "msys.ch|marc",
"bandwidth": "10Mbps",
"src": "rsync://master.pgxn.org/pgxn",
"rsync": "",
"notes": ""
},
{
"uri": "http://pgxn.justatheory.com/",
"frequency": "daily",
"location": "Portland, OR, USA",
"organization": "David E. Wheeler",
"timezone": "America/Los_Angeles",
"email": "justatheory.com|pgxn",
"bandwidth": "Cable",
"src": "rsync://master.pgxn.org/pgxn/",
"rsync": "",
"notes": ""
},
{
"uri": "http://pgxn.darkixion.com/",
"frequency": "hourly",
"location": "London, UK",
"organization": "Thom Brown",
"timezone": "Europe/London",
"email": "darkixion.com|pgxn",
"bandwidth": "1Gbps",
"src": "rsync://master.pgxn.org/pgxn",
"rsync": "rsync://pgxn.darkixion.com/pgxn",
"notes": ""
},
{
"uri": "http://mirrors.cat.pdx.edu/pgxn/",
"frequency": "hourly",
"location": "Portland, OR, USA",
"organization": "PSU Computer Action Team",
"timezone": "America/Los_Angeles",
"email": "cat.pdx.edu|support",
"bandwidth": "60Mbsec",
"src": "rsync://master.pgxn.org/pgxn",
"rsync": "rsync://mirrors.cat.pdx.edu/pgxn",
"notes": "I2 and IPv6"
},
{
"uri": "http://pgxn.dalibo.org/",
"frequency": "hourly",
"location": "Marseille, France",
"organization": "DALIBO SARL",
"timezone": "CEST",
"email": "dalibo.com|contact",
"bandwidth": "100Mbps",
"src": "rsync://master.pgxn.org/pgxn/",
"rsync": "",
"notes": ""
},
{
"uri": "http://pgxn.cxsoftware.org/",
"frequency": "hourly",
"location": "Seattle, WA, USA",
"organization": "CxNet",
"timezone": "America/Los_Angeles",
"email": "cxnet.cl|cristobal",
"bandwidth": "100Mbps",
"src": "rsync://master.pgxn.org/pgxn/",
"rsync": "",
"notes": ""
},
{
"uri": "http://api.pgxn.org/",
"frequency": "hourly",
"location": "Portland, OR, USA",
"organization": "PGXN",
"timezone": "America/Los_Angeles",
"email": "pgexperts.com|pgxn",
"bandwidth": "10MBps",
"src": "rsync://master.pgxn.org/pgxn",
"rsync": "",
"notes": "API server."
}
]
pgxnclient-1.2.1/testdata/http%3A%2F%2Fapi.pgxn.org%2Fextension%2Famqp.json 0000664 0001750 0001750 00000000672 12143727745 026024 0 ustar piro piro 0000000 0000000 {
"extension": "amqp",
"latest": "stable",
"stable": {
"abstract": "AMQP protocol support for PostgreSQL",
"dist": "pg_amqp",
"docpath": "doc/amqp",
"sha1": "4c7112a28584ecd2ac9607cf62274a0ec9d586cf",
"version": "0.3.0"
},
"versions": {
"0.3.0": [
{
"date": "2011-05-19T21:47:12Z",
"dist": "pg_amqp",
"version": "0.3.0"
}
]
}
}
pgxnclient-1.2.1/testdata/foobar-0.42.1.tar.gz 0000664 0001750 0001750 00000012006 12143727745 020577 0 ustar piro piro 0000000 0000000 z;Pfoobar-0.42.1.tar